Windows平台Unity3d下如何同时播放多路RTSP或RTMP流
生活随笔
收集整理的這篇文章主要介紹了
Windows平台Unity3d下如何同时播放多路RTSP或RTMP流
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
好多開發者在做AR、VR或者教育類產品時,苦于如何在windows平臺構建一個穩定且低延遲的RTSP或者RTMP播放器,如果基于Unity3d完全重新開發一個播放器,代價大、而且周期長,不適合快速出產品,我們認為當前最好的方式就是集成現有Native平臺上成熟穩定播放器,回調rgb/yuv數據到上層,上層做繪制即可。
廢話不多說,以Windows平臺多路播放為例:
1.Native播放器SDK支持吐RGB/YUV420/NV12等其中的一種未壓縮的圖像格式
比如Windows平臺,我們回調YUV上來(NT_SP_E_VIDEO_FRAME_FROMAT_I420),本文以調用大牛直播SDK(Github)的Windows平臺RTSP、RTMP播放器SDK為例,具體代碼如下:
public void Play(int sel){if (videoctrl[sel].is_running){Debug.Log("已經在播放..");return;}lock (videoctrl[sel].frame_lock_){videoctrl[sel].cur_video_frame_ = null;}OpenPlayer(sel);if (videoctrl[sel].player_handle_ == IntPtr.Zero)return;//設置播放URLNTSmartPlayerSDK.NT_SP_SetURL(videoctrl[sel].player_handle_, videoctrl[sel].videoUrl);/* ++ 播放前參數配置可加在此處 ++ */int play_buffer_time_ = 100;NTSmartPlayerSDK.NT_SP_SetBuffer(videoctrl[sel].player_handle_, play_buffer_time_); //設置buffer timeint is_using_tcp = 0; //TCP模式NTSmartPlayerSDK.NT_SP_SetRTSPTcpMode(videoctrl[sel].player_handle_, is_using_tcp);int timeout = 10;NTSmartPlayerSDK.NT_SP_SetRtspTimeout(videoctrl[sel].player_handle_, timeout);int is_auto_switch_tcp_udp = 1;NTSmartPlayerSDK.NT_SP_SetRtspAutoSwitchTcpUdp(videoctrl[sel].player_handle_, is_auto_switch_tcp_udp);Boolean is_mute_ = false;NTSmartPlayerSDK.NT_SP_SetMute(videoctrl[sel].player_handle_, is_mute_ ? 1 : 0); //是否啟動播放的時候靜音int is_fast_startup = 1;NTSmartPlayerSDK.NT_SP_SetFastStartup(videoctrl[sel].player_handle_, is_fast_startup); //設置快速啟動模式Boolean is_low_latency_ = false;NTSmartPlayerSDK.NT_SP_SetLowLatencyMode(videoctrl[sel].player_handle_, is_low_latency_ ? 1 : 0); //設置是否啟用低延遲模式//設置旋轉角度(設置0, 90, 180, 270度有效,其他值無效)int rotate_degrees = 0;NTSmartPlayerSDK.NT_SP_SetRotation(videoctrl[sel].player_handle_, rotate_degrees);int volume = 100;NTSmartPlayerSDK.NT_SP_SetAudioVolume(videoctrl[sel].player_handle_, volume); //設置播放音量, 范圍是[0, 100], 0是靜音,100是最大音量, 默認是100// 設置上傳下載報速度int is_report = 0;int report_interval = 1;NTSmartPlayerSDK.NT_SP_SetReportDownloadSpeed(videoctrl[sel].player_handle_, is_report, report_interval);/* -- 播放前參數配置可加在此處 -- *///video frame callback (YUV/RGB)videoctrl[sel].video_frame_call_back_ = new SP_SDKVideoFrameCallBack(NT_SP_SetVideoFrameCallBack);NTSmartPlayerSDK.NT_SP_SetVideoFrameCallBack(videoctrl[sel].player_handle_, (Int32)NT.NTSmartPlayerDefine.NT_SP_E_VIDEO_FRAME_FORMAT.NT_SP_E_VIDEO_FRAME_FROMAT_I420, window_handle_, videoctrl[sel].video_frame_call_back_);UInt32 flag = NTSmartPlayerSDK.NT_SP_StartPlay(videoctrl[sel].player_handle_);if (flag == DANIULIVE_RETURN_OK){videoctrl[sel].is_need_get_frame_ = true;Debug.Log("播放成功");}else{videoctrl[sel].is_need_get_frame_ = false;Debug.LogError("播放失敗");}videoctrl[sel].is_running = true;}2. 處理回調上來的數據
private void SDKVideoFrameCallBack(UInt32 status, IntPtr frame, int sel){//這里拿到回調frame,進行相關操作NT_SP_VideoFrame video_frame = (NT_SP_VideoFrame)Marshal.PtrToStructure(frame, typeof(NT_SP_VideoFrame));VideoFrame u3d_frame = new VideoFrame();u3d_frame.width_ = video_frame.width_;u3d_frame.height_ = video_frame.height_;u3d_frame.timestamp_ = (UInt64)video_frame.timestamp_;int d_y_stride = video_frame.width_;int d_u_stride = (video_frame.width_ + 1) / 2;int d_v_stride = d_u_stride;int d_y_size = d_y_stride * video_frame.height_;int d_u_size = d_u_stride * ((video_frame.height_ + 1) / 2);int d_v_size = d_u_size;int u_v_height = ((u3d_frame.height_ + 1) / 2);u3d_frame.y_stride_ = d_y_stride;u3d_frame.u_stride_ = d_u_stride;u3d_frame.v_stride_ = d_v_stride;u3d_frame.y_data_ = new byte[d_y_size];u3d_frame.u_data_ = new byte[d_u_size];u3d_frame.v_data_ = new byte[d_v_size];CopyFramePlane(u3d_frame.y_data_, d_y_stride,video_frame.plane0_, video_frame.stride0_, u3d_frame.height_);CopyFramePlane(u3d_frame.u_data_, d_u_stride,video_frame.plane1_, video_frame.stride1_, u_v_height);CopyFramePlane(u3d_frame.v_data_, d_v_stride,video_frame.plane2_, video_frame.stride2_, u_v_height);lock (videoctrl[sel].frame_lock_ ){videoctrl[sel].cur_video_frame_ = u3d_frame;//Debug.LogError("sel: " + sel + " w:" + u3d_frame.width_ + "h:" + u3d_frame.height_);}}3.Unity3D創建相應的RGB/YUV420等Shader,獲取圖像數據來填充紋理即可
private void UpdateYUVTexture(VideoFrame video_frame, int sel){if (video_frame.y_data_ == null || video_frame.u_data_ == null || video_frame.v_data_ == null){Debug.Log("video frame with null..");return;}if (videoctrl[sel].yTexture_ != null){videoctrl[sel].yTexture_.LoadRawTextureData(video_frame.y_data_);videoctrl[sel].yTexture_.Apply();}if (videoctrl[sel].uTexture_ != null){videoctrl[sel].uTexture_.LoadRawTextureData(video_frame.u_data_);videoctrl[sel].uTexture_.Apply();}if (videoctrl[sel].vTexture_ != null){videoctrl[sel].vTexture_.LoadRawTextureData(video_frame.v_data_);videoctrl[sel].vTexture_.Apply();}}4. 具體播放效果如下
總結
Unity3d下,做多路播放的話,首先確保調用的拉流解碼數據的模塊具備回調yuv/rgb數據能力,回上來后,再上層直接刷新顯示即可,是不是沒有你想的那么復雜?
總結
以上是生活随笔為你收集整理的Windows平台Unity3d下如何同时播放多路RTSP或RTMP流的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 【Python】电商用户复购数据实战:图
- 下一篇: java信息管理系统总结_java实现科