日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

live555传输实时h264视频流和mp3音频流

發(fā)布時間:2023/12/9 编程问答 25 豆豆
生活随笔 收集整理的這篇文章主要介紹了 live555传输实时h264视频流和mp3音频流 小編覺得挺不錯的,現(xiàn)在分享給大家,幫大家做個參考.

主要是利用live555的rtsp服務(wù)器發(fā)送實時視頻和音頻流,并擴大視頻buffer的容量來防止視頻幀較大時出現(xiàn)數(shù)據(jù)丟失的問題。

h264的實時視頻流

live555的h264視頻流是參考https://blog.csdn.net/caoshangpa/article/details/53200527

主要是創(chuàng)建服務(wù)器的時候 sms->addSubsession(H264VideoFileServerMediaSubsession::createNew(*env, inputFileName,reuseFirstSource)),中的H264VideoFileServerMediaSubsession替換成自己的子會話。

H264VideoFileServerMediaSubsession類在其createNewStreamSource(unsigned /*clientSessionId*/, unsigned& estBitrate)函數(shù)中調(diào)用了ByteStreamFileSource::createNew(envir(), fFileName),而frame的獲取正是在ByteStreamFileSource類中的doGetNextFrame()函數(shù)中實現(xiàn)的。因此,這里需要繼承H264VideoFileServerMediaSubsession和ByteStreamFileSource類,并重寫其中的createNewStreamSource和doGetNextFrame函數(shù)。在doGetNextFrame獲取數(shù)據(jù)替換成自己的方式即可。

代碼

main.cpp中創(chuàng)建rtspserver的代碼

//創(chuàng)建任務(wù)調(diào)度器并初始化使用環(huán)境TaskScheduler* scheduler = BasicTaskScheduler::createNew();UsageEnvironment* env = BasicUsageEnvironment::createNew(*scheduler);UserAuthenticationDatabase* authDB = NULL;//創(chuàng)建RTSP服務(wù)器,開始監(jiān)聽模客戶端的連接//注意這里的端口號不是默認的554端口,因此訪問URL時,需指定該端口號RTSPServer* rtspServer = RTSPServer::createNew(*env, 8554, authDB);if (rtspServer == NULL){*env << "Failed to create RTSP server: " << env->getResultMsg() << "\n";exit(1);}char const* descriptionString = "Session streamed by \"video\"";//流名字,媒體名char const* streamName = "video";//當客戶點播時,要輸入流名字streamName,告訴RTSP服務(wù)器點播的是哪個流。//創(chuàng)建媒體會話,流名字和文件名的對應關(guān)系是通過增加子會話建立起來的。媒體會話對會話描述、會話持續(xù)時間、流名字等與會話有關(guān)的信息進行管理。//第2個參數(shù):媒體名、3:媒體信息、4:媒體描述ServerMediaSession* sms= ServerMediaSession::createNew(*env, streamName, streamName ,descriptionString);OutPacketBuffer::maxSize = 90000;//修改為自己實現(xiàn)的H264LiveVideoServerMediaSubssionH264LiveVideoServerMediaSubssion* sub = H264LiveVideoServerMediaSubssion::createNew(*env, reuseFirstSource);sms->addSubsession(sub);Mp3LiveServerMediaSubssion* subAudio = Mp3LiveServerMediaSubssion::createNew(*env, reuseFirstSource);sms->addSubsession(subAudio);//為rtspserver添加sessionrtspServer->addServerMediaSession(sms);qInfo()<<"\n url"<<rtspServer->rtspURL(sms)<<"\n stream"<<streamName<<"\n";*env<<"\n url"<<rtspServer->rtspURL(sms)<<"\n stream"<<streamName<<"\n";media->startVideo();//進入事件循環(huán),對套接字的讀取事件和對媒體文件的延時發(fā)送操作都在這個循環(huán)中完成。env->taskScheduler().doEventLoop();

h264livevideoservermediasubssion.h

#ifndef H264LIVEVIDEOSERVERMEDIASUBSSION_H #define H264LIVEVIDEOSERVERMEDIASUBSSION_H#include "H264VideoFileServerMediaSubsession.hh"class H264LiveVideoServerMediaSubssion : public H264VideoFileServerMediaSubsession {public:static H264LiveVideoServerMediaSubssion* createNew(UsageEnvironment& env, Boolean reuseFirstSource);protected:H264LiveVideoServerMediaSubssion(UsageEnvironment& env, Boolean reuseFirstSource);~H264LiveVideoServerMediaSubssion();protected://重定義虛函數(shù)FramedSource* createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate); };#endif // H264LIVEVIDEOSERVERMEDIASUBSSION_H

h264livevideoservermediasubssion.cpp

#include "h264livevideoservermediasubssion.h" #include "h264LiveFramedSource.h" #include "H264VideoStreamFramer.hh"H264LiveVideoServerMediaSubssion* H264LiveVideoServerMediaSubssion::createNew(UsageEnvironment& env, Boolean reuseFirstSource) {return new H264LiveVideoServerMediaSubssion(env, reuseFirstSource); }H264LiveVideoServerMediaSubssion::H264LiveVideoServerMediaSubssion(UsageEnvironment& env, Boolean reuseFirstSource) : H264VideoFileServerMediaSubsession(env, 0, reuseFirstSource) {}H264LiveVideoServerMediaSubssion::~H264LiveVideoServerMediaSubssion() { }FramedSource* H264LiveVideoServerMediaSubssion::createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate) {//estimate bitrate:估計的比特率,記得根據(jù)需求修改estBitrate = 100; // kbps//創(chuàng)建視頻源H264LiveFramedSource* liveSource = H264LiveFramedSource::createNew(envir());if (liveSource == NULL){return NULL;}//為視頻流創(chuàng)建Framerreturn H264VideoStreamFramer::createNew(envir(), liveSource); }

h264liveframedsource.h

#ifndef _H264LIVEFRAMEDSOURCE_H #define _H264LIVEFRAMEDSOURCE_H#include "ByteStreamFileSource.hh" #include "UsageEnvironment.hh"class H264LiveFramedSource : public ByteStreamFileSource { public:static H264LiveFramedSource* createNew(UsageEnvironment& env, unsigned preferredFrameSize = 0, unsigned playTimePerFrame = 0);unsigned int maxFrameSize() const;protected:H264LiveFramedSource(UsageEnvironment& env, unsigned preferredFrameSize, unsigned playTimePerFrame);~H264LiveFramedSource();private://重定義虛函數(shù)virtual void doGetNextFrame(); };#endif

h264liveframedsource.cpp:

#include "h264LiveFramedSource.h" #include "GroupsockHelper.hh" #include <QByteArray>H264LiveFramedSource::H264LiveFramedSource(UsageEnvironment& env, unsigned preferredFrameSize, unsigned playTimePerFrame) : ByteStreamFileSource(env, 0, preferredFrameSize, playTimePerFrame) {}H264LiveFramedSource* H264LiveFramedSource::createNew(UsageEnvironment& env, unsigned preferredFrameSize, unsigned playTimePerFrame) {H264LiveFramedSource* newSource = new H264LiveFramedSource(env, preferredFrameSize, playTimePerFrame);return newSource; }H264LiveFramedSource::~H264LiveFramedSource() {}// This function is called when new frame data is available from the device. // We deliver this data by copying it to the 'downstream' object, using the following parameters (class members): // 'in' parameters (these should *not* be modified by this function): // fTo: The frame data is copied to this address. // (Note that the variable "fTo" is *not* modified. Instead, // the frame data is copied to the address pointed to by "fTo".) // fMaxSize: This is the maximum number of bytes that can be copied // (If the actual frame is larger than this, then it should // be truncated, and "fNumTruncatedBytes" set accordingly.) // 'out' parameters (these are modified by this function): // fFrameSize: Should be set to the delivered frame size (<= fMaxSize). // fNumTruncatedBytes: Should be set iff the delivered frame would have been // bigger than "fMaxSize", in which case it's set to the number of bytes // that have been omitted. // fPresentationTime: Should be set to the frame's presentation time // (seconds, microseconds). This time must be aligned with 'wall-clock time' - i.e., the time that you would get // by calling "gettimeofday()". // fDurationInMicroseconds: Should be set to the frame's duration, if known. // If, however, the device is a 'live source' (e.g., encoded from a camera or microphone), then we probably don't need // to set this variable, because - in this case - data will never arrive 'early'.void H264LiveFramedSource::doGetNextFrame() {QByteArray data = nullptr;//這里獲取實時自己的實時數(shù)據(jù)// media_->getData(data);fFrameSize = data.size();if (fFrameSize > fMaxSize){fNumTruncatedBytes = fFrameSize - fMaxSize;fFrameSize = fMaxSize;envir()<<"frame size "<<fFrameSize<<" MaxSize size "<<fMaxSize<<"fNumTruncatedBytes\n";}else{fNumTruncatedBytes = 0;}if(data.size()!=0){//把得到的實時數(shù)據(jù)復制進輸出端memmove(fTo, data.data(), fFrameSize);}gettimeofday(&fPresentationTime, NULL);//時間戳//表示延遲0秒后再執(zhí)行afterGetting函數(shù),也可以直接用afterGetting(this)nextTask() = envir().taskScheduler().scheduleDelayedTask(0, (TaskFunc*)FramedSource::afterGetting, this); // nextTask() = (TaskFunc*)FramedSource::afterGetting(this);}//這里返回的數(shù)值是BANK_SIZE的最大值 unsigned int H264LiveFramedSource::maxFrameSize() const {return 300000; }

?

mp3音頻流

原來參考https://blog.csdn.net/taixinlfx/article/details/8854440進行改寫,沒有調(diào)通。

后面使用了和h264視頻一樣的方式,在testProgs文件夾下的 testOnDemandRTSPServer.cpp 的demo找到mp3對應的subssion類為MP3AudioFileServerMediaSubsession,進行類似的改寫。

創(chuàng)建服務(wù)器的時候 sms->addSubsession(MP3AudioFileServerMediaSubsession::createNew(*env, inputFileName,reuseFirstSource)),中的MP3AudioFileServerMediaSubsession替換成自己的子會話。

MP3AudioFileServerMediaSubsession類在其createNewStreamSource函數(shù)中調(diào)用了ByteStreamFileSource::createNew,而frame的獲取是在ByteStreamFileSource類中的doGetNextFrame()函數(shù)中實現(xiàn)的。因此,這里需要繼承MP3AudioFileServerMediaSubsession和ByteStreamFileSource類,并重寫其中的createNewStreamSource和doGetNextFrame函數(shù)。在doGetNextFrame獲取數(shù)據(jù)替換成自己的方式即可。

因此對于要傳輸其他格式的實時數(shù)據(jù)流都可以嘗試此方式進行改寫。

代碼:

mp3liveservermediasubssion.h

#ifndef MP3LIVESERVERMEDIASUBSSION_H #define MP3LIVESERVERMEDIASUBSSION_H #include "MP3AudioFileServerMediaSubsession.hh" class Mp3LiveServerMediaSubssion : public MP3AudioFileServerMediaSubsession { public:static Mp3LiveServerMediaSubssion* createNew(UsageEnvironment& env, Boolean reuseFirstSource);protected:Mp3LiveServerMediaSubssion(UsageEnvironment& env, Boolean reuseFirstSource);~Mp3LiveServerMediaSubssion();protected://重定義虛函數(shù)FramedSource* createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate); };#endif // MP3LIVESERVERMEDIASUBSSION_H

mp3liveservermediasubssion.cpp

#include "mp3liveservermediasubssion.h" #include "mp3liveframedsource.h"Mp3LiveServerMediaSubssion *Mp3LiveServerMediaSubssion::createNew(UsageEnvironment &env, Boolean reuseFirstSource) {return new Mp3LiveServerMediaSubssion(env, reuseFirstSource); }Mp3LiveServerMediaSubssion::Mp3LiveServerMediaSubssion(UsageEnvironment &env, Boolean reuseFirstSource) : MP3AudioFileServerMediaSubsession(env, 0, reuseFirstSource, false, NULL) { }Mp3LiveServerMediaSubssion::~Mp3LiveServerMediaSubssion() { }FramedSource* Mp3LiveServerMediaSubssion::createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate) {//estimate bitrate:估計的比特率,記得根據(jù)需求修改estBitrate = 100; // kbps//創(chuàng)建視頻源Mp3LiveFramedSource* liveSource = Mp3LiveFramedSource::createNew(envir());if (liveSource == NULL){return NULL;}//為視頻流創(chuàng)建Framerreturn createNewStreamSourceCommon(liveSource, liveSource->fileSize(),estBitrate); }

mp3liveframedsource.h

#ifndef MP3LIVEFRAMEDSOURCE_H #define MP3LIVEFRAMEDSOURCE_H #include "ByteStreamFileSource.hh" #include "UsageEnvironment.hh" #include "GroupsockHelper.hh"class Mp3LiveFramedSource : public ByteStreamFileSource { public:static Mp3LiveFramedSource* createNew(UsageEnvironment& env, unsigned preferredFrameSize = 0, unsigned playTimePerFrame = 0);unsigned int maxFrameSize() const;protected:Mp3LiveFramedSource(UsageEnvironment& env, unsigned preferredFrameSize, unsigned playTimePerFrame);~Mp3LiveFramedSource();private://重定義虛函數(shù)virtual void doGetNextFrame(); };#endif // MP3LIVEFRAMEDSOURCE_H

mp3liveframedsource.cpp

#include "mp3liveframedsource.h"Mp3LiveFramedSource *Mp3LiveFramedSource::createNew(UsageEnvironment &env, unsigned preferredFrameSize, unsigned playTimePerFrame) {Mp3LiveFramedSource* newSource = new Mp3LiveFramedSource(env, preferredFrameSize, playTimePerFrame);return newSource; }Mp3LiveFramedSource::Mp3LiveFramedSource(UsageEnvironment &env, unsigned preferredFrameSize, unsigned playTimePerFrame) : ByteStreamFileSource(env, 0, preferredFrameSize, playTimePerFrame) {}Mp3LiveFramedSource::~Mp3LiveFramedSource() {}void Mp3LiveFramedSource::doGetNextFrame() {QByteArray data = nullptr;//這里獲取自己的實時音頻數(shù)據(jù)//media_->getDataAudio(data);...fFrameSize = data.size();if (fFrameSize > fMaxSize){fNumTruncatedBytes = fFrameSize - fMaxSize;fFrameSize = fMaxSize;envir()<<"frame size "<<fFrameSize<<" MaxSize size "<<fMaxSize<<"fNumTruncatedBytes\n";}else{fNumTruncatedBytes = 0;}if(data.size()!=0){memmove(fTo, data.data(), fFrameSize);}gettimeofday(&fPresentationTime, NULL);//時間戳nextTask() = envir().taskScheduler().scheduleDelayedTask(40000, (TaskFunc*)FramedSource::afterGetting, this); }unsigned int Mp3LiveFramedSource::maxFrameSize() const {return 300000; }

?

擴大視頻的容量

當傳輸?shù)囊曨l的分辨率較高時,需要對live555進行調(diào)整。

?

1增加發(fā)送端的OutPacketBuffer::maxSize大小,LIVE555默認OutPacketBuffer的大小只有60000,可能導致發(fā)送數(shù)據(jù)丟失從而接收端花屏,所以在main函數(shù)里 調(diào)用

OutPacketBuffer::maxSize = 90000;

當該值小時可能出現(xiàn)的錯誤提示(這里是沒有設(shè)置此值,使用的live555的默認值):

MultiFramedRTPSink::afterGettingFrame1(): The input frame data was too large for our buffer size (61140). ?12748 bytes of trailing data was dropped! ?Correct this by increasing "OutPacketBuffer::maxSize" to at least 72748, *before* creating this 'RTPSink'. ?(Current value is 60000.)

?

2.擴展幀解析buffer大小,即BANK_SIZE,默認值為150k,傳輸?shù)囊曨l幀較大所以設(shè)置為300k。否則視頻禎超出大小,會被Live555拋棄。

這個值是在live555源碼的StreamParser.cpp里,改為 #define BANK_SIZE 300000 ,修改完后重新編譯live555。

?

3.ByteStreamFileSource的maxFrameSize重寫,直接返回BANK_SIZE的最大值。

總結(jié)

以上是生活随笔為你收集整理的live555传输实时h264视频流和mp3音频流的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網(wǎng)站內(nèi)容還不錯,歡迎將生活随笔推薦給好友。

主站蜘蛛池模板: 国产一级片免费播放 | 中文成人无字幕乱码精品区 | 久久国产经典视频 | 欧美色亚洲色 | 日韩aⅴ视频 | 国产免费一区二区三区在线观看 | 狠狠干网| 日本成人片在线 | 欧美揉bbbbb揉bbbbb | 日本午夜精品理论片a级app发布 | 九九自拍偷拍 | 潘金莲一级淫片a.aaaaa播放 | 色婷婷基地 | 精品黑人一区二区三区久久 | 国产福利一区二区三区 | 久久色网| 日韩精品成人av | 午夜爱爱免费视频 | 88久久精品无码一区二区毛片 | 欧美精品网站 | 极品久久久 | 欧美日韩精品久久久 | 国产啊v在线 | 国产婷 | 天天干夜夜操视频 | 中国黄色1级片 | 国产一级黄色录像 | 一区二区在线 | 精品少妇人妻av一区二区三区 | 午夜久久视频 | 密臀av| 国产福利一区二区 | 国产91精品看黄网站在线观看 | 成人小片| 小早川怜子一区二区三区 | 黄色a网站| 国产精品video | 玖玖爱这里只有精品 | 肉感丰满的av演员 | 久久精品999 | 亚州精品国产精品乱码不99按摩 | 夜夜夜综合 | 国精产品一区二区 | 国产一区二区三区免费播放 | 成人影音在线 | 国产男男一区二区三区 | 国产三级av在线 | 玖玖色资源 | 黄色在线网 | 动漫av网 | www.爱色av.com| 国产黄色片免费看 | 免费一区二区三区四区 | 日本一级片在线播放 | 国产日韩一区二区三区 | 亚洲aa在线| 亚洲精品乱码久久久久久久久久久久 | 黑人精品一区二区三区 | 久久中文字幕人妻熟av女蜜柚m | 被黑人啪到哭的番号922在线 | 日韩av黄色片 | 摸一摸操一操 | 国产区在线观看视频 | 国产视频在线观看视频 | 亚洲色图另类 | 成年人av在线 | 777在线视频 | 日韩欧美www | 日韩黄色在线观看 | 最近中文字幕在线mv视频在线 | 国产精品亚洲一区二区三区 | 开心色站 | 精品成人在线视频 | 欧美激情aaa | 乱码一区二区三区 | 精品无码一区二区三区电影桃花 | 国产欧美日韩激情 | 精品久久久久久久久久久久久久 | 成人国产精品视频 | 成人不卡视频 | 国产精品国产精品国产专区不卡 | 精人妻无码一区二区三区 | 青青草原国产 | 欧美毛片在线观看 | 亚洲欧美国产精品 | 超碰在线免费97 | 国内激情自拍 | 色婷婷免费 | 久久久久激情 | 性欧美ⅴideo另类hd | 色网在线 | 久久久久久久久久久免费 | 国产日韩欧美高清 | 婷婷精品一区二区三区 | 99热在线这里只有精品 | 在线观看视频一区二区 | 91av高清| 91video| 亚洲中文字幕无码不卡电影 |