iOS硬解码H264视频流
????????蘋果在iOS 8.0系統(tǒng)之前若要做音視頻開發(fā)需使用第三方軟件進行編解碼(FFmpeg軟解碼H264視頻流可看到這里),學(xué)習(xí)成本較大,項目開發(fā)進度也可能超出預(yù)期。在iOS 8.0之后開放了視頻編解碼框架VideoToolbox,在此之后對于音視頻開發(fā)變得相對簡單。
?
?
?
一、硬解碼名詞(結(jié)構(gòu))解釋
1、VTDecompressionSessionRef:解碼器對象數(shù)據(jù)結(jié)構(gòu);
2、CMVideoFormatDescriptionRef:圖形解碼相關(guān)格式及描述;
3、CVPixelBufferRef:編碼前和解碼后的圖像數(shù)據(jù)結(jié)構(gòu);
4、CMBlockBufferRef:存在解碼前圖像數(shù)據(jù)內(nèi)存結(jié)構(gòu);
5、CMSampleBufferRef:存放解碼前的視頻圖像的容器數(shù)據(jù)結(jié)構(gòu);
6、AVSampleBufferDisplayLayer:以CMSampleBufferRef進行解碼并顯示Layer圖層;
7、SPS、PPS:h.264解碼參數(shù)信息;IDR:h.264視頻流I幀;
二、H264硬解碼流程圖
?
三:IDR(I幀)網(wǎng)絡(luò)裸流數(shù)據(jù)結(jié)構(gòu)
????????一般情況下網(wǎng)絡(luò)視頻裸流I幀中基本會包含SPS、PPS、SEI、IDR幀數(shù)據(jù),如下圖所示,但是部分只含有IDR幀數(shù)據(jù),其他解碼參數(shù)信息被單獨已Slice獲取。
?
四、硬解碼相關(guān)接口
?
1、初始化H264硬解解碼器
1)使用CMVideoFormatDescriptionCreateFromH264ParameterSets函數(shù)構(gòu)建解碼描述結(jié)構(gòu)CMVideoFormatDescriptionRef:
const uint8_t *const parameterSetPointers[2] = {pSPS,pPPS};const size_t parameterSetSizes[2] = {mSpsSize, mPpsSize};OSStatus status = CMVideoFormatDescriptionCreateFromH264ParameterSets(kCFAllocatorDefault,2, //參數(shù)個數(shù),主要包含SPS、PPSparameterSetPointers,parameterSetSizes,4, //NALU起始位個數(shù)&mDecoderFormatDescription);2)使用VTDecompressionSessionCreate函數(shù)構(gòu)建解碼器結(jié)構(gòu)VTDecompressionSessionRef:
?
uint32_t pixelFormatType = kCVPixelFormatType_420YpCbCr8BiPlanarFullRange; //NV12const void *keys[] = { kCVPixelBufferPixelFormatTypeKey };const void *values[] = { CFNumberCreate(NULL, kCFNumberSInt32Type, &pixelFormatType) }; //32位CFDictionaryRef attrs = CFDictionaryCreate(NULL, keys, values, 1, NULL, NULL);VTDecompressionOutputCallbackRecord callBackRecord;callBackRecord.decompressionOutputCallback = didDecompress; callBackRecord.decompressionOutputRefCon = NULL;status = VTDecompressionSessionCreate(kCFAllocatorDefault,mDecoderFormatDescription,NULL, attrs,&callBackRecord,&mDeocderSession);CFRelease(attrs);2、H264硬件解碼
1)將視頻裸流數(shù)據(jù)構(gòu)建成CMBlockBufferRef,主要目的是進一步轉(zhuǎn)換為CMSampleBufferRef:
????CMBlockBufferRef blockBuffer = NULL;OSStatus status = CMBlockBufferCreateWithMemoryBlock(kCFAllocatorDefault, (void *)videoBuffer, videoBufferSize, kCFAllocatorNull, NULL, 0, videoBufferSize, 0, &blockBuffer); CMSampleBufferRef sampleBuffer = NULL;const size_t sampleSizeArray[] = { videoBufferSize };OSStatus status = CMSampleBufferCreateReady(kCFAllocatorDefault, blockBuffer, mDecoderFormatDescription , 1, 0, NULL, 1, sampleSizeArray, &sampleBuffer);2)將CMSampleBufferRef結(jié)構(gòu)送入VTDecompressionSessionDecodeFrame函數(shù)進行解碼處理:
VTDecodeFrameFlags flags = 0;VTDecodeInfoFlags flagOut = 0;CVPixelBufferRef outputPixelBuffer = NULL;OSStatus decodeStatus = VTDecompressionSessionDecodeFrame(mDeocderSession, sampleBuffer, flags, &outputPixelBuffer, &flagOut);3)若使用AVSampleBufferDisplayLayer圖層進行直接顯示,可忽略上一步的還行,直接將CMSampleBufferRef送入AVSampleBufferDisplayLayer進行顯示:
CFArrayRef attachments = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, YES);CFMutableDictionaryRef dict = (CFMutableDictionaryRef)CFArrayGetValueAtIndex(attachments, 0);CFDictionarySetValue(dict, kCMSampleAttachmentKey_DisplayImmediately, kCFBooleanTrue);if ([self.displayLayer isReadyForMoreMediaData]) {@weakify(self);dispatch_sync(dispatch_get_main_queue(),^{@strongify(self);[self.displayLayer enqueueSampleBuffer:sampleBuffer];});}3、解碼之后的數(shù)據(jù)顯示
????????在本文中支持3種顯示方式:UIImage、CVPixelBufferRef、AVSampleBufferDisplayLayer,因在項目中需要UIImage,所以被默認轉(zhuǎn)化模式。
CVPixelBufferRef:即不進行UIImage轉(zhuǎn)換而直接輸出的方式;
AVSampleBufferDisplayLayer:不進行代碼邏輯解碼,被Layer層自行解碼和顯示;
UIImage:通過CVPixelBufferRef進一步轉(zhuǎn)化所得(提供了2種轉(zhuǎn)化方法,可在后面代碼中查看):
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];CIContext *temporaryContext = [CIContext contextWithOptions:nil];CGImageRef videoImage = [temporaryContext createCGImage:ciImage fromRect:CGRectMake(0, 0, CVPixelBufferGetWidth(pixelBuffer), CVPixelBufferGetHeight(pixelBuffer))];image = [[UIImage alloc] initWithCGImage:videoImage];CGImageRelease(videoImage);五、完整H264解碼代碼
????????本人原則上主張自主編寫相關(guān)代碼并學(xué)習(xí)相應(yīng)的知識,在此貼出iOS對H264視頻裸流硬解碼的完整代碼,各位可進行參考或?qū)W習(xí),若存在問題或者疑問歡迎留言。代碼中關(guān)于CLog接口為打印輸出,可自行屏蔽。
// // H264HwDecoder.h // IOTCamera // // Created by lzj<lizhijian_21@163.com> on 2017/2/18. // Copyright (c) 2017 LZJ. All rights reserved. //#import <Foundation/Foundation.h> #import <VideoToolbox/VideoToolbox.h> #import <AVFoundation/AVSampleBufferDisplayLayer.h>typedef enum : NSUInteger {H264HWDataType_Image = 0,H264HWDataType_Pixel,H264HWDataType_Layer, } H264HWDataType;@interface H264HwDecoder : NSObject@property (nonatomic,assign) H264HWDataType showType; //顯示類型 @property (nonatomic,strong) UIImage *image; //解碼成RGB數(shù)據(jù)時的IMG @property (nonatomic,assign) CVPixelBufferRef pixelBuffer; //解碼成YUV數(shù)據(jù)時的解碼BUF @property (nonatomic,strong) AVSampleBufferDisplayLayer *displayLayer; //顯示圖層@property (nonatomic,assign) BOOL isNeedPerfectImg; //是否讀取完整UIImage圖形(showType為0時才有效)- (instancetype)init;/**H264視頻流解碼@param videoData 視頻幀數(shù)據(jù)@param videoSize 視頻幀大小@return 視圖的寬高(width, height),當(dāng)為接收為AVSampleBufferDisplayLayer時返回接口是無效的*/ - (CGSize)decodeH264VideoData:(uint8_t *)videoData videoSize:(NSInteger)videoSize;/**釋放解碼器*/ - (void)releaseH264HwDecoder;/**視頻截圖@return IMG*/ - (UIImage *)snapshot;@end // // H264HwDecoder.m // IOTCamera // // Created by lzj<lizhijian_21@163.com> on 2017/2/18. // Copyright (c) 2017 LZJ. All rights reserved. //#import "H264HwDecoder.h"#ifndef FreeCharP #define FreeCharP(p) if (p) {free(p); p = NULL;} #endiftypedef enum : NSUInteger {HWVideoFrameType_UNKNOWN = 0,HWVideoFrameType_I,HWVideoFrameType_P,HWVideoFrameType_B,HWVideoFrameType_SPS,HWVideoFrameType_PPS,HWVideoFrameType_SEI, } HWVideoFrameType;@interface H264HwDecoder () {VTDecompressionSessionRef mDeocderSession;CMVideoFormatDescriptionRef mDecoderFormatDescription;uint8_t *pSPS;uint8_t *pPPS;uint8_t *pSEI;NSInteger mSpsSize;NSInteger mPpsSize;NSInteger mSeiSize;NSInteger mINalCount; //I幀起始碼個數(shù)NSInteger mPBNalCount; //P、B幀起始碼個數(shù)NSInteger mINalIndex; //I幀起始碼開始位BOOL mIsNeedReinit; //需要重置解碼器 }@endstatic void didDecompress(void *decompressionOutputRefCon, void *sourceFrameRefCon, OSStatus status, VTDecodeInfoFlags infoFlags, CVImageBufferRef pixelBuffer, CMTime presentationTimeStamp, CMTime presentationDuration ) {CVPixelBufferRef *outputPixelBuffer = (CVPixelBufferRef *)sourceFrameRefCon;*outputPixelBuffer = CVPixelBufferRetain(pixelBuffer); }@implementation H264HwDecoder- (instancetype)init {if (self = [super init]) {pSPS = pPPS = pSEI = NULL;mSpsSize = mPpsSize = mSeiSize = 0;mINalCount = mPBNalCount = mINalIndex = 0;mIsNeedReinit = NO;_showType = H264HWDataType_Image;_isNeedPerfectImg = NO;_pixelBuffer = NULL;}return self; }- (void)dealloc {[self releaseH264HwDecoder]; }- (BOOL)initH264HwDecoder {if (mDeocderSession) {return YES;}const uint8_t *const parameterSetPointers[2] = {pSPS,pPPS};const size_t parameterSetSizes[2] = {mSpsSize, mPpsSize};OSStatus status = CMVideoFormatDescriptionCreateFromH264ParameterSets(kCFAllocatorDefault, 2, parameterSetPointers, parameterSetSizes, 4, &mDecoderFormatDescription);if (status == noErr) {// kCVPixelFormatType_420YpCbCr8Planar is YUV420// kCVPixelFormatType_420YpCbCr8BiPlanarFullRange is NV12// kCVPixelFormatType_24RGB //使用24位bitsPerPixel// kCVPixelFormatType_32BGRA //使用32位bitsPerPixel,kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirstuint32_t pixelFormatType = kCVPixelFormatType_420YpCbCr8BiPlanarFullRange; //NV12if (self.showType == H264HWDataType_Pixel) {pixelFormatType = kCVPixelFormatType_420YpCbCr8Planar;}const void *keys[] = { kCVPixelBufferPixelFormatTypeKey };const void *values[] = { CFNumberCreate(NULL, kCFNumberSInt32Type, &pixelFormatType) };CFDictionaryRef attrs = CFDictionaryCreate(NULL, keys, values, 1, NULL, NULL);VTDecompressionOutputCallbackRecord callBackRecord;callBackRecord.decompressionOutputCallback = didDecompress;callBackRecord.decompressionOutputRefCon = NULL;status = VTDecompressionSessionCreate(kCFAllocatorDefault,mDecoderFormatDescription,NULL, attrs,&callBackRecord,&mDeocderSession);CFRelease(attrs);CLog(@"Init H264 hardware decoder success");} else {CLog([NSString stringWithFormat:@"Init H264 hardware decoder fail: %d", (int)status]);return NO;}return YES; }- (void)removeH264HwDecoder {if(mDeocderSession) {VTDecompressionSessionInvalidate(mDeocderSession);CFRelease(mDeocderSession);mDeocderSession = NULL;}if(mDecoderFormatDescription) {CFRelease(mDecoderFormatDescription);mDecoderFormatDescription = NULL;} }- (void)releaseH264HwDecoder {[self removeH264HwDecoder];[self releaseSliceInfo];if (_pixelBuffer) {CVPixelBufferRelease(_pixelBuffer);_pixelBuffer = NULL;} }- (void)releaseSliceInfo {FreeCharP(pSPS);FreeCharP(pPPS);FreeCharP(pSEI);mSpsSize = 0;mPpsSize = 0;mSeiSize = 0; }//將視頻數(shù)據(jù)封裝成CMSampleBufferRef進行解碼 - (CVPixelBufferRef)decode:(uint8_t *)videoBuffer videoSize:(NSInteger)videoBufferSize {CVPixelBufferRef outputPixelBuffer = NULL;CMBlockBufferRef blockBuffer = NULL;OSStatus status = CMBlockBufferCreateWithMemoryBlock(kCFAllocatorDefault, (void *)videoBuffer, videoBufferSize, kCFAllocatorNull, NULL, 0, videoBufferSize, 0, &blockBuffer);if (status == kCMBlockBufferNoErr) {CMSampleBufferRef sampleBuffer = NULL;const size_t sampleSizeArray[] = { videoBufferSize };status = CMSampleBufferCreateReady(kCFAllocatorDefault, blockBuffer, mDecoderFormatDescription , 1, 0, NULL, 1, sampleSizeArray, &sampleBuffer);if (status == kCMBlockBufferNoErr && sampleBuffer) {if (self.showType == H264HWDataType_Layer && _displayLayer) {CFArrayRef attachments = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, YES);CFMutableDictionaryRef dict = (CFMutableDictionaryRef)CFArrayGetValueAtIndex(attachments, 0);CFDictionarySetValue(dict, kCMSampleAttachmentKey_DisplayImmediately, kCFBooleanTrue);if ([self.displayLayer isReadyForMoreMediaData]) {@weakify(self);dispatch_sync(dispatch_get_main_queue(),^{@strongify(self);[self.displayLayer enqueueSampleBuffer:sampleBuffer];});}CFRelease(sampleBuffer);} else {VTDecodeFrameFlags flags = 0;VTDecodeInfoFlags flagOut = 0;OSStatus decodeStatus = VTDecompressionSessionDecodeFrame(mDeocderSession, sampleBuffer, flags, &outputPixelBuffer, &flagOut);CFRelease(sampleBuffer);if (decodeStatus == kVTVideoDecoderMalfunctionErr) {CLog(@"Decode failed status: kVTVideoDecoderMalfunctionErr");CVPixelBufferRelease(outputPixelBuffer);outputPixelBuffer = NULL;} else if(decodeStatus == kVTInvalidSessionErr) {CLog(@"Invalid session, reset decoder session");[self removeH264HwDecoder];} else if(decodeStatus == kVTVideoDecoderBadDataErr) {CLog([NSString stringWithFormat:@"Decode failed status=%d(Bad data)", (int)decodeStatus]);} else if(decodeStatus != noErr) {CLog([NSString stringWithFormat:@"Decode failed status=%d", (int)decodeStatus]);}}}CFRelease(blockBuffer);}return outputPixelBuffer; }- (CGSize)decodeH264VideoData:(uint8_t *)videoData videoSize:(NSInteger)videoSize {CGSize imageSize = CGSizeMake(0, 0);if (videoData && videoSize > 0) {HWVideoFrameType frameFlag = [self analyticalData:videoData size:videoSize];if (mIsNeedReinit) {mIsNeedReinit = NO;[self removeH264HwDecoder];}if (pSPS && pPPS && (frameFlag == HWVideoFrameType_I || frameFlag == HWVideoFrameType_P || frameFlag == HWVideoFrameType_B)) {uint8_t *buffer = NULL;if (frameFlag == HWVideoFrameType_I) {int nalExtra = (mINalCount==3?1:0); //如果是3位的起始碼,轉(zhuǎn)為大端時需要增加1位videoSize -= mINalIndex;buffer = (uint8_t *)malloc(videoSize + nalExtra);memcpy(buffer + nalExtra, videoData + mINalIndex, videoSize);videoSize += nalExtra;} else {int nalExtra = (mPBNalCount==3?1:0);buffer = (uint8_t *)malloc(videoSize + nalExtra);memcpy(buffer + nalExtra, videoData, videoSize);videoSize += nalExtra;}uint32_t nalSize = (uint32_t)(videoSize - 4);uint32_t *pNalSize = (uint32_t *)buffer;*pNalSize = CFSwapInt32HostToBig(nalSize);CVPixelBufferRef pixelBuffer = NULL;if ([self initH264HwDecoder]) {pixelBuffer = [self decode:buffer videoSize:videoSize];if(pixelBuffer) {NSInteger width = CVPixelBufferGetWidth(pixelBuffer);NSInteger height = CVPixelBufferGetHeight(pixelBuffer);imageSize = CGSizeMake(width, height);if (self.showType == H264HWDataType_Pixel) {if (_pixelBuffer) {CVPixelBufferRelease(_pixelBuffer);}self.pixelBuffer = CVPixelBufferRetain(pixelBuffer);} else {if (frameFlag == HWVideoFrameType_B) { //若B幀未進行亂序解碼,順序播放,則在此需要去除,否則解碼圖形則是灰色。size_t planeCount = CVPixelBufferGetPlaneCount(pixelBuffer);if (planeCount >= 2 && planeCount <= 3) {CVPixelBufferLockBaseAddress(pixelBuffer, 0);u_char *yDestPlane = (u_char *)CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);if (planeCount == 2) {u_char *uvDestPlane = (u_char *)CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1);if (yDestPlane[0] == 0x80 && uvDestPlane[0] == 0x80 && uvDestPlane[1] == 0x80) {frameFlag = HWVideoFrameType_UNKNOWN;NSLog(@"Video YUV data parse error: Y=%02x U=%02x V=%02x", yDestPlane[0], uvDestPlane[0], uvDestPlane[1]);}} else if (planeCount == 3) {u_char *uDestPlane = (u_char *)CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1);u_char *vDestPlane = (u_char *)CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 2);if (yDestPlane[0] == 0x80 && uDestPlane[0] == 0x80 && vDestPlane[0] == 0x80) {frameFlag = HWVideoFrameType_UNKNOWN;NSLog(@"Video YUV data parse error: Y=%02x U=%02x V=%02x", yDestPlane[0], uDestPlane[0], vDestPlane[0]);}}CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);}}if (frameFlag != HWVideoFrameType_UNKNOWN) {self.image = [self pixelBufferToImage:pixelBuffer];}}CVPixelBufferRelease(pixelBuffer);}}FreeCharP(buffer);}}return imageSize; }- (UIImage *)pixelBufferToImage:(CVPixelBufferRef)pixelBuffer {UIImage *image = nil;if (!self.isNeedPerfectImg) {//第1種繪制(可直接顯示,不可保存為文件(無效缺少圖像描述參數(shù)))CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];image = [UIImage imageWithCIImage:ciImage];} else {//第2種繪制(可直接顯示,可直接保存為文件,相對第一種性能消耗略大)CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];CIContext *temporaryContext = [CIContext contextWithOptions:nil];CGImageRef videoImage = [temporaryContext createCGImage:ciImage fromRect:CGRectMake(0, 0, CVPixelBufferGetWidth(pixelBuffer), CVPixelBufferGetHeight(pixelBuffer))];image = [[UIImage alloc] initWithCGImage:videoImage];CGImageRelease(videoImage);}return image; }- (UIImage *)snapshot {UIImage *img = nil;if (self.displayLayer) {UIGraphicsBeginImageContext(self.displayLayer.bounds.size);[self.displayLayer renderInContext:UIGraphicsGetCurrentContext()];img = UIGraphicsGetImageFromCurrentImageContext();UIGraphicsEndImageContext();} else {if (self.showType == H264HWDataType_Pixel) {if (self.pixelBuffer) {img = [self pixelBufferToImage:self.pixelBuffer];}} else {img = self.image;}if (!self.isNeedPerfectImg) {UIGraphicsBeginImageContext(CGSizeMake(img.size.width, img.size.height));[img drawInRect:CGRectMake(0, 0, img.size.width, img.size.height)];img = UIGraphicsGetImageFromCurrentImageContext();UIGraphicsEndImageContext();}}return img; }//從起始位開始查詢SPS、PPS、SEI、I、B、P幀起始碼,遇到I、P、B幀則退出 //存在多種情況: //1、起始碼是0x0 0x0 0x0 0x01 或 0x0 0x0 0x1 //2、每個SPS、PPS、SEI、I、B、P幀為單獨的Slice //3、I幀中包含SPS、PPS、I數(shù)據(jù)Slice //4、I幀中包含第3點的數(shù)據(jù)之外還包含SEI,順序:SPS、PPS、SEI、I //5、起始位是AVCC協(xié)議格式的大端數(shù)據(jù)(不支持多Slice的視頻幀) - (HWVideoFrameType)analyticalData:(const uint8_t *)buffer size:(NSInteger)size {NSInteger preIndex = 0;HWVideoFrameType preFrameType = HWVideoFrameType_UNKNOWN;HWVideoFrameType curFrameType = HWVideoFrameType_UNKNOWN;for (int i=0; i<size && i<300; i++) { //一般第四種情況下的幀起始信息不會超過(32+256+12)位,可適當(dāng)增大,為了不循環(huán)整個幀片數(shù)據(jù)int nalSize = [self getNALHeaderLen:(buffer + i) size:size-i];if (nalSize == 0 && i == 0) { //當(dāng)每個Slice起始位開始若使用AVCC協(xié)議則判斷幀大小是否一致uint32_t *pNalSize = (uint32_t *)(buffer);uint32_t videoSize = CFSwapInt32BigToHost(*pNalSize); //大端模式轉(zhuǎn)為系統(tǒng)端模式if (videoSize == size - 4) { //是大端模式(AVCC)nalSize = 4;}}if (nalSize && i + nalSize + 1 < size) {int sliceType = buffer[i + nalSize] & 0x1F;if (sliceType == 0x1) {mPBNalCount = nalSize;if (buffer[i + nalSize] == 0x1) { //B幀curFrameType = HWVideoFrameType_B;} else { //P幀curFrameType = HWVideoFrameType_P;}break;} else if (sliceType == 0x5) { //IDR(I幀)if (preFrameType == HWVideoFrameType_PPS) {mIsNeedReinit = [self getSliceInfo:buffer slice:&pPPS size:&mPpsSize start:preIndex end:i];} else if (preFrameType == HWVideoFrameType_SEI) {[self getSliceInfo:buffer slice:&pSEI size:&mSeiSize start:preIndex end:i];}mINalCount = nalSize;mINalIndex = i;curFrameType = HWVideoFrameType_I;goto Goto_Exit;} else if (sliceType == 0x7) { //SPSpreFrameType = HWVideoFrameType_SPS;preIndex = i + nalSize;i += nalSize;} else if (sliceType == 0x8) { //PPSif (preFrameType == HWVideoFrameType_SPS) {mIsNeedReinit = [self getSliceInfo:buffer slice:&pSPS size:&mSpsSize start:preIndex end:i];}preFrameType = HWVideoFrameType_PPS;preIndex = i + nalSize;i += nalSize;} else if (sliceType == 0x6) { //SEIif (preFrameType == HWVideoFrameType_PPS) {mIsNeedReinit = [self getSliceInfo:buffer slice:&pPPS size:&mPpsSize start:preIndex end:i];}preFrameType = HWVideoFrameType_SEI;preIndex = i + nalSize;i += nalSize;}}}//SPS、PPS、SEI為單獨的Slice幀片if (curFrameType == HWVideoFrameType_UNKNOWN && preIndex != 0) {if (preFrameType == HWVideoFrameType_SPS) {mIsNeedReinit = [self getSliceInfo:buffer slice:&pSPS size:&mSpsSize start:preIndex end:size];curFrameType = HWVideoFrameType_SPS;} else if (preFrameType == HWVideoFrameType_PPS) {mIsNeedReinit = [self getSliceInfo:buffer slice:&pPPS size:&mPpsSize start:preIndex end:size];curFrameType = HWVideoFrameType_PPS;} else if (preFrameType == HWVideoFrameType_SEI) {[self getSliceInfo:buffer slice:&pSEI size:&mSeiSize start:preIndex end:size];curFrameType = HWVideoFrameType_SEI;}}Goto_Exit:return curFrameType; }//獲取NAL的起始碼長度是3還4 - (int)getNALHeaderLen:(const uint8_t *)buffer size:(NSInteger)size {if (size >= 4 && buffer[0] == 0x0 && buffer[1] == 0x0 && buffer[2] == 0x0 && buffer[3] == 0x1) {return 4;} else if (size >= 3 && buffer[0] == 0x0 && buffer[1] == 0x0 && buffer[2] == 0x1) {return 3;}return 0; }//給SPS、PPS、SEI的Buf賦值,返回YES表示不同于之前的值 - (BOOL)getSliceInfo:(const uint8_t *)videoBuf slice:(uint8_t **)sliceBuf size:(NSInteger *)size start:(NSInteger)start end:(NSInteger)end {BOOL isDif = NO;NSInteger len = end - start;uint8_t *tempBuf = (uint8_t *)(*sliceBuf);if (tempBuf) {if (len != *size || memcmp(tempBuf, videoBuf + start, len) != 0) {free(tempBuf);tempBuf = (uint8_t *)malloc(len);memcpy(tempBuf, videoBuf + start, len);*sliceBuf = tempBuf;*size = len;isDif = YES;}} else {tempBuf = (uint8_t *)malloc(len);memcpy(tempBuf, videoBuf + start, len);*sliceBuf = tempBuf;*size = len;}return isDif; }@end總結(jié)
以上是生活随笔為你收集整理的iOS硬解码H264视频流的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 解码H264视频出现花屏或马赛克的问题
- 下一篇: 用拼音输入法字典库实现同音字模糊查询