日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

ios中的视频采集及参数设置和相机操作

發布時間:2023/12/15 编程问答 40 豆豆
生活随笔 收集整理的這篇文章主要介紹了 ios中的视频采集及参数设置和相机操作 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

概述

在直播應用中,視頻的采集一般都是用AVFoundation框架,因為利用它我們能定制采集視頻的參數;也能做切換手機攝像頭、拍照、打開手電筒等一些列相機的操作;當然,更重要的一點是我們能獲取到原始視頻數據用來做編碼等操作。這篇文章我們介紹的內容如下:

  • 介紹和視頻采集相關的關鍵類
  • 介紹視頻采集的步驟
  • 介紹如何改變視頻采集的參數,例如:分辨率,幀率,放大&縮小預覽層,設置曝光等。
  • 詳細介紹相機操作,例如:拍照、切換前后鏡頭、打開&關閉手電筒等操作。

代碼:

  • github
  • 歡迎fork&star

視頻采集的關鍵類

AVCaptureDevice

它表示硬件設備,我們可以從這個類中獲取手機硬件的照相機,聲音傳感器等。當我們需要改變一些硬件設備的屬性時(例如:閃光模式改變,相機聚焦改變等),必須要在改變設備屬性之前調用lockForConfiguration為設備加鎖,改變完成后調用unlockForConfiguration方法解鎖設備。

AVCaptureDeviceInput

輸入設備管理對象,可以根據AVCaptureDevice創建創建對應的AVCaptureDeviceInput對象,該對象會被添加到AVCaptureSession中管理。它代表輸入設備,它配置硬件設備的ports,通常的輸入設備有(麥克風,相機等)。

AVCaptureOutput

代表輸出數據,輸出的可以是圖片(AVCaptureStillImageOutput)或者視頻(AVCaptureMovieFileOutput)

AVCaptureSession

媒體捕捉會話,負責把捕捉的音視頻數據輸出到輸出設備中。一個AVCaptureSession可以有多個輸入或輸出。它是連接AVCaptureInput和AVCaptureOutput的橋梁,它協調input到output之間傳輸數據。它用startRunning和stopRunning兩種方法來開啟和結束會話。

每個session稱之為一個會話,也就是在應用運行過程中如果需要改變會話的一些配置(eg:切換攝像頭),此時需要先開啟配置,配置完成之后再提交配置。

AVCaptureConnection

AVCaptureConnection represents a connection between an AVCaptureInputPort or ports, and an AVCaptureOutput or AVCaptureVideoPreviewLayer present in an AVCaptureSession.即它是一個連接,這個連接是inputPort和output之間或者是圖像當前預覽層和當前會話之間的。

AVCaptureVideoPreviewPlayer

它是圖片預覽層。我們的照片以及視頻是如何顯示在手機上的呢?那就是通過把這個對象添加到UIView 的layer上的。

視頻采集的步驟

以下是視頻采集的代碼,幀率是30FPS,分辨率是1920*1080。

#import "MiVideoCollectVC.h" #import <AVFoundation/AVFoundation.h>@interface MiVideoCollectVC ()<AVCaptureVideoDataOutputSampleBufferDelegate> @property (nonatomic,strong) AVCaptureVideoDataOutput *video_output; @property (nonatomic,strong) AVCaptureSession *m_session;@property (weak, nonatomic) IBOutlet UIView *m_displayView; @end@implementation MiVideoCollectVC- (void)viewDidLoad {[super viewDidLoad];// Do any additional setup after loading the view.[self startCaptureSession]; }- (void)viewWillAppear:(BOOL)animated {[super viewWillAppear:animated];[self startPreview]; } - (IBAction)onpressedBtnDismiss:(id)sender {[self dismissViewControllerAnimated:YES completion:^{[self stopPreview];}]; }- (void)startCaptureSession {NSError *error = nil;AVCaptureSession *session = [[AVCaptureSession alloc] init];if ([session canSetSessionPreset:AVCaptureSessionPreset1920x1080]) {session.sessionPreset = AVCaptureSessionPreset1920x1080;}else{session.sessionPreset = AVCaptureSessionPresetHigh;}AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];if (error || !input) {NSLog(@"get input device error...");return;}[session addInput:input];_video_output = [[AVCaptureVideoDataOutput alloc] init];[session addOutput:_video_output];// Specify the pixel format_video_output.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange]forKey:(id)kCVPixelBufferPixelFormatTypeKey];_video_output.alwaysDiscardsLateVideoFrames = NO;dispatch_queue_t video_queue = dispatch_queue_create("MIVideoQueue", NULL);[_video_output setSampleBufferDelegate:self queue:video_queue];CMTime frameDuration = CMTimeMake(1, 30);BOOL frameRateSupported = NO;for (AVFrameRateRange *range in [device.activeFormat videoSupportedFrameRateRanges]) {if (CMTIME_COMPARE_INLINE(frameDuration, >=, range.minFrameDuration) &&CMTIME_COMPARE_INLINE(frameDuration, <=, range.maxFrameDuration)) {frameRateSupported = YES;}}if (frameRateSupported && [device lockForConfiguration:&error]) {[device setActiveVideoMaxFrameDuration:frameDuration];[device setActiveVideoMinFrameDuration:frameDuration];[device unlockForConfiguration];}[self adjustVideoStabilization];_m_session = session;CALayer *previewViewLayer = [self.m_displayView layer];previewViewLayer.backgroundColor = [[UIColor blackColor] CGColor];AVCaptureVideoPreviewLayer *newPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_m_session];[newPreviewLayer setFrame:[UIApplication sharedApplication].keyWindow.bounds];[newPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];// [previewViewLayer insertSublayer:newPreviewLayer atIndex:2];[previewViewLayer insertSublayer:newPreviewLayer atIndex:0]; }- (void)adjustVideoStabilization {NSArray *devices = [AVCaptureDevice devices];for (AVCaptureDevice *device in devices) {if ([device hasMediaType:AVMediaTypeVideo]) {if ([device.activeFormat isVideoStabilizationModeSupported:AVCaptureVideoStabilizationModeAuto]) {for (AVCaptureConnection *connection in _video_output.connections) {for (AVCaptureInputPort *port in [connection inputPorts]) {if ([[port mediaType] isEqual:AVMediaTypeVideo]) {if (connection.supportsVideoStabilization) {connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeStandard;NSLog(@"now videoStabilizationMode = %ld",(long)connection.activeVideoStabilizationMode);}else{NSLog(@"connection does not support video stablization");}}}}}else{NSLog(@"device does not support video stablization");}}} }- (void)startPreview {if (![_m_session isRunning]) {[_m_session startRunning];} }- (void)stopPreview {if ([_m_session isRunning]) {[_m_session stopRunning];} }#pragma mark -AVCaptureVideoDataOutputSampleBufferDelegate - (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {NSLog(@"%s",__func__); }// 有丟幀時,此代理方法會觸發 - (void)captureOutput:(AVCaptureOutput *)output didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {NSLog(@"MediaIOS: 丟幀..."); }@end復制代碼

視頻采集的具體步驟總結如下:

  • 首先創建一個AVCaptureSession對象,并且為該對象輸入設備和輸出設備并把輸入輸出設備添加到AVCaptrueSession對象。
  • 為AVCaptureSession設置視頻分辨率
  • 設置視頻采集的幀率
  • 創建視頻預覽層并插入到view的layer中
  • 改變視頻采集參數-分辨率和幀率

    我們先不介紹如何改變視頻的分辨率和幀率,我們首先來講一下如何監控視頻采集的這些參數,因為我們只有能監控到這些參數的變化才能知道我們對這些參數的設置是否成功。

    監控視頻分辨率:

    我們可以通過AVCaptureSession對象的sessionPreset直接獲取到,它是一個字符串,我們設置完成之后直接打印一下就可以了。

    監控視頻幀率:

    視頻的幀率表示的是每秒采集的視頻幀數,我們可以通過啟動一個timer(1s刷新一次),來實時打印當前采集的視頻幀率是多少。下面是計算1s內采集視頻幀數的代碼:

    // 計算每秒鐘采集視頻多少幀 static int captureVideoFPS; + (void)calculatorCaptureFPS {static int count = 0;static float lastTime = 0;CMClockRef hostClockRef = CMClockGetHostTimeClock();CMTime hostTime = CMClockGetTime(hostClockRef);float nowTime = CMTimeGetSeconds(hostTime);if(nowTime - lastTime >= 1){captureVideoFPS = count;lastTime = nowTime;count = 0;}else{count ++;} }// 獲取視頻幀率 + (int)getCaptureVideoFPS {return captureVideoFPS; } 復制代碼

    改變分辨率

    /*** Reset resolution** @param m_session AVCaptureSession instance* @param resolution*/ + (void)resetSessionPreset:(AVCaptureSession *)m_session resolution:(int)resolution {[m_session beginConfiguration];switch (resolution) {case 1080:m_session.sessionPreset = [m_session canSetSessionPreset:AVCaptureSessionPreset1920x1080] ? AVCaptureSessionPreset1920x1080 : AVCaptureSessionPresetHigh;break;case 720:m_session.sessionPreset = [m_session canSetSessionPreset:AVCaptureSessionPreset1280x720] ? AVCaptureSessionPreset1280x720 : AVCaptureSessionPresetMedium;break;case 480:m_session.sessionPreset = [m_session canSetSessionPreset:AVCaptureSessionPreset640x480] ? AVCaptureSessionPreset640x480 : AVCaptureSessionPresetMedium;break;case 360:m_session.sessionPreset = AVCaptureSessionPresetMedium;break;default:break;}[m_session commitConfiguration]; }復制代碼

    改變視頻幀率

    + (void)settingFrameRate:(int)frameRate {AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];[captureDevice lockForConfiguration:NULL];@try {[captureDevice setActiveVideoMinFrameDuration:CMTimeMake(1, frameRate)];[captureDevice setActiveVideoMaxFrameDuration:CMTimeMake(1, frameRate)];} @catch (NSException *exception) {NSLog(@"MediaIOS, 設備不支持所設置的分辨率,錯誤信息:%@",exception.description);} @finally {}[captureDevice unlockForConfiguration]; } 復制代碼

    為視頻預覽層添加捏合手勢

    在用雙手勢時,可以放大縮小所預覽的視頻。

    #define MiMaxZoomFactor 5.0f #define MiPrinchVelocityDividerFactor 20.0f+ (void)zoomCapture:(UIPinchGestureRecognizer *)recognizer {AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];[videoDevice formats];if ([recognizer state] == UIGestureRecognizerStateChanged) {NSError *error = nil;if ([videoDevice lockForConfiguration:&error]) {CGFloat desiredZoomFactor = videoDevice.videoZoomFactor + atan2f(recognizer.velocity, MiPrinchVelocityDividerFactor);videoDevice.videoZoomFactor = desiredZoomFactor <= MiMaxZoomFactor ? MAX(1.0, MIN(desiredZoomFactor, videoDevice.activeFormat.videoMaxZoomFactor)) : MiMaxZoomFactor ;[videoDevice unlockForConfiguration];} else {NSLog(@"error: %@", error);}}}復制代碼

    相機操作

    在視頻采集的時候,可能還伴隨有切換前后鏡頭、打開&關閉閃光燈、拍照等操作。

    切換相機前后鏡頭

    此處切換鏡頭后,我把分辨率默認設置為了720p,因為對于有的設備可能前置攝像頭不支持1080p,所以我在此設定一個固定的720p,如果在真實的項目中,這個值應該是你以前設定的那個值,如果前置攝像頭不支持對應的又不支持的策略。

    // 切換攝像頭 - (void)switchCamera {[_m_session beginConfiguration];if ([[_video_input device] position] == AVCaptureDevicePositionBack) {NSArray * devices = [AVCaptureDevice devices];for(AVCaptureDevice * device in devices) {if([device hasMediaType:AVMediaTypeVideo]) {if([device position] == AVCaptureDevicePositionFront) {[self rePreviewWithCameraType:MiCameraType_Front device:device];break;}}}}else{NSArray * devices = [AVCaptureDevice devices];for(AVCaptureDevice * device in devices) {if([device hasMediaType:AVMediaTypeVideo]) {if([device position] == AVCaptureDevicePositionBack) {[self rePreviewWithCameraType:MiCameraType_Back device:device];break;}}}}[_m_session commitConfiguration]; }- (void)rePreviewWithCameraType:(MiCameraType)cameraType device:(AVCaptureDevice *)device {NSError *error = nil;AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:deviceerror:&error];if (!input) return;[_m_session removeInput:_video_input];_m_session.sessionPreset = AVCaptureSessionPresetLow;if ([_m_session canAddInput:input]) {[_m_session addInput:input];}else {return;}_video_input = input;_m_cameraType = cameraType;NSString *preset = AVCaptureSessionPreset1280x720;if([device supportsAVCaptureSessionPreset:preset] && [_m_session canSetSessionPreset:preset]) {_m_session.sessionPreset = preset;}else {NSString *sesssionPreset = AVCaptureSessionPreset1280x720;if(![sesssionPreset isEqualToString:preset]) {_m_session.sessionPreset = sesssionPreset;}} } 復制代碼

    打開關閉閃光燈

    // 打開關閉閃光燈 -(void)switchTorch {[_m_session beginConfiguration];[[_video_input device] lockForConfiguration:NULL];self.m_torchMode = [_video_input device].torchMode == AVCaptureTorchModeOn ? AVCaptureTorchModeOff : AVCaptureTorchModeOn;if ([[_video_input device] isTorchModeSupported:_m_torchMode ]) {[_video_input device].torchMode = self.m_torchMode;}[[_video_input device] unlockForConfiguration];[_m_session commitConfiguration]; } 復制代碼

    拍照并保存到相冊

    具體的方案是:

    • 設置一個flag,在視頻采集的代理方法中監測這個flag,當觸發了拍照動作后改變flag的值
    • 在視頻采集的代理方法中判斷flag的值是否為需要拍照的裝填,如果是則轉化當前幀CMSampleBufferRef為UIImage,然后再把UIImage存儲到相冊中

    注意:以下代碼只有指定像素格式為RGB的時候,才能保存成功一張彩色的照片到相冊。

    - (UIImage *)convertSameBufferToUIImage:(CMSampleBufferRef)sampleBuffer {// 為媒體數據設置一個CMSampleBuffer的Core Video圖像緩存對象CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);// 鎖定pixel buffer的基地址CVPixelBufferLockBaseAddress(imageBuffer, 0);// 得到pixel buffer的基地址void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);// 得到pixel buffer的行字節數size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);// 得到pixel buffer的寬和高size_t width = CVPixelBufferGetWidth(imageBuffer);size_t height = CVPixelBufferGetHeight(imageBuffer);// 創建一個依賴于設備的RGB顏色空間CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();// 用抽樣緩存的數據創建一個位圖格式的圖形上下文(graphics context)對象CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);// 根據這個位圖context中的像素數據創建一個Quartz image對象CGImageRef quartzImage = CGBitmapContextCreateImage(context);// 解鎖pixel bufferCVPixelBufferUnlockBaseAddress(imageBuffer,0);// 釋放context和顏色空間CGContextRelease(context);CGColorSpaceRelease(colorSpace);// 用Quartz image創建一個UIImage對象imageUIImage *image = [UIImage imageWithCGImage:quartzImage];// 釋放Quartz image對象CGImageRelease(quartzImage);return (image); }+ (void)saveImageToSysphotos:(UIImage *)image {ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];[library writeImageToSavedPhotosAlbum:image.CGImage metadata:nil completionBlock:^(NSURL *assetURL, NSError *error) {if (error) {NSLog(@"MediaIos, save photo to photos error, error info: %@",error.description);}else{NSLog(@"MediaIos, save photo success...");}}]; }復制代碼

    設置自動對焦

    // 設置為自動對焦 - (void)mifocus:(UITapGestureRecognizer *)sender {CGPoint point = [sender locationInView:self.m_displayView];[self miAutoFocusWithPoint:point];NSLog(@"MediaIos, auto focus complete..."); }- (void)miAutoFocusWithPoint:(CGPoint)touchPoint{AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];if ([captureDevice isFocusPointOfInterestSupported] && [captureDevice isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {NSError *error;if ([captureDevice lockForConfiguration:&error]) {// 設置曝光點[captureDevice setExposurePointOfInterest:touchPoint];[captureDevice setExposureMode:AVCaptureExposureModeContinuousAutoExposure];// 設置對焦點[captureDevice setFocusPointOfInterest:touchPoint];[captureDevice setFocusMode:AVCaptureFocusModeAutoFocus];[captureDevice unlockForConfiguration];}} }復制代碼

    曝光調節

    // 曝光調節 - (void)changeExposure:(id)sender {UISlider *slider = (UISlider *)sender;[self michangeExposure:slider.value];}- (void)michangeExposure:(CGFloat)value{AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];NSError *error;if ([device lockForConfiguration:&error]) {[device setExposureTargetBias:value completionHandler:nil];[device unlockForConfiguration];} }復制代碼

    設置黑白平衡

    - (AVCaptureWhiteBalanceGains)recalcGains:(AVCaptureWhiteBalanceGains)gainsminValue:(CGFloat)minValuemaxValue:(CGFloat)maxValue {AVCaptureWhiteBalanceGains tmpGains = gains;tmpGains.blueGain = MAX(MIN(tmpGains.blueGain , maxValue), minValue);tmpGains.redGain = MAX(MIN(tmpGains.redGain , maxValue), minValue);tmpGains.greenGain = MAX(MIN(tmpGains.greenGain, maxValue), minValue);return tmpGains; }-(void)setWhiteBlanceUseTemperature:(CGFloat)temperature{AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];if ([device isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeLocked]) {[device lockForConfiguration:nil];AVCaptureWhiteBalanceGains currentGains = device.deviceWhiteBalanceGains;CGFloat currentTint = [device temperatureAndTintValuesForDeviceWhiteBalanceGains:currentGains].tint;AVCaptureWhiteBalanceTemperatureAndTintValues tempAndTintValues = {.temperature = temperature,.tint = currentTint,};AVCaptureWhiteBalanceGains gains = [device deviceWhiteBalanceGainsForTemperatureAndTintValues:tempAndTintValues];CGFloat maxWhiteBalanceGain = device.maxWhiteBalanceGain;gains = [self recalcGains:gains minValue:1 maxValue:maxWhiteBalanceGain];[device setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:gains completionHandler:nil];[device unlockForConfiguration];} }// 黑白平衡調節 - (void)whiteBlanceChange:(id)sender {UISlider *slider = (UISlider *)sender;[self setWhiteBlanceUseTemperature:slider.value]; }復制代碼

    轉載于:https://juejin.im/post/5cdaee84e51d453a506b0f0f

    總結

    以上是生活随笔為你收集整理的ios中的视频采集及参数设置和相机操作的全部內容,希望文章能夠幫你解決所遇到的問題。

    如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。