日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 综合教程 >内容正文

综合教程

Android:高通平台Camera HFR Usecase分析

發布時間:2023/12/13 综合教程 26 生活家
生活随笔 收集整理的這篇文章主要介紹了 Android:高通平台Camera HFR Usecase分析 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

一、高幀率錄像簡介

  高幀率錄像即慢動作拍攝,通常人眼能夠接受的最好的視頻幀速率是24幀/每秒。如果用120幀/秒拍攝一個動作,再用24幀每秒來播放的話,視頻就放慢了5倍。

  高通平臺的 Slow motion feature :

高速錄制(HSR) : 以高fps(運行速率)捕獲、編碼并保存為高 fps(目標速率),運行速率等于目標速率。
高幀率錄制(HFR) : 以高fps(運行速率)捕獲、編碼并保存為30 fps(目標速率),運行速率大于目標速率。

、代碼流程分析 (高通相機源碼路徑:packagesappsSnapdragonCamera)

1、app啟動錄像 : packagesappsSnapdragonCamerasrccomandroidcameraCaptureModule.java

    private boolean startRecordingVideo(final int cameraId) {
            ...

            if (ApiHelper.isAndroidPOrHigher()) {
                if (mHighSpeedCapture && ((int) mHighSpeedFPSRange.getUpper() > NORMAL_SESSION_MAX_FPS)) {
                    CaptureRequest initialRequest = mVideoRequestBuilder.build();
                    buildConstrainedCameraSession(mCameraDevice[cameraId], surfaces,
                            mSessionListener, mCameraHandler, initialRequest);

                } else {
                    configureCameraSessionWithParameters(cameraId, surfaces,
                            mSessionListener, mCameraHandler, mVideoRequestBuilder.build());
                }
            } else {

                //hfr開啟且最大幀率大于NORMAL_SESSION_MAX_FPS(60fps)
                //創建的是createConstrainedHighSpeedCaptureSession
                //否則是createCaptureSession
                if (mHighSpeedCapture && ((int) mHighSpeedFPSRange.getUpper() > NORMAL_SESSION_MAX_FPS)) {
            //創建高速流 mCameraDevice[cameraId].createConstrainedHighSpeedCaptureSession(surfaces, new CameraConstrainedHighSpeedCaptureSession.StateCallback() { @Override public void onConfigured(CameraCaptureSession cameraCaptureSession) { mCurrentSession = cameraCaptureSession; Log.v(TAG, "createConstrainedHighSpeedCaptureSession onConfigured"); mCaptureSession[cameraId] = cameraCaptureSession; CameraConstrainedHighSpeedCaptureSession session = (CameraConstrainedHighSpeedCaptureSession) mCurrentSession;

try {
setUpVideoCaptureRequestBuilder(mVideoRequestBuilder, cameraId);
List list = CameraUtil
.createHighSpeedRequestList(mVideoRequestBuilder.build());
                        // 通過setRepeatingBurst每次同時提交多個request申請,對應的native方法是submitRequestList
session.setRepeatingBurst(list, mCaptureCallback, mCameraHandler);
} catch (CameraAccessException e) {
Log.e(TAG, "Failed to start high speed video recording "
+ e.getMessage());
e.printStackTrace();
} catch (IllegalArgumentException e) {
Log.e(TAG, "Failed to start high speed video recording "
+ e.getMessage());
e.printStackTrace();
} catch (IllegalStateException e) {
Log.e(TAG, "Failed to start high speed video recording "
+ e.getMessage());
e.printStackTrace();
}
if (!mFrameProcessor.isFrameListnerEnabled() && !startMediaRecorder()) {
startRecordingFailed();
return;
} }, null); } else { surfaces.add(mVideoSnapshotImageReader.getSurface()); String zzHDR = mSettingsManager.getValue(SettingsManager.KEY_VIDEO_HDR_VALUE); boolean zzHdrStatue = zzHDR.equals("1"); // if enable ZZHDR mode, don`t call the setOpModeForVideoStream method. if (!zzHdrStatue) { setOpModeForVideoStream(cameraId); } String value = mSettingsManager.getValue(SettingsManager.KEY_FOVC_VALUE); if (value != null && Boolean.parseBoolean(value)) { mStreamConfigOptMode = mStreamConfigOptMode | STREAM_CONFIG_MODE_FOVC; } if (zzHdrStatue) { mStreamConfigOptMode = STREAM_CONFIG_MODE_ZZHDR; } if (DEBUG) { Log.v(TAG, "createCustomCaptureSession mStreamConfigOptMode :" + mStreamConfigOptMode); } if (mStreamConfigOptMode == 0) {
               //普通流,但是該過程設置了setOpModeForVideoStream,會導致config->operation_mode變化。 mCameraDevice[cameraId].createCaptureSession(surfaces, mCCSSateCallback, null); } else { List<OutputConfiguration> outConfigurations = new ArrayList<>(surfaces.size()); for (Surface sface : surfaces) { outConfigurations.add(new OutputConfiguration(sface)); } mCameraDevice[cameraId].createCustomCaptureSession(null, outConfigurations, mStreamConfigOptMode, mCCSSateCallback, null); } } } } ...

2、HFR 配置流 : frameworksasecorejavaandroidhardwarecamera2CameraDevice.java

 //創建高速捕獲會話接口
public abstract void createConstrainedHighSpeedCaptureSession(@NonNull List<Surface> outputs, @NonNull CameraCaptureSession.StateCallback callback, @Nullable Handler handler) throws CameraAccessException;

  具體實現在:frameworksasecorejavaandroidhardwarecamera2implCameraDeviceImpl.java

    @Override
    public void createConstrainedHighSpeedCaptureSession(List<Surface> outputs,
            android.hardware.camera2.CameraCaptureSession.StateCallback callback, Handler handler)
            throws CameraAccessException {
        if (outputs == null || outputs.size() == 0 || outputs.size() > 2) {
            throw new IllegalArgumentException(
                    "Output surface list must not be null and the size must be no more than 2");
        }
        List<OutputConfiguration> outConfigurations = new ArrayList<>(outputs.size());
        for (Surface surface : outputs) {
            outConfigurations.add(new OutputConfiguration(surface));
        }
        createCaptureSessionInternal(null, outConfigurations, callback,
                checkAndWrapHandler(handler),
                /*operatingMode*/ICameraDeviceUser.CONSTRAINED_HIGH_SPEED_MODE,
                /*sessionParams*/ null);
    }

  其中 createCaptureSessionInternal 函數實現如下:

    private void createCaptureSessionInternal(InputConfiguration inputConfig,
            List<OutputConfiguration> outputConfigurations,
            CameraCaptureSession.StateCallback callback, Executor executor,
            int operatingMode, CaptureRequest sessionParams) throws CameraAccessException {
        synchronized(mInterfaceLock) {
            if (DEBUG) {
                Log.d(TAG, "createCaptureSessionInternal");
            }

            checkIfCameraClosedOrInError();

            boolean isConstrainedHighSpeed =
                    (operatingMode == ICameraDeviceUser.CONSTRAINED_HIGH_SPEED_MODE);
            if (isConstrainedHighSpeed && inputConfig != null) {
                throw new IllegalArgumentException("Constrained high speed session doesn't support"
                        + " input configuration yet.");
            }

            // Notify current session that it's going away, before starting camera operations
            // After this call completes, the session is not allowed to call into CameraDeviceImpl
            if (mCurrentSession != null) {
                mCurrentSession.replaceSessionClose();
            }

            // TODO: dont block for this
            boolean configureSuccess = true;
            CameraAccessException pendingException = null;
            Surface input = null;
            try {
                // configure streams and then block until IDLE
                // 里面會獲取設備屬性
                configureSuccess = configureStreamsChecked(inputConfig, outputConfigurations,
                        operatingMode, sessionParams);
                if (configureSuccess == true && inputConfig != null) {
                    input = mRemoteDevice.getInputSurface();
                }
            } catch (CameraAccessException e) {
                configureSuccess = false;
                pendingException = e;
                input = null;
                if (DEBUG) {
                    Log.v(TAG, "createCaptureSession - failed with exception ", e);
                }
            }

            // Fire onConfigured if configureOutputs succeeded, fire onConfigureFailed otherwise.
            CameraCaptureSessionCore newSession = null;
            if (isConstrainedHighSpeed) {
                ArrayList<Surface> surfaces = new ArrayList<>(outputConfigurations.size());
                for (OutputConfiguration outConfig : outputConfigurations) {
                    surfaces.add(outConfig.getSurface());
                }
                StreamConfigurationMap config =
                    getCharacteristics().get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);

                // 檢查格式是否正確、fps是否是有效范圍、是不是預覽/錄像編碼流等
                SurfaceUtils.checkConstrainedHighSpeedSurfaces(surfaces, /*fpsRange*/null, config);

                newSession = new CameraConstrainedHighSpeedCaptureSessionImpl(mNextSessionId++,
                        callback, executor, this, mDeviceExecutor, configureSuccess,
                        mCharacteristics);
            } else {
                newSession = new CameraCaptureSessionImpl(mNextSessionId++, input,
                        callback, executor, this, mDeviceExecutor, configureSuccess);
            }

            // TODO: wait until current session closes, then create the new session
            mCurrentSession = newSession;

            if (pendingException != null) {
                throw pendingException;
            }

            mSessionStateCallback = mCurrentSession.getDeviceStateCallback();
        }
    }

  繼續分析configureStreamsChecked()

    public boolean configureStreamsChecked(InputConfiguration inputConfig,
            List<OutputConfiguration> outputs, int operatingMode, CaptureRequest sessionParams)
                    throws CameraAccessException {
        // Treat a null input the same an empty list
        if (outputs == null) {
            outputs = new ArrayList<OutputConfiguration>();
        }
        if (outputs.size() == 0 && inputConfig != null) {
            throw new IllegalArgumentException("cannot configure an input stream without " +
                    "any output streams");
        }

        checkInputConfiguration(inputConfig);

        boolean success = false;

        synchronized(mInterfaceLock) {
            checkIfCameraClosedOrInError();
            // Streams to create
            HashSet<OutputConfiguration> addSet = new HashSet<OutputConfiguration>(outputs);
            // Streams to delete
            List<Integer> deleteList = new ArrayList<Integer>();

            // Determine which streams need to be created, which to be deleted
            for (int i = 0; i < mConfiguredOutputs.size(); ++i) {
                int streamId = mConfiguredOutputs.keyAt(i);
                OutputConfiguration outConfig = mConfiguredOutputs.valueAt(i);

                if (!outputs.contains(outConfig) || outConfig.isDeferredConfiguration()) {
                    // Always delete the deferred output configuration when the session
                    // is created, as the deferred output configuration doesn't have unique surface
                    // related identifies.
                    deleteList.add(streamId);
                } else {
                    addSet.remove(outConfig);  // Don't create a stream previously created
                }
            }

            mDeviceExecutor.execute(mCallOnBusy);
            stopRepeating();

            try {
                waitUntilIdle();

                // 開始配置
                mRemoteDevice.beginConfigure();

                // reconfigure the input stream if the input configuration is different.
                InputConfiguration currentInputConfig = mConfiguredInput.getValue();
                if (inputConfig != currentInputConfig &&
                        (inputConfig == null || !inputConfig.equals(currentInputConfig))) {
                    if (currentInputConfig != null) {
                        mRemoteDevice.deleteStream(mConfiguredInput.getKey());
                        mConfiguredInput = new SimpleEntry<Integer, InputConfiguration>(
                                REQUEST_ID_NONE, null);
                    }
                    if (inputConfig != null) {
                        int streamId = mRemoteDevice.createInputStream(inputConfig.getWidth(),
                                inputConfig.getHeight(), inputConfig.getFormat());
                        mConfiguredInput = new SimpleEntry<Integer, InputConfiguration>(
                                streamId, inputConfig);
                    }
                }

                // Delete all streams first (to free up HW resources)
                for (Integer streamId : deleteList) {
                    mRemoteDevice.deleteStream(streamId);
                    mConfiguredOutputs.delete(streamId);
                }

                // Add all new streams
                for (OutputConfiguration outConfig : outputs) {
                    if (addSet.contains(outConfig)) {
                        int streamId = mRemoteDevice.createStream(outConfig);
                        mConfiguredOutputs.put(streamId, outConfig);
                    }
                }

                //customOpMode 可以通過setOpModeForVideoStream改變
                //CameraConstrainedHighSpeedCaptureSessionImpl沒有改變該值
                operatingMode = (operatingMode | (customOpMode << 16));

                //結束配置流
                //mRemoteDevice類型是ICameraDeviceUserWrapper
                //是在在打開相機時獲取的。
                if (sessionParams != null) {
                    mRemoteDevice.endConfigure(operatingMode, sessionParams.getNativeCopy());
                } else {
                    mRemoteDevice.endConfigure(operatingMode, null);
                }

                success = true;
            } catch (IllegalArgumentException e) {
                // OK. camera service can reject stream config if it's not supported by HAL
                // This is only the result of a programmer misusing the camera2 api.
                Log.w(TAG, "Stream configuration failed due to: " + e.getMessage());
                return false;
            } catch (CameraAccessException e) {
                if (e.getReason() == CameraAccessException.CAMERA_IN_USE) {
                    throw new IllegalStateException("The camera is currently busy." +
                            " You must wait until the previous operation completes.", e);
                }
                throw e;
            } finally {
                if (success && outputs.size() > 0) {
                    mDeviceExecutor.execute(mCallOnIdle);
                } else {
                    // Always return to the 'unconfigured' state if we didn't hit a fatal error
                    mDeviceExecutor.execute(mCallOnUnconfigured);
                }
            }
        }

        return success;
    }

  上面的mRemoteDevice對象是ICameraDeviceUserWrapper類型,是在打開相機時獲取的,代碼如下:

  frameworksasecorejavaandroidhardwarecamera2CameraManager.java

    private CameraDevice openCameraDeviceUserAsync(String cameraId,
            CameraDevice.StateCallback callback, Executor executor, final int uid)
            throws CameraAccessException {
        CameraCharacteristics characteristics = getCameraCharacteristics(cameraId);
        CameraDevice device = null;

        synchronized (mLock) {

            ICameraDeviceUser cameraUser = null;

            //創建CameraDeviceImpl對象
            android.hardware.camera2.impl.CameraDeviceImpl deviceImpl =
                    new android.hardware.camera2.impl.CameraDeviceImpl(
                        cameraId,
                        callback,
                        executor,
                        characteristics,
                        mContext.getApplicationInfo().targetSdkVersion);

            ICameraDeviceCallbacks callbacks = deviceImpl.getCallbacks();

            try {
                if (supportsCamera2ApiLocked(cameraId)) {
                    // Use cameraservice's cameradeviceclient implementation for HAL3.2+ devices
                    //獲取cameraService代理對象
                    ICameraService cameraService = CameraManagerGlobal.get().getCameraService();
                    if (cameraService == null) {
                        throw new ServiceSpecificException(
                            ICameraService.ERROR_DISCONNECTED,
                            "Camera service is currently unavailable");
                    }
                    //通過cameraService代理對象打開相機,獲取ICameraDeviceUser cameraUser對象
                    cameraUser = cameraService.connectDevice(callbacks, cameraId,
                            mContext.getOpPackageName(), uid);
                } else {
                    // Use legacy camera implementation for HAL1 devices
                    int id;
                    try {
                        id = Integer.parseInt(cameraId);
                    } catch (NumberFormatException e) {
                        throw new IllegalArgumentException("Expected cameraId to be numeric, but it was: "
                                + cameraId);
                    }

                    Log.i(TAG, "Using legacy camera HAL.");
                    cameraUser = CameraDeviceUserShim.connectBinderShim(callbacks, id);
                }
            } catch (ServiceSpecificException e) {
                if (e.errorCode == ICameraService.ERROR_DEPRECATED_HAL) {
                    throw new AssertionError("Should've gone down the shim path");
                } else if (e.errorCode == ICameraService.ERROR_CAMERA_IN_USE ||
                        e.errorCode == ICameraService.ERROR_MAX_CAMERAS_IN_USE ||
                        e.errorCode == ICameraService.ERROR_DISABLED ||
                        e.errorCode == ICameraService.ERROR_DISCONNECTED ||
                        e.errorCode == ICameraService.ERROR_INVALID_OPERATION) {
                    // Received one of the known connection errors
                    // The remote camera device cannot be connected to, so
                    // set the local camera to the startup error state
                    deviceImpl.setRemoteFailure(e);

                    if (e.errorCode == ICameraService.ERROR_DISABLED ||
                            e.errorCode == ICameraService.ERROR_DISCONNECTED ||
                            e.errorCode == ICameraService.ERROR_CAMERA_IN_USE) {
                        // Per API docs, these failures call onError and throw
                        throwAsPublicException(e);
                    }
                } else {
                    // Unexpected failure - rethrow
                    throwAsPublicException(e);
                }
            } catch (RemoteException e) {
                // Camera service died - act as if it's a CAMERA_DISCONNECTED case
                ServiceSpecificException sse = new ServiceSpecificException(
                    ICameraService.ERROR_DISCONNECTED,
                    "Camera service is currently unavailable");
                deviceImpl.setRemoteFailure(sse);
                throwAsPublicException(sse);
            }

            // TODO: factor out callback to be non-nested, then move setter to constructor
            // For now, calling setRemoteDevice will fire initial
            // onOpened/onUnconfigured callbacks.
            // This function call may post onDisconnected and throw CAMERA_DISCONNECTED if
            // cameraUser dies during setup.
            //將打開相機獲取的cameraUser對象設置到CameraDeviceImpl deviceImpl對象中
            deviceImpl.setRemoteDevice(cameraUser);
            device = deviceImpl;
        }
        
        //返回CameraDeviceImpl對象deviceImpl
        return device;
    }

 前面分析configureStreamsChecked可知其主要分為3步:

mRemoteDevice.beginConfigure();//開始配置
mRemoteDevice.deleteStream(streamId)和mRemoteDevice.createStream(outConfig)//創建、刪除流
mRemoteDevice.endConfigure(operatingMode);//結束配置,前兩步是準備工作,這一步才是真正配置流

主要分析下endConfigure:
  frameworksasecorejavaandroidhardwarecamera2implICameraDeviceUserWrapper.java

    public void endConfigure(int operatingMode, CameraMetadataNative sessionParams)
           throws CameraAccessException {
        try {
            // 通過Binder IPC 實際調用接口實現在 CameraDeviceClient.cpp
            mRemoteDevice.endConfigure(operatingMode, (sessionParams == null) ?
                    new CameraMetadataNative() : sessionParams);
        } catch (Throwable t) {
            CameraManager.throwAsPublicException(t);
            throw new UnsupportedOperationException("Unexpected exception", t);
        }
    }

  進入 frameworksavservicescameralibcameraserviceapi2CameraDeviceClient.cpp

binder::Status CameraDeviceClient::endConfigure(int operatingMode,
        const hardware::camera2::impl::CameraMetadataNative& sessionParams) {
    ATRACE_CALL();
    ALOGV("%s: ending configure (%d input stream, %zu output surfaces)",
            __FUNCTION__, mInputStream.configured ? 1 : 0,
            mStreamMap.size());
......
   // Sanitize the high speed session against necessary capability bit. bool isConstrainedHighSpeed = (operatingMode == ICameraDeviceUser::CONSTRAINED_HIGH_SPEED_MODE); // 檢查是否支持CONSTRAINED_HIGH_SPEED_MODE if (isConstrainedHighSpeed) { CameraMetadata staticInfo = mDevice->info(); camera_metadata_entry_t entry = staticInfo.find(ANDROID_REQUEST_AVAILABLE_CAPABILITIES); bool isConstrainedHighSpeedSupported = false; for(size_t i = 0; i < entry.count; ++i) { uint8_t capability = entry.data.u8[i]; if (capability == ANDROID_REQUEST_AVAILABLE_CAPABILITIES_CONSTRAINED_HIGH_SPEED_VIDEO) { isConstrainedHighSpeedSupported = true; break; } } if (!isConstrainedHighSpeedSupported) { String8 msg = String8::format( "Camera %s: Try to create a constrained high speed configuration on a device" " that doesn't support it.", mCameraIdStr.string()); ALOGE("%s: %s", __FUNCTION__, msg.string()); return STATUS_ERROR(CameraService::ERROR_ILLEGAL_ARGUMENT, msg.string()); } } //檢查通過后開始配置流 status_t err = mDevice->configureStreams(sessionParams, operatingMode); if (err == BAD_VALUE) { String8 msg = String8::format("Camera %s: Unsupported set of inputs/outputs provided", mCameraIdStr.string()); ALOGE("%s: %s", __FUNCTION__, msg.string()); res = STATUS_ERROR(CameraService::ERROR_ILLEGAL_ARGUMENT, msg.string()); } else if (err != OK) { String8 msg = String8::format("Camera %s: Error configuring streams: %s (%d)", mCameraIdStr.string(), strerror(-err), err); ALOGE("%s: %s", __FUNCTION__, msg.string()); res = STATUS_ERROR(CameraService::ERROR_INVALID_OPERATION, msg.string()); } return res; }

 接著進入到:frameworksavservicescameralibcameraservicedevice3Camera3Device.cpp

status_t Camera3Device::configureStreams(const CameraMetadata& sessionParams, int operatingMode) {
    ATRACE_CALL();
    ALOGV("%s: E", __FUNCTION__);

    Mutex::Autolock il(mInterfaceLock);
    Mutex::Autolock l(mLock);

    // In case the client doesn't include any session parameter, try a
    // speculative configuration using the values from the last cached
    // default request.
    if (sessionParams.isEmpty() &&
            ((mLastTemplateId > 0) && (mLastTemplateId < CAMERA3_TEMPLATE_COUNT)) &&
            (!mRequestTemplateCache[mLastTemplateId].isEmpty())) {
        ALOGV("%s: Speculative session param configuration with template id: %d", __func__,
                mLastTemplateId);
        return filterParamsAndConfigureLocked(mRequestTemplateCache[mLastTemplateId],
                operatingMode);
    }

    return filterParamsAndConfigureLocked(sessionParams, operatingMode);
}

 接著調用到如下函數:

status_t Camera3Device::configureStreamsLocked(int operatingMode,
        const CameraMetadata& sessionParams, bool notifyRequestThread) {
    ATRACE_CALL();
    status_t res;

    if (mStatus != STATUS_UNCONFIGURED && mStatus != STATUS_CONFIGURED) {
        CLOGE("Not idle");
        return INVALID_OPERATION;
    }

    if (operatingMode < 0) {
        CLOGE("Invalid operating mode: %d", operatingMode);
        return BAD_VALUE;
    }

    // 檢查是否是isConstrainedHighSpeed模式
    bool isConstrainedHighSpeed =
            static_cast<int>(StreamConfigurationMode::CONSTRAINED_HIGH_SPEED_MODE) ==
            operatingMode;

    if (mOperatingMode != operatingMode) {
        mNeedConfig = true;
        mIsConstrainedHighSpeedConfiguration = isConstrainedHighSpeed;
        mOperatingMode = operatingMode;
    }

    if (!mNeedConfig) {
        ALOGV("%s: Skipping config, no stream changes", __FUNCTION__);
        return OK;
    }

    // Workaround for device HALv3.2 or older spec bug - zero streams requires
    // adding a dummy stream instead.
    // TODO: Bug: 17321404 for fixing the HAL spec and removing this workaround.
    if (mOutputStreams.size() == 0) {
        addDummyStreamLocked();
    } else {
        tryRemoveDummyStreamLocked();
    }

    // Start configuring the streams
    ALOGV("%s: Camera %s: Starting stream configuration", __FUNCTION__, mId.string());

    mPreparerThread->pause();

    camera3_stream_configuration config;
    config.operation_mode = mOperatingMode; //將mOperatingMode賦值給config.operation_mode
    config.num_streams = (mInputStream != NULL) + mOutputStreams.size();

    Vector<camera3_stream_t*> streams;
    streams.setCapacity(config.num_streams);
    std::vector<uint32_t> bufferSizes(config.num_streams, 0);


    if (mInputStream != NULL) {
        camera3_stream_t *inputStream;
        inputStream = mInputStream->startConfiguration();
        if (inputStream == NULL) {
            CLOGE("Can't start input stream configuration");
            cancelStreamsConfigurationLocked();
            return INVALID_OPERATION;
        }
        streams.add(inputStream);
    }

    // 輸出流配置
    for (size_t i = 0; i < mOutputStreams.size(); i++) {

        // Don't configure bidi streams twice, nor add them twice to the list
        if (mOutputStreams[i].get() ==
            static_cast<Camera3StreamInterface*>(mInputStream.get())) {

            config.num_streams--;
            continue;
        }

        camera3_stream_t *outputStream;
        outputStream = mOutputStreams.editValueAt(i)->startConfiguration();
        if (outputStream == NULL) {
            CLOGE("Can't start output stream configuration");
            cancelStreamsConfigurationLocked();
            return INVALID_OPERATION;
        }
        streams.add(outputStream);

        if (outputStream->format == HAL_PIXEL_FORMAT_BLOB &&
                outputStream->data_space == HAL_DATASPACE_V0_JFIF) {
            size_t k = i + ((mInputStream != nullptr) ? 1 : 0); // Input stream if present should
                                                                // always occupy the initial entry.
            bufferSizes[k] = static_cast<uint32_t>(
                    getJpegBufferSize(outputStream->width, outputStream->height));
        }
    }

    config.streams = streams.editArray();

    // Do the HAL configuration; will potentially touch stream
    // max_buffers, usage, priv fields.

    const camera_metadata_t *sessionBuffer = sessionParams.getAndLock();

    //通知HAL層配置流
    res = mInterface->configureStreams(sessionBuffer, &config, bufferSizes);
    sessionParams.unlock(sessionBuffer);
......
  return OK; }

 最后調用HIDL接口: frameworksavservicescameralibcameraservicedevice3Camera3Device.cpp

status_t Camera3Device::HalInterface::configureStreams(const camera_metadata_t *sessionParams,
        camera3_stream_configuration *config, const std::vector<uint32_t>& bufferSizes) {
    ATRACE_NAME("CameraHal::configureStreams");
    if (!valid()) return INVALID_OPERATION;
    status_t res = OK;

......
  // See if we have v3.4 or v3.3 HAL if (mHidlSession_3_4 != nullptr) { // We do; use v3.4 for the call ALOGV("%s: v3.4 device found", __FUNCTION__); device::V3_4::HalStreamConfiguration finalConfiguration3_4; auto err = mHidlSession_3_4->configureStreams_3_4(requestedConfiguration3_4, [&status, &finalConfiguration3_4] (common::V1_0::Status s, const device::V3_4::HalStreamConfiguration& halConfiguration) { finalConfiguration3_4 = halConfiguration; status = s; }); if (!err.isOk()) { ALOGE("%s: Transaction error: %s", __FUNCTION__, err.description().c_str()); return DEAD_OBJECT; } finalConfiguration.streams.resize(finalConfiguration3_4.streams.size()); for (size_t i = 0; i < finalConfiguration3_4.streams.size(); i++) { finalConfiguration.streams[i] = finalConfiguration3_4.streams[i].v3_3; } } else if (mHidlSession_3_3 != nullptr) { // We do; use v3.3 for the call ALOGV("%s: v3.3 device found", __FUNCTION__); auto err = mHidlSession_3_3->configureStreams_3_3(requestedConfiguration3_2, [&status, &finalConfiguration] (common::V1_0::Status s, const device::V3_3::HalStreamConfiguration& halConfiguration) { finalConfiguration = halConfiguration; status = s; }); if (!err.isOk()) { ALOGE("%s: Transaction error: %s", __FUNCTION__, err.description().c_str()); return DEAD_OBJECT; } } else { // We don't; use v3.2 call and construct a v3.3 HalStreamConfiguration ALOGV("%s: v3.2 device found", __FUNCTION__); HalStreamConfiguration finalConfiguration_3_2; auto err = mHidlSession->configureStreams(requestedConfiguration3_2, [&status, &finalConfiguration_3_2] (common::V1_0::Status s, const HalStreamConfiguration& halConfiguration) { finalConfiguration_3_2 = halConfiguration; status = s; }); if (!err.isOk()) { ALOGE("%s: Transaction error: %s", __FUNCTION__, err.description().c_str()); return DEAD_OBJECT; } finalConfiguration.streams.resize(finalConfiguration_3_2.streams.size()); for (size_t i = 0; i < finalConfiguration_3_2.streams.size(); i++) { finalConfiguration.streams[i].v3_2 = finalConfiguration_3_2.streams[i]; finalConfiguration.streams[i].overrideDataSpace = requestedConfiguration3_2.streams[i].dataSpace; } } ......
  return res; }

 HIDL接口各版本在: hardwareinterfacescameradevice

 HAL層的代碼部分如下: vendorqcomproprietarycamxsrccorehalcamxhal3.cpp

static int configure_streams(
    const struct camera3_device*    pCamera3DeviceAPI,
    camera3_stream_configuration_t* pStreamConfigsAPI)
{
    CAMX_ENTRYEXIT_SCOPE(CamxLogGroupHAL, SCOPEEventHAL3ConfigureStreams);
......

        Camera3StreamConfig* pStreamConfigs = reinterpret_cast<Camera3StreamConfig*>(pStreamConfigsAPI);

        result = pHALDevice->ConfigureStreams(pStreamConfigs);

        if ((CamxResultSuccess != result) && (CamxResultEInvalidArg != result))
        {
            // HAL interface requires -ENODEV (EFailed) if a fatal error occurs
            result = CamxResultEFailed;
        }
        if (CamxResultSuccess == result)
        {
            for (UINT32 stream = 0; stream < pStreamConfigsAPI->num_streams; stream++)
            {
                CAMX_ASSERT(NULL != pStreamConfigsAPI->streams[stream]);

                if (NULL == pStreamConfigsAPI->streams[stream])
                {
                    CAMX_LOG_ERROR(CamxLogGroupHAL, "Invalid argument 2 for configure_streams()");
                    // HAL interface requires -EINVAL (EInvalidArg) for invalid arguments
                    result = CamxResultEInvalidArg;
                    break;
                }
                else
                {
                    CAMX_LOG_CONFIG(CamxLogGroupHAL, " FINAL stream[%d] = %p - info:", stream,
                        pStreamConfigsAPI->streams[stream]);
                    CAMX_LOG_CONFIG(CamxLogGroupHAL, "            format       : %d, %s",
                        pStreamConfigsAPI->streams[stream]->format,
                        FormatToString(pStreamConfigsAPI->streams[stream]->format));
                    CAMX_LOG_CONFIG(CamxLogGroupHAL, "            width        : %d",
                        pStreamConfigsAPI->streams[stream]->width);
                    CAMX_LOG_CONFIG(CamxLogGroupHAL, "            height       : %d",
                        pStreamConfigsAPI->streams[stream]->height);
                    CAMX_LOG_CONFIG(CamxLogGroupHAL, "            stream_type  : %08x, %s",
                        pStreamConfigsAPI->streams[stream]->stream_type,
                        StreamTypeToString(pStreamConfigsAPI->streams[stream]->stream_type));
                    CAMX_LOG_CONFIG(CamxLogGroupHAL, "            usage        : %08x",
                        pStreamConfigsAPI->streams[stream]->usage);
                    CAMX_LOG_CONFIG(CamxLogGroupHAL, "            max_buffers  : %d",
                        pStreamConfigsAPI->streams[stream]->max_buffers);
                    CAMX_LOG_CONFIG(CamxLogGroupHAL, "            rotation     : %08x, %s",
                        pStreamConfigsAPI->streams[stream]->rotation,
                        RotationToString(pStreamConfigsAPI->streams[stream]->rotation));
                    CAMX_LOG_CONFIG(CamxLogGroupHAL, "            data_space   : %08x, %s",
                        pStreamConfigsAPI->streams[stream]->data_space,
                        DataSpaceToString(pStreamConfigsAPI->streams[stream]->data_space));
                    CAMX_LOG_CONFIG(CamxLogGroupHAL, "            priv         : %p",
                        pStreamConfigsAPI->streams[stream]->priv);
                    CAMX_LOG_CONFIG(CamxLogGroupHAL, "            reserved[0]         : %p",
                        pStreamConfigsAPI->streams[stream]->reserved[0]);
                    CAMX_LOG_CONFIG(CamxLogGroupHAL, "            reserved[1]         : %p",
                        pStreamConfigsAPI->streams[stream]->reserved[1]);

                    Camera3HalStream* pHalStream =
                        reinterpret_cast<Camera3HalStream*>(pStreamConfigsAPI->streams[stream]->reserved[0]);
                    if (pHalStream != NULL)
                    {
                        if (TRUE == HwEnvironment::GetInstance()->GetStaticSettings()->enableHALFormatOverride) //GetInstance()中會初始化當前設備、sensor的 capabilities
                        {
                            pStreamConfigsAPI->streams[stream]->format =
                                static_cast<HALPixelFormat>(pHalStream->overrideFormat);
                        }
                        CAMX_LOG_CONFIG(CamxLogGroupHAL,
                            "   pHalStream: %p format : 0x%x, overrideFormat : 0x%x consumer usage: %llx, producer usage: %llx",
                            pHalStream, pStreamConfigsAPI->streams[stream]->format,
                            pHalStream->overrideFormat, pHalStream->consumerUsage, pHalStream->producerUsage);
                    }
                }
            }
        }
 ......
  return Utils::CamxResultToErrno(result);
}

 其中ConfigureStreams的實現在 vendorqcomproprietarycamxsrccorehalcamxhaldevice.cpp :

CamxResult HALDevice::ConfigureStreams(
    Camera3StreamConfig* pStreamConfigs)
{
    CamxResult result = CamxResultSuccess;

    // Validate the incoming stream configurations
    result = CheckValidStreamConfig(pStreamConfigs);

......
  if (CamxResultSuccess == result) { ClearFrameworkRequestBuffer(); m_numPipelines = 0; if (TRUE == m_bCHIModuleInitialized) { GetCHIAppCallbacks()->chi_teardown_override_session(reinterpret_cast<camera3_device*>(&m_camera3Device), 0, NULL); }
m_bCHIModuleInitialized = CHIModuleInitialize(pStreamConfigs); // 初始化Chi模塊 ...... } return result; }

  CHIModuleInitialize中調用了Chi層注冊的回調函數:

////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
// HALDevice::CHIModuleInitialize
////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
BOOL HALDevice::CHIModuleInitialize(
    Camera3StreamConfig* pStreamConfigs)
{
    BOOL isOverrideEnabled = FALSE;

    if (TRUE == HAL3Module::GetInstance()->IsCHIOverrideModulePresent())
    {
        /// @todo (CAMX-1518) Handle private data from Override module
        VOID*                   pPrivateData;
        chi_hal_callback_ops_t* pCHIAppCallbacks  = GetCHIAppCallbacks();
     
     // 調用Chi層的回調函數
pCHIAppCallbacks->chi_initialize_override_session(GetCameraId(), reinterpret_cast<const camera3_device_t*>(&m_camera3Device), &m_HALCallbacks, reinterpret_cast<camera3_stream_configuration_t*>(pStreamConfigs), &isOverrideEnabled, &pPrivateData); } return isOverrideEnabled; }

 chi層的回調函數實現在 vendorqcomproprietarychi-cdkvendorchioverridedefaultchxextensioninterface.cpp :

CDKResult ExtensionModule::InitializeOverrideSession(
    uint32_t                        logicalCameraId,
    const camera3_device_t*         pCamera3Device,
    const chi_hal_ops_t*            chiHalOps,
    camera3_stream_configuration_t* pStreamConfig,
    int*                            pIsOverrideEnabled,
    VOID**                          pPrivate)
{
    CDKResult          result             = CDKResultSuccess;
    UINT32             modeCount          = 0;
    ChiSensorModeInfo* pAllModes          = NULL;
    UINT32             fps                = *m_pDefaultMaxFPS;
    BOOL               isVideoMode        = FALSE;
    uint32_t           operation_mode;
    static BOOL        fovcModeCheck      = EnableFOVCUseCase();
    UsecaseId          selectedUsecaseId  = UsecaseId::NoMatch;
    UINT               minSessionFps      = 0;
    UINT               maxSessionFps      = 0;
......
if ((isVideoMode == TRUE) && (operation_mode != 0)) { UINT32 numSensorModes = m_logicalCameraInfo[logicalCameraId].m_cameraCaps.numSensorModes; // 獲取sensor的信息,跟HFR相關的有 frameRate、batchedFrames等等 CHISENSORMODEINFO* pAllSensorModes = m_logicalCameraInfo[logicalCameraId].pSensorModeInfo; if ((operation_mode - 1) >= numSensorModes) { result = CDKResultEOverflow; CHX_LOG_ERROR("operation_mode: %d, numSensorModes: %d", operation_mode, numSensorModes); } else { fps = pAllSensorModes[operation_mode - 1].frameRate; } } if (CDKResultSuccess == result) { #if defined(CAMX_ANDROID_API) && (CAMX_ANDROID_API >= 28) //Android-P or better camera_metadata_t *metadata = const_cast<camera_metadata_t*>(pStreamConfig->session_parameters); camera_metadata_entry_t entry = { 0 }; entry.tag = ANDROID_CONTROL_AE_TARGET_FPS_RANGE; // The client may choose to send NULL sesssion parameter, which is fine. For example, torch mode // will have NULL session param. if (metadata != NULL) { // 獲取對應tag的entry結構體,并將數據保存在entry傳入的參數中。 int ret = find_camera_metadata_entry(metadata, entry.tag, &entry); if(ret == 0) { minSessionFps = entry.data.i32[0]; maxSessionFps = entry.data.i32[1]; m_usecaseMaxFPS = maxSessionFps; } } #endif if ((StreamConfigModeConstrainedHighSpeed == pStreamConfig->operation_mode) || (StreamConfigModeSuperSlowMotionFRC == pStreamConfig->operation_mode)) { // 如果是HFR模式則進行如下操作: // 1)查找與Video/Preview stream匹配的HFRVideoSizes。 // 注:preview 和 recording streams size必須一樣,否則高速攝像 session會創建失敗。 // 2)如果single entry在SupportedHFRVideoSizes中找到,我們就選擇這個entry中的batchsize。 SearchNumBatchedFrames(logicalCameraId, pStreamConfig, &m_usecaseNumBatchedFrames, &m_usecaseMaxFPS, maxSessionFps); if (480 > m_usecaseMaxFPS) { m_CurrentpowerHint = PERF_LOCK_POWER_HINT_VIDEO_ENCODE_HFR; } else { // For 480FPS or higher, require more aggresive power hint m_CurrentpowerHint = PERF_LOCK_POWER_HINT_VIDEO_ENCODE_HFR_480FPS; } } else { // Not a HFR usecase, batch frames value need to be set to 1. m_usecaseNumBatchedFrames = 1; if (maxSessionFps == 0) { m_usecaseMaxFPS = fps; } if (TRUE == isVideoMode) { if (30 >= m_usecaseMaxFPS) { m_CurrentpowerHint = PERF_LOCK_POWER_HINT_VIDEO_ENCODE; } else { m_CurrentpowerHint = PERF_LOCK_POWER_HINT_VIDEO_ENCODE_60FPS; } } else { m_CurrentpowerHint = PERF_LOCK_POWER_HINT_PREVIEW; } } if ((NULL != m_pPerfLockManager[logicalCameraId]) && (m_CurrentpowerHint != m_previousPowerHint)) { m_pPerfLockManager[logicalCameraId]->ReleasePerfLock(m_previousPowerHint); } // Example [B == batch]: (240 FPS / 4 FPB = 60 BPS) / 30 FPS (Stats frequency goal) = 2 BPF i.e. skip every other stats *m_pStatsSkipPattern = m_usecaseMaxFPS / m_usecaseNumBatchedFrames / 30; if (*m_pStatsSkipPattern < 1) { *m_pStatsSkipPattern = 1; } m_VideoHDRMode = (StreamConfigModeVideoHdr == pStreamConfig->operation_mode); m_torchWidgetUsecase = (StreamConfigModeQTITorchWidget == pStreamConfig->operation_mode); // this check is introduced to avoid set *m_pEnableFOVC == 1 if fovcEnable is disabled in // overridesettings & fovc bit is set in operation mode. // as well as to avoid set,when we switch Usecases. if (TRUE == fovcModeCheck) { *m_pEnableFOVC = ((pStreamConfig->operation_mode & StreamConfigModeQTIFOVC) == StreamConfigModeQTIFOVC) ? 1 : 0; } SetHALOps(chiHalOps, logicalCameraId); m_logicalCameraInfo[logicalCameraId].m_pCamera3Device = pCamera3Device; // 根據CameraInfo找到匹配Usecase selectedUsecaseId = m_pUsecaseSelector->GetMatchingUsecase(&m_logicalCameraInfo[logicalCameraId], pStreamConfig); CHX_LOG_CONFIG("Session_parameters FPS range %d:%d, BatchSize: %u FPS: %u SkipPattern: %u, " "cameraId = %d selected use case = %d", minSessionFps, maxSessionFps, m_usecaseNumBatchedFrames, m_usecaseMaxFPS, *m_pStatsSkipPattern, logicalCameraId, selectedUsecaseId); // FastShutter mode supported only in ZSL usecase. if ((pStreamConfig->operation_mode == StreamConfigModeFastShutter) && (UsecaseId::PreviewZSL != selectedUsecaseId)) { pStreamConfig->operation_mode = StreamConfigModeNormal; } m_operationMode[logicalCameraId] = pStreamConfig->operation_mode; } if (UsecaseId::NoMatch != selectedUsecaseId) { // 根據UsecaseId創建Usecase對象,HFR的UsecaseId是default m_pSelectedUsecase[logicalCameraId] = m_pUsecaseFactory->CreateUsecaseObject(&m_logicalCameraInfo[logicalCameraId], selectedUsecaseId, pStreamConfig);
   } .....

return result; }

  其中會調用到 AdvancedCameraUsecase::Create,實現在 vendorqcomproprietarychi-cdkvendorchioverridedefaultchxadvancedcamerausecase.cpp

AdvancedCameraUsecase* AdvancedCameraUsecase::Create(
    LogicalCameraInfo*              pCameraInfo,   ///< Camera info
    camera3_stream_configuration_t* pStreamConfig, ///< Stream configuration
    UsecaseId                       usecaseId)     ///< Identifier for usecase function
{
    CDKResult              result                 = CDKResultSuccess;
    AdvancedCameraUsecase* pAdvancedCameraUsecase = CHX_NEW AdvancedCameraUsecase;

    if ((NULL != pAdvancedCameraUsecase) && (NULL != pStreamConfig))
    {
        result = pAdvancedCameraUsecase->Initialize(pCameraInfo, pStreamConfig, usecaseId); //其中接著會調用CameraUsecaseBase::Initialize(m_pCallbacks),然后調用CameraUsecaseBase::CreatePipeline

        if (CDKResultSuccess != result)
        {
            pAdvancedCameraUsecase->Destroy(FALSE);
            pAdvancedCameraUsecase = NULL;
        }
    }
    else
    {
        result = CDKResultEFailed;
    }

    return pAdvancedCameraUsecase;
}

 至此,chi層根據APP的參數以及平臺、sensor的信息和topology xml的結構,選擇匹配的UsecaseID并創建了所需的pipeline。

 HFR配置流時的限制:

通過createConstrainedHighSpeedCaptureSession配置高速流
只能配置一個或者兩個流,一個預覽流,一個是拍照流
對預覽流的限制是usage為 GRALLOC_USAGE_HW_TEXTURE | GRALLOC_USAGE_HW_COMPOSER | GRALLOC_USAGE_HW_RENDER
對錄像流的限制是usage為GRALLOC_USAGE_HW_VIDEO_ENCODER

、高通平臺如何獲取platform和camera sensor的capabilities?

  Snapdragon相機的setting界面中可以選擇Video quality和對應的Video High FrameRate,而選擇列表中的支持項是在camera服務啟動時根據平臺和camera sensor的輸出能力共同決定。
  在選擇一個Video quality后,HFR選項列表會被更新,其中的操作就是查詢當前分辨率支持的FPS,流程如下:

 (1)packagesappsSnapdragonCamerasrccomandroidcameraSettingsManager.java

   //查詢支持的fps并更新列表
  private void filterHFROptions() { ListPreference hfrPref = mPreferenceGroup.findPreference(KEY_VIDEO_HIGH_FRAME_RATE); if (hfrPref != null) { hfrPref.reloadInitialEntriesAndEntryValues(); if (filterUnsupportedOptions(hfrPref, getSupportedHighFrameRate())) { mFilteredKeys.add(hfrPref.getKey()); } } }
    private List<String> getSupportedHighFrameRate() {
        ArrayList<String> supported = new ArrayList<String>();
        supported.add("off");
        ListPreference videoQuality = mPreferenceGroup.findPreference(KEY_VIDEO_QUALITY);
        ListPreference videoEncoder = mPreferenceGroup.findPreference(KEY_VIDEO_ENCODER);
        if (videoQuality == null || videoEncoder == null) return supported;
        String videoSizeStr = videoQuality.getValue();
        int videoEncoderNum = SettingTranslation.getVideoEncoder(videoEncoder.getValue());
        VideoCapabilities videoCapabilities = null;
        boolean findVideoEncoder = false;
        if (videoSizeStr != null) {
            Size videoSize = parseSize(videoSizeStr);
            MediaCodecList allCodecs = new MediaCodecList(MediaCodecList.ALL_CODECS);
            for (MediaCodecInfo info : allCodecs.getCodecInfos()) {
                if (!info.isEncoder() || info.getName().contains("google")) continue;
                for (String type : info.getSupportedTypes()) {
                    if ((videoEncoderNum == MediaRecorder.VideoEncoder.MPEG_4_SP && type.equalsIgnoreCase(MediaFormat.MIMETYPE_VIDEO_MPEG4))
                            || (videoEncoderNum == MediaRecorder.VideoEncoder.H263 && type.equalsIgnoreCase(MediaFormat.MIMETYPE_VIDEO_H263))
                            || (videoEncoderNum == MediaRecorder.VideoEncoder.H264 && type.equalsIgnoreCase(MediaFormat.MIMETYPE_VIDEO_AVC))
                            || (videoEncoderNum == MediaRecorder.VideoEncoder.HEVC && type.equalsIgnoreCase(MediaFormat.MIMETYPE_VIDEO_HEVC))) {
                        CodecCapabilities codecCapabilities = info.getCapabilitiesForType(type);
                        videoCapabilities = codecCapabilities.getVideoCapabilities();
                        findVideoEncoder = true;
                        break;
                    }
                }
                if (findVideoEncoder) break;
            }

            try {
                // 獲取當前video size對應支持的fps
                Range[] range = getSupportedHighSpeedVideoFPSRange(mCameraId, videoSize);
                for (Range r : range) {
                    // To support HFR for both preview and recording,
                    // minmal FPS needs to be equal to maximum FPS
                    if ((int) r.getUpper() == (int) r.getLower()) {
                        if (videoCapabilities != null) {
                            if (videoCapabilities.areSizeAndRateSupported(
                                    videoSize.getWidth(), videoSize.getHeight(), (int) r.getUpper())) {
                                supported.add("hfr" + String.valueOf(r.getUpper()));
                                supported.add("hsr" + String.valueOf(r.getUpper()));
                            }
                        }
                    }
                }
            } catch (IllegalArgumentException ex) {
                Log.w(TAG, "HFR is not supported for this resolution " + ex);
            }
      .......
        }
        return supported;
    }

 (2) frameworksasecorejavaandroidhardwarecamera2paramsStreamConfigurationMap.java

    public Range<Integer>[] getHighSpeedVideoFpsRangesFor(Size size) {
        // 檢查當前選擇的video size是否支持HFR
        Integer fpsRangeCount = mHighSpeedVideoSizeMap.get(size);
        if (fpsRangeCount == null || fpsRangeCount == 0) {
            throw new IllegalArgumentException(String.format(
                    "Size %s does not support high speed video recording", size));
        }

        @SuppressWarnings("unchecked")
        Range<Integer>[] fpsRanges = new Range[fpsRangeCount];
        int i = 0;
        // 獲取當前video size支持的各fps
        for (HighSpeedVideoConfiguration config : mHighSpeedVideoConfigurations) {
            if (size.equals(config.getSize())) {
                fpsRanges[i++] = config.getFpsRange();
            }
        }
        return fpsRanges;
    }
其中 mHighSpeedVideoConfigurations 在下面接口中初始化 :frameworksasecorejavaandroidhardwarecamera2implCameraMetadataNative.java
    private StreamConfigurationMap getStreamConfigurationMap() {
        StreamConfiguration[] configurations = getBase(
                CameraCharacteristics.SCALER_AVAILABLE_STREAM_CONFIGURATIONS);
        StreamConfigurationDuration[] minFrameDurations = getBase(
                CameraCharacteristics.SCALER_AVAILABLE_MIN_FRAME_DURATIONS);
        StreamConfigurationDuration[] stallDurations = getBase(
                CameraCharacteristics.SCALER_AVAILABLE_STALL_DURATIONS);
        StreamConfiguration[] depthConfigurations = getBase(
                CameraCharacteristics.DEPTH_AVAILABLE_DEPTH_STREAM_CONFIGURATIONS);
        StreamConfigurationDuration[] depthMinFrameDurations = getBase(
                CameraCharacteristics.DEPTH_AVAILABLE_DEPTH_MIN_FRAME_DURATIONS);
        StreamConfigurationDuration[] depthStallDurations = getBase(
                CameraCharacteristics.DEPTH_AVAILABLE_DEPTH_STALL_DURATIONS);

     // 從camx層獲取high speed video的配置信息 HighSpeedVideoConfiguration[] highSpeedVideoConfigurations = getBase( CameraCharacteristics.CONTROL_AVAILABLE_HIGH_SPEED_VIDEO_CONFIGURATIONS);
ReprocessFormatsMap inputOutputFormatsMap = getBase( CameraCharacteristics.SCALER_AVAILABLE_INPUT_OUTPUT_FORMATS_MAP); int[] capabilities = getBase(CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES); boolean listHighResolution = false; for (int capability : capabilities) { if (capability == CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES_BURST_CAPTURE) { listHighResolution = true; break; } }

     // 創建StreamConfigurationMap,檢查config信息 return new StreamConfigurationMap( configurations, minFrameDurations, stallDurations, depthConfigurations, depthMinFrameDurations, depthStallDurations, highSpeedVideoConfigurations, inputOutputFormatsMap, listHighResolution); }

(3) HAL層的 cameraInfo 就是本文第二部分分析ConfigureStreams時獲取的,在 vendorqcomproprietarycamxsrccorehalcamxhal3.cpp中調用pHALDevice->ConfigureStreams(pStreamConfigs);
   進入到了vendorqcomproprietarycamxsrccorehalcamxhaldevice.cpp:

////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
// HALDevice::ConfigureStreams
////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
CamxResult HALDevice::ConfigureStreams(
    Camera3StreamConfig* pStreamConfigs)
{
    CamxResult result = CamxResultSuccess;

    // Validate the incoming stream configurations 
    result = CheckValidStreamConfig(pStreamConfigs); // 其中調用了 pHWEnvironment->GetCameraInfo(logicalCameraId, &cameraInfo);

    if ((StreamConfigModeConstrainedHighSpeed == pStreamConfigs->operationMode) ||
        (StreamConfigModeSuperSlowMotionFRC == pStreamConfigs->operationMode))
    {
        SearchNumBatchedFrames (pStreamConfigs, &m_usecaseNumBatchedFrames, &m_FPSValue);
        CAMX_ASSERT(m_usecaseNumBatchedFrames > 1);
    }
    else
    {
        // Not a HFR usecase batch frames value need to set to 1.
        m_usecaseNumBatchedFrames = 1;
    }

   ......
  return result; }

(4)platform、sensor的capabilities信息是在 vendorqcomproprietarycamxsrccorecamxhwenvironment.cpp 中初始化

VOID HwEnvironment::InitCaps()
{
......
if (CamxResultSuccess == result) {
     // 平臺、sensor的capabilities主要通過以下函數初始化 ProbeImageSensorModules(); //創建ImageSensorModuleDataManager,Initialize()時調用CreateAllSensorModuleSetManagers,其中會加載sensor的信息bin文件 EnumerateDevices(); InitializeSensorSubModules(); InitializeSensorStaticCaps(); //其中進一步調用ImageSensorModuleData::GetStaticCaps獲取sensor的capability result = m_staticEntryMethods.GetStaticCaps(&m_platformCaps[0]); // copy the static capacity to remaining sensor's for (UINT index = 1; index < m_numberSensors; index++) { Utils::Memcpy(&m_platformCaps[index], &m_platformCaps[0], sizeof(m_platformCaps[0])); } if (NULL != m_pOEMInterface->pInitializeExtendedPlatformStaticCaps) { m_pOEMInterface->pInitializeExtendedPlatformStaticCaps(&m_platformCaps[0], m_numberSensors); } } ...... }

(5) 在ProbeImageSensorModules中創建ImageSensorModuleDataManager時會加載模組的bin文件獲取sensor信息 :vendorqcomproprietarycamxsrccorecamximagesensormoduledatamanager.cpp

CamxResult ImageSensorModuleDataManager::CreateAllSensorModuleSetManagers()
{
    CamxResult                   result                  = CamxResultSuccess;
    ImageSensorModuleSetManager* pSensorModuleSetManager = NULL;
    UINT16                       fileCount               = 0;
    CHAR                         binaryFiles[MaxSensorModules][FILENAME_MAX];

    // 當前8150mtp使用的camera模組是ov12a10(wide),所以會加載com.qti.sensormodule.ofilm_ov12a10.bin
    fileCount = OsUtils::GetFilesFromPath(SensorModulesPath, FILENAME_MAX, &binaryFiles[0][0], "*", "sensormodule", "*", "bin");
    CAMX_ASSERT((fileCount != 0) && (fileCount < MaxSensorModules));

    m_numSensorModuleManagers = 0;

    if ((fileCount == 0) || (fileCount >= MaxSensorModules))
    {
        CAMX_LOG_ERROR(CamxLogGroupSensor, "Invalid fileCount", fileCount);
        result = CamxResultEFailed;
    }
    else
    {
        for (UINT i = 0; i < fileCount; i++)
        {
            result = GetSensorModuleManagerObj(&binaryFiles[i][0], &pSensorModuleSetManager);
            if (CamxResultSuccess == result)
            {
                m_pSensorModuleManagers[m_numSensorModuleManagers++] = pSensorModuleSetManager;
            }
            else
            {
                CAMX_LOG_ERROR(CamxLogGroupSensor,
                "GetSensorModuleManagerObj failed i: %d binFile: %s",
                i, &binaryFiles[i][0]);
            }
        }

        CAMX_ASSERT(m_numSensorModuleManagers > 0);
        if (0 == m_numSensorModuleManagers)
        {
            CAMX_LOG_ERROR(CamxLogGroupSensor, "Invalid number of sensor module managers");
            result = CamxResultEFailed;
        }
    }

    return result;
}

 獲取sensor static capability的調用時序圖:

  

 上面com.qti.sensormodule.ofilm_ov12a10.bin可以修改對應xml編譯更新,路徑在:vendorqcomproprietarychi-cdkvendorsensordefaultov12a10

  

 從ov12a10_sensor.xml可以看到1080p支持最大60fps:

  

  需要注意的是,修改xml參數frameRate為120,更新.bin后app的設置中的確會增加120 fps選項,但sensor的輸出能力如果只能達到1080p@60fps的話,錄制結果會卡頓,由于sensor的輸出幀率低于編碼率,所以插入了很多重復幀。

  Ps: HFR Usecase需要幀率大于等于120fps。

總結

以上是生活随笔為你收集整理的Android:高通平台Camera HFR Usecase分析的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。

主站蜘蛛池模板: 天天综合网天天综合 | 亚洲1区 | 日韩一区二区三区四区在线 | 日本一区二区三区在线观看视频 | 国产精品一区二区免费在线观看 | 久久精品视频7 | 性久久久久久久久久久 | 我们的2018在线观看免费高清 | 伊人夜夜 | 就操网 | 成人做爰免费视频免费看 | 特级淫片裸体免费看 | 婷婷色婷婷开心五月四房播播 | 少妇高潮一区二区三区 | 日韩 国产| 美女又爽又黄视频毛茸茸 | 日本在线视频www | 五月天精品 | 免费在线激情视频 | 久久久久9999 | 污污视频网站免费观看 | www.四色| 久久久久无码精品 | 国产a视频精品免费观看 | 深夜av| 黄色网址在线免费观看 | 亚洲理论片在线观看 | 色综合五月婷婷 | 午夜精品999 | 黄色一极视频 | 亚洲最大福利网 | 婷婷调教口舌奴ⅴk | 亚洲欧美亚洲 | 美女被草网站 | 欧美成人午夜影院 | 天天综合天天做 | 一本黄色片 | 2025中文字幕 | 欧美成人秋霞久久aa片 | 欧美三级日本三级 | www.欧美色图 | 日韩成人免费在线 | 国产.com | 我的丝袜美腿尤物麻麻 | 欧美性xxxxx极品娇小 | 国产亚洲性欧美日韩在线观看软件 | av片毛片| 中文字幕一区二区三区又粗 | 亚洲精品国产精品国自产观看浪潮 | wwwxx在线观看 | av女优一区| 日韩毛片网 | 国产丰满果冻videossex | 国产午夜一级一片免费播放 | 国产农村老头老太视频 | 中文字幕一区二区三区四区视频 | 91秦先生在线播放 | 免费在线黄色网 | 亚洲第一看片 | 成人午夜在线播放 | 高级家教课程在线观看 | 亚洲不卡在线 | 亚洲欧美综合色 | 日韩在线高清 | 神马国产 | 免费亚洲精品 | 一区二区视频免费 | 成人精品视频在线 | 黄色大片免费网站 | 日本黄色一区二区 | 黄色福利视频 | 亚洲午夜18毛片在线看 | xxx.国产| 日日干夜夜草 | 天堂中文8| 日本中文字幕网站 | 国产精品白虎 | 日韩精品免费视频 | 久久国产中文 | 99久久精品国产色欲 | 久久久久久久色 | 中文在线播放 | 人人爽人人爽人人爽 | 午夜寂寞福利 | 性猛交╳xxx乱大交 偷偷操不一样的久久 | 精品处破女学生 | 五月情婷婷 | 亚洲在线一区二区 | 婷婷调教口舌奴ⅴk | 91视频黄色 | 国产精品高潮呻吟久久 | 女生隐私免费看 | 日本少妇喂奶漫画 | 成人午夜精品福利免费 | 美妇湿透娇羞紧窄迎合 | 九九这里只有精品视频 | 久久久久久久久久久丰满 | 国产成人精品无码高潮 | 久久精品久久久精品美女 |