Camera2 createCaptureSession源码分析

当应用调用CameraManager#openCamera获取到已打开的camera设备后,会调用createCaptureSession方法来完成camera stream创建和stream的相关配置。在createCaptureSession方法中,首先将应用的surfaces信息封装成可跨binder传递的OutputConfiguration对象,然后调用createCaptureSessionInternal方法来进行进一步的配置。

在createCaptureSessionInternal方法中,主要完成3件事情:1)检查camera状态和session是否已经创建,如果创建,则重置session状态;2)通过configureStreamsChecked方法完成camera stream创建和配置;3)根据isConstrainedHighSpeed标志位来创建不同的CameraCaptureSession对象,然后在session构造函数根据camera stream配置结果通过不同的回调上报session对象和状态。

/frameworks/base/core/java/android/hardware/camera2/impl/CameraDeviceImpl.java

/frameworks/base/core/java/android/hardware/camera2/impl/CameraCaptureSessionImpl.java

接下来分析configureStreamsChecked流程。在这个流程中,主要完成camera stream命令请求的重置,并根据应用传递的camera surface参数重新创建和配置camera stream(input和output)。具体分为以下4件事情:1)对比本地和当前的configuration差异,获取需要移除的camera stream信息;2)停止并清空底层的camera stream命令请求,为下一步重新配置camera stream命令做准备;3)根据1)中获取的结果,更新camera stream通路;4)调用endConfigure完成camera stream的配置。

接下来分析更新camera stream的deleteStream流程。在这个流程中,mRemoteDevice通过binder调用到CameraDeviceClient#deleteStream方法中,主要完成2件事情:1)更新本地列表中缓存的camera stream信息,移除streamId对应的camera stream;2)调用Camera3Device#deleteStream,查找本地camera stream列表中id对应的deletedStream,然后调用deletedStream#disconnect,最终调用到Camera3OutputStream#disconnectLocked来完成surface连接并移除deletedStream本地记录。

/frameworks/av/service/camera/libcameraservice/api2/CameraDeviceClient.cpp

/frameworks/av/service/camera/libcameraservice/device3/Camera3Device.cpp

/frameworks/av/service/camera/libcameraservice/device3/Camera3OutputStream.cpp

接着分析更新camera stream的createStream流程。在这个流程中,mRemoteDevice通过binder调用到CameraDeviceClient#createStream方法中,主要完成4件事情:1)检查应用pid状态、surface及stream之间的对应关系限制和camera physical id的合法性;2)检查createStream方法中的参数outputConfiguration对应的buffer producer是否已经创建,调用createSurfaceFromGbp方法创建最匹配outputConfiguration参数对应的camera surface并保存到本地。3)检查需要创建的camera stream属性,调用Camera3Device#createStream完成camera stream的创建,保存camera stream id对应的outputConfiguration信息和创建的surface;4)调用setStreamTransformLocked来完成camera stream相关的方向变换,将camera stream id返回给framework。

/frameworks/av/service/camera/libcameraservice/api2/CameraDeviceClient.cpp

/frameworks/av/service/camera/libcameraservice/utils/SessionConfigurationUtils.cpp

/frameworks/av/service/camera/libcameraservice/utils/SessionConfigurationUtils.cpp

/frameworks/av/service/camera/libcameraservice/device3/Camera3Device.cpp

当camera stream创建完成后,mRemoteDevice#endConfigure方法通过binder调用到CameraDeviceClient#endConfigure,最终通过Camera3Device# configureStreams来完成camera stream的配置。在configureStreams方法中,通过filterParamsAndConfigureLocked获取默认session参数,并最终调用configureStreamsLocked来完成camera stream配置。在configureStreamsLocked方法中,首先暂停Camera3Device中的PreparerThread,然后调用inputStream/outputStream#startConfiguration获取到已经创建的camera stream,配置camera_stream_configuration参数,接着调用mInterface#configureStreams通过之前创建好的cameraDeviceSession完成hal层的camera stream配置,然后再调用inputStream/outputStream#finishConfiguration并最终通过Camera3OutputStream# configureConsumerQueueLocked来完成camera stream中surface参数的初始化配置,最后启动Camera3Device中的PreparerThread循环等待mPendingStreams中stream并执行buffer申请。至此,createCaptureSession流程分析完成,framework完成CameraCaptureSession的创建,native/hal层完成stream的创建和配置,并启动PreparerThread等待进一步的初始化流程。

/frameworks/av/service/camera/libcameraservice/device3/Camera3Device.cpp

/frameworks/av/service/camera/libcameraservice/device3/Camera3Device.cpp

/hardware/interfaces/camera/device/3.2/default/CameraDeviceSession.cpp

/frameworks/av/service/camera/libcameraservice/device3/Camera3OutputStream.cpp

/frameworks/av/service/camera/libcameraservice/device3/Camera3Device.cpp

  • 17
    点赞
  • 6
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
以下是一个简单的 Camera2 API 创建多个 CaptureSession 的示例代码: ```java public class CameraFragment extends Fragment { private CameraDevice mCameraDevice; private CameraCaptureSession mPreviewSession; private CameraCaptureSession mImageSession; private Size mPreviewSize; private Size mImageSize; private ImageReader mImageReader; private final CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() { @Override public void onOpened(@NonNull CameraDevice cameraDevice) { mCameraDevice = cameraDevice; createPreviewSession(); createImageSession(); } @Override public void onDisconnected(@NonNull CameraDevice cameraDevice) { cameraDevice.close(); mCameraDevice = null; } @Override public void onError(@NonNull CameraDevice cameraDevice, int error) { cameraDevice.close(); mCameraDevice = null; Activity activity = getActivity(); if (null != activity) { activity.finish(); } } }; private final CameraCaptureSession.StateCallback mPreviewSessionCallback = new CameraCaptureSession.StateCallback() { @Override public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) { mPreviewSession = cameraCaptureSession; updatePreview(); } @Override public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) { Activity activity = getActivity(); if (null != activity) { Toast.makeText(activity, "Failed", Toast.LENGTH_SHORT).show(); } } }; private final CameraCaptureSession.StateCallback mImageSessionCallback = new CameraCaptureSession.StateCallback() { @Override public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) { mImageSession = cameraCaptureSession; } @Override public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) { Activity activity = getActivity(); if (null != activity) { Toast.makeText(activity, "Failed", Toast.LENGTH_SHORT).show(); } } }; private final ImageReader.OnImageAvailableListener mOnImageAvailableListener = new ImageReader.OnImageAvailableListener() { @Override public void onImageAvailable(ImageReader reader) { Image image = reader.acquireLatestImage(); // Process the captured image image.close(); } }; @Nullable @Override public View onCreateView(LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) { View view = inflater.inflate(R.layout.fragment_camera, container, false); return view; } @Override public void onViewCreated(View view, @Nullable Bundle savedInstanceState) { super.onViewCreated(view, savedInstanceState); // Initialize the camera openCamera(); } @Override public void onResume() { super.onResume(); // Start the preview session if (null != mCameraDevice) { createPreviewSession(); } } @Override public void onPause() { closeCamera(); super.onPause(); } private void openCamera() { Activity activity = getActivity(); if (null == activity || activity.isFinishing()) { return; } CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE); try { String cameraId = manager.getCameraIdList()[0]; CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId); StreamConfigurationMap map = characteristics.get( CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP); mPreviewSize = map.getOutputSizes(SurfaceTexture.class)[0]; mImageSize = map.getOutputSizes(ImageFormat.JPEG)[0]; mImageReader = ImageReader.newInstance(mImageSize.getWidth(), mImageSize.getHeight(), ImageFormat.JPEG, /*maxImages*/2); mImageReader.setOnImageAvailableListener(mOnImageAvailableListener, null); manager.openCamera(cameraId, mStateCallback, null); } catch (CameraAccessException e) { e.printStackTrace(); } } private void closeCamera() { if (null != mPreviewSession) { mPreviewSession.close(); mPreviewSession = null; } if (null != mImageSession) { mImageSession.close(); mImageSession = null; } if (null != mCameraDevice) { mCameraDevice.close(); mCameraDevice = null; } if (null != mImageReader) { mImageReader.close(); mImageReader = null; } } private void createPreviewSession() { try { SurfaceTexture texture = getSurfaceTexture(); texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight()); Surface surface = new Surface(texture); CaptureRequest.Builder builder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW); builder.addTarget(surface); mCameraDevice.createCaptureSession(Arrays.asList(surface), mPreviewSessionCallback, null); } catch (CameraAccessException e) { e.printStackTrace(); } } private void createImageSession() { try { Surface surface = mImageReader.getSurface(); CaptureRequest.Builder builder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE); builder.addTarget(surface); mCameraDevice.createCaptureSession(Arrays.asList(surface), mImageSessionCallback, null); } catch (CameraAccessException e) { e.printStackTrace(); } } private void updatePreview() { if (null == mCameraDevice) { return; } try { CaptureRequest.Builder builder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW); SurfaceTexture texture = getSurfaceTexture(); texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight()); Surface surface = new Surface(texture); builder.addTarget(surface); mPreviewSession.setRepeatingRequest(builder.build(), null, null); } catch (CameraAccessException e) { e.printStackTrace(); } } private SurfaceTexture getSurfaceTexture() { Activity activity = getActivity(); if (null == activity) { return null; } TextureView textureView = activity.findViewById(R.id.texture_view); return textureView.getSurfaceTexture(); } } ``` 在此示例中,我们创建了两个 CaptureSession:一个用于预览,一个用于捕获图像。我们使用 ImageReader 来捕获 JPEG 图像,然后在 mOnImageAvailableListener 中处理捕获的图像。在 openCamera() 方法中,我们初始化了 ImageReader 并调用了 manager.openCamera() 来打开相机。在 mStateCallback 的 onOpened() 方法中,我们创建了两个 CaptureSession:一个用于预览,一个用于捕获图像。在 createPreviewSession() 方法中,我们首先获取 SurfaceTexture,然后创建一个 Surface 并将其添加到 CaptureRequest.Builder 中。然后,我们调用 mCameraDevice.createCaptureSession() 来创建预览 CaptureSession。在 createImageSession() 方法中,我们创建了一个与 ImageReader 相关联的 Surface,并将其添加到 CaptureRequest.Builder 中。然后,我们调用 mCameraDevice.createCaptureSession() 来创建捕获图像的 CaptureSession。在 updatePreview() 方法中,我们首先获取 SurfaceTexture,然后创建一个 Surface 并将其添加到 CaptureRequest.Builder 中。然后,我们调用 mPreviewSession.setRepeatingRequest() 来更新预览。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值