某车机平台,使用高通soc,基于Android P。底层通路已经由芯片供应商打通,移植的是Android v4l2的代码。
Android原生的v4l2 code路径:hardware\libhardware\modules\camera\3_4
Soc供应商复用了大部分代码+少量修改,实现了Camera功能。
问题:Camera的测试app在进行测试时,偶发camera HAL crash。概率大约50%左右。
下面记录解决步骤:
Camera的流程梳理
调用过程大致如下
系统中存在2个camera相关的进程:
- cameraserver
- android.hardware.camera.provider@2.4-service
Camera APP通过cameraManager API,binder call到cameraserver进程,这是一个native的camera服务,cameraservice随后通过CameraProviderManager调用到HAL进程 android.hardware.camera.provider@2.4-service。
其中HAL层的CameraProvider会在初始化时dlopen camera.v4l2.so。 上图中蓝色部分就是OEM的实现部分。
Crash堆栈分析
09-24 09:22:44.657 413 5032 F libc : Fatal signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x14 in tid 5032 (Dequeue buffers), pid 413 (provider@2.4-se)
09-24 09:22:44.839 5047 5047 F DEBUG : *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
09-24 09:22:44.839 5047 5047 F DEBUG : Build fingerprint: 'qti/msmnile_gvmq/msmnile_gvmq:9/PQ1A.190105.004/uzhay34509091133:userdebug/test-keys'
09-24 09:22:44.839 5047 5047 F DEBUG : Revision: '0'
09-24 09:22:44.839 5047 5047 F DEBUG : ABI: 'arm'
09-24 09:22:44.839 5047 5047 F DEBUG : pid: 413, tid: 5032, name: Dequeue buffers >>> /vendor/bin/hw/android.hardware.camera.provider@2.4-service <<<
09-24 09:22:44.839 5047 5047 F DEBUG : signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x14
09-24 09:22:44.839 5047 5047 F DEBUG : Cause: null pointer dereference
09-24 09:22:44.839 5047 5047 F DEBUG : r0 00000000 r1 00000000 r2 00000001 r3 00000000
09-24 09:22:44.839 5047 5047 F DEBUG : r4 f15a2cdc r5 f15ce0b4 r6 f15a2cc0 r7 efe018fc
09-24 09:22:44.839 5047 5047 F DEBUG : r8 efe01928 r9 efe01924 r10 f1595b30 r11 f1595b3c
09-24 09:22:44.839 5047 5047 F DEBUG : ip f1c39244 sp efe017e0 lr f1c17243 pc f0566f48
09-24 09:22:44.862 5047 5047 F DEBUG :
09-24 09:22:44.862 5047 5047 F DEBUG : backtrace:
09-24 09:22:44.862 5047 5047 F DEBUG : #00 pc 00017f48 /vendor/lib/hw/camera.v4l2.so (v4l2_camera_hal::V4L2Wrapper::DequeueRequest(std::__1::shared_ptr<default_camera_hal::CaptureRequest>*)+148)
09-24 09:22:44.862 5047 5047 F DEBUG : #01 pc 0000ca5b /vendor/lib/hw/camera.v4l2.so (v4l2_camera_hal::V4L2Camera::dequeueRequestBuffers()+42)
09-24 09:22:44.862 5047 5047 F DEBUG : #02 pc 0000c0cb /system/lib/vndk-sp-28/libutils.so (android::Thread::_threadLoop(void*)+286)
09-24 09:22:44.862 5047 5047 F DEBUG : #03 pc 00063c85 /system/lib/libc.so (__pthread_start(void*)+22)
09-24 09:22:44.862 5047 5047 F DEBUG : #04 pc 0001e085 /system/lib/libc.so (__start_thread+22)
纳尼,出现了空指针异常,崩溃在camera.v4l2.so。使用addr2line转换一下:
地址00017f48对应的代码:
uzhay345@cn1690vdi2034:~/8155_es15$ ./prebuilts/gcc/linux-x86/aarch64/aarch64-linux-android-4.9/bin/aarch64-linux-android-addr2line -e out/target/product/msmnile_gvmq/symbols/vendor/lib/hw/camera.v4l2.so -a 00017f48
0x00017f48
external/libcxx/include/vector:1504
vector本身应该没有问题,可能的原因:
1.某个指针变量指向vector,此指针是空
2.某个指针变量指向一个类,此类中包含vector的成员变量。此指针为空
接着看地址0000ca5b对应的代码:
uzhay345@cn1690vdi2034:~/8155_es15$ ./prebuilts/gcc/linux-x86/aarch64/aarch64-linux-android-4.9/bin/aarch64-linux-android-addr2line -e out/target/product/msmnile_gvmq/symbols/vendor/lib/hw/camera.v4l2.so -a 0000ca5b
0x0000ca5b
vendor/qcom/proprietary/ais/hal/v4l2_camera_hal/v4l2_camera.cpp:289
看一下v4l2_camera.cpp (截取了一部分)
bool V4L2Camera::dequeueRequestBuffers() {
HAL_LOGV("v4l2api::dequeueRequestBuffers");
// Dequeue a buffer.
std::shared_ptr<default_camera_hal::CaptureRequest> request;
int res;
{
std::unique_lock<std::mutex> lock(in_flight_lock_);
res = device_->DequeueRequest(&request);
if (!res) {
if (request) {
completeRequest(request, res);
in_flight_buffer_count_--;
}
return true;
}
}
if (res == -EAGAIN) {
// EAGAIN just means nothing to dequeue right now.
// Wait until something is available before looping again.
std::unique_lock<std::mutex> lock(in_flight_lock_);
while (in_flight_buffer_count_ == 0) {
buffers_in_flight_.wait(lock);
}
} else {
HAL_LOGE("Device failed to dequeue buffer: %d", res);
}
return true;
}
其中第289行是
res = device_->DequeueRequest(&request);
也就是说在request出队的函数里爆出了空指针异常,具体哪里崩了有待分析。再看一下android.hardware.camera.provider@2.4-service的log:
130|console:/ # logcat -b all --pid 413
--------- beginning of events
09-24 09:22:44.489 413 413 I auditd : type=1400 audit(0.0:747): avc: denied { read } for comm="HwBindject_r:default_prop:s0" dev="tmpfs" ino=9280 scontext=u:r:hal_camera_default:s0 tcontext=u:object_r:defaule permissive=1
--------- beginning of main
09-24 09:22:44.489 413 413 I HwBinder:413_1: type=1400 audit(0.0:747): avc: denied { read } for name="rop:s0" dev="tmpfs" ino=9280 scontext=u:r:hal_camera_default:s0 tcontext=u:object_r:default_prop:s0 tclass
09-24 09:22:44.489 413 413 I auditd : type=1400 audit(0.0:748): avc: denied { open } for comm="HwBind/__properties__/u:object_r:default_prop:s0" dev="tmpfs" ino=9280 scontext=u:r:hal_camera_default:s0 tcontet_prop:s0 tclass=file permissive=1
09-24 09:22:44.489 413 413 I HwBinder:413_1: type=1400 audit(0.0:748): avc: denied { open } for path="u:object_r:default_prop:s0" dev="tmpfs" ino=9280 scontext=u:r:hal_camera_default:s0 tcontext=u:object_r:de=file permissive=1
09-24 09:22:44.489 413 413 I auditd : type=1400 audit(0.0:749): avc: denied { getattr } for comm="HwBdev/__properties__/u:object_r:default_prop:s0" dev="tmpfs" ino=9280 scontext=u:r:hal_camera_default:s0 tcoault_prop:s0 tclass=file permissive=1
09-24 09:22:44.489 413 413 I HwBinder:413_1: type=1400 audit(0.0:749): avc: denied { getattr } for pat__/u:object_r:default_prop:s0" dev="tmpfs" ino=9280 scontext=u:r:hal_camera_default:s0 tcontext=u:object_rass=file permissive=1
09-24 09:22:44.489 413 413 I auditd : type=1400 audit(0.0:750): avc: denied { map } for comm="HwBinde__properties__/u:object_r:default_prop:s0" dev="tmpfs" ino=9280 scontext=u:r:hal_camera_default:s0 tcontex_prop:s0 tclass=file permissive=1
09-24 09:22:44.489 413 413 I HwBinder:413_1: type=1400 audit(0.0:750): avc: denied { map } for path="/:object_r:default_prop:s0" dev="tmpfs" ino=9280 scontext=u:r:hal_camera_default:s0 tcontext=u:object_r:deffile permissive=1
09-24 09:22:44.490 413 729 D Camera : openDevice:0:Opening camera device
09-24 09:22:44.490 413 729 E Camera : initialize:0: callback_ops=0xeff11004
09-24 09:22:44.504 413 5033 E Camera : constructDefaultRequestSettings:0: type=1
09-24 09:22:44.508 413 5033 E Camera : configureStreams:0: stream_config=0xefb01538
09-24 09:22:44.508 413 5033 E Camera : validateStreamConfiguration:validateStreamConfiguration
09-24 09:22:44.508 413 5033 E V4L2CameraHAL: setupStreams:367: (stream 0 is format 17, width 1920, heig format 33, width 1920, height 1080).
09-24 09:22:44.508 413 5033 I Adreno-GSL_RPC: <gsl_library_open:1467>: library open -- refcount=1
09-24 09:22:44.499 413 413 I auditd : type=1400 audit(0.0:751): avc: denied { read write } for comm="="hab" dev="tmpfs" ino=11008 scontext=u:r:hal_camera_default:s0 tcontext=u:object_r:hab_device:s0 tclass=c
09-24 09:22:44.499 413 413 I HwBinder:413_2: type=1400 audit(0.0:751): avc: denied { read write } for s" ino=11008 scontext=u:r:hal_camera_default:s0 tcontext=u:object_r:hab_device:s0 tclass=chr_file permissi
09-24 09:22:44.499 413 413 I auditd : type=1400 audit(0.0:752): avc: denied { open } for comm="HwBind/hab" dev="tmpfs" ino=11008 scontext=u:r:hal_camera_default:s0 tcontext=u:object_r:hab_device:s0 tclass=ch
09-24 09:22:44.499 413 413 I HwBinder:413_2: type=1400 audit(0.0:752): avc: denied { open } for path="" ino=11008 scontext=u:r:hal_camera_default:s0 tcontext=u:object_r:hab_device:s0 tclass=chr_file permissiv
09-24 09:22:44.499 413 413 I auditd : type=1400 audit(0.0:753): avc: denied { ioctl } for comm="HwBinv/hab" dev="tmpfs" ino=11008 ioctlcmd=0xa04 scontext=u:r:hal_camera_default:s0 tcontext=u:object_r:hab_devle permissive=1
09-24 09:22:44.499 413 413 I HwBinder:413_2: type=1400 audit(0.0:753): avc: denied { ioctl } for path=s" ino=11008 ioctlcmd=0xa04 scontext=u:r:hal_camera_default:s0 tcontext=u:object_r:hab_device:s0 tclass=ch
09-24 09:22:44.508 413 5033 I Adreno-GSL_RPC: <gsl_rpc_connect:1080>: connecting using conn_id 0
09-24 09:22:44.510 413 5033 I uhab : habmm_socket_open: opened fd 11, refcnt 1, return 0, vcid 19100
09-24 09:22:44.510 413 5033 I Adreno-GSL_RPC: <rpc_sockfd_create:580>: gsl_rpc_initialize: opening init
09-24 09:22:44.510 413 5033 I Adreno-GSL_RPC: <rpc_handshake:1029>: client process name is /vendor/bin/camera.provider@2.4-service(android.hardware.camera.provider@2.4-service)
09-24 09:22:44.511 413 5033 I Adreno-GSL_RPC: <rpc_handshake:1041>: client successfully connected to se id 7
09-24 09:22:44.511 413 5033 I uhab : habmm_socket_close: close fd 11, refcnt 0, vcid 19100011
09-24 09:22:44.511 413 5033 I Adreno-GSL_RPC: <gsl_rpc_connect:1080>: connecting using conn_id 7
09-24 09:22:44.520 413 5033 I uhab : habmm_socket_open: opened fd 11, refcnt 1, return 0, vcid 19100
09-24 09:22:44.521 413 5033 I Adreno-GSL_RPC: <rpc_sockfd_create:589>: gsl_rpc_initialize: connecting f
09-24 09:22:44.521 413 5033 I Adreno-GSL_RPC: <rpc_sub_handshake:1068>: thread successfully connected t
09-24 09:22:44.521 413 5033 I Adreno-GSL_RPC: <gsl_rpc_initialize:1205>: using /data/misc/gpu/gsl_rpc_mmempool settings
09-24 09:22:44.547 413 5039 I Adreno-GSL_RPC: <gsl_rpc_connect:1080>: connecting using conn_id 7
09-24 09:22:44.554 413 5039 I uhab : habmm_socket_open: opened fd 11, refcnt 2, return 0, vcid 19100
09-24 09:22:44.554 413 5039 I Adreno-GSL_RPC: <rpc_sockfd_create:589>: gsl_memory_alloc_rpc: connecting
09-24 09:22:44.555 413 5039 I Adreno-GSL_RPC: <rpc_sub_handshake:1068>: thread successfully connected t
09-24 09:22:44.598 413 5039 W Adreno-GSL_RPC: <gsl_dbq_create:2798>: HGSL:gsl_dbq_create failed, ret: -
09-24 09:22:44.611 413 5039 I Adreno-GSL_RPC: <gsl_rpc_disconnect:1109>: send disconnect command to the
09-24 09:22:44.612 413 5039 I uhab : habmm_socket_close: skip close fd 11, refcnt 1, vcid 19100013
09-24 09:22:44.629 413 729 E Camera : processCaptureRequest: processCaptureRequest
09-24 09:22:44.629 413 729 E Camera : preprocessCaptureBuffer called
09-24 09:22:44.629 413 729 E Camera : preprocessCaptureBuffer called
09-24 09:22:44.657 413 5032 E V4L2CameraHAL: DequeueRequest:1095: Failed to map output frame.
09-24 09:22:44.657 413 5032 E V4L2CameraHAL: dequeueRequestBuffers:307: Device failed to dequeue buffer
--------- beginning of crash
09-24 09:22:44.657 413 5032 F libc : Fatal signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x1ue buffers), pid 413 (provider@2.4-se)
看最后两句log:
09-24 09:22:44.657 413 5032 E V4L2CameraHAL: DequeueRequest:1095: Failed to map output frame.
09-24 09:22:44.657 413 5032 E V4L2CameraHAL: dequeueRequestBuffers:307: Device failed to dequeue buffer
第一句打印的地方正是在:res = device_->DequeueRequest(&request);这个函数里面:
HAL_LOGE(“Failed to map output frame.”);
v4l2_wrapper.cpp:
int V4L2Wrapper::DequeueRequest(std::shared_ptr<CaptureRequest>* request) {
v4l2_buffer buffer;
const camera3_stream_buffer_t* stream_preview = {0};
const camera3_stream_buffer_t* stream_buffer = {0};
uint32_t fourcc = 0;
uint32_t fourcc_snapsht_videorec = 0;
memset(&buffer, 0, sizeof(buffer));
......
std::lock_guard<std::mutex> guard(buffer_queue_lock_);
RequestContext* request_context = &buffers_[buffer.index];
if (request_context->active == false) {
HAL_LOGE("Request context fails\n");
return -EAGAIN;
}
// Lock the camera stream buffer for painting.
stream_preview = &request_context->request->output_buffers[PREVIEW_BUF_IDX];
fourcc = StreamFormat::HalToV4L2PixelFormat(stream_preview->stream->format);
if (request_context->request->output_buffers.size() == 2) {
stream_buffer = &request_context->request->output_buffers[SNAPSHOT_BUF_IDX];
}
if ((stream_buffer != NULL && stream_buffer->stream != NULL) && (stream_buffer->stream->format == HAL_PIXEL_FORMAT_BLOB || stream_buffer->stream->format == HAL_PIXEL_FORMAT_RGBA_8888)) {
fourcc_snapsht_videorec = StreamFormat::HalToV4L2PixelFormat(stream_buffer->stream->format);
HAL_LOGV("Second buffer format is %d\n", stream_buffer->stream->format);
}
if (request) {
*request = request_context->request;
}
// Note that the device buffer length is passed to the output frame. If the
// GrallocFrameBuffer does not have support for the transformation to
// |fourcc|, it will assume that the amount of data to lock is based on
// |buffer.length|, otherwise it will use the ImageProcessor::ConvertedSize.
arc::GrallocFrameBuffer output_frame(
*stream_preview->buffer, stream_preview->stream->width,
stream_preview->stream->height, fourcc, buffer.length,
stream_preview->stream->usage);
res = output_frame.Map();
if (res) {
HAL_LOGE("Failed to map output frame.");
request_context->request.reset();
return -EINVAL;
}
......
request_context->request.reset();
// Mark the buffer as not in flight.
request_context->active = false;
return 0;
}
看起来是进入了异常case,但为什么已经有异常判断,还会出现空指针呢?
根据android.hardware.camera.provider@2.4-service的log: 打印了2句error的log后,紧接着就打印--------- beginning of crash。也就是说
bool V4L2Camera::dequeueRequestBuffers() 这个函数执行完随即就发生了空指针。再查一下这个函数是在哪里被调用的:
v4l2_camera.cpp:
V4L2Camera::V4L2Camera(int id,
std::shared_ptr<V4L2Wrapper> v4l2_wrapper,
std::unique_ptr<Metadata> metadata)
: default_camera_hal::Camera(id),
device_(std::move(v4l2_wrapper)),
metadata_(std::move(metadata)),
max_input_streams_(0),
max_output_streams_({{0, 0, 0}}),
buffer_enqueuer_(new FunctionThread(
std::bind(&V4L2Camera::enqueueRequestBuffers, this))),
buffer_dequeuer_(new FunctionThread(
std::bind(&V4L2Camera::dequeueRequestBuffers, this))) {
HAL_LOG_ENTER();
}
原来是一个线程,即V4L2Camera::dequeueRequestBuffers会被循环调用,是一个消费者,一个线程在不断的入队请求,另一个线程不断的出队,处理请求。由于前一次的调用进入了异常case,第二次调用dequeueRequestBuffers产生了空指针。
于是加了一些log,找到了空指针的点,是这行代码:
// Lock the camera stream buffer for painting.
stream_preview = &request_context->request->output_buffers[PREVIEW_BUF_IDX];
由于前一次的线程调用进入了异常:
if (res) {
HAL_LOGE("Failed to map output frame.");
request_context->request.reset();
return -EINVAL;
}
从而把request_context->request这个std::shared_ptr 析构了,再次执行时:request_context->request是null,没有保护判断,直接获取request的output_buffers,就崩了,而request->output_buffers正好是一个vector,所以堆栈的crash地址才会定位到external/libcxx/include/vector:1504
找到原因后,修改代码,增加空指针判断:
v4l2_wrapper.cpp:
int V4L2Wrapper::DequeueRequest(std::shared_ptr<CaptureRequest>* request) {
......
// fix the null pointer seg fault.
if (request_context->request == nullptr || request_context->request->output_buffers.empty()) {
HAL_LOGE("request_context->request is null or request_context->request->output_buffers is empty\n");
return -EAGAIN;
}
// Lock the camera stream buffer for painting.
stream_preview = &request_context->request->output_buffers[PREVIEW_BUF_IDX];
......
}
再次运行,果然HAL进程android.hardware.camera.provider@2.4-service不会发生崩溃的问题了。
修正异常Case
此bug修改到这儿,只能说改动了一半。虽然服务不会崩,但是偶发camera的画面还是出不来。
按照代码的逻辑,res = output_frame.Map();正常情况下,res应该success。
// Note that the device buffer length is passed to the output frame. If the
// GrallocFrameBuffer does not have support for the transformation to
// |fourcc|, it will assume that the amount of data to lock is based on
// |buffer.length|, otherwise it will use the ImageProcessor::ConvertedSize.
arc::GrallocFrameBuffer output_frame(
*stream_preview->buffer, stream_preview->stream->width,
stream_preview->stream->height, fourcc, buffer.length,
stream_preview->stream->usage);
res = output_frame.Map();
if (res) {
HAL_LOGE("Failed to map output frame.");
request_context->request.reset();
return -EINVAL;
}
接着具体分析output_frame.Map(); 看一下为什么会返回异常:
frame_buffer.cpp:
int GrallocFrameBuffer::Map() {
base::AutoLock l(lock_);
if (is_mapped_) {
LOGF(ERROR) << "The buffer is already mapped";
return -EINVAL;
}
void* addr = NULL;
int ret = 0;
switch (fourcc_) {
case V4L2_PIX_FMT_YUV420:
case V4L2_PIX_FMT_YVU420:
case V4L2_PIX_FMT_YUYV:
{
......
else
return -EINVAL;
break;
}
case V4L2_PIX_FMT_RGB32:
case V4L2_PIX_FMT_BGR32:
{
......
{
LOGF(ERROR) << "In GRALLOC1_FUNCTION_LOCK failed";
free(ppBaseAddress_rec);
return -EINVAL;
}
......
else
return -EINVAL;
break;
}
default:
return -EINVAL;
}
data_ = static_cast<uint8_t*>(addr);
if (fourcc_ == V4L2_PIX_FMT_YVU420 || fourcc_ == V4L2_PIX_FMT_YUV420 ||
fourcc_ == V4L2_PIX_FMT_NV21 || fourcc_ == V4L2_PIX_FMT_RGB32 ||
fourcc_ == V4L2_PIX_FMT_BGR32) {
buffer_size_ = ImageProcessor::GetConvertedSize(fourcc_, width_, height_);
LOGF(ERROR) << "buffer_size_ optined is "<<buffer_size_;
}
is_mapped_ = true;
return 0;
}
通过添加log,比对正常log和异常log,终于发现,在camera画面正常时,output_frame.Map();此函数中 fourcc_变量的值为0x32315559,通过分析可知,是ASCII字符 ‘Y’ ‘U’ ‘1’ ‘2’ 或移位操作的结果,找到相应定义,即V4L2_PIX_FMT_YUV420。
而在camera异常时,fourcc_变量的值为0x4745504a,是ASCII字符 ‘J’ ‘P’ ‘E’ 'G‘,即V4L2_PIX_FMT_JPEG
videodev2.h:
......
#define V4L2_PIX_FMT_JPEG v4l2_fourcc('J', 'P', 'E', 'G') /* JFIF JPEG */
......
#define V4L2_PIX_FMT_YUV420 v4l2_fourcc('Y', 'U', '1', '2') /* 12 YUV 4:2:0 */
......
看来问题出在这,传入 V4L2_PIX_FMT_JPEG,由于**int GrallocFrameBuffer::Map()**函数中没有对应的case,导致异常, return -EINVAL;
通过分析代码fourcc_变量是在这里赋值:
v4l2_wrapper.cpp:
fourcc = StreamFormat::HalToV4L2PixelFormat(stream_preview->stream->format);
即根据传入的 stream_preview->stream->format,映射对应的v4l2的format值
所以问题来了:
正常case: stream_preview->stream->format的值为0x11 ,枚举是 HAL_PIXEL_FORMAT_YCrCb_420_SP
异常case: stream_preview->stream->format的值为0x21, 枚举是HAL_PIXEL_FORMAT_BLOB
后续又在源码中添加大量log,仔细梳理了一下cameraService,以及HAL层的代码,发现:
正常case: V4L2Camera::enqueueRequestBuffers() 入队, buffers_[0]的format是0x11
异常case: V4L2Camera::enqueueRequestBuffers() 入队,buffers_[0]的format是0x21
实际上是 V4L2Wrapper中buffers_成员变量的顺序反了,此buffers_是一个vector:
v4l2_wrapper.h:
std::vector<RequestContext> buffers_;
正常case: buffers_[0] 的format是0x11, buffers_[1] 的format是0x21
异常case: buffers_[0] 的format是0x21, buffers_[1] 的format是0x11
法克,为啥vector的顺序会反掉呢?在仔细分析源码发现:
这个buffers_的vector实际映射的是 APP层传递下来的2个surface,我在写camera测试demo的时候,在CaptureRequest中添加了2个对象,一个是用来显示实时画面的preview流,另一个是用来拍照的snap流,这两个surface的format是不一样的,一个是裸的surface,另一个是JPEG的image,大致代码如下:
private void createCameraPreviewSession() {
try {
SurfaceTexture texture = mTextureView.getSurfaceTexture();
assert texture != null;
// We configure the size of default buffer to be the size of camera preview we want.
texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
// This is the output Surface we need to start preview.
Surface surface = new Surface(texture);
// We set up a CaptureRequest.Builder with the output Surface.
mPreviewRequestBuilder
= mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
mPreviewRequestBuilder.addTarget(surface);//surface是显示流
mPreviewRequestBuilder.addTarget(mImageReader.getSurface()); //image是拍照流,mImageReader的格式是JPEG
// Here, we create a CameraCaptureSession for camera preview.
mCameraDevice.createCaptureSession(Arrays.asList(surface, mImageReader.getSurface()),
mCaptureStateCallback, mBackgroundHandler
);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
但是,万万没想到,Android在framework层使用的是ArraySet容器存放这两个surface,具体可参考:
frameworks\base\core\java\android\hardware\camera2\CaptureRequest.java:
public final class CaptureRequest extends CameraMetadata<CaptureRequest.Key<?>>
implements Parcelable {
......
private final ArraySet<Surface> mSurfaceSet = new ArraySet<Surface>();
......
public void addTarget(@NonNull Surface outputTarget) {
mRequest.mSurfaceSet.add(outputTarget);
}
......
}
ArraySet是一个Hash容器,是无序的,也就是说framework封装在captureRequest中,surface的顺序就不定,到底层v4l2库中时,buffers_[ ]的顺序也不定
那么,在底层的代码,此处的逻辑就很有问题了:
v4l2_wrapper.cpp:
int V4L2Wrapper::DequeueRequest(std::shared_ptr<CaptureRequest>* request) {
......
// Lock the camera stream buffer for painting.
stream_preview = &request_context->request->output_buffers[PREVIEW_BUF_IDX];
......
}
此处 PREVIEW_BUF_IDX是一个宏定义,值为0,即在出队处理中,默认认为 output_buffers[0]就是preview流。汗 ~~~~
找到root cause,修改一下,增加判断,在format不匹配的时候,对调一下顺序,强行修正即可。(可能还有其他的改法,无奈功力不够,以后有时间再深入研究最优解)
v4l2_wrapper.cpp:
int V4L2Wrapper::DequeueRequest(std::shared_ptr<CaptureRequest>* request) {
......
if (request_context->request->output_buffers.size() > 1) {
uint32_t tmpFormat1 = request_context->request->output_buffers[PREVIEW_BUF_IDX].stream->format;
uint32_t tmpFormat2 = request_context->request->output_buffers[SNAPSHOT_BUF_IDX].stream->format;
if (tmpFormat1 != HAL_PIXEL_FORMAT_YCrCb_420_SP && tmpFormat2 == HAL_PIXEL_FORMAT_YCrCb_420_SP) {
// change order here
stream_preview = &request_context->request->output_buffers[SNAPSHOT_BUF_IDX];
stream_buffer = &request_context->request->output_buffers[PREVIEW_BUF_IDX];
} else {
// keep the original order
stream_preview = &request_context->request->output_buffers[PREVIEW_BUF_IDX];
stream_buffer = &request_context->request->output_buffers[SNAPSHOT_BUF_IDX];
}
} else {
// 1 surface keep unchanged
stream_preview = &request_context->request->output_buffers[PREVIEW_BUF_IDX];
}
......
}
最终,进行多次测试,camera都能正常显示了。在此记录一下,供应商的一些代码也是需要仔细推敲的,记录修改步骤,以免忘记。
camera测试demo:
Camera2Demo-master.zip