树莓派外挂Camera2(libcamera)

1 简介

树莓派现在的库的介绍最权威的还是官方网站:https://www.raspberrypi.com/documentation/computers/camera_software.html#libcamera

树莓派其实有两个摄像头软件接口,一个是rpicam,另一个是libcamera。不过rpicam是以前的接口,而且貌似还是树莓派专用的,所以不用再花精力去看了。而libcamera是开源库,貌似现在用的还越来越普遍,所以学学完全没问题。

另外再上层还封装了picamera2,更降低了学习门槛。

2 基本使用

它的主页是:Welcome — libcamera

看了一下简介,还支持GStreamer插件。

关于几个命令,这里借用一下图片:【活動/社群聚會】Raspberry Pi 社群聚會 #35 會後資料 (Raspberry Pi 新產品介紹) - 台灣樹莓派

常用的几个命令都在上面了,不过参数看起来非常的多。

pi@raspberrypi:~/cam $ libcamera-hello -h
Valid options are:
  -h [ --help ] [=arg(=1)] (=0)         Print this help message
  --version [=arg(=1)] (=0)             Displays the build version number
  --list-cameras [=arg(=1)] (=0)        Lists the available cameras attached to the system.
  --camera arg (=0)                     Chooses the camera to use. To list the available indexes, use the
                                        --list-cameras option.
  -v [ --verbose ] [=arg(=2)] (=1)      Set verbosity level. Level 0 is no output, 1 is default, 2 is verbose.
  -c [ --config ] [=arg(=config.txt)]   Read the options from a file. If no filename is specified, default to
                                        config.txt. In case of duplicate options, the ones provided on the command line
                                        will be used. Note that the config file must only contain the long form
                                        options.
  --info-text arg (=#%frame (%fps fps) exp %exp ag %ag dg %dg)
                                        Sets the information string on the titlebar. Available values:
                                        %frame (frame number)
                                        %fps (framerate)
                                        %exp (shutter speed)
                                        %ag (analogue gain)
                                        %dg (digital gain)
                                        %rg (red colour gain)
                                        %bg (blue colour gain)
                                        %focus (focus FoM value)
                                        %aelock (AE locked status)
                                        %lp (lens position, if known)
                                        %afstate (AF state, if supported)
  --width arg (=0)                      Set the output image width (0 = use default value)
  --height arg (=0)                     Set the output image height (0 = use default value)
  -t [ --timeout ] arg (=5sec)          Time for which program runs. If no units are provided default to ms.
  -o [ --output ] arg                   Set the output file name
  --post-process-file arg               Set the file name for configuring the post-processing
  --post-process-libs arg               Set a custom location for the post-processing library .so files
  -n [ --nopreview ] [=arg(=1)] (=0)    Do not show a preview window
  -p [ --preview ] arg (=0,0,0,0)       Set the preview window dimensions, given as x,y,width,height e.g. 0,0,640,480
  -f [ --fullscreen ] [=arg(=1)] (=0)   Use a fullscreen preview window
  --qt-preview [=arg(=1)] (=0)          Use Qt-based preview window (WARNING: causes heavy CPU load, fullscreen not
                                        supported)
  --hflip [=arg(=1)] (=0)               Request a horizontal flip transform
  --vflip [=arg(=1)] (=0)               Request a vertical flip transform
  --rotation arg (=0)                   Request an image rotation, 0 or 180
  --roi arg (=0,0,0,0)                  Set region of interest (digital zoom) e.g. 0.25,0.25,0.5,0.5
  --shutter arg (=0)                    Set a fixed shutter speed. If no units are provided default to us
  --analoggain arg (=0)                 Set a fixed gain value (synonym for 'gain' option)
  --gain arg                            Set a fixed gain value
  --metering arg (=centre)              Set the metering mode (centre, spot, average, custom)
  --exposure arg (=normal)              Set the exposure mode (normal, sport)
  --ev arg (=0)                         Set the EV exposure compensation, where 0 = no change
  --awb arg (=auto)                     Set the AWB mode (auto, incandescent, tungsten, fluorescent, indoor, daylight,
                                        cloudy, custom)
  --awbgains arg (=0,0)                 Set explict red and blue gains (disable the automatic AWB algorithm)
  --flush [=arg(=1)] (=0)               Flush output data as soon as possible
  --wrap arg (=0)                       When writing multiple output files, reset the counter when it reaches this
                                        number
  --brightness arg (=0)                 Adjust the brightness of the output images, in the range -1.0 to 1.0
  --contrast arg (=1)                   Adjust the contrast of the output image, where 1.0 = normal contrast
  --saturation arg (=1)                 Adjust the colour saturation of the output, where 1.0 = normal and 0.0 =
                                        greyscale
  --sharpness arg (=1)                  Adjust the sharpness of the output image, where 1.0 = normal sharpening
  --framerate arg (=-1)                 Set the fixed framerate for preview and video modes
  --denoise arg (=auto)                 Sets the Denoise operating mode: auto, off, cdn_off, cdn_fast, cdn_hq
  --viewfinder-width arg (=0)           Width of viewfinder frames from the camera (distinct from the preview window
                                        size
  --viewfinder-height arg (=0)          Height of viewfinder frames from the camera (distinct from the preview window
                                        size)
  --tuning-file arg (=-)                Name of camera tuning file to use, omit this option for libcamera default
                                        behaviour
  --lores-width arg (=0)                Width of low resolution frames (use 0 to omit low resolution stream)
  --lores-height arg (=0)               Height of low resolution frames (use 0 to omit low resolution stream)
  --lores-par [=arg(=1)] (=0)           Preserve the pixel aspect ratio of the low res image (where possible) by
                                        applying a different crop on the stream.
  --mode arg                            Camera mode as W:H:bit-depth:packing, where packing is P (packed) or U
                                        (unpacked)
  --viewfinder-mode arg                 Camera mode for preview as W:H:bit-depth:packing, where packing is P (packed)
                                        or U (unpacked)
  --buffer-count arg (=0)               Number of in-flight requests (and buffers) configured for video, raw, and
                                        still.
  --viewfinder-buffer-count arg (=0)    Number of in-flight requests (and buffers) configured for preview window.
  --no-raw [=arg(=1)] (=0)              Disable requesting of a RAW stream. Will override any manual mode reqest the
                                        mode choice when setting framerate.
  --autofocus-mode arg (=default)       Control to set the mode of the AF (autofocus) algorithm.(manual, auto,
                                        continuous)
  --autofocus-range arg (=normal)       Set the range of focus distances that is scanned.(normal, macro, full)
  --autofocus-speed arg (=normal)       Control that determines whether the AF algorithm is to move the lens as quickly
                                        as possible or more steadily.(normal, fast)
  --autofocus-window arg (=0,0,0,0)     Sets AfMetering to  AfMeteringWindows an set region used, e.g.
                                        0.25,0.25,0.5,0.5
  --lens-position arg                   Set the lens to a particular focus position, expressed as a reciprocal distance
                                        (0 moves the lens to infinity), or "default" for the hyperfocal distance
  --hdr [=arg(=auto)] (=off)            Enable High Dynamic Range, where supported. Available values are "off", "auto",
                                        "sensor" for sensor HDR (e.g. for Camera Module 3), "single-exp" for PiSP based
                                        single exposure multiframe HDR
  --metadata arg                        Save captured image metadata to a file or "-" for stdout
  --metadata-format arg (=json)         Format to save the metadata in, either txt or json (requires --metadata)
  --flicker-period arg (=0s)            Manual flicker correction period
                                        Set to 10000us to cancel 50Hz flicker.
                                        Set to 8333us to cancel 60Hz flicker.

看这几个文档参考也行:https://docs.arducam.com/Raspberry-Pi-Camera/Native-camera/Libcamera-User-Guide/

树莓派RPi FPC Camera (B)教程

DIY相机(一)libcamera库_libcamera 库-CSDN博客

3 Python  demo

 现在树莓派在libcamera之上,又封装了picamera2,本着人生苦短,我用python的信条,所以先从picamera2开始学吧。

用起来还是非常简单。

1 demo1.py

#!//usr/bin/python3

from picamera2 import Picamera2

picam2 = Picamera2()
config = picam2.create_still_configuration()
picam2.configure(config)

picam2.start()

np_array = picam2.capture_array()
print(np_array)
picam2.capture_file("demo.jpg")
picam2.stop()

很简单就出图了,真的是非常方便。

2 demo2.py

最近几天一直想做的一个是就是把摄像头的图像显示在我那个不成器的ST7735小屏上,但是试了好几个方法都不成功。用ffmpeg,mplayer的命令,都是运行中就报错了或者是图形没显示。用python还真的轻松不少。

import time
import numpy as np
from picamera2 import Picamera2
from PIL import Image
import struct
import os

# 初始化摄像头
picam2 = Picamera2()
camera_config = picam2.create_still_configuration(main={"size": (128, 160)})
picam2.configure(camera_config)
picam2.start()

# 打开framebuffer
fb = open("/dev/fb0", "wb")

# 获取摄像头图像并转换格式
while True:
    # 捕获图像
    frame = picam2.capture_array()

    # 将图像转换为128x160
    img = Image.fromarray(frame)
    img = img.resize((128, 160))

    # 将图像转换为RGB565格式
    img_rgb565 = img.convert("RGB").tobytes("raw", "RGB")
    img_rgb565 = [((r & 0xF8) << 8) | ((g & 0xFC) << 3) | (b >> 3)
                  for r, g, b in zip(img_rgb565[0::3], img_rgb565[1::3], img_rgb565[2::3])]
    img_rgb565_bytes = struct.pack(f'{len(img_rgb565)}H', *img_rgb565)

    # 将转换后的图像数据写入framebuffer
    os.lseek(fb.fileno(), 0, os.SEEK_SET)
    fb.write(img_rgb565_bytes)
    fb.flush()

    time.sleep(0.1)  # 控制帧率

这里的关键就是图形大小和色彩的转换,直接看上面代码就行。当然,代码还是GPT里面生成,让我自己去弄,也许得调一天。。。当然,python的效率还是不够高,实际产品还是要改成C来发布。

4 C的demo

c - Copying frames from camera (/dev/video0) to framebuffer (/dev/fb0) gives unexpected result - Stack Overflow

#include <errno.h>
#include <fcntl.h>
#include <linux/videodev2.h>
#include <linux/fb.h>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <sys/mman.h>
#include <sys/stat.h>
#include <sys/types.h>
#include <sys/ioctl.h>

#define CAM_WIDTH 1280
#define CAM_HEIGHT 720
#define CAM_FORMAT (v4l2_fourcc('B','G','R','A'))

#define CAM_BUFFERS 1

int main(void)
{
    int cam_fd; 
    int fb_fd;

    struct v4l2_capability cap;
    struct v4l2_requestbuffers reqbuf;
    struct v4l2_format fmt;
    struct v4l2_buffer buffinfo;
    enum v4l2_buf_type bufftype;

    char *cam_buffers[6];
    int cam_buffer_size;
    int cam_buffer_index = -1;

    char *fb_p;
    struct fb_var_screeninfo vinfo;
    struct fb_fix_screeninfo finfo;

    /* Setting framebuffer */   
    fb_fd = open("/dev/fb0", O_RDWR);
    if(!fb_fd)
        {
                fprintf(stderr, "%s:%i: Unable to open framebuffer\n", __FILE__, __LINE__);
                return -1;
        }
    ioctl(fb_fd, FBIOGET_FSCREENINFO, &finfo);
    if(ioctl(fb_fd, FBIOGET_VSCREENINFO, &vinfo) == -1)
    {
        fprintf(stderr, "%s:%i: Unable to get framebuffer info\n", __FILE__, __LINE__);
        return -1;
    }
    printf("Framebuffer: resolution %dx%d with %dbpp\n\r", vinfo.xres, vinfo.yres, vinfo.bits_per_pixel);

    fb_p = (char*)mmap(0, vinfo.xres*vinfo.yres*4, PROT_READ | PROT_WRITE, MAP_SHARED, fb_fd, 0);

    memset(fb_p, 0, vinfo.xres*vinfo.yres*4);

    /* Setting camera */
    cam_fd = open("/dev/video0", O_RDWR | O_NONBLOCK, 0);
    if(!cam_fd){
        fprintf(stderr, "%s:%i: Couldn't open device\n", __FILE__, __LINE__);
        return -1;
    }
    if(ioctl(cam_fd, VIDIOC_QUERYCAP, &cap))
    {
        fprintf(stderr, "%s:%i: Couldn't retreive device capabilities\n", __FILE__, __LINE__);
        return -1;
    }
    if(cap.capabilities & V4L2_CAP_VIDEO_CAPTURE == 0)
    {
        fprintf(stderr, "%s:%i: Device is not a capture device\n", __FILE__, __LINE__);
        return -1;
    }
    if(cap.capabilities & V4L2_CAP_STREAMING == 0)
    {
        fprintf(stderr, "%s:%i: Device is not available for streaming", __FILE__, __LINE__);
        return -1;
    }
    
    /* Set image format */
    memset(&fmt, 0, sizeof(fmt));
    fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
    fmt.fmt.pix.width = CAM_WIDTH;
    fmt.fmt.pix.height = CAM_HEIGHT;
    fmt.fmt.pix.pixelformat = CAM_FORMAT;
    fmt.fmt.pix.field = V4L2_FIELD_NONE;
    if(ioctl(cam_fd, VIDIOC_S_FMT, &fmt) == -1)
    {
        fprintf(stderr, "%s:%i: Unable to set image format\n", __FILE__, __LINE__);
        return -1;
    }
    cam_buffer_size = fmt.fmt.pix.sizeimage;

    /* Request buffers */
    memset(&reqbuf, 0, sizeof(reqbuf));
    reqbuf.count = CAM_BUFFERS;
    reqbuf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
    reqbuf.memory = V4L2_MEMORY_MMAP;
    if(ioctl(cam_fd, VIDIOC_REQBUFS, &reqbuf) == -1)
    {
        fprintf(stderr, "%s:%i: Mmap streaming not supported\n", __FILE__, __LINE__);
        return -1;
    }
    if(reqbuf.count != CAM_BUFFERS)
    {
        fprintf(stderr, "%S:%i: Not all requared buffers are allocated\n", __FILE__, __LINE__);
        return -1;
    }
    
    /* Query and Mmap buffers */
    for (int i=0; i < CAM_BUFFERS; i++)
    {
        memset(&buffinfo, 0, sizeof(buffinfo));
        buffinfo.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
        buffinfo.memory = V4L2_MEMORY_MMAP;
        buffinfo.index = i;
        
        if(ioctl(cam_fd, VIDIOC_QUERYBUF, &buffinfo) == -1)
        {
            fprintf(stderr, "%s:%i: Unable to query buffers\n", __FILE__, __LINE__);
            return -1;
        }
        
        cam_buffers[i] = mmap(NULL, buffinfo.length, PROT_READ | PROT_WRITE, MAP_SHARED, cam_fd, buffinfo.m.offset);
    
        if(cam_buffers[i] == MAP_FAILED)
        {
            fprintf(stderr, "%s:%i: Unable to enqueue buffers\n", __FILE__, __LINE__);
            return -1;
        }
    }   
    
    /* Enqueue buffers */
        for (int i=0; i < CAM_BUFFERS; i++)
        {
                memset(&buffinfo, 0, sizeof(buffinfo));
                buffinfo.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
                buffinfo.memory = V4L2_MEMORY_MMAP;
                buffinfo.index = i;

                if(ioctl(cam_fd, VIDIOC_QBUF, &buffinfo) == -1)
                {
                        fprintf(stderr, "%s:%i: Unable to enqueue buffers\n", __FILE__, __LINE__);
                        return -1;
                }
        }    

    /* Start Streaming */
    bufftype = V4L2_BUF_TYPE_VIDEO_CAPTURE;
    if(ioctl(cam_fd, VIDIOC_STREAMON, &bufftype) == -1)
    {
        fprintf(stderr, "%s:%i: Unable to start streaming\n", __FILE__, __LINE__);
        return -1;
    }

    while(1)
    {
        fd_set fds;
        struct timeval tv;
        int r;

        FD_ZERO(&fds);
        FD_SET(cam_fd, &fds);
        tv.tv_sec = 2;
        tv.tv_usec = 0;
        
        r = select(cam_fd+1, &fds, NULL, NULL, &tv);
        if (r == -1) {
            if (errno = EINTR)
                continue;
            fprintf(stderr, "%s:%i: Call to select() failed\n", __FILE__, __LINE__);
            return -1;
        }
        if (r == 0) {
            fprintf(stderr, "%s:%i: Call to select() timeout\n", __FILE__, __LINE__);
            continue;
        }

        if (!FD_ISSET(cam_fd, &fds))
            continue;
        
        memset(&buffinfo, 0, sizeof(buffinfo));
        buffinfo.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
        buffinfo.memory = V4L2_MEMORY_MMAP;
        if(ioctl(cam_fd, VIDIOC_DQBUF, &buffinfo) == -1) {
            if(errno == EAGAIN)
                continue;
            fprintf(stderr, "%s:%i: Unable to dequeue buffer\n", __FILE__, __LINE__);
            return -1;
        }

        cam_buffer_index = buffinfo.index;

        memcpy(fb_p, cam_buffers[cam_buffer_index], vinfo.xres*vinfo.yres*4);

        memset(&buffinfo, 0, sizeof(buffinfo));
                buffinfo.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
                buffinfo.memory = V4L2_MEMORY_MMAP;
        buffinfo.index = cam_buffer_index;
                if(ioctl(cam_fd, VIDIOC_QBUF, &buffinfo) == -1) {
                        fprintf(stderr, "%s:%i: Unable to enqueue buffer\n", __FILE__, __LINE__);
                        return -1;
                }
    }
    return 0;
}

。。。又看了下,上面那个还是基于V4L2,下面这个才是根正苗红的libcamera。

#include <iomanip>
#include <iostream>
#include <memory>
#include <thread>

#include <libcamera/libcamera.h>

using namespace libcamera;
using namespace std::chrono_literals;

int main()
{
    // Code to follow

    return 0;
}

这个是API手册:libcamera: libcamera API reference

 

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值