研究了一圈,想直接采集raw图,很难。淘宝上直接买的模组,几乎都是直接出isp过后的图,至少有个去马赛克的过程,这就很难采raw了。淘宝上几乎是只能定制。想不到,不需要isp功能的相机模组反而非常难找。通过开发板来做的话,也很少有相关资料,而且多数是数莓派的开发板的,没搞过这种板。研究了一下,用jetson nano应该也可以,以下的仅有的一些资料:
2,在Jetson Nano平台上利用libargus控制树莓派相机模块拍照
3,GitHub - zhaoxuhui/Jetpack-Argus-Demo-OneShot
资料1有提一下代码是怎么样采集到raw数据(没有保存raw图),资料2的整体的介绍,包括了最小工程建立,cmake怎么写之类,资料3是2的github代码。
实际操作的时候,是下载github代码为蓝本,参考资料1来改成拍raw的指令,并摸索raw的保存和imx219相机的图片buffer预处理,资料1中的相机是13.2m相机,pitch是8192,而我的相机是(3264+64)*2,且我的相机的pitch中没用的数据是放在前面的,他那个相机是放在后面的。这个64也是多次尝试后摸索出来的,不跳过pading的时候i,图片会乱码,同时会有一行一行很规律的黑线,这个黑线看起来差不多是64个像素,且是从头开始的。按理说imx最大像素是3280,减去3264,就是16了,不知道为什么是64,但实际上就是64有效,没具体研究。
附上我以资料三中main-all.cpp修改来的main-raw.cpp,其他部分模仿资料三就好了:
#include <stdio.h>
#include <stdlib.h>
#include <Argus/Argus.h>
#include <EGLStream/EGLStream.h>
#include <iostream>
#include<fstream>
#include<iostream>
#include<string>
int main(int argc, char** argv)
{
// step1 创建Argus环境
Argus::UniqueObj<Argus::CameraProvider> cameraProvider(Argus::CameraProvider::create());
Argus::ICameraProvider *iCameraProvider = Argus::interface_cast<Argus::ICameraProvider>(cameraProvider);
// 信息输出
std::cout<<"Step1: Create Argus Session and Get Camera Provider"<<std::endl;
if(!iCameraProvider){
std::cout<<"==>Cannot get core camera provider interface"<<std::endl<<std::endl;
}else{
std::cout<<"==>Argus Version:"<<iCameraProvider->getVersion().c_str()<<std::endl<<std::endl;
}
// step2 获取可用相机
std::vector<Argus::CameraDevice*> cameraDevices;
iCameraProvider->getCameraDevices(&cameraDevices);
// 信息输出
std::cout<<"Step2: Get Available Camera Devices by Camera Provider"<<std::endl;
if(cameraDevices.size()==0){
std::cout<<"==>No cameras available"<<std::endl<<std::endl;
}else{
std::cout<<"==>"<<cameraDevices.size()<<" Camera device(s) avaiable"<<std::endl<<std::endl;
}
// step3 获取相机属性接口
Argus::ICameraProperties *iCameraProperties = Argus::interface_cast<Argus::ICameraProperties>(cameraDevices[0]);
// 信息输出
std::cout<<"Step3: Get Properties of Available Camera"<<std::endl;
if(!iCameraProperties){
std::cout<<"==>Failed to get iCameraProperties interface"<<std::endl<<std::endl;
}else{
std::cout<<"==>Succeed to get iCameraProperties interface"<<std::endl<<std::endl;
}
// step4 获取相机支持的拍摄模式并默认使用第一个拍摄
std::vector<Argus::SensorMode*> sensorModes;
iCameraProperties->getAllSensorModes(&sensorModes);
Argus::ISensorMode *iSensorMode = Argus::interface_cast<Argus::ISensorMode>(sensorModes[0]);
// 信息输出
std::cout<<"Step4: Get Available Camera Modes"<<std::endl;
if(sensorModes.size()==0){
std::cout<<"==>Failed to get sensor modes"<<std::endl<<std::endl;
}else{
std::cout<<"==>Succeed to get sensor modes"<<std::endl;
for (int i = 0; i < sensorModes.size(); ++i)
{
Argus::ISensorMode *tmpSensorMode = Argus::interface_cast<Argus::ISensorMode>(sensorModes[i]);
Argus::Size2D<uint32_t> resolution = tmpSensorMode->getResolution();
std::cout<<"\tMode "<<i<<": Width="<<resolution.width()<<" Height="<<resolution.height()<<std::endl;
}
std::cout<<std::endl;
}
// Step5 创建Capture Session并获得控制接口
Argus::Status status;
Argus::UniqueObj<Argus::CaptureSession> captureSession(iCameraProvider->createCaptureSession(cameraDevices[0], &status));
Argus::ICaptureSession *iSession = Argus::interface_cast<Argus::ICaptureSession>(captureSession);
// 信息输出
std::cout<<"Step5: Get the Interface of Capture Session"<<std::endl;
if(status!=Argus::STATUS_OK){
std::cout<<"==>Failed to get the interface of capture session"<<std::endl<<std::endl;
}else{
std::cout<<"==>Succeed to get the interface of capture session"<<std::endl<<std::endl;
}
// Step6 设置输出流参数
Argus::UniqueObj<Argus::OutputStreamSettings> streamSettings(iSession->createOutputStreamSettings(Argus::STREAM_TYPE_EGL));
Argus::IEGLOutputStreamSettings *iEGLStreamSettings = Argus::interface_cast<Argus::IEGLOutputStreamSettings>(streamSettings);
iEGLStreamSettings->setPixelFormat(Argus::PIXEL_FMT_RAW16);
//iEGLStreamSettings->setPixelFormat(Argus::PIXEL_FMT_YCbCr_420_888);
iEGLStreamSettings->setResolution(iSensorMode->getResolution());
iEGLStreamSettings->setMetadataEnable(true);
// 信息输出
std::cout<<"Step6: Set Parameters of Output Stream"<<std::endl;
if(!iEGLStreamSettings){
std::cout<<"==>Failed to set the parameters of output stream"<<std::endl<<std::endl;
}else{
std::cout<<"==>Succeed to set the parameters of output stream"<<std::endl<<std::endl;
}
// Step7 创建输出流
Argus::UniqueObj<Argus::OutputStream> stream(iSession->createOutputStream(streamSettings.get()));
// 信息输出
std::cout<<"Step7: Create Output Stream Object"<<std::endl;
if(!stream){
std::cout<<"==>Failed to create output stream object"<<std::endl<<std::endl;
}else{
std::cout<<"==>Succeed to create output stream object"<<std::endl<<std::endl;
}
// Step8 创建Consumer对象并获取控制接口
Argus::UniqueObj<EGLStream::FrameConsumer> consumer(EGLStream::FrameConsumer::create(stream.get()));
EGLStream::IFrameConsumer *iFrameConsumer = Argus::interface_cast<EGLStream::IFrameConsumer>(consumer);
// 信息输出
std::cout<<"Step8: Create Consumer Object and Interface"<<std::endl;
if(!iFrameConsumer){
std::cout<<"==>Failed to create consumer object"<<std::endl<<std::endl;
}else{
std::cout<<"==>Succeed to create consumer object"<<std::endl<<std::endl;
}
// Step9 创建请求并与Stream关联
Argus::UniqueObj<Argus::Request> request(iSession->createRequest(Argus::CAPTURE_INTENT_STILL_CAPTURE));
Argus::IRequest *iRequest = Argus::interface_cast<Argus::IRequest>(request);
status = iRequest->enableOutputStream(stream.get());
// 信息输出
std::cout<<"Step9: Create Request"<<std::endl;
if(status!=Argus::STATUS_OK){
std::cout<<"==>Failed to create request"<<std::endl<<std::endl;
}else{
std::cout<<"==>Succeed to create request"<<std::endl<<std::endl;
}
// Step10 设置拍摄的模式
Argus::ISourceSettings *iSourceSettings = Argus::interface_cast<Argus::ISourceSettings>(request);
iSourceSettings->setSensorMode(sensorModes[0]);
uint32_t requestId = iSession->capture(request.get());
// 信息输出
std::cout<<"Step10: Set Sensor Mode"<<std::endl;
if(!iSourceSettings){
std::cout<<"==>Failed to set sensor mode"<<std::endl<<std::endl;
}else{
std::cout<<"==>Succeed to set sensor mode"<<std::endl<<std::endl;
}
// Step11 构造Frame对象并接收数据
const uint64_t FIVE_SECONDS_IN_NANOSECONDS = 5000000000;
Argus::UniqueObj<EGLStream::Frame> frame(iFrameConsumer->acquireFrame(FIVE_SECONDS_IN_NANOSECONDS, &status));
EGLStream::IFrame *iFrame = Argus::interface_cast<EGLStream::IFrame>(frame);
EGLStream::Image *image = iFrame->getImage();
// 信息输出
std::cout<<"Step11: Create Frame Object and Receive Data"<<std::endl;
if(!image){
std::cout<<"==>Failed to create frame object and receive data"<<std::endl<<std::endl;
}else{
std::cout<<"==>Succeed to create frame object and receive data"<<std::endl<<std::endl;
}
// Step12 构造JPEG对象
//EGLStream::IImageJPEG *iImageJPEG = Argus::interface_cast<EGLStream::IImageJPEG>(image);
//status = iImageJPEG->writeJPEG("argus_oneShot.jpg");
EGLStream::IImage *iImage = Argus::interface_cast<EGLStream::IImage>(image);
// 信息输出
std::cout<<"Step12: Create image Object and Save Data"<<std::endl;
if(!image){
std::cout<<"==>Failed to create image object and save data"<<std::endl<<std::endl;
}else{
std::cout<<"==>Succeed to create image object and save data"<<std::endl<<std::endl;
}
//Argus::Size2D<uint16_t> size = iImage->getSize();
//status = iImage->writeJPEG("argus_oneShot.raw");
//EXIT_IF_NULL(iImage, "Failed to get iImage Interface");
//FILE *fpdst = fopen("argus_oneShot.raw","wb");
//std::cout<<(uint16_t*)iImage->mapBuffer()<<std::endl;
//fwrite((uint16_t*)iImage->mapBuffer(),sizeof(uint16_t),3264*2464,fpdst);
//fclose(fpdst);
uint16_t* p_s = (uint16_t*)iImage->mapBuffer();
Argus::Size2D<uint16_t> size = iImage->getBufferSize();
int width = size.width() ;
int height = size.height() ;
//uint16_t numPlanes = size.getNumberOfPlanes() ;
std::cout<<width<<std::endl;
std::cout<<height<<std::endl;
//std::cout<<numPlanes<<std::endl;
uint16_t* p_d = new uint16_t[3264*2464];
uint16_t* output_data = new uint16_t[3264*2464];
size_t destIndex = 0;
//uint16_t data_buff[3264*2464];
uint16_t max_raW = 0;
uint16_t max_tmp = 0;
uint16_t max_d = 0;
for(int y=0;y<2464;y++){
p_s+=64;
for(int x =0;x<3264;x++){
// imx219相机是10bit深度的图片,相机buffer内是14bit的,所以进行了移位运算,不知是不是驱动有bug的原因
//std::cout<<*p_s<<std::endl;
output_data[destIndex++] = *p_s>>4;
p_s++;
p_d++;
}
//p_s+=64;//很多相机是在这里放无用数据,但imx219不是
}
std::ofstream file ( "oneShot.raw",std::ios::out | std::ios::binary);
file.write(reinterpret_cast<const char*>(output_data),sizeof(uint16_t)*3264*2464);//
//file.write((const char*)data_buff,sizeof(uint16_t)*3264*2464);//
file.close();
// 信息输出
//std::cout<<"Step12: Create JPEG Object and Save Data"<<std::endl;
//if(status!=Argus::STATUS_OK){
// std::cout<<"==>Failed to create jpeg object and save data"<<std::endl<<std::endl;
// }else{
// std::cout<<"==>Succeed to create jpeg object and save data"<<std::endl<<std::endl;
// }
// Step13 关闭Argus
cameraProvider.reset();
// 信息输出
std::cout<<"Step13: Shut Down Argus"<<std::endl;
return 0;
}
下面是经过了可视化之后的raw图,也没有做去马赛克之类,所以是黑白的模糊的,可视化原理非常简单,下面是可视化的python:
import numpy as np
import cv2
x = 3264
y = 2464
rawpath = "oneShot.raw"
Bindata = np.fromfile(rawpath,dtype = 'uint16')
max = 0
for i in range(x*y):
#print(Bindata[i])
if Bindata[i] > max:
max = Bindata[i]
print(max)
Binimg = Bindata[0:x*y].reshape(y,x)
Mat_img = np.clip(Binimg/4,0,255.0).astype("uint8")
cv2.imwrite("oneShot_ir.jpg",Mat_img)