Java实现海康相机SDK拍照录像二次开发

前言

Springboot项目中,现场每个工位配备一台海康相机MV-CU120-0UC(USB连接)和一台平板电脑/工控机(工位客户端),工位与服务器通讯,服务器部署有一套工位管理系统,工人需要在客户端登录工位管理系统进行作业,在工位管理系统前端界面进行相机拍照、录像、实时监控操作并显示,图片和视频需要按照指定路径保存在客户端本地,并上传到服务器,因相机安装在工位上,要想在服务器端进行相机操作,需要在工位客户端另起一个相机程序操控相机,涉及到相机二次开发。

主要参考文献

1、相机二次开发主要参考这篇,海康二次开发SDK里面也有完成的demo示例,后面我自己加了录像功能,这款相机并不具备录像功能,录像功能其实是连续拍图片,MVS录像也是如此

 海康威视相机SDK二次开发(JAVA语言)_海康sdk二次开发-CSDN博客

2、补充下MVS官网下载链接,MVS是海康官网提供的相机测试软件,也可以进行实体或者虚拟相机测试,安装使用网上有很多示例

海康机器人-机器视觉-下载中心

【Tools】机器视觉工业相机客户端安装详解教程_机器视觉工业相机客户端mvs v3.3.1-CSDN博客

 官方demo

以下是安装MVS后安装目录里提供的SDK二次开发demo示例,ImageSave.java、Grab_Callback.java、GrabImage.java就是抓图并保存、抓图回调、连续抓图等示例,可直接复制到项目中运行。

 这个就是官方提供的SDK包,可以直接放到项目中调用,具体参考上面链接

 注意:使用真实相机设备时,这里是Off,表示内触发模式,相机从设备内部给出的信号采集图像,虚拟相机时是On,表示外触发模式。

项目结构

客户端运行的相机二次开发小程序项目结构如下

二次开发

拍照

在工位管理系统前端界面点击拍照,会调用工位管理系统的后端DeviceAPI下的takePhotoSave接口,接口里url就是要调用的工位客户端运行的二次开发的相机小程序接口路径

 url调用相机小程序后触发ImageSave.SaveimageWithTaskId()方法进行相机拍照

 拍好的照片存在客户端本地,要想在工位管理系统前端展示,需要调用服务器工位管理系统后端接口displayPicture将本地图片上传到服务器项目资源路径下,这个需要在工位管理系统进行有关配置

此接口不会直接返回图片字节数组,而是将图片上传到服务器资源路径下,并接口返回浏览器可访问的url,前端拿到url后通过资源映射,直接去服务器资源路径下找到图片并前端展示,其中涉及资源映射配置,主要是配置参考如下

这里就是图片上传到服务器的资源路径,具体资源跳转配置参考文献

springboot映射的配置文件在哪 springboot地址映射_mob64ca1418aeab的技术博客_51CTO博客

其中,通过url形式访问客户端相机小程序接口获取图片字节数组,从而写入到服务器对应资源路径下

完整的拍照demo示例如下

package jzy.api; /***************************************************************************************************
 * @file      ImageSave.java
 * @breif     This demo show how to save raw/jpeg/bmp/tiff image to local
 * @author    **
 * @date      2024/07/26
 *
 **************************************************************************************************/
import MvCameraControlWrapper.CameraControlException;
import MvCameraControlWrapper.MvCameraControl;
import jzy.util.FileUtil;
import jzy.util.ImageToBase64URL;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.OutputStream;
import java.text.SimpleDateFormat;
import java.util.*;

import static MvCameraControlWrapper.MvCameraControlDefines.*;

public class ImageSave
{
    private static SimpleDateFormat sfSecond = new SimpleDateFormat("yyyy-M-d_HH-mm-dd-SSS");
    private static SimpleDateFormat sfTime = new SimpleDateFormat("HH:mm:ss");
    private static SimpleDateFormat sfSave = new SimpleDateFormat("yyyy-M-d");
    public static String imageUrl;
	public static Handle hCamera = null;
	public static Scanner scanner;
    /**
     * 根据年月日计算图片的路径
     * @return
     */
    private static String imagePathFun(){
        SimpleDateFormat sfDate = new SimpleDateFormat("yyyy-M-d");
        return "/home/hello/MVS/catch_image/"+sfDate.format(new Date())+"/";
    }
	private static boolean IsHBPixelFormat(MvGvspPixelType ePixelType)
    {
		switch (ePixelType)
		{
			case PixelType_Gvsp_HB_Mono8:
			case PixelType_Gvsp_HB_Mono10:
			case PixelType_Gvsp_HB_Mono10_Packed:
			case PixelType_Gvsp_HB_Mono12:
			case PixelType_Gvsp_HB_Mono12_Packed:
			case PixelType_Gvsp_HB_Mono16:
			case PixelType_Gvsp_HB_RGB8_Packed:
			case PixelType_Gvsp_HB_BGR8_Packed:
			case PixelType_Gvsp_HB_RGBA8_Packed:
			case PixelType_Gvsp_HB_BGRA8_Packed:
			case PixelType_Gvsp_HB_RGB16_Packed:
			case PixelType_Gvsp_HB_BGR16_Packed:
			case PixelType_Gvsp_HB_RGBA16_Packed:
			case PixelType_Gvsp_HB_BGRA16_Packed:
			case PixelType_Gvsp_HB_YUV422_Packed:
			case PixelType_Gvsp_HB_YUV422_YUYV_Packed:
			case PixelType_Gvsp_HB_BayerGR8:
			case PixelType_Gvsp_HB_BayerRG8:
			case PixelType_Gvsp_HB_BayerGB8:
			case PixelType_Gvsp_HB_BayerBG8:
			case PixelType_Gvsp_HB_BayerRBGG8:
			case PixelType_Gvsp_HB_BayerGB10:
			case PixelType_Gvsp_HB_BayerGB10_Packed:
			case PixelType_Gvsp_HB_BayerBG10:
			case PixelType_Gvsp_HB_BayerBG10_Packed:
			case PixelType_Gvsp_HB_BayerRG10:
			case PixelType_Gvsp_HB_BayerRG10_Packed:
			case PixelType_Gvsp_HB_BayerGR10:
			case PixelType_Gvsp_HB_BayerGR10_Packed:
			case PixelType_Gvsp_HB_BayerGB12:
			case PixelType_Gvsp_HB_BayerGB12_Packed:
			case PixelType_Gvsp_HB_BayerBG12:
			case PixelType_Gvsp_HB_BayerBG12_Packed:
			case PixelType_Gvsp_HB_BayerRG12:
			case PixelType_Gvsp_HB_BayerRG12_Packed:
			case PixelType_Gvsp_HB_BayerGR12:
			case PixelType_Gvsp_HB_BayerGR12_Packed:
				return true;
			default:
				return false;
		}
	}
	 private static void printDeviceInfo(MV_CC_DEVICE_INFO stDeviceInfo)
    {
        if (null == stDeviceInfo) {
            System.out.println("stDeviceInfo is null");
            return;
        }

        if ((stDeviceInfo.transportLayerType == MV_GIGE_DEVICE)||( stDeviceInfo.transportLayerType == MV_GENTL_GIGE_DEVICE))
		{
            System.out.println("\tCurrentIp:       " + stDeviceInfo.gigEInfo.currentIp);
            System.out.println("\tModel:           " + stDeviceInfo.gigEInfo.modelName);
            System.out.println("\tUserDefinedName: " + stDeviceInfo.gigEInfo.userDefinedName);
        } 
		else if (stDeviceInfo.transportLayerType == MV_USB_DEVICE) {
            System.out.println("\tUserDefinedName: " + stDeviceInfo.usb3VInfo.userDefinedName);
            System.out.println("\tSerial Number:   " + stDeviceInfo.usb3VInfo.serialNumber);
            System.out.println("\tDevice Number:   " + stDeviceInfo.usb3VInfo.deviceNumber);
        }
		else if (stDeviceInfo.transportLayerType == MV_GENTL_CAMERALINK_DEVICE)
        {
            System.out.println("\tUserDefinedName: " + stDeviceInfo.cmlInfo.userDefinedName);
            System.out.println("\tSerial Number:   " + stDeviceInfo.cmlInfo.serialNumber);
            System.out.println("\tDevice Number:   " + stDeviceInfo.cmlInfo.DeviceID);
        }
        else if (stDeviceInfo.transportLayerType == MV_GENTL_CXP_DEVICE)
        {
            System.out.println("\tUserDefinedName: " + stDeviceInfo.cxpInfo.userDefinedName);
            System.out.println("\tSerial Number:   " + stDeviceInfo.cxpInfo.serialNumber);
            System.out.println("\tDevice Number:   " + stDeviceInfo.cxpInfo.DeviceID);
        }
        else if (stDeviceInfo.transportLayerType == MV_GENTL_XOF_DEVICE)
        {
            System.out.println("\tUserDefinedName: " + stDeviceInfo.xofInfo.userDefinedName);
            System.out.println("\tSerial Number:   " + stDeviceInfo.xofInfo.serialNumber);
            System.out.println("\tDevice Number:   " + stDeviceInfo.xofInfo.DeviceID);
        }
	    else {
            System.err.print("Device is not supported! \n");
        }

        System.out.println("\tAccessible:      "
            + MvCameraControl.MV_CC_IsDeviceAccessible(stDeviceInfo, MV_ACCESS_Exclusive));
        System.out.println("");
    }
    private static void printFrameInfo(MV_FRAME_OUT_INFO stFrameInfo)
    {
        if (null == stFrameInfo)
        {
            System.err.println("stFrameInfo is null");
            return;
        }

        StringBuilder frameInfo = new StringBuilder("");
        frameInfo.append(("\tFrameNum[" + stFrameInfo.frameNum + "]"));
        frameInfo.append("\tWidth[" + stFrameInfo.width + "]");
        frameInfo.append("\tHeight[" + stFrameInfo.height + "]");
        frameInfo.append(String.format("\tPixelType[%#x]", stFrameInfo.pixelType.getnValue()));

        System.out.println(frameInfo.toString());
    }
    public static void saveDataToFile(byte[] dataToSave, int dataSize, String fileName,String savePath)
    {
        OutputStream os = null;

        try
        {
			if((null == dataToSave)||(dataSize <= 0))
			{
				System.out.println("saveDataToFile param error.");
				return;
			}

            // Create directory
            /*
             * savePath参数为图片保存路径,在调用方法时通过参数传入
             */
            File tempFile = new File(savePath);
            if (!tempFile.exists()) {
                tempFile.mkdirs();
            }
            os = new FileOutputStream(tempFile.getPath() + File.separator + fileName);
			if(null != os)
			{
				os.write(dataToSave, 0, dataSize);
                /*
                 * 图片保存成功后,输出当前时间
                 */
                System.out.println("--- " + sfTime.format(new Date()) + " Save Image succeed ---");

			}
            
        }
        catch (IOException e)
        {
            e.printStackTrace();
        }
        finally
        {
            // close file stream
            try 
            {
				if(os != null)
				{
					 os.close();
				}
               
            } catch (IOException e) 
            {
                e.printStackTrace();
            }
        }
    }
   public static int chooseCamera(ArrayList<MV_CC_DEVICE_INFO> stDeviceList,int camIndex)
    {
        if (null == stDeviceList)
        {
            return -1;
        }
        
		 // Choose a device to operate
//        int camIndex = -1;

        while (true)
        {
//			System.out.print("Please input camera index:");
			

                try
                {
//				   camIndex = scanner.nextInt();
                   if ((camIndex >= 0 && camIndex < stDeviceList.size()) || -1 == camIndex)
                   {
                       break;
                   }
                   else
                  {
                      System.out.println("Input error: " + camIndex + " Over Range:( 0 - " + (stDeviceList.size()-1) + " )");
                      camIndex = -1;
                      break;
                   }
                }
                catch (Exception e)
                {
			       System.out.println("Input not number.");
			       e.printStackTrace();
                   camIndex = -1;
                   break;
                }


			
        }
       if (-1 == camIndex) {
            System.out.println("Input error.exit");
            return camIndex;
        }

        if (0 <= camIndex && stDeviceList.size() > camIndex)
        {
            if ((MV_GIGE_DEVICE == stDeviceList.get(camIndex).transportLayerType)||(MV_GENTL_GIGE_DEVICE == stDeviceList.get(camIndex).transportLayerType))
            {
                System.out.println("Connect to camera[" + camIndex + "]: " + stDeviceList.get(camIndex).gigEInfo.userDefinedName);
            }
            else if (MV_USB_DEVICE == stDeviceList.get(camIndex).transportLayerType)
            {
                System.out.println("Connect to camera[" + camIndex + "]: " + stDeviceList.get(camIndex).usb3VInfo.userDefinedName);
            }
			else if (MV_GENTL_CAMERALINK_DEVICE == stDeviceList.get(camIndex).transportLayerType)
            {
				System.out.println("Connect to camera[" + camIndex + "]: " + stDeviceList.get(camIndex).cmlInfo.DeviceID);
            }
            else if (MV_GENTL_CXP_DEVICE == stDeviceList.get(camIndex).transportLayerType)
            {
               System.out.println("Connect to camera[" + camIndex + "]: " + stDeviceList.get(camIndex).cxpInfo.DeviceID);
            }
            else if (MV_GENTL_XOF_DEVICE == stDeviceList.get(camIndex).transportLayerType)
            {
                System.out.println("Connect to camera[" + camIndex + "]: " + stDeviceList.get(camIndex).xofInfo.DeviceID);
            }
            else
            {
                System.out.println("Device is not supported.");
            }
        }
        else
        {
            System.out.println("Invalid index " + camIndex);
            camIndex = -1;
        }

		return camIndex;
    }
    public static int SaveImage(MV_SAVE_IAMGE_TYPE enSaveImageType,byte[] dataToSave,MV_FRAME_OUT_INFO stImageInfo,String savePath)
    {
        MV_SAVE_IMAGE_TO_FILE_PARAM_EX stSaveFileParam = new MV_SAVE_IMAGE_TO_FILE_PARAM_EX();

        stSaveFileParam.imageType = enSaveImageType;						// ch:需要保存的图像类型 | en:Image format to save
        stSaveFileParam.pixelType = stImageInfo.pixelType;  // ch:相机对应的像素格式 | en:Camera pixel type
        stSaveFileParam.width      = stImageInfo.ExtendWidth;         // ch:相机对应的宽 | en:Width
        stSaveFileParam.height     = stImageInfo.ExtendHeight;          // ch:相机对应的高 | en:Height
        stSaveFileParam.dataLen    = stImageInfo.frameLen;
        stSaveFileParam.data      = dataToSave;
        stSaveFileParam.methodValue = 1;

        // ch:jpg图像质量范围为(50-99]| en:jpg image nQuality range is (50-99]
        stSaveFileParam.jpgQuality = 99;
        if (MV_SAVE_IAMGE_TYPE.MV_Image_Bmp == stSaveFileParam.imageType)
        {
            stSaveFileParam.imagePath  = String.format("Image_w%d_h%d_fn%d.bmp",  stImageInfo.ExtendWidth, stImageInfo.ExtendHeight, stImageInfo.frameNum);
        }
        else if (MV_SAVE_IAMGE_TYPE.MV_Image_Jpeg == stSaveFileParam.imageType)
        {
            stSaveFileParam.imagePath  = String.format("Image_w%d_h%d_fn%d.jpg",  stImageInfo.ExtendWidth, stImageInfo.ExtendHeight, stImageInfo.frameNum);
//            stSaveFileParam.imagePath  = savePath;

        }
        else if (MV_SAVE_IAMGE_TYPE.MV_Image_Tif == stSaveFileParam.imageType)
        {
            stSaveFileParam.imagePath  = String.format("Image_w%d_h%d_fn%d.tif",  stImageInfo.ExtendWidth, stImageInfo.ExtendHeight, stImageInfo.frameNum);
        }
        else if (MV_SAVE_IAMGE_TYPE.MV_Image_Png == stSaveFileParam.imageType)
        {
            stSaveFileParam.imagePath  = String.format("Image_w%d_h%d_fn%d.png",  stImageInfo.ExtendWidth, stImageInfo.ExtendHeight, stImageInfo.frameNum);
         }

        int nRet = MvCameraControl.MV_CC_SaveImageToFileEx(hCamera,stSaveFileParam);
        return nRet;
    }
    public static void SaveimageWithTaskId( int  deleteDay, int  camIndex, String imagePath,String picId) {

        int nRet = MV_OK;
        Handle hCamera = null;
        ArrayList<MV_CC_DEVICE_INFO> stDeviceList=null;
        boolean needConnectDevice = true;

        while (needConnectDevice) {
            System.out.println("SDK Version " + MvCameraControl.MV_CC_GetSDKVersion());
            // Enuerate GigE and USB devices
            try {
                stDeviceList = MvCameraControl.MV_CC_EnumDevices(MV_GIGE_DEVICE | MV_USB_DEVICE | MV_GENTL_GIGE_DEVICE | MV_GENTL_CAMERALINK_DEVICE | MV_GENTL_CXP_DEVICE | MV_GENTL_XOF_DEVICE);
                if (0 >= stDeviceList.size()) {
                    System.out.println("No devices found!");
                }
                int i = 0;
                for (MV_CC_DEVICE_INFO stDeviceInfo : stDeviceList) {
                    System.out.println("[camera " + (i++) + "]");
                    printDeviceInfo(stDeviceInfo);
                }
            } catch (CameraControlException e) {
                System.err.println("Enumrate devices failed!" + e.toString());
                e.printStackTrace();
            }

            // choose camera
            camIndex = chooseCamera(stDeviceList,camIndex);
            if (camIndex == -1) {
            }

            // Create handle
            try {
                hCamera = MvCameraControl.MV_CC_CreateHandle(stDeviceList.get(camIndex));
            } catch (CameraControlException e) {
                System.err.println("Create handle failed!" + e.toString());
                e.printStackTrace();
                hCamera = null;
            }

            // Open device
            nRet = MvCameraControl.MV_CC_OpenDevice(hCamera);
            if (MV_OK != nRet) {
                System.err.printf("Connect to camera failed, errcode: [%#x]\n", nRet);
            }
            if(MV_OK == nRet){
                needConnectDevice=false;
                break;
            }
            try {
                Thread.sleep(2000);
            } catch (InterruptedException e) {
                e.printStackTrace();
            }
        }

            /*
            测试线路选择器
             */

        // Make sure that trigger mode is off
        nRet = MvCameraControl.MV_CC_SetEnumValueByString(hCamera, "TriggerMode", "Off");
        if (MV_OK != nRet) {
            System.err.printf("SetTriggerMode failed, errcode: [%#x]\n", nRet);

        }

        // Get payload size
        MVCC_INTVALUE stParam = new MVCC_INTVALUE();
        nRet = MvCameraControl.MV_CC_GetIntValue(hCamera, "PayloadSize", stParam);
        if (MV_OK != nRet) {
            System.err.printf("Get PayloadSize fail, errcode: [%#x]\n", nRet);

        }

        // Start grabbing
        nRet = MvCameraControl.MV_CC_StartGrabbing(hCamera);
        if (MV_OK != nRet) {
            System.err.printf("Start Grabbing fail, errcode: [%#x]\n", nRet);

        }

        // Get one frame
        MV_FRAME_OUT_INFO stImageInfo = new MV_FRAME_OUT_INFO();
        byte[] pData = new byte[(int) stParam.curValue];
        nRet = MvCameraControl.MV_CC_GetOneFrameTimeout(hCamera, pData, stImageInfo, 1000);
        if (MV_OK != nRet) {
            System.err.printf("GetOneFrameTimeout fail, errcode:[%#x]\n", nRet);

        }

        printFrameInfo(stImageInfo);

        int imageLen = stImageInfo.width * stImageInfo.height * 3;    // Every RGB pixel takes 3 bytes
        byte[] imageBuffer = new byte[imageLen];

        // Call MV_CC_SaveImage to save image as JPEG
        MV_SAVE_IMAGE_PARAM_EX3 stSaveParam = new MV_SAVE_IMAGE_PARAM_EX3();
        stSaveParam.width = stImageInfo.width;                                  // image width
        stSaveParam.height = stImageInfo.height;                                // image height
        stSaveParam.data = pData;                                               // image data
        stSaveParam.dataLen = stImageInfo.frameLen;                             // image data length
        stSaveParam.pixelType = stImageInfo.pixelType;                          // image pixel format
        stSaveParam.imageBuffer = imageBuffer;                                  // output image buffer
        stSaveParam.imageType = MV_SAVE_IAMGE_TYPE.MV_Image_Jpeg;               // output image pixel format
        stSaveParam.methodValue = 0;                                            // Interpolation method that converts Bayer format to RGB24.  0-Neareast 1-double linear 2-Hamilton
        stSaveParam.jpgQuality = 60;                                            // JPG endoding quality(50-99]

        nRet = MvCameraControl.MV_CC_SaveImageEx3(hCamera, stSaveParam);
        if (MV_OK != nRet) {
            //System.err.printf("SaveImage fail, errcode: [%#x]\n", nRet);
            // System.out.println(sfTime.format(new Date())+" "+ nRet + " No Image Need Save...");
            // break;
            //continue;
        } else {
            // 保存照片的方法 Save buffer content to file
            saveDataToFile(imageBuffer, stSaveParam.imageLen, picId,imagePath);
            imageUrl="1";//成功保存
        }

        // Stop grabbing
        nRet = MvCameraControl.MV_CC_StopGrabbing(hCamera);
        if (MV_OK != nRet) {
            System.err.printf("StopGrabbing fail, errcode: [%#x]\n", nRet);

        }

        if (null != hCamera) {
            // Destroy handle
            nRet = MvCameraControl.MV_CC_DestroyHandle(hCamera);
            if (MV_OK != nRet) {
                System.err.printf("DestroyHandle failed, errcode: [%#x]\n", nRet);
            }
        }
    }
}

录像

说明,类似MVS的阉割版,目前代码实现一秒存8张图片,再转成视频10帧一秒,可以拍摄静物或者小幅度移动的物体,不然会画面失真,MVS实时显示也就8帧每秒,java怎么优化需要考虑。

工位管理系统前端点击拍摄开始/结束,调用服务器工位管理系统后端接口, 再去调用客户端相机小程序接口进行相机录像操作,具体过程和拍照类似

 相机小程序接收到拍摄开始信号时,新启线程执行相机录像方法VideoSave.main(videoSavePath)

videoSavePath是视频待保存的路径,VideoSave.main()方法执行时,VideoSave.flag会一直是true,当接收到拍摄结束信号时,再置成false,从而结束录像方法

VideoSave类完整代码如下:

package jzy.api; /***************************************************************************************************
 * @file Grab_Callback.java
 * @breif Use functions provided in MvCameraControlWrapper.jar to grab images
 * @author zhanglei72
 * @date 2020/01/12
 *
 * @warning
 * @version V1.0.0  2020/01/12 Create this file
 *            V1.0.1  2020/02/10 add parameter checking
 * @since 2020/02/10
 **************************************************************************************************/

import MvCameraControlWrapper.CameraControlException;
import MvCameraControlWrapper.MvCameraControl;
import jzy.util.FileUtil;
import jzy.util.ImageToVideo;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;

import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.OutputStream;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Date;
import java.util.Scanner;

import static MvCameraControlWrapper.MvCameraControlDefines.*;
//@Component
public class VideoSave {
    private static SimpleDateFormat sfSave = new SimpleDateFormat("yyyy-M-d");
    private static SimpleDateFormat sfTime = new SimpleDateFormat("HH:mm:ss");
    public static Scanner scanner;
    static Handle hCamera = null;
    static int nRet = MV_OK;
     public static boolean flag = true;
     public static boolean videoOverflag=false;
    static int camIndex = -1;
    public static int j=0;
    @Value("${device.camera.imagesInputFolder}") //D:/ImagesToVideo/videoImages/  拍摄视频图片存放位置
     static  String imagesInputFolder;
    @Value("${device.camera.outputVideo}") //D:/Videos/2024-11-04/proId/craftId/   1.mp4  视频本地存放位置
     static  String outputVideo;

    private static void printDeviceInfo(MV_CC_DEVICE_INFO stDeviceInfo) {
        if (null == stDeviceInfo) {
            System.out.println("stDeviceInfo is null");
            return;
        }

        if ((stDeviceInfo.transportLayerType == MV_GIGE_DEVICE) || (stDeviceInfo.transportLayerType == MV_GENTL_GIGE_DEVICE)) {
            System.out.println("\tCurrentIp:       " + stDeviceInfo.gigEInfo.currentIp);
            System.out.println("\tModel:           " + stDeviceInfo.gigEInfo.modelName);
            System.out.println("\tUserDefinedName: " + stDeviceInfo.gigEInfo.userDefinedName);
        } else if (stDeviceInfo.transportLayerType == MV_USB_DEVICE) {
            System.out.println("\tUserDefinedName: " + stDeviceInfo.usb3VInfo.userDefinedName);
            System.out.println("\tSerial Number:   " + stDeviceInfo.usb3VInfo.serialNumber);
            System.out.println("\tDevice Number:   " + stDeviceInfo.usb3VInfo.deviceNumber);
        } else if (stDeviceInfo.transportLayerType == MV_GENTL_CAMERALINK_DEVICE) {
            System.out.println("\tUserDefinedName: " + stDeviceInfo.cmlInfo.userDefinedName);
            System.out.println("\tSerial Number:   " + stDeviceInfo.cmlInfo.serialNumber);
            System.out.println("\tDevice Number:   " + stDeviceInfo.cmlInfo.DeviceID);
        } else if (stDeviceInfo.transportLayerType == MV_GENTL_CXP_DEVICE) {
            System.out.println("\tUserDefinedName: " + stDeviceInfo.cxpInfo.userDefinedName);
            System.out.println("\tSerial Number:   " + stDeviceInfo.cxpInfo.serialNumber);
            System.out.println("\tDevice Number:   " + stDeviceInfo.cxpInfo.DeviceID);
        } else if (stDeviceInfo.transportLayerType == MV_GENTL_XOF_DEVICE) {
            System.out.println("\tUserDefinedName: " + stDeviceInfo.xofInfo.userDefinedName);
            System.out.println("\tSerial Number:   " + stDeviceInfo.xofInfo.serialNumber);
            System.out.println("\tDevice Number:   " + stDeviceInfo.xofInfo.DeviceID);
        } else {
            System.err.print("Device is not supported! \n");
        }

        System.out.println("\tAccessible:      "
                + MvCameraControl.MV_CC_IsDeviceAccessible(stDeviceInfo, MV_ACCESS_Exclusive));
        System.out.println("");
    }


    public static int chooseCamera(ArrayList<MV_CC_DEVICE_INFO> stDeviceList) {
        if (null == stDeviceList) {
            return -1;
        }

        // Choose a device to operate
        int camIndex = 0;

        while (true) {
                try {
                    if ((camIndex >= 0 && camIndex < stDeviceList.size()) || -1 == camIndex) {
                        break;
                    } else {
                        System.out.println("Input error: " + camIndex + " Over Range:( 0 - " + (stDeviceList.size() - 1) + " )");
                    }
                } catch (NumberFormatException e) {
                    System.out.println("Input not number.");
                    camIndex = -1;
                    break;
                }


        }
        if (-1 == camIndex) {
            System.out.println("Input error.exit");
            return camIndex;
        }

        if (0 <= camIndex && stDeviceList.size() > camIndex) {
            if ((MV_GIGE_DEVICE == stDeviceList.get(camIndex).transportLayerType) || (MV_GENTL_GIGE_DEVICE == stDeviceList.get(camIndex).transportLayerType)) {
                System.out.println("Connect to camera[" + camIndex + "]: " + stDeviceList.get(camIndex).gigEInfo.userDefinedName);
            } else if (MV_USB_DEVICE == stDeviceList.get(camIndex).transportLayerType) {
                System.out.println("Connect to camera[" + camIndex + "]: " + stDeviceList.get(camIndex).usb3VInfo.userDefinedName);
            } else if (MV_GENTL_CAMERALINK_DEVICE == stDeviceList.get(camIndex).transportLayerType) {
                System.out.println("Connect to camera[" + camIndex + "]: " + stDeviceList.get(camIndex).cmlInfo.DeviceID);
            } else if (MV_GENTL_CXP_DEVICE == stDeviceList.get(camIndex).transportLayerType) {
                System.out.println("Connect to camera[" + camIndex + "]: " + stDeviceList.get(camIndex).cxpInfo.DeviceID);
            } else if (MV_GENTL_XOF_DEVICE == stDeviceList.get(camIndex).transportLayerType) {
                System.out.println("Connect to camera[" + camIndex + "]: " + stDeviceList.get(camIndex).xofInfo.DeviceID);
            } else {
                System.out.println("Device is not supported.");
            }
        } else {
            System.out.println("Invalid index " + camIndex);
            camIndex = -1;
        }

        return camIndex;
    }

    public static int SaveImage(MV_SAVE_IAMGE_TYPE enSaveImageType, byte[] dataToSave, MV_FRAME_OUT_INFO stImageInfo) {
        MV_SAVE_IMAGE_TO_FILE_PARAM_EX stSaveFileParam = new MV_SAVE_IMAGE_TO_FILE_PARAM_EX();

        stSaveFileParam.imageType = enSaveImageType;                        // ch:需要保存的图像类型 | en:Image format to save
        stSaveFileParam.pixelType = stImageInfo.pixelType;  // ch:相机对应的像素格式 | en:Camera pixel type
        stSaveFileParam.width = stImageInfo.ExtendWidth;         // ch:相机对应的宽 | en:Width
        stSaveFileParam.height = stImageInfo.ExtendHeight;          // ch:相机对应的高 | en:Height
        stSaveFileParam.dataLen = stImageInfo.frameLen;
        stSaveFileParam.data = dataToSave;
        stSaveFileParam.methodValue = 1;

        // ch:jpg图像质量范围为(50-99]| en:jpg image nQuality range is (50-99]
        stSaveFileParam.jpgQuality = 99;
        if (MV_SAVE_IAMGE_TYPE.MV_Image_Bmp == stSaveFileParam.imageType) {
            stSaveFileParam.imagePath = String.format("Image_w%d_h%d_fn%d.bmp", stImageInfo.ExtendWidth, stImageInfo.ExtendHeight, stImageInfo.frameNum);
        } else if (MV_SAVE_IAMGE_TYPE.MV_Image_Jpeg == stSaveFileParam.imageType) {
//            stSaveFileParam.imagePath = String.format("Image_w%d_h%d_fn%d.jpg", stImageInfo.ExtendWidth, stImageInfo.ExtendHeight, stImageInfo.frameNum);
            System.out.println("D:/ImagesToVideo/videoImages/"+j+".jpg");
            stSaveFileParam.imagePath  ="D:/ImagesToVideo/videoImages/"+j+".jpg";
        } else if (MV_SAVE_IAMGE_TYPE.MV_Image_Tif == stSaveFileParam.imageType) {
            stSaveFileParam.imagePath = String.format("Image_w%d_h%d_fn%d.tif", stImageInfo.ExtendWidth, stImageInfo.ExtendHeight, stImageInfo.frameNum);
        } else if (MV_SAVE_IAMGE_TYPE.MV_Image_Png == stSaveFileParam.imageType) {
            stSaveFileParam.imagePath = String.format("Image_w%d_h%d_fn%d.png", stImageInfo.ExtendWidth, stImageInfo.ExtendHeight, stImageInfo.frameNum);
        }
        int nRet = MvCameraControl.MV_CC_SaveImageToFileEx(hCamera, stSaveFileParam);
        return nRet;
    }

    public static void GetImage() {

        MV_FRAME_OUT stFrameOut = new MV_FRAME_OUT();
        while (flag) {
            nRet = MvCameraControl.MV_CC_GetImageBuffer(hCamera, stFrameOut, 1000);
            // ch:获取一帧图像 | en:Get one image
            if (MV_OK == nRet) {
//                System.out.println("\n Get Image Buffer Width:" + stFrameOut.mvFrameOutInfo.ExtendWidth
//                        + " Height: " + stFrameOut.mvFrameOutInfo.ExtendHeight
//                        + " FrameNum: " + stFrameOut.mvFrameOutInfo.frameNum);
                nRet = SaveImage(MV_SAVE_IAMGE_TYPE.MV_Image_Jpeg, stFrameOut.buffer, stFrameOut.mvFrameOutInfo);
                j++;
                nRet = MvCameraControl.MV_CC_FreeImageBuffer(hCamera, stFrameOut);
//                try {
//                    Thread.sleep(50);
//                } catch (InterruptedException e) {
//                    e.printStackTrace();
//                }
                if (MV_OK != nRet) {
                    System.err.printf("\n Free ImageBuffer failed, errcode: [%#x]\n", nRet);
                }
            }
        }
        
    }

    public static void main(String videoSavePath) {
        ArrayList<MV_CC_DEVICE_INFO> stDeviceList = null;
        scanner = new Scanner(System.in);
        do {
            System.out.println("SDK Version " + MvCameraControl.MV_CC_GetSDKVersion());

            // Initialize SDK
            nRet = MvCameraControl.MV_CC_Initialize();
            if (MV_OK != nRet) {
                System.err.printf("Initialize SDK fail! nRet [0x%x]\n\n", nRet);
                break;
            }

            // Enumerate  devices
            try {
                stDeviceList = MvCameraControl.MV_CC_EnumDevices(MV_GIGE_DEVICE | MV_USB_DEVICE | MV_GENTL_GIGE_DEVICE | MV_GENTL_CAMERALINK_DEVICE | MV_GENTL_CXP_DEVICE | MV_GENTL_XOF_DEVICE);
            } catch (CameraControlException e) {
                System.err.println("Enumrate devices failed!" + e.toString());
                e.printStackTrace();
                break;
            }

            if (0 >= stDeviceList.size()) {
                System.out.println("No devices found!");
                break;
            }
            int i = 0;
            for (MV_CC_DEVICE_INFO stDeviceInfo : stDeviceList) {
                if (null == stDeviceInfo) {
                    continue;
                }
                System.out.println("[camera " + (i++) + "]");
                printDeviceInfo(stDeviceInfo);
            }

            // choose camera
            camIndex = chooseCamera(stDeviceList);
            if (-1 == camIndex) {
                break;
            }

            // Create device handle
            try {
                hCamera = MvCameraControl.MV_CC_CreateHandle(stDeviceList.get(camIndex));
            } catch (CameraControlException e) {
                System.err.println("Create handle failed!" + e.toString());
                e.printStackTrace();
                hCamera = null;
                break;
            }

            // Open selected device
            nRet = MvCameraControl.MV_CC_OpenDevice(hCamera);
            if (MV_OK != nRet) {
                System.err.printf("Connect to camera failed, errcode: [%#x]\n", nRet);
                break;
            } else {
                System.err.printf("Connect suc.\n");
            }
            
            // Turn on trigger mode
            nRet = MvCameraControl.MV_CC_SetEnumValueByString(hCamera, "TriggerMode", "Off");
            if (MV_OK != nRet) {
                System.err.printf("SetTriggerMode failed, errcode: [%#x]\n", nRet);
                break;
            }
            
            // Start grabbing
            nRet = MvCameraControl.MV_CC_StartGrabbing(hCamera);
            if (MV_OK != nRet) {
                System.err.printf("StartGrabbing failed, errcode: [%#x]\n", nRet);
                break;
            } else {
                System.err.printf("StartGrabbing  suc.\n");
            }


            Thread thread = new Thread(new Runnable() {
                @Override
                public void run() {
                    // 在线程中执行函数
                    GetImage();
                }
            });
            thread.start();

            scanner.useDelimiter("");
            System.out.println("Press Enter to exit.");

            while (flag) {
//                String input = scanner.nextLine();
//                if (scanner.hasNextLine()) {
//                    flag = false;
//                    break;
//
//                } else {
                    try {
                        Thread.sleep(1 * 10);
                    } catch (InterruptedException e) {
                        e.printStackTrace();
//                        flag = false;
//                        break;
                    }
//                }

            }


            // Stop grabbing
            nRet = MvCameraControl.MV_CC_StopGrabbing(hCamera);
            if (MV_OK != nRet) {
                System.err.printf("StopGrabbing failed, errcode: [%#x]\n", nRet);
                break;
            }

            // close device
            nRet = MvCameraControl.MV_CC_CloseDevice(hCamera);
            if (MV_OK != nRet) {
                System.err.printf("CloseDevice failed, errcode: [%#x]\n", nRet);
                break;
            }

        } while (false);

        if (null != hCamera) {
            // Destroy handle
            nRet = MvCameraControl.MV_CC_DestroyHandle(hCamera);
            if (MV_OK != nRet) {
                System.err.printf("DestroyHandle failed, errcode: [%#x]\n", nRet);
            }
        }
        MvCameraControl.MV_CC_Finalize();
        scanner.close();
         //将图片转视频
        // 输出的视频文件
        ImageToVideo.main(videoSavePath);
        videoOverflag=true;//视频转化完成
        //转换完将图片删除"D:/ImagesToVideo/videoImages/"
        String filesDele="D:/ImagesToVideo/videoImages/";
        FileUtil.deleteAllFile(filesDele);
    }
}

VideoSave主要流程是,Start grabbing开始抓图后,新启线程执行循环保存图片的方法,当拍摄结束后while(flag)才会放行,相机接着stop grabbing,关闭设备,Destroy handle,再然后将保存的图片通过ImageToVideo.main(videoSavePath)方法转成视频保存在设定的路径下,再delete刚循环保存的图片。

 在相机结束拍摄时,工位管理系统开始轮询确认相机小程序图片转成视频是否结束,因为图片转视频时间较长,防止获取视频接口超时异常,加入前端轮询功能,当确认转换视频完成,

VideoSave.videoOverflag为true,前端再调用displayPicture接口将客户端本地保存的视频上传到服务器资源路径下,这里资源映射和图片一样。视频功能就完成啦

其中,图片转视频需要用的到工具类ImageToVideo:

package jzy.util;
import java.io.IOException;
public class ImageToVideo {
    public static void main(String videoSavePath) {
        // 输入的图片文件夹
        String inputFolder = "D:/ImagesToVideo/videoImages/";
        // FFmpeg命令行字符串
        String command = "ffmpeg -framerate 10 -i " + inputFolder + "%d.jpg -c:v libx264 -pix_fmt yuv420p " + videoSavePath;
        try {
            // 运行FFmpeg命令
            Process process = Runtime.getRuntime().exec(command);
            new PrintStream(process.getErrorStream()).start();
            new PrintStream(process.getInputStream()).start();
            process.getErrorStream();
            process.waitFor(); // 等待命令执行完成
            System.out.println("视频已成功生成: " + videoSavePath);
        } catch (IOException e) {
            e.printStackTrace(); // 捕捉异常
        } catch (InterruptedException e) {
            e.printStackTrace(); // 捕捉中断异常
        }
    }
}

转换视频过程中信息打印工具类PrintStream:

package jzy.util;

public class PrintStream extends Thread {
    java.io.InputStream __is = null;
    public PrintStream(java.io.InputStream is) {
        __is = is;
    }
    public void run() {
        try {
            while (this != null) {
                int _ch = __is.read();
                if (_ch != -1)
                    System.out.print((char) _ch);
                else break;
            }
        } catch (Exception e) {
            e.printStackTrace();
        }
    }

}

执行图片删除的工具类deleteAllFile:

package jzy.util;

import java.io.File;

public class FileUtil {
    public static void deleteAllFile(String dir){
        File dirFile = new File(dir);
        // 判断dir是否是目录且dir是否存在
        if (!dirFile.exists()||(!dirFile.isDirectory())) {
            // System.out.println("目录:"+dir+" 不存在");
            return;
        }
        // 删除文件夹中所有的文件,包括子文件夹
        File[] files = dirFile.listFiles();
        for (int i = 0; i < files.length; i++) {
            if (files[i].isFile()) {
                // 如果是文件直接删除
                files[i].delete();
            }else if(files[i].isDirectory()){
                // 如果是目录,则通过递归删除
                deleteAllFile(files[i].getAbsolutePath());
            }
        }
        // 最后删除当前文件夹
//        dirFile.delete();
    }

}

实时画面

目前项目中没有硬性需求要做实时显示相机画面功能,这个待补充,大致就是通过SDK提供的GrabImage.java循环读取图片流,并通过WebSocket实时推送给前端显示

FFmpeg

录像功能中,图片转视频需要用到FFmpeg,包含了处理视频的强大工具和库,具体下载安装使用参考文献:

FFmpeg 简介及其下载安装步骤_ffmpeg安装-CSDN博客

在刚开始使用FFmpeg的过程中出现过FFmpeg长时间未响应的问题(死锁),参照下篇文献已解决,目前给出的代码示例可正常使用

javaweb中ffmpeg视频转码h264出现卡住不执行的解决办法(看到最后面就是答案了)_java执行ffmpeg命令卡住-CSDN博客

 以下代码示例解决了死锁问题,由于JVM只提供有限缓存空间,当外部程序(子进程)的输出流超出了这个有限空间而父进程又不读出这些数据,子进程会被阻塞waitFor()永远都不会返回,就会造成死锁。在Process类中,getInputStream用来获取进程的输出流,getOutputStream用来获取进程的输入流,getErrorStream用来获取进程的错误信息流。为了保险起见,在读出的时候,最好把子进程的输出流和错误流都读出来,这样可以保证清空缓存区。

不明觉厉,向大佬努力

打包

相机开发完后,需要达成jar包在工位客户端运行,由于我是通过将MvCameraControlWrapper.jar放在项目lib文件夹的形式引人SDK包的,打包的时候发现能正常打包,但是以jar包形式运行就会报错,提示找不到MvCameraControlWrapper类,原因是这种方式导入的SDK打包时并没有打包进meaven仓库,参考以下文献将其导入到本地meaven仓库后再打包程序,可正常运行。

一键搞定!用Maven命令把本地JAR包装进Maven仓库_maven install jar包到仓库-CSDN博客

参考文献

Python 实现海康机器人工业相机 MV-CU060-10GM 的实时显示视频流及拍照功能 - 龙凌云端 - 博客园

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值