Java对接海康SDK并推往ZLM4J实现超低延迟播放

前言

之前写过一篇JavaCv对接海康、大华摄像头SDK取流并转推到RTMP服务器 来实现取流播放,这种方式稍微有点延迟,使用云台体验延迟更为明显。所以我将采用ZLM4J 来直接结合海康SDK取流数据,这样实现的方式的延迟实测基本和网页打开摄像头后台一样。

需要资源

1. ZLM4J相关知识:ZLM4J使用文档

2. 海康SDK相关开发知识

实操

登录海康SDK并配置ZLM4J相关

public class RealPlayDemo {
    public static ZLMApi ZLM_API = Native.load("mk_api", ZLMApi.class);
    public static HCNetSDK hCNetSDK = Native.load("HCNetSDK", HCNetSDK.class);
    static int lUserID = 0;

    public static void main(String[] args) throws InterruptedException {
        //初始化zmk服务器
        ZLM_API.mk_env_init1(1, 1, 1, null, 0, 0, null, 0, null, null);
        //创建http服务器 0:失败,非0:端口号
        short http_server_port = ZLM_API.mk_http_server_start((short) 7788, 0);
        //创建rtsp服务器 0:失败,非0:端口号
        short rtsp_server_port = ZLM_API.mk_rtsp_server_start((short) 7554, 0);
        //创建rtmp服务器 0:失败,非0:端口号
        short rtmp_server_port = ZLM_API.mk_rtmp_server_start((short) 7935, 0);
        //初始化海康SDK
        boolean initSuc = hCNetSDK.NET_DVR_Init();
        if (!initSuc) {
            System.out.println("海康SDK初始化失败");
            return;
        }
        //登录海康设备
        Login_V40("172.16.6.236", (short) 8000, "admin", "telit123");
        MK_INI mkIni = ZLM_API.mk_ini_create();
        ZLM_API.mk_ini_set_option(mkIni, "enable_rtsp", "1");
        ZLM_API.mk_ini_set_option(mkIni, "enable_rtmp", "1");
        //创建媒体
        MK_MEDIA mkMedia = ZLM_API.mk_media_create2("__defaultVhost__", "live", "test", 0, mkIni);
        //释放资源
        ZLM_API.mk_ini_release(mkIni);
            //这里分辨率、帧率、码率都可随便写 0是H264 1是h265 可以事先定义好 也可以放到回调里面判断编码类型让后再初始化这个
        ZLM_API.mk_media_init_video(mkMedia, 0, 2688, 1520, 25.0f, 2500);
        ZLM_API.mk_media_init_audio(mkMedia, 2, 8000, 1, 16);
        ZLM_API.mk_media_init_complete(mkMedia);
        //创建海康sdk回调
        FRealDataCallback fRealDataCallBack = new FRealDataCallback(mkMedia,25.0);
        HCNetSDK.NET_DVR_PREVIEWINFO netDvrPreviewinfo = new HCNetSDK.NET_DVR_PREVIEWINFO();
        netDvrPreviewinfo.lChannel = 1;
        netDvrPreviewinfo.dwStreamType = 0;
        netDvrPreviewinfo.bBlocked = 0;
        netDvrPreviewinfo.dwLinkMode = 0;
        netDvrPreviewinfo.byProtoType = 0;
        //播放视频
        long ret = hCNetSDK.NET_DVR_RealPlay_V40(lUserID, netDvrPreviewinfo,fRealDataCallBack,Pointer.NULL);
        if (ret == -1) {
            System.out.println("【海康SDK】开始sdk播放视频失败! 错误码:" + hCNetSDK.NET_DVR_GetLastError());
            return;
        }
        //休眠
        Thread.sleep(120000);
        //释放资源
         fRealDataCallBack.release();
        hCNetSDK.NET_DVR_StopRealPlay(ret);
        Logout();
    }

    /**
     * 登录
     *
     * @param m_sDeviceIP 设备ip地址
     * @param wPort       端口号,设备网络SDK登录默认端口8000
     * @param m_sUsername 用户名
     * @param m_sPassword 密码
     */
    public static void Login_V40(String m_sDeviceIP, short wPort, String m_sUsername, String m_sPassword) {
        /* 注册 */
        // 设备登录信息
        HCNetSDK.NET_DVR_USER_LOGIN_INFO m_strLoginInfo = new HCNetSDK.NET_DVR_USER_LOGIN_INFO();
        // 设备信息
        HCNetSDK.NET_DVR_DEVICEINFO_V40 m_strDeviceInfo = new HCNetSDK.NET_DVR_DEVICEINFO_V40();
        m_strLoginInfo.sDeviceAddress = new byte[HCNetSDK.NET_DVR_DEV_ADDRESS_MAX_LEN];
        System.arraycopy(m_sDeviceIP.getBytes(), 0, m_strLoginInfo.sDeviceAddress, 0, m_sDeviceIP.length());
        m_strLoginInfo.wPort = wPort;
        m_strLoginInfo.sUserName = new byte[HCNetSDK.NET_DVR_LOGIN_USERNAME_MAX_LEN];
        System.arraycopy(m_sUsername.getBytes(), 0, m_strLoginInfo.sUserName, 0, m_sUsername.length());
        m_strLoginInfo.sPassword = new byte[HCNetSDK.NET_DVR_LOGIN_PASSWD_MAX_LEN];
        System.arraycopy(m_sPassword.getBytes(), 0, m_strLoginInfo.sPassword, 0, m_sPassword.length());
        // 是否异步登录:false- 否,true- 是
        m_strLoginInfo.bUseAsynLogin = false;
        // write()调用后数据才写入到内存中
        m_strLoginInfo.write();

        lUserID = hCNetSDK.NET_DVR_Login_V40(m_strLoginInfo, m_strDeviceInfo);
        if (lUserID == -1) {
            System.out.println("登录失败,错误码为" + hCNetSDK.NET_DVR_GetLastError());
            return;
        } else {
            System.out.println("登录成功!");

            // read()后,结构体中才有对应的数据
            m_strDeviceInfo.read();
            return;
        }
    }

    //设备注销 SDK释放
    public static void Logout() {
        if (lUserID >= 0) {
            if (!hCNetSDK.NET_DVR_Logout(lUserID)) {
                System.out.println("注销失败,错误码为" + hCNetSDK.NET_DVR_GetLastError());
            }
            System.out.println("注销成功");
            hCNetSDK.NET_DVR_Cleanup();
            return;
        } else {
            System.out.println("设备未登录");
            hCNetSDK.NET_DVR_Cleanup();
            return;
        }
    }

海康SDK取流回调

public class FRealDataCallback implements HCNetSDK.FRealDataCallBack_V30 {
    private final MK_MEDIA mkMedia;
    private final Memory buffer = new Memory(1024 * 1024 * 5);
    private int bufferSize = 0;
    private long pts;
    private double fps;
    private long time_base;
    private int videoType = 0;
    private int audioType = 0;


    public FRealDataCallback(MK_MEDIA mkMedia, double fps) {
        this.mkMedia = mkMedia;
        this.fps = fps;
        //ZLM以1000为时间基准
        time_base = (long) (1000 / fps);
        //回调使用同一个线程
        Native.setCallbackThreadInitializer(this, new CallbackThreadInitializer(true, false, "HikRealStream"));
    }


    @Override
    public void invoke(long lRealHandle, int dwDataType, ByteByReference pBuffer, int dwBufSize, Pointer pUser) throws IOException {
        //ps封装
        if (dwDataType == HCNetSDK.NET_DVR_STREAMDATA) {
            Pointer pointer = pBuffer.getPointer();
            int offset = 0;
            //解析psh头 psm头 psm标题
            offset = readPSHAndPSMAndPSMT(pointer, offset);
            //读取pes数据
            readPES(pointer, offset);
        }
    }

    /**
     * 读取pes及数据
     *
     * @param pointer
     * @param offset
     */
    private void readPES(Pointer pointer, int offset) {
        //pes header
        byte[] pesHeaderStartCode = new byte[3];
        pointer.read(offset, pesHeaderStartCode, 0, pesHeaderStartCode.length);
        if ((pesHeaderStartCode[0] & 0xFF) == 0x00 && (pesHeaderStartCode[1] & 0xFF) == 0x00 && (pesHeaderStartCode[2] & 0xFF) == 0x01) {
            offset = offset + pesHeaderStartCode.length;
            byte[] streamTypeByte = new byte[1];
            pointer.read(offset, streamTypeByte, 0, streamTypeByte.length);
            offset = offset + streamTypeByte.length;
            int streamType = streamTypeByte[0] & 0xFF;
            //视频流
            if (streamType >= 0xE0 && streamType <= 0xEF) {
                //视频数据
                readVideoES(pointer, offset);
            } else if (streamType >= 0xC0 & streamType <= 0xDF) {
                //音频数据
                readAudioES(pointer, offset);
            }
        }
    }

    /**
     * 读取视频数据
     *
     * @param pointer
     * @param offset
     */
    private void readVideoES(Pointer pointer, int offset) {
        byte[] pesLengthByte = new byte[2];
        pointer.read(offset, pesLengthByte, 0, pesLengthByte.length);
        offset = offset + pesLengthByte.length;
        int pesLength = (pesLengthByte[0] & 0xFF) << 8 | (pesLengthByte[1] & 0xFF);
        //pes数据
        if (pesLength > 0) {
            byte[] pts_dts_length_info = new byte[3];
            pointer.read(offset, pts_dts_length_info, 0, pts_dts_length_info.length);
            offset = offset + pts_dts_length_info.length;
            int pesHeaderLength = (pts_dts_length_info[2] & 0xFF);
            //判断是否是有pts 忽略dts
            int i = (pts_dts_length_info[1] & 0xFF) >> 6;
            if (i == 0x02 || i == 0x03) {
                //byte[] pts_dts = new byte[5];
                //pointer.read(offset, pts_dts, 0, pts_dts.length);
                //这里获取的是以90000为时间基的 需要转为 1/1000为基准的 但是pts还是不够平滑导致画面卡顿 所以不采用读取的pts
                //long pts_90000 = ((pts_dts[0] & 0x0e) << 29) | (((pts_dts[1] << 8 | pts_dts[2]) & 0xfffe) << 14) | (((pts_dts[3] << 8 | pts_dts[4]) & 0xfffe) >> 1);
                pts = time_base + pts;
            }
            offset = offset + pesHeaderLength;
            byte[] naluStart = new byte[5];
            pointer.read(offset, naluStart, 0, naluStart.length);
            //nalu起始标志
            if ((naluStart[0] & 0xFF) == 0x00 && (naluStart[1] & 0xFF) == 0x00 && (naluStart[2] & 0xFF) == 0x00 && (naluStart[3] & 0xFF) == 0x01) {
                if (bufferSize != 0) {
                    //获取nalu类型
                    int naluType = (naluStart[4] & 0x1F);
                    //如果是sps pps不需要变化pts
                    if (naluType == 7 || naluType == 8) {
                        pts = pts - time_base;
                    }
                    if (videoType == 0x1B) {
                        //推送帧数据
                        ZLM_API.mk_media_input_h264(mkMedia, buffer.share(0), bufferSize, pts, pts);
                    } else if (videoType == 0x24) {
                        //推送帧数据
                        ZLM_API.mk_media_input_h265(mkMedia, buffer.share(0), bufferSize, pts, pts);
                    }
                    bufferSize = 0;
                }
            }
            int naluLength = pesLength - pts_dts_length_info.length - pesHeaderLength;
            byte[] temp = new byte[naluLength];
            pointer.read(offset, temp, 0, naluLength);
            buffer.write(bufferSize, temp, 0, naluLength);
            bufferSize = naluLength + bufferSize;
        }
    }

    /**
     * 读取音频数据
     *
     * @param pointer
     * @param offset
     */
    private void readAudioES(Pointer pointer, int offset) {
        byte[] pesLengthByte = new byte[2];
        pointer.read(offset, pesLengthByte, 0, pesLengthByte.length);
        offset = offset + pesLengthByte.length;
        int pesLength = (pesLengthByte[0] & 0xFF) << 8 | (pesLengthByte[1] & 0xFF);
        //pes数据
        if (pesLength > 0) {
            byte[] pts_dts_length_info = new byte[3];
            pointer.read(offset, pts_dts_length_info, 0, pts_dts_length_info.length);
            offset = offset + pts_dts_length_info.length;
            int pesHeaderLength = (pts_dts_length_info[2] & 0xFF);
            //判断是否是有pts 忽略dts
            int i = (pts_dts_length_info[1] & 0xFF) >> 6;
            long pts_90000 =0;
            if (i == 0x02 || i == 0x03) {
                byte[] pts_dts = new byte[5];
                pointer.read(offset, pts_dts, 0, pts_dts.length);
                //这里获取的是以90000为时间基的 需要转为 1/1000为基准的 但是pts还是不够平滑导致画面卡顿 所以不采用读取的pts
                 pts_90000 = ((pts_dts[0] & 0x0e) << 29) | (((pts_dts[1] << 8 | pts_dts[2]) & 0xfffe) << 14) | (((pts_dts[3] << 8 | pts_dts[4]) & 0xfffe) >> 1);
                //pts = time_base + pts;
            }
            offset = offset + pesHeaderLength;
            int audioLength = pesLength - pts_dts_length_info.length - pesHeaderLength;
            byte[] bytes = G711ACodec._toPCM(pointer.getByteArray(offset, audioLength));
            Memory temp = new Memory(bytes.length);
            temp.write(0, bytes, 0, bytes.length);
            ZLM_API.mk_media_input_pcm(mkMedia,temp.share(0),bytes.length, pts_90000);
            temp.close();
        }
    }

    /**
     * 读取psh头 psm头 psm标题 及数据
     *
     * @param pointer
     * @param offset
     * @return
     */
    private int readPSHAndPSMAndPSMT(Pointer pointer, int offset) {
        //ps头起始标志
        byte[] psHeaderStartCode = new byte[4];
        pointer.read(offset, psHeaderStartCode, 0, psHeaderStartCode.length);
        //判断是否是ps头
        if ((psHeaderStartCode[0] & 0xFF) == 0x00 && (psHeaderStartCode[1] & 0xFF) == 0x00 && (psHeaderStartCode[2] & 0xFF) == 0x01 && (psHeaderStartCode[3] & 0xFF) == 0xBA) {
            byte[] stuffingLengthByte = new byte[1];
            offset = 13;
            pointer.read(offset, stuffingLengthByte, 0, stuffingLengthByte.length);
            int stuffingLength = stuffingLengthByte[0] & 0x07;
            offset = offset + stuffingLength + 1;
            //ps头起始标志
            byte[] psSystemHeaderStartCode = new byte[4];
            pointer.read(offset, psSystemHeaderStartCode, 0, psSystemHeaderStartCode.length);
            //PS system header 系统标题
            if ((psSystemHeaderStartCode[0] & 0xFF) == 0x00 && (psSystemHeaderStartCode[1] & 0xFF) == 0x00 && (psSystemHeaderStartCode[2] & 0xFF) == 0x01 && (psSystemHeaderStartCode[3] & 0xFF) == 0xBB) {
                offset = offset + psSystemHeaderStartCode.length;
                byte[] psSystemLengthByte = new byte[1];
                //ps系统头长度
                pointer.read(offset, psSystemLengthByte, 0, psSystemLengthByte.length);
                int psSystemLength = psSystemLengthByte[0] & 0xFF;
                //跳过ps系统头
                offset = offset + psSystemLength;
                pointer.read(offset, psSystemHeaderStartCode, 0, psSystemHeaderStartCode.length);
            }
            //判断是否是psm系统头 则为IDR帧
            if ((psSystemHeaderStartCode[0] & 0xFF) == 0x00 && (psSystemHeaderStartCode[1] & 0xFF) == 0x00 && (psSystemHeaderStartCode[2] & 0xFF) == 0x01 && (psSystemHeaderStartCode[3] & 0xFF) == 0xBC) {
                offset = offset + psSystemHeaderStartCode.length;
                //psm头长度可以
                byte[] psmLengthByte = new byte[2];
                pointer.read(offset, psmLengthByte, 0, psmLengthByte.length);
                int psmLength = (psmLengthByte[0] & 0xFF) << 8 | (psmLengthByte[1] & 0xFF);
                //获取音视频类型
                if (videoType == 0 || audioType == 0) {
                    //自定义复合流描述
                    byte[] detailStreamLengthByte = new byte[2];
                    int tempOffset = offset + psmLengthByte.length + 2;
                    pointer.read(tempOffset, detailStreamLengthByte, 0, detailStreamLengthByte.length);
                    int detailStreamLength = (detailStreamLengthByte[0] & 0xFF) << 8 | (detailStreamLengthByte[1] & 0xFF);
                    tempOffset = detailStreamLength + detailStreamLengthByte.length + tempOffset + 2;
                    byte[] videoStreamTypeByte = new byte[1];
                    pointer.read(tempOffset, videoStreamTypeByte, 0, videoStreamTypeByte.length);
                    videoType = videoStreamTypeByte[0] & 0xFF;
                    tempOffset = tempOffset + videoStreamTypeByte.length + 1;
                    byte[] videoStreamDetailLengthByte = new byte[2];
                    pointer.read(tempOffset, videoStreamDetailLengthByte, 0, videoStreamDetailLengthByte.length);
                    int videoStreamDetailLength = (videoStreamDetailLengthByte[0] & 0xFF) << 8 | (videoStreamDetailLengthByte[1] & 0xFF);
                    tempOffset = tempOffset + videoStreamDetailLengthByte.length + videoStreamDetailLength;
                    byte[] audioStreamTypeByte = new byte[1];
                    pointer.read(tempOffset, audioStreamTypeByte, 0, audioStreamTypeByte.length);
                    audioType = audioStreamTypeByte[0] & 0xFF;
                }
                offset = offset + psmLengthByte.length + psmLength;
            }
        }
        return offset;
    }

    /**
     * 释放资源
     *
     * @return
     */
    public void release() {
        ZLM_API.mk_media_release(mkMedia);
        buffer.close();
    }
}

G711A编解码器

public class G711ACodec {
    private final static int SIGN_BIT = 0x80;
    private final static int QUANT_MASK = 0xf;
    private final static int SEG_SHIFT = 4;
    private final static int SEG_MASK = 0x70;
    static short[] seg_end = {0xFF, 0x1FF, 0x3FF, 0x7FF, 0xFFF, 0x1FFF, 0x3FFF, 0x7FFF};
    private final static int cClip = 32635;
    private static final byte[] aLawCompressTable = new byte[]{1, 1, 2, 2, 3, 3, 3,
            3, 4, 4, 4, 4, 4, 4, 4, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5,
            5, 5, 5, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6,
            6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 7, 7, 7, 7, 7, 7, 7, 7, 7,
            7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7,
            7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7,
            7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7};

    private static byte linearToALawSample(short sample) {
        int sign;
        int exponent;
        int mantissa;
        int s;

        sign = ((~sample) >> 8) & 0x80;
        if (!(sign == 0x80)) {
            sample = (short) -sample;
        }
        if (sample > cClip) {
            sample = cClip;
        }
        if (sample >= 256) {
            exponent = aLawCompressTable[(sample >> 8) & 0x7F];
            mantissa = (sample >> (exponent + 3)) & 0x0F;
            s = (exponent << 4) | mantissa;
        } else {
            s = sample >> 4;
        }
        s ^= (sign ^ 0x55);
        return (byte) s;
    }

    /**
     * PCM转G711
     *
     * @param src 编码前的数据
     * @return 编码后res数组长度应为编码前src数组长度的一半
     */
    public static byte[] encode(byte[] src) {
        int j = 0;
        int len = src.length;
        int count = len / 2;
        byte[] res = new byte[count];
        short sample = 0;
        for (int i = 0; i < count; i++) {
            sample = (short) (((src[j++] & 0xff) | (src[j++]) << 8));
            res[i] = linearToALawSample(sample);
        }
        return res;
    }

    static short search(short val, short[] table, short size) {

        for (short i = 0; i < size; i++) {
            if (val <= table[i]) {
                return i;
            }
        }
        return size;
    }

    public static byte linear2alaw(short pcm_val) {
        short mask;
        short seg;
        char aval;
        if (pcm_val >= 0) {
            mask = 0xD5;
        } else {
            mask = 0x55;
            pcm_val = (short) (-pcm_val - 1);
        }

        /* Convert the scaled magnitude to segment number. */
        seg = search(pcm_val, seg_end, (short) 8);

        /* Combine the sign, segment, and quantization bits. */

        if (seg >= 8)       /* out of range, return maximum value. */ return (byte) (0x7F ^ mask);
        else {
            aval = (char) (seg << SEG_SHIFT);
            if (seg < 2) aval |= (pcm_val >> 4) & QUANT_MASK;
            else aval |= (pcm_val >> (seg + 3)) & QUANT_MASK;
            return (byte) (aval ^ mask);
        }
    }


    public static short alaw2linear(byte a_val) {
        short t;
        short seg;

        a_val ^= 0x55;

        t = (short) ((a_val & QUANT_MASK) << 4);
        seg = (short) ((a_val & SEG_MASK) >> SEG_SHIFT);
        switch (seg) {
            case 0:
                t += 8;
                break;
            case 1:
                t += 0x108;
                break;
            default:
                t += 0x108;
                t <<= seg - 1;
        }
        return (a_val & SIGN_BIT) != 0 ? t : (short) -t;
    }

    // 由G.711转至PCM
    public static byte[] _toPCM(byte[] g711data) {
        byte[] pcmdata = new byte[g711data.length * 2];
        for (int i = 0, k = 0; i < g711data.length; i++) {
            short v = alaw2linear(g711data[i]);
            pcmdata[k++] = (byte) (v & 0xff);
            pcmdata[k++] = (byte) ((v >> 8) & 0xff);
        }
        return pcmdata;
    }

    // 由PCM转至G.711
    public static byte[] _fromPCM(byte[] pcmData) {
        return encode(pcmData);
    }

    public byte[] toPCM(byte[] data) {
        byte[] temp;
        // 如果前四字节是00 01 52 00,则是海思头,需要去掉
        if (data[0] == 0x00 && data[1] == 0x01 && (data[2] & 0xff) == (data.length - 4) / 2 && data[3] == 0x00) {
            temp = new byte[data.length - 4];
            System.arraycopy(data, 4, temp, 0, temp.length);
        } else temp = data;

        return _toPCM(temp);
    }

    public byte[] fromPCM(byte[] data) {
        return encode(data);
    }
}```

# 总结
主要麻烦的地方还是需要去解析海康SDK取流回调函数中PS流的解析工作,此片也文章作为ZLM4J创建流资源的教程。

项目全部代码以放到Gitee[项目代码仓库](https://gitee.com/daofuli/zlm4j_hk)
# 后续
回放参见: [Java对接海康SDK并推往ZLM4J-回放取流](https://blog.csdn.net/weixin_42054936/article/details/138353050)
  • 10
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 3
    评论
CSDN Java对接海康SDK是为了实现海康SDKJava语言相结合,从而实现一些特定的功能和需求。海康SDK海康威视公司开发的一套用于监控摄像头、视频录像等监控设备的开发工具包。 首先,需要在CSDN平台上找到与Java对接海康SDK相关的教程或者相关的开源项目。CSDN是一个IT技术社区平台,上面有大量的技术文章和教程,可以提供对接海康SDK的指导和案例。 然后,按照教程或者案例中的步骤进行操作。一般来说,需要先下载并安装海康SDK的开发包,然后在Java项目中引入相关的库文件。接着,根据具体的需求编写Java代码,调用海康SDK提供的函数和接口实现所需的功能。 在对接过程中,可能会遇到一些问题和挑战,例如SDK的版本兼容性、接口调用参数的设置和调试等。此时,可以查阅相关的文档和资料,或者在CSDN社区中发布问题,与其他开发者交流和讨论,共同解决问题。 对接海康SDK后,可以通过Java代码实现海康监控设备的通信,例如获取实时视频流、控制云台转动、查询和回放录像等功能。这对于需要在Java应用中集成摄像头监控的项目或者需要自动化控制摄像头的场景非常有用。 总之,CSDN Java对接海康SDK是一种使用CSDN平台上的Java技术资源来实现海康SDK对接的方法。通过参考相关的教程和案例,编写Java代码,可以实现海康监控设备的通信和控制,以满足特定的需求。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 3
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值