H5 直接播放 不需要额外插件 springboot + ffmpeg + netty 实现 海康 大华设备整合为 websocket flv 格式视频流输出

springboot + ffmpeg + netty 实现 海康 大华设备整合为 websocket flv 格式视频流输出

原理

大致思路

实现思路的难点在海康设备获取FLV流上,网上很多文章都是使用FFMPEG 将海康的RTSP 流推到RTMP 由nginx进行分发,由于业务场景我这边只能通过websocket 去实现web播放

关于大华SDK是可以直接获取FLV流的,所以这边比较简单只需要在回调里把二进制flv流发到缓存中,注意这里有坑!

大华参考:大华摄像头实时预览(spring boot+websocket+flv.js)Java开发
im
netty集成 参考:https://github.com/tokyohost/ruoyi-im

注意大华实时预览数据回调函数 NetSDKLib.fRealDataCallBackEx 不允许阻塞操作 ,不然会有奇奇怪怪的报错!
通过websocket 发送流是网络操作会阻塞!
所以通过一个缓冲池缓冲一下,缓冲池自己起一个线程轮询发送流!

以下所有代码仅供参考!由于涉及项目敏感,所以仅提供核心的实现代码,关于channelContext 的管理以及相关的操作,请参考https://github.com/tokyohost/ruoyi-im实现

流缓存实现
海康和大华需要实例化不同的实例,不要用一个!包就分不开那个是那个了
VideoBuffer.java

package com.device.service;

import com.device.Model.BufferCtxDh;
import com.netsdk.lib.NetSDKLib;
import com.xim.server.constants.BussinessType;
import com.xim.server.constants.SocketConstants;
import com.xim.server.store.ChannelStore;
import io.netty.buffer.ByteBuf;
import io.netty.channel.ChannelFuture;
import io.netty.channel.ChannelFutureListener;
import io.netty.channel.ChannelHandlerContext;
import io.netty.handler.codec.http.websocketx.BinaryWebSocketFrame;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;

import java.util.*;
import java.util.concurrent.*;
import java.util.stream.Collectors;

/**
 * @author xuehui_li
 * @Version 1.0
 * @date 2024/7/25 16:59
 * @Content
 */

@Component
@Slf4j
public class VideoBuffer {
    ConcurrentLinkedQueue<BufferCtxDh> bufferQueue = new ConcurrentLinkedQueue<>();
    ThreadPoolExecutor sendBufferPool = new ThreadPoolExecutor(1, 1, 300L, TimeUnit.SECONDS, new LinkedBlockingQueue<>(512*1024*1024), new ThreadPoolExecutor.DiscardPolicy());

    @Autowired
    ChannelStore channelStore;

    @Autowired
    DhDeviceService dhDeviceService;


    public synchronized void putBuffer(BufferCtxDh bufferCtxDh) {

        bufferQueue.add(bufferCtxDh);
    }

    public boolean checkQueueHasNonEmpty(NetSDKLib.LLong handle) {
        for (BufferCtxDh bufferCtxDh : this.bufferQueue) {
            if(Objects.nonNull(bufferCtxDh.getLRealHandle()) && bufferCtxDh.getLRealHandle().equals(handle)){
                return true;
            }
        }
        return false;
    }

    public VideoBuffer() {
        Thread thread = new Thread(new Runnable() {
            @Override
            public void run() {
                while (true) {
                    if (!bufferQueue.isEmpty()) {
                        BufferCtxDh poll = bufferQueue.poll();
                        Optional<HashSet<ChannelHandlerContext>> byId = channelStore.findById(poll.getCtxKey());
                        if (byId.isPresent()) {
                            HashSet<ChannelHandlerContext> channelHandlerContexts = byId.get();
//                            for (ChannelHandlerContext channelHandlerContext : channelHandlerContexts) {
//                                //只给实时预览的ws连接发送
//                                if (BussinessType.REAL_PLAY.getType().equalsIgnoreCase(channelHandlerContext.channel().attr(SocketConstants.BUSSINESS_TYPE).get())) {
//                                    SendBufferTask sendBufferTask = new SendBufferTask(poll, channelHandlerContext);
//                                    sendBufferPool.submit(sendBufferTask);
//                                }
//                            }
                            //只给实时预览的ws连接发送
                            List<ChannelHandlerContext> collect = channelHandlerContexts.stream().filter(item -> BussinessType.REAL_PLAY.getType().equalsIgnoreCase(item.channel().attr(SocketConstants.BUSSINESS_TYPE).get())).collect(Collectors.toList());
                            SendBufferTask sendBufferTask = new SendBufferTask(poll, collect);
                            sendBufferPool.submit(sendBufferTask);
                            //释放内存
                            BufferCtxDh bufferCtxDh = new BufferCtxDh();
                            bufferCtxDh.setByteBuf(poll.getByteBuf());
                            bufferCtxDh.setRelease(true);
                            SendBufferTask relaseBufferTask = new SendBufferTask(bufferCtxDh, null);
                            sendBufferPool.submit(relaseBufferTask);
                            long count = channelHandlerContexts.stream().filter(item -> BussinessType.REAL_PLAY.getType().equalsIgnoreCase(item.channel().attr(SocketConstants.BUSSINESS_TYPE).get())).count();
                            if (count == 0) {
                                dhDeviceService.stopRealPlay(poll.getLRealHandle());
                                log.info("视频流 结束1");
                            }
                        } else {
                            dhDeviceService.stopRealPlay(poll.getLRealHandle());
                            log.info("视频流 结束2");
                        }
                    }

                }
            }
        });
        thread.setDaemon(true);
        thread.start();
        log.info("VideoBuffer 缓冲池 start");
    }
    class SendBufferTask implements Runnable {

        BufferCtxDh bufferCtxDh;
        List<ChannelHandlerContext> channelHandlerContext;


        public SendBufferTask(BufferCtxDh bufferCtxDh, List<ChannelHandlerContext> channelHandlerContext) {
            this.bufferCtxDh = bufferCtxDh;
            this.channelHandlerContext = channelHandlerContext;
        }

        @Override
        public void run() {

            if (bufferCtxDh.isRelease()) {
                bufferCtxDh.getByteBuf().release();
            }else{
//                log.warn("发送第{}个包,共{}个连接,池内包数量{}",bufferCtx.getSort(),channelHandlerContext.size(),sendBufferPool.getQueue().size());
                for (ChannelHandlerContext handlerContext : channelHandlerContext) {
                    synchronized (handlerContext) {
                        ByteBuf byteBuf = bufferCtxDh.getByteBuf();
                        ByteBuf copy = byteBuf.copy();
                        BinaryWebSocketFrame binaryWebSocketFrame = new BinaryWebSocketFrame(copy);
                        Object o = handlerContext.attr(SocketConstants.REAL_PLAY_HANDLE).get();
                        if (handlerContext.channel().isActive() && bufferCtxDh.getLRealHandle().equals(o)) {
                            ChannelFuture channelFuture = handlerContext.channel().writeAndFlush(binaryWebSocketFrame);
                            channelFuture.addListener(new ChannelFutureListener() {
                                @Override
                                public void operationComplete(ChannelFuture channelFuture) throws Exception {
                                    if (!channelFuture.isSuccess()) {
                                        Throwable cause = channelFuture.cause();
                                        log.error("发送失败 {}",cause);
                                    }
                                }
                            });
                        }
                    }

                    }

            }

        }
    }
}

BufferCtxDh 是封装的FLV流包,里面主要保存这个包发给谁以及包顺序,注意!
FLV流的包顺序必须是有序发送,不然前端播放不了!所以只起了一个线程去分发流
BufferCtx.java

package com.device.common.model;

import io.netty.buffer.ByteBuf;
import lombok.Data;

/**
 * @author xuehui_li
 * @Version 1.0
 * @date 2024/7/29 16:10
 * @Content
 */

@Data
public class BufferCtx {
    ByteBuf byteBuf;
    String ctxKey;
    Long sort;

    boolean release = false;
}

大华预览会反一个LLong 格式的句柄,海康是自定义用的UUID,所以这里是抽出来共用的字段。
大华包
BufferCtxDh .java

package com.device.Model;

import com.netsdk.lib.NetSDKLib;
import lombok.Data;

/**
 * @author xuehui_li
 * @Version 1.0
 * @date 2024/7/25 17:12
 * @Content
 */

@Data
public class BufferCtxDh extends com.device.common.model.BufferCtx {
    NetSDKLib.LLong lRealHandle;
}

海康包
BufferCtxHk.java

package com.device.hk.model;

import lombok.Data;

/**
 * @author xuehui_li
 * @Version 1.0
 * @date 2024/7/25 17:12
 * @Content
 */

@Data
public class BufferCtxHk extends com.device.common.model.BufferCtx {

    String lRealPlay;
}

大华封包发送核心代码

package com.device.service;

/**
 * @author xuehui_li
 * @Version 1.0
 * @date 2024/7/24 11:41
 * @Content
 */

import com.device.Model.BufferCtxDh;
import com.netsdk.lib.NetSDKLib;
import com.sun.jna.Pointer;
import com.xim.server.store.ChannelStore;
import io.netty.buffer.ByteBuf;
import io.netty.buffer.Unpooled;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;

import java.nio.ByteBuffer;
import java.util.List;
import java.util.concurrent.atomic.AtomicLong;

/**
 * 实时预览数据回调函数--扩展(pBuffer内存由SDK内部申请释放)
 */
@Component
@Slf4j
public class CbfRealDataCallBackEx implements NetSDKLib.fRealDataCallBackEx {

    @Autowired
    ChannelStore channelStore;

    @Autowired
    DhDeviceService dhDeviceService;

    @Autowired
    VideoBuffer videoBuffer;
    public CbfRealDataCallBackEx() {
    }

    AtomicLong packageSort = new AtomicLong(0);
    @Override
    public void invoke(NetSDKLib.LLong lRealHandle, int dwDataType, Pointer pBuffer,
                       int dwBufSize, int param, Pointer dwUser) {
        //将内容转换为字节数组
        byte[] bytes = pBuffer.getByteArray(0, dwBufSize);
        if (dwDataType== 1005 ) {//回调格式为flv的流
//            if (dwBufSize == 336) {
//                return;
//            }
            /**
             * 发送流数据
             * 使用pBuffer.getByteBuffer(0,dwBufSize)得到的是一个指向native pointer的ByteBuffer对象,其数据存储在native,
             * 而webSocket发送的数据需要存储在ByteBuffer的成员变量hb,使用pBuffer的getByteBuffer得到的ByteBuffer其hb为null
             * 所以,需要先得到pBuffer的字节数组,手动创建一个ByteBuffer
             */
            ByteBuffer buffer = ByteBuffer.wrap(bytes);
            ByteBuf bufferNetty = Unpooled.copiedBuffer(buffer);

            //通过websocket发送
            //ctxkey 其实是channelcontext 的自定义key,channel管理的时候实现的
            List<String> ctxKey = dhDeviceService.getCtxKeyByHandleId(lRealHandle.longValue());
            for (String key : ctxKey) {
                BufferCtxDh bufferCtxDh = new BufferCtxDh();
                bufferCtxDh.setCtxKey(key);
                bufferCtxDh.setLRealHandle(lRealHandle);
                bufferCtxDh.setByteBuf(bufferNetty);
                //调试使用
//                bufferCtx.setSort(packageSort.incrementAndGet());

                videoBuffer.putBuffer(bufferCtxDh);
            }


        }
    }
}

海康设备比大华更麻烦点,通过ffmpeg 拉RTSP 流,然后转成flv 流实现。
拉RTSP流几乎完全参考
https://github.com/banmajio/RTSPtoHTTP-FLV

在推流时进行特殊处理:
CameraPush.java

package com.device.hk.push;

import com.device.hk.alarm.HkDeviceManage;
import com.device.hk.config.Config;
import com.device.hk.handle.HkVideoBuffer;
import com.device.hk.model.BufferCtxHk;
import com.device.hk.model.CameraPojo;
import com.ruoyi.common.utils.spring.SpringUtils;
import io.netty.buffer.ByteBuf;
import io.netty.buffer.Unpooled;
import lombok.Data;
import org.bytedeco.ffmpeg.avcodec.AVPacket;
import org.bytedeco.ffmpeg.avformat.AVFormatContext;
import org.bytedeco.ffmpeg.global.avcodec;
import org.bytedeco.ffmpeg.global.avutil;

import org.bytedeco.javacv.FFmpegFrameGrabber;
import org.bytedeco.javacv.FFmpegFrameRecorder;
import org.bytedeco.javacv.FFmpegLogCallback;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.context.ApplicationContext;

import java.io.ByteArrayOutputStream;
import java.nio.ByteBuffer;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.concurrent.atomic.AtomicLong;

import static org.bytedeco.ffmpeg.global.avcodec.av_packet_unref;

/**
 * @author banmajio
 * @Title RtmpPush.java
 * @description javacv推数据帧
 * @time 2020年3月17日 下午2:32:42
 **/
@Data
public class CameraPush {
    private final static Logger logger = LoggerFactory.getLogger(CameraPush.class);
    private static Config config;

    /**
     * @description: 通过applicationContext上下文获取Config类
     * @author: banmajio
     */
    public static void setApplicationContext(ApplicationContext applicationContext) {
        config = applicationContext.getBean(Config.class);
    }

    /**
     * @description: 设备信息
     * @author: banmajio
     * @date: 2023/8/30 09:43
     */
    private CameraPojo pojo;
    /**
     * @description: 解码器
     * @author: banmajio
     * @date: 2023/8/30 09:43
     */
    private FFmpegFrameRecorder recorder;
    /**
     * @description: 采集器
     * @author: banmajio
     * @date: 2023/8/30 09:43
     */
    private FFmpegFrameGrabber grabber;
    private AtomicLong sort = new AtomicLong(0);
    /**
     * @description: 推流过程中出现错误的次数
     * @author: banmajio
     * @date: 2023/8/30 09:43
     */
    private int errIndex = 0;
    /**
     * @description: 退出状态码:0-正常退出;1-手动中断;
     * @author: banmajio
     * @date: 2023/8/30 09:44
     */
    private int exitCode = 0;
    /**
     * @description: 帧率
     * @author: banmajio
     * @date: 2023/8/30 09:44
     */
    private double frameRate = 0;

    public CameraPush(CameraPojo pojo) {
        this.pojo = pojo;
    }

    /**
     * @return void
     * @Title: release
     * @Description:资源释放
     **/
    public void release() {
        try {
            grabber.stop();
            grabber.close();
            if (recorder != null) {
                recorder.stop();
                recorder.release();
            }
        } catch (Exception e) {
            logger.error(e.getMessage());
        }
    }

    /**
     * @return void
     * @Title: push
     * @Description:推送视频流数据包
     **/
    public void push() {
        Thread pushBufferThread = null;
        try {
            avutil.av_log_set_level(avutil.AV_LOG_INFO);
            FFmpegLogCallback.set();
            grabber = new FFmpegFrameGrabber(pojo.getRtsp());
            grabber.setOption("rtsp_transport", "tcp");
            // 设置采集器构造超时时间
            grabber.setOption("stimeout", "2000000");
            if ("sub".equals(pojo.getStream())) {
                grabber.start(config.getSubCode());
            } else if ("main".equals(pojo.getStream())) {
                grabber.start(config.getMainCode());
            } else {
                grabber.start(config.getMainCode());
            }

            // 部分监控设备流信息里携带的帧率为9000,如出现此问题,会导致dts、pts时间戳计算失败,播放器无法播放,故出现错误的帧率时,默认为25帧
            if (grabber.getFrameRate() > 0 && grabber.getFrameRate() < 100) {
                frameRate = grabber.getFrameRate();
            } else {
                frameRate = 25.0;
            }
            int width = grabber.getImageWidth();
            int height = grabber.getImageHeight();
            // 若视频像素值为0,说明拉流异常,程序结束
            if (width == 0 && height == 0) {
                logger.error(pojo.getRtsp() + "  拉流异常!");
                grabber.stop();
                grabber.close();
                release();
                return;
            }
            final ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
            recorder = new FFmpegFrameRecorder(byteArrayOutputStream, grabber.getImageWidth(), grabber.getImageHeight());
//            recorder = new FFmpegFrameRecorder(pojo.getRtmp(), grabber.getImageWidth(), grabber.getImageHeight());
            recorder.setInterleaved(true);
            // 关键帧间隔,一般与帧率相同或者是视频帧率的两倍
            recorder.setGopSize((int) frameRate * 2);
            // 视频帧率(保证视频质量的情况下最低25,低于25会出现闪屏)
            recorder.setFrameRate(frameRate);
            // 设置比特率
//            recorder.setVideoBitrate(grabber.getVideoBitrate());
            recorder.setVideoBitrate(200000);
            // 封装flv格式
            recorder.setFormat("flv");
            // h264编/解码器
            recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
            recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);

            recorder.setMaxDelay(500);
//            recorder.setGopSize(10);

            Map<String, String> videoOption = new HashMap<>();

            // 该参数用于降低延迟
            videoOption.put("tune", "zerolatency");
            /**
             ** 权衡quality(视频质量)和encode speed(编码速度) values(值): *
             * ultrafast(终极快),superfast(超级快), veryfast(非常快), faster(很快), fast(快), *
             * medium(中等), slow(慢), slower(很慢), veryslow(非常慢) *
             * ultrafast(终极快)提供最少的压缩(低编码器CPU)和最大的视频流大小;而veryslow(非常慢)提供最佳的压缩(高编码器CPU)的同时降低视频流的大小
             */
            videoOption.put("preset", "ultrafast");
            // 画面质量参数,0~51;18~28是一个合理范围
            videoOption.put("crf", "28");
            recorder.setOptions(videoOption);
            AVFormatContext fc = grabber.getFormatContext();
            recorder.start(fc);
            logger.debug("开始推流 设备信息:[ip:" + pojo.getIp() + " channel:" + pojo.getChannel() + " stream:"
                    + pojo.getStream() + " starttime:" + pojo.getStarttime() + " endtime:" + pojo.getEndTime()
                    + " rtsp:" + pojo.getRtsp() + " url:" + pojo.getUrl() + "]");
            // 清空探测时留下的缓存
            grabber.flush();

            HkVideoBuffer hkVideoBuffer = SpringUtils.getBean(HkVideoBuffer.class);
            pushBufferThread = new Thread(new Runnable() {
                @Override
                public void run() {
                    HkDeviceManage hkDeviceManage = null;
                    //ffmpge 开始推流时只有第一个包有metadata 需要保存起来,不然后续新增的ws连接虽然能收到包,但是前端播放组件不知道是flv格式 会主动关闭ws 链接。
                    //https://www.jianshu.com/p/f2b31ddcf200
                    //Adobe Flash Video File Format Specification Version 10.1
                    Object lock = new Object();
                    ByteBuf onMetaDataPackage = null;
                    Boolean firstMetaData = true;
                    while (true) {
                        if (byteArrayOutputStream.size() >= 2000) {
                            if (hkDeviceManage == null) {
                                hkDeviceManage =  SpringUtils.getBean(HkDeviceManage.class);
                            }
                            final List<String> ctxKeyByHandleId = hkDeviceManage.getCtxKeyByHandleId(pojo.getToken());
                            byte[] byteArray;
                            synchronized (byteArrayOutputStream) {
                                byteArray = byteArrayOutputStream.toByteArray();
                                byteArrayOutputStream.reset();
                            }
                            ByteBuffer buffer = ByteBuffer.wrap(byteArray);
                            ByteBuf bufferNetty = Unpooled.copiedBuffer(buffer);
                            if (firstMetaData) {
                                synchronized (firstMetaData) {
                                    if (firstMetaData) {
                                        onMetaDataPackage = bufferNetty.copy();
                                        firstMetaData = false;
                                        for (String ctxkey : ctxKeyByHandleId) {
                                            hkDeviceManage.getMetaDataMap().put(ctxkey, onMetaDataPackage);
                                        }
                                    }
                                }

                            }
                            for (String ctxkey : ctxKeyByHandleId) {
                                BufferCtxHk bufferCtxHk = new BufferCtxHk();
                                bufferCtxHk.setCtxKey(ctxkey);
                                bufferCtxHk.setByteBuf(bufferNetty);
                                bufferCtxHk.setLRealPlay(pojo.getToken());
                                bufferCtxHk.setRelease(false);
                                bufferCtxHk.setSort(sort.incrementAndGet());

                                //推送数据给websocket
                                hkVideoBuffer.putBuffer(bufferCtxHk);
                            }
                        }
                    }

                }
            });
            pushBufferThread.start();

            AVPacket pkt;
            long dts = 0;
            long pts = 0;
            int timebase = 0;
            for (int noFrameIndex = 0; noFrameIndex < 5 && errIndex < 5; ) {
                long time1 = System.currentTimeMillis();
                if (exitCode == 1) {
                    break;
                }
                pkt = grabber.grabPacket();
                if (pkt == null || pkt.size() == 0 || pkt.data() == null) {
                    // 空包记录次数跳过
                    logger.warn("JavaCV 出现空包 设备信息:[ip:" + pojo.getIp() + " channel:" + pojo.getChannel() + " stream:"
                            + pojo.getStream() + " starttime:" + pojo.getStarttime() + " endtime:" + " rtsp:"
                            + pojo.getRtsp() + pojo.getEndTime() + " url:" + pojo.getUrl() + "]");
                    noFrameIndex++;
                    continue;
                }
                // 过滤音频
                if (pkt.stream_index() == 1) {
                    av_packet_unref(pkt);
                    continue;
                }

                // 矫正sdk回调数据的dts,pts每次不从0开始累加所导致的播放器无法续播问题
                pkt.pts(pts);
                pkt.dts(dts);
                errIndex += (recorder.recordPacket(pkt) ? 0 : 1);
                // pts,dts累加
                timebase = grabber.getFormatContext().streams(pkt.stream_index()).time_base().den();
                pts += timebase / (int) frameRate;
                dts += timebase / (int) frameRate;
                // 将缓存空间的引用计数-1,并将Packet中的其他字段设为初始值。如果引用计数为0,自动的释放缓存空间。
                av_packet_unref(pkt);

                long endTime = System.currentTimeMillis();
                if ((long) (1000 / frameRate) - (endTime - time1) > 0) {
                    Thread.sleep((long) (1000 / frameRate) - (endTime - time1));
                }
            }
        } catch (Exception e) {
            e.printStackTrace();
            logger.error(e.getMessage());
        } finally {
            release();
            if (pushBufferThread != null) {
                pushBufferThread.stop();
            }
            SpringUtils.getBean(HkDeviceManage.class).stopRealPlay(null,pojo.getToken());
            logger.info("推流结束 设备信息:[ip:" + pojo.getIp() + " channel:" + pojo.getChannel() + " stream:"
                    + pojo.getStream() + " starttime:" + pojo.getStarttime() + " endtime:" + pojo.getEndTime()
                    + " rtsp:" + pojo.getRtsp() + " url:" + pojo.getUrl() + "]");
        }
    }
}

推流不再推nginx,用ByteArrayOutputStream 存下来然后起一个线程去刷ByteArrayOutputStream ,封装成包推到hkVideoBuffer中,分发到websocket去。
注意 坑来了!
在多页面多路复用同一个通道时,只有一个可以播放,因为FLV 是有个协议头的,所以在起线程封包的时候要把第一个包包含flv metaData 的包存一下,有新的websocket 链接进来的时候先把这个头发下去就解决了!
大华想复用也一样,我这里没有复用,直接向SDK要了新的预览回调,每次开启新的FLV预览,自然就有metaData,所以大华没有遇到这个问题。
都是血泪史!

海康分发
HkVideoBuffer.java

package com.device.hk.handle;

import com.device.hk.alarm.HkDeviceManage;
import com.device.hk.model.BufferCtxHk;
import com.xim.server.constants.BussinessType;
import com.xim.server.constants.SocketConstants;
import com.xim.server.store.ChannelStore;
import io.netty.buffer.ByteBuf;
import io.netty.channel.ChannelFuture;
import io.netty.channel.ChannelFutureListener;
import io.netty.channel.ChannelHandlerContext;
import io.netty.handler.codec.http.websocketx.BinaryWebSocketFrame;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;

import java.util.*;
import java.util.concurrent.*;
import java.util.stream.Collectors;

import static com.xim.server.constants.SocketConstants.FLV_META_DATA;

/**
 * @author xuehui_li
 * @Version 1.0
 * @date 2024/7/25 16:59
 * @Content
 */

@Component
@Slf4j
public class HkVideoBuffer {
    ConcurrentLinkedQueue<BufferCtxHk> bufferQueue = new ConcurrentLinkedQueue<>();
    ThreadPoolExecutor sendBufferPool = new ThreadPoolExecutor(1, 1, 300L, TimeUnit.SECONDS, new LinkedBlockingQueue<>(512*1024*1024), new ThreadPoolExecutor.DiscardPolicy());

    @Autowired
    ChannelStore channelStore;

    @Autowired
    HkDeviceManage hkDeviceService;

    @Autowired
    CameraService cameraService;


    public synchronized void putBuffer(BufferCtxHk bufferCtxDh) {

        bufferQueue.add(bufferCtxDh);
    }
    public HkVideoBuffer() {
        Thread thread = new Thread(new Runnable() {
            @Override
            public void run() {
                while (true) {
                    if (!bufferQueue.isEmpty()) {
                        BufferCtxHk poll = bufferQueue.poll();
                        Optional<HashSet<ChannelHandlerContext>> byId = channelStore.findById(poll.getCtxKey());
                        if (byId.isPresent()) {
                            HashSet<ChannelHandlerContext> channelHandlerContexts = byId.get();
//                            for (ChannelHandlerContext channelHandlerContext : channelHandlerContexts) {
//                                //只给实时预览的ws连接发送
//                                if (BussinessType.REAL_PLAY.getType().equalsIgnoreCase(channelHandlerContext.channel().attr(SocketConstants.BUSSINESS_TYPE).get())) {
//                                    SendBufferTask sendBufferTask = new SendBufferTask(poll, channelHandlerContext);
//                                    sendBufferPool.submit(sendBufferTask);
//                                }
//                            }
                            //只给实时预览的ws连接发送
                            List<ChannelHandlerContext> collect = channelHandlerContexts.stream().filter(item -> BussinessType.REAL_PLAY.getType().equalsIgnoreCase(item.channel().attr(SocketConstants.BUSSINESS_TYPE).get())).collect(Collectors.toList());
                            SendBufferTask sendBufferTask = new SendBufferTask(poll, collect);
                            sendBufferPool.submit(sendBufferTask);
                            //释放内存
                            BufferCtxHk bufferCtxDh = new BufferCtxHk();
                            bufferCtxDh.setByteBuf(poll.getByteBuf());
                            bufferCtxDh.setRelease(true);
                            SendBufferTask relaseBufferTask = new SendBufferTask(bufferCtxDh, null);
                            sendBufferPool.submit(relaseBufferTask);
                            long count = channelHandlerContexts.stream().filter(item -> BussinessType.REAL_PLAY.getType().equalsIgnoreCase(item.channel().attr(SocketConstants.BUSSINESS_TYPE).get())).count();
                            if (count == 0) {
                                cameraService.closeCamera(poll.getLRealPlay());
                                log.info("视频流 结束1");
                            }
                        } else {
                            cameraService.closeCamera(poll.getLRealPlay());
//                            dhDeviceService.stopRealPlay(poll.getLRealHandle());
                            log.info("视频流 结束2");
                        }
                    }

                }
            }
        });
        thread.setDaemon(true);
        thread.start();
        log.info("VideoBuffer 缓冲池 start");
    }
    class SendBufferTask implements Runnable {

        BufferCtxHk bufferCtxHk;
        List<ChannelHandlerContext> channelHandlerContext;


        public SendBufferTask(BufferCtxHk bufferCtxHk, List<ChannelHandlerContext> channelHandlerContext) {
            this.bufferCtxHk = bufferCtxHk;
            this.channelHandlerContext = channelHandlerContext;
        }

        @Override
        public void run() {

            if (bufferCtxHk.isRelease()) {
                bufferCtxHk.getByteBuf().release();
            }else{
                log.warn("hk 发送第{}个包,共{}个连接,池内包数量{}",bufferCtxHk.getSort(),channelHandlerContext.size(),sendBufferPool.getQueue().size());
                for (ChannelHandlerContext handlerContext : channelHandlerContext) {
                    synchronized (handlerContext) {
                        ByteBuf byteBuf = bufferCtxHk.getByteBuf();
                        ByteBuf copy = byteBuf.copy();
                        BinaryWebSocketFrame binaryWebSocketFrame = new BinaryWebSocketFrame(copy);
                        Object o = handlerContext.attr(SocketConstants.REAL_PLAY_HANDLE).get();
                        if (handlerContext.channel().isActive() && o.equals(bufferCtxHk.getLRealPlay())) {
                            checkFlvMetaData(handlerContext);

                            ChannelFuture channelFuture = handlerContext.channel().writeAndFlush(binaryWebSocketFrame);
                            channelFuture.addListener(new ChannelFutureListener() {
                                @Override
                                public void operationComplete(ChannelFuture channelFuture) throws Exception {
                                    if (!channelFuture.isSuccess()) {
                                        Throwable cause = channelFuture.cause();
                                        log.error("发送失败 {}",cause);
                                    }
                                }
                            });
                        }
                    }

                }

            }

        }

        private void checkFlvMetaData(ChannelHandlerContext handlerContext) {
            if (handlerContext.channel().hasAttr(FLV_META_DATA)) {
                //补发FLV metaData
                ByteBuf metaDataByteBuf = handlerContext.channel().attr(FLV_META_DATA).getAndSet(null);
                if (Objects.isNull(metaDataByteBuf)) {
                    return;
                }
                BinaryWebSocketFrame metaDataFrame = new BinaryWebSocketFrame(metaDataByteBuf.copy());

                ChannelFuture channelFuture = handlerContext.channel().writeAndFlush(metaDataFrame);
                channelFuture.addListener(new ChannelFutureListener() {
                    @Override
                    public void operationComplete(ChannelFuture channelFuture) throws Exception {
                        if (!channelFuture.isSuccess()) {
                            Throwable cause = channelFuture.cause();
                            log.error("MetaData 发送失败 {}",cause);
                        }
                    }
                });
            }
        }
    }
}

前端播放组件 flv.js 这个比较通用下载用就行

 loadRealPlay(element,url) {
      const videoElement = this.$refs[element]
      if (flvjs.isSupported()) {
       const flvPlayer = flvjs.createPlayer({
          type: 'flv',					//媒体类型
          // url: `ws://localhost:8070/im?uid=${getToken()}&BUSINESS_TYPE=RealPlay&manufacturer=dahua&deviceId=10002`,	//flv格式媒体URL
          // url: `ws://localhost:8070/im?uid=${getToken()}&BUSINESS_TYPE=RealPlay&manufacturer=hk&deviceId=20001`,	//flv格式媒体URL
          url: url,	//flv格式媒体URL
          isLive: true,					//数据源是否为直播流
          hasAudio: false,				//数据源是否包含有音频
          hasVideo: true,					//数据源是否包含有视频
          enableStashBuffer: false		//是否启用缓存区
        },{
          enableWorker: false, 			//不启用分离线程
          enableStashBuffer: false, 		//关闭IO隐藏缓冲区
          autoCleanupSourceBuffer: true 	//自动清除缓存
        });
        flvPlayer.attachMediaElement(videoElement);	//将播放实例注册到节点
        flvPlayer.load(); 					//加载数据流
        flvPlayer.play();					//播放数据流
      }
    },

预览效果

在这里插入图片描述

  • 5
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
实现视频上传和回显,需要涉及前端页面、后端接口以及视频处理等多方面的技术。下面我将从这些方面一步一步介绍具体实现方法。 1. 前端页面 前端页面需要有一个上传视频的功能,可以使用`<input type="file">`标签实现。在上传时,可以通过`FormData`对象将视频文件和一些其他参数一起发送到后端接口。示例代码如下: ```html <template> <div> <input type="file" @change="handleFileUpload"> <button @click="uploadVideo">上传</button> <video v-if="videoUrl" :src="videoUrl" controls></video> </div> </template> <script> export default { data() { return { videoUrl: '' } }, methods: { handleFileUpload(event) { this.videoFile = event.target.files[0] }, async uploadVideo() { const formData = new FormData() formData.append('file', this.videoFile) formData.append('name', 'video') const response = await fetch('/api/upload', { method: 'POST', body: formData }) const data = await response.json() if (data.success) { this.videoUrl = data.videoUrl } } } } </script> ``` 2. 后端接口 后端接口使用SpringBoot框架,需要使用`@RestController`和`@PostMapping`注解来实现视频上传接口。接收到视频文件后,可以使用FFmpeg库来处理视频文件,将视频转换为指定格式或者提取视频的缩略图等。 ```java @RestController public class VideoController { @PostMapping("/api/upload") public ApiResponse uploadVideo(@RequestParam("file") MultipartFile file) { if (file.isEmpty()) { return ApiResponse.error("上传文件不能为空"); } try { // 保存视频文件并返回视频的URL String videoUrl = saveVideo(file); return ApiResponse.success(videoUrl); } catch (Exception e) { e.printStackTrace(); return ApiResponse.error("上传失败"); } } private String saveVideo(MultipartFile file) throws Exception { String fileName = UUID.randomUUID().toString() + ".mp4"; File dest = new File("uploads/" + fileName); file.transferTo(dest); return "http://localhost:8080/uploads/" + fileName; } } ``` 3. 视频处理 视频处理需要使用FFmpeg库来实现。在SpringBoot项目中,可以使用`ProcessBuilder`来执行FFmpeg命令。下面是一个实现视频转换为MP4格式的示例代码: ```java private void convertToMp4(String inputPath, String outputPath) throws Exception { String command = String.format("ffmpeg -i %s -c:v libx264 -preset slow -crf 22 -c:a aac -b:a 128k -movflags faststart -f mp4 %s", inputPath, outputPath); Process process = new ProcessBuilder(command.split(" ")).redirectErrorStream(true).start(); InputStream inputStream = process.getInputStream(); BufferedReader reader = new BufferedReader(new InputStreamReader(inputStream)); String line; while ((line = reader.readLine()) != null) { System.out.println(line); } process.waitFor(); reader.close(); inputStream.close(); } ``` 以上是实现视频上传和回显的基本步骤,具体实现需要根据具体需求进行调整。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值