前端利用Canvas+Video合并流实现截屏和录屏功能

1.前言

很久没有更新博客了,一言难尽,最近也换了份工作,工作也算是稳定下来了,所以之后都会继续更新自己的博客来记录自己所学的东西,最近项目中需要这么个功能,摄像头实时推送rtsp流,因为摄像头上有各种算法,所以会有框的绘制,本来按道理应该在推流之前实现绘框,那么前端就不需要做什么处理了(直接播放就行了),但是由于开发需要时间,项目比较赶,所以需要前端来绘制框,并且需要截图录屏功能(都要带框)。

2.思路

现在需要利用canvas绘制框(canvas流),rtsp流是在video上播放的(video可直接播放媒体流),现在就有2个流,就需要合并这两个流到一个流中,所以还需要多创建一个canvas( 新的媒体流)来获取合并后的媒体流,最后把两个流的音频添加到新的媒体流中,所以看下面思路:

  1. 将canvas绘制框流叫做lineStream,rtsp流叫做videoStream,新创建的myCanvas流叫做newStream。
  2. 将lineStream完全绘制到myCanvas上(撑满整个myCanvas),然后将videoStream也绘制到myCanvas的指定位置上,利用drawImage(x, y, width, height)设置开始位置和大小,这样就绘制好了一帧的影像。
  3. 再用requestAnimationFrame不停的更新myCanvas内容, 即可实现再canvas上播放混合流 再使用myCanvas.captureStream()获取合并后的媒体流。
  4. 最后把lineStream和videoStream音频添加进去. 使用mediaStream.getAudioTracks()获取音频轨道, 添加到新的媒体流中。

3.源码实现

<template>
  <div id="app">
    <!-- 项目中这里是rtsp流,这里直接用个MP4来测试 -->
    <video
      id="videoPlayer"
      src="../static/test.mp4"
      width="1000"
      height="562"
      ref="videoPlayer"
    ></video>
    <!-- myCanvas为新的媒体流 -->
    <canvas id="myCanvas" ref="myCanvas"></canvas>
    <!-- line为canvas画框的流 -->
    <canvas id="line" ref="line"></canvas>
    <button @click="screenShot()">截图</button>
    <button @click="recoder()">开始录像</button>
    <button @click="endRecoder()">结束录像</button>
  </div>
</template>

<script>
export default {
  name: "smallVideo",
  data() {
    return {
      //摄像头通过websocket传过来的坐标,我这里写个测试数据
      data: {
        tgtList: {
          tgtCoords: [
            {
              classId: 0,
              rect: {
                height: 0.5,
                point: {
                  x: 0.25,
                  y: 0.25,
                },
                width: 0.5,
              },
            },
            {
              classId: 0,
              rect: {
                height: 0.5,
                point: {
                  x: 0.75,
                  y: 0.25,
                },
                width: 0.5,
              },
            },
            {
              classId: 0,
              rect: {
                height: 0.5,
                point: {
                  x: 0.25,
                  y: 0.75,
                },
                width: 0.5,
              },
            },
            {
              classId: 0,
              rect: {
                height: 0.5,
                point: {
                  x: 0.75,
                  y: 0.75,
                },
                width: 0.5,
              },
            },
            {
              classId: 0,
              rect: {
                height: 0.5,
                point: {
                  x: 0.5,
                  y: 0.5,
                },
                width: 0.5,
              },
            },
          ],
          tgtNUM: 5,
        },
        timeStamp: 5903765758,
      },
      recorder: null, //进行录制的 MediaRecorder 对象
      videoData: [], //Blob对象集合
    };
  },
  mounted() {
    //动态设置line和myCanvas两个canvas的宽高
    let videoPlayer = this.$refs.videoPlayer,
      myCanvas = this.$refs.myCanvas,
      line = this.$refs.line,
      width = videoPlayer.offsetWidth,
      height = videoPlayer.offsetHeight,
      lineContext = line.getContext("2d");
      line.width = myCanvas.width = width;
      line.height = myCanvas.height = height;
     //转换websocket传过来的坐标
     let arr = this.paintDIv(this.data.tgtList.tgtCoords, width, height);
     //用lineCanvas实施画出框
     setInterval(
      (context) => {
        lineContext.clearRect(
          0,
          0,
          videoPlayer.clientWidth,
          videoPlayer.clientHeight
        );
        context.drawRect(arr, lineContext);
      },
      300,
      this
    );
  },
  methods: {
    //转换坐标
    paintDIv(data, width, height) {
      return data.map((ele) => {
        ele.rect.width = Math.round(width * ele.rect.width);
        ele.rect.height = Math.round(height * ele.rect.height);
        ele.rect.point.x = Math.round(
          ele.rect.point.x * width - ele.rect.width / 2
        );
        ele.rect.point.y = Math.round(
          ele.rect.point.y * height - ele.rect.height / 2
        );

        if (ele.rect.point.x < 0) {
          ele.rect.width = Math.round(ele.rect.width + ele.rect.point.x);
          ele.rect.point.x = 0;
        }

        if (ele.rect.point.y < 0) {
          ele.rect.height = Math.round(ele.rect.height + ele.rect.point.y);
          ele.rect.point.y = 0;
        }
        return ele;
      });
    },

    //实时画框,为了测试这里用data里面随机一个数据每300毫秒画一个框
    drawRect(pionts, lineContext) {
      lineContext.strokeStyle = "#ff0000";
      lineContext.lineWidth = 5;
      let index = Math.round(Math.random() * 4);
      let piont = pionts[index];
      lineContext.strokeRect(
        piont.rect.point.x,
        piont.rect.point.y,
        piont.rect.width,
        piont.rect.height
      );
    },

    //截屏
    screenShot() {
      let videoPlayer = this.$refs.videoPlayer,
        myCanvas = this.$refs.myCanvas,
        line = this.$refs.line,
        width = videoPlayer.offsetWidth,
        height = videoPlayer.offsetHeight,
        context = myCanvas.getContext("2d");
      context.drawImage(videoPlayer, 0, 0, width, height);
      context.drawImage(line, 0, 0, width, height);
      //转成base64
      let data = myCanvas.toDataURL();
      if (data) {
        // 如果直接返回了base64代码部分,所以不需要截取,如果含"data:image/png;base64," 则需要自己做截取处理
        //解码一个Base64字符串
        let raw = window.atob(data.split(",")[1]),
          rawLength = raw.length,
          //将Base64字符串转成uint8数组
          uInt8Array = new Uint8Array(rawLength);
        for (var i = 0; i < rawLength; ++i) {
          uInt8Array[i] = raw.charCodeAt(i);
        }
        //将图片的base64 转变成Blob形式
        let blob = new Blob([uInt8Array], {
          type: "image/png",
        });
        //保存图片
        var a = document.createElement("a");
        a.download = `file_${new Date().getTime()}.png`;
        a.href = URL.createObjectURL(blob);
        document.body.appendChild(a);
        a.click();
        a.remove();
        window.URL.revokeObjectURL(a.href);
      }
    },

    //合并流
    mergeStream() {
      let videoPlayer = this.$refs.videoPlayer,
        myCanvas = this.$refs.myCanvas,
        line = this.$refs.line,
        width = videoPlayer.offsetWidth,
        height = videoPlayer.offsetHeight,
        context = myCanvas.getContext("2d");

      let videoStream = videoPlayer.captureStream(),
        lineStream = line.captureStream(),
        render = () => {
          if (videoStream) {
            context.drawImage(videoPlayer, 0, 0, width, height);
            context.drawImage(line, 0, 0, width, height);
            window.requestAnimationFrame(render);
          }
        };
      render();
      // 创建新的媒体流
      let newStream = myCanvas.captureStream();
      //合并音频
      videoStream
        .getAudioTracks()
        .forEach((track) => newStream.addTrack(track));
      lineStream.getAudioTracks().forEach((track) => newStream.addTrack(track));
      return newStream;
    },

    //开始录像
    recoder() {
      let stream = this.mergeStream(),
      videoPlayer = this.$refs.videoPlayer,
      //注意要判断浏览器对webm的支持情况,有些时候video格式不对,在ondataavailable监听的时候会拿不到data数据(data的size为空)
      mime = MediaRecorder.isTypeSupported("video/webm; codecs=vp9")
          ? "video/webm; codecs=vp9"
          : "video/webm";
        this.recorder = new MediaRecorder(stream, {
          mimeType: mime,
        });
      this.recorder.ondataavailable = (e) => {
        this.videoData.push(e.data);
      };
      videoPlayer.play();
      this.recorder.start();
    },

    //结束录像
    endRecoder() {
      let videoPlayer = this.$refs.videoPlayer;
      this.recorder.stop();
      videoPlayer.pause();
      return new Promise((resolve) => {
        setTimeout(() => {
          let blob = new Blob(this.videoData, {
            type: "video/mp4",
          });
          let a = document.createElement("a");
          a.download = `file_${new Date().getTime()}.mp4`;
          a.href = window.URL.createObjectURL(blob);
          document.body.appendChild(a);
          a.click();
          a.remove();
          window.URL.revokeObjectURL(a.href);
          this.recorder = null;
          this.videoData = [];
          resolve();
        }, 0);
      });
    },
  },
};
</script>

<style lang="less" >
body {
  margin: 0;
}
canvas {
  /* border: 1px dashed black; */
  /* display: none; */
  position: absolute;
  top: 0;
  left: 0;
}

#line {
  z-index: 9999;
}

#myCanvas {
  display: none;
}
</style>

下面是实现效果:
在这里插入图片描述

  • 3
    点赞
  • 19
    收藏
    觉得还不错? 一键收藏
  • 2
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值