攻克实时通信:WebRTC前端代码实现(第三天)

前言:在上一章中我们使用springboot写好了一个后端的信令,并且使用了Postman和一些在线网站测试了一下,现在我们就要配合上我们的前端获取设备的摄像头以及麦克风等设备进行流媒体的传输,实现一个简易的视频会议通话的界面

前提准备:本章节中,我们需要有vue3的基础,并且需要安装一下node.js,还需要安装element组件库,以及vue-router,可自行到官网上查看如何使用npm进行下载安装,如果还不会的朋友可以先自行学习一下vue3之后再来看这篇文章

第一小节:写一个简单通话的界面

        在这里我们模仿一个视频会议的界面(给出的例子只是单人的通话,后续各位可自行拓展成多人视频通话)

        首先我们写一个会议列表,每一个会议都有对应的会议编号(meetingId),之后点击进入会议,来到我们的视频会议界面,界面加载的时候自动调用电脑的摄像头和麦克风设备,具体的代码如下(注:界面其实可以自己写,这里小编只是写个测试用例,所以界面较为简陋)

        视频列表界面:

<template>
  <div class="meeting-list-container">
    <!-- 页面标题 -->
    <div class="page-header">
      <el-page-header content="会议列表" @back="handleBack" />
    </div>

    <!-- 会议列表卡片 -->
    <el-card class="meeting-card" shadow="hover" v-for="(meeting, index) in meetingList" :key="index">
      <div class="card-header">
        <el-icon class="meeting-icon"><Calendar /></el-icon>
        <span class="meeting-title">{{ meeting.title }}</span>
      </div>

      <div class="card-body">
        <div class="meeting-info">
          <span class="info-label">会议ID:</span>
          <span class="info-value">{{ meeting.meetingId }}</span>
        </div>
        <div class="meeting-info">
          <span class="info-label">开始时间:</span>
          <span class="info-value">{{ meeting.startTime }}</span>
        </div>
        <div class="meeting-info">
          <span class="info-label">参会人数:</span>
          <span class="info-value">{{ meeting.attendeeCount }} 人</span>
        </div>
        <div class="meeting-info">
          <span class="info-label">会议状态:</span>
          <el-tag :type="meeting.status === '进行中' ? 'success' : 'primary'">
            {{ meeting.status }}
          </el-tag>
        </div>
      </div>

      <!-- 进入会议按钮 -->
      <div class="card-footer">
        <el-button 
          type="primary" 
          icon="Right" 
          @click="handleEnterMeeting(meeting.meetingId)"
        >
          进入会议
        </el-button>
      </div>
    </el-card>

    <!-- 无会议数据时的空状态 -->
    <el-empty 
      description="暂无会议数据" 
      v-if="meetingList.length === 0"
      class="empty-state"
    >
      <template #image>
        <el-image 
          src="https://element-plus.org/images/empty.svg" 
          style="width: 160px; height: 160px;" 
        />
      </template>
    </el-empty>
  </div>
</template>

<script setup>
import { useRouter } from 'vue-router'
// 引入 Element 组件和图标
import { ElPageHeader, ElCard, ElIcon, ElTag, ElButton, ElEmpty, ElImage } from 'element-plus'
import { Calendar, Right } from '@element-plus/icons-vue'

// 1. 模拟会议列表数据(实际项目中应从后端接口获取)
const meetingList = [
  {
    title: "产品需求评审会",
    meetingId: "467886", // 会议唯一ID(实际项目建议用UUID)
    startTime: "2024-08-29 14:00",
    attendeeCount: 8,
    status: "进行中"
  },
  {
    title: "技术架构设计会",
    meetingId: "002",
    startTime: "2024-08-29 16:30",
    attendeeCount: 5,
    status: "未开始"
  },
  {
    title: " weekly 周会",
    meetingId: "20240830",
    startTime: "2024-08-30 10:00",
    attendeeCount: 12,
    status: "未开始"
  }
]

// 2. 路由跳转逻辑
const router = useRouter()

// 点击“进入会议”:跳转到 meeting 页,并携带 meetingId(两种方式可选)
const handleEnterMeeting = (meetingId) => {
  router.push({
    path: '/meeting', 
    query: { meetingId: meetingId } // 携带会议ID
  })
}

const handleBack = () => {
  router.go(-1) // 返回上一页
}
</script>

<style scoped>
.meeting-list-container {
  padding: 20px;
  max-width: 1200px;
  margin: 0 auto;
}

.page-header {
  margin-bottom: 30px;
}

.meeting-card {
  margin-bottom: 20px;
  transition: all 0.3s ease;
}

.meeting-card:hover {
  transform: translateY(-5px);
}

.card-header {
  display: flex;
  align-items: center;
  margin-bottom: 15px;
}

.meeting-icon {
  color: #409eff;
  margin-right: 8px;
  font-size: 18px;
}

.meeting-title {
  font-size: 16px;
  font-weight: 500;
  color: #333;
}

.card-body {
  margin-bottom: 20px;
}

.meeting-info {
  display: flex;
  align-items: center;
  margin-bottom: 10px;
  font-size: 14px;
}

.info-label {
  color: #666;
  width: 80px;
  display: inline-block;
}

.info-value {
  color: #333;
}

.card-footer {
  display: flex;
  justify-content: flex-end;
}

.empty-state {
  margin-top: 80px;
  text-align: center;
}
</style>

        视频会议界面:

<script setup>
import { onMounted, ref } from 'vue';
import mircoButton from '../components/mircoButton.vue';
import videoButton from '../components/videoButton.vue';
import screenButton from '../components/screenButton.vue';
import { useDeviceInstance } from '../method/use-device.js';
import websocket from '../method/websock.js';
import { useRoute } from 'vue-router';
import avatar from '../assets/dog.jpg'
import { ElMessage } from 'element-plus';

const { videoRef, audioRef, initDevice, audioInputStream, videoInputStream, screenShareStream } = useDeviceInstance;
const baseStream = new MediaStream();

const route = useRoute();
const remoteVideoRef = ref(null);
const remoteMember = ref(null);
const peerConnection = ref(null);

// 配置ICE服务器
const configuration = {
    iceServers: [
        { urls: 'stun:stun.l.google.com:19302' },
        { urls: 'stun:stun1.l.google.com:19302' } // 增加备用STUN服务器
    ]
}

async function init() {
    try {
        await initDevice();
        console.log('设备初始化完成');
        
        // 确保媒体流已正确获取
        if (videoInputStream.value && audioInputStream.value) {
            // 添加轨道到基础流
            videoInputStream.value.getTracks().forEach(track => {
                baseStream.addTrack(track);
                console.log('添加视频轨道:', track.id);
            });
            audioInputStream.value.getTracks().forEach(track => {
                baseStream.addTrack(track);
                console.log('添加音频轨道:', track.id);
            });
            
            console.log('初始化本地流', baseStream);
        } else {
            ElMessage.error('无法获取媒体流');
            return;
        }
        
        // 发送进入会议消息
        websocket.sendMessage('ENTER', {
            name: '会议',
            avatar: avatar,
            meetingId: route.query.meetingId,
        });
    } catch (error) {
        console.error('初始化失败:', error);
        ElMessage.error('初始化失败: ' + error.message);
    }
}

function createRTCPeerConnection({fromId}) {
    // 如果已有连接,先关闭
    if (peerConnection.value) {
        peerConnection.value.close();
    }
    
    peerConnection.value = new RTCPeerConnection(configuration);
    console.log('创建对等连接', peerConnection.value);
    
    // 监听协商事件
    peerConnection.value.onnegotiationneeded = async () => {
        console.log('检测到需要协商,准备创建offer');
        try {
            const offer = await peerConnection.value.createOffer();
            console.log('创建offer', offer);
            await peerConnection.value.setLocalDescription(offer);
            
            websocket.sendMessage('OFFER', {
                targetId: fromId,
                description: peerConnection.value.localDescription
            });
        } catch (error) {
            console.error('创建offer失败:', error);
            ElMessage.error('创建连接失败: ' + error.message);
        }
    };
    
    // 添加本地媒体轨道到连接
    const tracks = baseStream.getTracks();
    tracks.forEach(track => {
        peerConnection.value.addTrack(track, baseStream);
        console.log('已添加轨道到连接:', track.kind);
    });
    
    // 处理远程轨道
    peerConnection.value.ontrack = (event) => {
        console.log('收到远程轨道:', event.track.kind);
        if (event.streams && event.streams[0]) {
            remoteVideoRef.value.srcObject = event.streams[0];
        } else {
            ElMessage.error('未收到有效的媒体流');
        }
    };
    
    // 处理ICE候选
    peerConnection.value.onicecandidate = (event) => {
        if (event.candidate) {
            console.log('发送ICE候选', event.candidate);
            websocket.sendMessage('CANDIDATE', {
                targetId: fromId,
                candidate: event.candidate
            });
        }
    };
    
    // 监听连接状态变化
    peerConnection.value.oniceconnectionstatechange = () => {
        console.log('ICE连接状态:', peerConnection.value.iceConnectionState);
        if (peerConnection.value.iceConnectionState === 'failed') {
            ElMessage.warning('连接失败,尝试重新连接...');
        }
    };
    
    // 手动触发协商(解决自动事件不触发问题)
    setTimeout(() => {
        if (peerConnection.value.signalingState === 'stable') {
            console.log('手动触发协商');
            peerConnection.value.onnegotiationneeded();
        }
    }, 1000);
}

// WebSocket消息处理
websocket.addEventHandler('ENTER', (data) => {
    console.log('收到用户进入消息:', data);
    const { name, avatar, fromId, meetingId } = data;
    remoteMember.value = { name, avatar, fromId };
    createRTCPeerConnection({fromId});
});

websocket.addEventHandler('OFFER', async (result) => {
    console.log('收到offer消息', result);
    const { fromId, description } = result;
    
    if (!peerConnection.value) {
        createRTCPeerConnection({fromId});
    }
    
    try {
        await peerConnection.value.setRemoteDescription(new RTCSessionDescription(description));
        const answer = await peerConnection.value.createAnswer();
        await peerConnection.value.setLocalDescription(answer);
        
        websocket.sendMessage('ANSWER', {
            targetId: fromId,
            description: peerConnection.value.localDescription
        });
    } catch (error) {
        console.error('处理offer失败:', error);
    }
});

websocket.addEventHandler('ANSWER', async (result) => {
    console.log('收到answer消息', result);
    try {
        await peerConnection.value.setRemoteDescription(
            new RTCSessionDescription(result.description)
        );
    } catch (error) {
        console.error('处理answer失败:', error);
    }
});

websocket.addEventHandler('CANDIDATE', async (result) => {
    console.log('收到candidate消息', result);
    if (result && result.candidate && peerConnection.value) {
        try {
            await peerConnection.value.addIceCandidate(new RTCIceCandidate(result.candidate));
        } catch (error) {
            console.error('添加ICE候选失败:', error);
        }
    }
});

onMounted(() => {
    init();
});
</script>

<template>
    <div class="container">
        <div class="video-wrapper">
            <div class="main-video">
                <video ref="videoRef" muted autoplay></video>
            </div>
            <div class="remote-video">
                <video ref="remoteVideoRef" autoplay></video>
            </div>
            <audio ref="audioRef" muted autoplay></audio>
            <div class="bottom-wrapper">
                <mircoButton />
                <videoButton />
                <screenButton />
            </div>
        </div>
    </div>
</template>

<style scoped>
.video-wrapper {
    width: 800px;
    height: 450px;
    background-color: #000;
    position: relative;
    display: flex;
}

video {
    width: 800px;
    height: 400px;
    object-fit: cover;
}

.bottom-wrapper {
    display: flex;
    justify-content: center;
    align-items: center;
    gap: 20px;
    position: absolute;
    bottom: 0;
    left: 0;
    width: 100%;
    height: 70px
}

.remote-video {
    margin-left: 15px;
    background-color: #666;
}
.remote-video video {
    width: 250px;
    height: 150px;
    background-color: white;
    margin: 1rem;
    border-radius: 5px;
}
</style>

第二小节:按钮组件的封装

        音频按钮:

<script setup>
import { Microphone, Mute } from '@element-plus/icons-vue';
import { useDeviceInstance } from '../method/use-device.js';

const { openMirco, closeMirco, audioInputState } = useDeviceInstance;

</script>

<template>
    <div class="btn-item" v-if="audioInputState">
        <el-icon><Microphone @click="closeMirco"/></el-icon>
    </div>
    <div class="btn-item" v-else>
        <el-icon><Mute @click="openMirco"/></el-icon>
    </div>


</template>

<style scoped>
.btn-item {
    width: 50px;
    height: 50px;
    background-color: rgba(255, 255, 255, 0.2);
    border-radius: 50%;
    display: flex;
    justify-content: center;
    align-items: center;
    font-size: 26px;
    color: white;
}

.btn-item:hover {
    cursor: pointer;
    background-color: rgba(255, 255, 255, 0.5);
}
</style>

        视频按钮:

<script setup>
import { VideoCamera, VideoCameraFilled } from '@element-plus/icons-vue';
import { useDeviceInstance } from '../method/use-device.js';

const { openVideo, closeVideo, videoInputState } = useDeviceInstance;

</script>

<template>
    <div class="btn-item" v-if="!videoInputState">
        <el-icon><VideoCamera @click="openVideo" /></el-icon>
    </div>
    <div class="btn-item" v-else>
        <el-icon><VideoCameraFilled @click="closeVideo" /></el-icon>
    </div>
</template>

<style scoped>
.btn-item {
    width: 50px;
    height: 50px;
    background-color: rgba(255, 255, 255, 0.2);
    border-radius: 50%;
    display: flex;
    justify-content: center;
    align-items: center;
    font-size: 26px;
    color: white;
}

.btn-item:hover {
    cursor: pointer;
    background-color: rgba(255, 255, 255, 0.5);
}
</style>

        屏幕共享按钮:

<script setup>
import { useDeviceInstance } from '../method/use-device';
import { DataBoard, DataAnalysis } from '@element-plus/icons-vue';

const { openScreenShare, closeScreenShare, screenShareState } = useDeviceInstance;
</script>

<template>
    <div class="btn-item" v-if="!screenShareState">
        <el-icon><DataBoard  @click="openScreenShare" /></el-icon>
    </div>
    <div class="btn-item" v-else>
        <el-icon><DataAnalysis @click="closeScreenShare" /></el-icon>
    </div>
</template>

<style scoped>
.btn-item {
    width: 50px;
    height: 50px;
    background-color: rgba(255, 255, 255, 0.2);
    border-radius: 50%;
    display: flex;
    justify-content: center;
    align-items: center;
    font-size: 26px;
    color: white;
}

.btn-item:hover {
    cursor: pointer;
    background-color: rgba(255, 255, 255, 0.5);
}
</style>

第三小节:封装一个websocket

        我们封装一个websocket,在会议界面展示之前,就应该连接,所以我们可以在路由中定义一个beforeEnter事件,在界面加载之前调用,可以去看【 路由函数的使用(简单版)】这里只写websocket的一个简单的封装,比如心跳机制各位可以后续自己加进去,小编虽然写了,但写的很简陋,各位可以自行完善,以下是代码示例:

import { ElMessage } from 'element-plus';

let socket = null;
let interval = 0;
const eventMap = new Map();

function open() {
    return new Promise((resolve, reject) => {
        if (socket) {
            resolve(socket);
            return;
        }
        const WebSocketUrl = `ws://localhost:8080/websocket/12323`;
        socket = new WebSocket(WebSocketUrl);
        console.log('执行到连接后面')
        socket.onopen = () => {
            interval = setInterval(() => {
                if (socket && socket.readyState === WebSocket.OPEN) {
                    const message = JSON.stringify({
                        code: 'PING',
                        data: {},
                    });
                    socket.send(message);
                }
            }, 1000 * 10);
            resolve();
        };

        socket.onmessage = (event) => {
            console.log('收到消息', event)
            const response = event.data;
            console.log('收到消息', response)
            if (response) {
                const { code, data } = JSON.parse(response);
                handleResult(code, data);
                if (code === 'CONNECT_SUCCESS') {
                    console.log('连接成功' + data);
                }
            }
        };

        socket.onclose = (event) => {
            clearInterval(interval);
            ElMessage.warning('断开连接')
        };

        socket.onerror = (error) => {
            console.log(error);
            reject(error);
        };
    });
}

function close() {
    if (socket) {
        socket.close();
        socket = null;
    }
}

function sendMessage(code, data) {
    console.log('执行了发送消息', data, code)
    if (socket.readyState !== WebSocket.OPEN)
        console.log('发送消息', code, data)
    socket.send(JSON.stringify({ code, data }));
}

function handleResult(code, data) {
    const callback = eventMap.get(code);
    if (typeof callback === 'function') {
        callback(data);
    }
}

function addEventHandler(code, callback) {
    eventMap.set(code, callback)
}

function removeEventHandler(key) {
    eventMap.delete(key);
}

export default {
    open,
    close,
    sendMessage,
    addEventHandler,
    removeEventHandler
}

        在这里我们封装了一个websocket,并且还处理一些监听的事件,在后续的WebRTC我们会使用它来监听对方发送来的 OFFER 、ANSWER等信息

第四小节:封装一个麦克风和摄像头调用函数

        下边是根据MDN文档中提供的,浏览器获取设备的麦克风和摄像头的方法,最重要的就是getUserMedia这个方法,当audio是true的时候,就表明获取设备的麦克风,当video为true 的时候表明获取设备的摄像头,在这里我们还定义了一个处理大部分常见异常的函数,比如用户拒绝使用摄像头等异常,会给用户相应的提示

import { ref } from "vue";
import { ElMessage } from "element-plus";
export function useDevice() {

    // video 元素
    const videoRef = ref(null);
    const audioRef = ref(null);

    // 是否有设备
    const hasAudioInput = ref(false);
    const hasVideoInput = ref(false);
    const hasAudioOutput = ref(false);


    // 设备状态
    const audioInputState = ref(false);
    const videoInputState = ref(false);
    const audioOutputState = ref(false);
    const screenShareState = ref(false);

    // 相对应的流 
    const audioInputStream = ref(null);
    const videoInputStream = ref(null);
    const audioOutputStream = ref(null);
    const screenShareStream = ref(null);

    // 初始化
    async function initDevice() {
        const device = await navigator.mediaDevices.enumerateDevices();
        const audioInput = device.find(device => device.kind === 'audioinput');
        const videoInput = device.find(device => device.kind === 'videoinput');
        const audioOutput = device.find(device => device.kind === 'audiooutput');
        if (audioInput) {
            hasAudioInput.value = true;
            await openMirco();
        }
        if (videoInput) {
            hasVideoInput.value = true;
            await openVideo();
        }
        if (audioOutput) {
            hasAudioOutput.value = true;
        }
    }

    // 打开麦克风
    async function openMirco() {
        if (audioInputState.value) {
            ElMessage.warning('麦克风已打开')
            return;
        }
        try {
            audioInputStream.value = await navigator.mediaDevices.getUserMedia
                ({ audio: true, video: false });
            audioInputState.value = true;
            audioRef.value.srcObject = audioInputStream.value;
            ElMessage.success('麦克风已打开');
        } catch (error) {
            handleError(error);
        }

    }

    // 关闭麦克风
    function closeMirco() {
        if (audioInputStream.value === null || audioInputState.value === false) {
            ElMessage.error('请先打开麦克风');
            return;
        }
        try {
            const tracks = audioInputStream.value.getTracks();
            tracks.forEach(track => {
                if (track.kind === 'audio') {
                    track.stop();
                }
            });
            audioInputStream.value = null;
            audioInputState.value = false;
            audioRef.value.srcObject = null;
            ElMessage.warning('麦克风已关闭');
        } catch (error) {
            handleError(error);
        }
    }

    // 打开摄像头
    async function openVideo() {
        if (videoInputStream.value !== null) {
            ElMessage.warning('摄像头已打开');
            return;
        }
        try {
            videoInputStream.value = await navigator.mediaDevices.getUserMedia
                ({ audio: false, video: true });
            videoRef.value.srcObject = videoInputStream.value;
            videoInputState.value = true;
            ElMessage.success('摄像头已打开');
        } catch (error) {
            videoInputState.value = false;
            handleError(error);
        }
    }

    // 关闭摄像头
    function closeVideo() {
        try {
            if (videoInputStream.value === null) {
                ElMessage.error('请先打开摄像头');
                return;
            }
            videoInputStream.value.getTracks().forEach(track => {
                if (track.kind === 'video') {
                    track.stop();
                }
            });
            videoInputState.value = false;
            videoInputStream.value = null;
            videoRef.value.srcObject = null;
            ElMessage.warning('摄像头已关闭');
        } catch (error) {
            handleError(error);
        }
    }

    // 打开屏幕共享
    async function openScreenShare() {
        // 判断是否支持屏幕共享
        if (!('getDisplayMedia' in navigator.mediaDevices)) {
            ElMessage.error('当前设备不支持屏幕共享');
            return;
        }
        if (screenShareState.value) {
            ElMessage.warning('屏幕共享已打开');
            return;
        }
        try {
            screenShareStream.value = await navigator.mediaDevices.getDisplayMedia
                ({ video: true });
            screenShareState.value = true;
            if (videoInputStream.value !== null) {
                closeVideo();
            }
            videoRef.value.srcObject = screenShareStream.value;
            ElMessage.success('屏幕共享已打开');
        } catch (error) {
            handleError(error);
        }
    }

    // 关闭屏幕共享
    function closeScreenShare() {
        if (screenShareStream.value === null || screenShareState.value === false) {
            ElMessage.error('请先打开屏幕共享');
            return;
        }
        try {
            const tracks = screenShareStream.value.getTracks();
            console.log(tracks);
            tracks.forEach(track => {
                if (track.kind === 'video') {
                    track.stop();
                }
            });
            screenShareStream.value = null;
            screenShareState.value = false;
            videoRef.value.srcObject = null;
            ElMessage.warning('屏幕共享已关闭');
        } catch (error) {
            handleError(error);
        }
    }


    // 处理所有的逻辑错误
    function handleError(error) {
        switch (error.name) {
            case 'NotFoundError':
                ElMessage.error('找不到设备');
                break;
            case 'NotReadableError':
                ElMessage.error('设备不可读');
                break;
            case 'OverconstrainedError':
                ElMessage.error('设备配置不合理');
                break;
            case 'NotAllowedError':
                ElMessage.error('用户拒绝访问设备');
                break;
            case 'TypeError':
                ElMessage.error('设备类型错误');
                break;
            case 'SecurityError':
                ElMessage.error('安全错误');
                break;
            default:
                ElMessage.error('未知错误');
                break;
        }
    }

    return {
        openMirco,
        closeMirco,
        openVideo,
        closeVideo,
        openScreenShare,
        closeScreenShare,
        initDevice,
        videoRef,
        audioRef,
        hasAudioInput,
        hasVideoInput,
        hasAudioOutput,
        audioInputState,
        videoInputState,
        audioOutputState,
        screenShareState,
        audioInputStream,
        videoInputStream,
        audioOutputStream,
        screenShareStream
    }

}

export const useDeviceInstance = useDevice();

第五小节:路由函数的使用(简单版)

        在这里使用了一个简单的路由,并且在进入会议界面(meeting.vue)之前先初始化websocket的连接,连接成功了才能进行后边的操作,这里也只是一个简单版的路由使用,各位可以在多加一些判断,比如当websocket初始化连接失败,可以提示用户会议连接失败或者跳转到错误界面,这里就不再写对应的代码了

import { createRouter, createWebHashHistory } from 'vue-router'
import { ElMessage } from 'element-plus';
import meeting from '../views/meeeting.vue'
import meetingList from '../views/meetingList.vue'
import websock from '../method/websock';
const routes = [
    {
        path: '/',
        name: 'meetingList',
        component: meetingList,
    },
    {
        path: '/meeting',
        name: 'meeting',
        component: meeting,
        beforeEnter: async (to, from, next) => {
            try {
                await websock.open();
                next();
            } catch (error) {
                console.log(error);
                ElMessage.error('连接失败,请检查网络连接');
                next();
            }
        }
    },
];

const router = createRouter({
    history: createWebHashHistory(),
    routes
});

export default router;

本章节就写到这里,其实到这里这个webrtc通信也就差不多讲完了,后续各位可以把它变成一个多人会议的形式,后续小编要是有空,也会继续更新的....

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值