【iOS】视频录像相关功能调研(一)

10 篇文章 0 订阅

1、实现视频录像的几种方式

  • UIImagePickerController
  • AVCaptureSession + AVCaptureMovieFileOutput
  • AVCaptureSession + AVAssetWriter

UIImagePickerController系统封装好的UI,直接输出视频文件
AVCaptureSession + AVCaptureMovieFileOutput 支持自定义UI,直接输出视频文件
AVCaptureSession + AVAssetWriter支持自定义UI,输出的是视频帧和音频帧,需要自己处理,拼接成视频文件

2、系统封装好的 UIImagePickerController

2.1 UIImagePickerController方式

是目前集成相机最简单的方式,但是不知道自定义相机,这是一个封装了完整视频捕获管线和相机 UI 的 view controller。

2.1.1 info.plist 设置

Privacy - Microphone Usage Description 是否允许设备调用您的麦克风?
Privacy - Camera Usage Description 是否允许设备调用您的相机?

2.1.2 是否支持相机录制

在实例化相机之前,首先要检查设备是否支持相机录制:

        /// 判断是否支持录像
        if UIImagePickerController.isSourceTypeAvailable(UIImagePickerController.SourceType.camera) {
            if let availableMediaTypes = UIImagePickerController.availableMediaTypes(for: UIImagePickerController.SourceType.camera) {
                if !availableMediaTypes.contains("public.movie") {
                    print("不支持录像")
                    return
                }
            }
        }
2.1.3 权限确认

视频权限和音频权限确认

        /// 视频权限
        AVCaptureDevice.requestAccess(for: AVMediaType.video) {[weak self] (granted) in
            guard let weakSelf = self else {
                return
            }
            if !granted {
                print("无权限访问相机")
                return
            }
            
            // 录音权限
            AVCaptureDevice.requestAccess(for: AVMediaType.audio) {[weak self] (granted) in
                guard let weakSelf = self else {
                    return
                }
                
                if !granted {
                    print("无权限访问麦克风")
                    return
                }
                // 进入录像页面
                weakSelf.present(weakSelf.pickerController, animated: true, completion: nil)
            }
        }
2.1.4 创建UIImagePickerController 对象

然后创建一个 UIImagePickerController 对象,设置好代理便于进一步处理录制好的视频 (比如存到相册) 以及对于用户关闭相机作出响应:

    lazy var pickerController: UIImagePickerController = {
        let pickerController = UIImagePickerController()
        // 设置图像选取控制器的来源模式为相机模式 相机、相册
        pickerController.sourceType = UIImagePickerController.SourceType.camera
        // 设置相机的类型 public.image  public.movie
        pickerController.mediaTypes = ["public.movie",]
        // 设置摄像头 前、后
        pickerController.cameraDevice = UIImagePickerController.CameraDevice.rear;
        // 设置摄像头闪光灯模式
        // pickerController.cameraFlashMode = UIImagePickerController.CameraFlashMode.auto
        // 设置摄像图像品质
        pickerController.videoQuality = UIImagePickerController.QualityType.typeHigh
        // 设置最长摄像时间
        pickerController.videoMaximumDuration = 30
        // 允许用户进行编辑
        pickerController.allowsEditing = false
        // 设置委托对象
        pickerController.delegate = self
        return pickerController
    }()
2.1.5 UIImagePickerControllerDelegate代理的实现
extension RecordVideoViewController: UIImagePickerControllerDelegate, UINavigationControllerDelegate
{
    func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
        let mediaType = info[UIImagePickerController.InfoKey.mediaType] as! String

        if mediaType == "public.movie" {
            // 获取视频文件的url
            let mediaURL = info[UIImagePickerController.InfoKey.mediaURL] as! NSURL
            // 视频文件的地址
            let pathString = mediaURL.relativePath
            print("视频地址:" + pathString!)
            
            DispatchQueue.global().async {
                //判断能不能保存到相簿
                if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(pathString!)) {
                    //保存视频到相簿
                    UISaveVideoAtPathToSavedPhotosAlbum(pathString!, self,  #selector(self.saveVideo(videoPath:didFinishSavingWithError:contextInfo:)), nil)
                }
                DispatchQueue.main.async {
                    picker.dismiss(animated: true, completion: nil)
                }
            }
            
        }
    }
    
    @objc private func saveVideo(videoPath:String, didFinishSavingWithError error: NSError?, contextInfo: AnyObject) {
        if error != nil{
            print("保存视频 失败")
        }else{
            print("保存视频 成功")
        }
    }
}

3、自定义相机 AVFoundation

AVFoundation 中关于视频捕获的主要的类是 AVCaptureSession。它负责调配影音输入与输出之间的数据流

3.1 AVCaptureSession + AVCaptureMovieFileOutput方式

3.1.1 info.plist 设置

Privacy - Microphone Usage Description 是否允许设备调用您的麦克风?
Privacy - Camera Usage Description 是否允许设备调用您的相机?

3.1.2 权限确认

视频权限和音频权限确认

        /// 视频权限
        AVCaptureDevice.requestAccess(for: AVMediaType.video) {[weak self] (granted) in
            guard let weakSelf = self else {
                return
            }
            if !granted {
                print("无权限访问相机")
                return
            }
            
            // 录音权限
            AVCaptureDevice.requestAccess(for: AVMediaType.audio) {[weak self] (granted) in
                guard let weakSelf = self else {
                    return
                }
                
                if !granted {
                    print("无权限访问麦克风")
                    return
                }
                
            }
        }
3.1.3 创建AVCaptureSession

使用一个 capture session,你需要先实例化,添加输入与输出,设置分辨率,接着启动从输入到输出之间的数据流:

    /// 视频捕获会话
    let captureSession = AVCaptureSession()
3.1.4 添加视频输入设备
    //MARK: 添加视频输入设备
    func addInputVideo() {
        self.captureSession.beginConfiguration()

        let videoDevice = AVCaptureDevice.default(for: AVMediaType.video)!
        let videoInput = try? AVCaptureDeviceInput(device: videoDevice)
        
        if self.captureSession.canAddInput(videoInput!) {
            self.captureSession.addInput(videoInput!)
        }
        
        self.captureSession.commitConfiguration()
    }
3.1.5 添加音频输入设备
    //MARK: 添加音频输入设备
    func addInputAudio() {
        self.captureSession.beginConfiguration()
        
        let audioDevice = AVCaptureDevice.default(for: AVMediaType.audio)
        let audioInput = try? AVCaptureDeviceInput(device: audioDevice!)
        
        if self.captureSession.canAddInput(audioInput!) {
            self.captureSession.addInput(audioInput!);
        }
        self.captureSession.commitConfiguration()
    }
3.1.6 设置分辨率
    //MARK: 设置分辨率
    func setPreset() {
        self.captureSession.beginConfiguration()
        if self.captureSession.canSetSessionPreset(AVCaptureSession.Preset.hd1280x720) {
            self.captureSession.sessionPreset = AVCaptureSession.Preset.hd1280x720
        }
        self.captureSession.commitConfiguration()
    }
    
3.1.7 设置输出
    //MARK: 设置输出
    func setOutput() {
        self.captureSession.beginConfiguration()
        
        if let captureConnection = self.fileOutput.connection(with: AVMediaType.video) {
            // 防止抖动
            if captureConnection.isVideoStabilizationSupported {
                captureConnection.preferredVideoStabilizationMode = .auto
            }
            // 预览图层和视频方向保存一直
            captureConnection.videoOrientation = (self.videoLayer.connection?.videoOrientation)!
        }
        // 视频时长默认10秒,此设置不受限制
        self.fileOutput.movieFragmentInterval = CMTime.invalid
        if  self.captureSession.canAddOutput(self.fileOutput) {
            self.captureSession.addOutput(self.fileOutput)
        }
        self.captureSession.commitConfiguration()
    }
3.1.8 设置显示采集画面,并开始采集
    /// 摄像头采集画面
    lazy var videoLayer: AVCaptureVideoPreviewLayer = {
        let videoLayer = AVCaptureVideoPreviewLayer(session: self.captureSession)
        videoLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
        videoLayer.masksToBounds = true
        return videoLayer
    }()
    //使用AVCaptureVideoPreviewLayer可以将摄像头的拍摄的实时画面显示在ViewController上
        DispatchQueue.main.async {
            weakSelf.videoLayer.frame = weakSelf.view.bounds
            weakSelf.view.layer.addSublayer(weakSelf.videoLayer)
            weakSelf.captureSession.startRunning()
            //创建按钮
            weakSelf.setUI()
        }
3.1.9 开始录制
    //MARK: 开始录制
    @objc func starRecordVideo() {
        
        if  !self.isRecording {
            //设置录像的保存地址
            let filePath = self.getNewPath(videoTyle: AVFileType.mp4)
            let fileURL = URL(fileURLWithPath: filePath)
            //启动视频编码输出
            fileOutput.startRecording(to:fileURL, recordingDelegate:  self )
            
            //记录状态:录像中...
            self.isRecording = true
            //开始、结束按钮颜色改变
            self.starButton.backgroundColor = UIColor.lightGray
            self.starButton.isEnabled = false
            
            self.stopButton.backgroundColor = UIColor.blue
            self.stopButton.isEnabled = true
        }
    }
3.1.10 结束录制
    //MARK: 结束录制
    @objc func stopRecordVideo() {
        if self.isRecording {
            //停止视频编码输出
            fileOutput.stopRecording()
            
            //记录状态:录像结束
            self .isRecording =  false
            
            //开始、结束按钮颜色改变
            self.starButton.backgroundColor = UIColor.red
            self.starButton.isEnabled = true
            
            self.stopButton.backgroundColor = UIColor.lightGray
            self.stopButton.isEnabled = false
        }
    }

3.1.11 AVCaptureFileOutputRecordingDelegate
//MARK: AVCaptureFileOutputRecordingDelegate
extension RecordVideo2ViewController:AVCaptureFileOutputRecordingDelegate
{
    // 开始录制
    func fileOutput(_ output: AVCaptureFileOutput, didStartRecordingTo fileURL: URL, from connections: [AVCaptureConnection]) {
        
    }
    // 结束录制
    func fileOutput(_ output: AVCaptureFileOutput, didFinishRecordingTo outputFileURL: URL, from connections: [AVCaptureConnection], error: Error?) {
        // 获取视频文件大小
        self.getVideoSize(videoUrl: outputFileURL)
        // 获取视频文件时长
        self.getVideoLength(videoUrl: outputFileURL)
        // 保存相册一份,便于测试
        self.saveVideoToAlbum(videoUrl: outputFileURL)
        // 获取指定时间的帧
        self.getImage(videoUrl: outputFileURL, cmtime: CMTimeMake(value: 1, timescale: 1), width: 300)
        
        // 压缩视频
        let newPath = self.getNewPath(videoTyle: AVFileType.mov)
        print(newPath)
        self.convertVideo(inputURL: outputFileURL, outputURL: URL(fileURLWithPath: newPath), presetName: AVAssetExportPresetMediumQuality) { (success) in
            if success {
                print("压缩成功")
            }else{
                print("压缩失败")
            }
        }
    }
}

3.2 AVCaptureSession + AVAssetWriter方式

对视频进一步了解,可以使用AVCaptureVideoDataOutputAVCaptureAudioDataOutput来会各自捕获视频和音频的样本缓存,而不是AVCaptureMovieFileOutpu
接着我们可以使用他们的代理AVCaptureVideoDataOutputSampleBufferDelegateAVCaptureAudioDataOutputSampleBufferDelegate,可以对采样缓冲进行处理 (比如给视频加滤镜),或者保持原样传送。
然后使用 AVAssetWriter 对象可以将样本缓存写入文件

3.2.1 info.plist 设置

Privacy - Microphone Usage Description 是否允许设备调用您的麦克风?
Privacy - Camera Usage Description 是否允许设备调用您的相机?

3.2.2 权限确认

视频权限和音频权限确认

        /// 视频权限
        AVCaptureDevice.requestAccess(for: AVMediaType.video) {[weak self] (granted) in
            guard let weakSelf = self else {
                return
            }
            if !granted {
                print("无权限访问相机")
                return
            }
            
            // 录音权限
            AVCaptureDevice.requestAccess(for: AVMediaType.audio) {[weak self] (granted) in
                guard let weakSelf = self else {
                    return
                }
                
                if !granted {
                    print("无权限访问麦克风")
                    return
                }
                
            }
        }
3.2.3 创建AVCaptureSession

使用一个 capture session,你需要先实例化,添加输入与输出,设置分辨率,接着启动从输入到输出之间的数据流:

    /// 视频捕获会话
    let captureSession = AVCaptureSession()
3.2.4 添加视频输入设备
    //MARK: 添加视频输入设备
    func addInputVideo() {
        self.captureSession.beginConfiguration()

        let videoDevice = AVCaptureDevice.default(for: AVMediaType.video)!
        let videoInput = try? AVCaptureDeviceInput(device: videoDevice)
        
        if self.captureSession.canAddInput(videoInput!) {
            self.captureSession.addInput(videoInput!)
        }
        
        self.captureSession.commitConfiguration()
    }
3.2.5 添加音频输入设备
    //MARK: 添加音频输入设备
    func addInputAudio() {
        self.captureSession.beginConfiguration()
        
        let audioDevice = AVCaptureDevice.default(for: AVMediaType.audio)
        let audioInput = try? AVCaptureDeviceInput(device: audioDevice!)
        
        if self.captureSession.canAddInput(audioInput!) {
            self.captureSession.addInput(audioInput!);
        }
        self.captureSession.commitConfiguration()
    }
3.2.6 设置分辨率
    //MARK: 设置分辨率
    func setPreset() {
        self.captureSession.beginConfiguration()
        if self.captureSession.canSetSessionPreset(AVCaptureSession.Preset.hd1280x720) {
            self.captureSession.sessionPreset = AVCaptureSession.Preset.hd1280x720
        }
        self.captureSession.commitConfiguration()
    }
    
3.2.7 添加视频输出、音频输出
    //MARK: 添加视频输出、音频输出
    func addOutputVideoAndAudio() {
        self.captureSession.beginConfiguration()
        
        self.videoDataOutput.setSampleBufferDelegate(self, queue: self.sessionQueue!)
        if self.captureSession.canAddOutput(self.videoDataOutput) {
            self.captureSession.addOutput(self.videoDataOutput)
        }
        
        self.audioDataOutput.setSampleBufferDelegate(self, queue: self.sessionQueue)
        if self.captureSession.canAddOutput(self.audioDataOutput) {
            self.captureSession.addOutput(self.audioDataOutput)
        }
        
        self.captureSession.commitConfiguration()
    }
3.2.8 设置显示采集画面,并开始采集
    /// 摄像头采集画面
    lazy var videoLayer: AVCaptureVideoPreviewLayer = {
        let videoLayer = AVCaptureVideoPreviewLayer(session: self.captureSession)
        videoLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
        videoLayer.masksToBounds = true
        return videoLayer
    }()
    //使用AVCaptureVideoPreviewLayer可以将摄像头的拍摄的实时画面显示在ViewController上
        DispatchQueue.main.async {
            weakSelf.videoLayer.frame = weakSelf.view.bounds
            weakSelf.view.layer.addSublayer(weakSelf.videoLayer)
            weakSelf.captureSession.startRunning()
            //创建按钮
            weakSelf.setUI()
        }
3.2.9 开始录制
    //MARK: 开始录制
    @objc func starRecordVideo() {
        
        if  !self.isRecording {
            print("开始录制")
            //设置录像的保存地址
            self.videoPath = self.getNewPath(videoTyle: AVFileType.mp4)
            print(self.videoPath)
            setAssetWriter(videoPath: self.videoPath)
            
            //记录状态:录像中...
            self.isRecording = true
            //开始、结束按钮颜色改变
            self.starButton.backgroundColor = UIColor.lightGray
            self.starButton.isEnabled = false
            
            self.stopButton.backgroundColor = UIColor.blue
            self.stopButton.isEnabled = true
        }
        
    }
3.2.10 设置AVAssetWrite
    //MARK: 设置AssetWrite
    func setAssetWriter(videoPath: String) {
        //设置录像的保存地址
        let fileURL = URL(fileURLWithPath: videoPath)
        
        if let assetWriter = try? AVAssetWriter.init(url: fileURL, fileType: AVFileType.mp4) {
            self.assetWriter = assetWriter
              
            var width = UIScreen.main.bounds.size.height
            
            var height = UIScreen.main.bounds.size.width
            //写入视频大小
            let numPixels = width*height
            //每像素比特
            let bitsPerPixel:CGFloat = 12.0;
            let bitsPerSecond = numPixels * bitsPerPixel;
            if (false) // 是否是刘海屏
            {
                width = UIScreen.main.bounds.size.height - 146;
                height = UIScreen.main.bounds.size.width;
            }
                
            let compressionProperties = [
                // 视频尺寸*比率,10.1相当于AVCaptureSessionPresetHigh,数值越大,显示越精细
                AVVideoAverageBitRateKey : bitsPerSecond,
                // 设置输出帧率
                AVVideoExpectedSourceFrameRateKey : 15,
                // 关键帧最大间隔,1为每个都是关键帧,数值越大压缩率越高
                AVVideoMaxKeyFrameIntervalKey : 15,
                // 画面质量
                AVVideoProfileLevelKey : AVVideoProfileLevelH264BaselineAutoLevel
            ] as [String : Any]
            
            let videoCompressionSettings = [
                AVVideoCodecKey : AVVideoCodecH264,
                AVVideoWidthKey : width * 2,
                AVVideoHeightKey : height * 2,
//                AVVideoScalingModeKey : AVVideoScalingModeResizeAspectFill,
                AVVideoCompressionPropertiesKey : compressionProperties
            ] as [String : Any]
            
            self.assetWriterVideoInput = AVAssetWriterInput.init(mediaType: AVMediaType.video, outputSettings: videoCompressionSettings)
            //expectsMediaDataInRealTime 必须设为yes,需要从capture session 实时获取数据
            self.assetWriterVideoInput?.expectsMediaDataInRealTime = true
      		
            // 音频设置
            let audioCompressionSettings = [
	            // 每个声道的比特率
                AVEncoderBitRatePerChannelKey : 28000,
                // 设置录音格式
                AVFormatIDKey : kAudioFormatMPEG4AAC,
                // 设置通道,单声道,双声道  mp3 必须双声道
                AVNumberOfChannelsKey : 1,
                // 设置录音采样率,8000是电话采样率,对于一般录音已经够了
                AVSampleRateKey : 22050,
                // 每个采样点位数,分为8、16、24、32
                AVLinearPCMBitDepthKey: 16,
                // 质量
                AVEncoderAudioQualityKey: AVAudioQuality.medium
            ] as [String : Any]
            
            self.assetWriterAudioInput = AVAssetWriterInput.init(mediaType: AVMediaType.audio, outputSettings: audioCompressionSettings)
            self.assetWriterAudioInput?.expectsMediaDataInRealTime = true
            if self.assetWriter!.canAdd(self.assetWriterVideoInput!) {
                self.assetWriter?.add(self.assetWriterVideoInput!)
            }
            
            if self.assetWriter!.canAdd(self.assetWriterAudioInput!) {
                self.assetWriter?.add(self.assetWriterAudioInput!)
            }
            
            self.canWrite = false
            
        } else {
            print("加载AVAssetWriter失败")
        }
    }

3.2.12 视频和音频每一帧处理
extension RecordVideo3ViewController: AVCaptureVideoDataOutputSampleBufferDelegate,AVCaptureAudioDataOutputSampleBufferDelegate
{
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {        
        autoreleasepool {
            if !isRecording {
                return
            }
            if connection == self.videoDataOutput.connection(with: AVMediaType.video) {
                objc_sync_enter(self)
                self.appendSampleBuffer(sampleBuffer: sampleBuffer, mediaType: AVMediaType.video)
                objc_sync_exit(self)
            }
            if connection == self.audioDataOutput.connection(with: AVMediaType.audio) {
                objc_sync_enter(self)
                self.appendSampleBuffer(sampleBuffer: sampleBuffer, mediaType: AVMediaType.audio)
                objc_sync_exit(self)
            }
        }
    }
    func appendSampleBuffer(sampleBuffer:CMSampleBuffer, mediaType:AVMediaType) {
        autoreleasepool {
            if (!self.canWrite && mediaType == AVMediaType.video) {
                print("开始写入AVAssetWriter")
                self.assetWriter?.startWriting()
                self.assetWriter?.startSession(atSourceTime: CMSampleBufferGetPresentationTimeStamp(sampleBuffer))
                self.canWrite = true
            }
            
            //写入视频数据
            if mediaType == AVMediaType.video {
                if self.assetWriterVideoInput!.isReadyForMoreMediaData {
                    let success = self.assetWriterVideoInput!.append(sampleBuffer)
                    if !success {
                        //停止录像
                        objc_sync_enter(self)
                        self.stopRecordVideo()
                        objc_sync_exit(self)
                    }
                }
            }
            
            //写入视频数据
            if mediaType == AVMediaType.audio {
                if self.assetWriterAudioInput!.isReadyForMoreMediaData {
                    let success = self.assetWriterAudioInput!.append(sampleBuffer)
                    if !success {
                        //停止录像
                        objc_sync_enter(self)
                        self.stopRecordVideo()
                        objc_sync_exit(self)
                    }
                }
            }
        }
    }
}
3.2.13 结束录制
    //MARK: 结束录制
    @objc func stopRecordVideo() {
        if self.isRecording {
            print("结束录制")
            //停止视频编码输出
            if self.assetWriter != nil && self.assetWriter?.status == AVAssetWriter.Status.writing {
                self.assetWriter?.finishWriting { [weak self] in
                    guard let weakSelf = self else {
                        return
                    }
                    weakSelf.canWrite = false
                    weakSelf.assetWriter = nil
                    weakSelf.assetWriterAudioInput = nil
                    weakSelf.assetWriterVideoInput = nil
                }
            }
            
            //记录状态:录像结束
            self .isRecording =  false
            
            //开始、结束按钮颜色改变
            self.starButton.backgroundColor = UIColor.red
            self.starButton.isEnabled = true
            
            self.stopButton.backgroundColor = UIColor.lightGray
            self.stopButton.isEnabled = false
        }
    }
  • 1
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值