VideoToolBox编解码

使用VideoToolBox进行视频编码流程

官方文档地址:https://developer.apple.com/documentation/videotoolbox

1.VTCompressionSessionCreate

Create a compression session using VTCompressionSessionCreate.

static void EncodeCallBack(void *outputCallbackRefCon,void *souceFrameRefCon,OSStatus status,VTEncodeInfoFlags infoFlags, CMSampleBufferRef sampleBuffer) {
}
VTCompressionSessionRef session;
OSStatus status = VTCompressionSessionCreate(NULL,width,height,codecType,NULL,NULL,NULL,EncodeCallBack,(__bridge void *)self,&session);
if (status != noErr) {
}

2.VTSessionSetProperty or VTSessionSetProperties

Optionally, configure the session with your desired Compression Properties by calling VTSessionSetProperty or VTSessionSetProperties.

判断要设置的属性是否支持
- (BOOL)isSupportPropertyWithSession:(VTCompressionSessionRef)session key:(CFStringRef)key {
    OSStatus status;
    static CFDictionaryRef supportedPropertyDictionary;
    if (!supportedPropertyDictionary) {
        //得到所有支持的属性
        status = VTSessionCopySupportedPropertyDictionary(session, &supportedPropertyDictionary);
        if (status != noErr) {
            return NO;
        }
    }
    //判断对应的属性是否支持
    BOOL isSupport = [NSNumber numberWithBool:CFDictionaryContainsKey(supportedPropertyDictionary, key)].intValue;
    return isSupport;
}

设置对应的属性
- (OSStatus)setSessionPropertyWithSession:(VTCompressionSessionRef)session key:(CFStringRef)key value:(CFTypeRef)value {
    if (value == nil || value == NULL || value == 0x0) {
        return noErr;
    }
    
    OSStatus status = VTSessionSetProperty(session, key, value);
    if (status != noErr)  {
        log4cplus_error("Video Encoder:", "Set session of %s Failed, status = %d",CFStringGetCStringPtr(key, kCFStringEncodingUTF8),status);
    }
    return status;
}


- (void)setCompressionSessionPropertyWithSession:(VTCompressionSessionRef)session fps:(int)fps bitrate:(int)bitrate isSupportRealtimeEncode:(BOOL)isSupportRealtimeEncode iFrameDuration:(int)iFrameDuration EncoderType:(VideoEncoderType)encoderType {
    
    int maxCount = 3;
    if (!isSupportRealtimeEncode) {
        if([self isSupportPropertyWithSession:session key:kVTCompressionPropertyKey_MaxFrameDelayCount]) {
            CFNumberRef ref   = CFNumberCreate(NULL, kCFNumberSInt32Type, &maxCount);
            [self setSessionPropertyWithSession:session key:kVTCompressionPropertyKey_MaxFrameDelayCount value:ref];
            CFRelease(ref);
        }
    }
    
    if(fps) {
        if([self isSupportPropertyWithSession:session key:kVTCompressionPropertyKey_ExpectedFrameRate]) {
            int         value = fps;
            CFNumberRef ref   = CFNumberCreate(NULL, kCFNumberSInt32Type, &value);
            [self setSessionPropertyWithSession:session key:kVTCompressionPropertyKey_ExpectedFrameRate value:ref];
            CFRelease(ref);
        }
    }else {
        log4cplus_error("Video Encoder:", "Current fps is 0");
        return;
    }
    
    if(bitrate) {
        if([self isSupportPropertyWithSession:session key:kVTCompressionPropertyKey_AverageBitRate]) {
            int value = bitrate;
            CFNumberRef ref = CFNumberCreate(NULL, kCFNumberSInt32Type, &value);
            [self setSessionPropertyWithSession:session key:kVTCompressionPropertyKey_AverageBitRate value:ref];
            CFRelease(ref);
        }
    }else {
        log4cplus_error("Video Encoder:", "Current bitrate is 0");
        return;
    }
    
    
    if([self isSupportPropertyWithSession:session key:kVTCompressionPropertyKey_RealTime]) {
        log4cplus_info("Video Encoder:", "use realTimeEncoder");
        [self setSessionPropertyWithSession:session key:kVTCompressionPropertyKey_RealTime value:isSupportRealtimeEncode ? kCFBooleanTrue : kCFBooleanFalse];
    }
    
    // Ban B frame.
    if([self isSupportPropertyWithSession:session key:kVTCompressionPropertyKey_AllowFrameReordering]) {
        [self setSessionPropertyWithSession:session key:kVTCompressionPropertyKey_AllowFrameReordering value:kCFBooleanFalse];
    }
    
    if (encoderType == H264Encoder) {
        if (isSupportRealtimeEncode) {
            if([self isSupportPropertyWithSession:session key:kVTCompressionPropertyKey_ProfileLevel]) {
                [self setSessionPropertyWithSession:session key:kVTCompressionPropertyKey_ProfileLevel value:kVTProfileLevel_H264_Main_AutoLevel];
            }
        }else {
            if([self isSupportPropertyWithSession:session key:kVTCompressionPropertyKey_ProfileLevel]) {
                [self setSessionPropertyWithSession:session key:kVTCompressionPropertyKey_ProfileLevel value:kVTProfileLevel_H264_Baseline_AutoLevel];
            }
            
            if([self isSupportPropertyWithSession:session key:kVTCompressionPropertyKey_H264EntropyMode]) {
                [self setSessionPropertyWithSession:session key:kVTCompressionPropertyKey_H264EntropyMode value:kVTH264EntropyMode_CAVLC];
            }
        }
    }else if (encoderType == H265Encoder) {
        if([self isSupportPropertyWithSession:session key:kVTCompressionPropertyKey_ProfileLevel]) {
            [self setSessionPropertyWithSession:session key:kVTCompressionPropertyKey_ProfileLevel value:kVTProfileLevel_HEVC_Main_AutoLevel];
        }
    }
    
    
    if([self isSupportPropertyWithSession:session key:kVTCompressionPropertyKey_MaxKeyFrameIntervalDuration]) {
        int         value   = iFrameDuration;
        CFNumberRef ref     = CFNumberCreate(NULL, kCFNumberSInt32Type, &value);
        [self setSessionPropertyWithSession:session key:kVTCompressionPropertyKey_MaxKeyFrameIntervalDuration value:ref];
        CFRelease(ref);
    }
    
    log4cplus_info("Video Encoder:", "The compression session max frame delay count = %d, expected frame rate = %d, average bitrate = %d, is support realtime encode = %d, I frame duration = %d",maxCount, fps, bitrate, isSupportRealtimeEncode,iFrameDuration);
}


3.VTCompressionSessionEncodeFrame, VTCompressionOutputCallback

Encode video frames using VTCompressionSessionEncodeFrame and receive the compressed video frames in the session’s VTCompressionOutputCallback.

-(void)startEncodeWithBuffer:(CMSampleBufferRef)sampleBuffer session:(VTCompressionSessionRef)session isNeedFreeBuffer:(BOOL)isNeedFreeBuffer isDrop:(BOOL)isDrop  needForceInsertKeyFrame:(BOOL)needForceInsertKeyFrame lock:(NSLock *)lock {
    [lock lock];
    
    if(session == NULL) {
        log4cplus_error("Video Encoder:", "%s,session is empty",__func__);
        [self handleEncodeFailedWithIsNeedFreeBuffer:isNeedFreeBuffer sampleBuffer:sampleBuffer];
        return;
    }
    
    //the first frame must be iframe then create the reference timeStamp;
    static BOOL isFirstFrame = YES;
    if(isFirstFrame && g_capture_base_time == 0) {
        CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
        g_capture_base_time = CMTimeGetSeconds(pts);// system absolutly time(s)
        //        g_capture_base_time = g_tvustartcaptureTime - (ntp_time_offset/1000);
        isFirstFrame = NO;
        log4cplus_error("Video Encoder:","start capture time = %u",g_capture_base_time);
    }
    
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CMTime presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
    
    // Switch different source data will show mosaic because timestamp not sync.
    static int64_t lastPts = 0;
    int64_t currentPts = (int64_t)(CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) * 1000);
    if (currentPts - lastPts < 0) {
        log4cplus_error("Video Encoder:","Switch different source data the timestamp < last timestamp, currentPts = %lld, lastPts = %lld, duration = %lld",currentPts, lastPts, currentPts - lastPts);
        [self handleEncodeFailedWithIsNeedFreeBuffer:isNeedFreeBuffer sampleBuffer:sampleBuffer];
        return;
    }
    lastPts = currentPts;
    
    OSStatus status = noErr;
    NSDictionary *properties = @{(__bridge NSString *)kVTEncodeFrameOptionKey_ForceKeyFrame:@(needForceInsertKeyFrame)};
    status = VTCompressionSessionEncodeFrame(session,
                                             imageBuffer,
                                             presentationTimeStamp,
                                             kCMTimeInvalid,
                                             (__bridge CFDictionaryRef)properties,
                                             NULL,
                                             NULL);
    
    if(status != noErr) {
        log4cplus_error("Video Encoder:", "encode frame failed");
        [self handleEncodeFailedWithIsNeedFreeBuffer:isNeedFreeBuffer sampleBuffer:sampleBuffer];
    }
    
    [lock unlock];
    if (isNeedFreeBuffer) {
        if (sampleBuffer != NULL) {
            CFRelease(sampleBuffer);
            log4cplus_debug("Video Encoder:", "release the sample buffer");
        }
    }
}

- (void)handleEncodeFailedWithIsNeedFreeBuffer:(BOOL)isNeedFreeBuffer sampleBuffer:(CMSampleBufferRef)sampleBuffer {
    // if sample buffer are from system needn't to release, if sample buffer are from we create need to release.
    [self.lock unlock];
    if (isNeedFreeBuffer) {
        if (sampleBuffer != NULL) {
            CFRelease(sampleBuffer);
            log4cplus_debug("Video Encoder:", "release the sample buffer");
        }
    }
}

4.VTCompressionSessionCompleteFrames

To force the completion of some or all pending frames, call VTCompressionSessionCompleteFrames.

5. VTCompressionSessionInvalidate

When you finish with the compression session, call VTCompressionSessionInvalidate to invalidate it and CFRelease to free its memory.

-(void)tearDownSessionWithSession:(VTCompressionSessionRef)session lock:(NSLock *)lock {
    log4cplus_error("Video Encoder:","tear down session");
    [lock lock];
    
    if (session == NULL) {
        log4cplus_error("Video Encoder:", "%s current compression is NULL",__func__);
        [lock unlock];
        return;
    }else {
        VTCompressionSessionCompleteFrames(session, kCMTimeInvalid);
        VTCompressionSessionInvalidate(session);
        CFRelease(session);
        session = NULL;
    }
    
    [lock unlock];
}

CMSampleBufferRef

在视频编码回调 VTCompressionOutputCallback 中得到 CMSampleBufferRef 通过以下方法可以获得相应的信息

1.获取PTS CMSampleBufferGetPresentationTimeStamp(CMSampleBufferRef)

2.获取DTS CMSampleBufferGetDecodeTimeStamp(CMSampleBufferRef)

3.判断是否是关键帧 
BOOL isKeyFrame = NO;
CFArrayRef attachments = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, false);
if(attachments != NULL) {
   CFDictionaryRef attachment =(CFDictionaryRef)CFArrayGetValueAtIndex(attachments, 0);
       CFBooleanRef dependsOnOthers = (CFBooleanRef)CFDictionaryGetValue(attachment, kCMSampleAttachmentKey_DependsOnOthers);
       isKeyFrame = (dependsOnOthers == kCFBooleanFalse);
}

4.将 CMSampleBufferRef 转换为 CMBlockBufferRef
CMBlockBufferRef block = CMSampleBufferGetDataBuffer(sampleBuffer);

解码流程

A decompression session supports the decompression of a sequence of video frames. Here’s the basic workflow:

1.VTDecompressionSessionCreate

Create a decompression session by calling
VTDecompressionSessionCreate(allocator:formatDescription:decoderSpecification:imageBufferAttributes:outputCallback:decompressionSessionOut:).

2.VTSessionSetProperty or VTSessionSetProperties

Optionally, configure the session with your desired Decompression Properties by calling VTSessionSetProperty(: key:value:) or VTSessionSetProperties(:propertyDictionary:).

3.VTDecompressionSessionDecodeFrame(_:sampleBuffer: flags:frameRefcon:infoFlagsOut:)

Decode video frames using VTDecompressionSessionDecodeFrame(_:sampleBuffer: flags:frameRefcon:infoFlagsOut:).

4.VTDecompressionSessionInvalidate

When you finish with the decompression session, call VTDecompressionSessionInvalidate(_: ) to tear it down, and call CFRelease to free its memory.

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
Ijkplayer 是一个功能强大的开源播放器,但是在 iOS 平台上使用时可能会出现一些性能问题。本文将介绍一些常见的优化方法。 ### 1. 裁剪无用的模块 Ijkplayer 包含了很多模块,但是在实际使用中可能用不到所有模块。可以根据自己的需求,裁剪掉一些无用的模块,以减小库的大小和提高启动速度。 在译 ijkplayer 时,可以使用如下命令指定需要译的模块: ```bash ./compile-ffmpeg.sh --disable-ffplay --disable-ffprobe --disable-avdevice --disable-doc --disable-sdl2 ``` 这里禁用了 ffplay、ffprobe、avdevice、doc 和 sdl2 模块。可以根据实际需求进行选择。 ### 2. 使用硬件解码 Ijkplayer 默认使用软件解码,但是在一些低端设备上可能会出现卡顿的现象。可以开启硬件解码以提高性能。 在创建 IJKFFMoviePlayerController 对象时,可以设置使用硬件解码: ```swift let options = IJKFFOptions.byDefault() options?.setPlayerOptionIntValue(1, forKey: "videotoolbox") let player = IJKFFMoviePlayerController(contentURL: url, with: options) ``` 这里使用了 videotoolbox 硬件解码器。可以根据实际情况选择其他的硬件解码器。 ### 3. 使用缓存 Ijkplayer 默认不会缓存视频数据,而是直接从网络中读取。可以开启缓存以提高播放流畅度。 在创建 IJKFFMoviePlayerController 对象时,可以设置缓存大小: ```swift let options = IJKFFOptions.byDefault() options?.setPlayerOptionIntValue(1, forKey: "packet-buffering") options?.setPlayerOptionIntValue(30, forKey: "max-buffer-size") let player = IJKFFMoviePlayerController(contentURL: url, with: options) ``` 这里开启了 packet-buffering 缓存,并设置了最大缓存大小为 30 秒。可以根据实际情况进行调整。 ### 4. 使用硬件加速 Ijkplayer 默认使用软件加速,但是在一些低端设备上可能会出现卡顿的现象。可以开启硬件加速以提高性能。 在创建 IJKFFMoviePlayerController 对象时,可以设置使用硬件加速: ```swift let options = IJKFFOptions.byDefault() options?.setPlayerOptionIntValue(1, forKey: "videotoolbox") options?.setPlayerOptionIntValue(1, forKey: "mediacodec") let player = IJKFFMoviePlayerController(contentURL: url, with: options) ``` 这里开启了 videotoolbox 和 mediacodec 硬件加速。可以根据实际情况选择其他的硬件加速器。 ### 5. 其他优化方法 除了上述方法外,还可以尝试以下优化方法: - 使用较低的分辨率和码率; - 使用较新的解码器; - 避免多次创建和销毁 IJKFFMoviePlayerController 对象; - 避免在主线程中进行耗时操作。 综上,通过裁剪无用的模块、使用硬件解码、使用缓存、使用硬件加速等优化方法,可以有效提高 ijkplayer 在 iOS 平台上的性能和用户体验。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值