iOS ARKit推流到WebRTC

13 篇文章 93 订阅
11 篇文章 1 订阅

背景

      直播SDK接入iOS ARKit。

WebRTC数据传入

      创建PeerConnection的时候需要创建一个VideoTrackSourceInterface对象,该对象可以作为外部视频数据传入WebRTC的入口。

ARKit数据来源

      ARKit有两种数据,一种是直接从摄像头采集到的原始数据,一种是经过AR渲染的数据,也就是我们直观看到的增强现实的图像。无论是哪种数据,都要经过CVPixelBuffer ==> RTCCVPixelBuffer ==> RTCVideoFrame的转换,才能交给VideoTrackSourceInterface,进而通过WebRTC推流。

摄像头原始数据

      实现ARSessionDelegate协议的以下方法:

……
RTCVideoCapturer *_dummyCapturer = [[RTCVideoCapturer alloc] init];
……

- (void)session:(ARSession *)session didUpdateFrame:(ARFrame *)frame {
    CVPixelBufferRef pixelBuffer = frame.capturedImage; // 获得CVPixelBufferRef
    RTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer]; // 转成RTCCVPixelBuffer.
    int64_t timeStampNs = frame.timestamp * 1000000000; // 时间单位转换,单位为ns.            
    RTCVideoRotation rotation = RTCVideoRotation_0; //TBD, check rotation.          
    RTCVideoFrame *videoFrame = [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer rotation:rotation timeStampNs:timeStampNs]; //转成RTCVideoFrame.             
    [_videoSource capturer:_dummyCapturer didCaptureVideoFrame:videoFrame]; //发送数据给WebRTC, _dummyCapturer弄个假的即可。
}

渲染后的AR数据

      需要创建一个SCNRenderer,以一定的帧率调用snapshotAtTime抓取场景。由于该方法返回的是UIImage,需要先把UIImage转成CVPixelBufferRef,才能使用上述方法发送数据到WebRTC。代码示例如下:

#define screenWidth [UIScreen mainScreen].bounds.size.width
#define screenHeight [UIScreen mainScreen].bounds.size.height

……
ARSCNView *_arView;
ARSession *_arSession;
SCNRenderer *_scnRenderer;
……

- (ARSession *)arSession {
    if(_arSession == nil) {
        _arSession = [[ARSession alloc] init];
        _arSession.delegate = self;
    }

    return _arSession;
}

- (ARSCNView *)arView {
    if (_arView == nil) {
        _arView = [[ARSCNView alloc] initWithFrame:CGRectMake(
           0,
           0,
           screenWidth,
           screenHeight)];
        _arView.session = self.arSession;
        _arView.automaticallyUpdatesLighting = YES;
        _arView.delegate = self;
    }
    return _arView;
}

- (SCNRenderer*)scnRenderer {
    if (_scnRenderer == nil) {
        _scnRenderer = [SCNRenderer rendererWithDevice:nil options:nil];
        _scnRenderer.scene = self.arView.scene;
    }
    return _scnRenderer;
}

-(CVPixelBufferRef)capturePixelBuffer : (NSTimeInterval)timestamp {
    UIImage *image = [self.scnRenderer snapshotAtTime:timestamp withSize:CGSizeMake(_outputSize.width, _outputSize.height) antialiasingMode:SCNAntialiasingModeMultisampling4X];
    CVPixelBufferRef pixelBuffer = [self imageToPixelBuffer:image.CGImage]; // UIImage ==> CVPixelBuffer.
    return pixelBuffer;
}

//CGImageRef ==> CVPixelBufferRef,也就是UIImage ==> CVPixelBuffer.
- (CVPixelBufferRef)imageToPixelBuffer:(CGImageRef)image {
    CGSize frameSize = CGSizeMake(_outputSize.width, _outputSize.height); // _outputSize = CGSizeMake(720, 1280), 分辨率越高,转换消耗越高.
    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES],kCVPixelBufferCGImageCompatibilityKey,[NSNumber numberWithBool:YES],kCVPixelBufferCGBitmapContextCompatibilityKey,nil];
    CVPixelBufferRef pxbuffer = NULL;
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, frameSize.width, frameSize.height,kCVPixelFormatType_32BGRA, (__bridge CFDictionaryRef)options, &pxbuffer);
    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pxdata, frameSize.width, frameSize.height,8, CVPixelBufferGetBytesPerRow(pxbuffer),rgbColorSpace,(CGBitmapInfo)kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    CGContextDrawImage(context, CGRectMake(0, 0, _outputSize.width, _outputSize.height), image);
    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);
    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
    return pxbuffer;
}

- (void)captureAndSend {
    CVPixelBufferRef pixelBuffer = [self capturePixelBuffer:[self getCurrentTimestamp]]; //获取当前时间戳的图像
    RTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer]; // 转成RTCCVPixelBuffer.
    int64_t timeStampNs = frame.timestamp * 1000000000; // 时间单位转换,单位为ns.            
    RTCVideoRotation rotation = RTCVideoRotation_0; //TBD, check rotation.          
    RTCVideoFrame *videoFrame = [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer rotation:rotation timeStampNs:timeStampNs]; //转成RTCVideoFrame.             
    [_videoSource capturer:_dummyCapturer didCaptureVideoFrame:videoFrame]; //发送数据给WebRTC, _dummyCapturer弄个假的即可。
}

在一个定时器里周期地调用captureAndSend即可。

原始AR Demo

https://github.com/miliPolo/ARSolarPlay

把这个流通过WebRTC推出去,还是蛮有意思的,AR+WebRTC,也许是一个值得尝试的方向。

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值