2021-07-21

replaykit2处理(引)

  • 特点
  1. replaykit2在replaykit的基础上增加了能录制桌面,退到后台打开其他应用同样能录制当前屏幕显示的内容,
  2. 使用socket完成进程间的通信,
  3. 负责录制的屏幕数据的进程占用内存最大只能50M,大于则会被系统杀掉
  • 系统版本下的区别
    ios12新出了一个api用于在app内能直接调起录屏功能的UI
RPSystemBroadcastPickerView

具体实现
申明

@property (nonatomic,strong)RPSystemBroadcastPickerView API_AVAILABLE(ios(12.0))     *broadPickerView;

初始化

- (RPSystemBroadcastPickerView *)broadPickerView API_AVAILABLE(ios(12.0)){
  if(!_broadPickerView){
    _broadPickerView = [[RPSystemBroadcastPickerView alloc] initWithFrame:CGRectMake(0, 0, 1, 1)];
    //是否显示音频按钮
    _broadPickerView.showsMicrophoneButton = NO;
    这个是决定调起某一个特定的extensionUpload,这里就写自己创建的upload的bundleID
    _broadPickerView.preferredExtension = @"extensionbundleID";
  }
  return _broadPickerView;
}

app内调用录制按钮的方法

if (@available(iOS 12.0, *)) {
	//首先判断是否在录制中
      if (![UIScreen mainScreen].isCaptured) {
        dispatch_async(dispatch_get_main_queue(), ^{
          [[UIApplication sharedApplication].keyWindow addSubview:self.broadPickerView];
          
          for (UIView *view in self.broadPickerView.subviews)
          {
            if ([view isKindOfClass:[UIButton class]])
            {
	          // UIControlEventAllTouchEvents这个属性需要特别注意,网上很多说使用的是别的类型,在ios13下都会不起作用,ios13以后必须使用UIControlEventAllTouchEvents,调用事件才有效,所以直接使用UIControlEventAllTouchEvents,在ios9以上都支持
              [(UIButton*)view sendActionsForControlEvents:UIControlEventAllTouchEvents];
            }
          }
          
        });
      }
    }

看了上面的特定api,先看下创建ExtensionUpload的方式
选中工程的 Targets —> editor —>addTarget —>Broadcast Upload Extension—>Next
在这里插入图片描述
next 如下

  1. 输入名称
  2. 选择语言
  3. Include UI Extension 可选可不选,后期也用不到
    在这里插入图片描述

确定后会在工程中多几个文件
1.SampleHandler
2. BroadcastSetupViewController

在这里插入图片描述
其中SampleHandler就是我们获取录屏数据处理的类,这个类相当于一个单独的应用,想要调制需要运行extension的才能打印或者断点调试

//这个代码是我们获取数据并处理的方法
- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType {
    
    switch (sampleBufferType) {
        case RPSampleBufferTypeVideo:
            // Handle video sample buffer
            break;
        case RPSampleBufferTypeAudioApp:
            // Handle audio sample buffer for app audio
            break;
        case RPSampleBufferTypeAudioMic:
            // Handle audio sample buffer for mic audio
            break;
            
        default:
            break;
    }
}
这里面的难点在于,如果你有单独的类来处理视频数据的上传,直接上传就可以了,如果你需要把数据传回住app在处理,就需要用到进程间的通信了,进程间的通信有很多种,不再叙述,感兴趣可以网上查查,这里只讲使使用socket进行进程间的通信

这里还涉及到一个进程间的通知的问题,用于状态标记的回传

socket 使用GCDAsysocket

处理数据视频数据使用一个第三方的对视频进行裁剪封装为nsdata数据,socket使用nsadata类型的数据
下面的代码其实可以是一个固定写法,就这样写,变动也是在个别参数的调试

//
//  SampleHandler.m
//  ClairEye
//
//  Created by AmpleSky on 2020/3/31.
//  Copyright © 2020年 Facebook. All rights reserved.
//

#import "SampleHandler.h"
#import "NTESYUVConverter.h"
#import "NTESI420Frame.h"
#import "GCDAsyncSocket.h"
#import "NTESSocketPacket.h"
#import "NTESTPCircularBuffer.h"

@interface SampleHandler ()<GCDAsyncSocketDelegate>

@property (nonatomic, assign) CGFloat cropRate;
@property (nonatomic, assign) CGSize  targetSize;
@property (nonatomic, assign) NTESVideoPackOrientation orientation;

@property (nonatomic, copy) NSString *ip;
@property (nonatomic, copy) NSString *clientPort;
@property (nonatomic, copy) NSString *serverPort;
@property (nonatomic, strong) dispatch_queue_t videoQueue;
@property (nonatomic, assign) NSUInteger frameCount;
@property (nonatomic, assign) BOOL connected;
@property (nonatomic, strong) dispatch_source_t timer;

@property (nonatomic, strong) GCDAsyncSocket *socket;
@property (nonatomic, strong) dispatch_queue_t queue;
@property (nonatomic, assign) NTESTPCircularBuffer *recvBuffer;

@end

@implementation SampleHandler
BOOL _enterBack = false;
static void Callback(CFNotificationCenterRef center,
                     void *observer,
                     CFStringRef name,
                     const void *object,
                     CFDictionaryRef userInfo)
{
  _enterBack = true;
  dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(3 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
  _enterBack = false;
  });
 
}
BOOL _finish = false;
static void FinishCallback(CFNotificationCenterRef center,
                           void *observer,
                           CFStringRef name,
                           const void *object,
                           CFDictionaryRef userInfo)
{
  _finish = true;
  
}


- (void)dealloc {
  _connected = NO;
  
  if (_socket) {
    [_socket disconnect];
    _socket = nil;
    NTESTPCircularBufferCleanup(_recvBuffer);
  }
  
  if(_timer) {
    _timer = nil;
  }
}

- (instancetype)init {
  if(self = [super init]) {
    
    _targetSize = CGSizeMake(540, 960);
    _cropRate = 15;
    _orientation = NTESVideoPackOrientationPortrait;
    
    _ip = @"127.0.0.1";
    _serverPort = @"8999";
    _clientPort = [NSString stringWithFormat:@"%d", arc4random()%9999];
    _videoQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
    CFStringRef name = CFSTR("customName");
    CFNotificationCenterRef center = CFNotificationCenterGetDarwinNotifyCenter();
    CFNotificationCenterAddObserver(center,
                                    (const void *)self,
                                    Callback,
                                    name,
                                    NULL,
                                    kCFNotificationDeliverImmediately);
    CFStringRef finishName = CFSTR("StopScreen");
    CFNotificationCenterRef finishCenter = CFNotificationCenterGetDarwinNotifyCenter();
  
    CFNotificationCenterAddObserver(finishCenter,
                                    (const void *)self,
                                    FinishCallback,
                                    finishName,
                                    NULL,
                                    kCFNotificationDeliverImmediately);
    

  }
  return self;
}


- (void)broadcastStartedWithSetupInfo:(NSDictionary<NSString *,NSObject *> *)setupInfo {
  if (!self.connected) {
    [self.socket disconnect];
  }
  if (!self.socket.isConnected) {
    [self setupSocket];
  }
  
}

- (void)broadcastPaused {
//  if (self.connected) {
//    NSString * str =@"Paused";
//    NSData *data =[str dataUsingEncoding:NSUTF8StringEncoding];
//    [self.socket writeData:data withTimeout:5 tag:0];
//  }
}

- (void)broadcastResumed {
//  if (self.connected) {
//    NSString * str =@"Resumed";
//    NSData *data =[str dataUsingEncoding:NSUTF8StringEncoding];
//    [self.socket writeData:data withTimeout:5 tag:0];
//  }
}

- (void)broadcastFinished {
  if (self.connected) {
    NSString * str =@"Finish";
    NSData *data =[str dataUsingEncoding:NSUTF8StringEncoding];
    [self.socket writeData:data withTimeout:-1 tag:0];
  }
//  [self.socket disconnect];
}

- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType {
  switch (sampleBufferType) {
    case RPSampleBufferTypeVideo:
    {
      if (!self.connected)
      {
        return;
      }
//      if(self.connected){
      
      if(_finish){
        NSError *error;
        if([UIDevice currentDevice].systemVersion.doubleValue >= 12 && [UIDevice currentDevice].systemVersion.doubleValue < 13){
          error = [NSError errorWithDomain:@"您停止了屏幕共享" code:-001 userInfo:@{}];
        }else{
          error = nil;
        }
        [self finishBroadcastWithError:error];
      }
      
      
      if( CMSampleBufferDataIsReady(sampleBuffer)){
        [self sendVideoBufferToHostApp:sampleBuffer];
      }

    }
      break;
    case RPSampleBufferTypeAudioApp:
      // Handle audio sample buffer for app audio
      break;
    case RPSampleBufferTypeAudioMic:
      // Handle audio sample buffer for mic audio
      break;
      
    default:
      break;
  }
}
#pragma mark - 处理分辨率切换等
- (void)onRecvData:(NSData *)data head:(NTESPacketHead *)head
{
  if (!data)
  {
    return;
  }
  
  switch (head->command_id)
  {
    case 1:
    {
      NSString *qualityStr = [NSString stringWithUTF8String:[data bytes]];
      int qualit = [qualityStr intValue];
      switch (qualit) {
        case 0:
          self.targetSize = CGSizeMake(480, 640);
          break;
        case 1:
          self.targetSize = CGSizeMake(144, 177);
          break;
        case 2:
          self.targetSize = CGSizeMake(288, 352);
          break;
        case 3:
          self.targetSize = CGSizeMake(320, 480);
          break;
        case 4:
          self.targetSize = CGSizeMake(375, 667);
          break;
        case 5:
          self.targetSize = CGSizeMake(414, 812);
          break;
        case 6:
          self.targetSize = CGSizeMake(414, 812);
          break;
        default:
          break;
      }
      NSLog(@"change target size %@", @(self.targetSize));
    }
      break;
    case 2:
      break;
    case 3:
    {
      NSString *orientationStr = [NSString stringWithUTF8String:[data bytes]];
      int orient = [orientationStr intValue];
      switch (orient) {
        case 0:
          self.orientation = NTESVideoPackOrientationPortrait;
          break;
        case 1:
          self.orientation = NTESVideoPackOrientationLandscapeLeft;
          break;
        case 2:
          self.orientation = NTESVideoPackOrientationPortraitUpsideDown;
          break;
        case 3:
          self.orientation = NTESVideoPackOrientationLandscapeRight;
          break;
        default:
          break;
      };
      NSLog(@"change orientation %@", @(self.orientation));
      
    }
      break;
    default:
      break;
  }
}



#pragma mark - Process
- (void)sendVideoBufferToHostApp:(CMSampleBufferRef)sampleBuffer {
  if (!self.socket)
  {
    return;
  }
  CFRetain(sampleBuffer);
  
  dispatch_async(self.videoQueue, ^{ // queue optimal
    @autoreleasepool {
      if (self.frameCount > 1000)
      {
        CFRelease(sampleBuffer);
        return;
      }
      self.frameCount ++ ;
      CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
      
      CFStringRef RPVideoSampleOrientationKeyRef = (__bridge CFStringRef)RPVideoSampleOrientationKey;
      NSNumber *orientation = (NSNumber *)CMGetAttachment(sampleBuffer, RPVideoSampleOrientationKeyRef,NULL);
      
       switch ([orientation integerValue]) {
         case 1:
           self.orientation = NTESVideoPackOrientationPortrait;
            break;
         case 6:
           self.orientation = NTESVideoPackOrientationLandscapeRight;
           break;
           
         case 8:
           self.orientation = NTESVideoPackOrientationLandscapeLeft;
           break;
         default:
           break;
       }
      // To data
      NTESI420Frame *videoFrame = nil;
      videoFrame = [NTESYUVConverter pixelBufferToI420:pixelBuffer
                                              withCrop:self.cropRate
                                            targetSize:self.targetSize
                                        andOrientation:self.orientation];
      CFRelease(sampleBuffer);
      // To Host App
      if (videoFrame){
          NSData *raw = [videoFrame bytes];
          //NSData *data = [NTESSocketPacket packetWithBuffer:raw];
          NSData *headerData = [NTESSocketPacket packetWithBuffer:raw];
          if (!_enterBack) {
            if (self.connected) {
              [self.socket writeData:headerData withTimeout:-1 tag:0];
              [self.socket writeData:raw withTimeout:-1 tag:0];
            }
          }
      }
      self.frameCount --;
    };
  });
}

- (NSData *)packetWithBuffer:(NSData *)rawData
{
  NSMutableData *mutableData = [NSMutableData data];
  @autoreleasepool {
    if (rawData.length == 0)
    {
      return NULL;
    }
    
    size_t size = rawData.length;
    void *data = malloc(sizeof(NTESPacketHead));
    NTESPacketHead *head = (NTESPacketHead *)malloc(sizeof(NTESPacketHead));
    head->version = 1;
    head->command_id = 0;
    head->service_id = 0;
    head->serial_id = 0;
    head->data_len = (uint32_t)size;
    
    size_t headSize = sizeof(NTESPacketHead);
    memcpy(data, head, headSize);
    NSData *headData = [NSData dataWithBytes:data length:headSize];
    [mutableData appendData:headData];
    [mutableData appendData:rawData];
    
    free(data);
    free(head);
  }
  return [mutableData copy];
}

- (NSData *)packetWithBuffer:(const void *)buffer
                        size:(size_t)size
                  packetSize:(size_t *)packetSize
{
  if (0 == size)
  {
    return NULL;
  }
  
  void *data = malloc(sizeof(NTESPacketHead) + size);
  NTESPacketHead *head = (NTESPacketHead *)malloc(sizeof(NTESPacketHead));
  head->version = 1;
  head->command_id = 0;
  head->service_id = 0;
  head->serial_id = 0;
  head->data_len = (uint32_t)size;
  
  size_t headSize = sizeof(NTESPacketHead);
  *packetSize = size + headSize;
  memcpy(data, head, headSize);
  memcpy(data + headSize, buffer, size);
  
  
  NSData *result = [NSData dataWithBytes:data length:*packetSize];
  
  free(head);
  free(data);
  return result;
}

#pragma mark - Socket

- (void)setupSocket
{
  _recvBuffer = (NTESTPCircularBuffer *)malloc(sizeof(NTESTPCircularBuffer)); // 需要释放
  NTESTPCircularBufferInit(_recvBuffer, kRecvBufferMaxSize);
  self.queue = dispatch_queue_create("com.netease.edu.rp.client", DISPATCH_QUEUE_SERIAL);
  self.socket = [[GCDAsyncSocket alloc] initWithDelegate:self delegateQueue:self.queue];
  //    self.socket.IPv6Enabled = NO;
  //    [self.socket connectToUrl:[NSURL fileURLWithPath:serverURL] withTimeout:5 error:nil];
  NSError *error;
  [self.socket connectToHost:_ip onPort:8999 error:&error];
  [self.socket readDataWithTimeout:-1 tag:0];
  NSLog(@"setupSocket:%@",error);
  if (error == nil)
  {
    NSLog(@"====开启成功");
  }
  else
  {
    NSLog(@"=====开启失败");
  }
}

- (void)socket:(GCDAsyncSocket *)sock didConnectToUrl:(NSURL *)url
{
  [self.socket readDataWithTimeout:-1 tag:0];
}

- (void)socket:(GCDAsyncSocket *)sock didConnectToHost:(NSString *)host port:(uint16_t)port
{
  if (self.connected) {
    NSString * str =@"Start";
    NSData *data =[str dataUsingEncoding:NSUTF8StringEncoding];
    [self.socket writeData:data withTimeout:15 tag:0];
  }
  [self.socket readDataWithTimeout:-1 tag:0];
  self.connected = YES;
}

- (void)socket:(GCDAsyncSocket *)sock didWriteDataWithTag:(long)tag
{
  
}

- (void)socket:(GCDAsyncSocket *)sock didReadData:(NSData *)data withTag:(long)tag
{
  NTESTPCircularBufferProduceBytes(self.recvBuffer, data.bytes, (int32_t)data.length);
  [self handleRecvBuffer];
  [sock readDataWithTimeout:-1 tag:0];
}

- (void)socketDidDisconnect:(GCDAsyncSocket *)sock withError:(NSError *)err
{
  self.connected = NO;
  [self.socket disconnect];
  self.socket = nil;
  [self setupSocket];
  [self.socket readDataWithTimeout:-1 tag:0];
}

- (void)handleRecvBuffer {
  if (!self.socket)
  {
    return;
  }
  
  int32_t availableBytes = 0;
  void * buffer = NTESTPCircularBufferTail(self.recvBuffer, &availableBytes);
  int32_t headSize = sizeof(NTESPacketHead);
  
  if (availableBytes <= headSize)
  {
    return;
  }
  
  NTESPacketHead head;
  memset(&head, 0, sizeof(head));
  memcpy(&head, buffer, headSize);
  uint64_t dataLen = head.data_len;
  
  if(dataLen > availableBytes - headSize && dataLen >0) {
    return;
  }
  
  void *data = malloc(dataLen);
  memset(data, 0, dataLen);
  memcpy(data, buffer + headSize, dataLen);
  NTESTPCircularBufferConsume(self.recvBuffer, (int32_t)(headSize+dataLen));
  
  
  if([self respondsToSelector:@selector(onRecvData:head:)]) {
    @autoreleasepool {
      [self onRecvData:[NSData dataWithBytes:data length:dataLen] head:&head];
    };
  }
  
  free(data);
  
  if (availableBytes - headSize - dataLen >= headSize)
  {
    [self handleRecvBuffer];
  }
}
@end

在接收处理的类中,写下如下代码

- (void)setupSocket
{
  
  self.sockets = [NSMutableArray array];
  self.recvBuffer = (NTESTPCircularBuffer *)malloc(sizeof(NTESTPCircularBuffer)); // 需要释放
  NTESTPCircularBufferInit(self.recvBuffer, kRecvBufferMaxSize);
  //    self.queue = dispatch_queue_create("com.netease.edu.rp.server", DISPATCH_QUEUE_SERIAL);
  self.queue = dispatch_get_main_queue();
  self.socket = [[GCDAsyncSocket alloc] initWithDelegate:self delegateQueue:self.queue];
  self.socket.IPv6Enabled = NO;
  NSError *error;
  //    [self.socket acceptOnUrl:[NSURL fileURLWithPath:serverURL] error:&error];
  [self.socket acceptOnPort:8999 error:&error];
  [self.socket readDataWithTimeout:-1 tag:0];
  if (error == nil)
  {
    NSLog(@"开启成功");
     [[NSRunLoop mainRunLoop]run];//目的让服务器不停止
    [self setTimer];
  }
  else
  {
    NSLog(@"开启失败");
    [self setupSocket];
    
  }


- (void)defaultsChanged:(NSNotification *)notification
{
  GCDAsyncSocket *socket = self.sockets.count ? self.sockets[0] : nil;
  
  NSUserDefaults *defaults = (NSUserDefaults*)[notification object];
  id setting = nil;
  // 分辨率
  static NSInteger quality;
  setting = [defaults objectForKey:@"videochat_preferred_video_quality"];
  if (quality != [setting integerValue] && setting)
  {
    quality = [setting integerValue];
    NTESPacketHead head;
    head.service_id = 0;
    head.command_id = 1; // 1:分辨率 2:裁剪比例 3:视频方向
    head.data_len = 0;
    head.version = 0;
    NSString *str = [NSString stringWithFormat:@"%d", [setting intValue]];
    [socket writeData:[NTESSocketPacket packetWithBuffer:[str dataUsingEncoding:NSUTF8StringEncoding] head:&head] withTimeout:-1 tag:0];
  }
  
  // 视频方向
  static NSInteger orientation;
  setting = [defaults objectForKey:@"videochat_preferred_video_orientation"];
  if (orientation != [setting integerValue] && setting)
  {
    orientation = [setting integerValue];
    self.rotation = orientation;
    NTESPacketHead head;
    head.service_id = 0;
    head.command_id = 3; // 1:分辨率 2:裁剪比例 3:视频方向
    head.data_len = 0;
    head.version = 0;
    head.serial_id = 0;
    NSString *str = [NSString stringWithFormat:@"%@", setting];
    [socket writeData:[NTESSocketPacket packetWithBuffer:[str dataUsingEncoding:NSUTF8StringEncoding] head:&head] withTimeout:-1 tag:0];
    
  }
}

- (void)stopSocket
{
  if (self.socket)
  {
    [self.socket disconnect];
    self.socket = nil;
    [self.sockets removeAllObjects];
    NTESTPCircularBufferCleanup(self.recvBuffer);
  }
  [[NSNotificationCenter defaultCenter] removeObserver:self name:NSUserDefaultsDidChangeNotification object:nil];
  
}
  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值