自定义相机封装与图像旋转和镜像问题(解决)

一、自定义相机的封装(AVFoundation)

如何使用AVFoundation来实现一个相机预览功能请移步:iOS开发(自定义相机的实现)——AVFoundation的基本使用
  问题重现:在之前写的相机功能中,如果多个页面需要实现相机,那代码的重复率会特别高,这个无疑是个严重的弊端,因此这里重新封装一下相机,以实现代码的简洁性。

1. 创建videoCaptureManager类,继承自NSObject

2. 在.h文件中,实现一个自定义的协议Protocol和代理。


#import <Foundation/Foundation.h>
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>


@protocol CaptureDataOutputProtocol;

@interface videoCaptureManager : NSObject
@property (nonatomic, readwrite, weak) id<CaptureDataOutputProtocol> delegate;

@property (nonatomic, readwrite, assign) BOOL runningStatus;

/**
 * 设定使用前置摄像头或者后置摄像头
 * AVCaptureDevicePositionFront 前置摄像头(默认)
 * AVCaptureDevicePositionBack 后置摄像头
 */
@property (nonatomic, readwrite, assign) AVCaptureDevicePosition position;

- (void)startSession;

- (void)stopSession;

- (void)resetSession;

- (AVCaptureSession *)returnSession;

@end

@protocol CaptureDataOutputProtocol <NSObject>

/**
 * 回调每一个分帧的image
 */

- (void)captureOutputImage:(UIImage *)image;

- (void)captureOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer;

- (void)captureError;

@end

3. 在.m文件中实现相应的功能

#import "videoCaptureManager.h"

@interface videoCaptureManager() <AVCaptureVideoDataOutputSampleBufferDelegate>{
    dispatch_queue_t _queue;
}
@property (nonatomic, readwrite, retain) AVCaptureSession *session;
@property (nonatomic, readwrite, retain) AVCaptureDevice *captureDevice;
@property (nonatomic, readwrite, retain) AVCaptureDeviceInput *input;
@property (nonatomic, readwrite, retain) AVCaptureVideoDataOutput *output;
@property (nonatomic, readwrite, assign) BOOL isSessionBegin;
@end

@implementation videoCaptureManager

- (void)setPosition:(AVCaptureDevicePosition)position {
    if (_position ^ position) {
        _position = position;
        if (self.isSessionBegin) {
            [self resetSession];
        }
    }
}

- (instancetype)init {
    if (self = [super init]) {
        _session = [[AVCaptureSession alloc] init];
        _session.sessionPreset = AVCaptureSessionPreset640x480;
        _queue = dispatch_queue_create("myQueue", NULL);
        _isSessionBegin = NO;
        _position = AVCaptureDevicePositionFront;
    }
    return self;
}

- (AVCaptureDevice *)cameraWithPosition:(AVCaptureDevicePosition) position {
    NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    for (AVCaptureDevice *device in devices) {
        if ([device position] == position) {
            return device;
        }
    }
    return nil;
}

- (void)startSession {
    if ([self.session isRunning]) {
        return;
    }
    if (!self.isSessionBegin) {
        self.isSessionBegin = YES;
        // 配置相机设备
        _captureDevice = [self cameraWithPosition:_position];
        // 初始化输入
        NSError *error = nil;
        _input = [[AVCaptureDeviceInput alloc] initWithDevice:_captureDevice error:&error];
        if (error == nil) {
            [_session addInput:_input];
        } else {
            if ([self.delegate respondsToSelector:@selector(captureError)]) {
                [self.delegate captureError];
            }
        }
        // 输出设置
        _output = [[AVCaptureVideoDataOutput alloc] init];
        _output.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
        [_output setSampleBufferDelegate:self queue:_queue];
        [_session addOutput:_output];
        AVCaptureConnection *conn = [_output connectionWithMediaType:AVMediaTypeVideo];
        conn.videoOrientation = AVCaptureVideoOrientationPortrait;
        // 调节摄像头翻转
        [conn setVideoMirrored:YES];
        [self.session startRunning];
    }
}
- (void)stopSession {
    if (![self.session isRunning]) {
        return;
    }
    if(self.isSessionBegin){
        self.isSessionBegin = NO;
        [self.session stopRunning];
        if(nil != self.output){
            [self.session removeInput:self.input];
        }
        if(nil != self.output){
            [self.session removeOutput:self.output];
        }
    }
}
- (void)resetSession {
    [self stopSession];
    [self startSession];
}
- (AVCaptureSession *)returnSession {
    return _session;
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    if (!_runningStatus) {
        return;
    }
    UIImage *sampleImage = [self imageFromSamplePlanerPixelBuffer:sampleBuffer];
    //提供回调中实现的是图片的检测
    if ([self.delegate respondsToSelector:@selector(captureOutputImage:)] && sampleImage != nil) {
        [self.delegate captureOutputImage:sampleImage];
    }
    //提供数据流
    if ([self.delegate respondsToSelector:@selector(captureOutputSampleBuffer:)] && sampleImage != nil) {
        [self.delegate captureOutputSampleBuffer:sampleBuffer];
    }
}
- (UIImage *) imageFromSamplePlanerPixelBuffer:(CMSampleBufferRef)sampleBuffer{
    @autoreleasepool {
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        CVPixelBufferLockBaseAddress(imageBuffer, 0);
        void *baseAddress = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);
        size_t bytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(imageBuffer,0);
        size_t width = CVPixelBufferGetWidth(imageBuffer);
        size_t height = CVPixelBufferGetHeight(imageBuffer);
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
        CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Little);
        CGImageRef quartzImage = CGBitmapContextCreateImage(context);
        CVPixelBufferUnlockBaseAddress(imageBuffer,0);
        CGContextRelease(context);
        CGColorSpaceRelease(colorSpace);
        UIImage *image = [UIImage imageWithCGImage:quartzImage];
        CGImageRelease(quartzImage);
        return (image);
    }
}
@end

这样,一个相机已经封装完毕了。

4. 主要接口的提示

对外提供两个回调的接口,一个数据是Image,一个是缓存区的数据流,这个可以根据需要进行调用。

	//接口
- (void)captureOutputImage:(UIImage *)image;

- (void)captureOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer;
	//内部实现(主要看.m文件中)
    //提供图片
    if ([self.delegate respondsToSelector:@selector(captureOutputImage:)] && sampleImage != nil) {
        [self.delegate captureOutputImage:sampleImage];
    }
    //提供数据流
    if ([self.delegate respondsToSelector:@selector(captureOutputSampleBuffer:)] && sampleImage != nil) {
        [self.delegate captureOutputSampleBuffer:sampleBuffer];
    }

5. 封装好的相机的初始化

 
  这里还是通过AVCaptureVideoPreviewLayer来实现预览功能,因此我在封装相机时候写了一个提供session的方法。
  同时,初始化videoLayer的时候,采用获取到的session来进行初始化。
  这样,相机的初始化功能就完成了,代码如下:

#import "ViewController.h"
#import "videoCaptureManager.h"
@interface ViewController ()<CaptureDataOutputProtocol>
@property (nonatomic, readwrite, retain) videoCaptureManager *videoCapture;
@property (strong, nonatomic) AVCaptureVideoPreviewLayer *videoLayer;
@end

@implementation ViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    [self startCamera];
}
- (void)viewWillDisappear:(BOOL)animated{
    [self.videoCapture stopSession];
}
#pragma mark -初始化相机
- (void)startCamera {
    self.videoCapture = [[videoCaptureManager alloc] init];
    self.videoCapture.delegate = self;
    self.videoCapture.runningStatus = YES;
    [self.videoCapture startSession];
    self.videoLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:[self.videoCapture returnSession]];
    self.view.layer.masksToBounds = YES;
    self.videoLayer.frame = self.view.bounds;
    //    self.videoLayer.videoGravity=AVLayerVideoGravityResizeAspectFill;
    self.videoLayer.videoGravity = AVLayerVideoGravityResizeAspect;
    [self.view.layer addSublayer:self.videoLayer];
}
@end

6. 功能的调用

通过以下两个方法,可以分别实现对图片数据流的操作,以及UI的刷新功能等

- (void)captureOutputImage:(UIImage *)image {
    //可以对Image做操作
    __weak typeof(self) weakSelf = self;
    dispatch_async(dispatch_get_main_queue(), ^{
    //对UI刷新的操作,比如人脸画框功能等
    //weakself .....
    });
}
- (void)captureOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer {
    //对数据流进行操作,可以自己转换成自己需要的数据
    __weak typeof(self) weakSelf = self;
    dispatch_async(dispatch_get_main_queue(), ^{
    //weakself .....
    //对UI刷新的操作,比如人脸画框功能等

    });
}

7. Demo的链接

这个是Demo,欢迎━(`∀´)ノ亻!

二、图像的镜像与旋转问题

 这个问题在这个Demo中不存在,因为我已经做过修复了。
  问题重现:在保存相机获取到的图像,或者通过回调接收到的image用UIImageView来进行实时预览时候,发现图像是存在逆时针90°旋转与镜像问题。
  经过查阅大量的资料后,找到一个方式:

//改变该图片的方向
Image = [UIImage imageWithCGImage:Image.CGImage scale:backImage.scale orientation:UIImageOrientationDown];

同时提供一下orientation的类型

UIImageOrientationUp,            // 默认方向
UIImageOrientationDown,          // 让默认方向旋转180度
UIImageOrientationLeft,          // 让默认方向逆时针旋转90度
UIImageOrientationRight,         // 让默认方向顺时针旋转90度
UIImageOrientationUpMirrored,    // 默认方向的竖线镜像
                                 //(即以原图的左(或右)边的竖线为对称轴,对原图进行对称投影得到的镜像)
UIImageOrientationDownMirrored,  // 让镜像旋转180度
UIImageOrientationLeftMirrored,  // 让镜像逆时针旋转90度
UIImageOrientationRightMirrored, // 让镜像顺时针旋转90度

 
  这个方法,也确实解决掉了图像旋转的问题,那应该一切正常了吧。不!这个方法只是在某个意义上将图片进行了旋转。我发现,我将旋转后的图片送检时候,图像出现了坍塌现象,原因不得而知,这个地方可以说明:
  利用上边的方式,可以实现查看的图片的旋转等,但是一些数据,没有得到数据,因此再做一些操作时候,会有一些隐患。
  这个问题困扰了我一段时间,最后通过在封装相机时候,为输出做一下改变的方式,让他出来的数据就是正的。代码如下

AVCaptureConnection *conn = [_output connectionWithMediaType:AVMediaTypeVideo];
//旋转
conn.videoOrientation = AVCaptureVideoOrientationPortrait;
// 调节摄像头翻转
[conn setVideoMirrored:YES];
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值