Avfoundation 相机指定裁剪区域

最近在搞直接一个类似于二维码的东西,同样也是需要获取其中某个区域的图片。直接上最为主要的一些代码吧。

 

下面这个是初始化AV部分,这样就可以将图像在view上面展示了。这里简单的阐述一下在这其中碰到的问题和解决方法。

 

1.如果在layer上面搞出一个“洞 ”,就是真正的裁剪区域,在这里用的是CAShapeLayer,利用fillMode,这样就可以通过mask方式作用在将覆盖在perviewLayer上面的coverLayer了。

 

2. 我们可以很容易的拿到整个的image,就可以在delegate中的sampleBuffer中拿到了。这里我使用的是AVCaptureVideoDataOutput,这样就可以不断的获取到采样的流了。

 

3. 从整个image中拿到裁剪区域中的图片。在这个问题上面花了不少时间和心思始终不能正确的拿到裁剪区域的图像。先是用了CGImageFromImage ,一个是才出来的图片位置和大小不对。之后转用cgcontext的方式。但是还是不太对。不断的使用google啊,怎么搞呢,琢磨啊。因为刚开始layer的呈现方式是fill的,这样实际的图片大小并不是和屏幕的大小是一样的。思前想后,可以确定是这个问题了,然后开始吧。针对不同的videoGravity的方式计算出裁剪区域实际在图片中对象的位置和大小,于是就有了一个calcRect的方法,这个方法就是将之前在屏幕上挖出来的“洞”对应到图片中的位置去。

 

 

 

总算是搞出来了。有兴趣的看看吧。

 

 

?
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
<pre name= "code" class = "objc" > // 
//  ScanView.m 
//  xxoo 
// 
//  Created by Tommy on 13-11-6. 
//  Copyright (c) 2013年 Tommy. All rights reserved. 
// 
   
# import "ScanView.h" 
# import <AVFoundation/AVFoundation.h> 
   
   
static inline double radians ( double degrees) { return degrees * M_PI/ 180 ;} 
   
@interface ScanView()<AVCaptureVideoDataOutputSampleBufferDelegate> 
   
@property AVCaptureVideoPreviewLayer* previewLayer; 
@property AVCaptureSession* session; 
@property AVCaptureDevice* videoDevice; 
@property dispatch_queue_t camera_sample_queue; 
@property CALayer* coverLayer; 
@property CAShapeLayer* cropLayer; 
@property CALayer* stillImageLayer; 
@property  AVCaptureStillImageOutput* stillImageOutput; 
   
@property UIImageView* stillImageView; 
@property UIImage* cropImage; 
   
@property BOOL hasSetFocus; 
   
   
   
@end 
   
@implementation ScanView 
   
- (id)initWithFrame:(CGRect)frame 
     self = [ super initWithFrame:frame]; 
     if (self) { 
         // Initialization code 
         self.hasSetFocus = NO; 
         [self initAVCaptuer]; 
         [self initOtherLayers]; 
    
     return self; 
   
/*
// Only override drawRect: if you perform custom drawing.
// An empty implementation adversely affects performance during animation.
- (void)drawRect:(CGRect)rect
{
     // Drawing code
}
*/ 
-( void )layoutSubviews 
     [self.previewLayer setFrame:self.bounds]; 
     [self.coverLayer setFrame:self.bounds]; 
     self.coverLayer.mask = self.cropLayer; 
   
- ( void ) initAVCaptuer{ 
       
     self.cropRect = CGRectZero; 
       
     self.videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; 
     AVCaptureDeviceInput* input = [[AVCaptureDeviceInput alloc]initWithDevice:self.videoDevice error:nil]; 
       
     AVCaptureVideoDataOutput* output = [[AVCaptureVideoDataOutput alloc]init]; 
     output.alwaysDiscardsLateVideoFrames = YES; 
     self.camera_sample_queue = dispatch_queue_create ( "com.scan.video.sample_queue" , DISPATCH_QUEUE_SERIAL); 
     [output setSampleBufferDelegate:self queue:self.camera_sample_queue]; 
       
     NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; 
     NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; 
     NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; 
     [output setVideoSettings:videoSettings]; 
       
       
     self.stillImageOutput = [[AVCaptureStillImageOutput alloc]init]; 
     NSDictionary* outputSettings = @{AVVideoCodecKey:AVVideoCodecJPEG}; 
     [self.stillImageOutput setOutputSettings:outputSettings]; 
       
     self.session = [[AVCaptureSession alloc]init]; 
     self.session.sessionPreset = AVCaptureSessionPresetMedium; 
       
     if ([self.session canAddInput:input]) 
    
         [self.session addInput:input]; 
           
         if ([self.session canAddOutput:output]) 
        
             [self.session addOutput:self.stillImageOutput]; 
             [self.session addOutput:output]; 
               
             self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.session]; 
             self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspect; 
             
             [self.layer addSublayer: self.previewLayer]; 
               
             return ; // success 
        
    
       
     self.session = nil; 
   
- ( void )setCropRect:(CGRect)cropRect 
     _cropRect = cropRect; 
     if (!CGRectEqualToRect(CGRectZero, self.cropRect)){ 
   
         self.cropLayer = [[CAShapeLayer alloc] init]; 
         CGMutablePathRef path = CGPathCreateMutable(); 
           
         CGPathAddRect(path, nil, self.cropRect); 
         CGPathAddRect(path, nil, self.bounds); 
           
         [self.cropLayer setFillRule:kCAFillRuleEvenOdd]; 
         [self.cropLayer setPath:path]; 
         [self.cropLayer setFillColor:[[UIColor whiteColor] CGColor]]; 
           
         [self.cropLayer setNeedsDisplay]; 
           
         //[self setVideoFocus]; 
           
    
       
     [self.stillImageLayer setFrame:CGRectMake( 100 , 450 , CGRectGetWidth(cropRect), CGRectGetHeight(cropRect))]; 
   
- ( void ) setVideoFocus{ 
       
     NSError *error; 
     CGPoint foucsPoint = CGPointMake(CGRectGetMidX(self.cropRect), CGRectGetMidY(self.cropRect)); 
     if ([self.videoDevice isFocusPointOfInterestSupported] 
        &&[self.videoDevice lockForConfiguration:&error] &&!self.hasSetFocus){ 
         self.hasSetFocus = YES; 
         [self.videoDevice setFocusPointOfInterest:[self convertToPointOfInterestFromViewCoordinates:foucsPoint]]; 
         [self.videoDevice setFocusMode:AVCaptureFocusModeContinuousAutoFocus]; 
         [self.videoDevice unlockForConfiguration]; 
    
//    [self.videoDevice setFocusMode:AVCaptureFocusModeAutoFocus]; 
     NSLog(@ "error:%@" ,error); 
       
   
   
- (CGPoint)convertToPointOfInterestFromViewCoordinates:(CGPoint)viewCoordinates 
     CGPoint pointOfInterest = CGPointMake(.5f, .5f); 
     CGSize frameSize = self.frame.size; 
       
     AVCaptureVideoPreviewLayer *videoPreviewLayer = self.previewLayer; 
       
     if ([self.previewLayer isMirrored]) { 
         viewCoordinates.x = frameSize.width - viewCoordinates.x; 
    
       
     if ( [[videoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResize] ) { 
         pointOfInterest = CGPointMake(viewCoordinates.y / frameSize.height, 1 .f - (viewCoordinates.x / frameSize.width)); 
     } else
         CGRect cleanAperture; 
         for (AVCaptureInputPort *port in [[[[self session] inputs] lastObject] ports]) { 
             if ([port mediaType] == AVMediaTypeVideo) { 
                 cleanAperture = CMVideoFormatDescriptionGetCleanAperture([port formatDescription], YES); 
                 CGSize apertureSize = cleanAperture.size; 
                 CGPoint point = viewCoordinates; 
                   
                 CGFloat apertureRatio = apertureSize.height / apertureSize.width; 
                 CGFloat viewRatio = frameSize.width / frameSize.height; 
                 CGFloat xc = .5f; 
                 CGFloat yc = .5f; 
                   
                 if ( [[videoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspect] ) { 
                     if (viewRatio > apertureRatio) { 
                         CGFloat y2 = frameSize.height; 
                         CGFloat x2 = frameSize.height * apertureRatio; 
                         CGFloat x1 = frameSize.width; 
                         CGFloat blackBar = (x1 - x2) / 2
                         if (point.x >= blackBar && point.x <= blackBar + x2) { 
                             xc = point.y / y2; 
                             yc = 1 .f - ((point.x - blackBar) / x2); 
                        
                     } else
                         CGFloat y2 = frameSize.width / apertureRatio; 
                         CGFloat y1 = frameSize.height; 
                         CGFloat x2 = frameSize.width; 
                         CGFloat blackBar = (y1 - y2) / 2
                         if (point.y >= blackBar && point.y <= blackBar + y2) { 
                             xc = ((point.y - blackBar) / y2); 
                             yc = 1 .f - (point.x / x2); 
                        
                    
                 } else if ([[videoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspectFill]) { 
                     if (viewRatio > apertureRatio) { 
                         CGFloat y2 = apertureSize.width * (frameSize.width / apertureSize.height); 
                         xc = (point.y + ((y2 - frameSize.height) / 2 .f)) / y2; 
                         yc = (frameSize.width - point.x) / frameSize.width; 
                     } else
                         CGFloat x2 = apertureSize.height * (frameSize.height / apertureSize.width); 
                         yc = 1 .f - ((point.x + ((x2 - frameSize.width) / 2 )) / x2); 
                         xc = point.y / frameSize.height; 
                    
                       
                
                   
                 pointOfInterest = CGPointMake(xc, yc); 
                 break
            
        
    
       
     return pointOfInterest; 
   
- ( void ) initOtherLayers{ 
     self.coverLayer = [CALayer layer]; 
       
     self.coverLayer.backgroundColor = [[[UIColor blackColor] colorWithAlphaComponent: 0.6 ] CGColor]; 
     [self.layer addSublayer:self.coverLayer]; 
       
     if (!CGRectEqualToRect(CGRectZero, self.cropRect)){ 
       
         self.cropLayer = [[CAShapeLayer alloc] init]; 
         CGMutablePathRef path = CGPathCreateMutable(); 
           
         CGPathAddRect(path, nil, self.cropRect); 
         CGPathAddRect(path, nil, self.bounds); 
           
         [self.cropLayer setFillRule:kCAFillRuleEvenOdd]; 
         [self.cropLayer setPath:path]; 
         [self.cropLayer setFillColor:[[UIColor redColor] CGColor]]; 
    
       
     self.stillImageLayer = [CALayer layer]; 
     self.stillImageLayer.backgroundColor = [[UIColor yellowColor] CGColor]; 
     self.stillImageLayer.contentsGravity = kCAGravityResizeAspect; 
     [self.coverLayer addSublayer:self.stillImageLayer]; 
       
       
     self.stillImageView = [[UIImageView alloc]initWithFrame:CGRectMake( 0 , 300 , 100 , 100 )]; 
     self.stillImageView.backgroundColor = [UIColor redColor]; 
     self.stillImageView.contentMode = UIViewContentModeScaleAspectFit; 
     [self addSubview:self.stillImageView]; 
       
       
     self.previewLayer.contentsGravity = kCAGravityResizeAspect; 
       
   
- ( void )captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{ 
       
       
     [self setVideoFocus]; 
       
     UIImage *image = [self imageFromSampleBuffer:sampleBuffer]; 
     self.cropImage = [self cropImageInRect:image]; 
       
     dispatch_async(dispatch_get_main_queue(), ^{ 
           
        [self.stillImageView setImage:image]; 
       // [self.stillImageLayer setContents:(id)[self.cropImage CGImage]]; 
     }); 
       
// 通过抽样缓存数据创建一个UIImage对象 
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer 
     // 为媒体数据设置一个CMSampleBuffer的Core Video图像缓存对象 
     CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
     // 锁定pixel buffer的基地址 
     CVPixelBufferLockBaseAddress(imageBuffer, 0 ); 
       
     // 得到pixel buffer的基地址 
     voidvoid *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 
       
     // 得到pixel buffer的行字节数 
     size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
     // 得到pixel buffer的宽和高 
     size_t width = CVPixelBufferGetWidth(imageBuffer); 
     size_t height = CVPixelBufferGetHeight(imageBuffer); 
       
     //NSLog(@"%zu,%zu",width,height); 
       
     // 创建一个依赖于设备的RGB颜色空间 
     CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
       
     // 用抽样缓存的数据创建一个位图格式的图形上下文(graphics context)对象 
     CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8
                                                  bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
       
   
     // 根据这个位图context中的像素数据创建一个Quartz image对象 
     CGImageRef quartzImage = CGBitmapContextCreateImage(context); 
     // 解锁pixel buffer 
     CVPixelBufferUnlockBaseAddress(imageBuffer, 0 ); 
       
     // 释放context和颜色空间 
     CGContextRelease(context); 
     CGColorSpaceRelease(colorSpace); 
       
//    cgimageget` 
       
     // 用Quartz image创建一个UIImage对象image 
     //UIImage *image = [UIImage imageWithCGImage:quartzImage]; 
     UIImage *image = [UIImage imageWithCGImage:quartzImage scale: 1 .0f orientation:UIImageOrientationRight]; 
       
     // 释放Quartz image对象 
     CGImageRelease(quartzImage); 
       
     return (image); 
       
       
   
   
   
- (CGRect) calcRect:(CGSize)imageSize{ 
     NSString* gravity = self.previewLayer.videoGravity; 
     CGRect cropRect = self.cropRect; 
     CGSize screenSize = self.previewLayer.bounds.size; 
       
     CGFloat screenRatio = screenSize.height / screenSize.width ; 
     CGFloat imageRatio = imageSize.height /imageSize.width; 
       
     CGRect presentImageRect = self.previewLayer.bounds; 
     CGFloat scale = 1.0
       
       
     if ([AVLayerVideoGravityResizeAspect isEqual: gravity]){ 
           
         CGFloat presentImageWidth = imageSize.width; 
         CGFloat presentImageHeigth = imageSize.height; 
         if (screenRatio > imageRatio){ 
             presentImageWidth = screenSize.width; 
             presentImageHeigth = presentImageWidth * imageRatio; 
               
         } else
             presentImageHeigth = screenSize.height; 
             presentImageWidth = presentImageHeigth / imageRatio; 
        
           
         presentImageRect.size = CGSizeMake(presentImageWidth, presentImageHeigth); 
         presentImageRect.origin = CGPointMake((screenSize.width-presentImageWidth)/ 2.0 , (screenSize.height-presentImageHeigth)/ 2.0 ); 
       
     } else if ([AVLayerVideoGravityResizeAspectFill isEqual:gravity]){ 
           
         CGFloat presentImageWidth = imageSize.width; 
         CGFloat presentImageHeigth = imageSize.height; 
         if (screenRatio > imageRatio){ 
             presentImageHeigth = screenSize.height; 
             presentImageWidth = presentImageHeigth / imageRatio; 
         } else
             presentImageWidth = screenSize.width; 
             presentImageHeigth = presentImageWidth * imageRatio; 
        
           
         presentImageRect.size = CGSizeMake(presentImageWidth, presentImageHeigth); 
         presentImageRect.origin = CGPointMake((screenSize.width-presentImageWidth)/ 2.0 , (screenSize.height-presentImageHeigth)/ 2.0 ); 
           
     } else
         NSAssert( 0 , @ "dont support:%@" ,gravity); 
    
       
     scale = CGRectGetWidth(presentImageRect) / imageSize.width; 
       
     CGRect rect = cropRect; 
     rect.origin = CGPointMake(CGRectGetMinX(cropRect)-CGRectGetMinX(presentImageRect), CGRectGetMinY(cropRect)-CGRectGetMinY(presentImageRect)); 
       
     rect.origin.x /= scale; 
     rect.origin.y /= scale; 
     rect.size.width /= scale; 
     rect.size.height  /= scale; 
       
     return rect; 
   
#define SUBSET_SIZE 360 
   
- (UIImage*) cropImageInRect:(UIImage*)image{ 
   
     CGSize size = [image size]; 
     CGRect cropRect = [self calcRect:size]; 
   
     float scale = fminf( 1 .0f, fmaxf(SUBSET_SIZE / cropRect.size.width, SUBSET_SIZE / cropRect.size.height)); 
     CGPoint offset = CGPointMake(-cropRect.origin.x, -cropRect.origin.y); 
       
     size_t subsetWidth = cropRect.size.width * scale; 
     size_t subsetHeight = cropRect.size.height * scale; 
       
       
     CGColorSpaceRef grayColorSpace = CGColorSpaceCreateDeviceGray(); 
       
     CGContextRef ctx = 
     CGBitmapContextCreate(nil, 
                           subsetWidth, 
                           subsetHeight, 
                           8
                           0
                           grayColorSpace, 
                           kCGImageAlphaNone|kCGBitmapByteOrderDefault); 
     CGColorSpaceRelease(grayColorSpace); 
     CGContextSetInterpolationQuality(ctx, kCGInterpolationNone); 
     CGContextSetAllowsAntialiasing(ctx, false ); 
   
     // adjust the coordinate system 
     CGContextTranslateCTM(ctx, 0.0 , subsetHeight); 
     CGContextScaleCTM(ctx, 1.0 , - 1.0 ); 
       
       
     UIGraphicsPushContext(ctx); 
     CGRect rect = CGRectMake(offset.x * scale, offset.y * scale, scale * size.width, scale * size.height); 
   
     [image drawInRect:rect]; 
       
     UIGraphicsPopContext(); 
       
     CGContextFlush(ctx); 
       
       
     CGImageRef subsetImageRef = CGBitmapContextCreateImage(ctx); 
       
     UIImage* subsetImage = [UIImage imageWithCGImage:subsetImageRef]; 
   
     CGImageRelease(subsetImageRef); 
       
     CGContextRelease(ctx); 
   
       
     return subsetImage; 
}   
   
   
   
- ( void ) start{ 
       
     dispatch_sync (self.camera_sample_queue, ^{ 
         [self.session startRunning]; }); 
       
- ( void ) stop{ 
     if (self.session){ 
         [self.session stopRunning]; 
    
       
   
   
@end 
</pre><br><br> 
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值