ios学习--iPhone/ipod摄像头设备获取

转载 2012年03月29日 10:33:49

LINK ADDRESS:http://www.cocoachina.com/bbs/read.php?tid=51754&fpage=2

又写了个新版的帖子,看不懂怎么用的看本人这个新贴,在不懂,那我也没办法了。
iPhone摄像头设备获取(分离简化版)



目的:打开、关闭前置摄像头,绘制图像,并获取摄像头的二进制数据。
需要的库
AVFoundation.framework 、CoreVideo.framework 、CoreMedia.framework 、QuartzCore.framework
该摄像头捕抓必须编译真机的版本,模拟器下编译不了。
函数说明

- (void)createControl
{
// UI界面控件的创建
}

- (AVCaptureDevice *)getFrontCamera;
获取前置摄像头设备
- (void)startVideoCapture;
打开摄像头并开始捕捉图像
其中代码
AVCaptureVideoPreviewLayer* previewLayer = [AVCaptureVideoPreviewLayer layerWithSession: self->avCaptureSession];
previewLayer.frame = localView.bounds;
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self->localView.layer addSublayer: previewLayer]; 
为把图片画到UIView里面

- (void)stopVideoCapture:(id)arg;

关闭摄像头,停止捕抓图像
其中代码:

for(UIView*viewinself->localView.subviews) {
[viewremoveFromSuperview];
}


为移除摄像头图像的View
详情见代码,代码拷过去可以直接使用      Over!!!!


代码:
文件

//
//  AVCallController.h
//  Pxlinstall
//
//  Created by Lin Charlie C. on 11-3-24.
//  Copyright 2011  xxxx. All rights reserved.
//


#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>




@interface AVCallController : UIViewController <AVCaptureVideoDataOutputSampleBufferDelegate>
{
//UI
UILabel*labelState;
UIButton*btnStartVideo;
UIView*localView;

AVCaptureSession* avCaptureSession;
AVCaptureDevice *avCaptureDevice;
BOOLfirstFrame//是否为第一帧
intproducerFps;


}
@property (nonatomicretain) AVCaptureSession *avCaptureSession;
@property (nonatomicretain) UILabel *labelState;


- (void)createControl;
- (AVCaptureDevice *)getFrontCamera;
- (void)startVideoCapture;
- (void)stopVideoCapture:(id)arg;
@end
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
实现文件:
    //
//  AVCallController.m
//  Pxlinstall
//
//  Created by Lin Charlie C. on 11-3-24.
//  Copyright 2011  高鸿移通. All rights reserved.
//


#import "AVCallController.h"




@implementation AVCallController


@synthesize avCaptureSession;
@synthesize labelState;


// The designated initializer.  Override if you create the controller programmatically and want to perform customization that is not appropriate for viewDidLoad.
/*
- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil {
    self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
    if (self) {
        // Custom initialization.
    }
    return self;
}
*/
-(id)init
{
if(self= [superinit])
{
firstFrameYES;
producerFps50;
}
returnself;
}


// Implement loadView to create a view hierarchy programmatically, without using a nib.
- (void)loadView {
[superloadView];
[selfcreateControl];
}




/*
// Implement viewDidLoad to do additional setup after loading the view, typically from a nib.
- (void)viewDidLoad {
    [super viewDidLoad];
}
*/


/*
// Override to allow orientations other than the default portrait orientation.
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation {
    // Return YES for supported orientations.
    return (interfaceOrientation == UIInterfaceOrientationPortrait);
}
*/


- (void)didReceiveMemoryWarning {
// Releases the view if it doesn't have a superview.
[superdidReceiveMemoryWarning];

// Release any cached data, images, etc. that aren't in use.
}


- (void)viewDidUnload {
[superviewDidUnload];
// Release any retained subviews of the main view.
// e.g. self.myOutlet = nil;
}




- (void)dealloc {
    [super dealloc];
}


#pragma mark -
#pragma mark createControl
- (void)createControl
{
//UI展示
self.view.backgroundColor= [UIColorgrayColor];
labelState= [[UILabelallocinitWithFrame:CGRectMake(102022030)];
labelState.backgroundColor= [UIColorclearColor];
[self.viewaddSubview:labelState];
[labelStaterelease];

btnStartVideo= [[UIButtonallocinitWithFrame:CGRectMake(203508050)];
[btnStartVideosetTitle:@"Star"forState:UIControlStateNormal];


[btnStartVideosetBackgroundImage:[UIImageimageNamed:@"Images/button.png"forState:UIControlStateNormal];
[btnStartVideoaddTarget:selfaction:@selector(startVideoCaptureforControlEvents:UIControlEventTouchUpInside];
[self.viewaddSubview:btnStartVideo];
[btnStartVideorelease];

UIButton* stop = [[UIButtonallocinitWithFrame:CGRectMake(1203508050)];
[stop setTitle:@"Stop"forState:UIControlStateNormal];

[stop setBackgroundImage:[UIImageimageNamed:@"Images/button.png"forState:UIControlStateNormal];
[stop addTarget:selfaction:@selector(stopVideoCapture:) forControlEvents:UIControlEventTouchUpInside];
[self.view addSubview:stop];
[stop release];

localView= [[UIViewallocinitWithFrame:CGRectMake(4050200300)];
[self.viewaddSubview:localView];
[localViewrelease];


}
#pragma mark -
#pragma mark VideoCapture
- (AVCaptureDevice *)getFrontCamera
{
//获取前置摄像头设备
NSArray *cameras = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    for (AVCaptureDevice *device in cameras)
{
        if (device.position == AVCaptureDevicePositionFront)
            return device;
    }
    return [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

}
- (void)startVideoCapture
{
//打开摄像设备,并开始捕抓图像
[labelStatesetText:@"Starting Video stream"];
if(self->avCaptureDevice|| self->avCaptureSession)
{
[labelStatesetText:@"Already capturing"];
return;
}

if((self->avCaptureDevice = [self getFrontCamera]) == nil)
{
[labelStatesetText:@"Failed to get valide capture device"];
return;
}

NSError *error = nil;
    AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:self->avCaptureDevice error:&error];
    if (!videoInput)
{
[labelStatesetText:@"Failed to get video input"];
self->avCaptureDevicenil;
        return;
    }

    self->avCaptureSession = [[AVCaptureSession allocinit];
    self->avCaptureSession.sessionPreset = AVCaptureSessionPresetLow;
    [self->avCaptureSession addInput:videoInput];

// Currently, the only supported key is kCVPixelBufferPixelFormatTypeKey. Recommended pixel format choices are 
// kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange or kCVPixelFormatType_32BGRA. 
// On iPhone 3G, the recommended pixel format choices are kCVPixelFormatType_422YpCbCr8 or kCVPixelFormatType_32BGRA.
//
    AVCaptureVideoDataOutput *avCaptureVideoDataOutput = [[AVCaptureVideoDataOutput allocinit];
NSDictionary*settings = [[NSDictionaryallocinitWithObjectsAndKeys:
//[NSNumber numberWithUnsignedInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange], kCVPixelBufferPixelFormatTypeKey,
[NSNumbernumberWithInt:240], (id)kCVPixelBufferWidthKey,
                              [NSNumber numberWithInt:320], (id)kCVPixelBufferHeightKey,
  nil];
    avCaptureVideoDataOutput.videoSettings = settings;
    [settings release];
    avCaptureVideoDataOutput.minFrameDuration = CMTimeMake(1self->producerFps);
/*We create a serial queue to handle the processing of our frames*/
dispatch_queue_tqueue = dispatch_queue_create("org.doubango.idoubs"NULL);
    [avCaptureVideoDataOutput setSampleBufferDelegate:self queue:queue];
    [self->avCaptureSession addOutput:avCaptureVideoDataOutput];
    [avCaptureVideoDataOutput release];
dispatch_release(queue);

AVCaptureVideoPreviewLayer* previewLayer = [AVCaptureVideoPreviewLayer layerWithSession: self->avCaptureSession];
previewLayer.frame = localView.bounds;
previewLayer.videoGravityAVLayerVideoGravityResizeAspectFill;

[self->localView.layer addSublayer: previewLayer];

self->firstFrameYES;
    [self->avCaptureSession startRunning];

[labelStatesetText:@"Video capture started"];

}
- (void)stopVideoCapture:(id)arg
{
//停止摄像头捕抓
if(self->avCaptureSession){
[self->avCaptureSession stopRunning];
self->avCaptureSessionnil;
[labelStatesetText:@"Video capture stopped"];
}
self->avCaptureDevicenil;
//移除localView里面的内容
for(UIView*viewinself->localView.subviews) {
[viewremoveFromSuperview];
}
}
#pragma mark -
#pragma mark AVCaptureVideoDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 
{
//捕捉数据输出 要怎么处理虽你便
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
/*Lock the buffer*/
if(CVPixelBufferLockBaseAddress(pixelBuffer, 0) == kCVReturnSuccess)
{
        UInt8 *bufferPtr = (UInt8 *)CVPixelBufferGetBaseAddress(pixelBuffer);
        size_t buffeSize = CVPixelBufferGetDataSize(pixelBuffer);

if(self->firstFrame)

if(1)
{
//第一次数据要求:宽高,类型
int width = CVPixelBufferGetWidth(pixelBuffer);
int height = CVPixelBufferGetHeight(pixelBuffer);

int pixelFormat = CVPixelBufferGetPixelFormatType(pixelBuffer);
switch (pixelFormat) {
casekCVPixelFormatType_420YpCbCr8BiPlanarVideoRange:
//TMEDIA_PRODUCER(producer)->video.chroma = tmedia_nv12; // iPhone 3GS or 4
NSLog(@"Capture pixel format=NV12");
break;
casekCVPixelFormatType_422YpCbCr8:
//TMEDIA_PRODUCER(producer)->video.chroma = tmedia_uyvy422; // iPhone 3
NSLog(@"Capture pixel format=UYUY422");
break;
default:
//TMEDIA_PRODUCER(producer)->video.chroma = tmedia_rgb32;
NSLog(@"Capture pixel format=RGB32");
break;
}

self->firstFrame = NO;
}
}
/*We unlock the buffer*/
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); 
    }
/*We create an autorelease pool because as we are not in the main_queue our code is
 not executed in the main thread. So we have to create an autorelease pool for the thread we are in*/
// NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
// 
//    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
//    /*Lock the image buffer*/
//    CVPixelBufferLockBaseAddress(imageBuffer,0); 
//    /*Get information about the image*/
//    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer); 
//    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
//    size_t width = CVPixelBufferGetWidth(imageBuffer); 
//    size_t height = CVPixelBufferGetHeight(imageBuffer);  
//    
//    /*Create a CGImageRef from the CVImageBufferRef*/
//    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
//    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
//    CGImageRef newImage = CGBitmapContextCreateImage(newContext); 
// 
//    /*We release some components*/
//    CGContextRelease(newContext); 
//    CGColorSpaceRelease(colorSpace);
//    
//    /*We display the result on the custom layer. All the display stuff must be done in the main thread because
//  UIKit is no thread safe, and as we are not in the main thread (remember we didn't use the main_queue)
//  we use performSelectorOnMainThread to call our CALayer and tell it to display the CGImage.*/
// [self.customLayer performSelectorOnMainThread:@selector(setContents:) withObject: (id) newImage waitUntilDone:YES];
// 
// /*We display the result on the image view (We need to change the orientation of the image so that the video is displayed correctly).
//  Same thing as for the CALayer we are not in the main thread so ...*/
// UIImage *image= [UIImage imageWithCGImage:newImage scale:1.0 orientation:UIImageOrientationRight];
// 
// /*We relase the CGImageRef*/
// CGImageRelease(newImage);
// 
// [self.imageView performSelectorOnMainThread:@selector(setImage:) withObject:image waitUntilDone:YES];
// 
// /*We unlock the  image buffer*/
// CVPixelBufferUnlockBaseAddress(imageBuffer,0);
// 
// [pool drain];
}
@end

IOS获取设备的前置摄像头和后置摄像头

- (void)viewDidLoad { [super viewDidLoad]; UIButton *btn = [UIButton buttonWithType:UIButtonTypeCu...
  • wyb2018
  • wyb2018
  • 2015年10月26日 16:58
  • 818

在iOS设备(iPhone/iPad)中调用摄像头进行全屏拍摄时的处理方案

在实际开发中,UIImagePickerController的调用在iPhone和iPad的中是有些许差异的,因为两者设备间的固有的的差距,在iPhone中调用照片库中的图片可以直接进行 首先应当遵循...
  • rylt2011
  • rylt2011
  • 2013年12月04日 14:54
  • 3688

iOS开发实战——摄像头与相册权限获取逻辑优化

在实际项目中,我们经常需要访问设备的摄像头或者相册,当第一次安装某个App的时候,系统便会弹出授权对话框,要求用户做出是否授权的判断。整体逻辑比较简单,但是在使用过程中需要对用户体验进行优化,否则会出...
  • CHENYUFENG1991
  • CHENYUFENG1991
  • 2016年06月22日 02:01
  • 9557

iOS之摄像头调用

iphone中图像通常存储在4个地方【相册、应用程序包、沙盒、Internet】,通过这4个源,我们就可以存取应用图片。   相册   iphone的相册包含摄像头胶卷+用户计算机同步的部...
  • u011199592
  • u011199592
  • 2013年08月23日 10:33
  • 1554

iOS RTMP 视频直播开发笔记(1) – 采集摄像头图像

iOS RTMP 视频直播开发笔记(1) – 采集摄像头图像
  • qq_33777090
  • qq_33777090
  • 2016年05月30日 18:25
  • 1329

iOS- 相机(摄像头)获取到的图片自动旋转90度解决办法

今天要做一个读取摄像头图片功能,发现拍摄之后的图片会在拍摄水平的基础上自动发生90°旋转。通过下面的方法 确实修正了,未避免选照片也发生该案例,我觉得在选取图片的时候也要进行修正。Mark!感谢 原文...
  • u014220518
  • u014220518
  • 2016年02月22日 14:49
  • 1181

iOS从摄像头获得实时视频流(研究中)

首先下面的Demo是将摄像头的视频流转化为image(JPEG) // // ViewController.m // 实时视频Demo // // Created by 程磊 on 15/4/...
  • chenglei9128
  • chenglei9128
  • 2015年04月11日 10:01
  • 3950

通过AVFoundation框架获取摄像头数据

一、概述 从iOS4开始,AVFoundation框架增加了几个类,AVCaptureDevice、AVCaptureSession等,可以获取摄像头的数据,而不会弹出类似于Imag...
  • jingcheng345413
  • jingcheng345413
  • 2017年02月10日 14:10
  • 1037

IOS调取系统摄像头以及相册

在IOS上很多应用都需要调用系统的摄像头以及相册的权限,IOS也给我们提供了接口。 UIImagePickerController继承自UINavigationController,所以我们不可以在摄...
  • zyzxrj
  • zyzxrj
  • 2014年10月18日 14:04
  • 1180

iOS AV Foundation 二维码扫描 05 缩放摄像头

最后一节,我们为程序添加通过手势对摄像头进行缩放控制的功能。 添加实例变量,并在viewDidLoad方法的最后,进行初始化: CGFloat _initialPinchZoom; ...
  • yamingwu
  • yamingwu
  • 2015年03月22日 10:11
  • 1921
内容举报
返回顶部
收藏助手
不良信息举报
您举报文章:ios学习--iPhone/ipod摄像头设备获取
举报原因:
原因补充:

(最多只允许输入30个字)