在进行视频捕获时,有输入设备及输出设备,程序通过 AVCaptureSession 的一个实例来协调、组织数据在它们之间的流动。
程序中至少需要:
● An instance of AVCaptureDevice to represent the input device, such as a camera or microphone
● An instance of a concrete subclass of AVCaptureInput to configure the ports from the input device
● An instance of a concrete subclass of AVCaptureOutput to manage the output to a movie file or still image
● An instance of AVCaptureSession to coordinate the data flow from the input to the output
由上图可以看出,一个AVCaptureSession 可以协调多个输入设备及输出设备。通过 AVCaptureSession 的 addInput、addOutput 方法可将输入、输出设备加入 AVCaptureSession 中。
capture input 及 capture out 之间的联系由 AVCaptureConnection 对象表示,capture input (AVCaptureInput) 有一个或多个输入端口(AVCaptureInputPort 实例) ,capture out(AVCaptureOutput 实例)可接收来自一个或多个输入源的数据。
当一个输入或一个输出被加入到 AVCaptureSession 中时,该 session 会“贪婪地” 在所有兼容的输入端口和输出之间建立连接(AVCaptureConnection),因此,一般不需要手工在输入、输出间建立连接。
输入设备:
AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:
[AVCaptureDevicedefaultDeviceWithMediaType:AVMediaTypeVideo] error:nil];
种类有:
AVMediaTypeVideo
AVMediaTypeAudio
输出设备有:
AVCaptureMovieFileOutput 输出到文件
AVCaptureVideoDataOutput 可用于处理被捕获的视频帧
AVCaptureAudioDataOutput 可用于处理被捕获的音频数据
AVCaptureStillImageOutput 可用于捕获带有元数据(MetaData)的静止图像
输出设备对象的创建:
AVCaptureMovieFileOutput *captureOutput = [[AVCaptureMovieFileOutput alloc] init;
一、捕获到视频文件
此时输出设备指定为:AVCaptureMovieFileOutput,其子类 AVCaptureFileOutput 的2个方法用于启动、停止编码输出:
- (void)startRecordingToOutputFileURL:(NSURL *)outputFileURL recordingDelegate:(id < AVCaptureFileOutputRecordingDelegate >)delegate
- (void)stopRecording
程序开始编码输出时,应先启动 AVCaptureSession,再用以上方法启动编码输出。整个步骤:
创建输入、输出设备、AVCaptureSession对象:
AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevicedefaultDeviceWithMediaType:AVMediaTypeVideo] error:nil];
AVCaptureDeviceInput *microphone = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevicedefaultDeviceWithMediaType:AVMediaTypeAudio] error:nil];
/*We setupt the output*/
captureOutput = [[AVCaptureMovieFileOutput alloc] init];
self.captureSession = [[AVCaptureSession alloc] init];
加入输入、输出设备:
[self.captureSession addInput:captureInput];
[self.captureSession addInput:microphone];
[self.captureSession addOutput:self.captureOutput];
设置Session 属性:
/*We use medium quality, ont the iPhone 4 this demo would be laging too much, the conversion in UIImage and CGImage demands too much ressources for a 720p resolution.*/
[self.captureSession setSessionPreset:AVCaptureSessionPresetMedium];
其他预置属性如下:
不同设备的情况:
开始编码,视频编码为H.264、音频编码为AAC:
[self performSelector:@selector(startRecording) withObject:nil afterDelay:10.0];
[self.captureSession startRunning];
- (void) startRecording
{
[captureOutput startRecordingToOutputFileURL:[self tempFileURL] recordingDelegate:self];
}
处理编码过程中事件的类必须符合 VCaptureFileOutputRecordingDelegate 协议,并在以下2个方法中进行处理:
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections
{
NSLog(@"start record video");
}
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
// 将临时文件夹中的视频文件复制到 照片 文件夹中,以便存取
[library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
completionBlock:^(NSURL *assetURL, NSError *error) {
if (error) {
_myLabel.text=@"Error";
}
else
_myLabel.text = [assetURL path];
}];
[library release];}
通过 AVCaptureFileOutput 的 stopRecording 方法停止编码。
二、捕获用于处理视频帧
三、捕获为静止图像
此时输出设备对象为:AVCaptureStillImageOutput,session 的预置(preset)信息决定图像分辨率:
图像格式:
例:
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
当需要捕获静止图像时,可向输出设备对象发送:captureStillImageAsynchronouslyFromConnection:completionHandler: 消息。第一个参数为欲进行图像捕获的连接对象(AVCaptureConnection),你必须找到具有视频采集输入端口(port)的连接:
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
第二个参数是一个块(block),它有2个参数,第一个参数是包含图像数据的 CMSampleBuffer,可用于处理图像:
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:stillImageConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
ALAssetsLibraryWriteImageCompletionBlock completionBlock = ^(NSURL *assetURL, NSError *error) {
if (error) {
// error handling
}
}
};
if (imageDataSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
UIImage *image = [[UIImage alloc] initWithData:imageData];
// 将图像保存到“照片” 中
[library writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation]
completionBlock:completionBlock];
[image release];
[library release];
}
else
completionBlock(nil, error);
if ([[self delegate] respondsToSelector:@selector(captureManagerStillImageCaptured:)]) {
[[self delegate] captureManagerStillImageCaptured:self];
}
}];