视频
MPMoviePlayerController
在iOS中播放视频可以使用MediaPlayer.framework种的MPMoviePlayerController类来完成,它支持本地视频和网络视频播放。这个类实现了MPMediaPlayback协议,因此具备一般的播放器控制功能,例如播放、暂停、停止等。但是MPMediaPlayerController自身并不是一个完整的视图控制器,如果要在UI中展示视频需要将view属性添加到界面中。下面列出了MPMoviePlayerController的常用属性和方法:
属性 | 说明 |
@property (nonatomic, copy) NSURL*contentURL | 播放媒体URL,这个URL可以是本地路径,也可以是网络路径 |
@property (nonatomic, readonly) UIView*view | 播放器视图,如果要显示视频必须将此视图添加到控制器视图中 |
@property (nonatomic, readonly) UIView*backgroundView | 播放器背景视图 |
@property (nonatomic, readonly)MPMoviePlaybackState playbackState | 媒体播放状态,枚举类型: MPMoviePlaybackStateStop MPMoviePlaybackStatePlay MPMoviePlaybackStatePaus MPMoviePlaybackStateInte MPMoviePlaybackStateSeek MPMoviePlaybackStateSeek |
@property (nonatomic, readonly)MPMovieLoadState loadState | 网络媒体加载状态,枚举类型: MPMovieLoadStateUnknown:位置类型 MPMovieLoadStatePlayable MPMovieLoadStatePlaythro MPMovieLoadStateStalled:停滞状态 |
@property (nonatomic) MPMovieControlStylecontrolStyle | 控制面板风格,枚举类型: MPMovieControlStyleNone:无控制面板 MPMovieControlStyleEmbed MPMovieControlStyleFulls MPMovieControlStyleDefau |
@property (nonatomic) MPMovieRepeatModerepeatMode; | 重复播放模式,枚举类型: MPMovieRepeatModeNone:不重复,默认值 MPMovieRepeatModeOne:重复播放 |
@property (nonatomic) BOOLshouldAutoplay | 当网络媒体缓存到一定数据时是否自动播放,默认为YES |
@property (nonatomic, getter=isFullscreen)BOOL fullscreen | 是否全屏展示,默认为NO,注意如果要通过此属性设置全屏必须在视图显示完成后设置,否则无效 |
@property (nonatomic) MPMovieScalingModescalingMode | 视频缩放填充模式,枚举类型: MPMovieScalingModeNone:不进行任何缩放 MPMovieScalingModeAspect MPMovieScalingModeAspect MPMovieScalingModeFill:不固定缩放比例压缩填充整个视图,视频不会被裁切但是比例失衡 |
@property (nonatomic, readonly) BOOLreadyForDisplay | 是否有相关媒体被播放 |
@property (nonatomic, readonly)MPMovieMediaTypeMask movieMediaTypes | 媒体类别,枚举类型: MPMovieMediaTypeMaskNone MPMovieMediaTypeMaskVide MPMovieMediaTypeMaskAudi |
@property (nonatomic) MPMovieSourceTypemovieSourceType | 媒体源,枚举类型: MPMovieSourceTypeUnknown MPMovieSourceTypeFile:本地文件 MPMovieSourceTypeStreami |
@property (nonatomic, readonly)NSTimeInterval duration | 媒体时长,如果未知则返回0 |
@property (nonatomic, readonly)NSTimeInterval playableDuration | 媒体可播放时长,主要用于表示网络媒体已下载视频时长 |
@property (nonatomic, readonly) CGSizenaturalSize | 视频实际尺寸,如果未知则返回CGSizeZero |
@property (nonatomic) NSTimeIntervalinitialPlaybackTime | 起始播放时间 |
@property (nonatomic) NSTimeIntervalendPlaybackTime | 终止播放时间 |
@property (nonatomic) BOOLallowsAirPlay | 是否允许无线播放,默认为YES |
@property (nonatomic, readonly,getter=isAirPlayVideoActive) BOOL airPlayVideoActive | 当前媒体是否正在通过AirPlay播放 |
@property(nonatomic, readonly) BOOLisPreparedToPlay | 是否准备好播放 |
@property(nonatomic) NSTimeIntervalcurrentPlaybackTime | 当前播放时间,单位:秒 |
@property(nonatomic) floatcurrentPlaybackRate | 当前播放速度,如果暂停则为0,正常速度为1.0,非0数据表示倍率 |
对象方法 | 说明 |
- (instancetype)initWithContentURL:(NSURL*)url | 使用指定的URL初始化媒体播放控制器对象 |
- (void)setFullscreen:(BOOL)fullscreenanimated:(BOOL)animated | 设置视频全屏,注意如果要通过此方法设置全屏则必须在其视图显示之后设置,否则无效 |
-(void)requestThumbnailImagesAt | 获取在指定播放时间的视频缩略图,第一个参数是获取缩略图的时间点数组;第二个参数代表时间点精度,枚举类型: MPMovieTimeOptionNearest MPMovieTimeOptionExact:准确时间 |
-(void)cancelAllThumbnailImageR | 取消所有缩略图获取请求 |
- (void)prepareToPlay | 准备播放,加载视频数据到缓存,当调用play方法时如果没有准备好会自动调用此方法 |
- (void)play | 开始播放 |
- (void)pause | 暂停播放 |
- (void)stop | 停止播放 |
- (void)beginSeekingForward | 向前定位 |
- (void)beginSeekingBackward | 向后定位 |
- (void)endSeeking | 停止快进/快退 |
通知 | 说明 |
MPMoviePlayerScalingMode | 视频缩放填充模式发生改变 |
MPMoviePlayerPlaybackDid | 媒体播放完成或用户手动退出,具体完成原因可以通过通知userInfo中的key为MPMoviePlayerPlaybackDid |
MPMoviePlayerPlaybackSta | 播放状态改变,可配合playbakcState属性获取具体状态 |
MPMoviePlayerLoadStateDi | 媒体网络加载状态改变 |
MPMoviePlayerNowPlayingM | 当前播放的媒体内容发生改变 |
MPMoviePlayerWillEnterFu | 将要进入全屏 |
MPMoviePlayerDidEnterFul | 进入全屏后 |
MPMoviePlayerWillExitFul | 将要退出全屏 |
MPMoviePlayerDidExitFull | 退出全屏后 |
MPMoviePlayerIsAirPlayVi | 当媒体开始通过AirPlay播放或者结束AirPlay播放 |
MPMoviePlayerReadyForDis | 视频显示状态改变 |
MPMovieMediaTypesAvailab | 确定了媒体可用类型后 |
MPMovieSourceTypeAvailab | 确定了媒体来源后 |
MPMovieDurationAvailable | 确定了媒体播放时长后 |
MPMovieNaturalSizeAvaila | 确定了媒体的实际尺寸后 |
MPMoviePlayerThumbnailIm | 缩略图请求完成之后 |
MPMediaPlaybackIsPrepare | 做好播放准备后 |
注意MPMediaPlayerController的状态等信息并不是通过代理来和外界交互的,而是通过通知中心,因此从上面的列表中可以看到常用的一些通知。由于MPMoviePlayerController本身对于媒体播放做了深度的封装,使用起来就相当简单:创建MPMoviePlayerController对象,设置frame属性,将MPMoviePlayerController的view添加到控制器视图中。下面的示例中将创建一个播放控制器并添加播放状态改变及播放完成的通知:
// // ViewController.m // MPMoviePlayerController // // Created by Kenshin Cui on 14/03/30. // Copyright (c) 2014年 cmjstudio. All rights reserved. // #import "ViewController.h" #import @interface ViewController () @property (nonatomic,strong) MPMoviePlayerController *moviePlayer;//视频播放控制器 @end @implementation ViewController #pragma mark - 控制器视图方法 - (void)viewDidLoad { [super viewDidLoad]; //播放 [self.moviePlayer play]; //添加通知 [self addNotification]; } -(void)dealloc{ //移除所有通知监控 [[NSNotificationCenter defaultCenter] removeObserver:self]; } #pragma mark - 私有方法 -(NSURL *)getFileUrl{ NSString *urlStr=[[NSBundle mainBundle] pathForResource:@"The New Look of OS X Yosemite.mp4" ofType:nil]; NSURL *url=[NSURL fileURLWithPath:urlStr]; return url; } -(NSURL *)getNetworkUrl{ NSString *urlStr=@"http://192.168.1.161/The New Look of OS X Yosemite.mp4"; urlStr=[urlStr stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding]; NSURL *url=[NSURL URLWithString:urlStr]; return url; } -(MPMoviePlayerController *)moviePlayer{ if (!_moviePlayer) { NSURL *url=[self getNetworkUrl]; _moviePlayer=[[MPMoviePlayerController alloc]initWithContentURL:url]; _moviePlayer.view.frame=self.view.bounds; _moviePlayer.view.autoresizingMask=UIViewAutoresizingFlexib leWidth|UIViewAutoresizingFlexib leHeight; [self.view addSubview:_moviePlayer.view]; } return _moviePlayer; } -(void)addNotification{ NSNotificationCenter *notificationCenter=[NSNotificationCenter defaultCenter]; [notificationCenter addObserver:self selector:@selector(mediaPlayerPlaybackState Change:) name:MPMoviePlayerPlaybackSta teDidChangeNotification object:self.moviePlayer]; [notificationCenter addObserver:self selector:@selector(mediaPlayerPlaybackFinis hed:) name:MPMoviePlayerPlaybackDid FinishNotification object:self.moviePlayer]; } -(void)mediaPlayerPlaybackState Change:(NSNotification *)notification{ switch (self.moviePlayer.playbackState) { case MPMoviePlaybackStatePlay ing: NSLog(@"正在播放..."); break; case MPMoviePlaybackStatePaus ed: NSLog(@"暂停播放."); break; case MPMoviePlaybackStateStop ped: NSLog(@"停止播放."); break; default: NSLog(@"播放状态:%li",self.moviePlayer.playbackState); break; } } -(void)mediaPlayerPlaybackFinis hed:(NSNotification *)notification{ NSLog(@"播放完成.%li",self.moviePlayer.playbackState); } @end
运行效果:
从上面的API大家也不难看出其实MPMoviePlayerController功能相当强大,日常开发中作为一般的媒体播放器也完全没有问题。MPMoviePlayerController除了一般的视频播放和控制外还有一些强大的功能,例如截取视频缩略图。请求视频缩略图时只要调用-(void)requestThumbnailImagesAt
// // ViewController.m // MPMoviePlayerController // // Created by Kenshin Cui on 14/03/30. // Copyright (c) 2014年 cmjstudio. All rights reserved. // 视频截图 #import "ViewController.h" #import @interface ViewController () @property (nonatomic,strong) MPMoviePlayerController *moviePlayer;//视频播放控制器 @end @implementation ViewController #pragma mark - 控制器视图方法 - (void)viewDidLoad { [super viewDidLoad]; //播放 [self.moviePlayer play]; //添加通知 [self addNotification]; //获取缩略图 [self thumbnailImageRequest]; } -(void)dealloc{ //移除所有通知监控 [[NSNotificationCenter defaultCenter] removeObserver:self]; } #pragma mark - 私有方法 -(NSURL *)getFileUrl{ NSString *urlStr=[[NSBundle mainBundle] pathForResource:@"The New Look of OS X Yosemite.mp4" ofType:nil]; NSURL *url=[NSURL fileURLWithPath:urlStr]; return url; } -(NSURL *)getNetworkUrl{ NSString *urlStr=@"http://192.168.1.161/The New Look of OS X Yosemite.mp4"; urlStr=[urlStr stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding]; NSURL *url=[NSURL URLWithString:urlStr]; return url; } -(MPMoviePlayerController *)moviePlayer{ if (!_moviePlayer) { NSURL *url=[self getNetworkUrl]; _moviePlayer=[[MPMoviePlayerController alloc]initWithContentURL:url]; _moviePlayer.view.frame=self.view.bounds; _moviePlayer.view.autoresizingMask=UIViewAutoresizingFlexib leWidth|UIViewAutoresizingFlexib leHeight; [self.view addSubview:_moviePlayer.view]; } return _moviePlayer; } -(void)thumbnailImageRequest{ //获取13.0s、21.5s的缩略图 [self.moviePlayer requestThumbnailImagesAt Times:@[@13.0,@21.5] timeOption:MPMovieTimeOptionNearest KeyFrame]; } #pragma mark - 控制器通知 -(void)addNotification{ NSNotificationCenter *notificationCenter=[NSNotificationCenter defaultCenter]; [notificationCenter addObserver:self selector:@selector(mediaPlayerPlaybackState Change:) name:MPMoviePlayerPlaybackSta teDidChangeNotification object:self.moviePlayer]; [notificationCenter addObserver:self selector:@selector(mediaPlayerPlaybackFinis hed:) name:MPMoviePlayerPlaybackDid FinishNotification object:self.moviePlayer]; [notificationCenter addObserver:self selector:@selector(mediaPlayerThumbnailRequ estFinished:) name:MPMoviePlayerThumbnailIm ageRequestDidFinishNotif ication object:self.moviePlayer]; } -(void)mediaPlayerPlaybackState Change:(NSNotification *)notification{ switch (self.moviePlayer.playbackState) { case MPMoviePlaybackStatePlay ing: NSLog(@"正在播放..."); break; case MPMoviePlaybackStatePaus ed: NSLog(@"暂停播放."); break; case MPMoviePlaybackStateStop ped: NSLog(@"停止播放."); break; default: NSLog(@"播放状态:%li",self.moviePlayer.playbackState); break; } } -(void)mediaPlayerPlaybackFinis hed:(NSNotification *)notification{ NSLog(@"播放完成.%li",self.moviePlayer.playbackState); } -(void)mediaPlayerThumbnailRequ estFinished:(NSNotification *)notification{ NSLog(@"视频截图完成."); UIImage *image=notification.userInfo[MPMoviePlayerThumbnailIm ageKey]; //保存图片到相册(首次调用会请求用户获得访问相册权限) UIImageWriteToSavedPhoto sAlbum(image, nil, nil, nil); } @end
截图效果:
扩展--使用AVFoundation生成缩略图
通过前面的方法大家应该已经看到,使用MPMoviePlayerController来生成缩略图足够简单,但是如果仅仅是是为了生成缩略图而不进行视频播放的话,此刻使用MPMoviePlayerController就有点大材小用了。其实使用AVFundation框架中的AVAssetImageGenerator就可以获取视频缩略图。使用AVAssetImageGenerator获取缩略图大致分为三个步骤:
- 创建AVURLAsset对象(此类主要用于获取媒体信息,包括视频、声音等)。
- 根据AVURLAsset创建AVAssetImageGenerator对象。
- 使用AVAssetImageGenerator的copyCGImageAtTime::方法获得指定时间点的截图。
// // ViewController.m // AVAssetImageGenerator // // Created by Kenshin Cui on 14/03/30. // Copyright (c) 2014年 cmjstudio. All rights reserved. // #import "ViewController.h" #import @interface ViewController () @end @implementation ViewController - (void)viewDidLoad { [super viewDidLoad]; //获取第13.0s的缩略图 [self thumbnailImageRequest:13.0]; } #pragma mark - 私有方法 -(NSURL *)getFileUrl{ NSString *urlStr=[[NSBundle mainBundle] pathForResource:@"The New Look of OS X Yosemite.mp4" ofType:nil]; NSURL *url=[NSURL fileURLWithPath:urlStr]; return url; } -(NSURL *)getNetworkUrl{ NSString *urlStr=@"http://192.168.1.161/The New Look of OS X Yosemite.mp4"; urlStr=[urlStr stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding]; NSURL *url=[NSURL URLWithString:urlStr]; return url; } -(void)thumbnailImageRequest:(CGFloat )timeBySecond{ //创建URL NSURL *url=[self getNetworkUrl]; //根据url创建AVURLAsset AVURLAsset *urlAsset=[AVURLAsset assetWithURL:url]; //根据AVURLAsset创建AVAssetImageGenerator AVAssetImageGenerator *imageGenerator=[AVAssetImageGenerator assetImageGeneratorWithA sset:urlAsset]; NSError *error=nil; CMTime time=CMTimeMakeWithSeconds(timeBySecond, 10);//CMTime是表示电影时间信息的结构体,第一个参数表示是视频第几秒,第二个参数表示每秒帧数.(如果要活的某一秒的第几帧可以使用CMTimeMake方法) CMTime actualTime; CGImageRef cgImage= [imageGenerator copyCGImageAtTime:time actualTime:&actualTime error:&error]; if(error){ NSLog(@"截取视频缩略图时发生错误,错误信息:%@",error.localizedDescription); return; } CMTimeShow(actualTime); UIImage *image=[UIImage imageWithCGImage:cgImage];//转化为UIImage //保存到相册 UIImageWriteToSavedPhoto sAlbum(image,nil, nil, nil); CGImageRelease(cgImage); } @end
生成的缩略图效果:
MPMoviePlayerViewControl ler
其实MPMoviePlayerController如果不作为嵌入视频来播放(例如在新闻中嵌入一个视频),通常在播放时都是占满一个屏幕的,特别是在iPhone、iTouch上。因此从iOS3.2以后苹果也在思考既然MPMoviePlayerController在使用时通常都是将其视图view添加到另外一个视图控制器中作为子视图,那么何不直接创建一个控制器视图内部创建一个MPMoviePlayerController属性并且默认全屏播放,开发者在开发的时候直接使用这个视图控制器。这个内部有一个MPMoviePlayerController的视图控制器就是MPMoviePlayerViewControl
// // ViewController.m // MPMoviePlayerViewController // // Created by Kenshin Cui on 14/03/30. // Copyright (c) 2014年 cmjstudio. All rights reserved. // MPMoviePlayerViewControl ler使用 #import "ViewController.h" #import @interface ViewController () //播放器视图控制器 @property (nonatomic,strong) MPMoviePlayerViewControl ler *moviePlayerViewControlle r; @end @implementation ViewController #pragma mark - 控制器视图方法 - (void)viewDidLoad { [super viewDidLoad]; } -(void)dealloc{ //移除所有通知监控 [[NSNotificationCenter defaultCenter] removeObserver:self]; } #pragma mark - 私有方法 -(NSURL *)getFileUrl{ NSString *urlStr=[[NSBundle mainBundle] pathForResource:@"The New Look of OS X Yosemite.mp4" ofType:nil]; NSURL *url=[NSURL fileURLWithPath:urlStr]; return url; } -(NSURL *)getNetworkUrl{ NSString *urlStr=@"http://192.168.1.161/The New Look of OS X Yosemite.mp4"; urlStr=[urlStr stringByAddingPercentEsc apesUsingEncoding:NSUTF8StringEncoding]; NSURL *url=[NSURL URLWithString:urlStr]; return url; } -(MPMoviePlayerViewControl ler *)moviePlayerViewControlle r{ if (!_moviePlayerViewControlle r) { NSURL *url=[self getNetworkUrl]; _moviePlayerViewControlle r=[[MPMoviePlayerViewControl ler alloc]initWithContentURL:url]; [self addNotification]; } return _moviePlayerViewControlle r; } #pragma mark - UI事件 - (IBAction)playClick:(UIButton *)sender { self.moviePlayerViewControlle r=nil;//保证每次点击都重新创建视频播放控制器视图,避免再次点击时由于不播放的问题 // [self presentViewController:self.moviePlayerViewControlle r animated:YES completion:nil]; //注意,在MPMoviePlayerViewControl ler.h中对UIViewController扩展两个用于模态展示和关闭MPMoviePlayerViewControl ler的方法,增加了一种下拉展示动画效果 [self presentMoviePlayerViewCo ntrollerAnimated:self.moviePlayerViewControlle r]; } #pragma mark - 控制器通知 -(void)addNotification{ NSNotificationCenter *notificationCenter=[NSNotificationCenter defaultCenter]; [notificationCenter addObserver:self selector:@selector(mediaPlayerPlaybackState Change:) name:MPMoviePlayerPlaybackSta teDidChangeNotification object:self.moviePlayerViewControlle r.moviePlayer]; [notificationCenter addObserver:self selector:@selector(mediaPlayerPlaybackFinis hed:) name:MPMoviePlayerPlaybackDid FinishNotification object:self.moviePlayerViewControlle r.moviePlayer]; } -(void)mediaPlayerPlaybackState Change:(NSNotification *)notification{ switch (self.moviePlayerViewControlle r.moviePlayer.playbackState) { case MPMoviePlaybackStatePlay ing: NSLog(@"正在播放..."); break; case MPMoviePlaybackStatePaus ed: NSLog(@"暂停播放."); break; case MPMoviePlaybackStateStop ped: NSLog(@"停止播放."); break; default: NSLog(@"播放状态:%li",self.moviePlayerViewControlle r.moviePlayer.playbackState); break; } } -(void)mediaPlayerPlaybackFinis hed:(NSNotification *)notification{ NSLog(@"播放完成.%li",self.moviePlayerViewControlle r.moviePlayer.playbackState); } @end
运行效果:
这里需要强调一下,由于MPMoviePlayerViewControl
AVPlayer
MPMoviePlayerController足够强大,几乎不用写几行代码就能完成一个播放器,但是正是由于它的高度封装使得要自定义这个播放器变得很复杂,甚至是不可能完成。例如有些时候需要自定义播放器的样式,那么如果要使用MPMoviePlayerController就不合适了,如果要对视频有自由的控制则可以使用AVPlayer。AVPlayer存在于AVFoundation中,它更加接近于底层,所以灵活性也更强:
AVPlayer本身并不能显示视频,而且它也不像MPMoviePlayerController有一个view属性。如果AVPlayer要显示必须创建一个播放器层AVPlayerLayer用于展示,播放器层继承于CALayer,有了AVPlayerLayer之添加到控制器视图的layer中即可。要使用AVPlayer首先了解一下几个常用的类:
AVAsset:主要用于获取多媒体信息,是一个抽象类,不能直接使用。
AVURLAsset:AVAsset的子类,可以根据一个URL路径创建一个包含媒体信息的AVURLAsset对象。
AVPlayerItem:一个媒体资源管理对象,管理者视频的一些基本信息和状态,一个AVPlayerItem对应着一个视频资源。
下面简单通过一个播放器来演示AVPlayer的使用,播放器的效果如下:
在这个自定义的播放器中实现了视频播放、暂停、进度展示和视频列表功能,下面将对这些功能一一介绍。
首先说一下视频的播放、暂停功能,这也是最基本的功能,AVPlayer对应着两个方法play、pause来实现。但是关键问题是如何判断当前视频是否在播放,在前面的内容中无论是音频播放器还是视频播放器都有对应的状态来判断,但是AVPlayer却没有这样的状态属性,通常情况下可以通过判断播放器的播放速度来获得播放状态。如果rate为0说明是停止状态,1是则是正常播放状态。
其次要展示播放进度就没有其他播放器那么简单了。在前面的播放器中通常是使用通知来获得播放器的状态,媒体加载状态等,但是无论是AVPlayer还是AVPlayerItem(AVPlayer有一个属性currentItem是AVPlayerItem类型,表示当前播放的视频对象)都无法获得这些信息。当然AVPlayerItem是有通知的,但是对于获得播放状态和加载状态有用的通知只有一个:播放完成通知AVPlayerItemDidPlayToEnd
最后就是视频切换的功能,在前面介绍的所有播放器中每个播放器对象一次只能播放一个视频,如果要切换视频只能重新创建一个对象,但是AVPlayer却提供了-(void)replaceCurrentItemWithPl
下面附上代码:
// // ViewController.m // AVPlayer // // Created by Kenshin Cui on 14/03/30. // Copyright (c) 2014年 cmjstudio. All rights reserved. // #import "ViewController.h" #import @interface ViewController () @property (nonatomic,strong) AVPlayer *player;//播放器对象 @property (weak, nonatomic) IBOutlet UIView *container; //播放器容器 @property (weak, nonatomic) IBOutlet UIButton *playOrPause; //播放/暂停按钮 @property (weak, nonatomic) IBOutlet UIProgressView *progress;//播放进度 @end @implementation ViewController #pragma mark - 控制器视图方法 - (void)viewDidLoad { [super viewDidLoad]; [self setupUI]; [self.player play]; } -(void)dealloc{ [self removeObserverFromPlayerItem:self.player.currentItem]; [self removeNotification]; } #pragma mark - 私有方法 -(void)setupUI{ //创建播放器层 AVPlayerLayer *playerLayer=[AVPlayerLayer playerLayerWithPlayer:self.player]; playerLayer.frame=self.container.frame; //playerLayer.videoGravity=AVLayerVideoGravityResiz eAspect;//视频填充模式 [self.container.layer addSublayer:playerLayer]; } -(AVPlayer *)player{ if (!_player) { AVPlayerItem *playerItem=[self getPlayItem:0]; _player=[AVPlayer playerWithPlayerItem:playerItem]; [self addProgressObserver]; [self addObserverToPlayerItem:playerItem]; } return _player; } -(AVPlayerItem *)getPlayItem:(int)videoIndex{ NSString *urlStr=[NSString stringWithFormat:@"http://192.168.1.161/%i.mp4",videoIndex]; urlStr =[urlStr stringByAddingPercentEsc apesUsingEncoding:NSUTF8StringEncoding]; NSURL *url=[NSURL URLWithString:urlStr]; AVPlayerItem *playerItem=[AVPlayerItem playerItemWithURL:url]; return playerItem; } #pragma mark - 通知 -(void)addNotification{ //给AVPlayerItem添加播放完成通知 [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(playbackFinished:) name:AVPlayerItemDidPlayToEnd TimeNotification object:self.player.currentItem]; } -(void)removeNotification{ [[NSNotificationCenter defaultCenter] removeObserver:self]; } -(void)playbackFinished:(NSNotification *)notification{ NSLog(@"视频播放完成."); } #pragma mark - 监控 -(void)addProgressObserver{ AVPlayerItem *playerItem=self.player.currentItem; UIProgressView *progress=self.progress; //这里设置每秒执行一次 [self.player addPeriodicTimeObserverF orInterval:CMTimeMake(1.0, 1.0) queue:dispatch_get_main_queue() usingBlock:^(CMTime time) { float current=CMTimeGetSeconds(time); float total=CMTimeGetSeconds([playerItem duration]); NSLog(@"当前已经播放%.2fs.",current); if (current) { [progress setProgress:(current/total) animated:YES]; } }]; } -(void)addObserverToPlayerItem:(AVPlayerItem *)playerItem{ //监控状态属性,注意AVPlayer也有一个status属性,通过监控它的status也可以获得播放状态 [playerItem addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptio nNew context:nil]; //监控网络加载情况属性 [playerItem addObserver:self forKeyPath:@"loadedTimeRanges" options:NSKeyValueObservingOptio nNew context:nil]; } -(void)removeObserverFromPlayer Item:(AVPlayerItem *)playerItem{ [playerItem removeObserver:self forKeyPath:@"status"]; [playerItem removeObserver:self forKeyPath:@"loadedTimeRanges"]; } -(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context{ AVPlayerItem *playerItem=object; if ([keyPath isEqualToString:@"status"]) { AVPlayerStatus status= [[change objectForKey:@"new"] intValue]; if(status==AVPlayerStatusReadyToPla y){ NSLog(@"正在播放...,视频总长度:%.2f",CMTimeGetSeconds(playerItem.duration)); } }else if([keyPath isEqualToString:@"loadedTimeRanges"]){ NSArray *array=playerItem.loadedTimeRanges; CMTimeRange timeRange = [array.firstObject CMTimeRangeValue];//本次缓冲时间范围 float startSeconds = CMTimeGetSeconds(timeRange.start); float durationSeconds = CMTimeGetSeconds(timeRange.duration); NSTimeInterval totalBuffer = startSeconds + durationSeconds;//缓冲总长度 NSLog(@"共缓冲:%.2f",totalBuffer); // } } #pragma mark - UI事件 - (IBAction)playClick:(UIButton *)sender { // AVPlayerItemDidPlayToEnd TimeNotification //AVPlayerItem *playerItem= self.player.currentItem; if(self.player.rate==0){ //说明时暂停 [sender setImage:[UIImage imageNamed:@"player_pause"] forState:UIControlStateNormal]; [self.player play]; }else if(self.player.rate==1){//正在播放 [self.player pause]; [sender setImage:[UIImage imageNamed:@"player_play"] forState:UIControlStateNormal]; } } - (IBAction)navigationButtonClick:(UIButton *)sender { [self removeNotification]; [self removeObserverFromPlayer Item:self.player.currentItem]; AVPlayerItem *playerItem=[self getPlayItem:sender.tag]; [self addObserverToPlayerItem:playerItem]; //切换视频 [self.player replaceCurrentItemWithPl ayerItem:playerItem]; [self addNotification]; } @end
运行效果:
到目前为止无论是MPMoviePlayerController还是AVPlayer来播放视频都相当强大,但是它也存在着一些不可回避的问题,那就是支持的视频编码格式很有限:H.264、MPEG-4,扩展名(压缩格式):.mp4、.mov、.m4v、.m2v、.3gp、.3g2等。但是无论是MPMoviePlayerController还是AVPlayer它们都支持绝大多数音频编码,所以大家如果纯粹是为了播放音乐的话也可以考虑使用这两个播放器。那么如何支持更多视频编码格式呢?目前来说主要还是依靠第三方框架,在iOS上常用的视频编码、解码框架有:VLC、ffmpeg,具体使用方式今天就不再做详细介绍。
摄像头
UIImagePickerController拍照和视频录制
下面看一下在iOS如何拍照和录制视频。在iOS中要拍照和录制视频最简单的方法就是使用UIImagePickerController。UIImagePickerController继承于UINavigationController,前面的文章中主要使用它来选取照片,其实UIImagePickerController的功能不仅如此,它还可以用来拍照和录制视频。首先看一下这个类常用的属性和方法:
属性 | 说明 |
@property(nonatomic) | 拾取源类型,sourceType是枚举类型: UIImagePickerControllerS ,默认值 UIImagePickerControllerS UIImagePickerControllerS |
@property(nonatomic,copy) | 媒体类型,默认情况下此数组包含kUTTypeImage,所以拍照时可以不用设置;但是当要录像的时候必须设置,可以设置为kUTTypeVideo(视频,但不带声音)或者kUTTypeMovie(视频并带有声音) |
@property(nonatomic) | 视频最大录制时长,默认为10 s |
@property(nonatomic) | 视频质量,枚举类型: UIImagePickerControllerQ UIImagePickerControllerQ UIImagePickerControllerQ UIImagePickerControllerQ UIImagePickerControllerQ UIImagePickerControllerQ |
@property(nonatomic) | 是否显示摄像头控制面板,默认为YES |
@property(nonatomic,retain) | 摄像头上覆盖的视图,可用通过这个视频来自定义拍照或录像界面 |
@property(nonatomic) | 摄像头形变 |
@property(nonatomic)UIImagePickerControllerC | 摄像头捕获模式,捕获模式是枚举类型: UIImagePickerControllerC UIImagePickerControllerC |
@property(nonatomic)UIImagePickerControllerC | 摄像头设备,cameraDevice是枚举类型: UIImagePickerControllerC UIImagePickerControllerC |
@property(nonatomic)UIImagePickerControllerC | 闪光灯模式,枚举类型: UIImagePickerControllerC UIImagePickerControllerC UIImagePickerControllerC |
类方法 | 说明 |
+(BOOL)isSourceTypeAvailable:(UIImagePickerControllerS | 指定的源类型是否可用,sourceType是枚举类型: UIImagePickerControllerS UIImagePickerControllerS UIImagePickerControllerS |
+ (NSArray*)availableMediaTypesForSo | 指定的源设备上可用的媒体类型,一般就是图片和视频 |
+(BOOL)isCameraDeviceAvailable:(UIImagePickerControllerC | 指定的摄像头是否可用,cameraDevice是枚举类型: UIImagePickerControllerC UIImagePickerControllerC |
+(BOOL)isFlashAvailableForCamer | 指定摄像头的闪光灯是否可用 |
+ (NSArray*)availableCaptureModesFor | 获得指定摄像头上的可用捕获模式,捕获模式是枚举类型: UIImagePickerControllerC UIImagePickerControllerC |
对象方法 | 说明 |
- (void)takePicture | 编程方式拍照 |
-(BOOL)startVideoCapture | 编程方式录制视频 |
- (void)stopVideoCapture | 编程方式停止录制视频 |
代理方法 | 说明 |
-(void)imagePickerController:(UIImagePickerController *)pickerdidFinishPickingMediaWit | 媒体拾取完成 |
-(void)imagePickerControllerDid | 取消拾取 |
扩展方法(主要用于保存照片、视频到相簿) | 说明 |
UIImageWriteToSavedPhoto | 保存照片到相簿 |
UIVideoAtPathIsCompatibl | 能否将视频保存到相簿 |
voidUISaveVideoAtPathToSaved | 保存视频到相簿 |
要用UIImagePickerController来拍照或者录制视频通常可以分为如下步骤:
- 创建UIImagePickerController对象。
- 指定拾取源,平时选择照片时使用的拾取源是照片库或者相簿,此刻需要指定为摄像头类型。
- 指定摄像头,前置摄像头或者后置摄像头。
- 设置媒体类型mediaType,注意如果是录像必须设置,如果是拍照此步骤可以省略,因为mediaType默认包含kUTTypeImage(注意媒体类型定义在MobileCoreServices.framework中)
- 指定捕获模式,拍照或者录制视频。(视频录制时必须先设置媒体类型再设置捕获模式
- )
- 展示UIImagePickerController(通常以模态窗口形式打开)。
- 拍照和录制视频结束后在代理方法中展示/保存照片或视频。
当然这个过程中有很多细节可以设置,例如是否显示拍照控制面板,拍照后是否允许编辑等等,通过上面的属性/方法列表相信并不难理解。下面就以一个示例展示如何使用UIImagePickerController来拍照和录制视频,下面的程序中只要将_isVideo设置为YES就是视频录制模式,录制完后在主视图控制器中自动播放;如果将_isVideo设置为NO则为拍照模式,拍照完成之后在主视图控制器中显示拍摄的照片:
// // ViewController.m // UIImagePickerController // // Created by Kenshin Cui on 14/04/05. // Copyright (c) 2014年 cmjstudio. All rights reserved. // #import "ViewController.h" #import #import @interface ViewController () @property (assign,nonatomic) int isVideo;//是否录制视频,如果为1表示录制视频,0代表拍照 @property (strong,nonatomic) UIImagePickerController *imagePicker; @property (weak, nonatomic) IBOutlet UIImageView *photo;//照片展示视图 @property (strong ,nonatomic) AVPlayer *player;//播放器,用于录制完视频后播放视频 @end @implementation ViewController #pragma mark - 控制器视图事件 - (void)viewDidLoad { [super viewDidLoad]; //通过这里设置当前程序是拍照还是录制视频 _isVideo=YES; } #pragma mark - UI事件 //点击拍照按钮 - (IBAction)takeClick:(UIButton *)sender { [self presentViewController:self.imagePicker animated:YES completion:nil]; } #pragma mark - UIImagePickerController代理方法 //完成 -(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info{ NSString *mediaType=[info objectForKey:UIImagePickerControllerM ediaType]; if ([mediaType isEqualToString:(NSString *)kUTTypeImage]) {//如果是拍照 UIImage *image; //如果允许编辑则获得编辑后的照片,否则获取原始照片 if (self.imagePicker.allowsEditing) { image=[info objectForKey:UIImagePickerControllerE ditedImage];//获取编辑后的照片 }else{ image=[info objectForKey:UIImagePickerControllerO riginalImage];//获取原始照片 } [self.photo setImage:image];//显示照片 UIImageWriteToSavedPhoto sAlbum(image, nil, nil, nil);//保存到相簿 }else if([mediaType isEqualToString:(NSString *)kUTTypeMovie]){//如果是录制视频 NSLog(@"video..."); NSURL *url=[info objectForKey:UIImagePickerControllerM ediaURL];//视频路径 NSString *urlStr=[url path]; if (UIVideoAtPathIsCompatibl eWithSavedPhotosAlbum(urlStr)) { //保存视频到相簿,注意也可以使用ALAssetsLibrary来保存 UISaveVideoAtPathToSaved PhotosAlbum(urlStr, self, @selector(video:didFinishSavingWithError :contextInfo:), nil);//保存视频到相簿 } } [self dismissViewControllerAni mated:YES completion:nil]; } -(void)imagePickerControllerDid Cancel:(UIImagePickerController *)picker{ NSLog(@"取消"); } #pragma mark - 私有方法 -(UIImagePickerController *)imagePicker{ if (!_imagePicker) { _imagePicker=[[UIImagePickerController alloc]init]; _imagePicker.sourceType=UIImagePickerControllerS ourceTypeCamera;//设置image picker的来源,这里设置为摄像头 _imagePicker.cameraDevice=UIImagePickerControllerC ameraDeviceRear;//设置使用哪个摄像头,这里设置为后置摄像头 if (self.isVideo) { _imagePicker.mediaTypes=@[(NSString *)kUTTypeMovie]; _imagePicker.videoQuality=UIImagePickerControllerQ ualityTypeIFrame1280x720 ; _imagePicker.cameraCaptureMode=UIImagePickerControllerC ameraCaptureModeVideo;//设置摄像头模式(拍照,录制视频) }else{ _imagePicker.cameraCaptureMode=UIImagePickerControllerC ameraCaptureModePhoto; } _imagePicker.allowsEditing=YES;//允许编辑 _imagePicker.delegate=self;//设置代理,检测操作 } return _imagePicker; } //视频保存后的回调 - (void)video:(NSString *)videoPath didFinishSavingWithError :(NSError *)error contextInfo:(void *)contextInfo{ if (error) { NSLog(@"保存视频过程中发生错误,错误信息:%@",error.localizedDescription); }else{ NSLog(@"视频保存成功."); //录制完之后自动播放 NSURL *url=[NSURL fileURLWithPath:videoPath]; _player=[AVPlayer playerWithURL:url]; AVPlayerLayer *playerLayer=[AVPlayerLayer playerLayerWithPlayer:_player]; playerLayer.frame=self.photo.frame; [self.photo.layer addSublayer:playerLayer]; [_player play]; } } @end
运行效果(视频录制):
AVFoundation拍照和录制视频
不得不说UIImagePickerController确实强大,但是与MPMoviePlayerController类似,由于它的高度封装性,要进行某些自定义工作就比较复杂了。例如要做出一款类似于美颜相机的拍照界面就比较难以实现了,此时就可以考虑使用AVFoundation来实现。AVFoundation中提供了很多现成的播放器和录音机,但是事实上它还有更加底层的内容可以供开发者使用。因为AVFoundation中抽了很多和底层输入、输出设备打交道的类,依靠这些类开发人员面对的不再是封装好的音频播放器AVAudioPlayer、录音机(AVAudioRecorder)、视频(包括音频)播放器AVPlayer,而是输入设备(例如麦克风、摄像头)、输出设备(图片、视频)等。首先了解一下使用AVFoundation做拍照和视频录制开发用到的相关类:
AVCaptureSession:媒体(音、视频)捕获会话,负责把捕获的音视频数据输出到输出设备中。一个AVCaptureSession可以有多个输入输出:
AVCaptureDevice:输入设备,包括麦克风、摄像头,通过该对象可以设置物理设备的一些属性(例如相机聚焦、白平衡等)。
AVCaptureDeviceInput:设备输入数据管理对象,可以根据AVCaptureDevice创建对应的AVCaptureDeviceInput对象,该对象将会被添加到AVCaptureSession中管理。
AVCaptureOutput:输出数据管理对象,用于接收各类输出数据,通常使用对应的子类AVCaptureAudioDataOutput
AVCaptureVideoPreviewLay
使用AVFoundation拍照和录制视频的一般步骤如下:
- 创建AVCaptureSession对象。
- 使用AVCaptureDevice的静态方法获得需要使用的设备,例如拍照和录像就需要获得摄像头设备,录音就要获得麦克风设备。
- 利用输入设备AVCaptureDevice初始化AVCaptureDeviceInput对象。
- 初始化输出数据管理对象,如果要拍照就初始化AVCaptureStillImageOutpu
t对象;如果拍摄视频就初始化AVCaptureMovieFileOutput 对象。 - 将数据输入对象AVCaptureDeviceInput、数据输出对象AVCaptureOutput添加到媒体会话管理对象AVCaptureSession中。
- 创建视频预览图层AVCaptureVideoPreviewLay
er并指定媒体会话,添加图层到显示容器中,调用AVCaptureSession的startRuning方法开始捕获。 - 将捕获的音频或视频数据输出到指定文件。
拍照
下面看一下如何使用AVFoundation实现一个拍照程序,在这个程序中将实现摄像头预览、切换前后摄像头、闪光灯设置、对焦、拍照保存等功能。应用大致效果如下:
在程序中定义会话、输入、输出等相关对象。
@interface ViewController () @property (strong,nonatomic) AVCaptureSession *captureSession;//负责输入和输出设备之间的数据传递 @property (strong,nonatomic) AVCaptureDeviceInput *captureDeviceInput;//负责从AVCaptureDevice获得输入数据 @property (strong,nonatomic) AVCaptureStillImageOutput *captureStillImageOutput;//照片输出流 @property (strong,nonatomic) AVCaptureVideoPreviewLay er *captureVideoPreviewLayer ;//相机拍摄预览图层 @property (weak, nonatomic) IBOutlet UIView *viewContainer; @property (weak, nonatomic) IBOutlet UIButton *takeButton;//拍照按钮 @property (weak, nonatomic) IBOutlet UIButton *flashAutoButton;//自动闪光灯按钮 @property (weak, nonatomic) IBOutlet UIButton *flashOnButton;//打开闪光灯按钮 @property (weak, nonatomic) IBOutlet UIButton *flashOffButton;//关闭闪光灯按钮 @property (weak, nonatomic) IBOutlet UIImageView *focusCursor; //聚焦光标 @end
在控制器视图将要展示时创建并初始化会话、摄像头设备、输入、输出、预览图层,并且添加预览图层到视图中,除此之外还做了一些初始化工作,例如添加手势(点击屏幕进行聚焦)、初始化界面等。
-(void)viewWillAppear:(BOOL)animated{ [super viewWillAppear:animated]; //初始化会话 _captureSession=[[AVCaptureSession alloc]init]; if ([_captureSession canSetSessionPreset:AVCaptureSessionPreset1280x720]) {//设置分辨率 _captureSession.sessionPreset=AVCaptureSessionPreset12 80x720; } //获得输入设备 AVCaptureDevice *captureDevice=[self getCameraDeviceWithPosit ion:AVCaptureDevicePositionB ack];//取得后置摄像头 if (!captureDevice) { NSLog(@"取得后置摄像头时出现问题."); return; } NSError *error=nil; //根据输入设备初始化设备输入对象,用于获得输入数据 _captureDeviceInput=[[AVCaptureDeviceInput alloc]initWithDevice:captureDevice error:&error]; if (error) { NSLog(@"取得设备输入对象时出错,错误原因:%@",error.localizedDescription); return; } //初始化设备输出对象,用于获得输出数据 _captureStillImageOutput=[[AVCaptureStillImageOutpu t alloc]init]; NSDictionary *outputSettings = @{AVVideoCodecKey:AVVideoCodecJPEG}; [_captureStillImageOutput setOutputSettings:outputSettings];//输出设置 //将设备输入添加到会话中 if ([_captureSession canAddInput:_captureDeviceInput]) { [_captureSession addInput:_captureDeviceInput]; } //将设备输出添加到会话中 if ([_captureSession canAddOutput:_captureStillImageOutput]) { [_captureSession addOutput:_captureStillImageOutput]; } //创建视频预览层,用于实时展示摄像头状态 _captureVideoPreviewLayer =[[AVCaptureVideoPreviewLay er alloc]initWithSession:self.captureSession]; CALayer *layer=self.viewContainer.layer; layer.masksToBounds=YES; _captureVideoPreviewLayer .frame=layer.bounds; _captureVideoPreviewLayer .videoGravity=AVLayerVideoGravityResiz eAspectFill;//填充模式 //将视频预览层添加到界面中 //[layer addSublayer:_captureVideoPreviewLayer ]; [layer insertSublayer:_captureVideoPreviewLayer below:self.focusCursor.layer]; [self addNotificationToCapture Device:captureDevice]; [self addGenstureRecognizer]; [self setFlashModeButtonStatus ]; }
在控制器视图展示和视图离开界面时启动、停止会话。
-(void)viewDidAppear:(BOOL)animated{ [super viewDidAppear:animated]; [self.captureSession startRunning]; } -(void)viewDidDisappear:(BOOL)animated{ [super viewDidDisappear:animated]; [self.captureSession stopRunning]; }
定义闪光灯开闭及自动模式功能,注意无论是设置闪光灯、白平衡还是其他输入设备属性,在设置之前必须先锁定配置,修改完后解锁。
-(void)changeDeviceProperty:(PropertyChangeBlock)propertyChange{ AVCaptureDevice *captureDevice= [self.captureDeviceInput device]; NSError *error; //注意改变设备属性前一定要首先调用lockForConfiguration:调用完之后使用unlockForConfiguration方法解锁 if ([captureDevice lockForConfiguration:&error]) { propertyChange(captureDevice); [captureDevice unlockForConfiguration]; }else{ NSLog(@"设置设备属性过程发生错误,错误信息:%@",error.localizedDescription); } } -(void)setFlashMode:(AVCaptureFlashMode )flashMode{ [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { if ([captureDevice isFlashModeSupported:flashMode]) { [captureDevice setFlashMode:flashMode]; } }]; }
定义切换摄像头功能,切换摄像头的过程就是将原有输入移除,在会话中添加新的输入,但是注意动态修改会话需要首先开启配置,配置成功后提交配置。
#pragma mark 切换前后摄像头 - (IBAction)toggleButtonClick:(UIButton *)sender { AVCaptureDevice *currentDevice=[self.captureDeviceInput device]; AVCaptureDevicePosition currentPosition=[currentDevice position]; [self removeNotificationFromCaptureDevice:currentDevice]; AVCaptureDevice *toChangeDevice; AVCaptureDevicePosition toChangePosition=AVCaptureDevicePositionF ront; if (currentPosition==AVCaptureDevicePositionU nspecified||currentPosition==AVCaptureDevicePositionF ront) { toChangePosition=AVCaptureDevicePositionB ack; } toChangeDevice=[self getCameraDeviceWithPosit ion:toChangePosition]; [self addNotificationToCapture Device:toChangeDevice]; //获得要调整的设备输入对象 AVCaptureDeviceInput *toChangeDeviceInput=[[AVCaptureDeviceInput alloc]initWithDevice:toChangeDevice error:nil]; //改变会话的配置前一定要先开启配置,配置完成后提交配置改变 [self.captureSession beginConfiguration]; //移除原有输入对象 [self.captureSession removeInput:self.captureDeviceInput]; //添加新的输入对象 if ([self.captureSession canAddInput:toChangeDeviceInput]) { [self.captureSession addInput:toChangeDeviceInput]; self.captureDeviceInput=toChangeDeviceInput; } //提交会话配置 [self.captureSession commitConfiguration]; [self setFlashModeButtonStatus ]; }
添加点击手势操作,点按预览视图时进行聚焦、白平衡设置。
-(void)focusWithMode:(AVCaptureFocusMode)focusMode exposureMode:(AVCaptureExposureMode)exposureMode atPoint:(CGPoint)point{ [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { if ([captureDevice isFocusModeSupported:focusMode]) { [captureDevice setFocusMode:AVCaptureFocusModeAutoFocus]; } if ([captureDevice isFocusPointOfInterestSu pported]) { [captureDevice setFocusPointOfInterest:point]; } if ([captureDevice isExposureModeSupported:exposureMode]) { [captureDevice setExposureMode:AVCaptureExposureModeAut oExpose]; } if ([captureDevice isExposurePointOfInteres tSupported]) { [captureDevice setExposurePointOfIntere st:point]; } }]; } -(void)addGenstureRecognizer{ UITapGestureRecognizer *tapGesture=[[UITapGestureRecognizer alloc]initWithTarget:self action:@selector(tapScreen:)]; [self.viewContainer addGestureRecognizer:tapGesture]; } -(void)tapScreen:(UITapGestureRecognizer *)tapGesture{ CGPoint point= [tapGesture locationInView:self.viewContainer]; //将UI坐标转化为摄像头坐标 CGPoint cameraPoint= [self.captureVideoPreviewLayer captureDevicePointOfInte restForPoint:point]; [self setFocusCursorWithPoint:point]; [self focusWithMode:AVCaptureFocusModeAutoFo cus exposureMode:AVCaptureExposureModeAut oExpose atPoint:cameraPoint]; }
定义拍照功能,拍照的过程就是获取连接,从连接中获得捕获的输出数据并做保存操作。
#pragma mark 拍照 - (IBAction)takeButtonClick:(UIButton *)sender { //根据设备输出获得连接 AVCaptureConnection *captureConnection=[self.captureStillImageOutput connectionWithMediaType:AVMediaTypeVideo]; //根据连接取得设备输出的数据 [self.captureStillImageOutput captureStillImageAsynchronouslyFromConnection:captureConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) { if (imageDataSampleBuffer) { NSData *imageData=[AVCaptureStillImageOutpu t jpegStillImageNSDataRepr esentation:imageDataSampleBuffer]; UIImage *image=[UIImage imageWithData:imageData]; UIImageWriteToSavedPhoto sAlbum(image, nil, nil, nil); // ALAssetsLibrary *assetsLibrary=[[ALAssetsLibrary alloc]init]; // [assetsLibrary writeImageToSavedPhotosA lbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil]; } }]; }
最后附上完整代码:
// // ViewController.m // AVFoundationCamera // // Created by Kenshin Cui on 14/04/05. // Copyright (c) 2014年 cmjstudio. All rights reserved. // #import "ViewController.h" #import #import typedef void(^PropertyChangeBlock)(AVCaptureDevice *captureDevice); @interface ViewController () @property (strong,nonatomic) AVCaptureSession *captureSession;//负责输入和输出设备之间的数据传递 @property (strong,nonatomic) AVCaptureDeviceInput *captureDeviceInput;//负责从AVCaptureDevice获得输入数据 @property (strong,nonatomic) AVCaptureStillImageOutput *captureStillImageOutput;//照片输出流 @property (strong,nonatomic) AVCaptureVideoPreviewLay er *captureVideoPreviewLayer ;//相机拍摄预览图层 @property (weak, nonatomic) IBOutlet UIView *viewContainer; @property (weak, nonatomic) IBOutlet UIButton *takeButton;//拍照按钮 @property (weak, nonatomic) IBOutlet UIButton *flashAutoButton;//自动闪光灯按钮 @property (weak, nonatomic) IBOutlet UIButton *flashOnButton;//打开闪光灯按钮 @property (weak, nonatomic) IBOutlet UIButton *flashOffButton;//关闭闪光灯按钮 @property (weak, nonatomic) IBOutlet UIImageView *focusCursor; //聚焦光标 @end @implementation ViewController #pragma mark - 控制器视图方法 - (void)viewDidLoad { [super viewDidLoad]; } -(void)viewWillAppear:(BOOL)animated{ [super viewWillAppear:animated]; //初始化会话 _captureSession=[[AVCaptureSession alloc]init]; if ([_captureSession canSetSessionPreset:AVCaptureSessionPreset12 80x720]) {//设置分辨率 _captureSession.sessionPreset=AVCaptureSessionPreset12 80x720; } //获得输入设备 AVCaptureDevice *captureDevice=[self getCameraDeviceWithPosit ion:AVCaptureDevicePositionB ack];//取得后置摄像头 if (!captureDevice) { NSLog(@"取得后置摄像头时出现问题."); return; } NSError *error=nil; //根据输入设备初始化设备输入对象,用于获得输入数据 _captureDeviceInput=[[AVCaptureDeviceInput alloc]initWithDevice:captureDevice error:&error]; if (error) { NSLog(@"取得设备输入对象时出错,错误原因:%@",error.localizedDescription); return; } //初始化设备输出对象,用于获得输出数据 _captureStillImageOutput=[[AVCaptureStillImageOutpu t alloc]init]; NSDictionary *outputSettings = @{AVVideoCodecKey:AVVideoCodecJPEG}; [_captureStillImageOutput setOutputSettings:outputSettings];//输出设置 //将设备输入添加到会话中 if ([_captureSession canAddInput:_captureDeviceInput]) { [_captureSession addInput:_captureDeviceInput]; } //将设备输出添加到会话中 if ([_captureSession canAddOutput:_captureStillImageOutput]) { [_captureSession addOutput:_captureStillImageOutput]; } //创建视频预览层,用于实时展示摄像头状态 _captureVideoPreviewLayer =[[AVCaptureVideoPreviewLay er alloc]initWithSession:self.captureSession]; CALayer *layer=self.viewContainer.layer; layer.masksToBounds=YES; _captureVideoPreviewLayer .frame=layer.bounds; _captureVideoPreviewLayer .videoGravity=AVLayerVideoGravityResiz eAspectFill;//填充模式 //将视频预览层添加到界面中 //[layer addSublayer:_captureVideoPreviewLayer ]; [layer insertSublayer:_captureVideoPreviewLayer below:self.focusCursor.layer]; [self addNotificationToCapture Device:captureDevice]; [self addGenstureRecognizer]; [self setFlashModeButtonStatus ]; } -(void)viewDidAppear:(BOOL)animated{ [super viewDidAppear:animated]; [self.captureSession startRunning]; } -(void)viewDidDisappear:(BOOL)animated{ [super viewDidDisappear:animated]; [self.captureSession stopRunning]; } -(void)dealloc{ [self removeNotification]; } #pragma mark - UI方法 #pragma mark 拍照 - (IBAction)takeButtonClick:(UIButton *)sender { //根据设备输出获得连接 AVCaptureConnection *captureConnection=[self.captureStillImageOutput connectionWithMediaType:AVMediaTypeVideo]; //根据连接取得设备输出的数据 [self.captureStillImageOutput captureStillImageAsynchr onouslyFromConnection:captureConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) { if (imageDataSampleBuffer) { NSData *imageData=[AVCaptureStillImageOutpu t jpegStillImageNSDataRepr esentation:imageDataSampleBuffer]; UIImage *image=[UIImage imageWithData:imageData]; UIImageWriteToSavedPhoto sAlbum(image, nil, nil, nil); // ALAssetsLibrary *assetsLibrary=[[ALAssetsLibrary alloc]init]; // [assetsLibrary writeImageToSavedPhotosA lbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil]; } }]; } #pragma mark 切换前后摄像头 - (IBAction)toggleButtonClick:(UIButton *)sender { AVCaptureDevice *currentDevice=[self.captureDeviceInput device]; AVCaptureDevicePosition currentPosition=[currentDevice position]; [self removeNotificationFromCa ptureDevice:currentDevice]; AVCaptureDevice *toChangeDevice; AVCaptureDevicePosition toChangePosition=AVCaptureDevicePositionF ront; if (currentPosition==AVCaptureDevicePositionU nspecified||currentPosition==AVCaptureDevicePositionF ront) { toChangePosition=AVCaptureDevicePositionB ack; } toChangeDevice=[self getCameraDeviceWithPosit ion:toChangePosition]; [self addNotificationToCapture Device:toChangeDevice]; //获得要调整的设备输入对象 AVCaptureDeviceInput *toChangeDeviceInput=[[AVCaptureDeviceInput alloc]initWithDevice:toChangeDevice error:nil]; //改变会话的配置前一定要先开启配置,配置完成后提交配置改变 [self.captureSession beginConfiguration]; //移除原有输入对象 [self.captureSession removeInput:self.captureDeviceInput]; //添加新的输入对象 if ([self.captureSession canAddInput:toChangeDeviceInput]) { [self.captureSession addInput:toChangeDeviceInput]; self.captureDeviceInput=toChangeDeviceInput; } //提交会话配置 [self.captureSession commitConfiguration]; [self setFlashModeButtonStatus ]; } #pragma mark 自动闪光灯开启 - (IBAction)flashAutoClick:(UIButton *)sender { [self setFlashMode:AVCaptureFlashModeAuto]; [self setFlashModeButtonStatus ]; } #pragma mark 打开闪光灯 - (IBAction)flashOnClick:(UIButton *)sender { [self setFlashMode:AVCaptureFlashModeOn]; [self setFlashModeButtonStatus ]; } #pragma mark 关闭闪光灯 - (IBAction)flashOffClick:(UIButton *)sender { [self setFlashMode:AVCaptureFlashModeOff]; [self setFlashModeButtonStatus ]; } #pragma mark - 通知 -(void)addNotificationToCapture Device:(AVCaptureDevice *)captureDevice{ //注意添加区域改变捕获通知必须首先设置设备允许捕获 [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { captureDevice.subjectAreaChangeMonitor ingEnabled=YES; }]; NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter]; //捕获区域发生改变 [notificationCenter addObserver:self selector:@selector(areaChange:) name:AVCaptureDeviceSubjectAr eaDidChangeNotification object:captureDevice]; } -(void)removeNotificationFromCa ptureDevice:(AVCaptureDevice *)captureDevice{ NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter]; [notificationCenter removeObserver:self name:AVCaptureDeviceSubjectAr eaDidChangeNotification object:captureDevice]; } -(void)removeNotification{ NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter]; [notificationCenter removeObserver:self]; } -(void)addNotificationToCapture Session:(AVCaptureSession *)captureSession{ NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter]; //会话出错 [notificationCenter addObserver:self selector:@selector(sessionRuntimeError:) name:AVCaptureSessionRuntimeE rrorNotification object:captureSession]; } -(void)deviceConnected:(NSNotification *)notification{ NSLog(@"设备已连接..."); } -(void)deviceDisconnected:(NSNotification *)notification{ NSLog(@"设备已断开."); } -(void)areaChange:(NSNotification *)notification{ NSLog(@"捕获区域改变..."); } -(void)sessionRuntimeError:(NSNotification *)notification{ NSLog(@"会话发生错误."); } #pragma mark - 私有方法 -(AVCaptureDevice *)getCameraDeviceWithPosit ion:(AVCaptureDevicePosition )position{ NSArray *cameras= [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; for (AVCaptureDevice *camera in cameras) { if ([camera position]==position) { return camera; } } return nil; } -(void)changeDeviceProperty:(PropertyChangeBlock)propertyChange{ AVCaptureDevice *captureDevice= [self.captureDeviceInput device]; NSError *error; //注意改变设备属性前一定要首先调用lockForConfiguration:调用完之后使用unlockForConfiguration方法解锁 if ([captureDevice lockForConfiguration:&error]) { propertyChange(captureDevice); [captureDevice unlockForConfiguration]; }else{ NSLog(@"设置设备属性过程发生错误,错误信息:%@",error.localizedDescription); } } -(void)setFlashMode:(AVCaptureFlashMode )flashMode{ [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { if ([captureDevice isFlashModeSupported:flashMode]) { [captureDevice setFlashMode:flashMode]; } }]; } -(void)setFocusMode:(AVCaptureFocusMode )focusMode{ [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { if ([captureDevice isFocusModeSupported:focusMode]) { [captureDevice setFocusMode:focusMode]; } }]; } -(void)setExposureMode:(AVCaptureExposureMode)exposureMode{ [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { if ([captureDevice isExposureModeSupported:exposureMode]) { [captureDevice setExposureMode:exposureMode]; } }]; } -(void)focusWithMode:(AVCaptureFocusMode)focusMode exposureMode:(AVCaptureExposureMode)exposureMode atPoint:(CGPoint)point{ [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { if ([captureDevice isFocusModeSupported:focusMode]) { [captureDevice setFocusMode:AVCaptureFocusModeAutoFo cus]; } if ([captureDevice isFocusPointOfInterestSu pported]) { [captureDevice setFocusPointOfInterest:point]; } if ([captureDevice isExposureModeSupported:exposureMode]) { [captureDevice setExposureMode:AVCaptureExposureModeAut oExpose]; } if ([captureDevice isExposurePointOfInteres tSupported]) { [captureDevice setExposurePointOfIntere st:point]; } }]; } -(void)addGenstureRecognizer{ UITapGestureRecognizer *tapGesture=[[UITapGestureRecognizer alloc]initWithTarget:self action:@selector(tapScreen:)]; [self.viewContainer addGestureRecognizer:tapGesture]; } -(void)tapScreen:(UITapGestureRecognizer *)tapGesture{ CGPoint point= [tapGesture locationInView:self.viewContainer]; //将UI坐标转化为摄像头坐标 CGPoint cameraPoint= [self.captureVideoPreviewLayer captureDevicePointOfInte restForPoint:point]; [self setFocusCursorWithPoint:point]; [self focusWithMode:AVCaptureFocusModeAutoFo cus exposureMode:AVCaptureExposureModeAut oExpose atPoint:cameraPoint]; } -(void)setFlashModeButtonStatus { AVCaptureDevice *captureDevice=[self.captureDeviceInput device]; AVCaptureFlashMode flashMode=captureDevice.flashMode; if([captureDevice isFlashAvailable]){ self.flashAutoButton.hidden=NO; self.flashOnButton.hidden=NO; self.flashOffButton.hidden=NO; self.flashAutoButton.enabled=YES; self.flashOnButton.enabled=YES; self.flashOffButton.enabled=YES; switch (flashMode) { case AVCaptureFlashModeAuto: self.flashAutoButton.enabled=NO; break; case AVCaptureFlashModeOn: self.flashOnButton.enabled=NO; break; case AVCaptureFlashModeOff: self.flashOffButton.enabled=NO; break; default: break; } }else{ self.flashAutoButton.hidden=YES; self.flashOnButton.hidden=YES; self.flashOffButton.hidden=YES; } } -(void)setFocusCursorWithPoint:(CGPoint)point{ self.focusCursor.center=point; self.focusCursor.transform=CGAffineTransformMakeSca le(1.5, 1.5); self.focusCursor.alpha=1.0; [UIView animateWithDuration:1.0 animations:^{ self.focusCursor.transform=CGAffineTransformIdentit y; } completion:^(BOOL finished) { self.focusCursor.alpha=0; }]; } @end