iOS中播放视频MPMoviePlayerController

MPMoviePlayerController

iOS中播放视频可以使用MediaPlayer.framework种的MPMoviePlayerController类来完成,它支持本地视频和网络视频播放。这个类实现了MPMediaPlayback协议,因此具备一般的播放器控制功能,例如播放、暂停、停止等。但是MPMediaPlayerController自身并不是一个完整的视图控制器,如果要在UI中展示视频需要将view属性添加到界面中。下面列出了MPMoviePlayerController的常用属性和方法:

属性

说明

@property (nonatomic, copy) NSURL *contentURL

播放媒体URL,这个URL可以是本地路径,也可以是网络路径

@property (nonatomic, readonly) UIView *view

播放器视图,如果要显示视频必须将此视图添加到控制器视图中

@property (nonatomic, readonly) UIView *backgroundView

播放器背景视图

@property (nonatomic, readonly) MPMoviePlaybackState playbackState

媒体播放状态,枚举类型:

MPMoviePlaybackStateStopped:停止播放

MPMoviePlaybackStatePlaying:正在播放

MPMoviePlaybackStatePaused:暂停

MPMoviePlaybackStateInterrupted:中断

MPMoviePlaybackStateSeekingForward:向前定位

MPMoviePlaybackStateSeekingBackward:向后定位

@property (nonatomic, readonly) MPMovieLoadState loadState

网络媒体加载状态,枚举类型:

MPMovieLoadStateUnknown:位置类型

MPMovieLoadStatePlayable

MPMovieLoadStatePlaythroughOK:这种状态如果shouldAutoPlayYES将自动播放

MPMovieLoadStateStalled:停滞状态

@property (nonatomic) MPMovieControlStyle controlStyle

控制面板风格,枚举类型:

MPMovieControlStyleNone:无控制面板

MPMovieControlStyleEmbedded:嵌入视频风格

MPMovieControlStyleFullscreen:全屏

MPMovieControlStyleDefault:默认风格

@property (nonatomic) MPMovieRepeatMode repeatMode;

重复播放模式,枚举类型:

MPMovieRepeatModeNone:不重复,默认值

MPMovieRepeatModeOne:重复播放

@property (nonatomic) BOOL shouldAutoplay

当网络媒体缓存到一定数据时是否自动播放,默认为YES

@property (nonatomic, getter=isFullscreen) BOOL fullscreen

是否全屏展示,默认为NO,注意如果要通过此属性设置全屏必须在视图显示完成后设置,否则无效

@property (nonatomic) MPMovieScalingMode scalingMode

视频缩放填充模式,枚举类型:

MPMovieScalingModeNone:不进行任何缩放

MPMovieScalingModeAspectFit:固定缩放比例并且尽量全部展示视频,不会裁切视频

MPMovieScalingModeAspectFill:固定缩放比例并填充满整个视图展示,可能会裁切视频

MPMovieScalingModeFill:不固定缩放比例压缩填充整个视图,视频不会被裁切但是比例失衡

@property (nonatomic, readonly) BOOL readyForDisplay

是否有相关媒体被播放

@property (nonatomic, readonly) MPMovieMediaTypeMask movieMediaTypes

媒体类别,枚举类型:

MPMovieMediaTypeMaskNone:未知类型

MPMovieMediaTypeMaskVideo:视频

MPMovieMediaTypeMaskAudio:音频

@property (nonatomic) MPMovieSourceType movieSourceType

媒体源,枚举类型:

MPMovieSourceTypeUnknown:未知来源

MPMovieSourceTypeFile:本地文件

MPMovieSourceTypeStreaming:流媒体(直播或点播)

@property (nonatomic, readonly) NSTimeInterval duration

媒体时长,如果未知则返回0

@property (nonatomic, readonly) NSTimeInterval playableDuration

媒体可播放时长,主要用于表示网络媒体已下载视频时长

@property (nonatomic, readonly) CGSize naturalSize

视频实际尺寸,如果未知则返回CGSizeZero

@property (nonatomic) NSTimeInterval initialPlaybackTime

起始播放时间

@property (nonatomic) NSTimeInterval endPlaybackTime

终止播放时间

@property (nonatomic) BOOL allowsAirPlay

是否允许无线播放,默认为YES

@property (nonatomic, readonly, getter=isAirPlayVideoActive) BOOL airPlayVideoActive

当前媒体是否正在通过AirPlay播放

@property(nonatomic, readonly) BOOL isPreparedToPlay

是否准备好播放

@property(nonatomic) NSTimeInterval currentPlaybackTime

当前播放时间,单位:秒

@property(nonatomic) float currentPlaybackRate

当前播放速度,如果暂停则为0,正常速度为1.0,非0数据表示倍率

对象方法

说明

- (instancetype)initWithContentURL:(NSURL *)url

使用指定的URL初始化媒体播放控制器对象

- (void)setFullscreen:(BOOL)fullscreen animated:(BOOL)animated

设置视频全屏,注意如果要通过此方法设置全屏则必须在其视图显示之后设置,否则无效

- (void)requestThumbnailImagesAtTimes:(NSArray *)playbackTimes timeOption:(MPMovieTimeOption)option

获取在指定播放时间的视频缩略图,第一个参数是获取缩略图的时间点数组;第二个参数代表时间点精度,枚举类型:

MPMovieTimeOptionNearestKeyFrame:时间点附近

MPMovieTimeOptionExact:准确时间

- (void)cancelAllThumbnailImageRequests

取消所有缩略图获取请求

- (void)prepareToPlay

准备播放,加载视频数据到缓存,当调用play方法时如果没有准备好会自动调用此方法

- (void)play

开始播放

- (void)pause

暂停播放

- (void)stop

停止播放

- (void)beginSeekingForward

向前定位

- (void)beginSeekingBackward

向后定位

- (void)endSeeking

停止快进/快退

通知

说明

MPMoviePlayerScalingModeDidChangeNotification

视频缩放填充模式发生改变

MPMoviePlayerPlaybackDidFinishNotification

媒体播放完成或用户手动退出,具体完成原因可以通过通知userInfo中的keyMPMoviePlayerPlaybackDidFinishReasonUserInfoKey的对象获取

MPMoviePlayerPlaybackStateDidChangeNotification

播放状态改变,可配合playbakcState属性获取具体状态

MPMoviePlayerLoadStateDidChangeNotification

媒体网络加载状态改变

MPMoviePlayerNowPlayingMovieDidChangeNotification

当前播放的媒体内容发生改变

MPMoviePlayerWillEnterFullscreenNotification

将要进入全屏

MPMoviePlayerDidEnterFullscreenNotification

进入全屏后

MPMoviePlayerWillExitFullscreenNotification

将要退出全屏

MPMoviePlayerDidExitFullscreenNotification

退出全屏后

MPMoviePlayerIsAirPlayVideoActiveDidChangeNotification

当媒体开始通过AirPlay播放或者结束AirPlay播放

MPMoviePlayerReadyForDisplayDidChangeNotification

视频显示状态改变

MPMovieMediaTypesAvailableNotification

确定了媒体可用类型后

MPMovieSourceTypeAvailableNotification

确定了媒体来源后

MPMovieDurationAvailableNotification

确定了媒体播放时长后

MPMovieNaturalSizeAvailableNotification

确定了媒体的实际尺寸后

MPMoviePlayerThumbnailImageRequestDidFinishNotification

缩略图请求完成之后

MPMediaPlaybackIsPreparedToPlayDidChangeNotification

做好播放准备后

注意MPMediaPlayerController的状态等信息并不是通过代理来和外界交互的,而是通过通知中心,因此从上面的列表中可以看到常用的一些通知。由于MPMoviePlayerController本身对于媒体播放做了深度的封装,使用起来就相当简单:创建MPMoviePlayerController对象,设置frame属性,将MPMoviePlayerControllerview添加到控制器视图中。下面的示例中将创建一个播放控制器并添加播放状态改变及播放完成的通知:

// ViewController.m

// MPMoviePlayerController

 

#import "ViewController.h"

#import <MediaPlayer/MediaPlayer.h>

 

@interfaceViewController ()

 

@property(nonatomic,strong) MPMoviePlayerController *moviePlayer;//视频播放控制器

 

@end

 

@implementation ViewController

 

#pragma mark -控制器视图方法

- (void)viewDidLoad {

   [super viewDidLoad];

    //播放

    [self.moviePlayer play];

    //添加通知

    [self addNotification];

}

 

-(void)dealloc{

    //移除所有通知监控

    [[NSNotificationCenterdefaultCenter] removeObserver:self];

}

 

 

#pragma mark -私有方法

/**

 *  取得本地文件路径

 * @return 文件路径

 */

-(NSURL *)getFileUrl{

   NSString *urlStr=[[NSBundle mainBundle] pathForResource:@"The New Look of OS XYosemite.mp4"ofType:nil];

   NSURL *url=[NSURL fileURLWithPath:urlStr];

    returnurl;

}

 

/**

 *  取得网络文件路径

 * @return 文件路径

 */

-(NSURL *)getNetworkUrl{

   NSString *urlStr=@"http://192.168.1.161/The New Look of OS X Yosemite.mp4";

   urlStr=[urlStrstringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding];

   NSURL *url=[NSURL URLWithString:urlStr];

    returnurl;

}

 

/**

 *  创建媒体播放控制器

 * @return 媒体播放控制器

 */

-(MPMoviePlayerController *)moviePlayer{

    if(!_moviePlayer) {

       NSURL *url=[self getNetworkUrl];

       _moviePlayer=[[MPMoviePlayerController alloc]initWithContentURL:url];

       _moviePlayer.view.frame=self.view.bounds;

       _moviePlayer.view.autoresizingMask=UIViewAutoresizingFlexibleWidth|UIViewAutoresizingFlexibleHeight;

       [self.view addSubview:_moviePlayer.view];

    }

    return_moviePlayer;

}

 

/**

 *  添加通知监控媒体播放控制器状态

 */

-(void)addNotification{

   NSNotificationCenter *notificationCenter=[NSNotificationCenterdefaultCenter];

   [notificationCenter addObserver:selfselector:@selector(mediaPlayerPlaybackStateChange:)name:MPMoviePlayerPlaybackStateDidChangeNotification object:self.moviePlayer];

   [notificationCenter addObserver:selfselector:@selector(mediaPlayerPlaybackFinished:)name:MPMoviePlayerPlaybackDidFinishNotification object:self.moviePlayer];

   

}

 

/**

 *  播放状态改变,注意播放完成时的状态是暂停

 * @param notification 通知对象

 */

-(void)mediaPlayerPlaybackStateChange:(NSNotification *)notification{

    switch(self.moviePlayer.playbackState) {

       caseMPMoviePlaybackStatePlaying:

           NSLog(@"正在播放...");

           break;

       caseMPMoviePlaybackStatePaused:

           NSLog(@"暂停播放.");

           break;

       caseMPMoviePlaybackStateStopped:

           NSLog(@"停止播放.");

           break;

       default:

           NSLog(@"播放状态:%li",self.moviePlayer.playbackState);

           break;

    }

}

 

/**

 *  播放完成

 * @param notification 通知对象

 */

-(void)mediaPlayerPlaybackFinished:(NSNotification *)notification{

   NSLog(@"播放完成.%li",self.moviePlayer.playbackState);

}

@end

运行效果:

从上面的API大家也不难看出其实MPMoviePlayerController功能相当强大,日常开发中作为一般的媒体播放器也完全没有问题。MPMoviePlayerController除了一般的视频播放和控制外还有一些强大的功能,例如截取视频缩略图。请求视频缩略图时只要调用-(void)requestThumbnailImagesAtTimes:(NSArray *)playbackTimestimeOption:(MPMovieTimeOption)option方法指定获得缩略图的时间点,然后监控MPMoviePlayerThumbnailImageRequestDidFinishNotification通知,每个时间点的缩略图请求完成就会调用通知,在通知调用方法中可以通过MPMoviePlayerThumbnailImageKey获得UIImage对象处理即可。例如下面的程序演示了在程序启动后获得两个时间点的缩略图的过程,截图成功后保存到相册:

//

// ViewController.m

// MPMoviePlayerController

//  视频截图

 

#import "ViewController.h"

#import <MediaPlayer/MediaPlayer.h>

@interfaceViewController ()

@property(nonatomic,strong) MPMoviePlayerController *moviePlayer;//视频播放控制器

 

@end

 

@implementation ViewController

 

#pragma mark -控制器视图方法

- (void)viewDidLoad {

   [super viewDidLoad];

   

    //播放

    [self.moviePlayer play];

    //添加通知

    [self addNotification];

    //获取缩略图

    [self thumbnailImageRequest];

}

 

-(void)dealloc{

    //移除所有通知监控

    [[NSNotificationCenterdefaultCenter] removeObserver:self];

}


#pragma mark -私有方法

/**

 *  取得本地文件路径

 * @return 文件路径

 */

-(NSURL *)getFileUrl{

   NSString *urlStr=[[NSBundle mainBundle] pathForResource:@"The New Look of OS XYosemite.mp4"ofType:nil];

   NSURL *url=[NSURL fileURLWithPath:urlStr];

    returnurl;

}

 

/**

 *  取得网络文件路径

 * @return 文件路径

 */

-(NSURL *)getNetworkUrl{

   NSString *urlStr=@"http://192.168.1.161/The New Look of OS X Yosemite.mp4";

   urlStr=[urlStrstringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding];

   NSURL *url=[NSURL URLWithString:urlStr];

    returnurl;

}

 

/**

 *  创建媒体播放控制器

 * @return 媒体播放控制器

 */

-(MPMoviePlayerController *)moviePlayer{

    if(!_moviePlayer) {

       NSURL *url=[self getNetworkUrl];

       _moviePlayer=[[MPMoviePlayerController alloc]initWithContentURL:url];

       _moviePlayer.view.frame=self.view.bounds;

       _moviePlayer.view.autoresizingMask=UIViewAutoresizingFlexibleWidth|UIViewAutoresizingFlexibleHeight;

       [self.view addSubview:_moviePlayer.view];

    }

    return_moviePlayer;

}

 

/**

 *  获取视频缩略图

 */

-(void)thumbnailImageRequest{

    //获取13.0s21.5s的缩略图

    [self.moviePlayerrequestThumbnailImagesAtTimes:@[@13.0,@21.5]timeOption:MPMovieTimeOptionNearestKeyFrame];

}

 

#pragma mark -控制器通知

/**

 *  添加通知监控媒体播放控制器状态

 */

-(void)addNotification{

   NSNotificationCenter *notificationCenter=[NSNotificationCenterdefaultCenter];

   [notificationCenter addObserver:selfselector:@selector(mediaPlayerPlaybackStateChange:)name:MPMoviePlayerPlaybackStateDidChangeNotification object:self.moviePlayer];

   [notificationCenter addObserver:selfselector:@selector(mediaPlayerPlaybackFinished:)name:MPMoviePlayerPlaybackDidFinishNotification object:self.moviePlayer];

   [notificationCenter addObserver:selfselector:@selector(mediaPlayerThumbnailRequestFinished:)name:MPMoviePlayerThumbnailImageRequestDidFinishNotificationobject:self.moviePlayer];

   

}

 

/**

 *  播放状态改变,注意播放完成时的状态是暂停

 * @param notification 通知对象

 */

-(void)mediaPlayerPlaybackStateChange:(NSNotification *)notification{

    switch(self.moviePlayer.playbackState) {

       caseMPMoviePlaybackStatePlaying:

           NSLog(@"正在播放...");

           break;

       caseMPMoviePlaybackStatePaused:

           NSLog(@"暂停播放.");

           break;

       caseMPMoviePlaybackStateStopped:

           NSLog(@"停止播放.");

           break;

       default:

           NSLog(@"播放状态:%li",self.moviePlayer.playbackState);

           break;

    }

}

 

/**

 *  播放完成

 * @param notification 通知对象

 */

-(void)mediaPlayerPlaybackFinished:(NSNotification *)notification{

   NSLog(@"播放完成.%li",self.moviePlayer.playbackState);

}

 

/**

 *  缩略图请求完成,此方法每次截图成功都会调用一次

 * @param notification 通知对象

 */

-(void)mediaPlayerThumbnailRequestFinished:(NSNotification *)notification{

   NSLog(@"视频截图完成.");

   UIImage *image=notification.userInfo[MPMoviePlayerThumbnailImageKey];

    //保存图片到相册(首次调用会请求用户获得访问相册权限)

    UIImageWriteToSavedPhotosAlbum(image,nil, nil, nil);

}

 

@end

截图效果:

    

扩展--使用AVFoundation生成缩略图

通过前面的方法大家应该已经看到,使用MPMoviePlayerController来生成缩略图足够简单,但是如果仅仅是是为了生成缩略图而不进行视频播放的话,此刻使用MPMoviePlayerController就有点大材小用了。其实使用AVFundation框架中的AVAssetImageGenerator就可以获取视频缩略图。使用AVAssetImageGenerator获取缩略图大致分为三个步骤:

创建AVURLAsset对象(此类主要用于获取媒体信息,包括视频、声音等)。

根据AVURLAsset创建AVAssetImageGenerator对象。

使用AVAssetImageGeneratorcopyCGImageAtTime::方法获得指定时间点的截图。

// AVAssetImageGenerator

 

#import "ViewController.h"

#import <AVFoundation/AVFoundation.h>

 

@interfaceViewController ()

@end

 

@implementation ViewController

 

- (void)viewDidLoad {

   [super viewDidLoad];

   

    //获取第13.0s的缩略图

    [self thumbnailImageRequest:13.0];

}

 

#pragma mark -私有方法

/**

 *  取得本地文件路径

 * @return 文件路径

 */

-(NSURL *)getFileUrl{

   NSString *urlStr=[[NSBundle mainBundle] pathForResource:@"The New Look of OS XYosemite.mp4"ofType:nil];

   NSURL *url=[NSURL fileURLWithPath:urlStr];

    returnurl;

}

 

/**

 *  取得网络文件路径

 * @return 文件路径

 */

-(NSURL *)getNetworkUrl{

   NSString *urlStr=@"http://192.168.1.161/The New Look of OS X Yosemite.mp4";

   urlStr=[urlStr stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding];

   NSURL *url=[NSURL URLWithString:urlStr];

    returnurl;

}

 

/**

 *  截取指定时间的视频缩略图

 * @param timeBySecond 时间点

 */

-(void)thumbnailImageRequest:(CGFloat )timeBySecond{

    //创建URL

    NSURL *url=[self getNetworkUrl];

    //根据url创建AVURLAsset

    AVURLAsset *urlAsset=[AVURLAssetassetWithURL:url];

    //根据AVURLAsset创建AVAssetImageGenerator

    AVAssetImageGenerator*imageGenerator=[AVAssetImageGenerator assetImageGeneratorWithAsset:urlAsset];

    /*截图

    * requestTime:缩略图创建时间

    * actualTime:缩略图实际生成的时间

    */

    NSError *error=nil;

   CMTime time=CMTimeMakeWithSeconds(timeBySecond, 10);//CMTime是表示电影时间信息的结构体,第一个参数表示是视频第几秒,第二个参数表示每秒帧数.(如果要活的某一秒的第几帧可以使用CMTimeMake方法)

    CMTime actualTime;

   CGImageRef cgImage= [imageGenerator copyCGImageAtTime:timeactualTime:&actualTime error:&error];

    if(error){

       NSLog(@"截取视频缩略图时发生错误,错误信息:%@",error.localizedDescription);

       return;

    }

   CMTimeShow(actualTime);

   UIImage *image=[UIImage imageWithCGImage:cgImage];//转化为UIImage

   //保存到相册

    UIImageWriteToSavedPhotosAlbum(image,nil,nil, nil);

   CGImageRelease(cgImage);

}

 

@end

生成的缩略图效果:

MPMoviePlayerViewController

其实MPMoviePlayerController如果不作为嵌入视频来播放(例如在新闻中嵌入一个视频),通常在播放时都是占满一个屏幕的,特别是在iPhoneiTouch上。因此从iOS3.2以后苹果也在思考既然MPMoviePlayerController在使用时通常都是将其视图view添加到另外一个视图控制器中作为子视图,那么何不直接创建一个控制器视图内部创建一个MPMoviePlayerController属性并且默认全屏播放,开发者在开发的时候直接使用这个视图控制器。这个内部有一个MPMoviePlayerController的视图控制器就是MPMoviePlayerViewController,它继承于UIViewControllerMPMoviePlayerViewController内部多了一个moviePlayer属性和一个带有url的初始化方法,同时它内部实现了一些作为模态视图展示所特有的功能,例如默认是全屏模式展示、弹出后自动播放、作为模态窗口展示时如果点击“Done”按钮会自动退出模态窗口等。在下面的示例中就不直接将播放器放到主视图控制器,而是放到一个模态视图控制器中,简单演示MPMoviePlayerViewController的使用。


// MPMoviePlayerViewController

// MPMoviePlayerViewController使用

 

#import "ViewController.h"

#import <MediaPlayer/MediaPlayer.h>

 

@interfaceViewController ()

//播放器视图控制器

@property(nonatomic,strong) MPMoviePlayerViewController *moviePlayerViewController; 

@end

 

@implementation ViewController

 

#pragma mark -控制器视图方法

- (void)viewDidLoad {

   [super viewDidLoad];

}

 

-(void)dealloc{

    //移除所有通知监控

    [[NSNotificationCenterdefaultCenter] removeObserver:self];

}

 

 

#pragma mark -私有方法

/**

 *  取得本地文件路径

 * @return 文件路径

 */

-(NSURL *)getFileUrl{

   NSString *urlStr=[[NSBundle mainBundle] pathForResource:@"The New Look of OS XYosemite.mp4"ofType:nil];

   NSURL *url=[NSURL fileURLWithPath:urlStr];

    returnurl;

}

 

/**

 *  取得网络文件路径

 *

 * @return 文件路径

 */

-(NSURL *)getNetworkUrl{

   NSString *urlStr=@"http://192.168.1.161/The New Look of OS X Yosemite.mp4";

   urlStr=[urlStrstringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding];

   NSURL *url=[NSURL URLWithString:urlStr];

    returnurl;

}

 

-(MPMoviePlayerViewController*)moviePlayerViewController{

    if(!_moviePlayerViewController) {

       NSURL *url=[self getNetworkUrl];

       _moviePlayerViewController=[[MPMoviePlayerViewControlleralloc]initWithContentURL:url];

       [self addNotification];

    }

    return_moviePlayerViewController;

}

#pragma mark - UI事件

- (IBAction)playClick:(UIButton *)sender {

   self.moviePlayerViewController=nil;//保证每次点击都重新创建视频播放控制器视图,避免再次点击时由于不播放的问题

//   [self presentViewController:self.moviePlayerViewController animated:YEScompletion:nil];

   //注意,在MPMoviePlayerViewController.h中对UIViewController扩展两个用于模态展示和关闭MPMoviePlayerViewController的方法,增加了一种下拉展示动画效果

    [selfpresentMoviePlayerViewControllerAnimated:self.moviePlayerViewController];

}

 

#pragma mark -控制器通知

/**

 *  添加通知监控媒体播放控制器状态

 */

-(void)addNotification{

   NSNotificationCenter *notificationCenter=[NSNotificationCenterdefaultCenter];

   [notificationCenter addObserver:selfselector:@selector(mediaPlayerPlaybackStateChange:)name:MPMoviePlayerPlaybackStateDidChangeNotification object:self.moviePlayerViewController.moviePlayer];

   [notificationCenter addObserver:selfselector:@selector(mediaPlayerPlaybackFinished:)name:MPMoviePlayerPlaybackDidFinishNotificationobject:self.moviePlayerViewController.moviePlayer];

   

}

 

/**

 *  播放状态改变,注意播放完成时的状态是暂停

 *  @param notification 通知对象

 */

-(void)mediaPlayerPlaybackStateChange:(NSNotification *)notification{

    switch(self.moviePlayerViewController.moviePlayer.playbackState){

       caseMPMoviePlaybackStatePlaying:

           NSLog(@"正在播放...");

           break;

       caseMPMoviePlaybackStatePaused:

           NSLog(@"暂停播放.");

           break;

       caseMPMoviePlaybackStateStopped:

           NSLog(@"停止播放.");

           break;

       default:

           NSLog(@"播放状态:%li",self.moviePlayerViewController.moviePlayer.playbackState);

           break;

    }

}

 

/**

 *  播放完成

 * @param notification 通知对象

 */

-(void)mediaPlayerPlaybackFinished:(NSNotification *)notification{

   NSLog(@"播放完成.%li",self.moviePlayerViewController.moviePlayer.playbackState);

}

 

@end

运行效果:

这里需要强调一下,由于MPMoviePlayerViewController的初始化方法做了大量工作(例如设置URL、自动播放、添加点击Done完成的监控等),所以当再次点击播放弹出新的模态窗口的时如果不销毁之前的MPMoviePlayerViewController,那么新的对象就无法完成初始化,这样也就不能再次进行播放。

AVPlayer

MPMoviePlayerController足够强大,几乎不用写几行代码就能完成一个播放器,但是正是由于它的高度封装使得要自定义这个播放器变得很复杂,甚至是不可能完成。例如有些时候需要自定义播放器的样式,那么如果要使用MPMoviePlayerController就不合适了,如果要对视频有自由的控制则可以使用AVPlayerAVPlayer存在于AVFoundation中,它更加接近于底层,所以灵活性也更强:

AVPlayer本身并不能显示视频,而且它也不像MPMoviePlayerController有一个view属性。如果AVPlayer要显示必须创建一个播放器层AVPlayerLayer用于展示,播放器层继承于CALayer,有了AVPlayerLayer之添加到控制器视图的layer中即可。要使用AVPlayer首先了解一下几个常用的类:

AVAsset:主要用于获取多媒体信息,是一个抽象类,不能直接使用。

AVURLAssetAVAsset的子类,可以根据一个URL路径创建一个包含媒体信息的AVURLAsset对象。

AVPlayerItem:一个媒体资源管理对象,管理者视频的一些基本信息和状态,一个AVPlayerItem对应着一个视频资源。

下面简单通过一个播放器来演示AVPlayer的使用,播放器的效果如下:

在这个自定义的播放器中实现了视频播放、暂停、进度展示和视频列表功能,下面将对这些功能一一介绍。

首先说一下视频的播放、暂停功能,这也是最基本的功能,AVPlayer对应着两个方法playpause来实现。但是关键问题是如何判断当前视频是否在播放,在前面的内容中无论是音频播放器还是视频播放器都有对应的状态来判断,但是AVPlayer却没有这样的状态属性,通常情况下可以通过判断播放器的播放速度来获得播放状态。如果rate0说明是停止状态,1是则是正常播放状态。

其次要展示播放进度就没有其他播放器那么简单了。在前面的播放器中通常是使用通知来获得播放器的状态,媒体加载状态等,但是无论是AVPlayer还是AVPlayerItemAVPlayer有一个属性currentItemAVPlayerItem类型,表示当前播放的视频对象)都无法获得这些信息。当然AVPlayerItem是有通知的,但是对于获得播放状态和加载状态有用的通知只有一个:播放完成通知AVPlayerItemDidPlayToEndTimeNotification。在播放视频时,特别是播放网络视频往往需要知道视频加载情况、缓冲情况、播放情况,这些信息可以通过KVO监控AVPlayerItemstatusloadedTimeRanges属性来获得。当AVPlayerItemstatus属性为AVPlayerStatusReadyToPlay是说明正在播放,只有处于这个状态时才能获得视频时长等信息;当loadedTimeRanges的改变时(每缓冲一部分数据就会更新此属性)可以获得本次缓冲加载的视频范围(包含起始时间、本次加载时长),这样一来就可以实时获得缓冲情况。然后就是依靠AVPlayer- (id)addPeriodicTimeObserverForInterval:(CMTime)intervalqueue:(dispatch_queue_t)queue usingBlock:(void (^)(CMTime time))block方法获得播放进度,这个方法会在设定的时间间隔内定时更新播放进度,通过time参数通知客户端。相信有了这些视频信息播放进度就不成问题了,事实上通过这些信息就算是平时看到的其他播放器的缓冲进度显示以及拖动播放的功能也可以顺利的实现。

最后就是视频切换的功能,在前面介绍的所有播放器中每个播放器对象一次只能播放一个视频,如果要切换视频只能重新创建一个对象,但是AVPlayer却提供了-(void)replaceCurrentItemWithPlayerItem:(AVPlayerItem *)item方法用于在不同的视频之间切换(事实上在AVFoundation内部还有一个AVQueuePlayer专门处理播放列表切换,有兴趣的朋友可以自行研究,这里不再赘述)。

下面附上代码:

//

// ViewController.m

// AVPlayer

 

#import "ViewController.h"

#import <AVFoundation/AVFoundation.h>

 

@interfaceViewController ()

 

@property(nonatomic,strong) AVPlayer *player;//播放器对象

 

@property(weak, nonatomic) IBOutlet UIView *container; //播放器容器

@property(weak, nonatomic) IBOutlet UIButton *playOrPause;//播放/暂停按钮

@property(weak, nonatomic) IBOutlet UIProgressView *progress;//播放进度

 

@end

 

@implementation ViewController

 

#pragma mark -控制器视图方法

- (void)viewDidLoad {

   [super viewDidLoad];

   [self setupUI];

   [self.player play];

}

 

-(void)dealloc{

   [self removeObserverFromPlayerItem:self.player.currentItem];

   [self removeNotification];

}

 

#pragma mark -私有方法

-(void)setupUI{

    //创建播放器层

    AVPlayerLayer*playerLayer=[AVPlayerLayer playerLayerWithPlayer:self.player];

   playerLayer.frame=self.container.frame;

    //playerLayer.videoGravity=AVLayerVideoGravityResizeAspect;//视频填充模式

    [self.container.layeraddSublayer:playerLayer];

}

 

/**

 *  截取指定时间的视频缩略图

 * @param timeBySecond 时间点

 */

 

/**

 *  初始化播放器

 * @return 播放器对象

 */

-(AVPlayer *)player{

    if(!_player) {

       AVPlayerItem *playerItem=[self getPlayItem:0];

       _player=[AVPlayer playerWithPlayerItem:playerItem];

       [self addProgressObserver];

       [self addObserverToPlayerItem:playerItem];

    }

    return_player;

}

 

/**

 *  根据视频索引取得AVPlayerItem对象

 *

 * @param videoIndex 视频顺序索引

 * @return AVPlayerItem对象

 */

-(AVPlayerItem *)getPlayItem:(int)videoIndex{

   NSString *urlStr=[NSString stringWithFormat:@"http://192.168.1.161/%i.mp4",videoIndex];

   urlStr =[urlStr stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding];

   NSURL *url=[NSURL URLWithString:urlStr];

   AVPlayerItem *playerItem=[AVPlayerItem playerItemWithURL:url];

    returnplayerItem;

}

#pragma mark -通知

/**

 *  添加播放器通知

 */

-(void)addNotification{

    //AVPlayerItem添加播放完成通知

    [[NSNotificationCenterdefaultCenter] addObserver:self selector:@selector(playbackFinished:)name:AVPlayerItemDidPlayToEndTimeNotification object:self.player.currentItem];

}

 

-(void)removeNotification{

   [[NSNotificationCenter defaultCenter] removeObserver:self];

}

 

/**

 *  播放完成通知

 * @param notification 通知对象

 */

-(void)playbackFinished:(NSNotification *)notification{

   NSLog(@"视频播放完成.");

}

 

#pragma mark -监控

/**

 *  给播放器添加进度更新

 */

-(void)addProgressObserver{

   AVPlayerItem *playerItem=self.player.currentItem;

   UIProgressView *progress=self.progress;

    //这里设置每秒执行一次

    [self.playeraddPeriodicTimeObserverForInterval:CMTimeMake(1.0, 1.0)queue:dispatch_get_main_queue() usingBlock:^(CMTime time) {

       floatcurrent=CMTimeGetSeconds(time);

       floattotal=CMTimeGetSeconds([playerItemduration]);

       NSLog(@"当前已经播放%.2fs.",current);

       if(current) {

           [progress setProgress:(current/total) animated:YES];

       }

   }];

}

 

/**

 *  AVPlayerItem添加监控

 *  @paramplayerItem AVPlayerItem对象

 */

-(void)addObserverToPlayerItem:(AVPlayerItem *)playerItem{

    //监控状态属性,注意AVPlayer也有一个status属性,通过监控它的status也可以获得播放状态

    [playerItem addObserver:selfforKeyPath:@"status"options:NSKeyValueObservingOptionNewcontext:nil];

    //监控网络加载情况属性

    [playerItem addObserver:selfforKeyPath:@"loadedTimeRanges"options:NSKeyValueObservingOptionNewcontext:nil];

}

-(void)removeObserverFromPlayerItem:(AVPlayerItem *)playerItem{

   [playerItem removeObserver:self forKeyPath:@"status"];

    [playerItem removeObserver:self forKeyPath:@"loadedTimeRanges"];

}

/**

 *  通过KVO监控播放器状态

 * @param keyPath 监控属性

 * @param object  监视器

 * @param change  状态改变

 * @param context 上下文

 */

-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)objectchange:(NSDictionary *)change context:(void*)context{

   AVPlayerItem *playerItem=object;

    if([keyPath isEqualToString:@"status"]) {

       AVPlayerStatus status= [[change objectForKey:@"new"] intValue];

       if(status==AVPlayerStatusReadyToPlay){

           NSLog(@"正在播放...,视频总长度:%.2f",CMTimeGetSeconds(playerItem.duration));

       }

    }else if([keyPath isEqualToString:@"loadedTimeRanges"]){

       NSArray *array=playerItem.loadedTimeRanges;

       CMTimeRange timeRange = [array.firstObjectCMTimeRangeValue];//本次缓冲时间范围

       floatstartSeconds =CMTimeGetSeconds(timeRange.start);

       floatdurationSeconds =CMTimeGetSeconds(timeRange.duration);

       NSTimeInterval totalBuffer = startSeconds + durationSeconds;//缓冲总长度

       NSLog(@"共缓冲:%.2f",totalBuffer);

//

    }

}

 

#pragma mark - UI事件

/**

 *  点击播放/暂停按钮

 *

 * @param sender 播放/暂停按钮

 */

- (IBAction)playClick:(UIButton *)sender {

//   AVPlayerItemDidPlayToEndTimeNotification

   //AVPlayerItem *playerItem= self.player.currentItem;

    if(self.player.rate==0){//说明时暂停

       [sendersetImage:[UIImage imageNamed:@"player_pause"] forState:UIControlStateNormal];

       [self.player play];

    }else if(self.player.rate==1){//正在播放

       [self.playerpause];

       [sender setImage:[UIImage imageNamed:@"player_play"] forState:UIControlStateNormal];

    }

}

 

/**

 *  切换选集,这里使用按钮的tag代表视频名称

 * @param sender 点击按钮对象

 */

- (IBAction)navigationButtonClick:(UIButton*)sender {

   [self removeNotification];

   [self removeObserverFromPlayerItem:self.player.currentItem];

   AVPlayerItem *playerItem=[self getPlayItem:sender.tag];

   [self addObserverToPlayerItem:playerItem];

    //切换视频

    [self.playerreplaceCurrentItemWithPlayerItem:playerItem];

   [self addNotification];

}

 

@end

运行效果:

到目前为止无论是MPMoviePlayerController还是AVPlayer来播放视频都相当强大,但是它也存在着一些不可回避的问题,那就是支持的视频编码格式很有限:H.264MPEG-4,扩展名(压缩格式):.mp4.mov.m4v.m2v.3gp.3g2等。但是无论是MPMoviePlayerController还是AVPlayer它们都支持绝大多数音频编码,所以大家如果纯粹是为了播放音乐的话也可以考虑使用这两个播放器。那么如何支持更多视频编码格式呢?目前来说主要还是依靠第三方框架,在iOS上常用的视频编码、解码框架有:VLCffmpeg具体使用方式今天就不再做详细介绍。

摄像头

UIImagePickerController拍照和视频录制

下面看一下在iOS如何拍照和录制视频。在iOS中要拍照和录制视频最简单的方法就是使用UIImagePickerControllerUIImagePickerController继承于UINavigationController,前面的文章中主要使用它来选取照片,其实UIImagePickerController的功能不仅如此,它还可以用来拍照和录制视频。首先看一下这个类常用的属性和方法:

属性

说明

@property(nonatomic)           UIImagePickerControllerSourceType     sourceType

拾取源类型,sourceType是枚举类型:

UIImagePickerControllerSourceTypePhotoLibrary:照片库

,默认值

UIImagePickerControllerSourceTypeCamera:摄像头

UIImagePickerControllerSourceTypeSavedPhotosAlbum:相簿

@property(nonatomic,copy)      NSArray                              *mediaTypes

媒体类型,默认情况下此数组包含kUTTypeImage,所以拍照时可以不用设置;但是当要录像的时候必须设置,可以设置为kUTTypeVideo(视频,但不带声音)或者kUTTypeMovie(视频并带有声音)

@property(nonatomic)           NSTimeInterval                        videoMaximumDuration

视频最大录制时长,默认为10 s

@property(nonatomic)           UIImagePickerControllerQualityType    videoQuality

视频质量,枚举类型:

UIImagePickerControllerQualityTypeHigh:高清质量

UIImagePickerControllerQualityTypeMedium:中等质量,适合WiFi传输

UIImagePickerControllerQualityTypeLow:低质量,适合蜂窝网传输

UIImagePickerControllerQualityType640x480640*480

UIImagePickerControllerQualityTypeIFrame1280x7201280*720

UIImagePickerControllerQualityTypeIFrame960x540960*540

@property(nonatomic)           BOOL                                  showsCameraControls

是否显示摄像头控制面板,默认为YES

@property(nonatomic,retain)    UIView                                *cameraOverlayView

摄像头上覆盖的视图,可用通过这个视频来自定义拍照或录像界面

@property(nonatomic)           CGAffineTransform                     cameraViewTransform

摄像头形变

@property(nonatomic) UIImagePickerControllerCameraCaptureMode cameraCaptureMode

摄像头捕获模式,捕获模式是枚举类型:

UIImagePickerControllerCameraCaptureModePhoto:拍照模式

UIImagePickerControllerCameraCaptureModeVideo:视频录制模式

@property(nonatomic) UIImagePickerControllerCameraDevice      cameraDevice

摄像头设备,cameraDevice是枚举类型:

UIImagePickerControllerCameraDeviceRear:前置摄像头

UIImagePickerControllerCameraDeviceFront:后置摄像头

@property(nonatomic) UIImagePickerControllerCameraFlashMode   cameraFlashMode

闪光灯模式,枚举类型:

UIImagePickerControllerCameraFlashModeOff:关闭闪光灯

UIImagePickerControllerCameraFlashModeAuto:闪光灯自动

UIImagePickerControllerCameraFlashModeOn:打开闪光灯

类方法

说明

+ (BOOL)isSourceTypeAvailable:(UIImagePickerControllerSourceType)sourceType

指定的源类型是否可用,sourceType是枚举类型:

UIImagePickerControllerSourceTypePhotoLibrary:照片库

UIImagePickerControllerSourceTypeCamera:摄像头

UIImagePickerControllerSourceTypeSavedPhotosAlbum:相簿

+ (NSArray *)availableMediaTypesForSourceType:(UIImagePickerControllerSourceType)sourceType

指定的源设备上可用的媒体类型,一般就是图片和视频

+ (BOOL)isCameraDeviceAvailable:(UIImagePickerControllerCameraDevice)cameraDevice

指定的摄像头是否可用,cameraDevice是枚举类型:

UIImagePickerControllerCameraDeviceRear:前置摄像头

UIImagePickerControllerCameraDeviceFront:后置摄像头

+ (BOOL)isFlashAvailableForCameraDevice:(UIImagePickerControllerCameraDevice)cameraDevice

指定摄像头的闪光灯是否可用

+ (NSArray *)availableCaptureModesForCameraDevice:(UIImagePickerControllerCameraDevice)cameraDevice

获得指定摄像头上的可用捕获模式,捕获模式是枚举类型:

UIImagePickerControllerCameraCaptureModePhoto:拍照模式

UIImagePickerControllerCameraCaptureModeVideo:视频录制模式

对象方法

说明

- (void)takePicture

编程方式拍照

- (BOOL)startVideoCapture

编程方式录制视频

- (void)stopVideoCapture

编程方式停止录制视频

代理方法

说明

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info

媒体拾取完成

- (void)imagePickerControllerDidCancel:(UIImagePickerController *)picker

取消拾取

扩展方法(主要用于保存照片、视频到相簿)

说明

UIImageWriteToSavedPhotosAlbum(UIImage *image, id completionTarget, SEL completionSelector, void *contextInfo)

保存照片到相簿

UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(NSString *videoPath)

能否将视频保存到相簿

void UISaveVideoAtPathToSavedPhotosAlbum(NSString *videoPath, id completionTarget, SEL completionSelector, void *contextInfo)

保存视频到相簿

要用UIImagePickerController来拍照或者录制视频通常可以分为如下步骤:

创建UIImagePickerController对象。

指定拾取源,平时选择照片时使用的拾取源是照片库或者相簿,此刻需要指定为摄像头类型。

指定摄像头,前置摄像头或者后置摄像头。

设置媒体类型mediaType,注意如果是录像必须设置,如果是拍照此步骤可以省略,因为mediaType默认包含kUTTypeImage(注意媒体类型定义在MobileCoreServices.framework中)

指定捕获模式,拍照或者录制视频。(视频录制时必须先设置媒体类型再设置捕获模式

展示UIImagePickerController(通常以模态窗口形式打开)。

拍照和录制视频结束后在代理方法中展示/保存照片或视频。

当然这个过程中有很多细节可以设置,例如是否显示拍照控制面板,拍照后是否允许编辑等等,通过上面的属性/方法列表相信并不难理解。下面就以一个示例展示如何使用UIImagePickerController来拍照和录制视频,下面的程序中只要将_isVideo设置为YES就是视频录制模式,录制完后在主视图控制器中自动播放;如果将_isVideo设置为NO则为拍照模式,拍照完成之后在主视图控制器中显示拍摄的照片:

// ViewController.m

// UIImagePickerController

#import "ViewController.h"

#import <MobileCoreServices/MobileCoreServices.h>

#import <AVFoundation/AVFoundation.h>

 

@interfaceViewController()<UIImagePickerControllerDelegate,UINavigationControllerDelegate>

@property(assign,nonatomic) intisVideo;//是否录制视频,如果为1表示录制视频,0代表拍照

@property(strong,nonatomic) UIImagePickerController *imagePicker;

@property(weak, nonatomic) IBOutlet UIImageView *photo;//照片展示视图

@property(strong ,nonatomic) AVPlayer *player;//播放器,用于录制完视频后播放视频

 

@end

 

@implementation ViewController

 

#pragma mark -控制器视图事件

- (void)viewDidLoad {

   [super viewDidLoad];

    //通过这里设置当前程序是拍照还是录制视频

    _isVideo=YES;

}

 

#pragma mark - UI事件

//点击拍照按钮

- (IBAction)takeClick:(UIButton *)sender {

   [self presentViewController:self.imagePicker animated:YEScompletion:nil];

}

 

#pragma mark - UIImagePickerController代理方法

//完成

-(void)imagePickerController:(UIImagePickerController *)pickerdidFinishPickingMediaWithInfo:(NSDictionary *)info{

   NSString *mediaType=[infoobjectForKey:UIImagePickerControllerMediaType];

    if([mediaType isEqualToString:(NSString*)kUTTypeImage]) {//如果是拍照

       UIImage *image;

       //如果允许编辑则获得编辑后的照片,否则获取原始照片

       if(self.imagePicker.allowsEditing) {

           image=[info objectForKey:UIImagePickerControllerEditedImage];//获取编辑后的照片

       }else{

           image=[info objectForKey:UIImagePickerControllerOriginalImage];//获取原始照片

       }

       [self.photo setImage:image];//显示照片

       UIImageWriteToSavedPhotosAlbum(image,nil, nil, nil);//保存到相簿

    }else if([mediaType isEqualToString:(NSString*)kUTTypeMovie]){//如果是录制视频

       NSLog(@"video...");

       NSURL *url=[info objectForKey:UIImagePickerControllerMediaURL];//视频路径

       NSString*urlStr=[url path];

       if(UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(urlStr)){

           //保存视频到相簿,注意也可以使用ALAssetsLibrary来保存

           UISaveVideoAtPathToSavedPhotosAlbum(urlStr,self, @selector(video:didFinishSavingWithError:contextInfo:), nil);//保存视频到相簿

       }

    }

   [self dismissViewControllerAnimated:YES completion:nil];

}

-(void)imagePickerControllerDidCancel:(UIImagePickerController *)picker{

   NSLog(@"取消");

}

 

#pragma mark -私有方法

-(UIImagePickerController *)imagePicker{

    if(!_imagePicker) {

       _imagePicker=[[UIImagePickerController alloc]init];

       _imagePicker.sourceType=UIImagePickerControllerSourceTypeCamera;//设置image picker的来源,这里设置为摄像头

       _imagePicker.cameraDevice=UIImagePickerControllerCameraDeviceRear;//设置使用哪个摄像头,这里设置为后置摄像头

       if(self.isVideo) {

           _imagePicker.mediaTypes=@[(NSString *)kUTTypeMovie];

           _imagePicker.videoQuality=UIImagePickerControllerQualityTypeIFrame1280x720;

           _imagePicker.cameraCaptureMode=UIImagePickerControllerCameraCaptureModeVideo;//设置摄像头模式(拍照,录制视频)

       }else{

  _imagePicker.cameraCaptureMode=UIImagePickerControllerCameraCaptureModePhoto;

       }

       _imagePicker.allowsEditing=YES;//允许编辑

       _imagePicker.delegate=self;//设置代理,检测操作

    }

    return_imagePicker;

}

 

//视频保存后的回调

- (void)video:(NSString *)videoPathdidFinishSavingWithError:(NSError *)error contextInfo:(void*)contextInfo{

    if(error) {

       NSLog(@"保存视频过程中发生错误,错误信息:%@",error.localizedDescription);

    }else{

       NSLog(@"视频保存成功.");

       //录制完之后自动播放

       NSURL*url=[NSURL fileURLWithPath:videoPath];

       _player=[AVPlayer playerWithURL:url];

       AVPlayerLayer *playerLayer=[AVPlayerLayerplayerLayerWithPlayer:_player];

       playerLayer.frame=self.photo.frame;

       [self.photo.layer addSublayer:playerLayer];

       [_player play];

    }

}

@end

运行效果(视频录制):

AVFoundation拍照和录制视频

不得不说UIImagePickerController确实强大,但是与MPMoviePlayerController类似,由于它的高度封装性,要进行某些自定义工作就比较复杂了。例如要做出一款类似于美颜相机的拍照界面就比较难以实现了,此时就可以考虑使用AVFoundation来实现。AVFoundation中提供了很多现成的播放器和录音机,但是事实上它还有更加底层的内容可以供开发者使用。因为AVFoundation中抽了很多和底层输入、输出设备打交道的类,依靠这些类开发人员面对的不再是封装好的音频播放器AVAudioPlayer、录音机(AVAudioRecorder)、视频(包括音频)播放器AVPlayer,而是输入设备(例如麦克风、摄像头)、输出设备(图片、视频)等。首先了解一下使用AVFoundation做拍照和视频录制开发用到的相关类:

AVCaptureSession:媒体(音、视频)捕获会话,负责把捕获的音视频数据输出到输出设备中。一个AVCaptureSession可以有多个输入输出:

AVCaptureDevice:输入设备,包括麦克风、摄像头,通过该对象可以设置物理设备的一些属性(例如相机聚焦、白平衡等)。

AVCaptureDeviceInput:设备输入数据管理对象,可以根据AVCaptureDevice创建对应的AVCaptureDeviceInput对象,该对象将会被添加到AVCaptureSession中管理。

AVCaptureOutput:输出数据管理对象,用于接收各类输出数据,通常使用对应的子类AVCaptureAudioDataOutputAVCaptureStillImageOutputAVCaptureVideoDataOutputAVCaptureFileOutput,该对象将会被添加到AVCaptureSession中管理。注意:前面几个对象的输出数据都是NSData类型,而AVCaptureFileOutput代表数据以文件形式输出,类似的,AVCcaptureFileOutput也不会直接创建使用,通常会使用其子类:AVCaptureAudioFileOutputAVCaptureMovieFileOutput。当把一个输入或者输出添加到AVCaptureSession之后AVCaptureSession就会在所有相符的输入、输出设备之间建立连接(AVCaptionConnection):

AVCaptureVideoPreviewLayer:相机拍摄预览图层,是CALayer的子类,使用该对象可以实时查看拍照或视频录制效果,创建该对象需要指定对应的AVCaptureSession对象。

使用AVFoundation拍照和录制视频的一般步骤如下:

创建AVCaptureSession对象。

使用AVCaptureDevice的静态方法获得需要使用的设备,例如拍照和录像就需要获得摄像头设备,录音就要获得麦克风设备。

利用输入设备AVCaptureDevice初始化AVCaptureDeviceInput对象。

初始化输出数据管理对象,如果要拍照就初始化AVCaptureStillImageOutput对象;如果拍摄视频就初始化AVCaptureMovieFileOutput对象。

将数据输入对象AVCaptureDeviceInput、数据输出对象AVCaptureOutput添加到媒体会话管理对象AVCaptureSession中。

创建视频预览图层AVCaptureVideoPreviewLayer并指定媒体会话,添加图层到显示容器中,调用AVCaptureSessionstartRuning方法开始捕获。

将捕获的音频或视频数据输出到指定文件。

拍照

下面看一下如何使用AVFoundation实现一个拍照程序,在这个程序中将实现摄像头预览、切换前后摄像头、闪光灯设置、对焦、拍照保存等功能。应用大致效果如下:

在程序中定义会话、输入、输出等相关对象。

@interfaceViewController ()

@property(strong,nonatomic) AVCaptureSession *captureSession;//负责输入和输出设备之间的数据传递

@property(strong,nonatomic) AVCaptureDeviceInput *captureDeviceInput;//负责从AVCaptureDevice获得输入数据

@property(strong,nonatomic) AVCaptureStillImageOutput *captureStillImageOutput;//照片输出流

@property(strong,nonatomic) AVCaptureVideoPreviewLayer *captureVideoPreviewLayer;//相机拍摄预览图层

@property(weak, nonatomic) IBOutlet UIView *viewContainer;

@property(weak, nonatomic) IBOutlet UIButton *takeButton;//拍照按钮

@property(weak, nonatomic) IBOutlet UIButton *flashAutoButton;//自动闪光灯按钮

@property(weak, nonatomic) IBOutlet UIButton *flashOnButton;//打开闪光灯按钮

@property(weak, nonatomic) IBOutlet UIButton *flashOffButton;//关闭闪光灯按钮

@property(weak, nonatomic) IBOutlet UIImageView *focusCursor;//聚焦光标

@end

在控制器视图将要展示时创建并初始化会话、摄像头设备、输入、输出、预览图层,并且添加预览图层到视图中,除此之外还做了一些初始化工作,例如添加手势(点击屏幕进行聚焦)、初始化界面等。

-(void)viewWillAppear:(BOOL)animated{

   [super viewWillAppear:animated];

    //初始化会话

    _captureSession=[[AVCaptureSessionalloc]init];

    if([_captureSessioncanSetSessionPreset:AVCaptureSessionPreset1280x720]) {//设置分辨率

       _captureSession.sessionPreset=AVCaptureSessionPreset1280x720;

    }

    //获得输入设备

    AVCaptureDevice*captureDevice=[self getCameraDeviceWithPosition:AVCaptureDevicePositionBack];//取得后置摄像头

    if(!captureDevice) {

       NSLog(@"取得后置摄像头时出现问题.");

       return;

    }

   

   NSError *error=nil;

    //根据输入设备初始化设备输入对象,用于获得输入数据

    _captureDeviceInput=[[AVCaptureDeviceInputalloc]initWithDevice:captureDevice error:&error];

    if(error) {

       NSLog(@"取得设备输入对象时出错,错误原因:%@",error.localizedDescription);

       return;

    }

    //初始化设备输出对象,用于获得输出数据

    _captureStillImageOutput=[[AVCaptureStillImageOutputalloc]init];

   NSDictionary *outputSettings = @{AVVideoCodecKey:AVVideoCodecJPEG};

   [_captureStillImageOutput setOutputSettings:outputSettings];//输出设置

   

   //将设备输入添加到会话中

    if([_captureSessioncanAddInput:_captureDeviceInput]) {

       [_captureSession addInput:_captureDeviceInput];

    }

   

    //将设备输出添加到会话中

    if([_captureSessioncanAddOutput:_captureStillImageOutput]) {

       [_captureSession addOutput:_captureStillImageOutput];

    }

   

    //创建视频预览层,用于实时展示摄像头状态

    _captureVideoPreviewLayer=[[AVCaptureVideoPreviewLayeralloc]initWithSession:self.captureSession];

   

   CALayer *layer=self.viewContainer.layer;

   layer.masksToBounds=YES;

   

   _captureVideoPreviewLayer.frame=layer.bounds;

   _captureVideoPreviewLayer.videoGravity=AVLayerVideoGravityResizeAspectFill;//填充模式

   //将视频预览层添加到界面中

   //[layer addSublayer:_captureVideoPreviewLayer];

    [layerinsertSublayer:_captureVideoPreviewLayer below:self.focusCursor.layer];

   

   [self addNotificationToCaptureDevice:captureDevice];

   [self addGenstureRecognizer];

   [self setFlashModeButtonStatus];

}

在控制器视图展示和视图离开界面时启动、停止会话。

-(void)viewDidAppear:(BOOL)animated{

   [super viewDidAppear:animated];

   [self.captureSession startRunning];

}

 

-(void)viewDidDisappear:(BOOL)animated{

   [super viewDidDisappear:animated];

   [self.captureSession stopRunning];

}

定义闪光灯开闭及自动模式功能,注意无论是设置闪光灯、白平衡还是其他输入设备属性,在设置之前必须先锁定配置,修改完后解锁。

/**

 *  改变设备属性的统一操作方法

 * @param propertyChange 属性改变操作

 */

-(void)changeDeviceProperty:(PropertyChangeBlock)propertyChange{

   AVCaptureDevice *captureDevice= [self.captureDeviceInput device];

   NSError *error;

    //注意改变设备属性前一定要首先调用lockForConfiguration:调用完之后使用unlockForConfiguration方法解锁

    if([captureDevicelockForConfiguration:&error]) {

       propertyChange(captureDevice);

       [captureDevice unlockForConfiguration];

    }else{

       NSLog(@"设置设备属性过程发生错误,错误信息:%@",error.localizedDescription);

    }

}

 

/**

 *  设置闪光灯模式

 * @param flashMode 闪光灯模式

 */

-(void)setFlashMode:(AVCaptureFlashMode )flashMode{

   [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {

       if([captureDeviceisFlashModeSupported:flashMode]) {

           [captureDevice setFlashMode:flashMode];

       }

   }];

}

定义切换摄像头功能,切换摄像头的过程就是将原有输入移除,在会话中添加新的输入,但是注意动态修改会话需要首先开启配置,配置成功后提交配置。

#pragma mark切换前后摄像头

- (IBAction)toggleButtonClick:(UIButton*)sender {

   AVCaptureDevice *currentDevice=[self.captureDeviceInput device];

   AVCaptureDevicePosition currentPosition=[currentDevice position];

   [self removeNotificationFromCaptureDevice:currentDevice];

   AVCaptureDevice *toChangeDevice;

   AVCaptureDevicePosition toChangePosition=AVCaptureDevicePositionFront;

    if(currentPosition==AVCaptureDevicePositionUnspecified||currentPosition==AVCaptureDevicePositionFront){

       toChangePosition=AVCaptureDevicePositionBack;

    }

   toChangeDevice=[self getCameraDeviceWithPosition:toChangePosition];

   [self addNotificationToCaptureDevice:toChangeDevice];

    //获得要调整的设备输入对象

    AVCaptureDeviceInput*toChangeDeviceInput=[[AVCaptureDeviceInput alloc]initWithDevice:toChangeDeviceerror:nil];

   

    //改变会话的配置前一定要先开启配置,配置完成后提交配置改变

    [self.captureSessionbeginConfiguration];

    //移除原有输入对象

    [self.captureSession removeInput:self.captureDeviceInput];

    //添加新的输入对象

    if([self.captureSessioncanAddInput:toChangeDeviceInput]) {

       [self.captureSession addInput:toChangeDeviceInput];

       self.captureDeviceInput=toChangeDeviceInput;

    }

    //提交会话配置

    [self.captureSessioncommitConfiguration];

   

   [self setFlashModeButtonStatus];

}

添加点击手势操作,点按预览视图时进行聚焦、白平衡设置。

/**

 *  设置聚焦点

 * @param point 聚焦点

 */

-(void)focusWithMode:(AVCaptureFocusMode)focusModeexposureMode:(AVCaptureExposureMode)exposureMode atPoint:(CGPoint)point{

   [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {

       if([captureDeviceisFocusModeSupported:focusMode]) {

           [captureDevice setFocusMode:AVCaptureFocusModeAutoFocus];

       }

       if([captureDevice isFocusPointOfInterestSupported]){

           [captureDevice setFocusPointOfInterest:point];

       }

       if([captureDeviceisExposureModeSupported:exposureMode]) {

           [captureDevice setExposureMode:AVCaptureExposureModeAutoExpose];

       }

        if([captureDevice isExposurePointOfInterestSupported]) {

           [captureDevice setExposurePointOfInterest:point];

       }

   }];

}

 

/**

 *  添加点按手势,点按时聚焦

 */

-(void)addGenstureRecognizer{

   UITapGestureRecognizer *tapGesture=[[UITapGestureRecognizeralloc]initWithTarget:self action:@selector(tapScreen:)];

   [self.viewContainer addGestureRecognizer:tapGesture];

}

-(void)tapScreen:(UITapGestureRecognizer *)tapGesture{

   CGPoint point= [tapGesture locationInView:self.viewContainer];

    //UI坐标转化为摄像头坐标

    CGPoint cameraPoint=[self.captureVideoPreviewLayer captureDevicePointOfInterestForPoint:point];

   [self setFocusCursorWithPoint:point];

   [self focusWithMode:AVCaptureFocusModeAutoFocusexposureMode:AVCaptureExposureModeAutoExpose atPoint:cameraPoint];

}

定义拍照功能,拍照的过程就是获取连接,从连接中获得捕获的输出数据并做保存操作。

#pragma mark拍照

- (IBAction)takeButtonClick:(UIButton*)sender {

    //根据设备输出获得连接

    AVCaptureConnection*captureConnection=[self.captureStillImageOutputconnectionWithMediaType:AVMediaTypeVideo];

    //根据连接取得设备输出的数据

    [self.captureStillImageOutputcaptureStillImageAsynchronouslyFromConnection:captureConnectioncompletionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

       if(imageDataSampleBuffer) {

           NSData *imageData=[AVCaptureStillImageOutputjpegStillImageNSDataRepresentation:imageDataSampleBuffer];

           UIImage *image=[UIImage imageWithData:imageData];

           UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);

//            ALAssetsLibrary *assetsLibrary=[[ALAssetsLibraryalloc]init];

//            [assetsLibrarywriteImageToSavedPhotosAlbum:[image CGImage]orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil];

       }

       

   }];

}

最后附上完整代码:


// AVFoundationCamera


#import "ViewController.h"

#import <AVFoundation/AVFoundation.h>

#import <AssetsLibrary/AssetsLibrary.h>

typedef void(^PropertyChangeBlock)(AVCaptureDevice*captureDevice);

 

@interfaceViewController ()

 

@property(strong,nonatomic) AVCaptureSession *captureSession;//负责输入和输出设备之间的数据传递

@property(strong,nonatomic) AVCaptureDeviceInput *captureDeviceInput;//负责从AVCaptureDevice获得输入数据

@property(strong,nonatomic) AVCaptureStillImageOutput *captureStillImageOutput;//照片输出流

@property(strong,nonatomic) AVCaptureVideoPreviewLayer *captureVideoPreviewLayer;//相机拍摄预览图层

@property(weak, nonatomic) IBOutlet UIView *viewContainer;

@property(weak, nonatomic) IBOutlet UIButton *takeButton;//拍照按钮

@property(weak, nonatomic) IBOutlet UIButton *flashAutoButton;//自动闪光灯按钮

@property(weak, nonatomic) IBOutlet UIButton *flashOnButton;//打开闪光灯按钮

@property(weak, nonatomic) IBOutlet UIButton *flashOffButton;//关闭闪光灯按钮

@property(weak, nonatomic) IBOutlet UIImageView *focusCursor;//聚焦光标

 

 

 

@end

 

@implementation ViewController

 

#pragma mark -控制器视图方法

- (void)viewDidLoad {

   [super viewDidLoad];

   

}

 

-(void)viewWillAppear:(BOOL)animated{

   [super viewWillAppear:animated];

    //初始化会话

    _captureSession=[[AVCaptureSessionalloc]init];

    if([_captureSessioncanSetSessionPreset:AVCaptureSessionPreset1280x720]) {//设置分辨率

       _captureSession.sessionPreset=AVCaptureSessionPreset1280x720;

    }

    //获得输入设备

    AVCaptureDevice*captureDevice=[self getCameraDeviceWithPosition:AVCaptureDevicePositionBack];//取得后置摄像头

    if(!captureDevice) {

       NSLog(@"取得后置摄像头时出现问题.");

       return;

    }

   

   NSError *error=nil;

    //根据输入设备初始化设备输入对象,用于获得输入数据

    _captureDeviceInput=[[AVCaptureDeviceInputalloc]initWithDevice:captureDevice error:&error];

    if(error) {

       NSLog(@"取得设备输入对象时出错,错误原因:%@",error.localizedDescription);

       return;

    }

    //初始化设备输出对象,用于获得输出数据

    _captureStillImageOutput=[[AVCaptureStillImageOutputalloc]init];

   NSDictionary *outputSettings = @{AVVideoCodecKey:AVVideoCodecJPEG};

   [_captureStillImageOutput setOutputSettings:outputSettings];//输出设置

   

   //将设备输入添加到会话中

    if([_captureSessioncanAddInput:_captureDeviceInput]) {

       [_captureSession addInput:_captureDeviceInput];

    }

   

    //将设备输出添加到会话中

    if([_captureSessioncanAddOutput:_captureStillImageOutput]) {

       [_captureSession addOutput:_captureStillImageOutput];

    }

   

    //创建视频预览层,用于实时展示摄像头状态

    _captureVideoPreviewLayer=[[AVCaptureVideoPreviewLayeralloc]initWithSession:self.captureSession];

   

   CALayer *layer=self.viewContainer.layer;

   layer.masksToBounds=YES;

   

   _captureVideoPreviewLayer.frame=layer.bounds;

   _captureVideoPreviewLayer.videoGravity=AVLayerVideoGravityResizeAspectFill;//填充模式

   //将视频预览层添加到界面中

   //[layer addSublayer:_captureVideoPreviewLayer];

    [layerinsertSublayer:_captureVideoPreviewLayer below:self.focusCursor.layer];

   

   [self addNotificationToCaptureDevice:captureDevice];

   [self addGenstureRecognizer];

   [self setFlashModeButtonStatus];

}

 

-(void)viewDidAppear:(BOOL)animated{

   [super viewDidAppear:animated];

   [self.captureSession startRunning];

}

 

-(void)viewDidDisappear:(BOOL)animated{

   [super viewDidDisappear:animated];

   [self.captureSession stopRunning];

}

 

-(void)dealloc{

   [self removeNotification];

}

#pragma mark - UI方法

#pragma mark拍照

- (IBAction)takeButtonClick:(UIButton*)sender {

    //根据设备输出获得连接

    AVCaptureConnection*captureConnection=[self.captureStillImageOutputconnectionWithMediaType:AVMediaTypeVideo];

    //根据连接取得设备输出的数据

    [self.captureStillImageOutputcaptureStillImageAsynchronouslyFromConnection:captureConnectioncompletionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

       if(imageDataSampleBuffer) {

           NSData *imageData=[AVCaptureStillImageOutputjpegStillImageNSDataRepresentation:imageDataSampleBuffer];

           UIImage *image=[UIImage imageWithData:imageData];

           UIImageWriteToSavedPhotosAlbum(image,nil, nil, nil);

//            ALAssetsLibrary*assetsLibrary=[[ALAssetsLibrary alloc]init];

//            [assetsLibrarywriteImageToSavedPhotosAlbum:[image CGImage]orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil];

       }

       

   }];

}

#pragma mark切换前后摄像头

- (IBAction)toggleButtonClick:(UIButton*)sender {

   AVCaptureDevice *currentDevice=[self.captureDeviceInput device];

   AVCaptureDevicePosition currentPosition=[currentDevice position];

   [self removeNotificationFromCaptureDevice:currentDevice];

   AVCaptureDevice *toChangeDevice;

   AVCaptureDevicePosition toChangePosition=AVCaptureDevicePositionFront;

    if(currentPosition==AVCaptureDevicePositionUnspecified||currentPosition==AVCaptureDevicePositionFront){

       toChangePosition=AVCaptureDevicePositionBack;

    }

   toChangeDevice=[self getCameraDeviceWithPosition:toChangePosition];

   [self addNotificationToCaptureDevice:toChangeDevice];

    //获得要调整的设备输入对象

    AVCaptureDeviceInput*toChangeDeviceInput=[[AVCaptureDeviceInput alloc]initWithDevice:toChangeDeviceerror:nil];

   

    //改变会话的配置前一定要先开启配置,配置完成后提交配置改变

    [self.captureSessionbeginConfiguration];

    //移除原有输入对象

    [self.captureSessionremoveInput:self.captureDeviceInput];

    //添加新的输入对象

    if([self.captureSessioncanAddInput:toChangeDeviceInput]) {

       [self.captureSession addInput:toChangeDeviceInput];

       self.captureDeviceInput=toChangeDeviceInput;

    }

    //提交会话配置

    [self.captureSessioncommitConfiguration];

   

   [self setFlashModeButtonStatus];

}

 

#pragma mark自动闪光灯开启

- (IBAction)flashAutoClick:(UIButton*)sender {

   [self setFlashMode:AVCaptureFlashModeAuto];

   [self setFlashModeButtonStatus];

}

#pragma mark打开闪光灯

- (IBAction)flashOnClick:(UIButton *)sender{

   [self setFlashMode:AVCaptureFlashModeOn];

   [self setFlashModeButtonStatus];

}

#pragma mark关闭闪光灯

- (IBAction)flashOffClick:(UIButton*)sender {

   [self setFlashMode:AVCaptureFlashModeOff];

   [self setFlashModeButtonStatus];

}

 

#pragma mark -通知

/**

 *  给输入设备添加通知

 */

-(void)addNotificationToCaptureDevice:(AVCaptureDevice *)captureDevice{

    //注意添加区域改变捕获通知必须首先设置设备允许捕获

    [selfchangeDeviceProperty:^(AVCaptureDevice *captureDevice) {

       captureDevice.subjectAreaChangeMonitoringEnabled=YES;

   }];

   NSNotificationCenter *notificationCenter= [NSNotificationCenterdefaultCenter];

    //捕获区域发生改变

    [notificationCenteraddObserver:self selector:@selector(areaChange:)name:AVCaptureDeviceSubjectAreaDidChangeNotification object:captureDevice];

}

-(void)removeNotificationFromCaptureDevice:(AVCaptureDevice *)captureDevice{

   NSNotificationCenter *notificationCenter= [NSNotificationCenterdefaultCenter];

   [notificationCenter removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotificationobject:captureDevice];

}

/**

 *  移除所有通知

 */

-(void)removeNotification{

   NSNotificationCenter *notificationCenter= [NSNotificationCenterdefaultCenter];

   [notificationCenter removeObserver:self];

}

 

-(void)addNotificationToCaptureSession:(AVCaptureSession *)captureSession{

   NSNotificationCenter *notificationCenter= [NSNotificationCenterdefaultCenter];

    //会话出错

    [notificationCenteraddObserver:self selector:@selector(sessionRuntimeError:) name:AVCaptureSessionRuntimeErrorNotificationobject:captureSession];

}

 

/**

 *  设备连接成功

 *

 * @param notification 通知对象

 */

-(void)deviceConnected:(NSNotification *)notification{

   NSLog(@"设备已连接...");

}

/**

 *  设备连接断开

 *

 * @param notification 通知对象

 */

-(void)deviceDisconnected:(NSNotification *)notification{

   NSLog(@"设备已断开.");

}

/**

 *  捕获区域改变

 *

 * @param notification 通知对象

 */

-(void)areaChange:(NSNotification *)notification{

   NSLog(@"捕获区域改变...");

}

 

/**

 *  会话出错

 * @param notification 通知对象

 */

-(void)sessionRuntimeError:(NSNotification *)notification{

   NSLog(@"会话发生错误.");

}

 

#pragma mark -私有方法

 

/**

 *  取得指定位置的摄像头

 *

 * @param position 摄像头位置

 * @return 摄像头设备

 */

-(AVCaptureDevice*)getCameraDeviceWithPosition:(AVCaptureDevicePosition )position{

   NSArray *cameras= [AVCaptureDevicedevicesWithMediaType:AVMediaTypeVideo];

    for(AVCaptureDevice *camera in cameras) {

       if([camera position]==position) {

           returncamera;

       }

    }

    returnnil;

}

 

/**

 *  改变设备属性的统一操作方法

 * @param propertyChange 属性改变操作

 */

-(void)changeDeviceProperty:(PropertyChangeBlock)propertyChange{

   AVCaptureDevice *captureDevice= [self.captureDeviceInput device];

   NSError *error;

    //注意改变设备属性前一定要首先调用lockForConfiguration:调用完之后使用unlockForConfiguration方法解锁

    if([captureDevicelockForConfiguration:&error]) {

       propertyChange(captureDevice);

       [captureDevice unlockForConfiguration];

    }else{

       NSLog(@"设置设备属性过程发生错误,错误信息:%@",error.localizedDescription);

    }

}

 

/**

 *  设置闪光灯模式

 *

 * @param flashMode 闪光灯模式

 */

-(void)setFlashMode:(AVCaptureFlashMode )flashMode{

   [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {

       if([captureDeviceisFlashModeSupported:flashMode]) {

           [captureDevice setFlashMode:flashMode];

        }

   }];

}

/**

 *  设置聚焦模式

 * @param focusMode 聚焦模式

 */

-(void)setFocusMode:(AVCaptureFocusMode )focusMode{

   [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {

       if([captureDeviceisFocusModeSupported:focusMode]) {

           [captureDevice setFocusMode:focusMode];

       }

   }];

}

/**

 *  设置曝光模式

 * @param exposureMode 曝光模式

 */

-(void)setExposureMode:(AVCaptureExposureMode)exposureMode{

   [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {

       if([captureDeviceisExposureModeSupported:exposureMode]) {

           [captureDevice setExposureMode:exposureMode];

       }

   }];

}

/**

 *  设置聚焦点

 * @param point 聚焦点

 */

-(void)focusWithMode:(AVCaptureFocusMode)focusModeexposureMode:(AVCaptureExposureMode)exposureMode atPoint:(CGPoint)point{

   [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {

       if([captureDeviceisFocusModeSupported:focusMode]) {

           [captureDevice setFocusMode:AVCaptureFocusModeAutoFocus];

       }

       if([captureDeviceisFocusPointOfInterestSupported]) {

           [captureDevice setFocusPointOfInterest:point];

       }

       if([captureDeviceisExposureModeSupported:exposureMode]) {

           [captureDevice setExposureMode:AVCaptureExposureModeAutoExpose];

       }

       if([captureDeviceisExposurePointOfInterestSupported]) {

           [captureDevice setExposurePointOfInterest:point];

       }

   }];

}

 

/**

 *  添加点按手势,点按时聚焦

 */

-(void)addGenstureRecognizer{

   UITapGestureRecognizer *tapGesture=[[UITapGestureRecognizeralloc]initWithTarget:self action:@selector(tapScreen:)];

   [self.viewContainer addGestureRecognizer:tapGesture];

}

-(void)tapScreen:(UITapGestureRecognizer *)tapGesture{

   CGPoint point= [tapGesture locationInView:self.viewContainer];

    //UI坐标转化为摄像头坐标

    CGPoint cameraPoint=[self.captureVideoPreviewLayer captureDevicePointOfInterestForPoint:point];

   [self setFocusCursorWithPoint:point];

   [self focusWithMode:AVCaptureFocusModeAutoFocusexposureMode:AVCaptureExposureModeAutoExpose atPoint:cameraPoint];

}

 

/**

 *  设置闪光灯按钮状态

 */

-(void)setFlashModeButtonStatus{

   AVCaptureDevice *captureDevice=[self.captureDeviceInput device];

   AVCaptureFlashMode flashMode=captureDevice.flashMode;

    if([captureDevice isFlashAvailable]){

       self.flashAutoButton.hidden=NO;

       self.flashOnButton.hidden=NO;

       self.flashOffButton.hidden=NO;

       self.flashAutoButton.enabled=YES;

       self.flashOnButton.enabled=YES;

       self.flashOffButton.enabled=YES;

       switch(flashMode) {

           caseAVCaptureFlashModeAuto:

               self.flashAutoButton.enabled=NO;

                break;

           caseAVCaptureFlashModeOn:

                self.flashOnButton.enabled=NO;

                break;

           caseAVCaptureFlashModeOff:

                self.flashOffButton.enabled=NO;

                break;

           default:

                break;

       }

    }else{

       self.flashAutoButton.hidden=YES;

       self.flashOnButton.hidden=YES;

       self.flashOffButton.hidden=YES;

    }

}

 

/**

 *  设置聚焦光标位置

 * @param point 光标位置

 */

-(void)setFocusCursorWithPoint:(CGPoint)point{

   self.focusCursor.center=point;

   self.focusCursor.transform=CGAffineTransformMakeScale(1.5, 1.5);

   self.focusCursor.alpha=1.0;

   [UIView animateWithDuration:1.0 animations:^{

       self.focusCursor.transform=CGAffineTransformIdentity;

    }completion:^(BOOL finished) {

       self.focusCursor.alpha=0;

       

   }];

}

@end

运行效果:

 

视频录制

其实有了前面的拍照应用之后要在此基础上做视频录制功能并不复杂,程序只需要做如下修改:

添加一个音频输入到会话(使用[[AVCaptureDevicedevicesWithMediaType:AVMediaTypeAudio] firstObject]获得输入设备,然后根据此输入设备创建一个设备输入对象),在拍照程序中已经添加了视频输入所以此时不需要添加视频输入。

创建一个音乐播放文件输出对象AVCaptureMovieFileOutput取代原来的照片输出对象。

将捕获到的视频数据写入到临时文件并在停止录制之后保存到相簿(通过AVCaptureMovieFileOutput的代理方法)。

相比拍照程序,程序的修改主要就是以上三点。当然为了让程序更加完善在下面的视频录制程序中加入了屏幕旋转视频、自动布局和后台保存任务等细节。下面是修改后的程序:

//

// ViewController.m

// AVFoundationCamera

 

#import "ViewController.h"

#import <AVFoundation/AVFoundation.h>

#import <AssetsLibrary/AssetsLibrary.h>

typedef void(^PropertyChangeBlock)(AVCaptureDevice*captureDevice);

 

@interfaceViewController ()<AVCaptureFileOutputRecordingDelegate>//视频文件输出代理

 

@property(strong,nonatomic) AVCaptureSession *captureSession;//负责输入和输出设备之间的数据传递

@property(strong,nonatomic) AVCaptureDeviceInput *captureDeviceInput;//负责从AVCaptureDevice获得输入数据

@property(strong,nonatomic) AVCaptureMovieFileOutput *captureMovieFileOutput;//视频输出流

@property(strong,nonatomic) AVCaptureVideoPreviewLayer *captureVideoPreviewLayer;//相机拍摄预览图层

@property(assign,nonatomic) BOOL enableRotation;//是否允许旋转(注意在视频录制过程中禁止屏幕旋转)

@property(assign,nonatomic) CGRect *lastBounds;//旋转的前大小

@property(assign,nonatomic) UIBackgroundTaskIdentifier backgroundTaskIdentifier;//后台任务标识

@property(weak, nonatomic) IBOutlet UIView *viewContainer;

@property(weak, nonatomic) IBOutlet UIButton *takeButton;//拍照按钮

@property(weak, nonatomic) IBOutlet UIImageView *focusCursor;//聚焦光标

 

 

@end

 

@implementation ViewController

 

#pragma mark -控制器视图方法

- (void)viewDidLoad {

   [super viewDidLoad];

}

 

-(void)viewWillAppear:(BOOL)animated{

   [super viewWillAppear:animated];

    //初始化会话

    _captureSession=[[AVCaptureSessionalloc]init];

    if([_captureSessioncanSetSessionPreset:AVCaptureSessionPreset1280x720]) {//设置分辨率

       _captureSession.sessionPreset=AVCaptureSessionPreset1280x720;

    }

    //获得输入设备

    AVCaptureDevice*captureDevice=[self getCameraDeviceWithPosition:AVCaptureDevicePositionBack];//取得后置摄像头

    if(!captureDevice) {

       NSLog(@"取得后置摄像头时出现问题.");

       return;

    }

    //添加一个音频输入设备

    AVCaptureDevice*audioCaptureDevice=[[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio]firstObject];

   

   

   NSError *error=nil;

    //根据输入设备初始化设备输入对象,用于获得输入数据

    _captureDeviceInput=[[AVCaptureDeviceInputalloc]initWithDevice:captureDevice error:&error];

    if(error) {

       NSLog(@"取得设备输入对象时出错,错误原因:%@",error.localizedDescription);

       return;

    }

   AVCaptureDeviceInput *audioCaptureDeviceInput=[[AVCaptureDeviceInputalloc]initWithDevice:audioCaptureDevice error:&error];

    if(error) {

       NSLog(@"取得设备输入对象时出错,错误原因:%@",error.localizedDescription);

       return;

    }

    //初始化设备输出对象,用于获得输出数据

    _captureMovieFileOutput=[[AVCaptureMovieFileOutputalloc]init];

   

    //将设备输入添加到会话中

    if([_captureSessioncanAddInput:_captureDeviceInput]) {

       [_captureSession addInput:_captureDeviceInput];

       [_captureSession addInput:audioCaptureDeviceInput];

       AVCaptureConnection*captureConnection=[_captureMovieFileOutputconnectionWithMediaType:AVMediaTypeVideo];

       if([captureConnectionisVideoStabilizationSupported ]) {

           captureConnection.preferredVideoStabilizationMode=AVCaptureVideoStabilizationModeAuto;

       }

    }

   

    //将设备输出添加到会话中

    if([_captureSessioncanAddOutput:_captureMovieFileOutput]) {

       [_captureSession addOutput:_captureMovieFileOutput];

    }

   

    //创建视频预览层,用于实时展示摄像头状态

    _captureVideoPreviewLayer=[[AVCaptureVideoPreviewLayeralloc]initWithSession:self.captureSession];

   

   CALayer *layer=self.viewContainer.layer;

   layer.masksToBounds=YES;

   

   _captureVideoPreviewLayer.frame=layer.bounds;

   _captureVideoPreviewLayer.videoGravity=AVLayerVideoGravityResizeAspectFill;//填充模式

   //将视频预览层添加到界面中

   //[layer addSublayer:_captureVideoPreviewLayer];

    [layerinsertSublayer:_captureVideoPreviewLayer below:self.focusCursor.layer];

   

   _enableRotation=YES;

   [self addNotificationToCaptureDevice:captureDevice];

   [self addGenstureRecognizer];

}

 

-(void)viewDidAppear:(BOOL)animated{

   [super viewDidAppear:animated];

   [self.captureSession startRunning];

}

 

-(void)viewDidDisappear:(BOOL)animated{

   [super viewDidDisappear:animated];

   [self.captureSession stopRunning];

}

 

-(BOOL)shouldAutorotate{

    returnself.enableRotation;

}

 

屏幕旋转时调整视频预览图层的方向

//-(void)willTransitionToTraitCollection:(UITraitCollection*)newCollectionwithTransitionCoordinator:(id<UIViewControllerTransitionCoordinator>)coordinator{

//   [super willTransitionToTraitCollection:newCollectionwithTransitionCoordinator:coordinator];

   NSLog(@"%i,%i",newCollection.verticalSizeClass,newCollection.horizontalSizeClass);

//   UIInterfaceOrientation orientation = [[UIApplication sharedApplication]statusBarOrientation];

//   NSLog(@"%i",orientation);

//   AVCaptureConnection *captureConnection=[self.captureVideoPreviewLayerconnection];

//   captureConnection.videoOrientation=orientation;

//   

//}

//屏幕旋转时调整视频预览图层的方向

-(void)willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientationduration:(NSTimeInterval)duration{

   AVCaptureConnection *captureConnection=[self.captureVideoPreviewLayerconnection];

   captureConnection.videoOrientation=(AVCaptureVideoOrientation)toInterfaceOrientation;

}

//旋转后重新设置大小

-(void)didRotateFromInterfaceOrientation:(UIInterfaceOrientation)fromInterfaceOrientation{

   _captureVideoPreviewLayer.frame=self.viewContainer.bounds;

}

 

-(void)dealloc{

   [self removeNotification];

}

#pragma mark - UI方法

#pragma mark视频录制

- (IBAction)takeButtonClick:(UIButton*)sender {

    //根据设备输出获得连接

    AVCaptureConnection*captureConnection=[self.captureMovieFileOutputconnectionWithMediaType:AVMediaTypeVideo];

    //根据连接取得设备输出的数据

    if(![self.captureMovieFileOutputisRecording]) {

       self.enableRotation=NO;

       //如果支持多任务则则开始多任务

       if([[UIDevice currentDevice]isMultitaskingSupported]) {

           self.backgroundTaskIdentifier=[[UIApplication sharedApplication]beginBackgroundTaskWithExpirationHandler:nil];

       }

       //预览图层和视频方向保持一致

       captureConnection.videoOrientation=[self.captureVideoPreviewLayerconnection].videoOrientation;

       NSString *outputFielPath=[NSTemporaryDirectory() stringByAppendingString:@"myMovie.mov"];

       NSLog(@"savepath is :%@",outputFielPath);

       NSURL *fileUrl=[NSURL fileURLWithPath:outputFielPath];

       [self.captureMovieFileOutput startRecordingToOutputFileURL:fileUrlrecordingDelegate:self];

    }

    else{

       [self.captureMovieFileOutput stopRecording];//停止录制

    }

}

#pragma mark切换前后摄像头

- (IBAction)toggleButtonClick:(UIButton*)sender {

   AVCaptureDevice *currentDevice=[self.captureDeviceInput device];

   AVCaptureDevicePosition currentPosition=[currentDevice position];

   [self removeNotificationFromCaptureDevice:currentDevice];

   AVCaptureDevice *toChangeDevice;

   AVCaptureDevicePosition toChangePosition=AVCaptureDevicePositionFront;

    if(currentPosition==AVCaptureDevicePositionUnspecified||currentPosition==AVCaptureDevicePositionFront){

       toChangePosition=AVCaptureDevicePositionBack;

    }

   toChangeDevice=[self getCameraDeviceWithPosition:toChangePosition];

   [self addNotificationToCaptureDevice:toChangeDevice];

    //获得要调整的设备输入对象

    AVCaptureDeviceInput*toChangeDeviceInput=[[AVCaptureDeviceInput alloc]initWithDevice:toChangeDeviceerror:nil];

   

    //改变会话的配置前一定要先开启配置,配置完成后提交配置改变

    [self.captureSessionbeginConfiguration];

    //移除原有输入对象

    [self.captureSessionremoveInput:self.captureDeviceInput];

    //添加新的输入对象

    if([self.captureSessioncanAddInput:toChangeDeviceInput]) {

       [self.captureSession addInput:toChangeDeviceInput];

       self.captureDeviceInput=toChangeDeviceInput;

    }

    //提交会话配置

    [self.captureSessioncommitConfiguration];

   

}

 

#pragma mark -视频输出代理

-(void)captureOutput:(AVCaptureFileOutput *)captureOutputdidStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray*)connections{

   NSLog(@"开始录制...");

}

-(void)captureOutput:(AVCaptureFileOutput *)captureOutputdidFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURLfromConnections:(NSArray *)connections error:(NSError *)error{

   NSLog(@"视频录制完成.");

    //视频录入完成之后在后台将视频存储到相簿

    self.enableRotation=YES;

   UIBackgroundTaskIdentifierlastBackgroundTaskIdentifier=self.backgroundTaskIdentifier;

   self.backgroundTaskIdentifier=UIBackgroundTaskInvalid;

   ALAssetsLibrary *assetsLibrary=[[ALAssetsLibrary alloc]init];

   [assetsLibrary writeVideoAtPathToSavedPhotosAlbum:outputFileURLcompletionBlock:^(NSURL *assetURL, NSError *error) {

       if(error) {

           NSLog(@"保存视频到相簿过程中发生错误,错误信息:%@",error.localizedDescription);

       }

       if(lastBackgroundTaskIdentifier!=UIBackgroundTaskInvalid){

           [[UIApplication sharedApplication]endBackgroundTask:lastBackgroundTaskIdentifier];

       }

       NSLog(@"成功保存视频到相簿.");

   }];

   

}

 

#pragma mark -通知

/**

 *  给输入设备添加通知

 */

-(void)addNotificationToCaptureDevice:(AVCaptureDevice *)captureDevice{

    //注意添加区域改变捕获通知必须首先设置设备允许捕获

    [selfchangeDeviceProperty:^(AVCaptureDevice *captureDevice) {

       captureDevice.subjectAreaChangeMonitoringEnabled=YES;

   }];

   NSNotificationCenter *notificationCenter= [NSNotificationCenterdefaultCenter];

    //捕获区域发生改变

    [notificationCenteraddObserver:self selector:@selector(areaChange:)name:AVCaptureDeviceSubjectAreaDidChangeNotification object:captureDevice];

}

-(void)removeNotificationFromCaptureDevice:(AVCaptureDevice *)captureDevice{

   NSNotificationCenter *notificationCenter= [NSNotificationCenterdefaultCenter];

   [notificationCenter removeObserver:selfname:AVCaptureDeviceSubjectAreaDidChangeNotification object:captureDevice];

}

/**

 *  移除所有通知

 */

-(void)removeNotification{

   NSNotificationCenter *notificationCenter= [NSNotificationCenterdefaultCenter];

   [notificationCenter removeObserver:self];

}

 

-(void)addNotificationToCaptureSession:(AVCaptureSession *)captureSession{

   NSNotificationCenter *notificationCenter= [NSNotificationCenterdefaultCenter];

    //会话出错

    [notificationCenteraddObserver:self selector:@selector(sessionRuntimeError:)name:AVCaptureSessionRuntimeErrorNotification object:captureSession];

}

 

/**

 *  设备连接成功

 * @param notification 通知对象

 */

-(void)deviceConnected:(NSNotification *)notification{

   NSLog(@"设备已连接...");

}

/**

 *  设备连接断开

 * @param notification 通知对象

 */

-(void)deviceDisconnected:(NSNotification *)notification{

   NSLog(@"设备已断开.");

}

/**

 *  捕获区域改变

 * @param notification 通知对象

 */

-(void)areaChange:(NSNotification *)notification{

   NSLog(@"捕获区域改变...");

}

 

/**

 *  会话出错

 * @param notification 通知对象

 */

-(void)sessionRuntimeError:(NSNotification *)notification{

   NSLog(@"会话发生错误.");

}

 

#pragma mark -私有方法

 

/**

 *  取得指定位置的摄像头

 * @param position 摄像头位置

 * @return 摄像头设备

 */

-(AVCaptureDevice*)getCameraDeviceWithPosition:(AVCaptureDevicePosition )position{

   NSArray *cameras= [AVCaptureDevicedevicesWithMediaType:AVMediaTypeVideo];

    for(AVCaptureDevice *camera in cameras) {

       if([camera position]==position) {

           returncamera;

       }

    }

    returnnil;

}

 

/**

 *  改变设备属性的统一操作方法

 * @param propertyChange 属性改变操作

 */

-(void)changeDeviceProperty:(PropertyChangeBlock)propertyChange{

   AVCaptureDevice *captureDevice= [self.captureDeviceInput device];

   NSError *error;

    //注意改变设备属性前一定要首先调用lockForConfiguration:调用完之后使用unlockForConfiguration方法解锁

    if([captureDevicelockForConfiguration:&error]) {

       propertyChange(captureDevice);

       [captureDevice unlockForConfiguration];

    }else{

       NSLog(@"设置设备属性过程发生错误,错误信息:%@",error.localizedDescription);

    }

}

 

/**

 *  设置闪光灯模式

 * @param flashMode 闪光灯模式

 */

-(void)setFlashMode:(AVCaptureFlashMode )flashMode{

   [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {

       if([captureDeviceisFlashModeSupported:flashMode]) {

           [captureDevice setFlashMode:flashMode];

       }

   }];

}

/**

 *  设置聚焦模式

 *

 * @param focusMode 聚焦模式

 */

-(void)setFocusMode:(AVCaptureFocusMode )focusMode{

   [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {

       if([captureDeviceisFocusModeSupported:focusMode]) {

           [captureDevice setFocusMode:focusMode];

       }

   }];

}

/**

 *  设置曝光模式

 *

 * @param exposureMode 曝光模式

 */

-(void)setExposureMode:(AVCaptureExposureMode)exposureMode{

   [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {

       if([captureDeviceisExposureModeSupported:exposureMode]) {

           [captureDevice setExposureMode:exposureMode];

       }

   }];

}

/**

 *  设置聚焦点

 * @param point 聚焦点

 */

-(void)focusWithMode:(AVCaptureFocusMode)focusModeexposureMode:(AVCaptureExposureMode)exposureMode atPoint:(CGPoint)point{

   [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {

       if([captureDeviceisFocusModeSupported:focusMode]) {

           [captureDevice setFocusMode:AVCaptureFocusModeAutoFocus];

       }

       if([captureDeviceisFocusPointOfInterestSupported]) {

           [captureDevice setFocusPointOfInterest:point];

       }

       if([captureDevice isExposureModeSupported:exposureMode]){

           [captureDevice setExposureMode:AVCaptureExposureModeAutoExpose];

       }

       if([captureDeviceisExposurePointOfInterestSupported]) {

           [captureDevice setExposurePointOfInterest:point];

       }

   }];

}

 

/**

 *  添加点按手势,点按时聚焦

 */

-(void)addGenstureRecognizer{

   UITapGestureRecognizer *tapGesture=[[UITapGestureRecognizeralloc]initWithTarget:self action:@selector(tapScreen:)];

   [self.viewContainer addGestureRecognizer:tapGesture];

}

-(void)tapScreen:(UITapGestureRecognizer *)tapGesture{

   CGPoint point= [tapGesture locationInView:self.viewContainer];

    //UI坐标转化为摄像头坐标

    CGPoint cameraPoint=[self.captureVideoPreviewLayer captureDevicePointOfInterestForPoint:point];

   [self setFocusCursorWithPoint:point];

   [self focusWithMode:AVCaptureFocusModeAutoFocusexposureMode:AVCaptureExposureModeAutoExpose atPoint:cameraPoint];

}

 

/**

 *  设置聚焦光标位置

 * @param point 光标位置

 */

-(void)setFocusCursorWithPoint:(CGPoint)point{

   self.focusCursor.center=point;

   self.focusCursor.transform=CGAffineTransformMakeScale(1.5, 1.5);

   self.focusCursor.alpha=1.0;

   [UIView animateWithDuration:1.0 animations:^{

       self.focusCursor.transform=CGAffineTransformIdentity;

    }completion:^(BOOL finished) {

        self.focusCursor.alpha=0;

       

   }];

}

@end

运行效果:

 

总结

前面用了大量的篇幅介绍了iOS中的音、视频播放和录制,有些地方用到了封装好的播放器、录音机直接使用,有些是直接调用系统服务自己组织封装,正如本篇开头所言,iOS对于多媒体支持相当灵活和完善,那么开放过程中如何选择呢,下面就以一个表格简单对比一下各个开发技术的优缺点。

提示:从本文及以后的文章中可能慢慢使用storyboardxib,原因如下:1.苹果官方目前主推storyboard;2.后面的文章中做屏幕适配牵扯到很多内容都是storyboard中进行(尽管纯代码也可以实现,但是纯代码对autolayout支持不太好)3.通过前面的一系列文章大家对于纯代码编程应该已经有一定的积累了(纯代码确实可以另初学者更加了解程序运行原理)。

 

   

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值