这几天研究了下AVFoundation框架,在这里做下笔记备用
一、播放
先初始化一个AVAsset资源,创建AVPlayerItem用于AVPlayer播放。
要将视频显示到view上,需要AVPlayerLayer,将layer添加进view。
NSURL *url = [[NSBundle mainBundle] URLForResource:@"video" withExtension:@"m4v"];
_asset = [AVAsset assetWithURL:url];
_playerItem = [AVPlayerItem playerItemWithAsset:_asset];
[_playerItem addObserver:self
forKeyPath:STATUS_KEYPATH
options:0
context:&PlayerItemStatusContext];
_player = [AVPlayer playerWithPlayerItem:_playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:_player];
// playerLayer.videoGravity=AVLayerVideoGravityResizeAspectFill;
playerLayer.frame=[UIScreen mainScreen].bounds;
playerLayer.backgroundColor=[[UIColor whiteColor] CGColor];
[self.view.layer addSublayer:playerLayer];
视频的播放需要添加kvo监听item的播放状态,如下
- (void)observeValueForKeyPath:(NSString *)keyPath
ofObject:(id)object
change:(NSDictionary *)change
context:(void *)context {
if (context == &PlayerItemStatusContext) {
dispatch_async(dispatch_get_main_queue(), ^{ // 1
[self.playerItem removeObserver:self forKeyPath:STATUS_KEYPATH];
if (self.playerItem.status == AVPlayerItemStatusReadyToPlay) {
[self.player play];
} else {
UIAlertView *alertView =[[UIAlertView alloc] initWithTitle:@"Error"
message:@"Failed to load video"
delegate:nil cancelButtonTitle:@"OK"
otherButtonTitles:nil, nil];
[alertView show];
}
});
}
}
二、视频截图
截取视频可使用AVAssetImageGenerator截获图片,先创建一组CMTime的数组,代表需要截取的时间点,执行generateCGImagesAsynchronouslyForTimes会多次执行block,因此这里我设置一个标注imageCount,每次减1,当建为0时,代表已经全部截取完图片
self.imageGenerator = // 1
[AVAssetImageGenerator assetImageGeneratorWithAsset:self.asset];
// Generate the @2x equivalent
self.imageGenerator.maximumSize = CGSizeMake(200.0f, 0.0f); // 2
CMTime duration = self.asset.duration;
NSMutableArray *times = [NSMutableArray array]; // 3
CMTimeValue increment = duration.value / 20;
CMTimeValue currentValue = 2.0 * duration.timescale;
while (currentValue <= duration.value) {
CMTime time = CMTimeMake(currentValue, duration.timescale);
[times addObject:[NSValue valueWithCMTime:time]];
currentValue += increment;
}
__block NSUInteger imageCount = times.count; // 4
__block NSMutableArray *images = [NSMutableArray array];
AVAssetImageGeneratorCompletionHandler handler; // 5
handler = ^(CMTime requestedTime,
CGImageRef imageRef,
CMTime actualTime,
AVAssetImageGeneratorResult result,
NSError *error) {
if (result == AVAssetImageGeneratorSucceeded) { // 6
UIImage *image = [UIImage imageWithCGImage:imageRef];
id thumbnail = [THThumbnail thumbnailWithImage:image time:actualTime];
[images addObject:thumbnail];
} else {
NSLog(@"Error: %@", [error localizedDescription]);
}
// If the decremented image count is at 0, we're all done.
if (--imageCount == 0) {
dispatch_async(dispatch_get_main_queue(), ^{
[self setImage:images];
});
}
};
[self.imageGenerator generateCGImagesAsynchronouslyForTimes:times // 8
completionHandler:handler];
如果只是为了截取一张图片,比如封面,可使用copyCGImageAtTime,如下:
AVAssetImageGenerator *assetImageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
assetImageGenerator.appliesPreferredTrackTransform =YES;
assetImageGenerator.apertureMode =AVAssetImageGeneratorApertureModeEncodedPixels;
CGImageRef thumbnailImageRef = NULL;
NSError *thumbnailImageGenerationError = nil;
CMTime time = CMTimeMakeWithSeconds(0.0, 600);
thumbnailImageRef = [assetImageGenerator copyCGImageAtTime:time actualTime:NULL error:&thumbnailImageGenerationError];
if (!thumbnailImageRef)
NSLog(@"thumbnailImageGenerationError %@", thumbnailImageGenerationError);
UIImage *thumbnailImage = thumbnailImageRef ? [[UIImage alloc] initWithCGImage:thumbnailImageRef] :nil;
CGImageRelease(thumbnailImageRef);
三、视频截取片段导出
通常截取视频的时候我们做一个视频截取的界面,用于选取截取的范围,而且让视频可以在选取的范围内播放,这个地方可以使用AVPlayerItem的reversePlaybackEndTime和forwardPlaybackEndTime控制播放范围,同时可以用于记录截取范围。
reversePlaybackEndTime
表示了视频反向播放时的结束点
forwardPlaybackEndTime
表示正向播放结束点。
通常视频播放到结束点(默认是播放到视频结束),会触发通知AVPlayerItemDidPlayToEndTimeNotification。所以,我们可以监听通知来处理一下逻辑(比如循环播放)
- (void)addItemEndObserverForPlayerItem {
NSOperationQueue *queue = [NSOperationQueue mainQueue];
__weak ViewController *weakSelf = self;
void (^callback)(NSNotification *note) = ^(NSNotification *notification) {
[weakSelf.player seekToTime:kCMTimeZero
completionHandler:^(BOOL finished) {
[weakSelf.playerItem seekToTime:weakSelf.playerItem.reversePlaybackEndTime completionHandler:^(BOOL finished){
[weakSelf.player play];
}];
}];
};
[[NSNotificationCenter defaultCenter] addObserverForName:AVPlayerItemDidPlayToEndTimeNotification
object:self.playerItem
queue:queue
usingBlock:callback];
}
当播放结束时,seekToTime让播放点回到reversePlaybackEndTime,实现在选取范围内循环播放视频。
最后,导出视频可使用AVAssetExportSession通过exportAsynchronouslyWithCompletionHandler来导出
[self.player pause];//先暂停视频
self.exportSession = [[AVAssetExportSession alloc] initWithAsset:self.asset presetName:AVAssetExportPresetMediumQuality];//设置编码质量
self.exportSession.outputFileType = AVFileTypeQuickTimeMovie;//设置输出文件类型
self.exportSession.shouldOptimizeForNetworkUse = YES;
CMTime startTime = self.playerItem.reversePlaybackEndTime;
CMTime endTime = self.playerItem.forwardPlaybackEndTime;
CMTimeRange timeRange = CMTimeRangeMake(startTime, endTime); // 3
NSDateFormatter *formater = [[NSDateFormatter alloc] init];//用时间给文件全名,以免重复,在测试的时候其实可以判断文件是否存在若存在,则删除,重新生成文件即可
[formater setDateFormat:@"yyyy-MM-dd-HH:mm:ss"];
NSString *documentsDirPath =[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) firstObject];
NSString * resultPath = [documentsDirPath stringByAppendingPathComponent: [NSString stringWithFormat:@"output-%@.mp4", [formater stringFromDate:[NSDate date]]]];
NSLog(@"resultPath = %@",resultPath);
self.exportSession.outputURL = [NSURL fileURLWithPath:resultPath];//设置输入路径
__weak ViewController *weakSelf = self;
[self.exportSession exportAsynchronouslyWithCompletionHandler:^{
AVAssetExportSessionStatus status = weakSelf.exportSession.status;
if (status == AVAssetExportSessionStatusCompleted) {
NSLog(@"AVAssetExportSessionStatusCompleted");
}else{
NSLog(@"AVAssetExportSessionStatusFailed:%@",weakSelf.exportSession.error);
}
}];
PS:元数据
AVAsset具有多种属性,提供有关资源的信息(时长、标题、日期),但是,当创建时,资源就是对基础媒体文件的处理,而属性使用了延迟加载的方法,直到请求时才载人。
查询元数据使用loadValuesAsynchronouslyForKeys方法进行查询。使用statusOfValueForKey方法获取属性的加载状态。
如下,查询资源标题:
__weak ViewController *weakSelf = self;
[self.asset loadValuesAsynchronouslyForKeys:@[@"commonMetadata"] completionHandler:^{
AVKeyValueStatus status =
[self.asset statusOfValueForKey:@"commonMetadata" error:nil];
if (status == AVKeyValueStatusLoaded) {
NSArray *items =
[AVMetadataItem metadataItemsFromArray:weakSelf.asset.commonMetadata
withKey:AVMetadataCommonKeyTitle
keySpace:AVMetadataKeySpaceCommon];
if (items.count > 0) {
AVMetadataItem *titleItem = [items firstObject];
NSLog(@"%@",titleItem.value);
}
}
}];