如果移动端访问不佳,请访问我的个人博客
https://github.com/imwcl/WCLRecordVideo
系统自带的录制视频的功能显然无法满足美工和项目经理的要求,自定义视频录制就非常重要了,那么下面来带大家制作属于自己的视频录制界面。
简介
自定义视频录制需要用到的框架主要是AVFoundation
和CoreMedia
,包括视频输出,输入和文件的读写,下面给大家罗列一下将要用到的类:
- AVCaptureSession
- AVCaptureVideoPreviewLayer
- AVCaptureDeviceInput
- AVCaptureConnection
- AVCaptureVideoDataOutput
- AVCaptureAudioDataOutput
- AVAssetWriter
- AVAssetWriterInput
下面详细介绍每个类和代码实现
AVCaptureSession
AVCaptureSession
是AVFoundation
捕捉类的中心枢纽,我们先从这个类入手,在视频捕获时,客户端可以实例化AVCaptureSession
并添加适当的AVCaptureInputs
、AVCaptureDeviceInput
和输出,比如AVCaptureMovieFileOutput
。通过[AVCaptureSession startRunning]
开始数据流从输入到输出,和[AVCaptureSession stopRunning]
停止输出输入的流动。客户端可以通过设置sessionPreset属性定制录制质量水平或输出的比特率。
- (AVCaptureSession *)recordSession {
if (_recordSession == nil) {
_recordSession = [[AVCaptureSession alloc] init];
if ([_recordSession canAddInput:self.backCameraInput]) {
[_recordSession addInput:self.backCameraInput];
}
if ([_recordSession canAddInput:self.audioMicInput]) {
[_recordSession addInput:self.audioMicInput];
}
if ([_recordSession canAddOutput:self.videoOutput]) {
[_recordSession addOutput:self.videoOutput];
NSDictionary* actual = self.videoOutput.videoSettings;
_cx = [[actual objectForKey:@"Height"] integerValue];
_cy = [[actual objectForKey:@"Width"] integerValue];
}
if ([_recordSession canAddOutput:self.audioOutput]) {
[_recordSession addOutput:self.audioOutput];
}
self.videoConnection.videoOrientation = AVCaptureVideoOrientationPortrait;
}
return _recordSession;
}
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
AVCaptureDevice
AVCaptureDevice
的每个实例对应一个设备,如摄像头或麦克风。AVCaptureDevice
的实例不能直接创建。所有现有设备可以使用类方法devicesWithMediaType:defaultDeviceWithMediaType:
获取,设备可以提供一个或多个给定流媒体类型。AVCaptureDevice
实例可用于提供给AVCaptureSession
创建一个为AVCaptureDeviceInput
类型的输入源。
- (AVCaptureDevice *)frontCamera {
return [self cameraWithPosition:AVCaptureDevicePositionFront];
}
- (AVCaptureDevice *)backCamera {
return [self cameraWithPosition:AVCaptureDevicePositionBack];
}
- (AVCaptureDevice *)cameraWithPosition:(AVCaptureDevicePosition) position {
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices) {
if ([device position] == position) {
return device;
}
}
return nil;
}
- (void)openFlashLight {
AVCaptureDevice *backCamera = [self backCamera];
if (backCamera.torchMode == AVCaptureTorchModeOff) {
[backCamera lockForConfiguration:nil];
backCamera.torchMode = AVCaptureTorchModeOn;
backCamera.flashMode = AVCaptureFlashModeOn;
[backCamera unlockForConfiguration];
}
}
- (void)closeFlashLight {
AVCaptureDevice *backCamera = [self backCamera];
if (backCamera.torchMode == AVCaptureTorchModeOn) {
[backCamera lockForConfiguration:nil];
backCamera.torchMode = AVCaptureTorchModeOff;
backCamera.flashMode = AVCaptureTorchModeOff;
[backCamera unlockForConfiguration];
}
}
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
- 31
- 32
- 33
- 34
- 35
- 36
- 37
- 38
- 39
- 40
- 41
- 42
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
- 31
- 32
- 33
- 34
- 35
- 36
- 37
- 38
- 39
- 40
- 41
- 42
AVCaptureDeviceInput
是AVCaptureSession输入源,提供媒体数据从设备连接到系统,通过AVCaptureDevice
的实例化得到,就是我们将要用到的设备输出源设备,也就是前后摄像头,通过[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]
方法获得。
- (AVCaptureDeviceInput *)backCameraInput {
if (_backCameraInput == nil) {
NSError *error;
_backCameraInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self backCamera] error:&error];
if (error) {
[SVProgressHUD showErrorWithStatus:@"获取后置摄像头失败~"];
}
}
return _backCameraInput;
}
- (AVCaptureDeviceInput *)frontCameraInput {
if (_frontCameraInput == nil) {
NSError *error;
_frontCameraInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self frontCamera] error:&error];
if (error) {
[SVProgressHUD showErrorWithStatus:@"获取前置摄像头失败~"];
}
}
return _frontCameraInput;
}
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
AVCaptureVideoPreviewLayer
是CoreAnimation
里面layer的一个子类,用来做为AVCaptureSession
预览视频输出,简单来说就是来做为拍摄的视频呈现的一个layer。
- (AVCaptureVideoPreviewLayer *)previewLayer {
if (_previewLayer == nil) {
AVCaptureVideoPreviewLayer *preview = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.recordSession];
preview.videoGravity = AVLayerVideoGravityResizeAspectFill;
_previewLayer = preview;
}
return _previewLayer;
}
AVCaptureMovieFileOutput
AVCaptureMovieFileOutput
是AVCaptureFileOutput
的子类,用来写入QuickTime
视频类型的媒体文件。因为这个类在iphone上并不能实现暂停录制,和不能定义视频文件的类型,所以在这里并不使用,而是用灵活性更强的AVCaptureVideoDataOutput
和AVCaptureAudioDataOutput
来实现视频的录制。
AVCaptureVideoDataOutput
AVCaptureVideoDataOutput
是AVCaptureOutput
一个子类,可以用于用来输出未压缩或压缩的视频捕获的帧,AVCaptureVideoDataOutput
产生的实例可以使用其他媒体视频帧适合的api处理,应用程序可以用captureOutput:didOutputSampleBuffer:fromConnection:
代理方法来获取帧数据。
- (AVCaptureVideoDataOutput *)videoOutput {
if (_videoOutput == nil) {
_videoOutput = [[AVCaptureVideoDataOutput alloc] init];
[_videoOutput setSampleBufferDelegate:self queue:self.captureQueue];
NSDictionary* setcapSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange], kCVPixelBufferPixelFormatTypeKey,
nil];
_videoOutput.videoSettings = setcapSettings;
}
return _videoOutput;
}
AVCaptureAudioDataOutput
AVCaptureAudioDataOutput
是AVCaptureOutput
的子类,可用于用来输出捕获来的非压缩或压缩的音频样本,AVCaptureAudioDataOutput
产生的实例可以使用其他媒体视频帧适合的api处理,应用程序可以用captureOutput:didOutputSampleBuffer:fromConnection:
代理方法来获取音频数据。
- (AVCaptureAudioDataOutput *)audioOutput {
if (_audioOutput == nil) {
_audioOutput = [[AVCaptureAudioDataOutput alloc] init];
[_audioOutput setSampleBufferDelegate:self queue:self.captureQueue];
}
return _audioOutput;
}
AVCaptureConnection
AVCaptureConnection
代表AVCaptureInputPort
或端口之间的连接,和一个AVCaptureOutput
或AVCaptureVideoPreviewLayer
在AVCaptureSession
中的呈现。
- (AVCaptureConnection *)videoConnection {
_videoConnection = [self.videoOutput connectionWithMediaType:AVMediaTypeVideo];
return _videoConnection;
}
- (AVCaptureConnection *)audioConnection {
if (_audioConnection == nil) {
_audioConnection = [self.audioOutput connectionWithMediaType:AVMediaTypeAudio];
}
return _audioConnection;
}
AVAssetWriter
AVAssetWriter
为写入媒体数据到一个新的文件提供服务,AVAssetWriter
的实例可以规定写入媒体文件的格式,如QuickTime
电影文件格式或MPEG-4
文件格式等等。AVAssetWriter
有多个并行的轨道媒体数据,基本的有视频轨道和音频轨道,将会在下面介绍。AVAssetWriter
的单个实例可用于一次写入一个单一的文件。那些希望写入多次文件的客户端必须每一次用一个新的AVAssetWriter
实例。
- (instancetype)initPath:(NSString*)path Height:(NSInteger)cy width:(NSInteger)cx channels:(int)ch samples:(Float64) rate {
self = [super init];
if (self) {
self.path = path;
[[NSFileManager defaultManager] removeItemAtPath:self.path error:nil];
NSURL* url = [NSURL fileURLWithPath:self.path];
_writer = [AVAssetWriter assetWriterWithURL:url fileType:AVFileTypeMPEG4 error:nil];
_writer.shouldOptimizeForNetworkUse = YES;
[self initVideoInputHeight:cy width:cx];
if (rate != 0 && ch != 0) {
[self initAudioInputChannels:ch samples:rate];
}
}
return self;
}
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
用AVAssetWriterInput
去拼接一个多媒体样本类型为CMSampleBuffer
的实例到AVAssetWriter
对象的输出文件的一个轨道;当有多个输入时, AVAssetWriter
试图在用于存储和播放效率的理想模式写媒体数据。它的每一个输入信号,是否能接受媒体的数据根据通过readyForMoreMediaData
的值来判断。如果readyForMoreMediaData
是YES
,说明输入可以接受媒体数据。并且你只能媒体数据追加到输入端。
- (void)initVideoInputHeight:(NSInteger)cy width:(NSInteger)cx {
NSDictionary* settings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInteger: cx], AVVideoWidthKey,
[NSNumber numberWithInteger: cy], AVVideoHeightKey,
nil];
_videoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:settings];
_videoInput.expectsMediaDataInRealTime = YES;
[_writer addInput:_videoInput];
}
- (void)initAudioInputChannels:(int)ch samples:(Float64)rate {
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
[ NSNumber numberWithInt: kAudioFormatMPEG4AAC], AVFormatIDKey,
[ NSNumber numberWithInt: ch], AVNumberOfChannelsKey,
[ NSNumber numberWithFloat: rate], AVSampleRateKey,
[ NSNumber numberWithInt: 128000], AVEncoderBitRateKey,
nil];
_audioInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:settings];
_audioInput.expectsMediaDataInRealTime = YES;
[_writer addInput:_audioInput];
}
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
- 31
- 32
- 33
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
- 31
- 32
- 33
上面是录制之前的一些需要的类和配置,下面介绍的是如何将获取到的数据呈现出来和怎样进行文件写入
写入数据
#pragma mark - 写入数据
- (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
BOOL isVideo = YES;
@synchronized(self) {
if (!self.isCapturing || self.isPaused) {
return;
}
if (captureOutput != self.videoOutput) {
isVideo = NO;
}
if ((self.recordEncoder == nil) && !isVideo)
{
CMFormatDescriptionRef fmt = CMSampleBufferGetFormatDescription(sampleBuffer);
[self setAudioFormat:fmt];
NSString *videoName = [NSString getUploadFile_type:@"video" fileType:@"mp4"];
self.videoPath = [[self getVideoCachePath] stringByAppendingPathComponent:videoName];
self.recordEncoder = [WCLRecordEncoder encoderForPath:self.videoPath Height:_cy width:_cx channels:_channels samples:_samplerate];
}
if (self.discont) {
if (isVideo) {
return;
}
self.discont = NO;
CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
CMTime last = isVideo ? _lastVideo : _lastAudio;
if (last.flags & kCMTimeFlags_Valid) {
if (_timeOffset.flags & kCMTimeFlags_Valid) {
pts = CMTimeSubtract(pts, _timeOffset);
}
CMTime offset = CMTimeSubtract(pts, last);
if (_timeOffset.value == 0) {
_timeOffset = offset;
}else {
_timeOffset = CMTimeAdd(_timeOffset, offset);
}
}
_lastVideo.flags = 0;
_lastAudio.flags = 0;
}
CFRetain(sampleBuffer);
if (_timeOffset.value > 0) {
CFRelease(sampleBuffer);
sampleBuffer = [self adjustTime:sampleBuffer by:_timeOffset];
}
CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
CMTime dur = CMSampleBufferGetDuration(sampleBuffer);
if (dur.value > 0) {
pts = CMTimeAdd(pts, dur);
}
if (isVideo) {
_lastVideo = pts;
}else {
_lastAudio = pts;
}
}
CMTime dur = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
if (self.startTime.value == 0) {
self.startTime = dur;
}
CMTime sub = CMTimeSubtract(dur, self.startTime);
self.currentRecordTime = CMTimeGetSeconds(sub);
if (self.currentRecordTime > self.maxRecordTime) {
if (self.currentRecordTime - self.maxRecordTime < 0.1) {
if ([self.delegate respondsToSelector:@selector(recordProgress:)]) {
dispatch_async(dispatch_get_main_queue(), ^{
[self.delegate recordProgress:self.currentRecordTime/self.maxRecordTime];
});
}
}
return;
}
if ([self.delegate respondsToSelector:@selector(recordProgress:)]) {
dispatch_async(dispatch_get_main_queue(), ^{
[self.delegate recordProgress:self.currentRecordTime/self.maxRecordTime];
});
}
[self.recordEncoder encodeFrame:sampleBuffer isVideo:isVideo];
CFRelease(sampleBuffer);
}
- (void)setAudioFormat:(CMFormatDescriptionRef)fmt {
const AudioStreamBasicDescription *asbd = CMAudioFormatDescriptionGetStreamBasicDescription(fmt);
_samplerate = asbd->mSampleRate;
_channels = asbd->mChannelsPerFrame;
}
- (CMSampleBufferRef)adjustTime:(CMSampleBufferRef)sample by:(CMTime)offset {
CMItemCount count;
CMSampleBufferGetSampleTimingInfoArray(sample, 0, nil, &count);
CMSampleTimingInfo* pInfo = malloc(sizeof(CMSampleTimingInfo) * count);
CMSampleBufferGetSampleTimingInfoArray(sample, count, pInfo, &count);
for (CMItemCount i = 0; i < count; i++) {
pInfo[i].decodeTimeStamp = CMTimeSubtract(pInfo[i].decodeTimeStamp, offset);
pInfo[i].presentationTimeStamp = CMTimeSubtract(pInfo[i].presentationTimeStamp, offset);
}
CMSampleBufferRef sout;
CMSampleBufferCreateCopyWithNewTiming(nil, sample, count, pInfo, &sout);
free(pInfo);
return sout;
}
- (BOOL)encodeFrame:(CMSampleBufferRef) sampleBuffer isVideo:(BOOL)isVideo {
if (CMSampleBufferDataIsReady(sampleBuffer)) {
if (_writer.status == AVAssetWriterStatusUnknown && isVideo) {
CMTime startTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
[_writer startWriting];
[_writer startSessionAtSourceTime:startTime];
}
if (_writer.status == AVAssetWriterStatusFailed) {
NSLog(@"writer error %@", _writer.error.localizedDescription);
return NO;
}
if (isVideo) {
if (_videoInput.readyForMoreMediaData == YES) {
[_videoInput appendSampleBuffer:sampleBuffer];
return YES;
}
}else {
if (_audioInput.readyForMoreMediaData) {
[_audioInput appendSampleBuffer:sampleBuffer];
return YES;
}
}
}
return NO;
}
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
- 31
- 32
- 33
- 34
- 35
- 36
- 37
- 38
- 39
- 40
- 41
- 42
- 43
- 44
- 45
- 46
- 47
- 48
- 49
- 50
- 51
- 52
- 53
- 54
- 55
- 56
- 57
- 58
- 59
- 60
- 61
- 62
- 63
- 64
- 65
- 66
- 67
- 68
- 69
- 70
- 71
- 72
- 73
- 74
- 75
- 76
- 77
- 78
- 79
- 80
- 81
- 82
- 83
- 84
- 85
- 86
- 87
- 88
- 89
- 90
- 91
- 92
- 93
- 94
- 95
- 96
- 97
- 98
- 99
- 100
- 101
- 102
- 103
- 104
- 105
- 106
- 107
- 108
- 109
- 110
- 111
- 112
- 113
- 114
- 115
- 116
- 117
- 118
- 119
- 120
- 121
- 122
- 123
- 124
- 125
- 126
- 127
- 128
- 129
- 130
- 131
- 132
- 133
- 134
- 135
- 136
- 137
- 138
- 139
- 140
- 141
- 142
- 143
- 144
- 145
- 146
- 147
- 148
- 149
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
- 31
- 32
- 33
- 34
- 35
- 36
- 37
- 38
- 39
- 40
- 41
- 42
- 43
- 44
- 45
- 46
- 47
- 48
- 49
- 50
- 51
- 52
- 53
- 54
- 55
- 56
- 57
- 58
- 59
- 60
- 61
- 62
- 63
- 64
- 65
- 66
- 67
- 68
- 69
- 70
- 71
- 72
- 73
- 74
- 75
- 76
- 77
- 78
- 79
- 80
- 81
- 82
- 83
- 84
- 85
- 86
- 87
- 88
- 89
- 90
- 91
- 92
- 93
- 94
- 95
- 96
- 97
- 98
- 99
- 100
- 101
- 102
- 103
- 104
- 105
- 106
- 107
- 108
- 109
- 110
- 111
- 112
- 113
- 114
- 115
- 116
- 117
- 118
- 119
- 120
- 121
- 122
- 123
- 124
- 125
- 126
- 127
- 128
- 129
- 130
- 131
- 132
- 133
- 134
- 135
- 136
- 137
- 138
- 139
- 140
- 141
- 142
- 143
- 144
- 145
- 146
- 147
- 148
- 149
完成录制并写入相册
- (void) stopCaptureHandler:(void (^)(UIImage *movieImage))handler {
@synchronized(self) {
if (self.isCapturing) {
NSString* path = self.recordEncoder.path;
NSURL* url = [NSURL fileURLWithPath:path];
self.isCapturing = NO;
dispatch_async(_captureQueue, ^{
[self.recordEncoder finishWithCompletionHandler:^{
self.isCapturing = NO;
self.recordEncoder = nil;
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
[PHAssetChangeRequest creationRequestForAssetFromVideoAtFileURL:url];
} completionHandler:^(BOOL success, NSError * _Nullable error) {
NSLog(@"保存成功");
}];
[self movieToImageHandler:handler];
}];
});
}
}
}
- (void)movieToImageHandler:(void (^)(UIImage *movieImage))handler {
NSURL *url = [NSURL fileURLWithPath:self.videoPath];
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.appliesPreferredTrackTransform = TRUE;
CMTime thumbTime = CMTimeMakeWithSeconds(0, 60);
generator.apertureMode = AVAssetImageGeneratorApertureModeEncodedPixels;
AVAssetImageGeneratorCompletionHandler generatorHandler =
^(CMTime requestedTime, CGImageRef im, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error){
if (result == AVAssetImageGeneratorSucceeded) {
UIImage *thumbImg = [UIImage imageWithCGImage:im];
if (handler) {
dispatch_async(dispatch_get_main_queue(), ^{
handler(thumbImg);
});
}
}
};
[generator generateCGImagesAsynchronouslyForTimes:
[NSArray arrayWithObject:[NSValue valueWithCMTime:thumbTime]] completionHandler:generatorHandler];
}
- (void)finishWithCompletionHandler:(void (^)(void))handler {
[_writer finishWritingWithCompletionHandler: handler];
}
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
- 31
- 32
- 33
- 34
- 35
- 36
- 37
- 38
- 39
- 40
- 41
- 42
- 43
- 44
- 45
- 46
- 47
- 48
- 49
- 50
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
- 31
- 32
- 33
- 34
- 35
- 36
- 37
- 38
- 39
- 40
- 41
- 42
- 43
- 44
- 45
- 46
- 47
- 48
- 49
- 50
以上就是本博客内容的全部内容,大家如果有什么疑问可以问我,本文附带有demo,大家可以去看看具体怎么使用,有用的话可以点一下star,谢谢大家的阅读~~