背景
本项目是基于ijkplayer二次开发的http-mp4点播项目,业务UI层是flutter,所使用的render方式是flutter的texture,而非platformview。需要给flutter提供rgba格式的像素数据render。
问题
在使用ijkplayer做二次开发的http-mp4点播项目时,在生产环境遇到了一些视频在播放时,iOS平台出现了画面变形的问题,而同样的视频在android平台,不论是mediacodec硬解还是ffmpeg软解均是ok的。
原因
分析发现,此类视频的分辨率均非标准,但android表现正常,大胆推测是iOS平台解码后渲染前环节出现问题所致。
进一步分析iOS平台源码,发现是将rgba格式数据,存储在CVPixelBufferRef,再callback给flutter render的;而此结构需要由CVPixelBufferCreate方法创建,由于传入的视频宽度参数不是16倍数引起画面变形。
// IJKSDL GLview call this when display frame
- (void) display_pixels:(IJKOverlay *)overlay
{
if (overlay->pixel_buffer != nil && _cvPBView != nil) {
[self onSnapshot: overlay->pixel_buffer];
[_cvPBView display_pixelbuffer:overlay->pixel_buffer];
} else if (_cvPBView != nil && overlay->format == SDL_FCC_BGRA){
CVPixelBufferRef pixelBuffer;
// CVPixelBufferCreateWithBytes lead to crash if reset player
// and then setDataSource and play again.
int retval = CVPixelBufferCreate(kCFAllocatorDefault,
(size_t) overlay->w,
(size_t) overlay->h,
kCVPixelFormatType_32BGRA,
_optionsDictionary,
&pixelBuffer);
if (retval == kCVReturnSuccess) {
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
uint8_t *dst = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
memcpy(dst, overlay->pixels[0], overlay->pitches[0] * overlay->h);
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
[self onSnapshot: pixelBuffer];
[_cvPBView display_pixelbuffer:pixelBuffer];
CVPixelBufferRelease(pixelBuffer);
}
}
}
iOS平台flutter视频渲染
IJKCVPBViewProtocol接口定义如下:
@protocol IJKCVPBViewProtocol <NSObject>
@required
- (void) display_pixelbuffer: (CVPixelBufferRef) pixelbuffer;
@end
FijkPlayer结构定义:
@interface FijkPlayer : NSObject <FlutterStreamHandler, IJKMPEventHandler,
FlutterTexture, IJKCVPBViewProtocol>
@property(atomic, readonly) NSNumber *playerId;
- (instancetype)initWithRegistrar:(id<FlutterPluginRegistrar>)registrar;
- (instancetype)initJustTexture;
- (void)shutdown;
@end
display_pixelbuffer代理实现:
// IJKCVPBViewProtocol delegate
// IJKFFMediaPlayer will incoke this method whem new frame should be displayed
- (void)display_pixelbuffer:(CVPixelBufferRef)pixelbuffer {
if (_lastBuffer == nil) {
_lastBuffer = CVPixelBufferRetain(pixelbuffer);
CFRetain(pixelbuffer);
} else if (_lastBuffer != pixelbuffer) {
CVPixelBufferRelease(_lastBuffer);
_lastBuffer = CVPixelBufferRetain(pixelbuffer);
CFRetain(pixelbuffer);
}
CVPixelBufferRef newBuffer = pixelbuffer;
CVPixelBufferRef old = _latestPixelBuffer;
while (!OSAtomicCompareAndSwapPtrBarrier(old, newBuffer,
(void **)&_latestPixelBuffer)) {
old = _latestPixelBuffer;
}
if (old && old != pixelbuffer) {
CFRelease(old);
}
if (_vid >= 0) {
[_textureRegistry textureFrameAvailable:_vid];
}
}
display_pixelbuffer回调,而后通过textureFrameAvailable唤醒flutter获取rgba数据,再进行视频的render:
// After textureFrameAvailable has been called
// Flutter engine call this to get new CVPixelBufferRef to render
- (CVPixelBufferRef _Nullable)copyPixelBuffer {
CVPixelBufferRef pixelBuffer = _latestPixelBuffer;
while (!OSAtomicCompareAndSwapPtrBarrier(pixelBuffer, nil,
(void **)&_latestPixelBuffer)) {
pixelBuffer = _latestPixelBuffer;
}
return pixelBuffer;
}
解决方案
由以上分析可知,引起iOS平台部分视频画面变形的原因是:CVPixelBufferCreate要求传入视频宽度为16倍,而此类视频宽度并非16倍所致。
解决方案有多种:
1)是在服务器将视频宽度缩放至16倍;
2)在播放器侧将视频宽度缩放至16倍;
3)不用flutter的texture渲染,改用platformview;
此处选择方案二。
在ff_ffplay.c的ffplay_video_thread解码线程里,在解码成功后,将yuv数据的宽度缩放至16倍即可:
#ifdef __APPLE__
// iOS CVPixelBufferCreate 要求视频宽度为16的倍数,不然画面变形
const int multiple = 16;
if (frame->width % multiple) {
int dst_width = ((frame->width / multiple) + 1) * multiple;
dst_frame->format = frame->format;
dst_frame->width = dst_width;
dst_frame->height = frame->height;
dst_frame->channels = frame->channels;
dst_frame->channel_layout = frame->channel_layout;
dst_frame->nb_samples = frame->nb_samples;
dst_frame->pts = frame->pts;
dst_frame->key_frame = frame->key_frame;
dst_frame->pict_type = frame->pict_type;
dst_frame->sample_aspect_ratio = frame->sample_aspect_ratio;
int ret0 = av_frame_get_buffer(dst_frame, 32);
if (ret0 < 0) {
av_log(NULL, AV_LOG_INFO, "av_frame_get_buffer ret=%d\n", ret);
continue;
}
ret0 = av_frame_copy(dst_frame, frame);
if (ret0 < 0) {
av_log(NULL, AV_LOG_INFO, "av_frame_copy ret=%d\n", ret);
continue;
}
struct SwsContext* sws_context = sws_getContext(frame->width, frame->height, frame->format,
dst_width, dst_frame->height, dst_frame->format,
SWS_BICUBIC,
NULL, NULL, NULL);
if (NULL == sws_context) {
av_log(NULL, AV_LOG_INFO, "ffmpeg get context error NULL\n");
} else {
sws_scale(sws_context, (const uint8_t * const *)frame->data,
frame->linesize, 0, frame->height,
dst_frame->data, dst_frame->linesize);
sws_freeContext(sws_context);
duration = (frame_rate.num && frame_rate.den ? av_q2d((AVRational){frame_rate.den, frame_rate.num}) : 0);
pts = (dst_frame->pts == AV_NOPTS_VALUE) ? NAN : dst_frame->pts * av_q2d(tb);
ret = queue_picture(ffp, dst_frame, pts, duration, dst_frame->pkt_pos, is->viddec.pkt_serial);
av_frame_unref(dst_frame);
}
} else {
#endif
duration = (frame_rate.num && frame_rate.den ? av_q2d((AVRational){frame_rate.den, frame_rate.num}) : 0);
pts = (frame->pts == AV_NOPTS_VALUE) ? NAN : frame->pts * av_q2d(tb);
ret = queue_picture(ffp, frame, pts, duration, frame->pkt_pos, is->viddec.pkt_serial);
av_frame_unref(frame);
#ifdef __APPLE__
}
#endif
参考资料