本文描述如何从ffmpeg decode出来的YUV frame到SDL2.0显示,重点在如何将ffmpeg decode的yuv format转换到SDL2.0可以显示的format,以及SDL如何显示。
为什么要用YUV显示
在使用软解方式实现一个视频播放器时,显示部分如果需要使用RGB format,由于ffmepg decode出来的format为YUV420,则需要做一次从YUV420到RGB32的转换,这个转换过程中涉及到大量的浮点运算,performance会比较低,导致视频播放帧率受到影响,如果可以直接使用YUV做显示,理论上performance会比使用RGB更好一些。
convert YUV420 to YUV420P(YV12)
1. SDL2.0支援的YUV format
从SDL2.0的官方文档可以了解到, SDL2.0所支援的YUV format如下:
YUV Format | comment |
---|---|
SDL_PIXELFORMAT_YV12 | planar mode: Y + V + U (3 planes) |
SDL_PIXELFORMAT_IYUV | planar mode: Y + U + V (3 planes) |
SDL_PIXELFORMAT_YUY2 | packed mode: Y0+U0+Y1+V0 (1 plane) |
SDL_PIXELFORMAT_UYVY | packed mode: U0+Y0+V0+Y1 (1 plane) |
SDL_PIXELFORMAT_YVYU | packed mode: Y0+V0+Y1+U0 (1 plane) |
2. ffmepg output YUV format
ffmepg decode output format为YUV420.
从上述可以看出,ffmpeg decode的output没有办法直接送给SDL2.0进行显示,需要将YUV420转换为SDL2.0支援的YUV format,YUV420和YV12(YUV420P)同为YUV420格式,只是排列方式不同,所以我们选取YV12作为我们的转换目标。
3. sample code
AVFrame* pFrameYUV;
pFrameYUV = av_frame_alloc();
if( pFrameYUV == NULL )
return -1;
int numBytes = avpicture_get_size( PIX_FMT_YUV420P, pCodecCtx->width, pCodecCtx->height );
uint8_t* buffer = ( uint8_t* )av_malloc( numBytes * sizeof( uint8_t ) );
avpicture_fill( ( AVPicture* )pFrameYUV, buffer, PIX_FMT_YUV420P, pCodecCtx->width, pCodecCtx->height );
struct SwsContext* sws_ctx = NULL;
sws_ctx = sws_getContext( pCodecCtx->width, pCodecCtx->height, pCdoecCtx->pix_fmt, pCodecCtx->width, pCdoecCtx->height, PIX_FMT_YUV420P, SWS_BILINEAR, NULL, NULL, NULL );
// use av_read_frame & avcodec_decode_video2 to get a complete frame
sws_scale( sws_ctx, ( uint8_t const* const* )pFrame->data, pFrame->linesize, 0, pCodecCtx->height, pFrameYUV->data, pFrameYUV->linesize );
save YUV420P to files
YUV420P的格式是先放整张图所有的Y data,然后是所有的U和V(U V顺序需要确认);
这样在将YUV420P存到file的时候,需要从AVFrame中奖data[0],data[1],data[2]的所有数据依次写入。
void SaveYuvFrame( AVFrame* pFrame, int width, int height, int iFrame ) {
FILE* pFile;
char szFilename[32];
int y;
// Open file
sprintf( szFilename, "frame%d.yuv", iFrame );
pFile = fopen( szFilename, "wb" );
if( pFile == NULL ) {
return;
}
// write yuv data
fwrite( pFrame->data[0], 1, ( int )(pFrame->linesize[0] * height), pFile );
fwrite( pFrame->data[1], 1, ( int )pFrame->linesize[1]*height, pFile );
fwrite( pFrame->data[2], 1, pFrame->linesize[2]*height, pFile );
// Close file
fclose( pFile );
}
上面code中,没有将linesize存下来(linesize中有3个数据,y,v,v在每一行中的byte数目),以1920X1080的data为例,linesize={1920, 960, 960}
load YUV420P from files and input to SDL2.0
FILE* pFile = fopen( "frame0.yuv", "rb" );
if( pFile == NULL )
return -1;
char* yuvdata[3];
int linesize[3] = {1920, 960, 960};
yuvdata[0] = ( char* )malloc( ( linesize[0] + linesize[1] + linesize[2]) * pCodecCtx->height );
yuvdata[1] = yuvdata[0] + linesize[0] * pCodecCtx->height;
yuvdata[2] = yuvdata[1] + linesize[1] * pCodecCtx->height;
int size = fread( yuvdata[0], 1, linesize[0]*pCodecCtx->height, pFile );
size = fread( yuvdata[1], 1, linesize[1]*pCodecCtx->height, pFile );
size = fread( yuvdata[2], 1, linesize[2]*pCodecCtx->height, pFile );
fclose( pFile );
上面code中,由于yuv file中没有存linesize,所以预设一个linesize,实际使用时,在存yuv data时,可以将linesize当做file header存下来
SDL2.0 image display flow
// SDL init
SDL_Window *window;
SDL_Renderer *renderer;
SDL_RendererInfo info;
SDL_Surface *image;
SDL_Rect rect;
Uint8 *imageYUV;
SDL_Texture *texture;
SDL_Event event;
Uint32 then, now, frames;
SDL_bool done = SDL_FALSE;
if (SDL_Init(SDL_INIT_VIDEO) < 0) {
fprintf(stderr, "Couldn't initialize SDL: %s\n", SDL_GetError());
return 2;
}
/* Create the window and renderer */
window = SDL_CreateWindow("YUV speed test",
SDL_WINDOWPOS_UNDEFINED,
SDL_WINDOWPOS_UNDEFINED,
pCodecCtx->width, pCodecCtx->height,
SDL_WINDOW_SHOWN|SDL_WINDOW_RESIZABLE);
if (!window) {
fprintf(stderr, "Couldn't set create window: %s\n", SDL_GetError());
quit(5);
}
renderer = SDL_CreateRenderer(window, -1, 0);
if (!renderer) {
fprintf(stderr, "Couldn't set create renderer: %s\n", SDL_GetError());
quit(6);
}
SDL_GetRendererInfo(renderer, &info);
printf("Using %s rendering\n", info.name);
texture = SDL_CreateTexture(renderer, SDL_PIXELFORMAT_YV12, SDL_TEXTUREACCESS_STREAMING, pCodecCtx->width, pCodecCtx->height);
if (!texture) {
fprintf(stderr, "Couldn't set create texture: %s\n", SDL_GetError());
quit(7);
}
rect.x = 0;
rect.y = 0;
rect.w = pCodecCtx->width;
rect.h = pCodecCtx->height;
// end
// get YUV420P frame data
SDL_UpdateTexture( texture, &rect, yuvdata[0], linesize[0] );
SDL_RenderClear( renderer );
SDL_RenderCopy( renderer, texture, &rect, &rect );
SDL_RenderPresent( renderer );
SDL_Delay( 1000 );