在我的另一篇博客中介绍了硬解码和软解码,ffmpeg解码
在这篇文章中,主要介绍软解码获取的图片,在QT界面中播放,
获取解码后的每一帧图片,将图片通过QT中Qwidget自带的paintEvent事件,绘制到界面上
void PondingVideoWidget::paintEvent(QPaintEvent * )
{
QPainter painter(this);
painter.setRenderHint(QPainter::Antialiasing);
painter.setRenderHint(QPainter::TextAntialiasing);
painter.setRenderHint(QPainter::SmoothPixmapTransform);
painter.setRenderHint(QPainter::HighQualityAntialiasing);
painter.setBrush(Qt::black);
painter.drawRect(0, 0, this->width(), this->height());
if (mImage.size().width() <= 0) return;
QImage img = mImage.scaled(this->size(), Qt::KeepAspectRatio);
int x = this->width() - img.width();
int y = this->height() - img.height();
x = x / 2;
y = y / 2;
painter.drawImage(QPoint(x, y), img);
}
在这里做一个简单的实现方式
首先创建一个QWidge界面,getFrame是一个槽函数,这里没有写,
#include "ponding_video_widget.h"
#include "video_blending/video_blender.h"
#include "customize_files/mymessage.h"
PondingVideoWidget::PondingVideoWidget(QString name, QString path, QWidget *parent) :QWidget(parent)
{
setFixedSize(622, 350);
setAttribute(Qt::WA_StyledBackground, true);
setAttribute(Qt::WA_TransparentForMouseEvents, true);
setWindowFlags(Qt::FramelessWindowHint);
setAttribute(Qt::WA_TranslucentBackground);
setAttribute(Qt::WA_DeleteOnClose);
}
void PondingVideoWidget::paintEvent(QPaintEvent * )
{
QPainter painter(this);
painter.setRenderHint(QPainter::Antialiasing);
painter.setRenderHint(QPainter::TextAntialiasing);
painter.setRenderHint(QPainter::SmoothPixmapTransform);
painter.setRenderHint(QPainter::HighQualityAntialiasing);
painter.setBrush(Qt::black);
painter.drawRect(0, 0, this->width(), this->height());
if (mImage.size().width() <= 0) return;
QImage img = mImage.scaled(this->size(), Qt::KeepAspectRatio);
int x = this->width() - img.width();
int y = this->height() - img.height();
x = x / 2;
y = y / 2;
painter.drawImage(QPoint(x, y), img);
}
void PondingVideoWidget::getFrame()
{
mImage = frame.img;
this->update();
}
解码部分,只写一部分,对比解码文章就可以了
这里将图面通过信号发射出去,每一帧发射一次信号,然后界面接收,并绘制,
if (got_picture)
{
sws_scale(imgConvertCtx,
(uint8_t const * const *)pFrame->data,
pFrame->linesize, 0, pCodecCtx->height, pFrameBGR->data,
pFrameBGR->linesize);
VideoFrame frame;
frame.width = pFrame->width;
frame.height = pFrame->height;
frame.data = QByteArray((char *)outBuffer, byteNum * sizeof(uint8_t));
QImage tmpImg((uchar *)outBuffer, pCodecCtx->width, pCodecCtx->height, QImage::Format_RGB32);
frame.img = tmpImg.convertToFormat(QImage::Format_RGB888, Qt::NoAlpha);
emit frame.img;
frameQueue->push_back(frame);
}
不懂的话,可以私信交流