关联博客
ExoPlayer播放器剖析(一)进入ExoPlayer的世界
ExoPlayer播放器剖析(二)编写exoplayer的demo
ExoPlayer播放器剖析(三)流程分析—从build到prepare看ExoPlayer的创建流程
ExoPlayer播放器剖析(四)从renderer.render函数分析至MediaCodec
ExoPlayer播放器剖析(五)ExoPlayer对AudioTrack的操作
ExoPlayer播放器剖析(六)ExoPlayer同步机制分析
ExoPlayer播放器剖析(七)ExoPlayer对音频时间戳的处理
ExoPlayer播放器扩展(一)DASH流与HLS流简介
一、引言:
在上一篇博客中,我们对exoplayer的流程做了一个分析,可以看到,exoplayer的初始化流程步骤并不复杂,而往往api越简单的接口,其下面的实现越麻烦,这篇博客,我们就针对上一篇末尾提到的doSomeWork这个关键函数入手,看看exoplayer是如何封装至MediaCodec的。
二、doSomeWork函数分析:
首先贴出doSomeWork的中的关键代码:
private void doSomeWork() throws ExoPlaybackException, IOException {
...
/* 1.更新音频时间戳 */
updatePlaybackPositions();
...
/* 2.调用各个类型的render进行数据处理 */
for (int i = 0; i < renderers.length; i++) {
...
/* 核心处理方法 */
renderer.render(rendererPositionUs, rendererPositionElapsedRealtimeUs);
....
}
...
if (finishedRendering && playingPeriodHolder.info.isFinal) {
setState(Player.STATE_ENDED);
stopRenderers();
/* 更新播放器状态为STATE_READY */
} else if (playbackInfo.playbackState == Player.STATE_BUFFERING
&& shouldTransitionToReadyState(renderersAllowPlayback)) {
setState(Player.STATE_READY);
/* 如果playWhenReady为true,则开始渲染 */
if (shouldPlayWhenReady()) {
startRenderers();
}
} else if (playbackInfo.playbackState == Player.STATE_READY
&& !(enabledRendererCount == 0 ? isTimelineReady() : renderersAllowPlayback)) {
rebuffering = shouldPlayWhenReady();
setState(Player.STATE_BUFFERING);
stopRenderers();
}
...
if ((shouldPlayWhenReady() && playbackInfo.playbackState == Player.STATE_READY)
|| playbackInfo.playbackState == Player.STATE_BUFFERING) {
/* 开启渲染之后将进入这个分支:ACTIVE_INTERVAL_MS为10ms */
maybeScheduleWakeup(operationStartTimeMs, ACTIVE_INTERVAL_MS);
} else if (enabledRendererCount != 0 && playbackInfo.playbackState != Player.STATE_ENDED) {
scheduleNextWork(operationStartTimeMs, IDLE_INTERVAL_MS);
} else {
handler.removeMessages(MSG_DO_SOME_WORK);
}
}
1.renderer.render函数分析:
先贴出代码:
----------------------------------------------------------------------------
render@E:\GitHub_Windows\ExoPlayer\library\core\src\main\java\com\google\android\exoplayer2\mediacodec\MediaCodecRenderer.java
----------------------------------------------------------------------------
@Override
public void render(long positionUs, long elapsedRealtimeUs) throws ExoPlaybackException {
/* 是否处理EOS */
if (pendingOutputEndOfStream) {
pendingOutputEndOfStream = false;
processEndOfStream();
}
if (pendingPlaybackException != null) {
ExoPlaybackException playbackException = pendingPlaybackException;
pendingPlaybackException = null;
throw playbackException;
}
try {
if (outputStreamEnded) {
renderToEndOfStream();
return;
}
if (inputFormat == null && !readToFlagsOnlyBuffer(/* requireFormat= */ true)) {
// We still don't have a format and can't make progress without one.
return;
}
// We have a format.
/* 配置codec */
maybeInitCodecOrBypass();
if (bypassEnabled) {
TraceUtil.beginSection("bypassRender");
while (bypassRender(positionUs, elapsedRealtimeUs)) {}
TraceUtil.endSection();
} else if (codec != null) {
long renderStartTimeMs = SystemClock.elapsedRealtime();
TraceUtil.beginSection("drainAndFeed");
/* 消耗解码数据 */
while (drainOutputBuffer(positionUs, elapsedRealtimeUs)
&& shouldContinueRendering(renderStartTimeMs)) {}
/* 填充源数据 */
while (feedInputBuffer() && shouldContinueRendering(renderStartTimeMs)) {}
TraceUtil.endSection();
} else {
decoderCounters.skippedInputBufferCount += skipSource(positionUs);
// We need to read any format changes despite not having a codec so that drmSession can be
// updated, and so that we have the most recent format should the codec be initialized. We
// may also reach the end of the stream. Note that readSource will not read a sample into a
// flags-only buffer.
readToFlagsOnlyBuffer(/* requireFormat= */ false);
}
decoderCounters.ensureUpdated();
} catch (IllegalStateException e) {
if (isMediaCodecException(e)) {
throw createRendererException(createDecoderException(e, getCodecInfo()), inputFormat);
}
throw e;
}
}
整个render函数其实就干了三件事:调用子类MediaCodecVideoRenderer和
MediaCodecAudioRenderer去配置codec、消耗MediaCodec中解码处理的数据和往MediaCodec中填充源数据。
着重看配置codec的函数maybeInitCodecOrBypass():
protected final void maybeInitCodecOrBypass() throws ExoPlaybackException
{
...
while (codec == null) {
MediaCodecInfo codecInfo = availableCodecInfos.peekFirst();
if (!shouldInitCodec(codecInfo)) {
return;
}
try {
/* 初始化codec */
initCodec(codecInfo, crypto);
} catch (Exception e) {
...
}
}
availableCodecInfos = null;
}
继续压缩代码,只关注initCodec(终于可以看到MediaCodecle了~):
private void initCodec(MediaCodecInfo codecInfo, MediaCrypto crypto) throws Exception {
long codecInitializingTimestamp;
long codecInitializedTimestamp;
MediaCodec codec = null;
String codecName = codecInfo.name;
float codecOperatingRate =
Util.SDK_INT < 23
? CODEC_OPERATING_RATE_UNSET
: getCodecOperatingRateV23(operatingRate, inputFormat, getStreamFormats());
if (codecOperatingRate <= assumedMinimumCodecOperatingRate) {
codecOperatingRate = CODEC_OPERATING_RATE_UNSET;
}
/* 创建了codec适配器来管理codec */
MediaCodecAdapter codecAdapter = null;
try {
codecInitializingTimestamp = SystemClock.elapsedRealtime();
TraceUtil.beginSection("createCodec:" + codecName);
/* 实例化MediaCodec */
codec = MediaCodec.createByCodecName(codecName);
if (mediaCodecOperationMode == OPERATION_MODE_ASYNCHRONOUS_DEDICATED_THREAD
&& Util.SDK_INT >= 23) {
codecAdapter = new AsynchronousMediaCodecAdapter(codec, getTrackType());
} else if (mediaCodecOperationMode
== OPERATION_MODE_ASYNCHRONOUS_DEDICATED_THREAD_ASYNCHRONOUS_QUEUEING
&& Util.SDK_INT >= 23) {
codecAdapter =
new AsynchronousMediaCodecAdapter(
codec, /* enableAsynchronousQueueing= */ true, getTrackType());
} else {
/* 同步模式请看这里 */
codecAdapter = new SynchronousMediaCodecAdapter(codec);
}
TraceUtil.endSection();
TraceUtil.beginSection("configureCodec");
/* 配置codec */
configureCodec(codecInfo, codecAdapter, inputFormat, crypto, codecOperatingRate);
TraceUtil.endSection();
TraceUtil.beginSection("startCodec");
/* 开启解码 */
codecAdapter.start();
TraceUtil.endSection();
codecInitializedTimestamp = SystemClock.elapsedRealtime();
/* 获取buffer数组:Android 5.0以下 */
getCodecBuffers(codec);
} catch (Exception e) {
if (codecAdapter != null) {
codecAdapter.shutdown();
}
if (codec != null) {
resetCodecBuffers();
codec.release();
}
throw e;
}
this.codec = codec;
this.codecAdapter = codecAdapter;
this.codecInfo = codecInfo;
this.codecOperatingRate = codecOperatingRate;
codecInputFormat = inputFormat;
codecAdaptationWorkaroundMode = codecAdaptationWorkaroundMode(codecName);
codecNeedsReconfigureWorkaround = codecNeedsReconfigureWorkaround(codecName);
codecNeedsDiscardToSpsWorkaround =
codecNeedsDiscardToSpsWorkaround(codecName, codecInputFormat);
codecNeedsFlushWorkaround = codecNeedsFlushWorkaround(codecName);
codecNeedsSosFlushWorkaround = codecNeedsSosFlushWorkaround(codecName);
codecNeedsEosFlushWorkaround = codecNeedsEosFlushWorkaround(codecName);
codecNeedsEosOutputExceptionWorkaround = codecNeedsEosOutputExceptionWorkaround(codecName);
codecNeedsMonoChannelCountWorkaround =
codecNeedsMonoChannelCountWorkaround(codecName, codecInputFormat);
codecNeedsEosPropagation =
codecNeedsEosPropagationWorkaround(codecInfo) || getCodecNeedsEosPropagation();
if ("c2.android.mp3.decoder".equals(codecInfo.name)) {
c2Mp3TimestampTracker = new C2Mp3TimestampTracker();
}
if (getState() == STATE_STARTED) {
codecHotswapDeadlineMs = SystemClock.elapsedRealtime() + MAX_CODEC_HOTSWAP_TIME_MS;
}
decoderCounters.decoderInitCount++;
/* 计算初始化codec的耗时 */
long elapsed = codecInitializedTimestamp - codecInitializingTimestamp;
onCodecInitialized(codecName, codecInitializedTimestamp, elapsed);
}
拨开层层面纱,总算看到了MediaCodec,exoplayer在对codec的管理一块,引入了适配器的概念,至于同步方式和异步方式,对MediaCodec的使用逻辑上是不一样的,通常,我们是以同步的方式来进行,config和start是对MediaCodec的标准操作逻辑,最后一步会去获取输入输出的buffer数组(看了下代码,只有Android 5.0以下才会走这一步),这些都是由MediaCodec底层来提供的。另外,代码中还非常有意思的一点是,统计了初始化codec的总耗时。继续往下分析,我们先看下同步模式做了什么:
public SynchronousMediaCodecAdapter(MediaCodec mediaCodec) {
this.codec = mediaCodec;
}
仅仅是codec的一个赋值,再看下configureCodec(这里以音频为例):
@Override
protected void configureCodec(
MediaCodecInfo codecInfo,
MediaCodecAdapter codecAdapter,
Format format,
@Nullable MediaCrypto crypto,
float codecOperatingRate) {
codecMaxInputSize = getCodecMaxInputSize(codecInfo, format, getStreamFormats());
codecNeedsDiscardChannelsWorkaround = codecNeedsDiscardChannelsWorkaround(codecInfo.name);
codecNeedsEosBufferTimestampWorkaround = codecNeedsEosBufferTimestampWorkaround(codecInfo.name);
MediaFormat mediaFormat =
getMediaFormat(format, codecInfo.codecMimeType, codecMaxInputSize, codecOperatingRate);
/* 调用适配器的configure */
codecAdapter.configure(mediaFormat, /* surface= */ null, crypto, /* flags= */ 0);
// Store the input MIME type if we're only using the codec for decryption.
boolean decryptOnlyCodecEnabled =
MimeTypes.AUDIO_RAW.equals(codecInfo.mimeType)
&& !MimeTypes.AUDIO_RAW.equals(format.sampleMimeType);
decryptOnlyCodecFormat = decryptOnlyCodecEnabled ? format : null;
}
核心还是调用适配器的configure,前面我们说过对应创建的是同步适配器,所以看一下SynchronousMediaCodecAdapter中的configure函数:
@Override
public void configure(
@Nullable MediaFormat mediaFormat,
@Nullable Surface surface,
@Nullable MediaCrypto crypto,
int flags) {
codec.configure(mediaFormat, surface, crypto, flags);
}
完全是调用Android的MediaCodec接口了,同理,start函数也是一样:
@Override
public void start() {
codec.start();
}
2.分析startRenderers:
我们看startRenderers方法里面做了什么:
private void startRenderers() throws ExoPlaybackException {
rebuffering = false;
/* 开启MediaClock */
mediaClock.start();
for (Renderer renderer : renderers) {
/* 如果render使能 */
if (isRendererEnabled(renderer)) {
renderer.start();
}
}
}
MediaClock是exoplayer定义的一个用于记录当前系统时间的一个类,后续在同步过程会扮演非常重要的角色,如果render被使能的话,那么将调用start接口,renderer对应的类是Renderer,我们看下Renderer接口的继承关系:
public interface Renderer extends PlayerMessage.Target {
...
}
继承自Renderer的类有:
看这里是不是有种熟悉的感觉?在上一篇博客中有讲过SimpleExoPlayer的构造函数中会去创建5~6种render,所以,对应的实现类就是这是这个地方,对于解码最重要的两个render,自然就是MediaCodecAudioRenderer类和MediaCodecVideoRenderer类了。
先看下BaseRenderer中start的实现:
@Override
public final void start() throws ExoPlaybackException {
/* 设置Renderer状态为STATE_ENABLED */
Assertions.checkState(state == STATE_ENABLED);
state = STATE_STARTED;
onStarted();
}
onStarted()就是各个类来具体实现了,先看MediaCodecAudioRenderer:
@Override
protected void onStarted() {
/* 调用父类onStarted,而父类该函数do nothing */
super.onStarted();
audioSink.play();
}
这里的audioSink实际上是exoplayer基于AudioTrack封装的类,它的实现是在DefaultAudioSink中:
----------------------------------------------------------------------------
play@ExoPlayer\library\core\src\main\java\com\google\android\exoplayer2\audio\DefaultAudioSink.java
----------------------------------------------------------------------------
@Override
public void play() {
playing = true;
if (isAudioTrackInitialized()) {
audioTrackPositionTracker.start();
audioTrack.play();
}
}
可以看到,如果AudioTrack已经创建,那么就直接调用AudioTrack的play方法进行音频数据的播放。
再回到前面去看MediaCodecVideoRenderer中onStarted()方法的实现:
----------------------------------------------------------------------------
onStarted@ExoPlayer\library\core\src\main\java\com\google\android\exoplayer2\video\MediaCodecVideoRenderer.java
----------------------------------------------------------------------------
@Override
protected void onStarted() {
super.onStarted();
/* 丢帧数清0 */
droppedFrames = 0;
droppedFrameAccumulationStartTimeMs = SystemClock.elapsedRealtime();
/* 将上一帧渲染时间设定为系统启动后至今的时间,即当前时间 */
lastRenderTimeUs = SystemClock.elapsedRealtime() * 1000;
totalVideoFrameProcessingOffsetUs = 0;
videoFrameProcessingOffsetCount = 0;
/* 更新帧率 */
updateSurfaceFrameRate(/* isNewSurface= */ false);
}
updateSurfaceFrameRate函数在AndroidR(Android 11)版本以下将直接返回,所以,这个函数做的事不多,主要是清零丢帧数及记录当前的系统时间。
总结:
startRenderers()方法对音频而言是开始往audiotrack中写数据进行播放,对视频而言是记录当前的系统时间为后面的渲染做准备。
三、总结:
对MediaCodec的配置过程就分析完了,doSomeWork每次循环都会去调用具体的render类进行处理,在第一次调用的时候,会根据对应的mime type去初始化codec,在最下层的适配器中会去调用Android原生的MediaCodec接口,之后就会去不停地去生产和消耗解码后的数据了。