一、音画同步
音画同步是一项非常重要的工作,但音画同步涉及多种方式,由于场景的需要,每种方式有所区别。音画同步一般都是以Audio Master方式为主,人体对声音的敏感度超过视觉,这也是以音频为主的方式,当然未必一定是固定的方式,根据场景进行定制开发。
二、常见的音同步方式
【1】获取音频的PositionUs ,先Seek视频 ,再Seek音频
这种方式本质上画面和视频都会产生卡顿,之所以两次Seek的原因是视频的GOP不确定性以及关键帧的查找相对音频比较复杂,显然Seek视频反有可能达不到预期,再次Seek音频进行兜底处理。
优点:简单
确定:体验很差
【2】获取音频或者视频的PositionUs
计算时间差值,快的一方WAIT(或pause),时间差之后Resume
优点:难度一般
确定:控制复杂
【3】视频丢帧
这种方式一般是常见的播放器实现方式,以音频控制时间为准。
优点:体验较好
确定:计算复杂
【4】变速同步
以音频时间播放为准,修改视频播放速度
优点:体验较好,视频快时视频减速,视频慢时视频加速
缺点:需要兼容各种播放器状态。
三、ExoPlayer 音画同步
3.1 为什么说ExoPlayer是以音频为准
ExoPlayer重我们知道,其本身是有是时钟的,使用时钟的好处是避免了AudioTrack#getPlayHeadPosition的两个问题,一个是只能增大,不能后退的问题,第二个原因是部分杂牌设备对getPlayHeadPosition的适配不敬人意,存在前后抖动的问题,这对音画同步而言简直就是灾难性的。
在ExoPlayer中com.google.android.exoplayer2.audio.BaseRenderer#getMediaClock方法是空实现,但是在子类重视频依然返回null,只有音频进行了实现
com.google.android.exoplayer2.audio.MediaCodecAudioRenderer#getMediaClock()。
@Override @Nullable public MediaClock getMediaClock() { return this; }
这也证明了存在音频Track时以音频为准,当然如果没有音频Track,ExoPlayer中会使用自然时钟StandaloneMediaClock。下面是Render时钟选择,空时钟的视频Render最终被排除掉
public void onRendererEnabled(Renderer renderer) throws ExoPlaybackException { @Nullable MediaClock rendererMediaClock = renderer.getMediaClock(); //只有音频的不为空 if (rendererMediaClock != null && rendererMediaClock != rendererClock) { if (rendererClock != null) { throw ExoPlaybackException.createForUnexpected( new IllegalStateException("Multiple renderer media clocks enabled.")); } this.rendererClock = rendererMediaClock; this.rendererClockSource = renderer; rendererClock.setPlaybackParameters(standaloneClock.getPlaybackParameters()); } }
3.2 MediaClock的作用
public interface MediaClock { /** * Returns the current media position in microseconds. */ long getPositionUs(); /** * Attempts to set the playback parameters. The media clock may override the speed if changing the * playback parameters is not supported. * * @param playbackParameters The playback parameters to attempt to set. */ void setPlaybackParameters(PlaybackParameters playbackParameters); /** Returns the active playback parameters. */ PlaybackParameters getPlaybackParameters(); }
从内部方法我们可以知道,主要是获取时间,管理播放速度和音频音调。
3.3 时间如何同步
首先来看同步方法的调用
private void updatePlaybackPositions() throws ExoPlaybackException { MediaPeriodHolder playingPeriodHolder = queue.getPlayingPeriod(); if (playingPeriodHolder == null) { return; } // Update the playback position. long discontinuityPositionUs = playingPeriodHolder.prepared ? playingPeriodHolder.mediaPeriod.readDiscontinuity() : C.TIME_UNSET; if (discontinuityPositionUs != C.TIME_UNSET) { resetRendererPosition(discontinuityPositionUs); // A MediaPeriod may report a discontinuity at the current playback position to ensure the // renderers are flushed. Only report the discontinuity externally if the position changed. if (discontinuityPositionUs != playbackInfo.positionUs) { playbackInfo = handlePositionDiscontinuity( playbackInfo.periodId, discontinuityPositionUs, playbackInfo.requestedContentPositionUs); playbackInfoUpdate.setPositionDiscontinuity(Player.DISCONTINUITY_REASON_INTERNAL); } } else { rendererPositionUs = mediaClock.syncAndGetPositionUs( /* isReadingAhead= */ playingPeriodHolder != queue.getReadingPeriod()); long periodPositionUs = playingPeriodHolder.toPeriodTime(rendererPositionUs); maybeTriggerPendingMessages(playbackInfo.positionUs, periodPositionUs); playbackInfo.positionUs = periodPositionUs; } // Update the buffered position and total buffered duration. MediaPeriodHolder loadingPeriod = queue.getLoadingPeriod(); playbackInfo.bufferedPositionUs = loadingPeriod.getBufferedPositionUs(); playbackInfo.totalBufferedDurationUs = getTotalBufferedDurationUs(); }
刚才我们看到,MediaClock并没有syncAndGetPositionUs获取同步时间,这个方法实际上是DefaultMediaClock重的,属于MediaClock的子类。获取播放时间点,注意这里并不是同步视频,而是与系统时间进行同步后获取音频位置。
3.4 音频播放位置如何同步到视频 ?
这个我们可以看看doSomeWork的调用,该方法在ExoPlayer会定时调用,用来驱动播放状态,方法实现较多,这里简单截取一下。
private void doSomeWork() throws ExoPlaybackException, IOException {
updatePlaybackPositions(); //更新位置
// TODO: Each renderer should return the maximum delay before which it wishes to be called
// again. The minimum of these values should then be used as the delay before the next
// invocation of this method.
renderer.render(rendererPositionUs, rendererPositionElapsedRealtimeUs); //同步位置
}
3.4 视频如何同步
在MediaCodecVideoRender重,render() ->drainOutputBuffer -> processOutputBuffer 传递时间,最终在processOuputBuffer中处理
boolean treatDroppedBuffersAsSkipped = joiningDeadlineMs != C.TIME_UNSET;
if (shouldDropBuffersToKeyframe(earlyUs, elapsedRealtimeUs, isLastBuffer)
&& maybeDropBuffersToKeyframe(
codec, bufferIndex, presentationTimeUs, positionUs, treatDroppedBuffersAsSkipped)) {
return false;
} else if (shouldDropOutputBuffer(earlyUs, elapsedRealtimeUs, isLastBuffer)) {
if (treatDroppedBuffersAsSkipped) {
skipOutputBuffer(codec, bufferIndex, presentationTimeUs);
} else {
dropOutputBuffer(codec, bufferIndex, presentationTimeUs);
}
return true;
}
if (Util.SDK_INT >= 21) {
// Let the underlying framework time the release.
if (earlyUs < 50000) {
notifyFrameMetadataListener(
presentationTimeUs, adjustedReleaseTimeNs, format, currentMediaFormat);
renderOutputBufferV21(codec, bufferIndex, presentationTimeUs, adjustedReleaseTimeNs);
return true;
}
} else {
// We need to time the release ourselves.
if (earlyUs < 30000) {
if (earlyUs > 11000) {
// We're a little too early to render the frame. Sleep until the frame can be rendered.
// Note: The 11ms threshold was chosen fairly arbitrarily.
try {
// Subtracting 10000 rather than 11000 ensures the sleep time will be at least 1ms.
Thread.sleep((earlyUs - 10000) / 1000);
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
return false;
}
}
notifyFrameMetadataListener(
presentationTimeUs, adjustedReleaseTimeNs, format, currentMediaFormat);
renderOutputBuffer(codec, bufferIndex, presentationTimeUs);
return true;
}
四、总结