上周接到leader吩咐,说项目中feed流占用的内存有点大,而且在低端手机上页面渲染很慢,需要排查下。所以大致梳理了下Glide中Webp资源的解析过程。如有不正确的地方,烦请斧正,一起进步。
如何通过Glide加载Wepb并展示
目前博主所在的项目使用的第三方开源库来加载webp的,所以第一步👉添加依赖
implementation "com.github.bumptech.glide:glide:4.8.0"
implementation "com.github.bumptech.glide:okhttp3-integration:4.8.0"
implementation "com.zlc.glide:webpdecoder:1.2.4.8.0"
然后就可以直接使用下面代码来加载webp资源并展示了
Glide.with(context)
.load(feedUrl)
.into(holder.img)
就这么简单的一句代码就可以实现webp资源的加载和展示了,实在优雅得离谱,阿明表示很懵圈。glide到底干了什么?webp如何被解析加载?如何被展示?为啥就无限循环播放了?无奈!!!直接刀源码吧
read the fucking source code
加载流程
因为本文只是关注glide如何解析和展示webp资源,所以资源请求逻辑和缓存等逻辑就先不关注了,后面再出文章记录吧!!(flag这不就立起来了吗)
上图是Webp加载流程中相关类及其关系,看不懂?不要慌!不要慌!接着往下看,阿明把加载流程分成了三个部分,后文会分别对这三个部分详细解释,看完后再看上图应该就没啥问题了。
- 如何注册解码器和找到解码器
- 解码器如何解码webp
- webp如何播放
如何注册解码器
得益于编译注解解析技术,glide框架设计得扩展性很强,允许自定义各种加载器,而webp解析库就是继承了LibraryGlideModule,然后注入了自己的解析器
@GlideModule
public class WebpGlideLibraryModule extends LibraryGlideModule {
@Override
public void registerComponents(Context context, Glide glide, Registry registry) {
final Resources resources = context.getResources();
final BitmapPool bitmapPool = glide.getBitmapPool();
final ArrayPool arrayPool = glide.getArrayPool();
WebpDownsampler webpDownsampler = new WebpDownsampler(registry.getImageHeaderParsers(),
resources.getDisplayMetrics(), bitmapPool, arrayPool);
ByteBufferBitmapWebpDecoder byteBufferBitmapDecoder = new ByteBufferBitmapWebpDecoder(webpDownsampler);
StreamBitmapWebpDecoder streamBitmapDecoder = new StreamBitmapWebpDecoder(webpDownsampler, arrayPool);
ByteBufferWebpDecoder byteBufferWebpDecoder =
new ByteBufferWebpDecoder(context, arrayPool, bitmapPool);
registry
/* 这里是静态图解析配置:解析成Bitmap之类的 */
.prepend(Registry.BUCKET_BITMAP, ByteBuffer.class, Bitmap.class, byteBufferBitmapDecoder)
.prepend(Registry.BUCKET_BITMAP, InputStream.class, Bitmap.class, streamBitmapDecoder)
/* BitmapDrawables for static webp images */
.prepend(
Registry.BUCKET_BITMAP_DRAWABLE,
ByteBuffer.class,
BitmapDrawable.class,
new BitmapDrawableDecoder<>(resources, byteBufferBitmapDecoder))
.prepend(
Registry.BUCKET_BITMAP_DRAWABLE,
InputStream.class,
BitmapDrawable.class,
new BitmapDrawableDecoder<>(resources, streamBitmapDecoder))
/* 下面的是动态图解析的配置: 会动哟 */
.prepend(ByteBuffer.class, WebpDrawable.class, byteBufferWebpDecoder)
.prepend(InputStream.class, WebpDrawable.class, new StreamWebpDecoder(byteBufferWebpDecoder, arrayPool))
.prepend(WebpDrawable.class, new WebpDrawableEncoder());
}
}
上述代码就干了一件事,通过registry对象的prepend方法将各个类型的解码器(Decoder)或者编码器(Encoder)配置到仓库中,为使用解码器或者编码器做准备。而registry对象中有很多成员,我们关注的解码器会被注册到ResourceDecoderRegistry类型的decoderRegistry成员中。
public <Data, TResource> Registry prepend(
@NonNull String bucket,
@NonNull Class<Data> dataClass,
@NonNull Class<TResource> resourceClass,
@NonNull ResourceDecoder<Data, TResource> decoder) {
decoderRegistry.prepend(bucket, decoder, dataClass, resourceClass);
return this;
}
而ResourceDecoderRegistry作为一个数据仓库类,其数据结构如下图
private final Map<String, List<Entry<?, ?>>> decoders = new HashMap<>();
可以看出,每个注册的decoder会被Entry对象持有,并插入到列表中,同时被持有的还有dataClass和resourceClass,可以这样理解他们三的关系:decoder负责将dataClass类型的数据转化成resourceClass类型的数据,比如如下注册的解码器负责将InputStream数据流解析成WebpDrawble资源。
.prepend(InputStream.class, WebpDrawable.class, new StreamWebpDecoder(byteBufferWebpDecoder, arrayPool))
这里的prepend方法会将持有decoder对象的Entry插入到对应bucket的列表头,而glide在获取解析器时,是从列表头开始获取解析器并处理数据的,如下:
private Resource<ResourceType> decodeResourceWithList(DataRewinder<DataType> rewinder, int width, int height, @NonNull Options options, List<Throwable> exceptions) throws GlideException {
Resource<ResourceType> result = null;
int i = 0;
for(int size = this.decoders.size(); i < size; ++i) {
ResourceDecoder decoder = (ResourceDecoder)this.decoders.get(i);
try {
DataType data = rewinder.rewindAndGet();
if (decoder.handles(data, options)) {
data = rewinder.rewindAndGet();
result = decoder.decode(data, width, height, options);
}
...
}
这样可以实现自定义的解码器优先级高于默认解码器。所以在查找需要将InputStream数据转换为WebpDrawable资源时,就会找到StreamWebpDecoder,而StreamWebpDecoder会委托给ByteBufferWebpDecoder来处理.
#StreamWebpDecoder.java
public Resource<WebpDrawable> decode(InputStream inputStream, int width, int height, Options options) throws IOException {
ByteBuffer byteBuffer = ByteBuffer.wrap(data);
return byteBufferDecoder.decode(byteBuffer, width, height, options);
}
接下来看看ByteBufferWebpDecoder是如何去解析Webp的。
如何解码Webp
直接看ByteBufferWebpDecoder的decode方法
public Resource<WebpDrawable> decode(ByteBuffer source, int width, int height, Options options) throws IOException {
int length = source.remaining();
byte[] data = new byte[length];
source.get(data, 0, length);
WebpImage webp = WebpImage.create(data);
int sampleSize = getSampleSize(webp.getWidth(), webp.getHeight(), width, height);
WebpDecoder webpDecoder = new WebpDecoder(mProvider, webp, source, sampleSize);
webpDecoder.advance();
Bitmap firstFrame = webpDecoder.getNextFrame();
if (firstFrame == null) {
return null;
}
Transformation<Bitmap> unitTransformation = UnitTransformation.get();
return new WebpDrawableResource(new WebpDrawable(mContext, webpDecoder, mBitmapPool, unitTransformation, width, height,
firstFrame));
}
decode方法干了三件事:
- 将source中的字节数据拷贝到数组中并通过WebpImage创建一个WebpImage对象
- 创建webpDecoder对象并解析第一帧图像
- 创建WebpDrawable并返回WebpDrawableResource
创建WebpImage
static {
System.loadLibrary("glide-webp");
}
public static WebpImage create(byte[] source) {
Preconditions.checkNotNull(source);
ByteBuffer byteBuffer = ByteBuffer.allocateDirect(source.length);
byteBuffer.put(source);
byteBuffer.rewind();
return nativeCreateFromDirectByteBuffer(byteBuffer);
}
WebpImage使用了glide-webp这个so库,主要是通过ta来解析webp文件的数据流,构建webp文件对应数据的描述对象,通过WebpImage可以知道该.webp的基本信息如下:
public class WebpImage {
// Access by native
@Keep
private long mNativePtr;
private int mWidth; //wepb宽度
private int mHeigth;//webp高度
private int mFrameCount;//帧数
private int mDurationMs;//总时长
private int[] mFrameDurations;//每一帧的时长数据
private int mLoopCount;//循环次数
....
创建webpDecoder对象并解析第一帧图像
创建webpDecoder对象后调用其getNextFrame获取第一帧图像,整体步骤分为三步:
- 准备画布:从缓存中获取Bitmap并创建Canvas
- 数据渲染:将当前帧内容绘制到新Bitmap中,再将此Bitmap会知道前一步的bitmap中
- 缓存bitmap:拷贝第二步渲染的Bitmap并缓存
@Override
public Bitmap getNextFrame() {
int frameNumber = getCurrentFrameIndex();
// 从缓存中获取
Bitmap bitmap = mBitmapProvider.obtain(downsampledWidth, downsampledHeight, Bitmap.Config.ARGB_8888);
Canvas canvas = new Canvas(bitmap);
canvas.drawColor(Color.TRANSPARENT, PorterDuff.Mode.SRC);
int nextIndex;
if (!isKeyFrame(frameNumber)) {
//如果不是关键帧,先用背景和透明色填充
nextIndex = prepareCanvasWithBlending(frameNumber - 1, canvas);
} else {
nextIndex = frameNumber;
}
for (int index = nextIndex; index < frameNumber; index++) {
WebpFrameInfo frameInfo = mFrameInfos[index];
if (!frameInfo.blendPreviousFrame) {
disposeToBackground(canvas, frameInfo);
}
// render the previous frame
renderFrame(index, canvas);
if (frameInfo.disposeBackgroundColor) {
disposeToBackground(canvas, frameInfo);
}
}
WebpFrameInfo frameInfo = mFrameInfos[frameNumber];
if (!frameInfo.blendPreviousFrame) {
disposeToBackground(canvas, frameInfo);
}
// Finally, we render the current frame. We don't dispose it.
renderFrame(frameNumber, canvas);
// Then put the rendered frame into the BitmapCache
cacheFrameBitmap(frameNumber, bitmap);
return bitmap;
}
private void renderFrame(int frameNumber, Canvas canvas) {
WebpFrameInfo frameInfo = mFrameInfos[frameNumber];
int frameWidth = frameInfo.width / sampleSize;
int frameHeight = frameInfo.height / sampleSize;
int xOffset = frameInfo.xOffset / sampleSize;
int yOffset = frameInfo.yOffset / sampleSize;
WebpFrame webpFrame = mWebPImage.getFrame(frameNumber);
try {
Bitmap frameBitmap = mBitmapProvider.obtain(frameWidth, frameHeight, mBitmapConfig);
frameBitmap.eraseColor(Color.TRANSPARENT);
webpFrame.renderFrame(frameWidth, frameHeight, frameBitmap);
canvas.drawBitmap(frameBitmap, xOffset, yOffset, null);
mBitmapProvider.release(frameBitmap);
} finally {
webpFrame.dispose();
}
}
private void cacheFrameBitmap(int frameNumber, Bitmap bitmap) {
// Release the old cached bitmap
mFrameBitmapCache.remove(frameNumber);
// 创建一个Bitmap的拷贝并缓存
Bitmap cache = mBitmapProvider.obtain(bitmap.getWidth(), bitmap.getHeight(), bitmap.getConfig());
cache.eraseColor(Color.TRANSPARENT);
Canvas canvas = new Canvas(cache);
canvas.drawBitmap(bitmap, 0, 0, null);
mFrameBitmapCache.put(frameNumber, cache);
}
创建WebpDrawable
创建WebpDrawble并返回WebpDrawableResource,之后decoder会initialize方法来初始化resource,再将初始化后到设置给ImageView展示。
public class WebpDrawableResource extends DrawableResource<WebpDrawable> implements Initializable {
public void initialize() {
//这样drawable就有第一帧数据可以展示了
drawable.getFirstFrame().prepareToDraw();
}
}
如何驱动webp播放
通过前面的步骤,目前ImageView已经可以渲染第一帧的画面了,那如何让wepb内容动起来呢?答案在WebpDrawable,在WebpDrawable变得可见和动画播放方法被回调时,会调用startRunning方法,而后会根据frameloader去加载下一帧数据:
private void startRunning() {
if(state.frameLoader.getFrameCount() == 1) {
invalidateSelf();
} else if(!isRunning) {
isRunning = true;
state.frameLoader.subscribe(this);
invalidateSelf();
}
}
private void start() {
if (!isRunning) {
isRunning = true;
isCleared = false;
loadNextFrame();
}
}
private void loadNextFrame() {
if (isRunning && !isLoadPending) {
if (startFromFirstFrame) {
gifDecoder.resetFrameIndex();
startFromFirstFrame = false;
}
isLoadPending = true;
int delay = gifDecoder.getNextDelay();
long targetTime = SystemClock.uptimeMillis() + (long) delay;
gifDecoder.advance();
next = new DelayTarget(handler, gifDecoder.getCurrentFrameIndex(), targetTime);
requestBuilder.clone().apply(RequestOptions.signatureOf(new FrameSignature())).load(gifDecoder).into(next);
}
}
可以看出这里将加载下一帧的数据给了delayTarget对象
static class DelayTarget extends SimpleTarget<Bitmap> {
.....
public void onResourceReady(Bitmap resource, Transition<? super Bitmap> transition) {
this.resource = resource;
//解析到下一帧资源后,发送消息给handler
Message msg = handler.obtainMessage(1, this);
handler.sendMessageAtTime(msg, targetTime);
}
}
private class FrameLoaderCallback implements Handler.Callback {
public boolean handleMessage(Message msg) {
DelayTarget target;
if (msg.what == 1) {
target = (DelayTarget) msg.obj;
//handler中回调onFrameReady方法
WebpFrameLoader.this.onFrameReady(target);
return true;
}
}
}
void onFrameReady(DelayTarget delayTarget) {
if (isCleared) {
handler.obtainMessage(2, delayTarget).sendToTarget();
} else {
//..省略了通知渲染的步骤
//开始加载下一帧了,动起来了
loadNextFrame();
}
}