点击“开发者技术前线”,选择“星标?”
13:21 在看|星标|留言, 真爱
来自:腾讯在线教育腾讯教育从去年开始接入Flutter,今年上半年重构腾讯课堂和企鹅辅导iPad端,80%代码都采用Flutter实现,对于教育最重要的点播功能同样也需要迁移到Flutter上进行渲染。
目前正在研究的实现渲染的方案主要有2种形式PlatformView和Texture Widget。下面文章就先大概讲述一下Flutter的渲染框架原理和实现,然后会对这两种方案进行对比分析。
Flutter渲染框架和原理
Flutter的框架主要包括Framework和Engine两层,应用是基于Framework层开发的,Framework负责渲染中的Build,Layout,Paint,生成Layer等。Engine层是C++实现的渲染引擎,负责把Framework生成的Layer组合,生成纹理,然后通过OpenGL接口向GPU提交渲染数据。
Flutter:Framework的最底层,提供工具类和方法 Painting :封装了Flutter Engine提供的绘制接口,主要是为了在绘制控件等固定样式的图形时提供更直观、更方便的接口 Animation :动画相关的类 Gesture :提供了手势识别相关的功能,包括触摸事件类定义和多种内置的手势识别器 Rendering:渲染库,Flutter的控件树在实际显示时会转换成对应的渲染对象(RenderObject)树来实现布局和绘制操作
渲染原理
当GPU发出Vsync信号时,会执行Dart代码绘制新UI,Dart会被执行为Layer Tree,然后经过Compositor合成后交由Skia引擎渲染处理为GPU数据,最后通过GL/Vulkan发给GPU,具体流程如下:
当需要更新UI的时候,Framework通知Engine,Engine会等到下个Vsync信号到达的时候通知Framework,Framework进行animations,build,layout,compositing,paint,最后生成layer提交给Engine。Engine再把layer进行组合,生成纹理,最后通过OpenGL接口提交数据给GPU,具体流程如下:
接下来分别分析一下两个方案的各自的特点以及使用的方式。
PlatformView
PlatformView是Flutter官方在1.0版本推出的组件,以解决开发者想在Flutter中嵌入Android和iOS平台原生View的Widget。例如想嵌入地图、视频播放器等原生组件,对于想尝试Flutter,但是又想低成本的迁移复杂组件的团队,可以尝试PlatformView,在 Dart 中的类对应到 iOS 和 Android 平台分别是UIKitView和AndroidView。
那么PlatformView在点播功能中应该怎么实现,如下图所示:
其中的ARMPlatformView代表业务View。
Dart层
1.创建关联类
关联类的作用是Native和Dart侧的桥梁,其中id需要和Native获取对应。
class VodPlayerController {
VodPlayerController._(int id)
: _channel = MethodChannel('ARMFlutterVodPlayerView_$id');
final MethodChannel _channel;
Future<void> play(String url) async {
return _channel.invokeMethod('play', url);
}
Future<void> stop() async {
return _channel.invokeMethod('stop');
}
}
2.创建Callback
typedef void VodPlayerViewWidgetCreatedCallback(VodPlayerController controller);
3.创建Widget布局
class VodVideoWidget extends StatefulWidget {
final VodPlayerViewWidgetCreatedCallback callback;
final x;
final y;
final width;
final height;
VodVideoWidget({
Key key,
@required this.callback,
@required this.x,
@required this.y,
@required this.width,
@required this.height,
});
@override
_VodVideoWidgetState createState() => _VodVideoWidgetState();
}
class _VodVideoWidgetState extends State<VodVideoWidget> {
@override
Widget build(BuildContext context) {
return UiKitView(
viewType: 'ARMFlutterVodPlayerView',
onPlatformViewCreated: _onPlatformViewCreated,
creationParams: <String,dynamic>{
'x': widget.x,
'y': widget.y,
'width': widget.width,
'height': widget.height,
},
creationParamsCodec: new StandardMessageCodec(),
);
}
void _onPlatformViewCreated(int id){
if(widget.callback == null) {
return;
}
widget.callback(VodPlayerController._(id));
}
}
Native层
1.注册ViewFactory
@implementation ARMFlutterVodPlugin
+ (void)registerWithRegistrar:(nonnull NSObject<FlutterPluginRegistrar> *)registrar {
ARMFlutterVodPlayerFactory* vodFactory =
[[ARMFlutterVodPlayerFactory alloc] initWithMessenger:registrar.messenger];
[registrar registerViewFactory:vodFactory withId:@"ARMFlutterVodPlayerView"];
}
@end
2.注册Plugin
+ (void)registerWithRegistry:(NSObject<FlutterPluginRegistry>*)registry {
[ARMFlutterVodPlugin registerWithRegistrar:[registry registrarForPlugin:@"ARMFlutterVodPlugin"]];
}
3.ViewFactory实现
@implementation ARMFlutterVodPlayerFactory
- (instancetype)initWithMessenger:(NSObject<FlutterBinaryMessenger> *)messager {
self = [super init];
if (self) {
_messenger = messager;
}
return self;
}
- (NSObject<FlutterMessageCodec> *)createArgsCodec {
return [FlutterStandardMessageCodec sharedInstance];
}
- (nonnull NSObject<FlutterPlatformView> *)createWithFrame:(CGRect)frame viewIdentifier:(int64_t)viewId arguments:(id _Nullable)args {
return [[ARMFlutterVodPlayerView alloc] initWithWithFrame:frame viewIdentifier:viewId arguments:args binaryMessenger:self.messenger];
}
@end
4.View实现
@implementation ARMFlutterVodPlayerView
- (instancetype)initWithWithFrame:(CGRect)frame viewIdentifier:(int64_t)viewId arguments:(id)args binaryMessenger:(NSObject<FlutterBinaryMessenger> *)messenger{
if (self = [super init]) {
NSDictionary *dic = args;
CGFloat x = [dic[@"x"] floatValue];
CGFloat y = [dic[@"y"] floatValue];
CGFloat width = [dic[@"width"] floatValue];
CGFloat height = [dic[@"height"] floatValue];
ARMFlutterVodManager.shareInstance.mainView.frame = CGRectMake(x, y, width, height);
NSString* channelName = [NSString stringWithFormat:@"ARMFlutterVodPlayerView_%lld", viewId];
_channel = [FlutterMethodChannel methodChannelWithName:channelName binaryMessenger:messenger];
__weak __typeof__(self) weakSelf = self;
[_channel setMethodCallHandler:^(FlutterMethodCall * call, FlutterResult result) {
[weakSelf onMethodCall:call result:result];
}];
}
return self;
}
- (nonnull UIView *)view {
return ARMFlutterVodManager.shareInstance.mainView;
}
- (void)onMethodCall:(FlutterMethodCall*)call result:(FlutterResult)result{
if ([[call method] isEqualToString:@"play"]) {
NSString *url = [call arguments];
[ARMFlutterVodManager.shareInstance play:url];
} else {
result(FlutterMethodNotImplemented);
}
}
@end
Texture Widget
基于纹理实现视频渲染,Flutter官方提供的video_player则是通过这种方式实现的,以iOS为例,Native需要提供一个CVPixelBufferRef给Texture Widget,具体实现流程如下图所示: 其中的ARMTexture是业务提供CVPixelBufferRef,具体实现步骤主要是 1.继承FlutterTexture 2.管理已注册textures集合textures = [self.registrar textures];
3.获得textureId
self.textureId = [textures registerTexture:self];
4.重写
- (CVPixelBufferRef _Nullable)copyPixelBuffer
返回CVPixelBufferRef 5.通知Texture获取CVPixelBufferRef
- (void)onDisplayLink {
[textures textureFrameAvailable:self.textureId];
}
Dart层
MethodChannel _globalChannel = MethodChannel("ARMFlutterTextureVodPlayer");
class _ARMPlugin {
MethodChannel get channel => MethodChannel("ARMFlutterTextureVodPlayer/$textureId");
int textureId;
_ARMPlugin(this.textureId);
Future<void> play() async {
await channel.invokeMethod("play");
}
Future<void> pause() async {
await channel.invokeMethod("pause");
}
Future<void> stop() async {
await channel.invokeMethod("stop");
}
Future<void> setNetworkDataSource(
{String uri, Map<String, String> headers = const {}}) async {
await channel.invokeMethod("setNetworkDataSource", <String, dynamic>{
"uri": uri,
"headers": headers,
});
}
}
Native层
1.注册Plugin
@implementation ARMFlutterTextureVodPlugin
- (instancetype)initWithRegistrar:(NSObject <FlutterPluginRegistrar> *)registrar {
self = [super init];
if (self) {
self.registrar = registrar;
}
return self;
}
+ (instancetype)pluginWithRegistrar:(NSObject <FlutterPluginRegistrar> *)registrar {
return [[self alloc] initWithRegistrar:registrar];
}
+ (void)registerWithRegistrar:(NSObject <FlutterPluginRegistrar> *)registrar {
FlutterMethodChannel *channel = [FlutterMethodChannel
methodChannelWithName:@"ARMFlutterTextureVodPlayer"
binaryMessenger:[registrar messenger]];
ARMFlutterTextureVodPlugin *instance = [ARMFlutterTextureVodPlugin pluginWithRegistrar:registrar];
[registrar addMethodCallDelegate:instance channel:channel];
}
- (void)handleMethodCall:(FlutterMethodCall *)call result:(FlutterResult)result {
}
@end
2.管理Texture/获取TextureId
+ (instancetype)armWithRegistrar:(NSObject <FlutterPluginRegistrar> *)registrar {
return [[self alloc] initWithRegistrar:registrar];
}
- (instancetype)initWithRegistrar:(NSObject <FlutterPluginRegistrar> *)registrar {
if (self = [super init]) {
self.registrar = registrar;
textures = [self.registrar textures];
self.textureId = [textures registerTexture:self];
NSString *channelName = [NSString stringWithFormat:@"ARMFlutterTextureVodPlayer/%lli", self.textureId];
channel = [FlutterMethodChannel methodChannelWithName:channelName binaryMessenger:[registrar messenger]];
__weak typeof(&*self) weakSelf = self;
[channel setMethodCallHandler:^(FlutterMethodCall *call, FlutterResult result) {
[weakSelf handleMethodCall:call result:result];
}];
}
return self;
}
3.重写copyPixelBuffer
- (CVPixelBufferRef _Nullable)copyPixelBuffer {
CVPixelBufferRef newBuffer = [self.vodPlayer framePixelbuffer];
if (newBuffer) {
CFRetain(newBuffer);
CVPixelBufferRef pixelBuffer = latestPixelBuffer;
while (!OSAtomicCompareAndSwapPtrBarrier(pixelBuffer, newBuffer, (void **) &latestPixelBuffer)) {
pixelBuffer = latestPixelBuffer;
}
return pixelBuffer;
}
return NULL;
}
4.调用textureFrameAvailable
这里是需要主动调用的,告诉TextureRegistry更新画面。displayLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(onDisplayLink)];
displayLink.frameInterval = 1;
[displayLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSRunLoopCommonModes];
- (void)onDisplayLink {
[textures textureFrameAvailable:self.textureId];
}
性能对比
播放同一段MP4视频PlatformView和Texture Widget性能对比, Texture Widget 性能相对差一些,分析主要原因是因为CVPixelBufferRef 提供给Flutter的Texture,Native到Flutter会经过GPU->CPU->GPU的拷贝过程,1.0版本数据对比如下:遇到的问题
Flutter
播放
器接入到课堂iPad中采用的是Texure的方案,在实现PlatformView和
Tex
ture
Widget
两个方案的时候,主要遇到了以下几个问题。
PlatformView内存增长问题
课堂在连续播放视频之后,出现内存暴增问题,主要原因是OpenGL 操作都需要设置[EAGLContext setCurrentContext:context_] , 在IOSGLRenderTarget析构的时候,没有设置context上下文。课堂直播场景退出之后,前面Flutter页面出现黑屏
课堂直播课退出之后回到上一个Flutter页面出现页面黑屏,直播视频渲染也是采用OpenGL,当直播退出的时候不仅需要设置context,还需要清空帧缓冲区,重置纹理,销毁代码如下:EAGLContext *prevContext = [EAGLContext currentContext];
[EAGLContext setCurrentContext:_context];
_renderer = nil;
glBindTexture(GL_TEXTURE_2D, 0);
glBindBuffer(GL_ARRAY_BUFFER, 0);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);
if (_framebuffer) {
glDeleteFramebuffers(1, &_framebuffer);
_framebuffer = 0;
}
if (_renderbuffer) {
glDeleteRenderbuffers(1, &_renderbuffer);
_renderbuffer = 0;
}
if (_program) {
glDeleteProgram(_program);
_program = 0;
}
_context = nil;
[EAGLContext setCurrentContext:prevContext];
Texture Widget内存相比PlatformView性能相对差一些
主要原因是因为CVPixelBufferRef 提供给Flutter的Texture,Native到Flutter会经过GPU->CPU->GPU的拷贝过程,所以我们将Native生成TextureID->拷贝生成PixelBuffer->生成新的TextureID改为直接通过Native生成TextureID->渲染,减少多次拷贝引起的内存问题,经过优化Texture Widget 的整体性能优于PlatformView。2.0优化后数据对比如下:总结
目前基于教育自研的播放器ARMPlayer的Flutter播放器Plugin已经在腾讯课堂iPad中使用,采用优化后的Texture Widget方案,Texture Widget是官方推荐的方式,不管是视频,还是图片都可以用Texture,方便扩展,同时通过纹理形式贴到LayerTree上保证平台无关,多端可复用,优化后的Texture Widget性能也优于PlatformView,Flutter播放器Plugin是教育客户端中台(大前端和点播)结合的一个新的尝试。---END---
选择”开发者技术前线 “星标?,内容一触即达。点击原文更多惊喜!
开发者技术前线 汇集技术前线快讯和关注行业趋势,大厂干货,是开发者经历和成长的优秀指南。
历史推荐
点个在看,解锁更多惊喜!