腾讯教育 App Flutter 跨端点播组件实践
點擊“開發者技術前線”,選擇“星標?”
13:21 在看|星標|留言, ?真愛
來自:騰訊在線教育騰訊教育從去年開始接入Flutter,今年上半年重構騰訊課堂和企鵝輔導iPad端,80%代碼都采用Flutter實現,對于教育最重要的點播功能同樣也需要遷移到Flutter上進行渲染。
目前正在研究的實現渲染的方案主要有2種形式PlatformView和Texture Widget。下面文章就先大概講述一下Flutter的渲染框架原理和實現,然后會對這兩種方案進行對比分析。
Flutter渲染框架和原理
Flutter的框架主要包括Framework和Engine兩層,應用是基于Framework層開發的,Framework負責渲染中的Build,Layout,Paint,生成Layer等。Engine層是C++實現的渲染引擎,負責把Framework生成的Layer組合,生成紋理,然后通過OpenGL接口向GPU提交渲染數據。
Flutter:Framework的最底層,提供工具類和方法 Painting :封裝了Flutter Engine提供的繪制接口,主要是為了在繪制控件等固定樣式的圖形時提供更直觀、更方便的接口 Animation :動畫相關的類 Gesture :提供了手勢識別相關的功能,包括觸摸事件類定義和多種內置的手勢識別器 Rendering:渲染庫,Flutter的控件樹在實際顯示時會轉換成對應的渲染對象(RenderObject)樹來實現布局和繪制操作
渲染原理
當GPU發出Vsync信號時,會執行Dart代碼繪制新UI,Dart會被執行為Layer Tree,然后經過Compositor合成后交由Skia引擎渲染處理為GPU數據,最后通過GL/Vulkan發給GPU,具體流程如下:
當需要更新UI的時候,Framework通知Engine,Engine會等到下個Vsync信號到達的時候通知Framework,Framework進行animations,build,layout,compositing,paint,最后生成layer提交給Engine。Engine再把layer進行組合,生成紋理,最后通過OpenGL接口提交數據給GPU,具體流程如下:
接下來分別分析一下兩個方案的各自的特點以及使用的方式。
PlatformView
PlatformView是Flutter官方在1.0版本推出的組件,以解決開發者想在Flutter中嵌入Android和iOS平臺原生View的Widget。例如想嵌入地圖、視頻播放器等原生組件,對于想嘗試Flutter,但是又想低成本的遷移復雜組件的團隊,可以嘗試PlatformView,在 Dart 中的類對應到 iOS 和 Android 平臺分別是UIKitView和AndroidView。
那么PlatformView在點播功能中應該怎么實現,如下圖所示:
其中的ARMPlatformView代表業務View。
Dart層
1.創建關聯類
關聯類的作用是Native和Dart側的橋梁,其中id需要和Native獲取對應。
class VodPlayerController { VodPlayerController._(int id) : _channel = MethodChannel('ARMFlutterVodPlayerView_$id'); final MethodChannel _channel; Future<void> play(String url) async { return _channel.invokeMethod('play', url); } Future<void> stop() async { return _channel.invokeMethod('stop'); } }2.創建Callback
typedef void VodPlayerViewWidgetCreatedCallback(VodPlayerController controller);3.創建Widget布局
class VodVideoWidget extends StatefulWidget { final VodPlayerViewWidgetCreatedCallback callback; final x; final y; final width; final height; VodVideoWidget({ Key key, @required this.callback, @required this.x, @required this.y, @required this.width, @required this.height, }); @override _VodVideoWidgetState createState() => _VodVideoWidgetState(); } class _VodVideoWidgetState extends State<VodVideoWidget> { @override Widget build(BuildContext context) { return UiKitView( viewType: 'ARMFlutterVodPlayerView', onPlatformViewCreated: _onPlatformViewCreated, creationParams: <String,dynamic>{ 'x': widget.x, 'y': widget.y, 'width': widget.width, 'height': widget.height, }, creationParamsCodec: new StandardMessageCodec(), ); } void _onPlatformViewCreated(int id){ if(widget.callback == null) { return; } widget.callback(VodPlayerController._(id)); } }Native層
1.注冊ViewFactory
@implementation ARMFlutterVodPlugin + (void)registerWithRegistrar:(nonnull NSObject<FlutterPluginRegistrar> *)registrar { ARMFlutterVodPlayerFactory* vodFactory = [[ARMFlutterVodPlayerFactory alloc] initWithMessenger:registrar.messenger]; [registrar registerViewFactory:vodFactory withId:@"ARMFlutterVodPlayerView"]; } @end2.注冊Plugin
+ (void)registerWithRegistry:(NSObject<FlutterPluginRegistry>*)registry { [ARMFlutterVodPlugin registerWithRegistrar:[registry registrarForPlugin:@"ARMFlutterVodPlugin"]]; }3.ViewFactory實現
@implementation ARMFlutterVodPlayerFactory - (instancetype)initWithMessenger:(NSObject<FlutterBinaryMessenger> *)messager { self = [super init]; if (self) { _messenger = messager; } return self; } - (NSObject<FlutterMessageCodec> *)createArgsCodec { return [FlutterStandardMessageCodec sharedInstance]; } - (nonnull NSObject<FlutterPlatformView> *)createWithFrame:(CGRect)frame viewIdentifier:(int64_t)viewId arguments:(id _Nullable)args { return [[ARMFlutterVodPlayerView alloc] initWithWithFrame:frame viewIdentifier:viewId arguments:args binaryMessenger:self.messenger]; } @end4.View實現
@implementation ARMFlutterVodPlayerView - (instancetype)initWithWithFrame:(CGRect)frame viewIdentifier:(int64_t)viewId arguments:(id)args binaryMessenger:(NSObject<FlutterBinaryMessenger> *)messenger{ if (self = [super init]) { NSDictionary *dic = args; CGFloat x = [dic[@"x"] floatValue]; CGFloat y = [dic[@"y"] floatValue]; CGFloat width = [dic[@"width"] floatValue]; CGFloat height = [dic[@"height"] floatValue]; ARMFlutterVodManager.shareInstance.mainView.frame = CGRectMake(x, y, width, height); NSString* channelName = [NSString stringWithFormat:@"ARMFlutterVodPlayerView_%lld", viewId]; _channel = [FlutterMethodChannel methodChannelWithName:channelName binaryMessenger:messenger]; __weak __typeof__(self) weakSelf = self; [_channel setMethodCallHandler:^(FlutterMethodCall * call, FlutterResult result) { [weakSelf onMethodCall:call result:result]; }]; } return self; } - (nonnull UIView *)view { return ARMFlutterVodManager.shareInstance.mainView; } - (void)onMethodCall:(FlutterMethodCall*)call result:(FlutterResult)result{ if ([[call method] isEqualToString:@"play"]) { NSString *url = [call arguments]; [ARMFlutterVodManager.shareInstance play:url]; } else { result(FlutterMethodNotImplemented); } } @endTexture Widget
基于紋理實現視頻渲染,Flutter官方提供的video_player則是通過這種方式實現的,以iOS為例,Native需要提供一個CVPixelBufferRef給Texture Widget,具體實現流程如下圖所示:其中的ARMTexture是業務提供CVPixelBufferRef,具體實現步驟主要是 1.繼承FlutterTexture 2.管理已注冊textures集合textures = [self.registrar textures];3.獲得textureIdself.textureId = [textures registerTexture:self];4.重寫- (CVPixelBufferRef _Nullable)copyPixelBuffer返回CVPixelBufferRef 5.通知Texture獲取CVPixelBufferRef- (void)onDisplayLink { [textures textureFrameAvailable:self.textureId]; }Dart層
MethodChannel _globalChannel = MethodChannel("ARMFlutterTextureVodPlayer"); class _ARMPlugin { MethodChannel get channel => MethodChannel("ARMFlutterTextureVodPlayer/$textureId"); int textureId; _ARMPlugin(this.textureId); Future<void> play() async { await channel.invokeMethod("play"); } Future<void> pause() async { await channel.invokeMethod("pause"); } Future<void> stop() async { await channel.invokeMethod("stop"); } Future<void> setNetworkDataSource( {String uri, Map<String, String> headers = const {}}) async { await channel.invokeMethod("setNetworkDataSource", <String, dynamic>{ "uri": uri, "headers": headers, }); } }Native層
1.注冊Plugin
@implementation ARMFlutterTextureVodPlugin - (instancetype)initWithRegistrar:(NSObject <FlutterPluginRegistrar> *)registrar { self = [super init]; if (self) { self.registrar = registrar; } return self; } + (instancetype)pluginWithRegistrar:(NSObject <FlutterPluginRegistrar> *)registrar { return [[self alloc] initWithRegistrar:registrar]; } + (void)registerWithRegistrar:(NSObject <FlutterPluginRegistrar> *)registrar { FlutterMethodChannel *channel = [FlutterMethodChannel methodChannelWithName:@"ARMFlutterTextureVodPlayer" binaryMessenger:[registrar messenger]]; ARMFlutterTextureVodPlugin *instance = [ARMFlutterTextureVodPlugin pluginWithRegistrar:registrar]; [registrar addMethodCallDelegate:instance channel:channel]; } - (void)handleMethodCall:(FlutterMethodCall *)call result:(FlutterResult)result { } @end2.管理Texture/獲取TextureId
+ (instancetype)armWithRegistrar:(NSObject <FlutterPluginRegistrar> *)registrar { return [[self alloc] initWithRegistrar:registrar]; } - (instancetype)initWithRegistrar:(NSObject <FlutterPluginRegistrar> *)registrar { if (self = [super init]) { self.registrar = registrar; textures = [self.registrar textures]; self.textureId = [textures registerTexture:self]; NSString *channelName = [NSString stringWithFormat:@"ARMFlutterTextureVodPlayer/%lli", self.textureId]; channel = [FlutterMethodChannel methodChannelWithName:channelName binaryMessenger:[registrar messenger]]; __weak typeof(&*self) weakSelf = self; [channel setMethodCallHandler:^(FlutterMethodCall *call, FlutterResult result) { [weakSelf handleMethodCall:call result:result]; }]; } return self; }3.重寫copyPixelBuffer
- (CVPixelBufferRef _Nullable)copyPixelBuffer { CVPixelBufferRef newBuffer = [self.vodPlayer framePixelbuffer]; if (newBuffer) { CFRetain(newBuffer); CVPixelBufferRef pixelBuffer = latestPixelBuffer; while (!OSAtomicCompareAndSwapPtrBarrier(pixelBuffer, newBuffer, (void **) &latestPixelBuffer)) { pixelBuffer = latestPixelBuffer; } return pixelBuffer; } return NULL; }4.調用textureFrameAvailable
這里是需要主動調用的,告訴TextureRegistry更新畫面。displayLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(onDisplayLink)]; displayLink.frameInterval = 1; [displayLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSRunLoopCommonModes];- (void)onDisplayLink { [textures textureFrameAvailable:self.textureId]; }性能對比
播放同一段MP4視頻PlatformView和Texture Widget性能對比,Texture Widget性能相對差一些,分析主要原因是因為CVPixelBufferRef提供給Flutter的Texture,Native到Flutter會經過GPU->CPU->GPU的拷貝過程,1.0版本數據對比如下:遇到的問題
Flutter播放器接入到課堂iPad中采用的是Texure的方案,在實現PlatformView和Texture Widget兩個方案的時候,主要遇到了以下幾個問題。PlatformView內存增長問題
課堂在連續播放視頻之后,出現內存暴增問題,主要原因是OpenGL操作都需要設置[EAGLContext setCurrentContext:context_],在IOSGLRenderTarget析構的時候,沒有設置context上下文。課堂直播場景退出之后,前面Flutter頁面出現黑屏
課堂直播課退出之后回到上一個Flutter頁面出現頁面黑屏,直播視頻渲染也是采用OpenGL,當直播退出的時候不僅需要設置context,還需要清空幀緩沖區,重置紋理,銷毀代碼如下:EAGLContext *prevContext = [EAGLContext currentContext]; [EAGLContext setCurrentContext:_context]; _renderer = nil; glBindTexture(GL_TEXTURE_2D, 0); glBindBuffer(GL_ARRAY_BUFFER, 0); glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0); if (_framebuffer) { glDeleteFramebuffers(1, &_framebuffer); _framebuffer = 0; } if (_renderbuffer) { glDeleteRenderbuffers(1, &_renderbuffer); _renderbuffer = 0; } if (_program) { glDeleteProgram(_program); _program = 0; } _context = nil; [EAGLContext setCurrentContext:prevContext];Texture Widget內存相比PlatformView性能相對差一些
主要原因是因為CVPixelBufferRef提供給Flutter的Texture,Native到Flutter會經過GPU->CPU->GPU的拷貝過程,所以我們將Native生成TextureID->拷貝生成PixelBuffer->生成新的TextureID改為直接通過Native生成TextureID->渲染,減少多次拷貝引起的內存問題,經過優化Texture Widget的整體性能優于PlatformView。2.0優化后數據對比如下:總結
目前基于教育自研的播放器ARMPlayer的Flutter播放器Plugin已經在騰訊課堂iPad中使用,采用優化后的Texture Widget方案,Texture Widget是官方推薦的方式,不管是視頻,還是圖片都可以用Texture,方便擴展,同時通過紋理形式貼到LayerTree上保證平臺無關,多端可復用,優化后的Texture Widget性能也優于PlatformView,Flutter播放器Plugin是教育客戶端中臺(大前端和點播)結合的一個新的嘗試。---END---
選擇”開發者技術前線?“星標?,內容一觸即達。點擊原文更多驚喜!
開發者技術前線?匯集技術前線快訊和關注行業趨勢,大廠干貨,是開發者經歷和成長的優秀指南。
歷史推薦
為什么我推薦你用 Ubuntu 開發?
支付寶 App 啟動性能優化
美團基于跨平臺 Flutter 的動態化平臺建設
點個在看,解鎖更多驚喜!
總結
以上是生活随笔為你收集整理的腾讯教育 App Flutter 跨端点播组件实践的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: Simple QQLogin 1.3(Q
- 下一篇: 我做的去除乱码FM