Android Video Playback----from Framework to Driver

I will analyze the process of  the local video playback in Android 4.1 on Pandaboard. Hardware acceleration for HD video playback also will be discussed.

1. Video File

A video file usually contains two part: File Header and Coded Video Frames. So there are two steps for video playback: Parse file header and Decode video frames. The media player must implement the two functionalities. A media player consist of a media player engine and media codec plugin. The former is used to parse the file header, and the latter is used to decode video frames.


a general video file

2. Architecture Overview

The diagram shows the layers from media player application to DSP driver/firmware. Please keep the picture in your mind.


3. Multimedia Framework Evolution

The media player engine in Android is evolving. There are many articles on android media framework which are based on the different Android version. If you don‘t know about the media framework evolution, you may be confused by the articles. Both of the engine and the directory structure are changed.

  • Engine

OpenCore was the player engine in Android 1.6. StareFright and Nuplayer replaced OpenCore gradually. StageFight is used to play the local media file. Nuplayer can play the stream online.

  • Directory

In 4.1 JB: The native codes of Media Framework,
                 include libeffects, libmedia, libmediaplayerservice,
                             libstagefright, mediaserver,
                 are moved from
                                   AOSP/frameworks/base/media
                                   to
                                   AOSP/frameworks/av/media

4. Jellybean MM Architecture

I got the diagram from internet. It shows the structure of the main classes/files in JellyBean‘s media framework.


5. Local Video Playback Process

The diagram shows a simple and clear local video playback process in JAVA API layer. In next sections, I will introduce the details under the APIs.


6. Select Media Player Engine

MediaPlayer::setDataSource() is a key function which is used to

  1. Get the proper media player engine for the specified media file type
  2. Create mediaExtractor to parse the metadata of the media file
I got the sequence diagram from internet.


We can follow the diagram to trace the code. Let‘s see how the correct player is selected.


The rightest slot in the diagram is MediaPlayerService. It‘s an important service in android. MediaPlayerService is added into Service Manger by Media Server which is started in Android boot process.



7. Load Media Codec Plugin

The correct media player engine is created, but it can not decode the video frames. We need to load the Media Codec Plugin which will be connected with Media Player Engine to decode the video frames.


There are plugins: hardware and software. AOSP implements a soft OMX plugin, while the chipset vendor usually implements a hardware OMX plugin. Let‘s see how the Hardware OMX plugin is created and added. The code for the two steps depends on the hardware platform. I give the analysis on PandaBoard.

    1. Create OMX Plugin.

    The vendor‘s hardware OMX plugin is loaded dynamically from .so file. 

    In OMX plugin initialization function, a table of the supported media formats is created.



    2. Add OMX Plugin

    Now we can add the plugin into the plugins list of media player engine. OMXMaster keeps the plugins list. The media player engine can get the pointer to OMXMaster. So Media Player Engine is connected with Media Codec.


8. Match and Init a Codec

9. TI OpenMAX IL Architecture

10. Load DSP Bridge Driver

11. Boot kernel with DSP


(To be continued)


Android Video Playback----from Framework to Driver,,5-wow.com

郑重声明:本站内容如果来自互联网及其他传播媒体,其版权均属原媒体及文章作者所有。转载目的在于传递更多信息及用于网络分享,并不代表本站赞同其观点和对其真实性负责,也不构成任何其他建议。