"本文转载自:[yanbixing123]的Android MultiMedia框架完全解析 - 从开机到MediaServer的注册过程"
1.概述
MediaPlayer是Android中的一个多媒体播放类,通过它可以控制音视频流或本地音视频资源的播放过程。下面是MediaPlayer播放音视频的大致流程:
MediaPlayer mediaPlayer = new MediaPlayer();
mediaPlayer.setOnCompletionListener(new OnCompletionListener() {
@Override
public void onCompletion(MediaPlayer mp){
mediaPlayer.release();
mediaPlayer = null;
}
});
mediaPlayer.setDataSource("abc.mp3");
mediaPlayer.setDisplay();
mediaPlayer.prepare();
mediaPlayer.start();
我们就按照这个流程来一步一步分析整个播放流程。
2.MediaServer的启动流程
Android是基于Linux内核的,而在Linux中,启动的第一个进程就是init进程,其他进程都是init进程的子进程。在init进程的启动过程中,会解析Linux的配置脚本init.rc文件。根据init.rc文件的内容,init进程会装载Android的文件系统,创建系统目录,启动Android系统重要的守护进程,这些进程包括USB守护进程,adb守护进程,vold守护进程等。
同时,init进程还会启动MediaServer(多媒体服务),ServiceManager(Binder服务管家)等重要服务。init进程还会孵化出Zygote进程,Zygote进程是Android系统的首个Java进程,Zygote是所有Java进程的父进程,如图所示:
(1)在Android 10.0中,mediaserver服务的启动脚本在system/core/rootdir/init.zygote64.rc文件中:
service zygote /system/bin/app_process64 -Xzygote /system/bin --zygote --start-system-server
class main
priority -20
user root
group root readproc reserved_disk
socket zygote stream 660 root system
socket usap_pool_primary stream 660 root system
onrestart write /sys/android_power/request_state wake
onrestart write /sys/power/state on
onrestart restart audioserver
onrestart restart cameraserver
onrestart restart media
onrestart restart netd
onrestart restart wificond
writepid /dev/cpuset/foreground/tasks
(2)其中命令:onrestart restart media,会执行frameworks/av/media/mediaserver/mediaserver.rc文件中的mediaserver的启动脚本:
service media /system/bin/mediaserver
class main
user media
group audio camera inet net_bt net_bt_admin net_bw_acct drmrpc mediadrm
ioprio rt 4
writepid /dev/cpuset/foreground/tasks /dev/stune/foreground/tasks
(3)mediaserver启动后,会把media相关的一些服务添加到ServiceManager中,其中就有MediaPlayerService,ResourceManagerService等等。mediaserver可以理解为所有有关Media相关的服务器,它为app所提供服务。
那么再来看看MediaPlayerService在Android Framework中所处的位置:
这个mediaserver的核心文件就是frameworks/av/media/mediaserver/main_mediaserver.cpp:
int main(int argc __unused, char **argv __unused)
{
signal(SIGPIPE, SIG_IGN);
sp<ProcessState> proc(ProcessState::self());
sp<IServiceManager> sm(defaultServiceManager());
ALOGI("ServiceManager: %p", sm.get());
AIcu_initializeIcuOrDie();
MediaPlayerService::instantiate();
ResourceManagerService::instantiate();
registerExtensions();
ProcessState::self()->startThreadPool();
IPCThreadState::self()->joinThreadPool();
}
其中:MediaPlayerService::instantiate(),就是MediaPlayerService的初始化代码,它位于frameworks/av/media/libmediaplayerservice/MediaPlayerService.cpp中:
void MediaPlayerService::instantiate() {
defaultServiceManager()->addService(
String16("media.player"), new MediaPlayerService());
}
向ServiceManager中注册了一个实名Binder:media.player。
3.MediaPlayer的创建过程
在App中,我们执行了:MediaPlayer mediaPlayer = new MediaPlayer();
那么就来看看这个过程,在 frameworks/base/media/java/android/MediaPlayer.java中:
public MediaPlayer() {
super(new AudioAttributes.Builder().build(),
AudioPlaybackConfiguration.PLAYER_TYPE_JAM_MEDIAPLAYER);
Looper looper;
if ((looper = Looper.myLooper()) != null) {
mEventHandler = new EventHandler(this, looper);
} else if ((looper = Looper.getMainLooper()) != null) {
mEventHandler = new EventHandler(this, looper);
} else {
mEventHandler = null;
}
mTimeProvider = new TimeProvider(this);
mOpenSubtitleSources = new Vector<InputStream>();
/* Native setup requires a weak reference to our object.
* It's easier to create it here than in C++.
*/
native_setup(new WeakReference<MediaPlayer>(this));
baseRegisterPlayer();
}
在它的构造函数中,将java中创建的MediaPlayer通过弱引用传递给JNI层,而在它的构造函数之前,MediaPlayer类有一段静态代码块,加载了media_jni.so库,用于JNI相关的初始化:
static {
System.loadLibrary("media_jni");
native_init();
}
private static native final void native_init();
这里调用了本地方法native_init(),它的实现位于 frameworks/base/media/jni/android_media_MediaPlayer.cpp中,但是,并不能单纯通过名字直接找到这个native_init()方法,它通过一个结构体数组做了映射:
static const JNINativeMethod gMethods[] = {
{
"nativeSetDataSource",
"(Landroid/os/IBinder;Ljava/lang/String;[Ljava/lang/String;"
"[Ljava/lang/String;)V",
(void *)android_media_MediaPlayer_setDataSourceAndHeaders
},
{"_setDataSource", "(Ljava/io/FileDescriptor;JJ)V", (void *)android_media_MediaPlayer_setDataSourceFD},
{"_setDataSource", "(Landroid/media/MediaDataSource;)V",(void *)android_media_MediaPlayer_setDataSourceCallback },
{"_setVideoSurface", "(Landroid/view/Surface;)V", (void *)android_media_MediaPlayer_setVideoSurface},
...
{"native_init", "()V", (void *)android_media_MediaPlayer_native_init},
{"native_setup", "(Ljava/lang/Object;)V", (void *)android_media_MediaPlayer_native_setup},
...
};
这个结构体数组几乎映射了所有MediaPlayer类的方法,以后有类似的方法可以通过这个数组找到,下面继续看native_init()方法:
static void
android_media_MediaPlayer_native_init(JNIEnv *env)
{
jclass clazz;
clazz = env->FindClass("android/media/MediaPlayer");
if (clazz == NULL) {
return;
}
fields.context = env->GetFieldID(clazz, "mNativeContext", "J");
if (fields.context == NULL) {
return;
}
fields.post_event = env->GetStaticMethodID(clazz, "postEventFromNative",
"(Ljava/lang/Object;IIILjava/lang/Object;)V");
if (fields.post_event == NULL) {
return;
}
fields.surface_texture = env->GetFieldID(clazz, "mNativeSurfaceTexture", "J");
if (fields.surface_texture == NULL) {
return;
}
env->DeleteLocalRef(clazz);
clazz = env->FindClass("android/net/ProxyInfo");
if (clazz == NULL) {
return;
}
fields.proxyConfigGetHost =
env->GetMethodID(clazz, "getHost", "()Ljava/lang/String;");
fields.proxyConfigGetPort =
env->GetMethodID(clazz, "getPort", "()I");
fields.proxyConfigGetExclusionList =
env->GetMethodID(clazz, "getExclusionListAsString", "()Ljava/lang/String;");
env->DeleteLocalRef(clazz);
......
}
这里native_init()函数就执行完毕了,它设置了一些java层的方法。然后就是MediaServer的构造函数中的native_setup()函数,它同样位于这个文件中:
static void
android_media_MediaPlayer_native_setup(JNIEnv *env, jobject thiz, jobject weak_this)
{
ALOGV("native_setup");
sp<MediaPlayer> mp = new MediaPlayer();
if (mp == NULL) {
jniThrowException(env, "java/lang/RuntimeException", "Out of memory");
return;
}
// create new listener and give it to MediaPlayer
sp<JNIMediaPlayerListener> listener = new JNIMediaPlayerListener(env, thiz, weak_this);
mp->setListener(listener);
// Stow our new C++ MediaPlayer in an opaque field in the Java object.
setMediaPlayer(env, thiz, mp);
}
这里创建了一个C++层的MediaPlayer,还设置了一些Listener回调,这个模式和Android的Looper机制差不多,都是java层一个Looper,C++层也有一个Looper。
通过上面的步骤,就发现,从java层想要生成一个MediaPlayer,最终会在C++层中生成一个MediaPlayer()类,至此,MediaPlayer的构造就完成了。
4.SetDataSource的过程
在Android App中:mediaPlayer.setDataSource("abc.mp3");
继续在frameworks/base/media/jni/android_media_MediaPlayer.cpp文件中通过那个映射数组找到对应的方法(我们以播放本地文件为例),所以对应的方法就是:
static void
android_media_MediaPlayer_setDataSourceFD(JNIEnv *env, jobject thiz, jobject fileDescriptor, jlong offset, jlong length)
{
sp<MediaPlayer> mp = getMediaPlayer(env, thiz);
if (mp == NULL ) {
jniThrowException(env, "java/lang/IllegalStateException", NULL);
return;
}
if (fileDescriptor == NULL) {
jniThrowException(env, "java/lang/IllegalArgumentException", NULL);
return;
}
int fd = jniGetFDFromFileDescriptor(env, fileDescriptor);
ALOGV("setDataSourceFD: fd %d", fd);
process_media_player_call( env, thiz, mp->setDataSource(fd, offset, length), "java/io/IOException", "setDataSourceFD failed." );
}
首先获取C++层中刚创建的MediaPlayer,然后调用process_media_player_call()来执行MediaPlayer的setDataSource函数并检查返回状态(process_media_player_call()函数这里就不展开了,可以自己看,主要是做了一些错误和异常检测工作,然后notify出去相应的错误状态)。
具体MediaPlayer的setDataSource函数做了什么工作,我们下节再分析,这里只需要知道,java层的setDataSource方法最终调用了C++层MediaPlayer类的setDataSource方法就行。
5.SetDisplay的过程
在Android App中:mediaPlayer.setDisplay();
在frameworks/base/media/java/android/MediaPlayer.java中:
public void setDisplay(SurfaceHolder sh) {
mSurfaceHolder = sh;
Surface surface;
if (sh != null) {
surface = sh.getSurface();
} else {
surface = null;
}
_setVideoSurface(surface);
updateSurfaceScreenOn();
}
private native void _setVideoSurface(Surface surface);
最终会调用到jni层的android_media_MediaPlayer_setVideoSurface()函数,同样在frameworks/base/media/jni/android_media_MediaPlayer.cpp中:
static void
android_media_MediaPlayer_setVideoSurface(JNIEnv *env, jobject thiz, jobject jsurface)
{
setVideoSurface(env, thiz, jsurface, true /* mediaPlayerMustBeAlive */);
}
static void
setVideoSurface(JNIEnv *env, jobject thiz, jobject jsurface, jboolean mediaPlayerMustBeAlive)
{
sp<MediaPlayer> mp = getMediaPlayer(env, thiz);
if (mp == NULL) {
if (mediaPlayerMustBeAlive) {
jniThrowException(env, "java/lang/IllegalStateException", NULL);
}
return;
}
decVideoSurfaceRef(env, thiz);
sp<IGraphicBufferProducer> new_st;
if (jsurface) {//获取java层的surface
sp<Surface> surface(android_view_Surface_getSurface(env, jsurface));
if (surface != NULL) {
new_st = surface->getIGraphicBufferProducer();//获取IGraphicBufuferProducer
if (new_st == NULL) {
jniThrowException(env, "java/lang/IllegalArgumentException",
"The surface does not have a binding SurfaceTexture!");
return;
}
new_st->incStrong((void*)decVideoSurfaceRef);
} else {
jniThrowException(env, "java/lang/IllegalArgumentException",
"The surface has been released");
return;
}
}
env->SetLongField(thiz, fields.surface_texture, (jlong)new_st.get());
// This will fail if the media player has not been initialized yet. This
// can be the case if setDisplay() on MediaPlayer.java has been called
// before setDataSource(). The redundant call to setVideoSurfaceTexture()
// in prepare/prepareAsync covers for this case.
mp->setVideoSurfaceTexture(new_st);
}
这里主要是对图像现实的surface进行保存,然后将旧的IGraphicBufferProducer强引用减一,再获得新的IGraphicBufferProducer,最终会调用C++层的MediaPlayer的setVideoSurfaceTexture将它设置进去。
IGraphicBufferProducer是SurfaceFlinger中的内容,一个UI完全现实到display的过程,SurfaceFlinger扮演着重要的角色,但是它的职责是"Flinger",即把所有应用程序最终的绘图结果进行“混合叠图”,然后统一绘制到物理屏幕上。在这个绘图过程中,需要BufferQueue的参与,它是每个应用程序“一对一”的辅导老师,知道着UI程序的“画板申请”,“作画流程”等一系列细节,同时BufferQueue也是IGraphicBufferProducer的服务端,app通过IGraphicBufferProducer来与BufferQueue沟通。
6.prepare的过程
在Android App中:mediaPlayer.prepare();
在frameworks/base/media/java/android/MediaPlayer.java中:
public void prepare() throws IOException, IllegalStateException {
_prepare();
scanInternalSubtitleTracks();
// DrmInfo, if any, has been resolved by now.
synchronized (mDrmLock) {
mDrmInfoResolved = true;
}
}
private native void _prepare() throws IOException, IllegalStateException;
最终会调用到jni层的android_media_MediaPlayer_prepare()函数,同样在frameworks/base/media/jni/android_media_MediaPlayer.cpp中:
static void
android_media_MediaPlayer_prepare(JNIEnv *env, jobject thiz)
{
sp<MediaPlayer> mp = getMediaPlayer(env, thiz);
if (mp == NULL ) {
jniThrowException(env, "java/lang/IllegalStateException", NULL);
return;
}
// Handle the case where the display surface was set before the mp was
// initialized. We try again to make it stick.
sp<IGraphicBufferProducer> st = getVideoSurfaceTexture(env, thiz);
mp->setVideoSurfaceTexture(st);
process_media_player_call( env, thiz, mp->prepare(), "java/io/IOException", "Prepare failed." );
}
最终会调用C++层的MediaPlayer的prepare()。
7.start的过程
在Android App中:mediaPlayer.start();
在frameworks/base/media/java/android/MediaPlayer.java中:
public void start() throws IllegalStateException {
//FIXME use lambda to pass startImpl to superclass
final int delay = getStartDelayMs();
if (delay == 0) {
startImpl();
} else {
new Thread() {
public void run() {
try {
Thread.sleep(delay);
} catch (InterruptedException e) {
e.printStackTrace();
}
baseSetStartDelayMs(0);
try {
startImpl();
} catch (IllegalStateException e) {
// fail silently for a state exception when it is happening after
// a delayed start, as the player state could have changed between the
// call to start() and the execution of startImpl()
}
}
}.start();
}
}
private void startImpl() {
baseStart();
stayAwake(true);
_start();
}
private native void _start() throws IllegalStateException;
最终会调用到jni层的android_media_MediaPlayer_start()函数,同样在frameworks/base/media/jni/android_media_MediaPlayer.cpp中:
static void
android_media_MediaPlayer_start(JNIEnv *env, jobject thiz)
{
ALOGV("start");
sp<MediaPlayer> mp = getMediaPlayer(env, thiz);
if (mp == NULL ) {
jniThrowException(env, "java/lang/IllegalStateException", NULL);
return;
}
process_media_player_call( env, thiz, mp->start(), NULL, NULL );
}
最终会调用C++层的MediaPlayer的start()。
8.小结
本节只是分析了APP层与C++层MediaPlayer的构建过程,让大家对于这个播放流程有个大致的认识和理解。