安卓MediaPlayer源码跟读解析 您所在的位置:网站首页 mediaplayer安卓代码 安卓MediaPlayer源码跟读解析

安卓MediaPlayer源码跟读解析

2023-10-25 07:51| 来源: 网络整理| 查看: 265

简介:

        安卓通过MediaPlayer这个类提供了一整套接口呈现给给客户实现视音频的播放。可是凡事必究其根,我们今天就来看看安卓的MediaPlayer框架(基于Android 8.0)究竟是怎么实现的。因为框架层全是C/C++代码,建议读者拥有相关基础,没有也没关系,都能看懂。

概要: 

        先给出网上扒的MediaPlayer状态图,MediaPlayer常用的方法也就这些了。

        

        因为音视频相关的编解码,解封装,渲染等操作需要大量的运算,所以谷歌将这些方法通过底层C/C++代码来实现了。但是java怎么调用的C++代码呢?于是JNI(java native interface)就应运而生了。JNI允许Java代码和其他语言写的代码进行交互。JNI一开始是为了本地已编译语言,尤其是C和C++而设计的,但是它并不妨碍你使用其他编程语言,只要调用约定受支持就可以了。使用java与本地已编译的代码交互,通常会丧失平台可移植性。

        因此我们可以给出MediaPlayer框架的层次架构图:

         代码分析:

        我们写java程序如果要用到视音频播放肯定少不了MediaPlayer,常见的调用方法大致如下:

mediaPlayer = MediaPlayer.create(MainActivity.this, R.raw.test); mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC); mediaPlayer.setDisplay(surfaceHolder); mediaPlayer.setLooping(true); mediaPlayer.start();

         是不是很熟悉?先通过creat()函数创建一个mediaPlayer,再设置音频流类型,设计播放窗口,就start(),一般都是固定的几步,可是它的底层是怎么运作的呢?我们现在来分析一下。就看creat()函数吧,跟着它往下读。

creat()

         当然先去MediaPlayer.java里找一找了。

         目录:xref: /frameworks/base/media/java/android/media/MediaPlayer.java

//往下调用 public static MediaPlayer create(Context context, int resid) { int s = AudioSystem.newAudioSessionId(); return create(context, resid, null, s > 0 ? s : 0); } //还是要预先创建一个自己,然后调用setDataSource()函数 public static MediaPlayer create(Context context, int resid, AudioAttributes audioAttributes, int audioSessionId) { try { AssetFileDescriptor afd = context.getResources().openRawResourceFd(resid); if (afd == null) return null; MediaPlayer mp = new MediaPlayer(); final AudioAttributes aa = audioAttributes != null ? audioAttributes : new AudioAttributes.Builder().build(); mp.setAudioAttributes(aa); mp.setAudioSessionId(audioSessionId); mp.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength()); afd.close(); mp.prepare(); return mp; } catch (IOException ex) { Log.d(TAG, "create failed:", ex); // fall through } catch (IllegalArgumentException ex) { Log.d(TAG, "create failed:", ex); // fall through } catch (SecurityException ex) { Log.d(TAG, "create failed:", ex); // fall through } return null; } public void setDataSource(FileDescriptor fd, long offset, long length) throws IOException, IllegalArgumentException, IllegalStateException { _setDataSource(fd, offset, length); } private native void _setDataSource(FileDescriptor fd, long offset, long length) throws IOException, IllegalArgumentException, IllegalStateException;

          我把这几个相关的函数穿了起来,层层调用。java对上层那么多的重载函数,在底层都是互相调用,最终归为一两个特别的函数上。还记得初学java的MediaPlayer时,网上说可以new一个MediaPlayer也可以不new,直接调用creat()函数,我觉得不可思议,现在算是真相大白了。creat()函数在内部先new了一个MediaPlayer来处理业务,然后调用MediaPlayer的setDataSource()函数来设置播放资源,setDataSource()函数是我们研究框架的开路先锋,他是最重要的一个函数了。注意最后一个函数的native标志,它标示了这是一个底层函数的声明,你在当前目录是找不到它的定义的,这就是JNI。

         但是看native层的代码之前,我们注意到在MediaPlayer.java的构造函数之前有一段静态代码段:

static { System.loadLibrary("media_jni"); native_init(); } private static native final void native_init();

        嗯?提前调用了native层的native_init方法。没完呢,我们注意到在MediaPlayer.java的构造函数里也有一段native代码:

native_setup(new WeakReference(this)); private native final void native_setup(Object mediaplayer_this);

        这两个native函数在new一个MediaPlayer时就会调用,别放过,很重要,我们马上就看看。

JNI:

        JNI的全称就是Java Native Interface,就是Java和C/C++相互通信的接口,实现了一个工程,多种语言并存。那么什么时候需要用到JNI呢?

        1.需要调用java语言不支持的依赖于操作系统平台的特性的一些功能;

        2.为了整合一些以前使用非java语言开发的某些系统,例如C和C++;

        3.为了节省程序的运行时间,必须采用一些低级或中级语言。

       关于JNI的用法请看:JNI的用法

       MediaPlayer.java调用了底层JNI,根据JNI特有的命名规则,我们要找android_media_MediaPlayer.cpp文件。好了我们上面看到java层调用了native层的函数,有native_init(),native_setup(),_setDataSource(),然后我们兴高采烈的在android_media_MediaPlayer.cpp文件中并没有找到这三个函数的定义,什么情况?

        不慌,细心的人就会发现JNI有这么一块代码,实现了对函数换名!

xref: /frameworks/base/media/jni/android_media_MediaPlayer.cpp static const JNINativeMethod gMethods[] = { { "nativeSetDataSource", "(Landroid/os/IBinder;Ljava/lang/String;[Ljava/lang/String;" "[Ljava/lang/String;)V", (void *)android_media_MediaPlayer_setDataSourceAndHeaders }, {"_setDataSource", "(Ljava/io/FileDescriptor;JJ)V", (void *)android_media_MediaPlayer_setDataSourceFD}, {"_setDataSource", "(Landroid/media/MediaDataSource;)V",(void *)android_media_MediaPlayer_setDataSourceCallback }, {"_setVideoSurface", "(Landroid/view/Surface;)V", (void *)android_media_MediaPlayer_setVideoSurface}, {"getDefaultBufferingParams", "()Landroid/media/BufferingParams;", (void *)android_media_MediaPlayer_getDefaultBufferingParams}, {"getBufferingParams", "()Landroid/media/BufferingParams;", (void *)android_media_MediaPlayer_getBufferingParams}, {"setBufferingParams", "(Landroid/media/BufferingParams;)V", (void *)android_media_MediaPlayer_setBufferingParams}, {"_prepare", "()V", (void *)android_media_MediaPlayer_prepare}, {"prepareAsync", "()V", (void *)android_media_MediaPlayer_prepareAsync}, {"_start", "()V", (void *)android_media_MediaPlayer_start}, {"_stop", "()V", (void *)android_media_MediaPlayer_stop}, {"native_init", "()V", (void *)android_media_MediaPlayer_native_init}, {"native_setup", "(Ljava/lang/Object;)V", (void *)android_media_MediaPlayer_native_setup}, }

        原来我们上层的函数在这里进行了统一的名称替换(如果对JNI的语法规则不明白的自行百度,这里不需要深究)。这就真相大白了,android_media_MediaPlayer_native_init()函数只是设置JNI层相关参数,感兴趣请自行查看,我们看看android_media_MediaPlayer_native_setup()函数:

//代码目录: xref: /frameworks/base/media/jni/android_media_MediaPlayer.cpp static void android_media_MediaPlayer_native_setup(JNIEnv *env, jobject thiz, jobject weak_this) { ALOGV("native_setup"); sp mp = new MediaPlayer(); if (mp == NULL) { jniThrowException(env, "java/lang/RuntimeException", "Out of memory"); return; } // create new listener and give it to MediaPlayer sp listener = new JNIMediaPlayerListener(env, thiz, weak_this); mp->setListener(listener); // Stow our new C++ MediaPlayer in an opaque field in the Java object. setMediaPlayer(env, thiz, mp); }

        先创建了一个MediaPlayer,不过这个MediaPlayer是C++版的,用来与MediaPlayerService进行通信。然后调用setMediaPlayer将这个mp与上层java关联了起来(记住这个地方,后面会用到)。

        好的,那我们继续寻找android_media_MediaPlayer_setDataSourceFD()函数,下面给出代码:

static void android_media_MediaPlayer_setDataSourceFD(JNIEnv *env, jobject thiz, jobject fileDescriptor, jlong offset, jlong length) { sp mp = getMediaPlayer(env, thiz); if (mp == NULL ) { jniThrowException(env, "java/lang/IllegalStateException", NULL); return; } if (fileDescriptor == NULL) { jniThrowException(env, "java/lang/IllegalArgumentException", NULL); return; } int fd = jniGetFDFromFileDescriptor(env, fileDescriptor); ALOGV("setDataSourceFD: fd %d", fd); process_media_player_call( env, thiz, mp->setDataSource(fd, offset, length), "java/io/IOException", "setDataSourceFD failed." ); }

        这里有必要说一下,JNIEnv是用来向上层回调,jobject是java的类,后面三个不说了。sp是什么玩意儿?不慌,strong pointer,强指针,此外还有wp,weak pointer,弱指针。这两个都是C++的模板类,实现了对指针的封装,我们直接将它们看成指针即可,ASOP中有太多这样的用法了。wp可以实现对原指针的更改,wp不可以。

        好了我们看函数,显式调用getMediaPlayer()函数,得到上面new出来的C++版的mediaPlayer,最后调用了C++版的MediaPlayer的setDataSource()函数,然后是process_media_player_call()是将处理结果返回给上层java。至此,我们正式进入MediaPlayer框架层,下面给出类图:

        MediaPlayer框架采用C/S架构,分别处在两个进程上,采用Bindler机制进行进程间通信。我们发现native层的大部分类都是采用IXXX,BpXXX,BnXXX形式的。在MediaPlayer框架层,由IMediaPlayer,IMediaPlayerService,IMediaPlayerClient三大元老组成了基本框架,由IBinder,BBinder(准确来说叫BnBinder比较合适),BpBinder将其粘合。此外,IXXX类里总是一些虚抽象函数,不存在定义,由BpXXX和BnXXX继承它,BpXXX作为Client端的代理类,发起服务的请求,服务的实现则统一放在BnXXX类里。

MediaPlayer.cpp: setDataSource():

         接着上面,我们来看MediaPlayer的setDataSource()函数,代码贴上:

//代码目录: xref: /frameworks/av/media/libmedia/mediaplayer.cpp status_t MediaPlayer::setDataSource(int fd, int64_t offset, int64_t length) { ALOGV("setDataSource(%d, %" PRId64 ", %" PRId64 ")", fd, offset, length); status_t err = UNKNOWN_ERROR; const sp service(getMediaPlayerService()); if (service != 0) { sp player(service->create(this, mAudioSessionId)); if ((NO_ERROR != doSetRetransmitEndpoint(player)) || (NO_ERROR != player->setDataSource(fd, offset, length))) { player.clear(); } err = attachNewPlayer(player); } return err; }

        getMediaPlayerService()函数:一眼望去,就是请求Service无疑了。MediaPlayer.cpp中并没有这个函数的实现方法,所以我们去他的父类IMediaDeathNotify寻找,嘿,果然在这儿!

//代码码目录:/frameworks/av/media/libmedia/IMediaDeathNotifier.cpp /*static*/const sp IMediaDeathNotifier::getMediaPlayerService() { ALOGV("getMediaPlayerService"); Mutex::Autolock _l(sServiceLock); if (sMediaPlayerService == 0) { sp sm = defaultServiceManager(); sp binder; do { binder = sm->getService(String16("media.player")); if (binder != 0) { break; } ALOGW("Media player service not published, waiting..."); usleep(500000); // 0.5 s } while (true); if (sDeathNotifier == NULL) { sDeathNotifier = new DeathNotifier(); } binder->linkToDeath(sDeathNotifier); sMediaPlayerService = interface_cast(binder); } ALOGE_IF(sMediaPlayerService == 0, "no media player service!?"); return sMediaPlayerService; }

        这段代码就是Client端的请求服务了,通过调用defaultServiceManager()得到IServiceManager,通过调用IServiceManager的getService()函数来查询“media.player”是否注册,如果注册则返回对应的IBinder,留给Client进行通信。然后就是通过interface_cast将IBinder转化为服务端IMediaPlayerService的指针返回。可是这个inteface_cast()是什么呢?是一个强制类型转换吗?不不不,一叶障目罢了,我们来看看它的定义:

代码目录:frameworks/native/include/binder/IInterface.h template inline sp interface_cast(const sp& obj) { return INTERFACE::asInterface(obj); }

        好家伙,直接返回自身的,即IMediaPlayerService::asInteface(),我们继续追,额,我就不贴代码了,你会发现IMediaPlayerService中并没有这个函数的定义,怎么回事儿?去父类看看!一对比就能发现蹊跷了:

/frameworks/native/include/binder/IInterface.h // ---------------------------------------------------------------------- #define DECLARE_META_INTERFACE(INTERFACE) \ static const ::android::String16 descriptor; \ static ::android::sp asInterface( \ const ::android::sp& obj); \ virtual const ::android::String16& getInterfaceDescriptor() const; \ I##INTERFACE(); \ virtual ~I##INTERFACE(); \ #define IMPLEMENT_META_INTERFACE(INTERFACE, NAME) \ const ::android::String16 I##INTERFACE::descriptor(NAME); \ const ::android::String16& \ I##INTERFACE::getInterfaceDescriptor() const { \ return I##INTERFACE::descriptor; \ } \ ::android::sp I##INTERFACE::asInterface( \ const ::android::sp& obj) \ { \ ::android::sp intr; \ if (obj != NULL) { \ intr = static_cast( \ obj->queryLocalInterface( \ I##INTERFACE::descriptor).get()); \ if (intr == NULL) { \ intr = new Bp##INTERFACE(obj); \ } \ } \ return intr; \ } \ I##INTERFACE::I##INTERFACE() { } \ I##INTERFACE::~I##INTERFACE() { } \ #define CHECK_INTERFACE(interface, data, reply) \ if (!(data).checkInterface(this)) { return PERMISSION_DENIED; } \

        IInterface中有这么一段奇怪的代码段,不妨,仔细看一下,哦,原来是一对宏声明和定义!而IMediaPlayerService里刚好有这两个宏的调用!那么就见泰山了。我们将IMediaPlayerService置换进去,就能看到IBinder转IMediaPlayerService的实现了!我就不再贴出了。

        好了扯远了,我们通过getDefaultService得到了一个注册名为“mediapalyer"的服务,并通过interface_cast转换为一个IMediaPlayerService的指针返回。我们继续往下看:

        sp player(service->create(this, mAudioSessionId));

        原来是调用IMediaPlayer的creat()函数,我们去看看:

代码目录:/frameworks/av/media/libmedia/IMediaPlayerService.cpp virtual sp create( const sp& client, audio_session_t audioSessionId) { Parcel data, reply; data.writeInterfaceToken(IMediaPlayerService::getInterfaceDescriptor()); data.writeStrongBinder(IInterface::asBinder(client)); data.writeInt32(audioSessionId); remote()->transact(CREATE, data, &reply); return interface_cast(reply.readStrongBinder()); }

        asBinder()是直接将client转化为binder接口,而没有经过ServiceManager这个中介,说明这是个匿名管道,只能在这两个进程间进行通信。来看一下:

// static sp IInterface::asBinder(const sp& iface) { if (iface == NULL) return NULL; return iface->onAsBinder(); } template inline IBinder* BpInterface::onAsBinder() { return remote(); }

        remote()得到的就是远端的BpBinder。remote() ->transact(),这个函数要好好说道一下

   1.BpBinder,BBinder,IBinder是安桌Binder机制的抽象,其中BpBinder不在这些继承关系中。

   2.remote()是在BpRefBase的子类中实现的,返回的就是一个BpBinder。

   3.BpBinder的transact实现,就是直接调用IPCThreadState::self()->transact()发送数据。

   4.Service端通过IPCThreadState接收到client的请求后,首先会调用BBinder的transact()方法。

   5.BBinder的transact方法又会调用子类实现的虚拟方法onTransact。这个虚拟方法是在BnXXXService中实现的

        所以,我们直接在BnMediaPlayerService中寻找onTransact()的CREAT实现:

xref: /frameworks/av/media/libmedia/IMediaPlayerService.cpp status_t BnMediaPlayerService::onTransact( uint32_t code, const Parcel& data, Parcel* reply, uint32_t flags) { switch (code) { case CREATE: { CHECK_INTERFACE(IMediaPlayerService, data, reply); sp client = interface_cast(data.readStrongBinder()); audio_session_t audioSessionId = (audio_session_t) data.readInt32(); sp player = create(client, audioSessionId); reply->writeStrongBinder(IInterface::asBinder(player)); return NO_ERROR; } break; ...} }

         首先又将BpBinder转回了sp,然后调用了creat()方法,可是我们发现BnMediaPlayerService中只有一个onTransact()的实现,所以这个creat()我们要去它的子类寻找,果然就在MediaPlayerService中: 

xref: /frameworks/av/media/libmediaplayerservice/MediaPlayerService.cpp sp MediaPlayerService::create(const sp& client, audio_session_t audioSessionId) { pid_t pid = IPCThreadState::self()->getCallingPid(); int32_t connId = android_atomic_inc(&mNextConnId); sp c = new Client( this, pid, connId, client, audioSessionId, IPCThreadState::self()->getCallingUid()); ALOGD("Create new client(%d) from pid %d, uid %d, ", connId, pid, IPCThreadState::self()->getCallingUid()); wp w = c; { Mutex::Autolock lock(mLock); mClients.add(w); } return c; }

         代码简单易懂,创建了它一个自身类Client并返回指针供远端调用,这个Client包含了上层java的大部分接口。好了,回到我们的开始地方:

//代码目录:/frameworks/av/media/libmedia/mediaplayer.cpp status_t MediaPlayer::setDataSource(int fd, int64_t offset, int64_t length) { ALOGV("setDataSource(%d, %" PRId64 ", %" PRId64 ")", fd, offset, length); status_t err = UNKNOWN_ERROR; const sp service(getMediaPlayerService());//通过IPC机制获取一个远程服务 if (service != 0) { sp player(service->create(this, mAudioSessionId));//通过MediaPlayerService端创建了一个Client if ((NO_ERROR != doSetRetransmitEndpoint(player)) || (NO_ERROR != player->setDataSource(fd, offset, length))) {//调用Client的setDataSource() player.clear(); } err = attachNewPlayer(player); } return err; }

         这部分都是牵扯了Bindler机制,扯得有点远了。不过好坏算是回来了。

         哦,后面就简单了许多,直接调用了Client的setDataSource()函数,我们去看看:

MediaPlayerService::Client::setDataSource(): //代码目录: xref: /frameworks/av/media/libmediaplayerservice/MediaPlayerService.cpp status_t MediaPlayerService::Client::setDataSource(int fd, int64_t offset, int64_t length) { struct stat sb; int ret = fstat(fd, &sb); if (ret != 0) { ALOGE("fstat(%d) failed: %d, %s", fd, ret, strerror(errno)); return UNKNOWN_ERROR; } if (offset >= sb.st_size) { ALOGE("offset error"); return UNKNOWN_ERROR; } if (offset + length > sb.st_size) { length = sb.st_size - offset; ALOGV("calculated length = %lld", (long long)length); } player_type playerType = MediaPlayerFactory::getPlayerType(this, fd, offset, length); sp p = setDataSource_pre(playerType); if (p == NULL) { return NO_INIT; } // now set data source mStatus = setDataSource_post(p, p->setDataSource(fd, offset, length)); ALOGD("[%d] %s done", mConnId, __FUNCTION__); return mStatus; }

         首先调用MediaPlayerFactory::getPlayerType()函数获得了播放器的类型,我们看看实现:

//代码目录: xref: /frameworks/av/media/libmediaplayerservice/MediaPlayerFactory.cpp #define GET_PLAYER_TYPE_IMPL(a...) \ Mutex::Autolock lock_(&sLock); \ \ player_type ret = STAGEFRIGHT_PLAYER; \ float bestScore = 0.0; \ \ for (size_t i = 0; i < sFactoryMap.size(); ++i) { \ \ IFactory* v = sFactoryMap.valueAt(i); \ float thisScore; \ CHECK(v != NULL); \ thisScore = v->scoreFactory(a, bestScore); \ if (thisScore > bestScore) { \ ret = sFactoryMap.keyAt(i); \ bestScore = thisScore; \ } \ } \ \ if (0.0 == bestScore) { \ ret = getDefaultPlayerType(); \ } \ \ return ret; player_type MediaPlayerFactory::getPlayerType(const sp& client, int fd, int64_t offset, int64_t length) { GET_PLAYER_TYPE_IMPL(client, fd, offset, length); }

        这个叫打分机制,sFactoryMap存储了各个播放器的工厂类,通过scoreFactory()来获取对应播放类型的权值最大的工厂类,并将其返回。一般情况下播放网流或者本地资源都是调用的NuPlayerFactory,可以看一下它的打分机制:

virtual float scoreFactory(const sp& /*client*/, int fd, int64_t offset, int64_t length, float /*curScore*/) { char buf[20]; lseek(fd, offset, SEEK_SET); read(fd, buf, sizeof(buf)); lseek(fd, offset, SEEK_SET); uint32_t ident = *((uint32_t*)buf); // Ogg vorbis? if (ident == 0x5367674f) // 'OggS': use NuPlayer to play OGG clips return 1.0; if (ident == 0xc450fbff && length == 40541) // FIXME: for CTS testGapless1 test work around return 1.0; if (!memcmp(buf+3, " ftypM4A ", 9) && (length == 46180 || length == 83892)) // FIXME: for CTS testGapless2&3 test work around return 1.0; // FIXME: for AN7 CTS closed caption tests work around // testChangeSubtitleTrack // testDeselectTrackForSubtitleTracks // testGetTrackInfoForVideoWithSubtitleTracks if (length == 210936) return 1.0; // FIXME: for AN7 CTS NativeDecoderTest#testMuxerVp8 if (length == 2829176) return 1.0; // FIXME: for AN8 CTS android.media.cts.MediaPlayerDrmTest if (length == 3012095) { return 1.0; } // FIXME: for AN8 GTS MediaPlayerTest if (length == 1481047 || length == 8127587) return 1.0; return 0.0; }

         根据各个播放类型设置权值,并对应着返回。就不一一解释了,因为有些类型我也看不懂^_^。获得了播放器类型后就要创建播放器了,这一步是调用:

    sp p = setDataSource_pre(playerType)

          来实现的,我们看看setDataSource_pre()的定义:

sp MediaPlayerService::Client::setDataSource_pre( player_type playerType) { ALOGV("player type = %d", playerType); // MStar Android Patch Begin if (playerType == MST_PLAYER) { mIsMstplayer = true; } // MStar Android Patch End // create the right type of player sp p = createPlayer(playerType); if (p == NULL) { return p; } sp sm = defaultServiceManager(); sp binder = sm->getService(String16("media.extractor")); if (binder == NULL) { ALOGE("extractor service not available"); return NULL; } sp extractorDeathListener = new ServiceDeathNotifier(binder, p, MEDIAEXTRACTOR_PROCESS_DEATH); binder->linkToDeath(extractorDeathListener); sp codecDeathListener; if (property_get_bool("persisdia.treble_omx", true)) { // Treble IOmx sp omx = IOmx::getService(); if (omx == nullptr) { ALOGE("Treble IOmx not available"); return NULL; } codecDeathListener = new ServiceDeathNotifier(omx, p, MEDIACODEC_PROCESS_DEATH); omx->linkToDeath(codecDeathListener, 0); } else { // Legacy IOMX binder = sm->getService(String16("media.codec")); if (binder == NULL) { ALOGE("codec service not available"); return NULL; } codecDeathListener = new ServiceDeathNotifier(binder, p, MEDIACODEC_PROCESS_DEATH); binder->linkToDeath(codecDeathListener); } Mutex::Autolock lock(mLock); clearDeathNotifiers_l(); mExtractorDeathListener = extractorDeathListener; mCodecDeathListener = codecDeathListener; if (!p->hardwareOutput()) { mAudioOutput = new AudioOutput(mAudioSessionId, IPCThreadState::self()->getCallingUid(), mPid, mAudioAttributes); static_cast(p.get())->setAudioSink(mAudioOutput); } return p; }

          额,代码有点多,我们只抓重点,返回的p是什么?sp p = createPlayer(playerType),原来如此,我们直接跳过去吧:

sp MediaPlayerService::Client::createPlayer(player_type playerType) { // determine if we have the right player type sp p = mPlayer; if ((p != NULL) && (p->playerType() != playerType)) { ALOGV("delete player"); p.clear(); } if (p == NULL) { p = MediaPlayerFactory::createPlayer(playerType, this, notify, mPid); } if (p != NULL) { p->setUID(mUid); } return p; }

         mPlayer是上次播放调用的"播放器",重点在p = MediaPlayerFactory::createPlayer(playerType, this, notify, mPid),又调用了MediaPlayerFactory的函数,通过播放器类型创建了一个播放器吗?我们去看看:

sp MediaPlayerFactory::createPlayer( player_type playerType, void* cookie, notify_callback_f notifyFunc, pid_t pid) { sp p; IFactory* factory; status_t init_result; Mutex::Autolock lock_(&sLock); factory = sFactoryMap.valueFor(playerType); CHECK(NULL != factory); p = factory->createPlayer(pid); //.... return p; }

        代码很简单,从工厂map中找出对应的播放器工厂,然后通过工厂创建一个“播放器”。我们来看一下工厂类型: 

typedef KeyedVector tFactoryMap; void MediaPlayerFactory::registerBuiltinFactories() { Mutex::Autolock lock_(&sLock); if (sInitComplete) return; registerFactory_l(new NuPlayerFactory(), NU_PLAYER); registerFactory_l(new TestPlayerFactory(), TEST_PLAYER); // MStar Android Patch Begin #ifdef BUILD_WITH_MSTAR_MM registerFactory_l(new MstPlayerFactory(), MST_PLAYER); #ifdef SUPPORT_IMAGEPLAYER registerFactory_l(new ImagePlayerFactory(), IMG_PLAYER); #endif #ifdef SUPPORT_MSTARPLAYER registerFactory_l(new MstarPlayerFactory(), MSTAR_PLAYER); #endif #endif // MStar Android Patch End sInitComplete = true; }

         可见一共注册了五种播放器,但是我们最多用的还是NuPlayer。我们继续看代码,调用NuPlayer的creatplayer()函数:

virtual sp createPlayer(pid_t pid) { ALOGV(" create NuPlayer"); return new NuPlayerDriver(pid); }

         好敷衍,直接新建了一个NuPlayerDriver类,我们的NuPlayer播放器呢?不慌,我们看看NuPlayer的构造函数:

NuPlayerDriver::NuPlayerDriver(pid_t pid) : mState(STATE_IDLE), mLooper(new ALooper), mPlayer(new NuPlayer(pid)), mAutoLoop(false) { } }

           原来是在构造函数里新建了一个播放器。这样就明白了,所以现在回到setDataSource_pre()中,它反悔了一个NuPlayerDriver,我们继续看Client的setDataSource函数:

//代码目录: xref: /frameworks/av/media/libmediaplayerservice/MediaPlayerService.cpp status_t MediaPlayerService::Client::setDataSource(int fd, int64_t offset, int64_t length) { //... player_type playerType = MediaPlayerFactory::getPlayerType(this, fd, offset, length); sp p = setDataSource_pre(playerType); if (p == NULL) { return NO_INIT; } // now set data source mStatus = setDataSource_post(p, p->setDataSource(fd, offset, length)); ALOGD("[%d] %s done", mConnId, __FUNCTION__); return mStatus; }

        调用了NuPlayerDriver的setDataSource()函数啊,看来要进入安卓内置播放器了:

//代码目录: xref: /frameworks/av/media/libmediaplayerservice/nuplayer/NuPlayerDriver.cpp status_t NuPlayerDriver::setDataSource(int fd, int64_t offset, int64_t length) { ALOGV("setDataSource(%p) file(%d)", this, fd); Mutex::Autolock autoLock(mLock); if (mState != STATE_IDLE) { return INVALID_OPERATION; } mState = STATE_SET_DATASOURCE_PENDING; mPlayer->setDataSourceAsync(fd, offset, length); while (mState == STATE_SET_DATASOURCE_PENDING) { mCondition.wait(mLock); } return mAsyncResult; }

        之前我们看到,mPlayer就是NuPlayer类型,NuPlauerDriver调用了NuPlayer的setdataSourceAsync()函数。这样就进入了内置播放器,其他函数的调用都大致如此,但是没有setDataSource这么复杂,抓住一条线,就能顺理成章的搞懂框架的调用关系。

NuPlayer:


【本文地址】

公司简介

联系我们

今日新闻

    推荐新闻

    专题文章
      CopyRight 2018-2019 实验室设备网 版权所有