1、Android流媒体分类
我们可以参考google的api文档,看看Android到底支持了那些流媒体类型。见如下网址:
http://developer.android.com/guide/appendix/media-formats.html
摘出我们关注的描述:
Network Protocols
The following network protocols are supported for audio and video playback:
- RTSP (RTP, SDP)
- HTTP/HTTPS progressive streaming
- HTTP/HTTPS live streaming draft protocol:
- MPEG-2 TS media files only
- Protocol version 3 (Android 4.0 and above)
- Protocol version 2 (Android 3.x)
- Not supported before Android 3.0
Note: HTTPS is not supported before Android 3.1.
可见,Android支持RTSP、HTTP和HTTP live三种流媒体,目前比较常见的是RTSP和HTTP。2、流媒体播放api
跟播放本地视频其实是一样的,都是用的MediaPlayer来设置视频源,只不过本地是设置路径,流媒体是设置uri。
MediaPlayer有的两个关键功能:联系解码器和播放状态控制。
1、音频在解码后,通过MediaPlayerService和AudioFlinger,传到了HAL层AudioHardware,进一步传到音频驱动,发出声音。
2、视频需要使用SurfaceHolder来显示,并设置给MediaPlayer的setDisplay接口。
这个方法相对比较麻烦,google自己封装了VideoView类,使得视频播放开发更简单,更直接。
3、流媒体播放调用流程
1、setDataSource调用流程
setDataSource的功能主要是根据传入的参数实例化对应的播放器,对流媒体HTTP对应的是StagefrightPlayer,RTSP和HTTP live对应NuPlayer。
注:上图中为了简化流程,忽略了jni和binder等调用细节
调用流程中有几点需要关注,但是没有在图中显示出来,这里大概补充一下:
1、服务MediaPlayerService
MediaPlayer.cpp中获得MediaPlayerService:
// establish binder interface to MediaPlayerService
/*static*/const sp<IMediaPlayerService>&
IMediaDeathNotifier::getMediaPlayerService()
{
ALOGV("getMediaPlayerService");
Mutex::Autolock _l(sServiceLock);
if (sMediaPlayerService == 0) {
sp<IServiceManager> sm = defaultServiceManager();
sp<IBinder> binder;
do {
binder = sm->getService(String16("media.player"));
if (binder != 0) {
break;
}
ALOGW("Media player service not published, waiting...");
usleep(500000); // 0.5 s
} while (true);
if (sDeathNotifier == NULL) {
sDeathNotifier = new DeathNotifier();
}
binder->linkToDeath(sDeathNotifier);
sMediaPlayerService = interface_cast<IMediaPlayerService>(binder);
}
ALOGE_IF(sMediaPlayerService == 0, "no media player service!?");
return sMediaPlayerService;
}
MediaPlayerService的服务名为media.player,通过ServiceManager即可获得。
MediaPlayerService隶属于mediaserver,在开始时mediaserver启动时加载实例化的。
见main_mediaserver.cpp
int main(int argc, char** argv)
{
sp<ProcessState> proc(ProcessState::self());
sp<IServiceManager> sm = defaultServiceManager();
ALOGI("ServiceManager: %p", sm.get());
AudioFlinger::instantiate();
MediaPlayerService::instantiate();
CameraService::instantiate();
AudioPolicyService::instantiate();
ProcessState::self()->startThreadPool();
IPCThreadState::self()->joinThreadPool();
}
mediaserver是在init.rc即开机时加载的,如下:
service media /system/bin/mediaserver
class main
user media
group audio camera inet net_bt net_bt_admin net_bw_acct drmrpc
ioprio rt 4
2、音频设置
sp<MediaPlayerBase> MediaPlayerService::Client::setDataSource_pre(
player_type playerType)
{
ALOGV("player type = %d", playerType);
// create the right type of player
sp<MediaPlayerBase> p = createPlayer(playerType);
if (p == NULL) {
return p;
}
if (!p->hardwareOutput()) {
mAudioOutput = new AudioOutput(mAudioSessionId);
static_cast<MediaPlayerInterface*>(p.get())->setAudioSink(mAudioOutput);
}
return p;
}
AudioOutput会定义一个AudioTrack设置一堆的参数,并且会在start函数被调用时实例化AudioTrack。
AudioSink是用来接收解码器传过来的数据,再传给AudioTrack。