android libstreaming流程解析

1、简介

libstreaming is an API that allows you, with only a few lines of code, to stream the camera and/or microphone of an android powered device using RTP over UDP.


Github地址


2、简单使用

2.1 添加权限
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.CAMERA" />
2.2 注册rtsp服务
<service android:name="net.majorkernelpanic.streaming.rtsp.RtspServer"/>
2.3 启动
		// 设置rtsp服务监听端口号
		Editor editor = PreferenceManager.getDefaultSharedPreferences(this).edit();
		editor.putString(RtspServer.KEY_PORT, String.valueOf(8899));
		editor.commit();

		// 设置默认参数
		SessionBuilder.getInstance()
		.setSurfaceView(mSurfaceView)
		.setPreviewOrientation(90)
		.setContext(getApplicationContext())
		.setAudioEncoder(SessionBuilder.AUDIO_NONE) //不推送声音
		.setVideoEncoder(SessionBuilder.VIDEO_H264);//推送H264格式数据
		
		// Starts the RTSP server
		this.startService(new Intent(this,RtspServer.class)); //启动rtsp服务
2.4 查看效果

打开网络串流
直播流完成
上面这些讲的跟没讲一样,看下面的吧

3、流程分析

3.1 启动RtspServer后做了什么?
@Override
	public void onCreate() {

		// Let's restore the state of the service 
		mSharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
		mPort = Integer.parseInt(mSharedPreferences.getString(KEY_PORT, String.valueOf(mPort)));
		mEnabled = mSharedPreferences.getBoolean(KEY_ENABLED, mEnabled);

		// If the configuration is modified, the server will adjust
		mSharedPreferences.registerOnSharedPreferenceChangeListener(mOnSharedPreferenceChangeListener);

		start();
	}

一样看去就是获取了端口号和使能标记,主要是satrt()函数;

3.2 RtspServer.start()函数
public void start() {
		if (!mEnabled || mRestart) stop();
		if (mEnabled && mListenerThread == null) {
			try {
				mListenerThread = new RequestListener();
			} catch (Exception e) {
				mListenerThread = null;
			}
		}
		mRestart = false;
	}

这是一个监听客户端连接监听线程,并且在构造函数中调用了Thread的start()函数

RequestListener()构造函数
public RequestListener() throws IOException {
			try {
				mServer = new ServerSocket(mPort);
				start();
			} catch (BindException e) {
				Log.e(TAG,"Port already in use !");
				postError(e, ERROR_BIND_FAILED);
				throw e;
			}
		}
RequestListener.run()构造函数
public void run() {
			Log.i(TAG,"RTSP server listening on port "+mServer.getLocalPort());
			while (!Thread.interrupted()) {
				try {
					new WorkerThread(mServer.accept()).start();
				} catch (SocketException e) {
					break;
				} catch (IOException e) {
					Log.e(TAG,e.getMessage());
					continue;
				}
			}
			Log.i(TAG,"RTSP server stopped !");
		}

为每个连接过来的客户端创建一个WorkerThread来处理,并启动了WorkerThread工作线程

WorkerThread()构造函数
public WorkerThread(final Socket client) throws IOException {
			mInput = new BufferedReader(new InputStreamReader(client.getInputStream()));
			mOutput = client.getOutputStream();
			mClient = client;
			mSession = new Session();
		}

初始化了流和回话

重点是WorkerThread.run()函数

太长了,不贴了



重点1
request = Request.parseRequest(mInput);

这里面解析了RTSP标准协议里面包含的信息;RTSP协议会多次交互,可以参考:RTSP交互过程

public static Request parseRequest(BufferedReader input) throws IOException, IllegalStateException, SocketException {
			Request request = new Request();
			String line;
			Matcher matcher;

			// Parsing request method & uri
			if ((line = input.readLine())==null) throw new SocketException("Client disconnected");
			matcher = regexMethod.matcher(line);
			matcher.find();
			request.method = matcher.group(1); //两个关键点-客户端请求的method,对应options-setup-play等
			request.uri = matcher.group(2);   //

			// Parsing headers of the request
			while ( (line = input.readLine()) != null && line.length()>3 ) {
				matcher = rexegHeader.matcher(line);
				matcher.find();
				request.headers.put(matcher.group(1).toLowerCase(Locale.US),matcher.group(2));
			}
			if (line==null) throw new SocketException("Client disconnected");

			// It's not an error, it's just easier to follow what's happening in logcat with the request in red
			Log.e(TAG,request.method+" "+request.uri);

			return request;
		}

其实也没什么看的,就是解析出客户端的指令

重点2
response = processRequest(request);

这里就有点复杂了,先看懂RTSP协议流程再读源码会有很大帮助
RTSP从连接到播放,需要握手多次,才能完成播放动作;这里就是对多次握手的响应处理;

RTSP建立会话

我们看下DESCRIBE和SETUP两次握手

DESCRIBE
DESCRIBE解析1->mSession = handleRequest(request.uri, mClient);
// Parse the requested URI and configure the session
          mSession = handleRequest(request.uri, mClient);
          mSessions.put(mSession, null);
          mSession.syncConfigure();

          String requestContent = mSession.getSessionDescription();
          String requestAttributes =
                  "Content-Base: " + mClient.getLocalAddress().getHostAddress() + ":" + mClient.getLocalPort() + "/\r\n" +
                          "Content-Type: application/sdp\r\n";

          response.attributes = requestAttributes;
          response.content = requestContent;

          // If no exception has been thrown, we reply with OK
          response.status = Response.STATUS_OK;

mSession = handleRequest(request.uri, mClient);
跟进去你会发现里面重新实例化了Session对象,并且为会话实例化了音视频流,音视频流是重点吧

protected Session handleRequest(String uri, Socket client) throws IllegalStateException, IOException {
		Session session = UriParser.parse(uri);
		session.setOrigin(client.getLocalAddress().getHostAddress());
		if (session.getDestination()==null) {
			session.setDestination(client.getInetAddress().getHostAddress());
		}
		return session;
	}

Session session = UriParser.parse(uri);

        public static Session parse(String uri) throws IllegalStateException, IOException {
			SessionBuilder builder = SessionBuilder.getInstance().clone();
			byte audioApi = 0, videoApi = 0;
	
	        String query = URI.create(uri).getQuery();
	        String[] queryParams = query == null ? new String[0] : query.split("&");
	        ContentValues params = new ContentValues();
	        for(String param:queryParams)
	        {
	            ...//解析连接参数
	        }
	        ...  //根据参数配置Session
	        Session session = builder.build();
        

Session session = builder.build();

public Session build() {
		Session session;

		session = new Session();
		session.setOrigin(mOrigin);
		session.setDestination(mDestination);
		session.setTimeToLive(mTimeToLive);
		session.setCallback(mCallback);

		switch (mAudioEncoder) {
		case AUDIO_AAC:
			AACStream stream = new AACStream();
			session.addAudioTrack(stream);
			if (mContext!=null) 
				stream.setPreferences(PreferenceManager.getDefaultSharedPreferences(mContext));
			break;
		case AUDIO_AMRNB:
			session.addAudioTrack(new AMRNBStream());
			break;
		}

		switch (mVideoEncoder) {
		case VIDEO_H263:
		case VIDEO_H264:
			H264Stream stream = new H264Stream(mContext);
			if (mContext!=null) 
				stream.setPreferences(PreferenceManager.getDefaultSharedPreferences(mContext));
			if(pushServer != null){
				stream.setPushServer(pushServer);
			}
			session.addVideoTrack(stream);
			break;
		}
		...
		return session;
	}

最终是在这里实例化了AudioStream和VideoStream,这个很重要吧;
注意:因为项目需要个人对源码做了修改

DESCRIBE解析2->mSession.syncConfigure();
for (int id=0;id<2;id++) {
			Stream stream = id==0 ? mAudioStream : mVideoStream;
			if (stream!=null && !stream.isStreaming()) {
				try {
					stream.configure();
				}

这里其实就是分别调用音频和视频的configure函数,我主要跟视频流;

不贴那么多代码了,只贴过程了

mSession.syncConfigure()->VideoStream.configure()->super.configure();//注:super是MediaStream,所以
->MediaStream.configure();

这样差不多就结束了DESCRIBE

MediaStream.configure()
if (mStreaming) throw new IllegalStateException("Can't be called while streaming.");
		if (mPacketizer != null) {
			mPacketizer.setDestination(mDestination, mRtpPort, mRtcpPort);
			mPacketizer.getRtpSocket().setOutputStream(mOutputStream, mChannelIdentifier);
		}

mPacketizer是在Session中初始化H264Stream中构造函数中初始化的;

这里其实启动了摄像头预览,因为项目需要我去掉了相关部分

SETUP

处理SETUP最重要的一句代码:mSession.syncStart(trackId);
从这里开始就启动编码器开始工作了,自己跟一下了,感觉写好多了

处理编码数据

编码数据在VideoStream.java文件中完成,根据配置会分别调用对应的编码函数,我默认配置了只调用encodeWithMediaCodecMethod1();

数据推送

数据推送是在H264Packetizer.java中,在启动编码后,会启动H264Packetizer进行编解码然后发送

try {
			while (!Thread.interrupted()) {

				oldtime = System.nanoTime();
				// We read a NAL units from the input stream and we send them
				send();
				// We measure how long it took to receive NAL units from the phone
				duration = System.nanoTime() - oldtime;

				stats.push(duration);
				// Computes the average duration of a NAL unit
				delay = stats.average();
				//Log.d(TAG,"duration: "+duration/1000000+" delay: "+delay/1000000);

			}
		} catch (IOException e) {
		} catch (InterruptedException e) {}

到这里我就不跟了,原谅我只会考代码

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 5
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 5
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值