FFMPEG研究: Android下录制/dev/video0设备h264编码保存为mp4格式视频

博主QQ:1356438802

QQ群:473383394——UVC&OpenCV47





其实这篇文章早就应该发出来的,去年一直在研究FFMPEG的视频录制,从Ubuntu到Android平台的验证,《FFMPEG研究: ubuntu下录制/dev/video0/设备视频保存为mp4格式》中,Ubuntu的录制已经略见雏形,后面就把FFMPEG的Android-lib给另外的同事去调试了,他更擅长java。他把我的Ubuntu代码就行移植整合,写成了一个可预览可录制的APP——UVC_FFMPEG。


下载:http://download.csdn.net/detail/luoyouren/9594950

源码结构


JNI中的record_camera.cpp和ubuntu中的record_camera.c基本一样,ImageProc.cpp提供java层的调用接口,其他就是FFMPEG-lib,特别注意Android.mk的写法。


关键的MainActivity.java

package cc.along.uvcsimple;

import java.nio.ByteBuffer;

import cc.along.ui.UVCPreViewView;
import android.support.v7.app.ActionBarActivity;
import android.support.v7.app.ActionBar;
import android.support.v4.app.Fragment;
import android.os.Bundle;
import android.os.Handler;
import android.os.Message;
import android.util.Log;
import android.view.LayoutInflater;
import android.view.Menu;
import android.view.MenuItem;
import android.view.SurfaceView;
import android.view.View;
import android.view.ViewGroup;
import android.view.View.OnClickListener;
import android.widget.Button;
import android.widget.Toast;
import android.os.Build;

public class MainActivity extends ActionBarActivity {

	private static final String TAG = "UVCSimple_main";
	protected static final int MSG_START_RECORDING = 0;
	protected static final int MSG_STOP_RECORDING = 1;
	protected static final int MSG_SHOW_INFO = 2;

	@Override
	protected void onCreate(Bundle savedInstanceState) {
		super.onCreate(savedInstanceState);
		setContentView(R.layout.activity_main);

		if (savedInstanceState == null) {
			getSupportFragmentManager().beginTransaction()
					.add(R.id.container, new PlaceholderFragment()).commit();
		}
	}

	/**
	 * A placeholder fragment containing a simple view.
	 */
	public static class PlaceholderFragment extends Fragment implements
			OnClickListener {

		private boolean isCameraExist = true;
		private boolean isRecording = false;
		private Button mStartBt;
		private Button mStopBt;
		private int mImageBufSize;
		private ByteBuffer mPreViewBuffer;
		private UVCPreViewView mPreViewSurface;

		public PlaceholderFragment() {
		}

		@Override
		public View onCreateView(LayoutInflater inflater, ViewGroup container,
				Bundle savedInstanceState) {
			int ret;

			View rootView = inflater.inflate(R.layout.fragment_main, container,
					false);

			mPreViewSurface = (UVCPreViewView) rootView.findViewById(R.id.preview_id);
			mStartBt = (Button) rootView.findViewById(R.id.bt_start);
			mStopBt = (Button) rootView.findViewById(R.id.bt_stop);
			mStartBt.setOnClickListener(this);
			mStopBt.setOnClickListener(this);

			if (UVCJni.DEBUG) {
				UVCUtil.listDevFile(getActivity());
			}
			ret = UVCJni.prepareCamera(0);
			if (ret < 0) {
				Log.e(TAG, "prepareCamera failed");
				showMsgInfo("prepareCamera failed");
			}

			if (ret > 0) {
				mImageBufSize = UVCJni.getBufferSize();
				mPreViewBuffer = ByteBuffer.allocateDirect(mImageBufSize);
				UVCJni.setDirectBuffer(mPreViewBuffer, mImageBufSize);
				isCameraExist = true;
			}

			return rootView;
		}

		@Override
		public void onClick(View v) {
			switch (v.getId()) {
			case R.id.bt_start:

				// showMsgInfo("tryOpen " + UVCJni.tryOpen() );

				// try {
				// Thread.sleep(3000);
				// } catch (InterruptedException e) {
				// // TODO Auto-generated catch block
				// e.printStackTrace();
				// }

				sendMsg(MSG_START_RECORDING, "");
				break;
			case R.id.bt_stop:
				Log.i(TAG, "Record stop entry");
				sendMsg(MSG_STOP_RECORDING, "");
				Log.i(TAG, "Record stop exit");
				break;
			}

		}

		private void showMsgInfo(String info) {
			sendMsg(MSG_SHOW_INFO, info);
		}

		private void sendMsg(int msgID, Object obj) {
			Message msg = handler.obtainMessage();
			msg.what = msgID;
			msg.obj = obj;
			handler.sendMessage(msg);
		}

		Handler handler = new Handler() {

			@Override
			public void handleMessage(Message msg) {
				super.handleMessage(msg);

				switch (msg.what) {
				case MSG_START_RECORDING:
					

					if (isCameraExist && !isRecording) {
						isRecording = true;
						new Thread() {
							int bufferLength;

							public void run() {

								Log.i(TAG, "Record start entry");
								showMsgInfo("Record start entry");

								while (isRecording) {
									UVCJni.startRecording();
									bufferLength = UVCJni.getRealImageSize();
									Log.i(TAG, "getRealImageSize " + bufferLength);
									mPreViewSurface.updatePreview(mPreViewBuffer.array(), bufferLength);
								}

								UVCJni.stopRecording();
								isRecording = false;
								Log.i(TAG, "Record start exit");
								showMsgInfo("Record start exit");
							}
						}.start();
					} else {
						showMsgInfo("Camera not exist");
					}

					break;

				case MSG_STOP_RECORDING:
					isRecording = false;
					break;

				case MSG_SHOW_INFO:
					String info = (String) msg.obj;
					Toast.makeText(getActivity(), "" + info, 0).show();
					break;

				default:
					break;
				}
			}
		};
	}

}


上层还是很简单的,prepareCamera,然后在Thread里面去StartRecord,然后StopRecord。


但是APP运行,预览录像后,画面超级卡顿,大概三秒的滞后,录制的视频也不正常。


细细想下,如果h264编码非常占用资源,耗时间,而我的读取数据帧、预览、录制在同一个线程,那么必然导致读取数据时很慢,预览也就滞后,同时又导致给h264编码用的原始帧并不是连续的,所以最后录制的视频也就有问题!!!


综上,如果想要效果好,就用多线程去处理,一个预览,一个录制,不过h264在Android上如果编码效率太低,则最终也是白费功夫!


PS:在多线程机制下,临界资源保护时,如果对于资源的处理过程很耗时,那么这个处理过程不应该放在保护区内。最好的方式是,在临界区将资源拷贝一份,离开临界区后再去处理资源,比如涉及到编解码过程。典型的,需要用空间换时间的策略!






评论 8
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值