Android多媒体开发(3)————使用Android NKD编译havlenapetr-FFMpeg-7c27aa2

/********************************************************************************************
* author:conowen@大钟
* E-mail:conowen@hotmail.com
* http://blog.csdn.net/conowen
* 注:本文为原创,仅作为学习交流使用,转载请标明作者及出处。

********************************************************************************************/


1、

使用NDK去编译官方的FFmpeg原版的话,还得自己实现JNI层与java层工程量比较大。所以移植FFmpeg到Android平台时,可以移植一些已经实现JNI与JAVA层的开源项目,毕竟软件行业从来都是站在巨人肩膀上发展的。


2、移植havlenapetr/FFMpeg


havlenapetr的开源项目是比较出名的一个FFmpeg工程,很多Android多媒体项目都是在此基础上面修改的。


下载地址:https://github.com/havlenapetr/FFMpeg

可以直接ZIP包:https://github.com/havlenapetr/FFMpeg/zipball/debug

或者通过git方式下载,新建一个目录,然后在linux的终端下执行,当然了,你要事情安装git的相关工具

git clone https://github.com/havlenapetr/FFMpeg.git

3、利用NDK编译生成so库


下载后直接在havlenapetr-FFMpeg-7c27aa2的顶级目录下执行

$ndk/ndk-build

是可以编译通过的,不会提示任何error。

关于如何利用NDK编译,可以参考我之前的博文:http://blog.csdn.net/conowen/article/details/7518870


4、导入java工程,实现播放

然后把在eclipse里面,把havlenapetr-FFMpeg-7c27aa2这个项目import进来,就可以播放视频了。


4.1、需要注意的是:这个版本的havlenapetr FFmpeg工程只能在Android 2.2上面运行,因为havlenapetr采用的是音视频直接在JNI层输入。可以注意到havlenapetr-FFMpeg-7c27aa2目录下有prebuilt这样一个目录,此目录下有Android 2.2版本的libjniaudio.so和libjnivideo.so两个库文件。


4.2、Android版本不同导致不能播放:

havlenapetr的FFmpeg项目音视频输出如下

音频:采用Android底层的audiotrack输出。

视频:在FFmpeg解码之后,得到YUV信号,然后转换成RGB信号,最终通过Android底层的surface输出。


提示:可以移植SDL开源库实现音视频输出,因为SDL的视频输出机制是通过OPenGL呈现画面,这样就可以兼容所有的Android平台。


但是问题就来了,Android每个版本的framework都是不大一样的,所以要在底层使用Android的audiotrack和surface来输入音视频信号,就要在相应版本的Android源代码中,重新编译生成libjniaudio.so和libjnivideo.so两个库文件了。


5、编译havlenapetr FFmpeg工程Android 2.3版本的libjniaudio.so和libjnivideo.so

首先要明白一点,Android的官方源代码编译之后,是不会生成libjniaudio.so和libjnivideo.so的。所以要自己添加audiotrack.cpp、surface.cpp和Android.mk文件到Android源代码里面编译生成。(每次编译libjniaudio.so和libjnivideo.so都要重新编译这个Android源代码,时间比较长。)


5.1下载audio与video文件夹

可以在https://github.com/havlenapetr/android_frameworks_base下载audiotrack.cpp、surface.cpp和Android.mk,注意要选择正确的branch(分支)

froyo---->Android 2.2

gingerbread---->Android 2.3

ICS---->Android 4.0


5.2、编译Android系统源代码

下载之后,然后找到里面的native文件夹,把里面的audio和video文件夹拖进Android源代码的frameworks/base/native目录下。

需要注意的一点是:

gingerbread下载之后,里面是没有audio和video文件夹的,但是可以用froyo版本的audio和video文件夹。(也就是下载gingerbread感觉也没啥用Orz~~~)

但是我们可以使用froyo的audio和video文件夹,编译Android源代码是可以成功通过的,ndk-build也可以通过,但是在Android的java工程里面使用就会有以下错误信息。

java.lang.NoSuchFieldError: no field with name='mSurface' signature='I' in class Landroid/view/Surface;

加载库时,找不到mSruface类
修改方法是:
将surface.cpp中mSurface改为 mNativeSurface ,然后重新编译即可。当然了,你也可以用ICS的surface.cpp文件,这个版本是没有问题的。


另外编译havlenapetr FFmpeg工程Android 4.0版本的libjniaudio.so和libjnivideo.so与上面步骤差不多。


/************************************************************************/

附上我所使用的audio与video(来源havlenapetr的项目)

video/jni/surface.cpp(注意目录结构)

/* * Copyright (C) 2012 Havlena Petr * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ #define LOG_TAG "ASurface" #include <utils/Log.h> #include <surfaceflinger/Surface.h> #include <SkCanvas.h> #include <SkBitmap.h> #include <SkMatrix.h> #include <SkRect.h> #include "surface.h" #define CHECK(val) \ if(!val) { \ LOGE("%s [%i]: NULL pointer exception!", __func__, __LINE__); \ return -1; \ } #define SDK_VERSION_FROYO 8 using namespace android; typedef struct ASurface { /* our private members here */ Surface* surface; SkCanvas canvas; } ASurface; static Surface* getNativeSurface(JNIEnv* env, jobject jsurface, int sdkVersion) { /* we know jsurface is a valid local ref, so use it */ jclass clazz = env->GetObjectClass(jsurface); if(clazz == NULL) { LOGE("Can't find surface class!"); return NULL; } jfieldID field_surface = env->GetFieldID(clazz, sdkVersion > SDK_VERSION_FROYO ? "mNativeSurface" : "mSurface", "I"); if(field_surface == NULL) { LOGE("Can't find native surface field!"); return NULL; } return (Surface *) env->GetIntField(jsurface, field_surface); } int ASurface_init(JNIEnv* env, jobject jsurface, int sdkVersion, ASurface** aSurface) { if(!env || jsurface == NULL) { LOGE("JNIEnv or jsurface obj is NULL!"); return -1; } Surface* surface = getNativeSurface(env, jsurface, sdkVersion); if(!surface) { LOGE("Can't obtain native surface!"); return -1; } *aSurface = (ASurface *) malloc(sizeof(ASurface)); (*aSurface)->surface = surface; return 0; } void ASurface_deinit(ASurface** aSurface) { free(*aSurface); *aSurface = NULL; } int ASurface_lock(ASurface* aSurface, AndroidSurfaceInfo* info) { static Surface::SurfaceInfo surfaceInfo; CHECK(aSurface); CHECK(aSurface->surface); Surface* surface = aSurface->surface; if (!surface->isValid()) { LOGE("Native surface isn't valid!"); return -1; } int res = surface->lock(&surfaceInfo); if(res < 0) { LOGE("Can't lock native surface!"); return res; } info->w = surfaceInfo.w; info->h = surfaceInfo.h; info->s = surfaceInfo.s; info->usage = surfaceInfo.usage; info->format = surfaceInfo.format; info->bits = surfaceInfo.bits; return 0; } static SkBitmap::Config convertPixelFormat(APixelFormat format) { switch(format) { case ANDROID_PIXEL_FORMAT_RGBX_8888: case ANDROID_PIXEL_FORMAT_RGBA_8888: return SkBitmap::kARGB_8888_Config; case ANDROID_PIXEL_FORMAT_RGB_565: return SkBitmap::kRGB_565_Config; } return SkBitmap::kNo_Config; } static void initBitmap(SkBitmap& bitmap, AndroidSurfaceInfo* info) { bitmap.setConfig(convertPixelFormat(info->format), info->w, info->h); if (info->format == ANDROID_PIXEL_FORMAT_RGBX_8888) { bitmap.setIsOpaque(true); } if (info->w > 0 && info->h > 0) { bitmap.setPixels(info->bits); } else { // be safe with an empty bitmap. bitmap.setPixels(NULL); } } void ASurface_scaleToFullScreen(ASurface* aSurface, AndroidSurfaceInfo* src, AndroidSurfaceInfo* dst) { SkBitmap srcBitmap; SkBitmap dstBitmap; SkMatrix matrix; initBitmap(srcBitmap, src); initBitmap(dstBitmap, dst); matrix.setRectToRect(SkRect::MakeWH(srcBitmap.width(), srcBitmap.height()), SkRect::MakeWH(dstBitmap.width(), dstBitmap.height()), SkMatrix::kFill_ScaleToFit); aSurface->canvas.setBitmapDevice(dstBitmap); aSurface->canvas.drawBitmapMatrix(srcBitmap, matrix); } int ASurface_rotate(ASurface* aSurface, AndroidSurfaceInfo* src, uint32_t degrees) { SkBitmap bitmap; CHECK(aSurface); CHECK(src); initBitmap(bitmap, src); aSurface->canvas.setBitmapDevice(bitmap); return aSurface->canvas.rotate(SkScalar(degrees)) ? 0 : -1; } int ASurface_unlockAndPost(ASurface* aSurface) { CHECK(aSurface); CHECK(aSurface->surface); return aSurface->surface->unlockAndPost(); }
video/jni/Android.mk (注意目录结构)

LOCAL_PATH:= $(call my-dir) include $(CLEAR_VARS) # our source files # LOCAL_SRC_FILES:= \ surface.cpp LOCAL_SHARED_LIBRARIES := \ libskia \ libsurfaceflinger_client \ libutils \ liblog LOCAL_C_INCLUDES += \ $(JNI_H_INCLUDE) \ external/skia/src/core \ external/skia/include/core \ frameworks/base/include \ frameworks/base/native/include # Optional tag would mean it doesn't get installed by default LOCAL_MODULE_TAGS := optional LOCAL_PRELINK_MODULE := false LOCAL_MODULE:= libjnivideo include $(BUILD_SHARED_LIBRARY)
/audio/jni/audiotrack.cpp(注意目录结构)

/* * Copyright (C) 2009 The Android Open Source Project * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ #include <android/audiotrack.h> #include <utils/Log.h> #include <media/AudioTrack.h> #include <media/AudioSystem.h> #include <utils/Errors.h> #include <binder/MemoryHeapBase.h> #include <binder/MemoryBase.h> #define TAG "AudioTrackWrapper" using namespace android; //struct audiotrack_fields_t { static AudioTrack* track; //sp<MemoryHeapBase> memHeap; //sp<MemoryBase> memBase; //}; //static struct audiotrack_fields_t audio; static AudioTrack* getNativeAudioTrack(JNIEnv* env, jobject jaudioTrack) { jclass clazz = env->FindClass("android/media/AudioTrack"); jfieldID field_track = env->GetFieldID(clazz, "mNativeTrackInJavaObj", "I"); if(field_track == NULL) { return NULL; } return (AudioTrack *) env->GetIntField(jaudioTrack, field_track); } /* static bool allocSharedMem(int sizeInBytes) { memHeap = new MemoryHeapBase(sizeInBytes); if (memHeap->getHeapID() < 0) { return false; } memBase = new MemoryBase(memHeap, 0, sizeInBytes); return true; } */ int AndroidAudioTrack_register() { __android_log_print(ANDROID_LOG_INFO, TAG, "registering audio track"); track = new AudioTrack(); if(track == NULL) { return ANDROID_AUDIOTRACK_RESULT_JNI_EXCEPTION; } __android_log_print(ANDROID_LOG_INFO, TAG, "registered"); return ANDROID_AUDIOTRACK_RESULT_SUCCESS; } int AndroidAudioTrack_start() { //__android_log_print(ANDROID_LOG_INFO, TAG, "starting audio track"); if(track == NULL) { return ANDROID_AUDIOTRACK_RESULT_ALLOCATION_FAILED; } track->start(); return ANDROID_AUDIOTRACK_RESULT_SUCCESS; } int AndroidAudioTrack_set(int streamType, uint32_t sampleRate, int format, int channels) { if(track == NULL) { return ANDROID_AUDIOTRACK_RESULT_ALLOCATION_FAILED; } __android_log_print(ANDROID_LOG_INFO, TAG, "setting audio track"); status_t ret = track->set(streamType, sampleRate, format, channels, 0, 0, 0, 0, false); if (ret != NO_ERROR) { return ANDROID_AUDIOTRACK_RESULT_ERRNO; } return ANDROID_AUDIOTRACK_RESULT_SUCCESS; } int AndroidAudioTrack_flush() { if(track == NULL) { return ANDROID_AUDIOTRACK_RESULT_ALLOCATION_FAILED; } track->flush(); return ANDROID_AUDIOTRACK_RESULT_SUCCESS; } int AndroidAudioTrack_stop() { if(track == NULL) { return ANDROID_AUDIOTRACK_RESULT_ALLOCATION_FAILED; } track->stop(); return ANDROID_AUDIOTRACK_RESULT_SUCCESS; } int AndroidAudioTrack_reload() { if(track == NULL) { return ANDROID_AUDIOTRACK_RESULT_ALLOCATION_FAILED; } if(track->reload() != NO_ERROR) { return ANDROID_AUDIOTRACK_RESULT_ERRNO; } return ANDROID_AUDIOTRACK_RESULT_SUCCESS; } int AndroidAudioTrack_unregister() { __android_log_print(ANDROID_LOG_INFO, TAG, "unregistering audio track"); if(!track->stopped()) { track->stop(); } //memBase.clear(); //memHeap.clear(); free(track); //track = NULL; __android_log_print(ANDROID_LOG_INFO, TAG, "unregistered"); return ANDROID_AUDIOTRACK_RESULT_SUCCESS; } int AndroidAudioTrack_write(void *buffer, int buffer_size) { // give the data to the native AudioTrack object (the data starts at the offset) ssize_t written = 0; // regular write() or copy the data to the AudioTrack's shared memory? if (track->sharedBuffer() == 0) { written = track->write(buffer, buffer_size); } else { // writing to shared memory, check for capacity if ((size_t)buffer_size > track->sharedBuffer()->size()) { __android_log_print(ANDROID_LOG_INFO, TAG, "buffer size was too small"); buffer_size = track->sharedBuffer()->size(); } memcpy(track->sharedBuffer()->pointer(), buffer, buffer_size); written = buffer_size; } return written; }
/audio/jni/Android.mk(注意目录结构)

LOCAL_PATH:= $(call my-dir) include $(CLEAR_VARS) # our source files # LOCAL_SRC_FILES:= \ audiotrack.cpp LOCAL_SHARED_LIBRARIES := \ libbinder \ libmedia \ libutils \ liblog LOCAL_C_INCLUDES += \ $(JNI_H_INCLUDE) \ frameworks/base/include \ frameworks/base/native/include # Optional tag would mean it doesn't get installed by default LOCAL_MODULE_TAGS := optional LOCAL_PRELINK_MODULE := false LOCAL_MODULE:= libjniaudio include $(BUILD_SHARED_LIBRARY)





  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值