AudioRecord 中的 Java、JNI 和 Native 关系浅析
注意:本文基于 Android 8.1 进行分析
Qidi 2020.07.06 (Markdown & Haroopad)
在 Android 系统中,JNI 作为沟通 Java 和 Native 层的桥梁,从设计角度看,属于 代理模式(Proxy pattern) 的一种。它要求每一个使用 JNI 的组件,比如 AudioRecord、AudioSystem 等,都必须实现同一套接口,即都要在各自的 JNI 源文件中填充结构体数组 JNINativeMethod gMethods[]
,并在该数组中写明各个函数调用的映射关系。
各组件的 JNI 源文件位于目录 frameworks/base/core/jni/
下,文件命名风格为 android_*.cpp
。以 AudioRecord
为例,可以找到其对应的 JNI 文件为:
frameworks/base/core/jni/android_media_AudioRecord.cpp
对应的 gMethods[]
定义为:
static const JNINativeMethod gMethods[] = {
// name, signature, funcPtr
{"native_start", "(II)I", (void *)android_media_AudioRecord_start},
{"native_stop", "()V", (void *)android_media_AudioRecord_stop},
{"native_setup", "(Ljava/lang/Object;Ljava/lang/Object;[IIIII[ILjava/lang/String;J)I",
(void *)android_media_AudioRecord_setup},
{"native_finalize", "()V", (void *)android_media_AudioRecord_finalize},
{"native_release", "()V", (void *)android_media_AudioRecord_release},
{"native_read_in_byte_array",
"([BIIZ)I",
(void *)android_media_AudioRecord_readInArray<jbyteArray>},
{"native_read_in_short_array",
"([SIIZ)I",
(void *)android_media_AudioRecord_readInArray<jshortArray>},
{"native_read_in_float_array",
"([FIIZ)I",
(void *)android_media_AudioRecord_readInArray<jfloatArray>},
{"native_read_in_direct_buffer","(Ljava/lang/Object;IZ)I",
(void *)android_media_AudioRecord_readInDirectBuffer},
{"native_get_buffer_size_in_frames",
"()I", (void *)android_media_AudioRecord_get_buffer_size_in_frames},
{"native_set_marker_pos","(I)I", (void *)android_media_AudioRecord_set_marker_pos},
{"native_get_marker_pos","()I", (void *)android_media_AudioRecord_get_marker_pos},
{"native_set_pos_update_period",
"(I)I", (void *)android_media_AudioRecord_set_pos_update_period},
{"native_get_pos_update_period",
"()I", (void *)android_media_AudioRecord_get_pos_update_period},
{"native_get_min_buff_size",
"(III)I", (void *)android_media_AudioRecord_get_min_buff_size},
{"native_setInputDevice", "(I)Z", (void *)android_media_AudioRecord_setInputDevice},
{"native_getRoutedDeviceId", "()I", (void *)android_media_AudioRecord_getRoutedDeviceId},
{"native_enableDeviceCallback", "()V", (void *)android_media_AudioRecord_enableDeviceCallback},
{"native_disableDeviceCallback", "()V",
(void *)android_media_AudioRecord_disableDeviceCallback},
{"native_get_timestamp", "(Landroid/media/AudioTimestamp;I)I",
(void *)android_media_AudioRecord_get_timestamp},
};
gMethods[]
里每一个 JNINativeMethod
的中间项是 JNI 签名,用以标识被调用的函数方法的参数类型和返回值类型。比如,{"native_start", "(II)I", (void*)android_media_AudioRecord_start}
表示函数指针 android_media_AudioRecord_start
指向的函数入口参数为 (int, int)
,返回值类型也是 int
。详细说明可参考《JNI Types and Data Structures》。
如果组件所属的 JNI 源文件中实现了 JNI_OnLoad()
方法,那么它会在编译阶段被编译成独立的 .so
库文件;否则,它就会作为资源文件被链接到其它独立的 .so
库文件。
Android 虚拟机(ART) 启动时会依次加载 JNI 的每个 .so
文件,调用 JNI_OnLoad()
方法,并注册 gMethods[]
中声明的函数映射关系,从而实现对 Java 到 native 的调用关系的自动查找。关于 ART 注册 JNI 方法的过程和查询映射关系的详细逻辑,可以阅读源文件 art/runtime/java_vm_ext.cc
。
在此之后,Java 层只要相应声明了 native 方法,便可以自动通过 JNI 调用到 native 层了。
还是以 AudioRecord
为例,对照 gMethods[]
中的定义,Java 层声明的 native 方法如下(位于 frameworks/base/media/java/android/media/AudioRecord.java
):
//---------------------------------------------------------
// Native methods called from the Java side
//--------------------
private native final int native_setup(Object audiorecord_this,
Object /*AudioAttributes*/ attributes,
int[] sampleRate, int channelMask, int channelIndexMask, int audioFormat,
int buffSizeInBytes, int[] sessionId, String opPackageName,
long nativeRecordInJavaObj);
// TODO remove: implementation calls directly into implementation of native_release()
private native final void native_finalize();
/**
* @hide
*/
public native final void native_release();
private native final int native_start(int syncEvent, int sessionId);
private native final void native_stop();
private native final int native_read_in_byte_array(byte[] audioData,
int offsetInBytes, int sizeInBytes, boolean isBlocking);
private native final int native_read_in_short_array(short[] audioData,
int offsetInShorts, int sizeInShorts, boolean isBlocking);
private native final int native_read_in_float_array(float[] audioData,
int offsetInFloats, int sizeInFloats, boolean isBlocking);
private native final int native_read_in_direct_buffer(Object jBuffer,
int sizeInBytes, boolean isBlocking);
private native final int native_get_buffer_size_in_frames();
private native final int native_set_marker_pos(int marker);
private native final int native_get_marker_pos();
private native final int native_set_pos_update_period(int updatePeriod);
private native final int native_get_pos_update_period();
static private native final int native_get_min_buff_size(
int sampleRateInHz, int channelCount, int audioFormat);
private native final boolean native_setInputDevice(int deviceId);
private native final int native_getRoutedDeviceId();
private native final void native_enableDeviceCallback();
private native final void native_disableDeviceCallback();
private native final int native_get_timestamp(@NonNull AudioTimestamp outTimestamp,
@AudioTimestamp.Timebase int timebase);
仍然以 AudioRecord
为例,创建 RecordTrack 和启动录音的流程如下图所示。
以下是 APP 到 JNI 的时序图:
以下是 JNI 到 native 层的时序图: