I have application (Qt + Android), that creates live stream from Android's Camera
(AVC) + AudioRecorder
(AAC) and then sends encoded data to RTMP server using librtmp library (v 2.4).
AVC MediaCodec
main func.:
public void videoEncode(byte[] data) {
// Video buffers
videoCodecInputBuffers = videoMediaCodec.getInputBuffers();
videoCodecOutputBuffers = videoMediaCodec.getOutputBuffers();
int inputBufferIndex = videoMediaCodec.dequeueInputBuffer(-1);
if (inputBufferIndex >= 0) {
videoInputBuffer = videoCodecInputBuffers[inputBufferIndex];
videoCodecInputData = YV12toYUV420Planar(data, encWidth * encHeight);
videoInputBuffer.clear();
videoInputBuffer.put(videoCodecInputData);
videoMediaCodec.queueInputBuffer(inputBufferIndex, 0, videoCodecInputData.length, 0, 0);
}
// Get AVC/H.264 frame
int outputBufferIndex = videoMediaCodec.dequeueOutputBuffer(videoBufferInfo, 0);
while(outputBufferIndex >= 0) {
videoOutputBuffer = videoCodecOutputBuffers[outputBufferIndex];
videoOutputBuffer.get(videoCodecOutputData, 0, videoBufferInfo.size);
// H.264 / AVC header
if(videoCodecOutputData[0] == 0x00 && videoCodecOutputData[1] == 0x00 && videoCodecOutputData[2] == 0x00 && videoCodecOutputData[3] == 0x01) {
// I-frame
boolean keyFrame = false;
if((videoBufferInfo.flags & MediaCodec.BUFFER_FLAG_SYNC_FRAME) == MediaCodec.BUFFER_FLAG_SYNC_FRAME) {
resetTimestamp();
keyFrame = true;
}
int currentTimestamp = cameraAndroid.calcTimestamp();
if(prevTimestamp == currentTimestamp) currentTimestamp++;
sendVideoData(videoCodecOutputData, videoBufferInfo.size, currentTimestamp, cameraAndroid.calcTimestamp()); // Native C func
prevTimestamp = currentTimestamp;
// SPS / PPS sent
spsPpsFrame = true;
}
videoMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
outputBufferIndex = videoMediaCodec.dequeueOutputBuffer(videoBufferInfo, 0);
}
}
AAC MediaCodec
main func.:
public void audioEncode(byte[] data) {
// Audio buffers
audioCodecInputBuffers = audioMediaCodec.getInputBuffers();
audioCodecOutputBuffers = audioMediaCodec.getOutputBuffers();
// Add raw chunk into buffer
int inputBufferIndex = audioMediaCodec.dequeueInputBuffer(-1);
if (inputBufferIndex >= 0) {
audioInputBuffer = audioCodecInputBuffers[inputBufferIndex];
audioInputBuffer.clear();
audioInputBuffer.put(data);
audioMediaCodec.queueInputBuffer(inputBufferIndex, 0, data.length, 0, 0);
}
// Encode AAC
int outputBufferIndex = audioMediaCodec.dequeueOutputBuffer(audioBufferInfo, 0),
audioOutputBufferSize = 0;
while(outputBufferIndex >= 0) {
audioOutputBuffer = audioCodecOutputBuffers[outputBufferIndex];
audioOutputBuffer.get(audioCodecOutputData, 0, audioBufferInfo.size);
if(spsPpsFrame || esdsChunk) {
int currentTimestamp = cameraAndroid.calcTimestamp();
if(prevTimestamp == currentTimestamp) currentTimestamp++;
sendAudioData(audioCodecOutputData, audioBufferInfo.size, currentTimestamp); // Native C func
prevTimestamp = currentTimestamp;
esdsChunk = false;
}
// Next chunk
audioMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
outputBufferIndex = audioMediaCodec.dequeueOutputBuffer(audioBufferInfo, 0);
}
}
Camera
frames encoded in setPreviewCallbackWithBuffer
and AudioRecorder
's chunks in other thread:
audioThread = new Thread(new Runnable() {
public void run() {
audioBufferSize = AudioRecord.getMinBufferSize(44100, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
while(!audioThread.interrupted()) {
int ret = mic.read(audioCodecInputData, 0, audioBufferSize);
if(ret >= 0)
cameraAndroid.audioEncode(audioCodecInputData);
}
}
});
sendVideoData
and sendAudioData
are native C functions (librtmp func-s + JNI):
public synchronized native void sendVideoData(byte[] buf, int size, int timestamp, boolean keyFrame);
public synchronized native void sendAudioData(byte[] buf, int size, int timestamp);
The main thing, that I can't understood is: why live stream is absolutely unstable, when I playing them from Adobe Flash Player? First 1-2 seconds of stream is absolutely correct, but then I always see I-frames every 2 seconds (videoMediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 2)
) and very bad sound stream, that I can hear for milliseconds during I-frame interval and then it interrupts.
Can someone show to me correct way for creating stable live stream, please? Where I'm wrong?
Also, I post here AVC/AAC MediaCodec
settings (may be something wrong here?):
// H.264/AVC (advanced video coding) format
MediaFormat videoMediaFormat = MediaFormat.createVideoFormat("video/avc", encWidth, encHeight);
videoMediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
videoMediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, encWidth * encHeight * 4); // бит в секунду
videoMediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, fps); // FPS
videoMediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, iFrameInterval); // interval секунд между I-frames
videoMediaCodec = MediaCodec.createEncoderByType("video/avc");
videoMediaCodec.configure(videoMediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
// AAC (advanced audio coding) format
MediaFormat audioMediaFormat = MediaFormat.createAudioFormat("audio/mp4a-latm", 44100, 1); // mime-type, sample rate, channel count
audioMediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 64 * 1000); // kbps
audioMediaFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);
audioMediaFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, audioBufferSize); // 4096 (default) / 4736 * 1 (min audio buffer size)
audioMediaCodec = MediaCodec.createEncoderByType("audio/mp4a-latm");
audioMediaCodec.configure(audioMediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
Update: I tried to play stream with ffmpeg (thanks @Robert Rowntree) and what I see in console constantly:
Non-monotonous DTS in output stream 0:1; previous: 95054, current: 46136; changing to 95056. This may result in incorrect timestamps in the output file.
So, I check output from android app, but I can't see wrong lines (a
- encoded AAC chunk, v
- encoded AVC frame, integer value - timestamp in milliseconds): output.txt
Is that correct timestamps?