Android's MediaCodec (API 16): AAC + AVC / H.264 live stream(003)

http://stackoverflow.com/questions/36313646/androids-mediacodec-api-16-aac-avc-h-264-live-stream-is-unstable


I have application (Qt + Android), that creates live stream from Android's Camera (AVC) + AudioRecorder (AAC) and then sends encoded data to RTMP server using librtmp library (v 2.4).

AVC MediaCodec main func.:

public void videoEncode(byte[] data) {
    // Video buffers
    videoCodecInputBuffers = videoMediaCodec.getInputBuffers();
    videoCodecOutputBuffers = videoMediaCodec.getOutputBuffers();

    int inputBufferIndex = videoMediaCodec.dequeueInputBuffer(-1);
    if (inputBufferIndex >= 0) {
        videoInputBuffer = videoCodecInputBuffers[inputBufferIndex];
        videoCodecInputData = YV12toYUV420Planar(data, encWidth * encHeight);
        videoInputBuffer.clear();
        videoInputBuffer.put(videoCodecInputData);
        videoMediaCodec.queueInputBuffer(inputBufferIndex, 0, videoCodecInputData.length, 0, 0);
    }

    // Get AVC/H.264 frame
    int outputBufferIndex = videoMediaCodec.dequeueOutputBuffer(videoBufferInfo, 0);
    while(outputBufferIndex >= 0) {
        videoOutputBuffer = videoCodecOutputBuffers[outputBufferIndex];
        videoOutputBuffer.get(videoCodecOutputData, 0, videoBufferInfo.size);

        // H.264 / AVC header
        if(videoCodecOutputData[0] == 0x00 && videoCodecOutputData[1] == 0x00 && videoCodecOutputData[2] == 0x00 && videoCodecOutputData[3] == 0x01) {

            // I-frame
            boolean keyFrame = false;
            if((videoBufferInfo.flags & MediaCodec.BUFFER_FLAG_SYNC_FRAME) == MediaCodec.BUFFER_FLAG_SYNC_FRAME) {
                resetTimestamp();
                keyFrame = true;
            }

            int currentTimestamp = cameraAndroid.calcTimestamp();
            if(prevTimestamp == currentTimestamp) currentTimestamp++;
            sendVideoData(videoCodecOutputData, videoBufferInfo.size, currentTimestamp, cameraAndroid.calcTimestamp()); // Native C func
            prevTimestamp = currentTimestamp;

            // SPS / PPS sent
            spsPpsFrame = true;
        }

        videoMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
        outputBufferIndex = videoMediaCodec.dequeueOutputBuffer(videoBufferInfo, 0);
    }
}

AAC MediaCodec main func.:

public void audioEncode(byte[] data) {

    // Audio buffers
    audioCodecInputBuffers = audioMediaCodec.getInputBuffers();
    audioCodecOutputBuffers = audioMediaCodec.getOutputBuffers();

    // Add raw chunk into buffer
    int inputBufferIndex = audioMediaCodec.dequeueInputBuffer(-1);
    if (inputBufferIndex >= 0) {
        audioInputBuffer = audioCodecInputBuffers[inputBufferIndex];
        audioInputBuffer.clear();
        audioInputBuffer.put(data);
        audioMediaCodec.queueInputBuffer(inputBufferIndex, 0, data.length, 0, 0);
    }

    // Encode AAC
    int outputBufferIndex = audioMediaCodec.dequeueOutputBuffer(audioBufferInfo, 0),
        audioOutputBufferSize = 0;
    while(outputBufferIndex >= 0) {
        audioOutputBuffer = audioCodecOutputBuffers[outputBufferIndex];
        audioOutputBuffer.get(audioCodecOutputData, 0, audioBufferInfo.size);

        if(spsPpsFrame || esdsChunk) {
            int currentTimestamp = cameraAndroid.calcTimestamp();
            if(prevTimestamp == currentTimestamp) currentTimestamp++;
            sendAudioData(audioCodecOutputData, audioBufferInfo.size, currentTimestamp); // Native C func
            prevTimestamp = currentTimestamp;
            esdsChunk = false;
        }

        // Next chunk
        audioMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
        outputBufferIndex = audioMediaCodec.dequeueOutputBuffer(audioBufferInfo, 0);
    }
}

Camera frames encoded in setPreviewCallbackWithBuffer and AudioRecorder's chunks in other thread:

audioThread = new Thread(new Runnable() {
    public void run() {
        audioBufferSize = AudioRecord.getMinBufferSize(44100, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
        while(!audioThread.interrupted()) {
            int ret = mic.read(audioCodecInputData, 0, audioBufferSize);
            if(ret >= 0)
                cameraAndroid.audioEncode(audioCodecInputData);
        }
    }
});

sendVideoData and sendAudioData are native C functions (librtmp func-s + JNI):

public synchronized native void sendVideoData(byte[] buf, int size, int timestamp, boolean keyFrame);
public synchronized native void sendAudioData(byte[] buf, int size, int timestamp);

The main thing, that I can't understood is: why live stream is absolutely unstable, when I playing them from Adobe Flash Player? First 1-2 seconds of stream is absolutely correct, but then I always see I-frames every 2 seconds (videoMediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 2)) and very bad sound stream, that I can hear for milliseconds during I-frame interval and then it interrupts.

Can someone show to me correct way for creating stable live stream, please? Where I'm wrong?

Also, I post here AVC/AAC MediaCodec settings (may be something wrong here?):

// H.264/AVC (advanced video coding) format
MediaFormat videoMediaFormat = MediaFormat.createVideoFormat("video/avc", encWidth, encHeight);
videoMediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
videoMediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, encWidth * encHeight * 4);                        // бит в секунду
videoMediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, fps);                                           // FPS
videoMediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, iFrameInterval);                          // interval секунд между I-frames
videoMediaCodec = MediaCodec.createEncoderByType("video/avc");
videoMediaCodec.configure(videoMediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);

// AAC (advanced audio coding) format
MediaFormat audioMediaFormat = MediaFormat.createAudioFormat("audio/mp4a-latm", 44100, 1);              // mime-type, sample rate, channel count
audioMediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 64 * 1000);                                       // kbps
audioMediaFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);
audioMediaFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, audioBufferSize);                           // 4096 (default) / 4736 * 1 (min audio buffer size)
audioMediaCodec = MediaCodec.createEncoderByType("audio/mp4a-latm");
audioMediaCodec.configure(audioMediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);

Update: I tried to play stream with ffmpeg (thanks @Robert Rowntree) and what I see in console constantly:

Non-monotonous DTS in output stream 0:1; previous: 95054, current: 46136; changing to 95056. This may result in incorrect timestamps in the output file.

So, I check output from android app, but I can't see wrong lines (a - encoded AAC chunk, v - encoded AVC frame, integer value - timestamp in milliseconds): output.txt

Is that correct timestamps?

share improve this question
 
 
can u play your stream by other clients? VLC say.. –  Robert Rowntree  Mar 30 '16 at 18:46
 
yeah, as I wrote upper - stream playable, but unstable. If I disable audio, before streaming - everything OK. –  0x0000dead  Mar 30 '16 at 20:09

Your Answer


  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值