使用AudioTrack和AudioRecord录制和播放PCM wave文件



  http://blog.csdn.net/brooknew/article/details/8274594

 Android.media package里包含声音录放的两个类AudioRecord和AudioTrack。前者用来录制,后者用来播放。录制的流程基本上如第一个图,播放基本上如第二个图。

      

首先来看看 AudioRecord 的构造函数:public AudioRecord(int audioSource, int sampleRateInHz, int channelConfig, int audioFormat, int bufferSizeInBytes)

第一个参数是音源,可以是从MicroPhone( MediaRecorder.AudioSource.MIC),也可以是通话的话音( MediaRecorder.AudioSource.VOICE_CALL,MediaRecorder.AudioSource.VOICE_DOWNLINK即对方声音,MediaRecorder.AudioSource.VOICE_UPLINK即本方声音 )

第二个参数是期望录音的采样频率,比如8000,16000,44100等

第三个参数是期望录音的声道数,可以是AudioFormat.CHANNEL_IN_MONO 和 AudioFormat.CHANNEL_STEREO.

第四个参数是期望录音的比特数,可以是AudioFormat.ENCODING_PCM_16BIT和AudioFormat.ENCODING_PCM_8BIT.

第五个参数是期望录音时系统为其提供的缓冲区大小,必须大于使用AudioRecord.getMinBufferSize() 得出的大小,这个参数指定的数值愈大,可以有更长的缓冲时间,也就是可以间隔较长的时间调用AudioRecord.read 函数从底层取得数据也不会溢出。

关于第一参数指定录制电话通话时,有的手机可以,有的手机不行。不行的原因是因为平台没有实现这个功能。我试过 Nexus-s不行,HTC Desire HD 可以,和“通话录音CallRecorder_1.29.apk”这个程序的测试结果一致。

startRecording()之后就可以不断调用 read函数取得声音数据。这个函数是阻塞试的,下层没有足够的数据会停在它里面。

一般来说,在一个独立线程里处理录音的数据采集比较好。

 

然后来看看AudioTrack的构造函数

AudioTrack(int streamType, int sampleRateInHz, int channelConfig, int audioFormat, int bufferSizeInBytes, int mode)

第一个参数 是选择流的种类,一般来说从喇叭放音选择 STREAM_MUSIC

最后一个参数是表示一次性把数据传给 AudioTrack还是连续不断地使用  write 函数传数据。我这里选择 MODE_STREAM

还有几个函数要在这里说一下。

write也是阻塞式的,下面的数据满了的话,会在它里面等待。

getHeadPosition()返回值表示当前已经播放了多少帧(1帧就是采样一次的意思,比如采样率是8000的话,那么1秒钟就是8000帧)

AudioTrack.OnPlaybackPositionUpdateListener 声明两个方法:

onMarkerReached(AudioTrack track )

onPeriodicNotification(AudioTrack track )

这两个方法由客户实现,通过 setPlaybackPositionUpdateListener 把该接口的实现类设进去。

通过 setNotificationMarkerPosition 设定一个marker位置,当声音播放到这个位置时,就启动onMarkerReached 方法。

通过 setPositionNotificationPeriod 设定一个周期,然后每播放了这个周期的声音时,就会启动 onPeriodicNotification 的周期。

getHeadPosition  , onMarkerReached , onPeriodicNotification 可以方便的取得声音播放的时间,了解声音的节奏,可以用来和视频和字幕同步。 

 下面上代码:

/avccodecDemo/src/yzriver/avc/avccodec/AudioRecThread.Java

 

package yzriver.avc.avccodec;


import java.lang.Thread;

 

import java.lang.Runnable;
import android.os.Handler;
import android.os.Message;
import android.os.PowerManager ;
import android.util.Log;
import android.content.Context ;
import java.io.FileInputStream ;
import java.io.FileOutputStream ;
import java.io.IOException;
import java.io.RandomAccessFile;
import java.io.FileNotFoundException;

import android.os.Handler;
import java.lang.System;
import android.widget.TextView;
import android.media.AudioRecord ;
import android.media.AudioFormat ;
import android.media.MediaRecorder.AudioSource;
import android.media.AudioManager;
import android.media.AudioTrack ;
import android.media.AudioTrack.OnPlaybackPositionUpdateListener ;
import java.util.Arrays;
 
public class AudioRecThread implements android.media.AudioTrack.OnPlaybackPositionUpdateListener {
 final static String TAG = "AudioRecThread" ;
 static public  final int MSG_AUDIO_REC_FINISH = 1;
 static public  final int MSG_AUDIO_PLAY_FINISH = 2;
 private boolean audioRecContinue = false ;
 private boolean audioPlayContinue = false ;
 private PowerManager.WakeLock  wl ;
 private Thread thread;
 private long startPlayMilli = 0 ;
 private long playUpdatePeriodAmass = 0 ;
 private long timeMarkAttendModified = 0 ;
 private int  markPos = 0 ;
 private int markMill = 200 ;
 
 private final Handler myHandler = new MainHandler() ;

 private int updateTimes = 0 ;
 private class MainHandler extends Handler{
  @Override
     public void handleMessage(Message msg) {
        /* switch (msg.what) {
             case AvcThread.MSG_UPDATE_YUVVIEW: {
              updateTimes ++ ; 
              Log.v("GraphicsDraw", "AvcThread.MSG_UPDATE_YUVVIEW") ;
              graphicsView.update( argbArray ,yuv_width , yuv_height  );
                 break;
             }
             case AvcThread.MSG_UPDATE_FRAMERATE: {
   String str ;
   if( frameRatePreText == null )
    str = "Enc/Dec frame rate is " ;
   else
    str = frameRatePreText ;
              str +=  mFrameRate ;
              frameRateTextView.setText( str  ) ;
              frameRateTextView.bringToFront();  
              break ;
             }
         }*/
     }
    }
  public void onPeriodicNotification(AudioTrack track){
   playUpdatePeriodAmass += 25 ;
   long c = System.currentTimeMillis() ;
   c -= startPlayMilli ;
   c = track.getPlaybackHeadPosition() ;
   c /= 8;
   Log.v(TAG+" onPeriodicNotification", "real ms " + c + " period total " +  playUpdatePeriodAmass ) ;
 }
 public void onMarkerReached(AudioTrack track) {
  timeMarkAttendModified = System.currentTimeMillis() ;
  timeMarkAttendModified -= markMill ;
  Log.v(TAG+" period currpos", "mark:"+markMill ) ;
 }
 
 AudioRecThread( PowerManager.WakeLock wl1){
  Log.v(TAG, "AvcThread created constructor1");
  wl = wl1 ;  
 }
 
 void writeShort( RandomAccessFile fo , short in ) throws IOException {
  byte tmp[] = {0,0} ;
  tmp[0] =(byte) (in&0xff) ;
  tmp[1] =(byte) ((in >> 8) & 0xff) ;
  try {
   fo.write(tmp);
  }catch( IOException e ){
   throw e ;
  } 
 }
 
 void writeInt( RandomAccessFile fo , int in ) throws IOException {
  byte tmp[] = {0,0,0,0} ;
  tmp[0] =(byte) (in&0xff) ;
  tmp[1] =(byte) ((in >> 8) & 0xff) ;
  tmp[2] =(byte) ((in >> 16) & 0xff) ;
  tmp[3] =(byte) ((in >> 24) & 0xff) ;  
  try {
   fo.write(tmp);
  }catch( IOException e ){
   throw e ;
  } 
 } 
 
 void startAudioRec(final String audioFname, final int  audioSource, final Handler handler )
 {
 
  final Runnable audioRecRun = new Runnable() {       
       public void run() {

       int channel =AudioFormat.CHANNEL_IN_MONO ;
       short  channels = 1 ;
       int format = AudioFormat.ENCODING_PCM_16BIT ;
   int bitsPerSample = 16 ;
       int sampleRate = 8000 ;
       int bufferSizeInBytes = AudioRecord.getMinBufferSize ( sampleRate, channel , format  ) *4 ;

       AudioRecord audioRecord = new AudioRecord( audioSource,sampleRate, channel,format,  bufferSizeInBytes);
       if( audioRecord == null )
     return ;
       int bufReadLenInByte = bufferSizeInBytes/2 ;
       byte[] bufRead = new byte[bufReadLenInByte] ;
        
    RandomAccessFile  fo ;
    wl.acquire() ;
        try {
         fo = new RandomAccessFile  ( audioFname , "rw" ) ;
        }catch ( FileNotFoundException e){
         return ;
        }catch ( IllegalArgumentException e ) {
     return ;
    }catch( SecurityException e ) {
     return ;
    }
        
/*
typedef struct tWAVEFORMATEX
{
    WORD        wFormatTag;       
    WORD        nChannels;        
    DWORD       nSamplesPerSec;
    DWORD       nAvgBytesPerSec; 
    WORD        nBlockAlign;       
    WORD        wBitsPerSample; 
    WORD        cbSize;            
} WAVEFORMATEX
*/    
       int datalength = 0 ;
       
       try {
        byte[]  tmp = {0,0,0,0} ;
        fo.writeBytes( "RIFF" ) ;
     fo.writeInt(0) ; // Riff file length
     fo.writeBytes("WAVE"); // 4
     fo.writeBytes("fmt ") ; // 4     
     writeInt(fo , 16 ) ; //4 , followed by 16 bytes     
     writeShort(fo, (short)1 ) ;//pcm wave tag
     writeShort(fo, (short) channels ) ;
     writeInt(fo, sampleRate ) ;     
     writeInt(fo, bitsPerSample*sampleRate*channels/8) ;
     writeShort(fo, (short)(channels*bitsPerSample/8) ) ;
     writeShort(fo,(short)bitsPerSample) ;
     fo.writeBytes("data"); // 4
     fo.writeInt( 0 ) ; // 4 ,data length
       }catch( IOException  e) {
        
       }
    

    audioRecContinue = true ;
    try {
     audioRecord.startRecording()  ;
    } catch( IllegalStateException  e ) {

    }

        while( audioRecContinue )
        {
         //Log.v(TAG+"-ENC","In run AudioRecThread.encodeContinue="+audioRecContinue ) ;
         try
         {
           int read = audioRecord.read ( bufRead, 0 , bufReadLenInByte)  ;
           if( read > 0  ) {
           fo.write((byte[]) bufRead , 0, read ) ;
           datalength += read ;
          }
         }
         catch (IOException e)
         {
          
         }        
        }
        try {
         fo.seek( 4 ) ;
         writeInt(fo, 36 +datalength ) ;
         fo.seek( 40 ) ;
         writeInt(fo, datalength ) ;        
         fo.close() ;
        }
        catch (IOException e)
        {
          
        }
    try {
     audioRecord.stop()  ;
    } catch( IllegalStateException  e ) {

    }
    audioRecord.release() ;

        wl.release() ;
        handler.sendEmptyMessage( MSG_AUDIO_REC_FINISH ) ;
        Log.v(TAG,"AvcThread Finish encoding " ) ;
   }
  } ;  
  startLowPriorityNewThread(audioRecRun);  
 }
 public void stopAudioRec() {
  audioRecContinue = false ;
  if( thread != null  ) {
   try {
      thread.join() ;
   }catch( InterruptedException e  ) {
    
   }  
  }

  thread = null; 
  //Log.v(TAG,"in stopAvcEnc AvcThread.encodeContinue="+encodeContinue ) ;
 }
 
 
 private short readShort( FileInputStream fi ) throws IOException {
  byte tmp[] = {0,0} ;
  try {
   fi.read(tmp);
  }catch( IOException e ){
   throw e ;
  }
  short r = tmp[1] ;
  r <<= 8 ;
  r += tmp[0] ;
  return r  ;
 }
 
 private int readInt( FileInputStream fi ) throws IOException {
  byte tmp[] = {0,0,0,0} ;

  try {
   fi.read(tmp);
  }catch( IOException e ){
   throw e ;
  } 
  int r = tmp[3] ;
  r <<= 8 ;
  r += tmp[2] ;
  r <<= 8 ;
  r += tmp[1] ;
  r <<= 8 ;
  r += tmp[0] ;
  
  return r ;  
 } 


 void startAudioPlay(final String audioFname,  final Handler handler )
 {
 
  final Runnable audioRecRun = new Runnable() {       
       public void run() {
       FileInputStream fi ;
       try {
        fi = new FileInputStream( audioFname );
       }catch (FileNotFoundException e ) {
        return ;
       }
       
       short  channels = 1 ;
       int blockAlign ;
       int bitsPerSample ;
       int sampleRate  ;
        
    wl.acquire() ;
  
       int datalength = 0 ;
       
       try {
        byte[]  tmp = {0,0,0,0} ;
        fi.read(tmp) ;
        boolean check = Arrays.equals( tmp, "RIFF".getBytes() ) ;
        if( check!= true )
         return ;
     fi.read(tmp) ; // Riff file length
     fi.read( tmp ) ;
     check = Arrays.equals( tmp, "WAVE".getBytes()); // 4
     if( check!= true )
         return ;
     fi.read( tmp ) ;
     check = Arrays.equals( tmp ,  "fmt ".getBytes()) ; // 4
     if( check!= true )
         return ;
     int headlen = readInt(fi ) ;
     if( headlen != 16 )
      return ; //4 , followed by 16 bytes     
     short s = readShort(fi ) ;
     if( s != 1 )
      return ;//pcm wave tag
     channels = readShort(fi ) ;
     sampleRate = readInt(fi )  ;     
     readInt(fi) ; // bitsPerSample*sampleRate*channels/8) ;
     blockAlign = readShort(fi ) ;//, (short)(channels*bitsPerSample/8) ) ;
     bitsPerSample = readShort(fi ) ;//,(short)bitsPerSample) ;
     fi.read( tmp ) ; //"data"); // 4
     fi.read( tmp ) ; // 4 ,data length
       }catch( IOException  e) {
        return ;
       }
    
       
       int format  ;//= AudioFormat.ENCODING_PCM_16BIT ;
       int channel  ;//=AudioFormat.CHANNEL_CONFIGURATION_MONO ;
       
       if( channels == 2 )
        channel = AudioFormat.CHANNEL_OUT_STEREO ;
       else if( channels == 1 )
        channel = AudioFormat.CHANNEL_OUT_MONO ;
       else
        channel = -1 ;
       
       if( bitsPerSample == 16 )
        format = AudioFormat.ENCODING_PCM_16BIT ;
       else if( bitsPerSample == 8 )
        format = AudioFormat.ENCODING_PCM_8BIT ;
       else
        format = -1 ;
       
       int bufferSizeInBytes = AudioTrack.getMinBufferSize(sampleRate,channel,format) *4 ;
       if( bufferSizeInBytes < channels*bitsPerSample*2/8*sampleRate  ) ;
        bufferSizeInBytes = channels*bitsPerSample*2/8*sampleRate ;
       Log.v(TAG+" write", "bufferSizeInBytes="+bufferSizeInBytes ) ;
       
       AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, sampleRate,channel, format, bufferSizeInBytes, AudioTrack.MODE_STREAM ) ; 
       if( audioTrack == null )
     return ;
       int bufReadLenInByte = bufferSizeInBytes ;
       byte[] bufRead = new byte[bufReadLenInByte] ;
    audioPlayContinue = true ;
    
    audioTrack.setPositionNotificationPeriod(200) ;
    
    markPos = sampleRate*markMill/1000 ;
    int rr = audioTrack.setNotificationMarkerPosition(markPos) ;
    Log.v(TAG+" Period currpos", "setNotificationMarkerPosition return "+rr) ;
    audioTrack.setPlaybackPositionUpdateListener( AudioRecThread.this ) ;
    //int rr = audioTrack.setPlaybackHeadPosition(8000*5);
    //Log.v(TAG+" Period", "setPlaybackHeadPosition return "+rr) ;
    startPlayMilli = System.currentTimeMillis() ;
    Log.v(TAG+" Period currpos", "startPlayMilli= "+startPlayMilli) ;
    playUpdatePeriodAmass = 0 ;
    rr = audioTrack.getPlaybackHeadPosition();
    Log.v(TAG+" Period", "getPlaybackHeadPosition return "+rr) ;
    
       while( audioPlayContinue )
        {
         //Log.v(TAG+"-ENC","In run AudioRecThread.encodeContinue="+audioRecContinue ) ;
         try
         {
           int read = fi.read ( bufRead, 0 , bufReadLenInByte)  ;
           if( read > 0  ) {
           Log.v(TAG+" write", "before write to AudioTrack" )  ;
           read = audioTrack.write((byte[]) bufRead , 0, read ) ;
           Log.v(TAG+" write","after write to AudioTrack, write " + read + " bytes" )  ;
           if( datalength == 0 ) {
            try {
             Log.v(TAG+" Period currpos", "startPlay : "+System.currentTimeMillis()) ;
             audioTrack.play()  ;
             bufReadLenInByte /= 10 ;
            } catch( IllegalStateException  e ) {

            }     
           }
           datalength += read ;
          }else {
           break;
          }          
         }
         catch (IOException e)
         {
          
         }
     
         long cu = System.currentTimeMillis() ;
         long cu1 = cu-startPlayMilli ;
         long cu2 = cu - timeMarkAttendModified ;
      rr = audioTrack.getPlaybackHeadPosition();
      Log.v(TAG+" Period currpos", "curr: "+cu1+ " modified mill:"+ cu2 + " pos:" + rr*1000/8000 ) ;
         

        }
        try {
         fi.close() ;
        }
        catch (IOException e)
        {
          
        };
    try {
     Log.v (TAG+" Period" , "dataLength="+datalength );
     audioTrack.flush();
     for( int i = 50 ; i >0 ; i -- ) {
         long cu = System.currentTimeMillis() ;
         long cu1 = cu-startPlayMilli ;
         long cu2 = cu - timeMarkAttendModified ;
      rr = audioTrack.getPlaybackHeadPosition();
      Log.v(TAG+" Period currpos", "curr: "+cu1+ " modified mill:"+ cu2 + " pos:" + rr*1000/8000 ) ;
      
      Thread.sleep(100) ;
     }
     audioTrack.stop()  ;
    } catch( IllegalStateException  e ) {

    }catch ( InterruptedException e) {
     
    }
    audioTrack.release() ;

        wl.release() ;
        handler.sendEmptyMessage( MSG_AUDIO_PLAY_FINISH ) ;
        Log.v(TAG,"AvcThread Finish encoding " ) ;
   }
  } ;  
  startLowPriorityNewThread(audioRecRun);  
 }

 public void stopAudioPlay() {
  audioPlayContinue = false ;
  if( thread != null  ) {
   try {
      thread.join() ;
   }catch( InterruptedException e  ) {
    
   }  
  }

  thread = null; 
  //Log.v(TAG,"in stopAvcEnc AvcThread.encodeContinue="+encodeContinue ) ;
 }

 
 private void startLowPriorityNewThread(Runnable run) {
  thread = new Thread( run ) ;
  int pr = thread.getPriority() ;
  Log.v( TAG, "Original Thread priority is " + pr ) ;
  thread.setPriority( pr - 1 ) ;
  pr = thread.getPriority() ;
  Log.v( TAG, "new Thread priority is " + pr ) ;
  thread.start();  
 }
 
 
 
}

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值