iOS音视频开发相关文章:
在iOS中,Core Audio提供的一套软件接口来处理音频,支持录音、播放、声音效果、格式转换、文件流解析等。现在常用的场景是网络传输过来的音频流播放,在Core Audio中,可以使用Audio Queue或者OpenAL实现。
audio queue可以录音或播放,它的作用主要是:
·连接音频的硬件部分
·管理内存
·对于压缩的音频格式,能使用编解码codec
·调节录音与播放
下面重点看看audio queue对于音频流的处理:
图片看上去挺复杂,但主要的步骤就两步:
1、将音频数据送入buffer
2、数据播放后,再给对应的buffer补充数据
这就好比运输队有三辆车(类比3个buffer),分别给车装满货。把货物运到目的地后车上的货物卸下来,然后回去重新装上货。audio queue中的操作主要是装载音频的buffer,将buffer中存的数据(AudioQueueBufferRef)要一定格式组装好,送入buffer后就能自动播放声音。
了解这点,进一步理解就容易理解多了。
详细的介绍:http://blog.csdn.net/sqc3375177/article/details/38532207
开源audio queue代码:音视频播放器:https://github.com/durfu/DFURTSPPlayer
在本例中,应用从第三方的库(tutk平台)中得到音频传输数据G711 mu-law,然后转换成pcm格式,通过Audio Queue播放。
1、音频数据格式转换:G711 mu-law转pcm
int G711Decode(char* pRawData,const unsigned char* pBuffer, int nBufferSize)
{
short *out_data = (short*)pRawData;
int i = 0;
for(; i<nBufferSize; i++)
{
// 参考文档:http://bbs.csdn.net/topics/360024000
// out_data[i] = (short)ulaw2linear((unsigned char)pBuffer[i]);
int v = ulaw2linear((unsigned char)pBuffer[i]);
out_data[i] = v < -32768 ? -32768 : v > 32767 ? 32767 : v;
}
return nBufferSize * 2;
}
int ulaw2linear(unsigned char u_val)
{
int t;
/* Complement to obtain normal u-law value. */
u_val = ~u_val;
/*
* Extract and bias the quantization bits. Then
* shift up by the segment number and subtract out the bias.
*/
t = ((u_val & QUANT_MASK) << 3) + BIAS;
t <<= ((unsigned)u_val & SEG_MASK) >> SEG_SHIFT;
return ((u_val & SIGN_BIT) ? (BIAS - t) : (t - BIAS));
}
2、audio queue播放: - ( void )play:( void *)pcmData length:( unsigned int )length送入数据及长度即可
#import "PCMDataPlayer.h"
//audio queue运行流程:
//1、新建AudioQueueNewOutput,设置每个buffer用完后的回调,新建audio queue buffer(AudioQueueAllocateBuffer)
//2、外界传来一帧音频数据,找到闲置的一个buffer,将数据塞到buffer里,然后enqueue到audio queue中
@implementation PCMDataPlayer
- (id)init
{
self = [super init];
if (self) {
[self reset];
}
return self;
}
- (void)dealloc
{
if (audioQueue != nil) {
AudioQueueStop(audioQueue, true);
}
audioQueue = nil;
sysnLock = nil;
NSLog(@"PCMDataPlayer dealloc...");
}
static void AudioPlayerAQInputCallback(void* inUserData, AudioQueueRef outQ, AudioQueueBufferRef outQB)
{
PCMDataPlayer* player = (__bridge PCMDataPlayer*)inUserData;
[player playerCallback:outQB];
}
- (void)reset
{
[self stop];
sysnLock = [[NSLock alloc] init];
///设置音频参数
audioDescription.mSampleRate = 8000; //采样率
audioDescription.mFormatID = kAudioFormatLinearPCM;
audioDescription.mFormatFlags = kLinearPCMFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
audioDescription.mChannelsPerFrame = 1; ///单声道
audioDescription.mFramesPerPacket = 1; //每一个packet一侦数据
audioDescription.mBitsPerChannel = 16; //每个采样点16bit量化
audioDescription.mBytesPerFrame = (audioDescription.mBitsPerChannel / 8) * audioDescription.mChannelsPerFrame;
audioDescription.mBytesPerPacket = audioDescription.mBytesPerFrame;
AudioQueueNewOutput(&audioDescription, AudioPlayerAQInputCallback, (__bridge void*)self, nil, nil, 0, &audioQueue); //使用player的内部线程播放
//初始化音频缓冲区
for (int i = 0; i < QUEUE_BUFFER_SIZE; i++) {
int result = AudioQueueAllocateBuffer(audioQueue, MIN_SIZE_PER_FRAME, &audioQueueBuffers[i]); ///创建buffer区,MIN_SIZE_PER_FRAME为每一侦所需要的最小的大小,该大小应该比每次往buffer里写的最大的一次还大
// NSLog(@"AudioQueueAllocateBuffer i = %d,result = %d", i, result);
}
NSLog(@"PCMDataPlayer reset");
}
- (void)stop
{
if (audioQueue != nil) {
AudioQueueStop(audioQueue, true);
AudioQueueReset(audioQueue);
}
audioQueue = nil;
}
- (void)play:(void*)pcmData length:(unsigned int)length
{
if (audioQueue == nil || ![self checkBufferHasUsed]) {
[self reset];
AudioQueueStart(audioQueue, NULL);
}
[sysnLock lock];
AudioQueueBufferRef audioQueueBuffer = NULL;
while (true) {
audioQueueBuffer = [self getNotUsedBuffer];
if (audioQueueBuffer != NULL) {
break;
}
usleep(1000);
}
audioQueueBuffer->mAudioDataByteSize = length;
Byte* audiodata = (Byte*)audioQueueBuffer->mAudioData;
for (int i = 0; i < length; i++) {
audiodata[i] = ((Byte*)pcmData)[i];
}
AudioQueueEnqueueBuffer(audioQueue, audioQueueBuffer, 0, NULL);
// NSLog(@"PCMDataPlayer play dataSize:%d", length);
[sysnLock unlock];
}
- (BOOL)checkBufferHasUsed
{
for (int i = 0; i < QUEUE_BUFFER_SIZE; i++) {
if (YES == audioQueueUsed[i]) {
return YES;
}
}
// NSLog(@"PCMDataPlayer 播放中断............");
return NO;
}
- (AudioQueueBufferRef)getNotUsedBuffer
{
for (int i = 0; i < QUEUE_BUFFER_SIZE; i++) {
if (NO == audioQueueUsed[i]) {
audioQueueUsed[i] = YES;
// NSLog(@"PCMDataPlayer play buffer index:%d", i);
return audioQueueBuffers[i];
}
}
return NULL;
}
- (void)playerCallback:(AudioQueueBufferRef)outQB
{
for (int i = 0; i < QUEUE_BUFFER_SIZE; i++) {
if (outQB == audioQueueBuffers[i]) { //现存的buffer与回调的buffer一致,就表示没用过
audioQueueUsed[i] = NO;
}
}
}
@end
相关代码:
音频解码播放代码
这部分是针对TUTK平台进行开发,欢迎从事TUTK智能摄像头开发的小伙伴加入QQ群交流: 331753091