Android 音频可视化 Visualizer

音频可视化 Visualizer

效果图

请添加图片描述

Visualizer

官方语录:

The Visualizer class enables application to retrieve part of the currently playing audio for visualization purpose. It is not an audio recording interface and only returns partial and low quality audio content. However, to protect privacy of certain audio data (e.g voice mail) the use of the visualizer requires the permission android.permission.RECORD_AUDIO.

The audio session ID passed to the constructor indicates which audio content should be visualized:

  • If the session is 0, the audio output mix is visualized
  • If the session is not 0, the audio from a particular or using this audio session is visualized MediaPlayer AudioTrack

Two types of representation of audio content can be captured:

  • Waveform data: consecutive 8-bit (unsigned) mono samples by using the methodgetWaveForm(byte[])
  • Frequency data: 8-bit magnitude FFT by using the methodgetFft(byte[])

The length of the capture can be retrieved or specified by calling respectively and methods. The capture size must be a power of 2 in the range returned by . getCaptureSize() setCaptureSize(int) getCaptureSizeRange()

In addition to the polling capture mode described above with and methods, a callback mode is also available by installing a listener by use of the method. The rate at which the listener capture method is called as well as the type of data returned is specified. getWaveForm(byte[]) getFft(byte[]) setDataCaptureListener(android.media.audiofx.Visualizer.OnDataCaptureListener, int, boolean, boolean)

Before capturing data, the Visualizer must be enabled by calling the method. When data capture is not needed any more, the Visualizer should be disabled. setEnabled(boolean)

It is good practice to call the method when the Visualizer is not used anymore to free up native resources associated to the Visualizer instance. release()

Creating a Visualizer on the output mix (audio session 0) requires permission Manifest.permission.MODIFY_AUDIO_SETTINGS

The Visualizer class can also be used to perform measurements on the audio being played back. The measurements to perform are defined by setting a mask of the requested measurement modes with . Supported values are to cancel any measurement, and for peak and RMS monitoring. Measurements can be retrieved through .setMeasurementMode(int) MEASUREMENT_MODE_NONE MEASUREMENT_MODE_PEAK_RMS getMeasurementPeakRms(android.media.audiofx.Visualizer.MeasurementPeakRms)

实现过程

  1. 通过MediaPlayer进行音频播放
  2. 然后创建Visualizer对象,根据Visualizer需要传递一个audioSessionId,通过MediaPlayergetAudioSessionId方法获取,
  3. 然后根据Visualizer官方文档所述,通过设置setDataCaptureListener监听,捕获波形数据或者频率数据。
  4. 然后根据数据遍历绘制图形即可。

代码实现

  1. 首先根据需要申请权限

    <uses-permission android:name="android.permission.RECORD_AUDIO" />
        <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
        <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
    
    RxPermissions rxPermissions = new RxPermissions(this);
            rxPermissions.requestEach(Manifest.permission.RECORD_AUDIO,
                Manifest.permission.WRITE_EXTERNAL_STORAGE, Manifest.permission.READ_EXTERNAL_STORAGE)
                .subscribe(new Consumer<Permission>() {
                    @Override
                    public void accept(Permission permission) throws Exception {
                        if (permission.granted) {
                            Log.d(TAG, "accept: true");
                        } else if (permission.shouldShowRequestPermissionRationale) {
                            finish();
                        } else {
                            finish();
                        }
                    }
                });
    

    RxPermissions依赖:

    implementation 'com.tbruyelle.rxpermissions2:rxpermissions:0.9.5'
    
  2. 通过MediaPlayer播放音频文件

    mMediaPlayer = MediaPlayer.create(this, R.raw.daoxiang);
     mMediaPlayer.setOnErrorListener(null);
            mMediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
                @Override
                public void onPrepared(MediaPlayer mediaPlayer) {
                    mediaPlayer.setLooping(true);//循环播放
                }
            });
            mMediaPlayer.start();
    
  3. 获取audioSessionId

    int audioSessionId = mediaPlayer.getAudioSessionId();
    
  4. 创建Visualizer对象

    visualizer = new Visualizer(audioSessionId);
    //生成Visualizer实例之后,为其设置可视化数据的大小,其范围是Visualizer.getCaptureSizeRange()[0] ~ Visualizer.getCaptureSizeRange()[1],此处设置为最大值:
    visualizer.setCaptureSize(Visualizer.getCaptureSizeRange()[1]);
    
  5. 通过 setDataCaptureListener 为可视化对象设置采样监听数据的回调

    visualizer.setDataCaptureListener(new Visualizer.OnDataCaptureListener() {
        @Override
        public void onWaveFormDataCapture(Visualizer visualizer, byte[] bytes, int samplingRate) {
        }
    
        @Override
        public void onFftDataCapture(Visualizer visualizer, byte[] fft, int samplingRate) {
            float[] model = new float[fft.length / 2 + 1];
            model[0] = (byte) Math.abs(fft[1]);
            int j = 1;
    
            for (int i = 2; i < fft.length / 2; ) {
                model[j] = (float) Math.hypot(fft[i], fft[i + 1]);
                i += 2;
                j++;
                model[j] = (float) Math.abs(fft[j]);
            }
            //model即为最终用于绘制的数据
        }
    }, Visualizer.getMaxCaptureRate() / 2, false, true);
    

    setDataCaptureListener的参数作用如下:

    listener:回调对象
    rate:采样的频率,其范围是0~Visualizer.getMaxCaptureRate(),此处设置为最大值一半。
    waveform:是否获取波形信息
    fft:是否获取快速傅里叶变换后的数据

    OnDataCaptureListener中的两个回调方法分别为:

    onWaveFormDataCapture:波形数据回调
    onFftDataCapture:傅里叶数据回调,即频率数据回调

    这里我们采用的是傅里叶数据进行可视化绘制,onFftDataCapture 中返回的byte数组就是快速傅里叶转换之后的数据,但还需要处理一下:

    根据上面设置的采样率为Visualizer.getCaptureSizeRange()[1],即1024个采样点,每1024个实数点放入一个数组,进行FFT快速傅里叶变换,得到1024个复数点,由于对称性,前512个点与后512个点对称,取前513个点(包括第0点)

    其中第0点和第512点为实数,中间511点为复数

    onFftDataCapture(Visualizer visualizer, byte[] bytes, int samplingRate)
    

    FFT数据为byte类型,放于byte[1024]中,其中一共1+1+(1024-2)/2=513个有效FFT数据,除了直流和n/2对应的点占一个坑,其他频率数据都是 实部+i虚部 两个坑

    在这里插入图片描述

    获得的频率范围= 0~采样率/2 = 0~22.05kHz之间

    即513个频率分布在 [ 0Hz,22.05kHz ]之间

    每相邻两个频率间隔(mHz) = 采样率 / (1024 / 2) = 44 100 000 / 512 = 86.132Hz分辨率为86.132Hz,再小的频率间隔将无法分辨

    采样率:每秒采集音频流的点数

    frequencyEach = samplingRate * 2 / visualizer.getCaptureSize();  //86132  samplingRate=44,100,000 mHz  getCaptureSize()=1024
    
    		float[] model = new float[fft.length / 2 + 1];
    		//由于返回的byte数据有可能为负,所以要取绝对值处理:
            model[0] = (byte) Math.abs(fft[1]);
            int j = 1;
    
            for (int i = 2; i < fft.length / 2; ) {
                model[j] = (float) Math.hypot(fft[i], fft[i + 1]);
                i += 2;
                j++;
                model[j] = (float) Math.abs(fft[j]);
            }
            //model即为最终用于绘制的数据
    
  6. 设置Visualizer启动

    visualizer.setEnabled(true);
    
  7. 绘制图形:

    public class VisualizeView extends View {
    
        private static final String TAG = "SingleVisualizeView";
    
        /**
         * the count of spectrum
         */
        protected int mSpectrumCount = 60;
        /**
         * the width of every spectrum
         */
        protected float mStrokeWidth;
        /**
         * the color of drawing spectrum
         */
        protected int mColor;
        /**
         * audio data transform by hypot
         */
        protected float[] mRawAudioBytes;
        /**
         * the margin of adjoin spectrum
         */
        protected float mItemMargin = 12;
    
        protected float mSpectrumRatio = 2;
    
        protected RectF mRect;
        protected Paint mPaint;
        protected Path mPath;
        protected float centerX, centerY;
        private int mode;
        public static final int SINGLE = 0;
        public static final int CIRCLE = 1;
        public static final int NET = 2;
        public static final int REFLECT = 3;
        public static final int WAVE = 4;
        public static final int GRAIN = 5;
        float radius = 150;
    
        public VisualizeView(Context context) {
            super(context);
            init();
        }
    
        public VisualizeView(Context context, @Nullable AttributeSet attrs) {
            super(context, attrs);
            init();
        }
    
        protected void init() {
            mStrokeWidth = 5;
    
            mPaint = new Paint();
            mPaint.setStrokeWidth(mStrokeWidth);
            mPaint.setColor(getResources().getColor(R.color.black));
            mPaint.setStrokeCap(Paint.Cap.ROUND);
            mPaint.setAntiAlias(true);
            mPaint.setMaskFilter(new BlurMaskFilter(5, BlurMaskFilter.Blur.SOLID));
    
            mRect = new RectF();
            mPath = new Path();
        }
    
        @Override
        protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) {
            super.onMeasure(widthMeasureSpec, heightMeasureSpec);
            int finallyWidth;
            int finallyHeight;
            int wSpecMode = MeasureSpec.getMode(widthMeasureSpec);
            int wSpecSize = MeasureSpec.getSize(widthMeasureSpec);
            int hSpecMode = MeasureSpec.getMode(heightMeasureSpec);
            int hSpecSize = MeasureSpec.getSize(heightMeasureSpec);
            if (wSpecMode == MeasureSpec.EXACTLY) {
                finallyWidth = wSpecSize;
            } else {
                finallyWidth = 500;
            }
    
            if (hSpecMode == MeasureSpec.EXACTLY) {
                finallyHeight = hSpecSize;
            } else {
                finallyHeight = 500;
            }
    
            setMeasuredDimension(finallyWidth, finallyHeight);
        }
    
        @Override
        protected void onLayout(boolean changed, int left, int top, int right, int bottom) {
            super.onLayout(changed, left, top, right, bottom);
            mRect.set(0, 0, getWidth(), getHeight() - 50);
            centerX = mRect.width() / 2;
            centerY = mRect.height() / 2;
        }
    
        @Override
        protected void onDraw(Canvas canvas) {
            super.onDraw(canvas);
            if (mRawAudioBytes == null) {
                Log.d(TAG, "onDraw: ");
                return;
            }
            drawChild(canvas);
        }
    
        protected void drawChild(Canvas canvas) {
            mStrokeWidth = (mRect.width() - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f;
            mPaint.setStrokeWidth(mStrokeWidth);
            mPaint.setStyle(Paint.Style.FILL);
    
            switch (mode) {
                case SINGLE:
                    for (int i = 0; i < mSpectrumCount; i++) {
                        canvas.drawLine(mRect.width() * i / mSpectrumCount, mRect.height() / 2,
                            mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 - mRawAudioBytes[i], mPaint);
                    }
                    break;
                case CIRCLE:
                    mStrokeWidth = (float) ((Math.PI * 2 * radius - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f);
                    mPaint.setStyle(Paint.Style.STROKE);
                    mPaint.setStrokeWidth(2);
                    canvas.drawCircle(centerX, centerY, radius, mPaint);
                    mPaint.setStrokeWidth(mStrokeWidth);
                    mPaint.setStyle(Paint.Style.FILL);
                    mPath.moveTo(0, centerY);
                    for (int i = 0; i < mSpectrumCount; i++) {
                        double angel = ((360d / mSpectrumCount * 1.0d) * (i + 1));
                        double startX = centerX + (radius + mStrokeWidth / 2) * Math.sin(Math.toRadians(angel));
                        double startY = centerY + (radius + mStrokeWidth / 2) * Math.cos(Math.toRadians(angel));
                        double stopX = centerX + (radius + mStrokeWidth / 2 + mSpectrumRatio * mRawAudioBytes[i]) * Math.sin(Math.toRadians(angel));
                        double stopY = centerY + (radius + mStrokeWidth / 2 + mSpectrumRatio * mRawAudioBytes[i]) * Math.cos(Math.toRadians(angel));
                        canvas.drawLine((float) startX, (float) startY, (float) stopX, (float) stopY, mPaint);
                    }
                    break;
                case NET:
                    mStrokeWidth = (float) ((Math.PI * 2 * radius - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f);
                    mPaint.setStyle(Paint.Style.STROKE);
                    mPaint.setStrokeWidth(2);
                    canvas.drawCircle(centerX, centerY, radius, mPaint);
    
                    mPaint.setStrokeWidth(mStrokeWidth);
                    mPaint.setStyle(Paint.Style.FILL);
                    mPath.moveTo(0, centerY);
                    for (int i = 0; i < mSpectrumCount; i++) {
                        double angel = ((360d / mSpectrumCount * 1.0d) * (i + 1));
                        double startX = centerX + (radius + mStrokeWidth / 2) * Math.sin(Math.toRadians(angel));
                        double startY = centerY + (radius + mStrokeWidth / 2) * Math.cos(Math.toRadians(angel));
                        double stopX = centerX + (radius + mStrokeWidth / 2 + mSpectrumRatio * mRawAudioBytes[i]) * Math.sin(Math.toRadians(angel));
                        double stopY = centerY + (radius + mStrokeWidth / 2 + mSpectrumRatio * mRawAudioBytes[i]) * Math.cos(Math.toRadians(angel));
                        canvas.drawLine((float) startX, (float) startY, (float) stopX, (float) stopY, mPaint);
                        if (i == 0) {
                            mPath.moveTo((float) startX, (float) startY);
                        }
                        mPath.lineTo((float) stopX, (float) stopY);
                    }
                    mPaint.setStyle(Paint.Style.STROKE);
                    canvas.drawPath(mPath, mPaint);
                    mPath.reset();
                    break;
                case REFLECT:
                    mStrokeWidth = (mRect.width() - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f;
                    mPaint.setStrokeWidth(mStrokeWidth);
                    mPaint.setStyle(Paint.Style.FILL);
                    for (int i = 0; i < mSpectrumCount; i++) {
                        canvas.drawLine(mRect.width() * i / mSpectrumCount, mRect.height() / 2, mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 - mSpectrumRatio * mRawAudioBytes[i], mPaint);
                        canvas.drawLine(mRect.width() * i / mSpectrumCount, mRect.height() / 2, mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 + mSpectrumRatio * mRawAudioBytes[i], mPaint);
                    }
                    break;
                case WAVE:
                    mStrokeWidth = (mRect.width() - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f;
                    mPaint.setStrokeWidth(mStrokeWidth);
                    mPaint.setStyle(Paint.Style.FILL);
                    mPath.moveTo(0, centerY);
    
                    for (int i = 0; i < mSpectrumCount; i++) {
                        mPath.lineTo(mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 + mRawAudioBytes[i]);
                    }
                    mPath.lineTo(mRect.width(), centerY);
                    mPath.close();
                    canvas.drawPath(mPath, mPaint);
                    mPath.reset();
                    break;
                case GRAIN:
                    mStrokeWidth = (mRect.width() - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f;
                    mPaint.setStrokeWidth(mStrokeWidth);
                    mPaint.setStyle(Paint.Style.FILL);
                    for (int i = 0; i < mSpectrumCount; i++) {
                        canvas.drawPoint(mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 - mRawAudioBytes[i], mPaint);
                        canvas.drawPoint(mRect.width() * i / mSpectrumCount, mRect.height() / 4 + 2 + (mRect.height() / 2 - mRawAudioBytes[i]) / 2, mPaint);
                    }
                    break;
                default:
                    break;
            }
    
        }
    
        public void setMode(int mode) {
            this.mode = mode;
            if (mRawAudioBytes != null) {
                invalidate();
            }
        }
    
        public void setData(float[] parseData) {
            mRawAudioBytes = parseData;
            invalidate();
        }
    }
    
  8. 退出应用时进行释放:

    	@Override
        protected void onDestroy() {
            super.onDestroy();
            if (mMediaPlayer != null) {
                mMediaPlayer.stop();
                mMediaPlayer.reset();
                mMediaPlayer.release();
                mMediaPlayer = null;
            }
            if (visualizer != null) {
                visualizer.setEnabled(false);
                visualizer.release();
            }
        }
    

tips:Spinner使用

  1. 新建values目录下arrays的xml文件,配置

    <?xml version="1.0" encoding="utf-8"?>
    <resources>
        <string-array name="view_type">
            <item>SINGLE</item>
            <item>CIRCLE </item>
            <item>NET</item>
            <item>REFLECT</item>
            <item>WAVE</item>
            <item>GRAIN</item>
        </string-array>
    </resources>
    
  2. 使用spinner

     		<Spinner
                android:id="@+id/spinner_view"
                android:layout_width="200px"
                android:layout_height="wrap_content"
                android:entries="@array/view_type"
                app:layout_constraintEnd_toEndOf="parent"
                app:layout_constraintTop_toTopOf="parent" />
    
    mBinding.spinnerView.setOnItemSelectedListener(new AdapterView.OnItemSelectedListener() {
                @Override
                public void onItemSelected(AdapterView<?> parent, View view, int position, long id) {
                    mBinding.visualizerView.setMode(position);
                }
    
                @Override
                public void onNothingSelected(AdapterView<?> parent) {
    
                }
            });
    

完整代码

  1. MainActivity

    public class MainActivity extends AppCompatActivity {
    
        private static final String TAG = "MainActivity";
        Visualizer visualizer;
        int mCount = 60;
        ActivityMainBinding mBinding;
        private MediaPlayer mMediaPlayer;
    
        @Override
        protected void onCreate(Bundle savedInstanceState) {
            super.onCreate(savedInstanceState);
            mBinding = DataBindingUtil.setContentView(this, R.layout.activity_main);
            RxPermissions rxPermissions = new RxPermissions(this);
            rxPermissions.requestEach(Manifest.permission.RECORD_AUDIO,
                Manifest.permission.WRITE_EXTERNAL_STORAGE, Manifest.permission.READ_EXTERNAL_STORAGE)
                .subscribe(new Consumer<Permission>() {
                    @Override
                    public void accept(Permission permission) throws Exception {
                        if (permission.granted) {
                            Log.d(TAG, "accept: true");
                        } else if (permission.shouldShowRequestPermissionRationale) {
                            finish();
                        } else {
                            finish();
                        }
                    }
                });
            mMediaPlayer = MediaPlayer.create(this, R.raw.daoxiang);
            if (mMediaPlayer == null) {
                Log.d(TAG, "mediaPlayer is null");
                return;
            }
    
            mMediaPlayer.setOnErrorListener(null);
            mMediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
                @Override
                public void onPrepared(MediaPlayer mediaPlayer) {
                    mediaPlayer.setLooping(true);//循环播放
                    int audioSessionId = mediaPlayer.getAudioSessionId();
                    visualizer = new Visualizer(audioSessionId);
                    visualizer.setCaptureSize(Visualizer.getCaptureSizeRange()[1]);
                    visualizer.setDataCaptureListener(new Visualizer.OnDataCaptureListener() {
                        @Override
                        public void onWaveFormDataCapture(Visualizer visualizer, byte[] bytes, int samplingRate) {
                        }
    
                        @Override
                        public void onFftDataCapture(Visualizer visualizer, byte[] fft, int samplingRate) {
                            Log.d(TAG, "onFftDataCapture: fft " + fft.length);
                            float[] model = new float[fft.length / 2 + 1];
                            model[0] = (byte) Math.abs(fft[1]);
                            int j = 1;
    
                            for (int i = 2; i < fft.length / 2; ) {
                                model[j] = (float) Math.hypot(fft[i], fft[i + 1]);
                                i += 2;
                                j++;
                                model[j] = (float) Math.abs(fft[j]);
                            }
                            //model即为最终用于绘制的数据
                            mBinding.visualizerView.setData(model);
                        }
                    }, Visualizer.getMaxCaptureRate() / 2, false, true);
                    visualizer.setEnabled(true);
                }
            });
            mMediaPlayer.start();
    
            mBinding.spinnerView.setOnItemSelectedListener(new AdapterView.OnItemSelectedListener() {
                @Override
                public void onItemSelected(AdapterView<?> parent, View view, int position, long id) {
                    mBinding.visualizerView.setMode(position);
                }
    
                @Override
                public void onNothingSelected(AdapterView<?> parent) {
    
                }
            });
        }
    
        @Override
        protected void onDestroy() {
            super.onDestroy();
            if (mMediaPlayer != null) {
                mMediaPlayer.stop();
                mMediaPlayer.reset();
                mMediaPlayer.release();
                mMediaPlayer = null;
            }
            if (visualizer != null) {
                visualizer.setEnabled(false);
                visualizer.release();
            }
        }
    }
    
  2. layout文件

    <?xml version="1.0" encoding="utf-8"?>
    <layout xmlns:android="http://schemas.android.com/apk/res/android"
        xmlns:app="http://schemas.android.com/apk/res-auto"
        xmlns:tools="http://schemas.android.com/tools">
    
        <data>
    
        </data>
    
        <androidx.constraintlayout.widget.ConstraintLayout
            android:layout_width="match_parent"
            android:layout_height="match_parent"
            tools:context=".MainActivity">
    
            <Spinner
                android:id="@+id/spinner_view"
                android:layout_width="200px"
                android:layout_height="wrap_content"
                android:entries="@array/view_type"
                app:layout_constraintEnd_toEndOf="parent"
                app:layout_constraintTop_toTopOf="parent" />
    
            <com.learn.visualizer.VisualizeView
                android:id="@+id/visualizer_view"
                android:layout_width="match_parent"
                android:layout_height="match_parent"
                app:layout_constraintBottom_toBottomOf="parent"
                app:layout_constraintLeft_toLeftOf="parent"
                app:layout_constraintRight_toRightOf="parent"
                app:layout_constraintTop_toTopOf="parent" />
    
        </androidx.constraintlayout.widget.ConstraintLayout>
    </layout>
    
  3. 自定义View

    package com.learn.visualizer;
    
    import android.content.Context;
    import android.graphics.BlurMaskFilter;
    import android.graphics.Canvas;
    import android.graphics.Paint;
    import android.graphics.Path;
    import android.graphics.RectF;
    import android.util.AttributeSet;
    import android.util.Log;
    import android.view.View;
    
    import androidx.annotation.Nullable;
    
    public class VisualizeView extends View {
    
        private static final String TAG = "SingleVisualizeView";
    
        /**
         * the count of spectrum
         */
        protected int mSpectrumCount = 60;
        /**
         * the width of every spectrum
         */
        protected float mStrokeWidth;
        /**
         * the color of drawing spectrum
         */
        protected int mColor;
        /**
         * audio data transform by hypot
         */
        protected float[] mRawAudioBytes;
        /**
         * the margin of adjoin spectrum
         */
        protected float mItemMargin = 12;
    
        protected float mSpectrumRatio = 2;
    
        protected RectF mRect;
        protected Paint mPaint;
        protected Path mPath;
        protected float centerX, centerY;
        private int mode;
        public static final int SINGLE = 0;
        public static final int CIRCLE = 1;
        public static final int NET = 2;
        public static final int REFLECT = 3;
        public static final int WAVE = 4;
        public static final int GRAIN = 5;
        float radius = 150;
    
        public VisualizeView(Context context) {
            super(context);
            init();
        }
    
        public VisualizeView(Context context, @Nullable AttributeSet attrs) {
            super(context, attrs);
            init();
        }
    
        protected void init() {
            mStrokeWidth = 5;
    
            mPaint = new Paint();
            mPaint.setStrokeWidth(mStrokeWidth);
            mPaint.setColor(getResources().getColor(R.color.black));
            mPaint.setStrokeCap(Paint.Cap.ROUND);
            mPaint.setAntiAlias(true);
            mPaint.setMaskFilter(new BlurMaskFilter(5, BlurMaskFilter.Blur.SOLID));
    
            mRect = new RectF();
            mPath = new Path();
        }
    
        @Override
        protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) {
            super.onMeasure(widthMeasureSpec, heightMeasureSpec);
            int finallyWidth;
            int finallyHeight;
            int wSpecMode = MeasureSpec.getMode(widthMeasureSpec);
            int wSpecSize = MeasureSpec.getSize(widthMeasureSpec);
            int hSpecMode = MeasureSpec.getMode(heightMeasureSpec);
            int hSpecSize = MeasureSpec.getSize(heightMeasureSpec);
            if (wSpecMode == MeasureSpec.EXACTLY) {
                finallyWidth = wSpecSize;
            } else {
                finallyWidth = 500;
            }
    
            if (hSpecMode == MeasureSpec.EXACTLY) {
                finallyHeight = hSpecSize;
            } else {
                finallyHeight = 500;
            }
    
            setMeasuredDimension(finallyWidth, finallyHeight);
        }
    
        @Override
        protected void onLayout(boolean changed, int left, int top, int right, int bottom) {
            super.onLayout(changed, left, top, right, bottom);
            mRect.set(0, 0, getWidth(), getHeight() - 50);
            centerX = mRect.width() / 2;
            centerY = mRect.height() / 2;
        }
    
        @Override
        protected void onDraw(Canvas canvas) {
            super.onDraw(canvas);
            if (mRawAudioBytes == null) {
                Log.d(TAG, "onDraw: ");
                return;
            }
            drawChild(canvas);
        }
    
        protected void drawChild(Canvas canvas) {
            mStrokeWidth = (mRect.width() - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f;
            mPaint.setStrokeWidth(mStrokeWidth);
            mPaint.setStyle(Paint.Style.FILL);
    
            switch (mode) {
                case SINGLE:
                    for (int i = 0; i < mSpectrumCount; i++) {
                        canvas.drawLine(mRect.width() * i / mSpectrumCount, mRect.height() / 2,
                            mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 - mRawAudioBytes[i], mPaint);
                    }
                    break;
                case CIRCLE:
                    mStrokeWidth = (float) ((Math.PI * 2 * radius - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f);
                    mPaint.setStyle(Paint.Style.STROKE);
                    mPaint.setStrokeWidth(2);
                    canvas.drawCircle(centerX, centerY, radius, mPaint);
                    mPaint.setStrokeWidth(mStrokeWidth);
                    mPaint.setStyle(Paint.Style.FILL);
                    mPath.moveTo(0, centerY);
                    for (int i = 0; i < mSpectrumCount; i++) {
                        double angel = ((360d / mSpectrumCount * 1.0d) * (i + 1));
                        double startX = centerX + (radius + mStrokeWidth / 2) * Math.sin(Math.toRadians(angel));
                        double startY = centerY + (radius + mStrokeWidth / 2) * Math.cos(Math.toRadians(angel));
                        double stopX = centerX + (radius + mStrokeWidth / 2 + mSpectrumRatio * mRawAudioBytes[i]) * Math.sin(Math.toRadians(angel));
                        double stopY = centerY + (radius + mStrokeWidth / 2 + mSpectrumRatio * mRawAudioBytes[i]) * Math.cos(Math.toRadians(angel));
                        canvas.drawLine((float) startX, (float) startY, (float) stopX, (float) stopY, mPaint);
                    }
                    break;
                case NET:
                    mStrokeWidth = (float) ((Math.PI * 2 * radius - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f);
                    mPaint.setStyle(Paint.Style.STROKE);
                    mPaint.setStrokeWidth(2);
                    canvas.drawCircle(centerX, centerY, radius, mPaint);
    
                    mPaint.setStrokeWidth(mStrokeWidth);
                    mPaint.setStyle(Paint.Style.FILL);
                    mPath.moveTo(0, centerY);
                    for (int i = 0; i < mSpectrumCount; i++) {
                        double angel = ((360d / mSpectrumCount * 1.0d) * (i + 1));
                        double startX = centerX + (radius + mStrokeWidth / 2) * Math.sin(Math.toRadians(angel));
                        double startY = centerY + (radius + mStrokeWidth / 2) * Math.cos(Math.toRadians(angel));
                        double stopX = centerX + (radius + mStrokeWidth / 2 + mSpectrumRatio * mRawAudioBytes[i]) * Math.sin(Math.toRadians(angel));
                        double stopY = centerY + (radius + mStrokeWidth / 2 + mSpectrumRatio * mRawAudioBytes[i]) * Math.cos(Math.toRadians(angel));
                        canvas.drawLine((float) startX, (float) startY, (float) stopX, (float) stopY, mPaint);
                        if (i == 0) {
                            mPath.moveTo((float) startX, (float) startY);
                        }
                        mPath.lineTo((float) stopX, (float) stopY);
                    }
                    mPaint.setStyle(Paint.Style.STROKE);
                    canvas.drawPath(mPath, mPaint);
                    mPath.reset();
                    break;
                case REFLECT:
                    mStrokeWidth = (mRect.width() - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f;
                    mPaint.setStrokeWidth(mStrokeWidth);
                    mPaint.setStyle(Paint.Style.FILL);
                    for (int i = 0; i < mSpectrumCount; i++) {
                        canvas.drawLine(mRect.width() * i / mSpectrumCount, mRect.height() / 2, mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 - mSpectrumRatio * mRawAudioBytes[i], mPaint);
                        canvas.drawLine(mRect.width() * i / mSpectrumCount, mRect.height() / 2, mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 + mSpectrumRatio * mRawAudioBytes[i], mPaint);
                    }
                    break;
                case WAVE:
                    mStrokeWidth = (mRect.width() - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f;
                    mPaint.setStrokeWidth(mStrokeWidth);
                    mPaint.setStyle(Paint.Style.FILL);
                    mPath.moveTo(0, centerY);
    
                    for (int i = 0; i < mSpectrumCount; i++) {
                        mPath.lineTo(mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 + mRawAudioBytes[i]);
                    }
                    mPath.lineTo(mRect.width(), centerY);
                    mPath.close();
                    canvas.drawPath(mPath, mPaint);
                    mPath.reset();
                    break;
                case GRAIN:
                    mStrokeWidth = (mRect.width() - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f;
                    mPaint.setStrokeWidth(mStrokeWidth);
                    mPaint.setStyle(Paint.Style.FILL);
                    for (int i = 0; i < mSpectrumCount; i++) {
                        canvas.drawPoint(mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 - mRawAudioBytes[i], mPaint);
                        canvas.drawPoint(mRect.width() * i / mSpectrumCount, mRect.height() / 4 + 2 + (mRect.height() / 2 - mRawAudioBytes[i]) / 2, mPaint);
                    }
                    break;
                default:
                    break;
            }
    
        }
    
        public void setMode(int mode) {
            this.mode = mode;
            if (mRawAudioBytes != null) {
                invalidate();
            }
        }
    
        public void setData(float[] parseData) {
            mRawAudioBytes = parseData;
            invalidate();
        }
    }
    

参考文档

  1. https://www.jianshu.com/p/c95bb166fb28
  2. https://blog.csdn.net/gkw421178132/article/details/71081628
  3. https://developer.android.google.cn/reference/android/media/audiofx/Visualizer#setDataCaptureListener(android.media.audiofx.Visualizer.OnDataCaptureListener,%20int,%20boolean,%20boolean)
  • 3
    点赞
  • 19
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 1
    评论
要使用PyQt5进行音频可视化,需要使用PyQtGraph库。以下是一个简单的例子: ```python import pyqtgraph as pg import numpy as np from PyQt5.QtWidgets import QApplication, QMainWindow from PyQt5.QtCore import Qt import sys import soundfile as sf class AudioVisualizer(QMainWindow): def __init__(self): super().__init__() self.initUI() def initUI(self): self.setWindowTitle("Audio Visualizer") self.setGeometry(100, 100, 800, 600) self.centralwidget = pg.GraphicsLayoutWidget(self) self.setCentralWidget(self.centralwidget) self.plot = self.centralwidget.addPlot(title="Audio Signal", row=0, col=0) self.plot.setLabel('left', 'Amplitude') self.plot.setLabel('bottom', 'Time', units='s') self.plot.setYRange(-1, 1) self.plot.showGrid(x=True, y=True) self.plot.setMouseEnabled(x=False, y=False) self.plot.hideButtons() self.audio_data, self.sample_rate = sf.read('audio_file.wav') self.audio_data = self.audio_data[:, 0] # get only one channel self.time = np.arange(0, len(self.audio_data)) / self.sample_rate self.curve = self.plot.plot(self.time, self.audio_data) timer = pg.QtCore.QTimer(self) timer.timeout.connect(self.update) timer.start(10) def update(self): self.audio_data, self.sample_rate = sf.read('audio_file.wav') self.audio_data = self.audio_data[:, 0] # get only one channel self.time = np.arange(0, len(self.audio_data)) / self.sample_rate self.curve.setData(self.time, self.audio_data) if __name__ == '__main__': app = QApplication(sys.argv) ex = AudioVisualizer() ex.show() sys.exit(app.exec_()) ``` 这个例子会打开一个窗口,并在窗口中显示一个波形图,该波形图显示一个音频文件的振幅随时间变化的情况。程序将每隔10毫秒更新一次波形图,以显示音频的实时振幅情况。 要对波形图进行更详细的控制,可以使用PyQtGraph提供的其他功能,例如添加多个波形图、显示频谱等。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

立花泷える宫水三叶

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值