Android获取camera的的图像并保存

要想保存camera中的图像就要用到ImageReader

ImageReader中有surface,将这个surface设置到

CaptureRequest和CameraCaptureSession中。

ImageReader的创建

 public static @NonNull ImageReader newInstance(
            @IntRange(from = 1) int width,
            @IntRange(from = 1) int height,
            @Format             int format,
            @IntRange(from = 1) int maxImages)

创建imagerader我们需要的参数

width:我们要读取图片的宽

height  我们要读取图片的高

format 我们要读取图片的格式 ,特别要注意不支持ImageFormat.NV21。我们可以从源码中看到

  protected ImageReader(int width, int height, int format, int maxImages, long usage) {
       
         .......
        if (format == ImageFormat.NV21) {
            throw new IllegalArgumentException(
                    "NV21 format is not supported");
        }
    .......
}

maxImages 表示我们最多可以同时从ImageRader中获取多少张图片。获取的越多,需要的buffer也就越多。所以这里要注意。

我们知道camera可以支持很多分辨率,那么我们就从camera当中,找到我们支持的宽高

 所以就有了下面的代码

从camera中获取宽高的方法

CameraCharacteristics characteristics = cameraManager.getCameraCharacteristics(backCameraId);
            StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
            Size largest = Collections.max(Arrays.asList(map.getOutputSizes(ImageFormat.YUV_420_888)), new CompareSizesByArea());

 创建ImageRader

 mImageReader = ImageReader.newInstance(largest.getWidth(), largest.getHeight(),
                    ImageFormat.YUV_420_888, 2);
            width = largest.getWidth();
            height = largest.getHeight();

获取到宽高后我们创建ImageRader设置到CameraCaptureSession

 OutputConfiguration outputConfiguration = new OutputConfiguration(binding.surfaceView.getHolder().getSurface());
            OutputConfiguration imageReaderOutputConfiguration = new OutputConfiguration(mImageReader.getSurface());
            List<OutputConfiguration> outputs = new ArrayList<>();
            outputs.add(outputConfiguration);
            outputs.add(imageReaderOutputConfiguration);
            SessionConfiguration sessionConfiguration
                    = new SessionConfiguration(SessionConfiguration.SESSION_REGULAR,
                    outputs,
                    mExecutorService,
                    new CameraCaptureSession.StateCallback() {
                        @Override
                        public void onConfigured(@NonNull CameraCaptureSession session) {
                            
                        }

                        @Override
                        public void onConfigureFailed(@NonNull CameraCaptureSession session) {

                        }
                    }
            );

            mCamera.createCaptureSession(sessionConfiguration);

 我么将ImageReader设置到CaptureRequest

    private void record() {
        try {
            Log.e("CameraActivity", "record");
            i = 0;
            File file = new File(savePicPath);
            DirUtil.deleteDir(file);
            final CaptureRequest.Builder builder
                    = mCamera.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
            builder.addTarget(binding.surfaceView.getHolder().getSurface());
            builder.addTarget(mImageReader.getSurface());
            OutputConfiguration outputConfiguration = new OutputConfiguration(binding.surfaceView.getHolder().getSurface());
            OutputConfiguration imageReaderOutputConfiguration = new OutputConfiguration(mImageReader.getSurface());
            List<OutputConfiguration> outputs = new ArrayList<>();
            outputs.add(outputConfiguration);
            outputs.add(imageReaderOutputConfiguration);
            SessionConfiguration sessionConfiguration
                    = new SessionConfiguration(SessionConfiguration.SESSION_REGULAR,
                    outputs,
                    mExecutorService,
                    new CameraCaptureSession.StateCallback() {
                        @Override
                        public void onConfigured(@NonNull CameraCaptureSession session) {

                        }

                        @Override
                        public void onConfigureFailed(@NonNull CameraCaptureSession session) {

                        }
                    }
            );

            mCamera.createCaptureSession(sessionConfiguration);

        } catch (Exception e) {

        }

    }

我们将ImagerReader绑定在Capture Request中

我们执行

mCameraPreviewCaptureSession.setRepeatingRequest

就可以在ImageReader中的回调中拿到yuv数据了

          mImageReader.setOnImageAvailableListener(new ImageReader.OnImageAvailableListener() {
                @Override
                public void onImageAvailable(ImageReader reader) {
                    
                }
            },subHandler);

yuv420中y:u:v = 4:1:1

可是当我们拿到yuv数据后却发现

Image image = mImageReader.acquireLatestImage();
Image.Plane[] planes = image.getPlanes();
/** Y */
ByteBuffer bufferY = planes[0].getBuffer();
/** U(Cb) */
ByteBuffer bufferU = planes[1].getBuffer();
/** V(Cr) */
ByteBuffer bufferV = planes[2].getBuffer();

bufferY :bufferU:bufferV = 4:2:2

为什么会这样呢

 Image.Plane中的

/**
         * <p>The distance between adjacent pixel samples, in bytes.</p>
         *
         * <p>This is the distance between two consecutive pixel values in a row
         * of pixels. It may be larger than the size of a single pixel to
         * account for interleaved image data or padded formats.
         * Note that pixel stride is undefined for some formats such as
         * {@link android.graphics.ImageFormat#RAW_PRIVATE RAW_PRIVATE},
         * and calling getPixelStride on images of these formats will
         * cause an UnsupportedOperationException being thrown.
         * For formats where pixel stride is well defined, the pixel stride
         * is always greater than 0.</p>
         */
        public abstract int getPixelStride();

返回了当前数组中,每隔多少是一个有效数据

我们这里返回了2

说明 bufferU bufferV中的有效数据的位置是0, 2,4,6.

所以YUV420 中的Y : U : V = 4 :1:1并没有问题。

我们要将yuv数据保存成jpg图片。这时候要用到

public YuvImage(byte[] yuv, int format, int width, int height, int[] strides) {
        if (format != ImageFormat.NV21 &&
                format != ImageFormat.YUY2) {
            throw new IllegalArgumentException(
                    "only support ImageFormat.NV21 " +
                    "and ImageFormat.YUY2 for now");
        }

YuvImage的构造函数告诉我们只支持NV21和YUY2

NV21也是yuv420的packet存储格式

yuv420会分成三个平面保存

NV21会在一个平面保存。格式是YYYYYYYYVUVU.先放 全部的Y,再VU交替存放。

所以也就有了下面的转化

由于YuvImage只支持NV21和YUY2

所以我们必须用上面的方法将Yuv420转化为NV21。

下面是完整的代码

package com.yuanxuzhen.androidmedia.video;

import android.Manifest;
import android.content.pm.PackageManager;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.ImageFormat;
import android.graphics.Rect;
import android.graphics.YuvImage;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CaptureFailure;
import android.hardware.camera2.CaptureRequest;
import android.hardware.camera2.CaptureResult;
import android.hardware.camera2.TotalCaptureResult;
import android.hardware.camera2.params.OutputConfiguration;
import android.hardware.camera2.params.SessionConfiguration;
import android.hardware.camera2.params.StreamConfigurationMap;
import android.media.Image;
import android.media.ImageReader;
import android.os.Build;
import android.os.Bundle;
import android.os.Handler;
import android.os.HandlerThread;
import android.util.Log;
import android.util.Size;
import android.view.Surface;
import android.view.SurfaceHolder;
import android.view.View;
import android.widget.Toast;

import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import androidx.annotation.RequiresApi;
import androidx.appcompat.app.AppCompatActivity;
import androidx.core.app.ActivityCompat;

import com.yuanxuzhen.androidmedia.DirUtil;
import com.yuanxuzhen.androidmedia.databinding.ActivityCameraLayoutBinding;

import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;

import permissions.dispatcher.NeedsPermission;
import permissions.dispatcher.OnNeverAskAgain;
import permissions.dispatcher.OnPermissionDenied;
import permissions.dispatcher.RuntimePermissions;

@RuntimePermissions
public class CameraActivity extends AppCompatActivity {
    ActivityCameraLayoutBinding binding;
    ExecutorService mExecutorService;
    CameraManager cameraManager;
    CameraDevice mCamera;
    private String frontCameraId = "";
    private String backCameraId = "";
    CameraCaptureSession mCameraPreviewCaptureSession;
    CameraCaptureSession mCameraRecordCaptureSession;
    ImageReader mImageReader;
    HandlerThread subHandlerThread;
    Handler subHandler;
    HandlerThread sub1HandlerThread;

    Handler sub1Handler;

    int width, height;
    private String savePicPath = null;

    @Override
    protected void onCreate(@Nullable Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        savePicPath = DirUtil.getCacheDir() + File.separator + "save";
        cameraManager = (CameraManager) getSystemService(CAMERA_SERVICE);
        mExecutorService = Executors.newCachedThreadPool();
        binding = ActivityCameraLayoutBinding.inflate(getLayoutInflater());
        setContentView(binding.getRoot());
        subHandlerThread = new HandlerThread("sub");
        subHandlerThread.start();
        subHandler = new Handler(subHandlerThread.getLooper());

        sub1HandlerThread = new HandlerThread("sub1");
        sub1HandlerThread.start();
        sub1Handler = new Handler(sub1HandlerThread.getLooper());
        binding.startRecord.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                record();
            }
        });
        binding.stopRecord.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                preview();
            }
        });

        binding.surfaceView.getHolder().addCallback(new SurfaceHolder.Callback() {
            @Override
            public void surfaceCreated(@NonNull SurfaceHolder holder) {
                CameraActivityPermissionsDispatcher.startCameraWithPermissionCheck(CameraActivity.this);
            }

            @Override
            public void surfaceChanged(@NonNull SurfaceHolder holder, int format, int width, int height) {

            }

            @Override
            public void surfaceDestroyed(@NonNull SurfaceHolder holder) {

            }
        });

    }

    @NeedsPermission({Manifest.permission.RECORD_AUDIO, Manifest.permission.CAMERA})
    public void startCamera() {
        Log.e("CameraActivity", "startCamera");
        mExecutorService.execute(new Runnable() {
            @Override
            public void run() {
                try {
                    if (ActivityCompat.checkSelfPermission(CameraActivity.this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
                        // TODO: Consider calling
                        //    ActivityCompat#requestPermissions
                        // here to request the missing permissions, and then overriding
                        //   public void onRequestPermissionsResult(int requestCode, String[] permissions,
                        //                                          int[] grantResults)
                        // to handle the case where the user grants the permission. See the documentation
                        // for ActivityCompat#requestPermissions for more details.
                        return;
                    }
                    String[] cameraIdArray = cameraManager.getCameraIdList();
                    for (String ele : cameraIdArray) {
                        CameraCharacteristics cameraCharacteristics = cameraManager.getCameraCharacteristics(ele);
                        if (cameraCharacteristics.get(CameraCharacteristics.LENS_FACING) == CameraCharacteristics.LENS_FACING_FRONT) {
                            frontCameraId = ele;
                        } else if (cameraCharacteristics.get(CameraCharacteristics.LENS_FACING) == CameraCharacteristics.LENS_FACING_BACK) {
                            backCameraId = ele;
                        }
                    }
                    calculateCameraParameters();

                    cameraManager.openCamera(backCameraId,
                            mExecutorService,
                            new CameraDevice.StateCallback() {
                                @Override
                                public void onOpened(@NonNull CameraDevice camera) {
                                    Log.e("CameraActivity", "onOpened");
                                    mCamera = camera;
                                    preview();
                                }

                                @Override
                                public void onDisconnected(@NonNull CameraDevice camera) {
                                    Log.e("CameraActivity", "onDisconnected");

                                }

                                @Override
                                public void onError(@NonNull CameraDevice camera, int error) {
                                    Log.e("CameraActivity", "onError");

                                }
                            }

                    );
                } catch (Exception e) {
                    e.printStackTrace();
                }

            }
        });

    }

    public void stopCamera() {
        Log.e("CameraActivity", "stopCamera");
        closeRecordiew();
        closePreview();
        if (mCamera != null) {
            mCamera.close();
            mCamera = null;
        }

    }

    @OnPermissionDenied(Manifest.permission.RECORD_AUDIO)
    public void onDeniedAudio() {
        Toast.makeText(this, "录音权限拒绝", Toast.LENGTH_SHORT).show();
    }

    @OnNeverAskAgain(Manifest.permission.RECORD_AUDIO)
    public void onNeverAskAgainAudio() {
        Toast.makeText(this, "录音权限再不询问", Toast.LENGTH_SHORT).show();
    }

    @OnPermissionDenied(Manifest.permission.CAMERA)
    public void onDeniedCamera() {
        Toast.makeText(this, "录像权限拒绝", Toast.LENGTH_SHORT).show();
    }

    @OnNeverAskAgain(Manifest.permission.CAMERA)
    public void onNeverAskAgainCamera() {
        Toast.makeText(this, "录像权限再不询问", Toast.LENGTH_SHORT).show();
    }

    @Override
    public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions,
                                           @NonNull int[] grantResults) {
        super.onRequestPermissionsResult(requestCode, permissions, grantResults);
        CameraActivityPermissionsDispatcher.onRequestPermissionsResult(this, requestCode, grantResults);
    }


    /**
     * Check if this device has a camera
     */
    private boolean checkCameraHardware() {
        if (getPackageManager().hasSystemFeature(PackageManager.FEATURE_CAMERA)) {
            // this device has a camera
            return true;
        } else {
            // no camera on this device
            return false;
        }
    }

    @Override
    protected void onDestroy() {
        stopCamera();
        super.onDestroy();
    }

    @RequiresApi(api = Build.VERSION_CODES.N)
    private void preview() {
        try {
            closeRecordiew();
            final CaptureRequest.Builder previewBuilder
                    = mCamera.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
            previewBuilder.addTarget(binding.surfaceView.getHolder().getSurface());
            OutputConfiguration outputConfiguration = new OutputConfiguration(binding.surfaceView.getHolder().getSurface());
            List<OutputConfiguration> outputs = new ArrayList<>();
            outputs.add(outputConfiguration);
            SessionConfiguration sessionConfiguration
                    = new SessionConfiguration(SessionConfiguration.SESSION_REGULAR,
                    outputs,
                    mExecutorService,
                    new CameraCaptureSession.StateCallback() {
                        @Override
                        public void onConfigured(@NonNull CameraCaptureSession session) {
                            mCameraPreviewCaptureSession = session;
                            previewBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_VIDEO);
                            CaptureRequest previewRequest = previewBuilder.build();
                            try {
                                mCameraPreviewCaptureSession.setRepeatingRequest(previewRequest, new CameraCaptureSession.CaptureCallback() {
                                    @Override
                                    public void onCaptureStarted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, long timestamp, long frameNumber) {
                                        super.onCaptureStarted(session, request, timestamp, frameNumber);
                                    }

                                    @Override
                                    public void onCaptureProgressed(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull CaptureResult partialResult) {
                                        super.onCaptureProgressed(session, request, partialResult);
                                    }

                                    @Override
                                    public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
                                        super.onCaptureCompleted(session, request, result);
                                    }

                                    @Override
                                    public void onCaptureFailed(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull CaptureFailure failure) {
                                        super.onCaptureFailed(session, request, failure);
                                    }

                                    @Override
                                    public void onCaptureSequenceCompleted(@NonNull CameraCaptureSession session, int sequenceId, long frameNumber) {
                                        super.onCaptureSequenceCompleted(session, sequenceId, frameNumber);
                                    }

                                    @Override
                                    public void onCaptureSequenceAborted(@NonNull CameraCaptureSession session, int sequenceId) {
                                        super.onCaptureSequenceAborted(session, sequenceId);
                                    }

                                    @Override
                                    public void onCaptureBufferLost(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull Surface target, long frameNumber) {
                                        super.onCaptureBufferLost(session, request, target, frameNumber);
                                    }
                                }, sub1Handler);
                            } catch (CameraAccessException e) {
                                e.printStackTrace();
                            }
                        }

                        @Override
                        public void onConfigureFailed(@NonNull CameraCaptureSession session) {

                        }
                    }
            );

            mCamera.createCaptureSession(sessionConfiguration);

        } catch (Exception e) {

        }

    }


    private void closePreview() {
        if (mCameraPreviewCaptureSession != null) {
            mCameraPreviewCaptureSession.close();
            mCameraPreviewCaptureSession = null;
        }
    }

    private void closeRecordiew() {
        if (mCameraRecordCaptureSession != null) {
            mCameraRecordCaptureSession.close();
            mCameraRecordCaptureSession = null;
        }
    }

    int i = 0;

    @RequiresApi(api = Build.VERSION_CODES.N)
    private void record() {
        try {
            Log.e("CameraActivity", "record");
            i = 0;
            File file = new File(savePicPath);
            DirUtil.deleteDir(file);
            final CaptureRequest.Builder builder
                    = mCamera.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
            builder.addTarget(binding.surfaceView.getHolder().getSurface());
            builder.addTarget(mImageReader.getSurface());
            OutputConfiguration outputConfiguration = new OutputConfiguration(binding.surfaceView.getHolder().getSurface());
            OutputConfiguration imageReaderOutputConfiguration = new OutputConfiguration(mImageReader.getSurface());
            List<OutputConfiguration> outputs = new ArrayList<>();
            outputs.add(outputConfiguration);
            outputs.add(imageReaderOutputConfiguration);
            SessionConfiguration sessionConfiguration
                    = new SessionConfiguration(SessionConfiguration.SESSION_REGULAR,
                    outputs,
                    mExecutorService,
                    new CameraCaptureSession.StateCallback() {
                        @Override
                        public void onConfigured(@NonNull CameraCaptureSession session) {
                            mCameraRecordCaptureSession = session;
                            builder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_VIDEO);
                            CaptureRequest request = builder.build();
                            try {
                                mCameraRecordCaptureSession.setRepeatingRequest(request, new CameraCaptureSession.CaptureCallback() {
                                    @Override
                                    public void onCaptureStarted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, long timestamp, long frameNumber) {
                                        super.onCaptureStarted(session, request, timestamp, frameNumber);
                                    }

                                    @Override
                                    public void onCaptureProgressed(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull CaptureResult partialResult) {
                                        super.onCaptureProgressed(session, request, partialResult);
                                    }

                                    @Override
                                    public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
                                        super.onCaptureCompleted(session, request, result);
                                    }

                                    @Override
                                    public void onCaptureFailed(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull CaptureFailure failure) {
                                        super.onCaptureFailed(session, request, failure);
                                    }

                                    @Override
                                    public void onCaptureSequenceCompleted(@NonNull CameraCaptureSession session, int sequenceId, long frameNumber) {
                                        super.onCaptureSequenceCompleted(session, sequenceId, frameNumber);
                                    }

                                    @Override
                                    public void onCaptureSequenceAborted(@NonNull CameraCaptureSession session, int sequenceId) {
                                        super.onCaptureSequenceAborted(session, sequenceId);
                                    }

                                    @Override
                                    public void onCaptureBufferLost(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull Surface target, long frameNumber) {
                                        super.onCaptureBufferLost(session, request, target, frameNumber);
                                    }
                                }, sub1Handler);
                            } catch (CameraAccessException e) {
                                e.printStackTrace();
                            }
                        }

                        @Override
                        public void onConfigureFailed(@NonNull CameraCaptureSession session) {

                        }
                    }
            );

            mCamera.createCaptureSession(sessionConfiguration);

        } catch (Exception e) {

        }

    }

    /**
     * 根据当前摄像头计算所需参数
     */
    private void calculateCameraParameters() {
        try {
            CameraCharacteristics characteristics = cameraManager.getCameraCharacteristics(backCameraId);
            StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
            Size largest = Collections.max(Arrays.asList(map.getOutputSizes(ImageFormat.YUV_420_888)), new CompareSizesByArea());
            Log.e("CameraActivity", "calculateCameraParameters width=" + largest.getWidth() + " height=" + largest.getHeight());
            mImageReader = ImageReader.newInstance(largest.getWidth(), largest.getHeight(),
                    ImageFormat.YUV_420_888, 2);
            width = largest.getWidth();
            height = largest.getHeight();
            mImageReader.setOnImageAvailableListener(new ImageReader.OnImageAvailableListener() {
                @Override
                public void onImageAvailable(ImageReader reader) {
                    ++i;

                    Log.e("CameraActivity", "onImageAvailable");
                    Image image = mImageReader.acquireLatestImage();
                    if (image != null && i < 2) {
                        Image.Plane[] planes = image.getPlanes();
                        /** Y */
                        ByteBuffer bufferY = planes[0].getBuffer();
                        /** U(Cb) */
                        ByteBuffer bufferU = planes[1].getBuffer();
                        /** V(Cr) */
                        ByteBuffer bufferV = planes[2].getBuffer();
                        /** YUV数据集合 */
                        byte[] yuvData =  yuv420ToNV21(bufferY, bufferU, bufferV);
                        mExecutorService.execute(new Runnable() {
                            @Override
                            public void run() {
                                try {
                                    saveYUVtoPicture(yuvData, width, height);
                                } catch (IOException e) {
                                    e.printStackTrace();
                                }
                            }
                        });
                    } else {
                    }
                    if(image != null){
                        image.close();
                    }
                }
            }, subHandler);
        } catch (Exception e) {
            e.printStackTrace();
        }

    }

    private byte[] yuv420ToNV21(ByteBuffer y, ByteBuffer u, ByteBuffer v){
        ByteBuffer bufferY = y;
        int bufferYSize = bufferY.remaining();
        /** U(Cb) */
        ByteBuffer bufferU = u;
        int bufferUSize = bufferU.remaining();
        /** V(Cr) */
        ByteBuffer bufferV = v;
        int bufferVSize = bufferV.remaining();
        Log.e("CameraActivity", "bufferYSize=" + bufferYSize + " bufferUSize=" + bufferUSize + " bufferVSize=" + bufferVSize);

        Log.e("CameraActivity", "yuv420ToNV21 size="+(bufferYSize + bufferUSize + bufferVSize));
        byte[] yuvData = new byte[bufferYSize + bufferUSize];
        byte[] uData = new byte[bufferUSize];
        byte[] vData = new byte[bufferVSize];

        bufferY.get(yuvData, 0, bufferYSize);
        bufferU.get(uData, 0, bufferUSize);
        bufferV.get(vData, 0, bufferVSize);
        for(int i = 0; i < (bufferUSize + 1) / 2 ; i++){
            yuvData[bufferYSize + 2 * i] = vData[i * 2];
            yuvData[bufferYSize + 2 * i + 1] = uData[i * 2];
        }
        return yuvData;
    }




    public  void saveYUVtoPicture(byte[] data,int width,int height) throws IOException{
        FileOutputStream outStream = null;
        File file = new File(savePicPath) ;
        if(!file.exists()){
            file.mkdir();
        }

        try {
            YuvImage yuvimage = new YuvImage(data, ImageFormat.NV21, width, height, null);


            ByteArrayOutputStream baos = new ByteArrayOutputStream();
            yuvimage.compressToJpeg(new Rect(0, 0,width, height), 80, baos);

            Bitmap bmp = BitmapFactory.decodeByteArray(baos.toByteArray(), 0, baos.toByteArray().length);

            outStream = new FileOutputStream(
                    String.format(savePicPath + File.separator + "%d_%s_%s.jpg",
                            System.currentTimeMillis(),String.valueOf(width),String.valueOf(height)));
            bmp.compress(Bitmap.CompressFormat.JPEG, 100, outStream);
            outStream.write(baos.toByteArray());
            outStream.close();

        } catch (FileNotFoundException e) {
            e.printStackTrace();
        } catch (IOException e) {
            e.printStackTrace();
        } finally {
        }
    }




}

gitee地址

https://gitee.com/creat151/android-media.git

  • 0
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 3
    评论
您可以使用以下代码获取保存相机拍摄的图像文件: ```java private File createImageFile() throws IOException { // Create an image file name String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date()); String imageFileName = "JPEG_" + timeStamp + "_"; File storageDir = getExternalFilesDir(Environment.DIRECTORY_PICTURES); File image = File.createTempFile( imageFileName, /* prefix */ ".jpg", /* suffix */ storageDir /* directory */ ); // Save a file: path for use with ACTION_VIEW intents currentPhotoPath = image.getAbsolutePath(); return image; } private void dispatchTakePictureIntent() { Intent takePictureIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE); // Ensure that there's a camera activity to handle the intent if (takePictureIntent.resolveActivity(getPackageManager()) != null) { // Create the File where the photo should go File photoFile = null; try { photoFile = createImageFile(); } catch (IOException ex) { // Error occurred while creating the File ex.printStackTrace(); } // Continue only if the File was successfully created if (photoFile != null) { Uri photoURI = FileProvider.getUriForFile(this, "com.example.android.fileprovider", photoFile); takePictureIntent.putExtra(MediaStore.EXTRA_OUTPUT, photoURI); startActivityForResult(takePictureIntent, REQUEST_TAKE_PHOTO); } } } @Override protected void onActivityResult(int requestCode, int resultCode, Intent data) { super.onActivityResult(requestCode, resultCode, data); if (requestCode == REQUEST_TAKE_PHOTO && resultCode == RESULT_OK) { // Image captured and saved to fileUri specified in the Intent Toast.makeText(this, "Image saved to: " + currentPhotoPath, Toast.LENGTH_LONG).show(); } } ``` 这个例子假设你的应用拥有 `android.permission.WRITE_EXTERNAL_STORAGE` 权限,并且你的 `FileProvider` 的配置按照 Android Developer 都正确设置了。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 3
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值