WebRTC Android 视频采集和预览过程

元旦休息,总结一下 WebRtc Android 视频采集调用流程。

一、WebRtc Android APPRTC-Demo 视频源采集调用流程:

1、demo CallActivity 中 onCreate() 调用 startCall() 创建连接、房间


2、WebSocketClient.connectToRoom() 请求服务器


3、请求回调到 CallActivity 实现的onConnectedToRoom() 中,开始创建连接


4、调用 peerConnectionClient.createPeerConnection(rootEglBase.getEglBaseContext(), localRender,remoteRenderers, videoCapturer, signalingParameters); 将所需的上下文,本地、远程的渲染器 renderers ,videoCapturer 本地视频采集器参数传入
(这里的videoCapturer 便是视频源采集的关键先生)


5、到 PeerConnectionClientpeerConnectionClient.createPeerConnection()
里边看到主要是创建了 PeerConnectionClient 对象,创建了本地音视频轨道
createAudioTrack()、createVideoTrack(videoCapturer)


6、对端视频 remoteVideoTrack 轨道,是通过在创建 PeerConnectionClient 对象,传入实现 PeerConnection.Observer 观察者 pcObserver 对象里获取的 , 底层数据信息回调 onAddStream()中


7、在 createVideoTrack(videoCapturer) 中调用了 videoCapturer.startCapture(videoWidth, videoHeight, videoFps); 采集视频的宽高帧率


到这里本地和对端视频进行 videoTrack.addRenderer() 渲染,就可以看到本地和对端视频。


二、VideoCapture 视频源采集过程

1、在 CallActivity ** 中 createVideoCapturer 通过CameraEnumerator** 创建 VideoCapture


2、CameraEnumerator 是一个接口,在这里可以获取到设备名称,是否前后置摄像头,创建一个摄像头的视频采集器(拿Camera1Capturer来说)


3、通过 Camera1Enumerator.createCapturer() 返回 Camera1Capturer 对象


4、在 Camera1Capturer 没有其他的方法,在父类 CameraCapturer 中找到 **startCapture()**方法主要调用了 **createSessionInternal()**接口


5、上面看到 CameraCapturer 的创建,在不同相机 API 会有 Camera1CapturerCamera2Capturer 。而 CameraCapturer 调用的接口 **createSessionInternal()**在 Camera1Capturer、Camera2Capturer 中实现
Camera1Capturer代码:


public class Camera1Capturer extends CameraCapturer {
    private final boolean captureToTexture;
    public Camera1Capturer(
            String cameraName, CameraEventsHandler eventsHandler, boolean captureToTexture) {
        super(cameraName, eventsHandler, new Camera1Enumerator(captureToTexture));
        this.captureToTexture = captureToTexture;
    }

    @Override
    protected void createCameraSession(CameraSession.CreateSessionCallback createSessionCallback,
                                       CameraSession.Events events, Context applicationContext,
                                       SurfaceTextureHelper surfaceTextureHelper, String cameraName, int width, int height,
                                       int framerate, int extraRotation) {
        Camera1Session.create(createSessionCallback, events, captureToTexture, applicationContext,
                surfaceTextureHelper, Camera1Enumerator.getCameraIndex(cameraName), width, height,
                framerate, extraRotation);
    }
}

Camera2Capturer代码:

public class Camera2Capturer extends CameraCapturer {
  private final Context context;
  private final CameraManager cameraManager;

  public Camera2Capturer(Context context, String cameraName, CameraEventsHandler eventsHandler) {
    super(cameraName, eventsHandler, new Camera2Enumerator(context));

    this.context = context;
    cameraManager = (CameraManager) context.getSystemService(Context.CAMERA_SERVICE);
  }

  @Override
  protected void createCameraSession(CameraSession.CreateSessionCallback createSessionCallback,
      CameraSession.Events events, Context applicationContext,
      SurfaceTextureHelper surfaceTextureHelper, String cameraName, int width, int height,
      int framerate, int extraRotation) {
    Camera2Session.create(createSessionCallback, events, applicationContext, cameraManager,
        surfaceTextureHelper, cameraName, width, height, framerate, extraRotation);
  }
}

可以看到 Camera1Capturer 、Camera2Capturer 具体做了创建不同的 CameraCapturerCameraSession,相机采集的逻辑都封装在 CameraCapturer 中。


6、接着我们看下 CameraSession ,原来是一个接口,里边分了两个回调类 CreateSessionCallbackEvents 和停止采集方法 stop()
CreateSessionCallback :相机创建完成 ,创建失败
Events :摄像头打开,打开失败失败,断开连接,相机关闭,采集数据


7、再看看 Camera1Session 具体实现:


public class Camera1Session implements CameraSession {
    private static final String TAG = "Camera1Session";
    private static final int NUMBER_OF_CAPTURE_BUFFERS = 3;

    private static final Histogram camera1StartTimeMsHistogram =
            Histogram.createCounts("WebRTC.Android.Camera1.StartTimeMs", 1, 10000, 50);
    private static final Histogram camera1StopTimeMsHistogram =
            Histogram.createCounts("WebRTC.Android.Camera1.StopTimeMs", 1, 10000, 50);
    private static final Histogram camera1ResolutionHistogram = Histogram.createEnumeration(
            "WebRTC.Android.Camera1.Resolution", CameraEnumerationAndroid.COMMON_RESOLUTIONS.size());

    private static enum SessionState {RUNNING, STOPPED}

    private final Handler cameraThreadHandler;
    private final Events events;
    private final boolean captureToTexture;
    private final Context applicationContext;
    private final SurfaceTextureHelper surfaceTextureHelper;
    private final int cameraId;
    private final int width;
    private final int height;
    private final int framerate;
    public static int extraRotation;
    private final android.hardware.Camera camera;
    private final android.hardware.Camera.CameraInfo info;
    private final CaptureFormat captureFormat;
    // Used only for stats. Only used on the camera thread.
    private final long constructionTimeNs; // Construction time of this class.

    private SessionState state;
    private boolean firstFrameReported = false;

    public static void create(final CreateSessionCallback callback, final Events events,
                              final boolean captureToTexture, final Context applicationContext,
                              final SurfaceTextureHelper surfaceTextureHelper, final int cameraId, final int width,
                              final int height, final int framerate, final int extraRotation) {
        final long constructionTimeNs = System.nanoTime();
        Logging.d(TAG, "Open camera " + cameraId);
        events.onCameraOpening();

        final android.hardware.Camera camera;
        try {
            camera = android.hardware.Camera.open(cameraId);
        } catch (RuntimeException e) {
            callback.onFailure(e.getMessage());
            return;
        }

        try {
            camera.setPreviewTexture(surfaceTextureHelper.getSurfaceTexture());
        } catch (IOException e) {
            camera.release();
            callback.onFailure(e.getMessage());
            return;
        }

        final android.hardware.Camera.CameraInfo info = new android.hardware.Camera.CameraInfo();
        android.hardware.Camera.getCameraInfo(cameraId, info);

        final android.hardware.Camera.Parameters parameters = camera.getParameters();
        final CaptureFormat captureFormat =
                findClosestCaptureFormat(parameters, width, height, framerate);
        final Size pictureSize = findClosestPictureSize(parameters, width, height);

        updateCameraParameters(camera, parameters, captureFormat, pictureSize, captureToTexture);

        // Initialize the capture buffers.
        if (!captureToTexture) {
            final int frameSize = captureFormat.frameSize();
            for (int i = 0; i < NUMBER_OF_CAPTURE_BUFFERS; ++i) {
                final ByteBuffer buffer = ByteBuffer.allocateDirect(frameSize);
                camera.addCallbackBuffer(buffer.array());
            }
        }

        // Calculate orientation manually and send it as CVO insted.
        camera.setDisplayOrientation(0 /* degrees */);

        callback.onDone(
                new Camera1Session(events, captureToTexture, applicationContext, surfaceTextureHelper,
                        cameraId, width, height, framerate, extraRotation, camera, info, captureFormat, constructionTimeNs));
    }

    ...

    private Camera1Session(Events events, boolean captureToTexture, Context applicationContext,
                           SurfaceTextureHelper surfaceTextureHelper, int cameraId, int width, int height, int framerate,
                           int extraRotation, android.hardware.Camera camera, android.hardware.Camera.CameraInfo info,
                           CaptureFormat captureFormat, long constructionTimeNs) {
        Logging.d(TAG, "Create new camera1 session on camera " + cameraId);

        this.cameraThreadHandler = new Handler();
        this.events = events;
        //  this.captureToTexture = captureToTexture;
        this.captureToTexture = false;
        this.applicationContext = applicationContext;
        this.surfaceTextureHelper = surfaceTextureHelper;
        this.cameraId = cameraId;
        this.width = width;
        this.height = height;
        this.framerate = framerate;
        this.extraRotation = extraRotation;
        this.camera = camera;
        this.info = info;
        this.captureFormat = captureFormat;
        this.constructionTimeNs = constructionTimeNs;

        startCapturing();
    }

   ...

    private void startCapturing() {
        Logging.d(TAG, "Start capturing");
        checkIsOnCameraThread();

        state = SessionState.RUNNING;

        camera.setErrorCallback(new android.hardware.Camera.ErrorCallback() {
            @Override
            public void onError(int error, android.hardware.Camera camera) {
                String errorMessage;
                if (error == android.hardware.Camera.CAMERA_ERROR_SERVER_DIED) {
                    errorMessage = "Camera server died!";
                } else {
                    errorMessage = "Camera error: " + error;
                }
                Logging.e(TAG, errorMessage);
                state = SessionState.STOPPED;
                stopInternal();
                if (error == android.hardware.Camera.CAMERA_ERROR_EVICTED) {
                    events.onCameraDisconnected(Camera1Session.this);
                } else {
                    events.onCameraError(Camera1Session.this, errorMessage);
                }
            }
        });

        if (captureToTexture) {
            listenForTextureFrames();
        } else {
            listenForBytebufferFrames();
        }
        try {
            camera.startPreview();
        } catch (RuntimeException e) {
            state = SessionState.STOPPED;
            stopInternal();
            events.onCameraError(this, e.getMessage());
        }
    }

   ...

    private void listenForTextureFrames() {
        surfaceTextureHelper.startListening(new SurfaceTextureHelper.OnTextureFrameAvailableListener() {
            @Override
            public void onTextureFrameAvailable(
                    int oesTextureId, float[] transformMatrix, long timestampNs) {
                checkIsOnCameraThread();

                if (state != SessionState.RUNNING) {
                    Logging.d(TAG, "Texture frame captured but camera is no longer running.");
                    surfaceTextureHelper.returnTextureFrame();
                    return;
                }

                if (!firstFrameReported) {
                    final int startTimeMs =
                            (int) TimeUnit.NANOSECONDS.toMillis(System.nanoTime() - constructionTimeNs);
                    camera1StartTimeMsHistogram.addSample(startTimeMs);
                    firstFrameReported = true;
                }

                int rotation = getFrameOrientation();
                if (info.facing == android.hardware.Camera.CameraInfo.CAMERA_FACING_FRONT) {
                    // Undo the mirror that the OS "helps" us with.
                    // http://developer.android.com/reference/android/hardware/Camera.html#setDisplayOrientation(int)
                    transformMatrix = RendererCommon.multiplyMatrices(
                            transformMatrix, RendererCommon.horizontalFlipMatrix());
                }
                events.onTextureFrameCaptured(Camera1Session.this, captureFormat.width,
                        captureFormat.height, oesTextureId, transformMatrix, rotation, timestampNs);
            }
        });
    }

    private void listenForBytebufferFrames() {
        camera.setPreviewCallbackWithBuffer(new android.hardware.Camera.PreviewCallback() {
            @Override
            public void onPreviewFrame(byte[] data, android.hardware.Camera callbackCamera) {
                checkIsOnCameraThread();

                if (callbackCamera != camera) {
                    Logging.e(TAG, "Callback from a different camera. This should never happen.");
                    return;
                }

                if (state != SessionState.RUNNING) {
                    Logging.d(TAG, "Bytebuffer frame captured but camera is no longer running.");
                    return;
                }

                final long captureTimeNs = TimeUnit.MILLISECONDS.toNanos(SystemClock.elapsedRealtime());

                if (!firstFrameReported) {
                    final int startTimeMs =
                            (int) TimeUnit.NANOSECONDS.toMillis(System.nanoTime() - constructionTimeNs);
                    camera1StartTimeMsHistogram.addSample(startTimeMs);
                    firstFrameReported = true;
                }

                events.onByteBufferFrameCaptured(Camera1Session.this, data, captureFormat.width,
                        captureFormat.height, getFrameOrientation(), captureTimeNs);
                camera.addCallbackBuffer(data);
            }
        });
    }
  ...
}

Camera1Capturer 调用了 Camera1Session.create(),可以看到具体实现是开启相机,视频源本地预览,相机参数设置,视频采集数据回调 listenForTextureFrames()listenForTextureFrames() 这下工作。

至此整个视频源采集和预览就算结束了。(暂时没有更新 webrtc 源码,可能和最新版本有一些变化)


元旦快乐,新的一年加油!


  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值