android webrtc 视频流源码获取帧图像VideoFrame转bitmap 作为图像识别

30 篇文章 0 订阅
28 篇文章 0 订阅
由于需要使用opencv等项目识别webrtc中的画面
需要对webrtc的摄像头进行采集

该方法在EglRenderer implements VideoSink 类的onFrame中增加saveImgBitmap(frame)方法来获取图像具体代码如下
// VideoSink interface.
@Override
public void onFrame(VideoFrame frame){

 //将org.webrtc.VideoFrame转bitmap并保存
 saveImgBitmap(frame)

 synchronized (statisticsLock) {
    ++framesReceived;
  }
  final boolean dropOldFrame;
  synchronized (handlerLock) {
    if (renderThreadHandler == null) {
      logD("Dropping frame - Not initialized or already released.");
      return;
    }
    synchronized (frameLock) {
      dropOldFrame = (pendingFrame != null);
      if (dropOldFrame) {
        pendingFrame.release();
      }
      pendingFrame = frame;
      pendingFrame.retain();
      renderThreadHandler.post(this ::renderFrameOnRenderThread);
    }
  }
  if (dropOldFrame) {
    synchronized (statisticsLock) {
      ++framesDropped;
    }
  }
 }
}

 

class EglRenderer implements VideoSink { 

 ......

  private final Matrix drawMatrix = new Matrix();
  // Used for bitmap capturing.
  private final GlTextureFrameBuffer bitmapTextureFramebuffer =
      new GlTextureFrameBuffer(GLES20.GL_RGBA);

 ......

public void saveImgBitmap(VideoFrame frame){
    drawMatrix.reset();
    drawMatrix.preTranslate(0.5f, 0.5f);
    drawMatrix.preScale(mirrorHorizontally ? -1f : 1f, mirrorVertically ? -1f : 1f);
    drawMatrix.preScale(1f, -1f); // We want the output to be upside down for Bitmap.
    drawMatrix.preTranslate(-0.5f, -0.5f);



    final int scaledWidth = (int) (1 * frame.getRotatedWidth());
    final int scaledHeight = (int) (1 * frame.getRotatedHeight());


    bitmapTextureFramebuffer.setSize(scaledWidth, scaledHeight);

    GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, bitmapTextureFramebuffer.getFrameBufferId());
    GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0,
            GLES20.GL_TEXTURE_2D, bitmapTextureFramebuffer.getTextureId(), 0);

    GLES20.glClearColor(0 /* red */, 0 /* green */, 0 /* blue */, 0 /* alpha */);
    GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
    frameDrawer.drawFrame(frame, drawer, drawMatrix, 0 /* viewportX */,
            0 /* viewportY */, scaledWidth, scaledHeight);

    final ByteBuffer bitmapBuffer = ByteBuffer.allocateDirect(scaledWidth * scaledHeight * 4);
    GLES20.glViewport(0, 0, scaledWidth, scaledHeight);
    GLES20.glReadPixels(
            0, 0, scaledWidth, scaledHeight, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, bitmapBuffer);

    GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
    GlUtil.checkNoGLES2Error("EglRenderer.notifyCallbacks");

    final Bitmap bitmap = Bitmap.createBitmap(scaledWidth, scaledHeight, Bitmap.Config.ARGB_8888);
    bitmap.copyPixelsFromBuffer(bitmapBuffer);


    try {
      OutputStream outputStream=new FileOutputStream(Environment.getExternalStorageDirectory().getAbsolutePath()+"/"+"atest/"+System.currentTimeMillis()+".jpg");
      bitmap.compress(Bitmap.CompressFormat.JPEG,100,outputStream);
    } catch (FileNotFoundException e) {
      e.printStackTrace();
    }
    
  }

}

 

  • 0
    点赞
  • 8
    收藏
    觉得还不错? 一键收藏
  • 1
    评论
要在 Android 上打开 WebRTC 视频流,你需要使用 WebRTC 库和 Android 原生代码来实现。以下是一个示例代码,可以帮助你开始: 1. 导入 WebRTC 库: 在你的 Android 项目中,在 app/build.gradle 文件中添加以下依赖项: ``` implementation 'org.webrtc:google-webrtc:1.0.+' ``` 2. 创建 PeerConnectionFactory: ``` PeerConnectionFactory.initialize(PeerConnectionFactory.InitializationOptions.builder(context).createInitializationOptions()); PeerConnectionFactory peerConnectionFactory = PeerConnectionFactory.builder().createPeerConnectionFactory(); ``` 3. 创建 VideoCapturer: ``` VideoCapturer videoCapturer = createCameraCapturer(new Camera1Enumerator(false)); ``` 4. 创建 VideoSource: ``` SurfaceTextureHelper surfaceTextureHelper = SurfaceTextureHelper.create("CaptureThread", null); VideoSource videoSource = peerConnectionFactory.createVideoSource(videoCapturer.isScreencast()); videoCapturer.initialize(surfaceTextureHelper, context, videoSource.getCapturerObserver()); videoCapturer.startCapture(1280, 720, 30); ``` 5. 创建 VideoTrack: ``` VideoTrack videoTrack = peerConnectionFactory.createVideoTrack("video", videoSource); ``` 6. 创建 PeerConnection: ``` PeerConnection.RTCConfiguration configuration = new PeerConnection.RTCConfiguration(Arrays.asList(new PeerConnection.IceServer("stun:stun.l.google.com:19302"))); PeerConnection peerConnection = peerConnectionFactory.createPeerConnection(configuration, new CustomPeerConnectionObserver()); ``` 7. 将 VideoTrack 添加到 PeerConnection 中: ``` peerConnection.addTrack(videoTrack); ``` 8. 创建 VideoRenderer: ``` SurfaceViewRenderer localView = findViewById(R.id.local_view); localView.init(surfaceTextureHelper.getSurfaceTexture(), context); localView.setMirror(true); VideoRenderer localRenderer = new VideoRenderer(localView); videoTrack.addRenderer(localRenderer); ``` 9. 创建 Offer: ``` peerConnection.createOffer(new CustomSdpObserver() { @Override public void onCreateSuccess(SessionDescription sessionDescription) { peerConnection.setLocalDescription(new CustomSdpObserver(), sessionDescription); // 发送 Offer 到远端 } }, new MediaConstraints()); ``` 这就是一个简单的 WebRTC 视频流应用的示例代码。你需要根据你的具体需求来进行相应的修改。
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值