基于android RPC网络框架的Camera数据传输

2. Feature特性描述

2.1 特性描述

1)  Feature name:Remote Control(multi camera);

2)  Feature简介: 该feature较为复杂,通过wifi网络控制multiple camera设备,涉及到NFC、wifi以及rpc等相关知识;

2.2 特性分析

1)  所需基础:

NFC、wifi AP、wifi p2p(direct)、rpc;

NFC:用于验证连接以及传输设备基本信息;

Wifi:帧数据通过wifi网络传输;

DI camera设备:涉及wifi AP和rpc相关知识,通过rpc(远程过程调用)协议操作设备,获取帧数据;

Xperia:通过wifi P2P协议发送和获取帧数据。

2)  通过wifi p2p处理previw帧数据流程如下图所示:

Service端:

Camera捕获的preview数据帧通过Jpeg格式编码后通过socket发送至wifi网络;

Clent端:

接收wifi网络中的数据帧,解码后在屏幕显示。

后续会增加相关内容介绍

3. Feature基本流程

3.1 Multi Camera feature函数执行流程

3.1.1 connectToDI camera

Wifi AP接收网络中数据:

vim vendor/semc/packages/apps/camera-addons/OnlineRemoteCamera/src/com/sonymobile/android/addoncamera/onlineremote/OnlineRemoteCameraActivity.java

private void connectToDI(NdefMessage ndefMessage) {

    DINdefMessage diNdefMessage = new DINdefMessage(ndefMessage);

    mStateMachine.setIsConnectionCanceled(false);

    mStateMachine.requestConnectToRemoteDeviceAccessPoint(

            diNdefMessage.getSsid(),

            diNdefMessage.getPasswd());

}

vim vendor/semc/packages/apps/camera-addons/OnlineRemoteCamera/src/com/sonymobile/android/addoncamera/onlineremote/controller/StateMachineController.java

public void requestConnectToRemoteDeviceAccessPoint(String ssid, String passwd) {

    // disable Xperia related service

    stopWifiP2pRemoteCameraEnvironment();

    mNfcEnv.disable();

    mWifiEnv.requestConnect(ssid, passwd, new WifiConnectionCallbackImpl());

    mUiThreadHandler.post(new Runnable(){

        @Override

        public void run() {

            sendEvent(TransitterEvent.EVENT_CONNECTION_START,

                    ConnectionTarget.REMOTE_DEVICE);

        }

    });

}

Private class WifiConnectionCallbackImpl implements WifiConnectionRequestCallback {

   @Overrider

    public void onConnected() {

        if (IS_WIFIP2P_DEBUG) logDebug("onConnected()");

        startRemoteCameraPreview();

        // Update camera status. Set below parameter.

        // - RemovableCameraClients

        //   This includes connected DI camera without XPERIA camera and built-in camera.

        int removableCameraClients = 1;

        new GlobalCameraStatusPublisher(mActivity)

                .put(new RemovableCameraClients(removableCameraClients))

                .publish();

    }

    ……

}

private void startRemoteCameraPreview() {

    RemoteDeviceScanner.request(  //1、首先通过request拿到相应设备

            new RemoteScanCallback(), //2callback中具体处理数据

            mBackWorker,

            0);

}

1、首先通过request拿到相应设备

vim vendor/semc/packages/apps/camera-addons/common-components/src/com/sonymobile/cameracommon/remotedevice/RemoteDeviceScanner.java

public static void request(

        OnRemoteDeviceScannedCallback callback,

        ExecutorService callbackExecutor,

        int retryCount) {

    RequestOnBackTask task = new RequestOnBackTask(callback, callbackExecutor, retryCount);

    if (retryCount == 0) {

        // no wait in first time.

        mInternalExecutor.schedule(task, 0, TimeUnit.SECONDS);

    } else {

        mInternalExecutor.schedule(task, 1, TimeUnit.SECONDS);

    }

}

private static class RequestOnBackTask implements Runnable {

    private OnRemoteDeviceScannedCallback mCallback = null;

    private ExecutorService mCallbackExecutor = null;

    private final int mRetryCount;

    /**

     * CONSTRUCTOR.

     */

    public RequestOnBackTask(

            OnRemoteDeviceScannedCallback callback,

            ExecutorService callbackExecutor,

            int retryCount) {

        mCallback = callback;

        mCallbackExecutor = callbackExecutor;

        mRetryCount = retryCount;

    }

    @Override

    public void run() {

        if (IS_DEBUG) Log.logHttp(TAG, "RequestOnBackTask.run(): " + mRetryCount);

        requestOnBack(mCallback, mCallbackExecutor, mRetryCount);

    }

}

private static void requestOnBack(

        OnRemoteDeviceScannedCallback callback,

        ExecutorService callbackExecutor,

        int retryCount) {

    if (IS_DEBUG) Log.logHttp(TAG, "requestOnBack() : E");

    DatagramSocket udpSocket = null;

    InetSocketAddress isAddress = null;

    DatagramPacket sendPacket = null;

    DatagramPacket receivePacket = null;

    // UDP socket.

    try {

        udpSocket = new DatagramSocket();

    } catch (SocketException e) {

      ……

    }

    isAddress = new InetSocketAddress(SSDP_ADDRESS, SSDP_PORT);

    // Receive datagram.

    byte[] receiveBuf = new byte[SSDP_PACKET_BUFFER_SIZE];

    receivePacket = new DatagramPacket(receiveBuf, receiveBuf.length);

    // Send datagram.

    final byte[] sendBuf = getSsdpRequest().getBytes();

    try {

        sendPacket = new DatagramPacket(

                sendBuf,

                sendBuf.length,

                isAddress);

    } catch (SocketException e) {

       ……

    }

    // Broadcast.

    if (IS_DEBUG) Log.logDebug(TAG, "Do SSDP broadcast.");

    try {

        udpSocket.send(sendPacket);

    } catch (IOException e) {

       ……

    }

    // Receive 1 response.

    try {

        udpSocket.setSoTimeout(SSDP_UDP_SOCKET_TIMEOUT);

    } catch (SocketException e) {

       ……

    }

    try {

        if (IS_DEBUG) Log.logHttp(TAG, "Do receive response packet.");

        udpSocket.receive(receivePacket);

    } catch (IOException e) {

       …….

    }

    String ssdpResMessage = new String(

            receivePacket.getData(),

            0,

            receivePacket.getLength());

    if (IS_DEBUG) Log.logHttp(TAG, "RES = \n" + ssdpResMessage);

    // Create RemoteDevice.

    List<RemoteDevice> remoteDevices = new ArrayList<RemoteDevice>();

    String deviceUuid = findValueFromKeyInHttpReqRes(

            ssdpResMessage,

            SSDP_HEADER_KEY_USN);

    String deviceLocation = findValueFromKeyInHttpReqRes(

            ssdpResMessage,

            SSDP_HEADER_KEY_LOCATION);

    if (deviceUuid != null && deviceLocation != null) {

        RemoteDevice remoteDevice = RemoteDevice.load(

                deviceUuid,

                deviceLocation);

        if (remoteDevice == null) {

            // Failed to load  RemoteDevice.

            if (udpSocket != null && !udpSocket.isClosed()) {

                udpSocket.close();

            }

            // Callback.

            notifyCallback(callback, callbackExecutor, ResultStatus.NETWORK_ERROR, null,

                    retryCount);

            return;

        }

        remoteDevices.add(remoteDevice);

    }

    // Callback.

    notifyCallback(callback, callbackExecutor, ResultStatus.NO_ERROR, remoteDevices,

            retryCount);

    if (udpSocket != null && !udpSocket.isClosed()) {

        udpSocket.close();

    }

    if (IS_DEBUG) Log.logHttp(TAG, "request() : X");

}

private static void notifyCallback(

        final OnRemoteDeviceScannedCallback callback,

        final ExecutorService callbackExecutor,

        final ResultStatus resultStatus,

        final List<RemoteDevice> remoteDevices,

        final int retryCount) {

    callbackExecutor.execute(new Runnable() {

        @Override

        public void run() { //至此通过最初传下的callback函数拿到remoteDevices

            callback.onRemoteDeviceScanned(resultStatus, remoteDevices, retryCount);

        }

    });

}

2、callback中具体处理数据

vim vendor/semc/packages/apps/camera-addons/OnlineRemoteCamera/src/com/sonymobile/android/addoncamera/onlineremote/controller/StateMachineController.java

private class RemoteScanCallback implements RemoteDeviceScanner.OnRemoteDeviceScannedCallback {

    @Override

    public void onRemoteDeviceScanned(

            RemoteDeviceScanner.ResultStatus status,

            List<RemoteDevice> remoteDevices,

            int retryCount) {

        if (mWifiEnv == null) {

            if (IS_WIFIP2P_DEBUG) logDebug("onRemoteDeviceScanned() already Wifi released.");

            return;

        }

        if (status == RemoteDeviceScanner.ResultStatus.NO_ERROR) {

            if (remoteDevices.size() == 0) {

                ……

                cancelConnect();

                return;

            }

            // Create RemoteDeviceHandler.

            mRemoteDeviceHandler = new RemoteDeviceHandler(

                    remoteDevices.get(0), // Top device.

                    mBackWorker);

            mRemoteDeviceHandler.openEvf(new RemoteEvfCallback());  //获取流数据

            mRemoteDeviceHandler.startMonitoring();

        } else if (retryCount <= MAX_DEVICE_SCAN_RETRY_COUNT) {

            // Retry request.

            ……

            RemoteDeviceScanner.request(

                    new RemoteScanCallback(),

                    mBackWorker,

                    retryCount + 1);

        } else {

            // Retry time out.

            cancelConnect();

        }

    }

}

vi vendor/semc/packages/apps/camera-addons/common-components/src/com/sonymobile/cameracommon/remotedevice/RemoteDeviceHandler.java

public RemoteDeviceHandler(RemoteDevice target, ExecutorService callbackExecutor) {

    mRemoteDevice = target;

    mCallbackExecutor = callbackExecutor;

    // Event observer.

    mEventObserver = new EventObserverController(

            mRemoteDevice,

            mCallbackExecutor);

    // TakePicture controller.

    mTakePictureController = new TakePictureController(mRemoteDevice, mCallbackExecutor);

    // MovieRec controller.

    mMovieRecController = new MovieRecController(mRemoteDevice, mCallbackExecutor);

    // Register callbacks.

    mEventObserver.addCallback(mRemoteDevice.getEventObserverCallback());

    mEventObserver.addCallback(mTakePictureController.getEventObserverCallback());

    mEventObserver.addCallback(mMovieRecController.getEventObserverCallback());

}

public void openEvf(EvfStreamCallback callback) {

    if (IS_DEBUG) Log.logDebug(TAG, "openEvf() : E");

    mIsOnEvfStreaming = true;

    // Open stream slicer.

    mEvfStreamSlicer = new EvfStreamController(mRemoteDevice, callback, mCallbackExecutor);

    mEvfStreamSlicer.open();

    if (IS_DEBUG) Log.logDebug(TAG, "openEvf() : X");

}

vim vendor/semc/packages/apps/camera-addons/common-components/src/com/sonymobile/cameracommon/remotedevice/stream/EvfStreamController.java

public void open() {

    ……

    mIsEvfLoading = true;

    mInternalExecutor.execute(new Runnable() {

        @Override

        public void run() {

            if (doOpen()) {

                doStartEvfLoading();

            }

        }

    });

}

private void doStartEvfLoading() {

    if (!mInternalExecutor.isShutdown()) {

        mInternalExecutor.execute(new LoadEvfFrameTask());

    }

}

private class LoadEvfFrameTask implements Runnable {  //通过rpc远程过程调用协议获取数据

    @Override

    public void run() {

        while (mIsEvfLoading) {

            if (IS_DEBUG) Log.logDebug(TAG, "Load 1 frame : START");

            // Load common header.

            int commonHeaderSize

                    = PACKET_HEADER_SIZE_START_BYTE

                    + PACKET_HEADER_SIZE_PAYLOAD_TYPE

                    + PACKET_HEADER_SIZE_SEQUENCE_NUMBER

                    + PACKET_HEADER_SIZE_TIMESTAMP;

            byte[] commonHeader = Util.readInputStreamAsByteArray(

                    TAG,

                    mEvfStream,

                    commonHeaderSize);

            if (commonHeader == null || commonHeader.length != commonHeaderSize) {

                Log.logError(TAG, "Failed to read Common Header.");

                if (mCallbackExecutor != null) {

                    mCallbackExecutor.execute(new NotifyEvfFrameTask(null, mCallback));

                }

                return;

            }

            if (commonHeader[0] != PACKET_HEADER_START_BYTE) {

                Log.logError(TAG, "Invalid common header. (start byte)");

                if (mCallbackExecutor != null) {

                    mCallbackExecutor.execute(new NotifyEvfFrameTask(null, mCallback));

                }

                return;

            }

            if (commonHeader[1] != PACKET_HEADER_PAYLOAD_TYPE_LIVEVIEW) {

                Log.logError(TAG, "Invalid common header. (payload type)");

                if (mCallbackExecutor != null) {

                    mCallbackExecutor.execute(new NotifyEvfFrameTask(null, mCallback));

                }

                return;

            }

            // Load payload header.

            int payloadHeaderSize

                    = PAYLOAD_HEADER_SIZE_START_BYTE

                    + PAYLOAD_HEADER_SIZE_JPEG_DATA_SIZE

                    + PAYLOAD_HEADER_SIZE_PADDING_SIZE

                    + PAYLOAD_HEADER_SIZE_RESERVED_0

                    + PAYLOAD_HEADER_SIZE_FLAG

                    + PAYLOAD_HEADER_SIZE_RESERVED_1;

            byte[] payloadHeader = Util.readInputStreamAsByteArray(

                    TAG,

                    mEvfStream,

                    payloadHeaderSize);

            if (payloadHeader == null || payloadHeader.length != payloadHeaderSize) {

                Log.logError(TAG, "Failed to read Payload Header.");

                if (mCallbackExecutor != null) {

                    mCallbackExecutor.execute(new NotifyEvfFrameTask(null, mCallback));

                }

                return;

            }

            if (payloadHeader[0] != PAYLOAD_HEADER_START_BYTE_0

                    || payloadHeader[1] != PAYLOAD_HEADER_START_BYTE_1

                    || payloadHeader[2] != PAYLOAD_HEADER_START_BYTE_2

                    || payloadHeader[3] != PAYLOAD_HEADER_START_BYTE_3) {

                Log.logError(TAG, "Invalid payload header. (start byte)");

                if (mCallbackExecutor != null) {

                    mCallbackExecutor.execute(new NotifyEvfFrameTask(null, mCallback));

                }

                return;

            }

            // Load JPEG data.

            int jpegSize = Util.byteArray2Integer(

                    payloadHeader,

                    PAYLOAD_HEADER_SIZE_START_BYTE,

                    PAYLOAD_HEADER_SIZE_JPEG_DATA_SIZE);

            byte[] jpegData = Util.readInputStreamAsByteArray(

                    TAG,

                    mEvfStream,

                    jpegSize);

            // Load padding data.

            int paddingSize = Util.byteArray2Integer(

                    payloadHeader,

                    PAYLOAD_HEADER_SIZE_START_BYTE + PAYLOAD_HEADER_SIZE_JPEG_DATA_SIZE,

                    PAYLOAD_HEADER_SIZE_PADDING_SIZE);

            byte[] paddingData = Util.readInputStreamAsByteArray(

                    TAG,

                    mEvfStream,

                    paddingSize);

            if (paddingData != null) {

                if (IS_DEBUG) Log.logDebug(TAG, "Padding size = " + paddingData.length);

            } else {

                if (IS_DEBUG) Log.logDebug(TAG, "Cannot get Padding data.");

            }

            if (mCallbackExecutor != null) {

                mCallbackExecutor.execute(new NotifyEvfFrameTask(jpegData, mCallback));

            }

            if (IS_DEBUG) Log.logDebug(TAG, "Load 1 frame : END");

        }

    }

}

private static class NotifyEvfFrameTask implements Runnable {

    /** Callback. */

    private final EvfStreamCallback mCallback;

    /** Frame. */

    private final byte[] mFrame;

    /**

     * CONSTRUCTOR.

     *

     * @param frame

     * @param callback

     */

    public NotifyEvfFrameTask(byte[] frame, EvfStreamCallback callback) {

        mFrame = frame;

        mCallback = callback;

    }

    @Override

    public void run( ){

        if (mCallback != null) {

            mCallback.onEvfFrame(mFrame);   //传下来的callback函数拿到数据帧

        }

    }

}

vim vendor/semc/packages/apps/camera-addons/OnlineRemoteCamera/src/com/sonymobile/android/addoncamera/onlineremote/controller/StateMachineController.java

private class RemoteEvfCallback implements EvfStreamCallback {

    private FrameConversionTask mFrameConversionTask = null;

    private Queue<byte[]> mFrameStack = new ConcurrentLinkedQueue<byte[]>();

    private static final int MAX_FRAME_STACK_COUNT = 6;

    private ScheduledExecutorService mLiveViewFinderWorker = Executors.newScheduledThreadPool(

            1,

            new LiveViewFinderWorkerThreadFactory());

    private class LiveViewFinderWorkerThreadFactory implements ThreadFactory {

        @Override

        public Thread newThread(Runnable runnable) {

            Thread thread = new Thread(runnable, "evf-worker-th");

            thread.setPriority(Thread.MAX_PRIORITY);

            return thread;

        }

    }

    @Override

    public void onEvfOpened(boolean isSuccess) {

        ……

        if (!isSuccess) {

            if (IS_WIFIP2P_DEBUG) logDebug("onEvfOpened(): false");

            // Cancel connection.

            stopRemoteCameraPreview();

            cancelConnectToRemoteDevice();

            return;

        }

        mFrameStack.clear();

        mFrameConversionTask = new FrameConversionTask();

        mUiThreadHandler.post(new Runnable() {

            @Override

            public void run() {

                if (mActivity.getThisNode() != null) {

                    sendEvent(

                            TransitterEvent.EVENT_CONNECTION_REMOTE_DEVICE_READY,

                            Util.getLayoutNodeId(

                                    mActivity.getThisNode().getMacAddress(),

                                    Util.STREAM_ID_REMOTE_DEVICE));

                }

            }

        });

    }

    @Override

    public void onEvfFrame(byte[] frame) {

        ……

        // Cache.

        while(MAX_FRAME_STACK_COUNT < mFrameStack.size()) {

            mFrameStack.remove();

        }

        mFrameStack.offer(frame);  //保存数据帧

        // Initialize. Call this after set frame.

        if (!mFrameConversionTask.isAlreadyInitialized()) {

            mFrameConversionTask.initialize();

            mLiveViewFinderWorker.scheduleAtFixedRate(

                    mFrameConversionTask,

                    1000 / 15,

                    1000 / 15,

                    TimeUnit.MILLISECONDS);

        }

    }

3.1.2 connectToXperia camera

1) server端:将callback至应用端的preview数据流发送至网络

wifi p2p发送数据到网络:

注册callback函数:

vim vendor/semc/packages/apps/camera-addons/OnlineRemoteCamera/src/com/sonymobile/android/addoncamera/onlineremote/device/CameraDeviceHandler.java

@Override

public void preparePreviewCallbackWithBuffer() {

    // Get camera instance.

    Camera camera = getCameraInstance();

    if (camera == null) {

        return;

    }

    // Add callback.

    camera.setPreviewCallbackWithBuffer(mOnPreviewFrameCallback);

}

class OnPreviewFrameCallback implements Camera.PreviewCallback {

    @Override

    public void onPreviewFrame(byte[] frame, Camera camera) {

        ……

        if (mStateMachine != null) {

            mStateMachine.onPreviewFrameUpdated(frame);

        }

        ……

}

vim vendor/semc/packages/apps/camera-addons/OnlineRemoteCamera/src/com/sonymobile/android/addoncamera/onlineremote/controller/StateMachineController.java

@Override

public void onPreviewFrameUpdated(byte[] frame) {

    // Rendering culled out.

    ++mThisFrameUpdatedCount;

    if (mThisFrameUpdatedCount % mFrameRenderingRequestStride == 0) {

        // Current.

        FrameData frameData = new FrameData(

                FrameData.ImageFormat.YVU420_SEMIPLANAR,

                mCameraDeviceHandler.getPreviewRect().width(),

                mCameraDeviceHandler.getPreviewRect().height(),

                frame);

        // Render.

        mGLRendererAccessor.setMirrored(MultiCameraFrameRender.NODE_ID_THIS_DEVICE,

                mSettingPreferences.getCurrentFacing().isPreviewShowMirrored());

        mGLRendererAccessor.requestFrame(MultiCameraFrameRender.NODE_ID_THIS_DEVICE, frameData);

        // Streaming.

        int frameStreamId = mSettingPreferences.getCurrentFacing().getStreamId();

        requestViewFinderStreaming(frameStreamId, frame);

    }

    // Next.

    requestNextFrame();

}

private void requestViewFinderStreaming(int frameStreamId, byte[] frame) {

    ……

    // Copy.

    mPreResizedFrameBufferRing.increment();

    System.arraycopy(

            frame,

            0,

            mPreResizedFrameBufferRing.getCurrent().array(),

            mPreResizedFrameBufferRing.getCurrent().arrayOffset(),

            frame.length);

    // Back task.

    RequestEncodeAndSubmitTask task = new RequestEncodeAndSubmitTask(

            mCameraDeviceHandler.getPreviewRect(),

            mPreResizedFrameBufferRing.getCurrent(),

            frameStreamId,

            mJpegEncParams,

            mPostResizedFrameBuffer);

    mSubmitQueueTaskSet.add(task);

    mSubmitQueueWorker.execute(task);

    if (IS_WIFIP2P_DEBUG) logFps("requestViewFinderStreaming() : X");

}

public RequestEncodeAndSubmitTask(

        Rect frameRect,

        ByteBuffer frameByteBuffer,

        int frameStreamId,

        JpegEncoder.Parameters jpegEncParams,

        ByteBuffer postResizedFrameByteBuffer) {

    mFrameRect = frameRect;

    mFrameByteBuffer = frameByteBuffer;

    mFrameStreamId = frameStreamId;

    mJpegEncParamsLocal = jpegEncParams;

    mPostResizedFrameByteBuffer = postResizedFrameByteBuffer;

}

/**

 * Release.

 */

public void release() {

    mIsReleased = true;

}

@Override

public void run() {

    ……

    ImageConvertor.shrinkYvu420Sp(

            mFrameByteBuffer,

            mFrameRect.width(),

            mFrameRect.height(),

            mPostResizedFrameByteBuffer,

            SHRINK_RATIO_FOR_STREAMING);

    ……

    byte[] jpegBytes = null;

    final int resizedFrameWidth

            = mFrameRect.width() / SHRINK_RATIO_FOR_STREAMING.shrinkSize;

    final int resizedFrameHeight

            = mFrameRect.height() / SHRINK_RATIO_FOR_STREAMING.shrinkSize;

    byte[] postResizedFrameBytes = ByteBufferUtil.array(mPostResizedFrameByteBuffer);

    if (ClassDefinitionChecker.isJpegEncoderSupported()) {

        JpegEncoder.Result result = mJpegEnc.process(

                postResizedFrameBytes,

                mJpegEncParamsLocal);

        jpegBytes = result.imageBuffer;

    } else {

        // YVU to JPEG encode.

        YuvImage yuvImage = new YuvImage(

                postResizedFrameBytes,

                ImageFormat.NV21,

                resizedFrameWidth,

                resizedFrameHeight,

                null);

        ByteArrayOutputStream os = new ByteArrayOutputStream();

        yuvImage.compressToJpeg(

                new Rect(0, 0, resizedFrameWidth, resizedFrameHeight),

                mTrafficJpegQuality,

                os);

        try {

            os.flush();

            jpegBytes = os.toByteArray();

            os.close();

        } catch (IOException e) {

            CameraLogger.e(TAG, "Stream release failed: ", e);

            return;

        }

    }

    if (IS_WIFIP2P_DEBUG) logFps("Encode YVU420SP->JPEG : OUT");

    // Submit to network.

    submitJpegBytes(

            jpegBytes,

            resizedFrameWidth,

            resizedFrameHeight,

            mFrameStreamId,

            new ViewFinderStreamSubmitCallback());

    ++mFrameId;

    mSubmitQueueTaskSet.remove(this);

    if (IS_WIFIP2P_DEBUG) logFps("RequestEncodeAndSubmitTask.run() : X");

}

private void submitJpegBytes(

        byte[] jpegBytes,

        int frameWidth,

        int frameHeight,

        int frameStreamId,

        MessageSubmittedCallback callback) {

    // Payload.

    byte[] payloadBytes = jpegBytes;

    int packetCount = payloadBytes.length / MAX_PAYLOAD_SIZE_OCTET;

    if (payloadBytes.length % MAX_PAYLOAD_SIZE_OCTET != 0) {

        ++packetCount;

    }

    byte[][] packetList = new byte[packetCount][];

    // Copy.

    for (int i = 0; i < payloadBytes.length; i += MAX_PAYLOAD_SIZE_OCTET) {

        int count = i / MAX_PAYLOAD_SIZE_OCTET;

        // Payload size.

        int remainSize = payloadBytes.length - i;

        int thisFragSize;

        if (remainSize < MAX_PAYLOAD_SIZE_OCTET) {

            thisFragSize = remainSize;

        } else {

            thisFragSize = MAX_PAYLOAD_SIZE_OCTET;

        }

        // Write payload.

        byte[] thisFrag = new byte[thisFragSize];

        System.arraycopy(payloadBytes, i, thisFrag, 0, thisFragSize);

        packetList[count] = thisFrag;

    }

    if (IS_WIFIP2P_DEBUG) logFps("Generate packet payloads : DONE");

    // Callback object.

    MultipleMessageSubmittedCallbackHandler callbackHandler

            = new MultipleMessageSubmittedCallbackHandler(callback);

    // Request submit.

    Iterator<NetworkNode> groupedItr = mGroupedNodeSet.iterator();

    while (groupedItr.hasNext()) {

        NetworkNode eachNode = groupedItr.next();

        if (eachNode.getIpAddress() != null && eachNode.getUdpPort() != 0) {

            // Check required.

            boolean isRequired = false;

            Set<NetworkNode.Stream> required = eachNode.getRequiredStreams();

            if (required == null) {

                // NOP. There is no required stream.

                break;

            }

            for (NetworkNode.Stream eachStream : eachNode.getRequiredStreams()) {

                if (eachStream.macAddress.equals(mActivity.getThisNode().getMacAddress())

                        && eachStream.id == frameStreamId) {

                    // OK.

                    isRequired = true;

                    break;

                }

            }

            if (isRequired) {

                // Sony RemoteControl API use 4 bytes for timestamp and we follow the format.

                // So use int for timestamp.

                int timestamp = (int) mActivity.getApplicationUptimeMilis();

                for (int i = 0; i < packetCount; ++i) {

                    FragmentFrame frag = FragmentFrame.generate(

                            mFrameId,

                            timestamp,

                            i,

                            packetCount,

                            frameStreamId,

                            Constants.FrameColorFormat.JPEG,

                            frameWidth,

                            frameHeight,

                            packetList[i]);

                    callbackHandler.countSubmit();

                    mViewFinderStream.requestSubmitMessage(

                            frag.getDatagramBuffer(),

                            eachNode.getIpAddress(),

                            eachNode.getUdpPort(),

                            callbackHandler);

                }

                ……

        }

    }

    // Ready to callback.

    callbackHandler.setReadyToCallback();

}

vim vendor/semc/packages/apps/camera-addons/common-components/src/com/sonymobile/cameracommon/wifip2pcontroller/communication/UniCastStream.java

public void requestSubmitMessage(

        byte[] message,

        String targetIpAddress,

        int targetPort,

        MessageSubmittedCallback callback) {

    executeSubmitTask(new SubmitTask(message, targetIpAddress, targetPort, callback));

}

public SubmitTask(

        byte[] message,

        String targetIpAddress,

        int targetPort,

        MessageSubmittedCallback callback) {

    ……

    mMessage = message;

    mTargetIpAddress = targetIpAddress;

    mTargetPort = targetPort;

    mCallback = callback;

}

@Override

public void run() {

    // Target Address + port.

    InetSocketAddress targetIpPort = new InetSocketAddress(

            mTargetIpAddress,

            mTargetPort);

    DatagramSocket datagramSocket = null;

    DatagramPacket datagramPacket = null;

    boolean isSuccess = false;

    // Total exception barrier.

    try {

        // UDP socket.

        datagramSocket = new DatagramSocket(0);

        // UDP packet.  Socket将数据发送至网络

        datagramPacket = new DatagramPacket(

                mMessage,

                mMessage.length,

                targetIpPort);

        // Send.

        datagramSocket.send(datagramPacket);

        // Check.

        isSuccess = true;

    } catch (IOException e) {

        Log.logError(TAG, "UniCastStream.SubmitTask.", e);

    } finally {

        if (datagramSocket != null) {

            datagramSocket.close();

            datagramSocket = null;

        }

    }

    // Notify callback.

    if (mCallback != null) {

        if (isSuccess) {

            mCallback.onSubmitSucceeded();

        } else {

            mCallback.onSubmitFailed();

        }

    }

}

2) client端:接收网络中的图像数据

vim vendor/semc/packages/apps/camera-addons/OnlineRemoteCamera/src/com/sonymobile/android/addoncamera/onlineremote/controller/StateMachineController.java

private void startWifiP2pRemoteCameraEnvironment() {

    // Environment.

    mWifiP2pEnv = new WifiP2pNetworkEnvironment();

    mWifiP2pEnv.initialize(

            mActivity,

            mActivity.getAppVersionCode());

    mWifiP2pEnv.setPassKeyManager(mActivity.getPassKeyManager());

    // Image processor.

    if (ClassDefinitionChecker.isJpegEncoderSupported()) {

        mJpegEnc = JpegEncoder.create();

    }

    createJpegEncoderParams();

    // Buffer.

    changeBufferSize();

    mAvailableStreamSet.clear();

    // Frame ID related data.

    mFrameIdRelatedDataMap.clear();

    // Queue.

    mSubmitQueueTaskSet.clear();

    mSubmitQueueWorker = Executors.newSingleThreadExecutor(

            new QueueWorkerThreadFactory("encode-submit"));

    mRenderingWorker = Executors.newScheduledThreadPool(

            1,

            new QueueWorkerThreadFactory("request-render"));

    mRenderingWorker.scheduleAtFixedRate(

            mRequestRenderTask,

            1000 / OnlineRemoteCameraActivity.TOTAL_REQUIRED_FPS,

            1000 / OnlineRemoteCameraActivity.TOTAL_REQUIRED_FPS,

            TimeUnit.MILLISECONDS);

    // Stream.

    mViewFinderStream = new UniCastStream();

    // Set callbacks.

    mWifiP2pEnv.setCallbacks(new NetworkStateCallbackImpl(), new NodeStateCallbackImpl());

    // DEBUG.

    if (IS_DEBUG_DISPLAY_ON) {

        mDebugDisplay = new DebugDisplayRefreshTask();

        mUiThreadHandler.post(mDebugDisplay);

    }

}

public void onThisNodeChanged(NetworkNode thisNode) {  //类NodeStateCallbackImpl中方法

    if (IS_WIFIP2P_DEBUG) logDebug("onThisNodeChanged() :" + thisNode.toString());

    // Update.

    mActivity.setThisNode(thisNode);

    // Update current this node stream.

    updateStreamByCurrentFacing();

    if (!mIsAlreadyStreamReceiving && thisNode.getUdpPort() != 0 &&

            mViewFinderStream != null) {

        mIsAlreadyStreamReceiving = true;

        mViewFinderStream.startReceiveMessage(

                thisNode.getUdpPort(),

                UDP_MSG_MAX_SIZE_OCTET,

                new ViewFinderStreamReceiveCallback());

    }

}

vim vendor/semc/packages/apps/camera-addons/common-components/src/com/sonymobile/cameracommon/wifip2pcontroller/communication/UniCastStream.java

public void startReceiveMessage(

        int targetPort,

        int dataSize,

        MessageReceivedCallback callback) {

    startReceiveMessageImpl(

            callback,

            new ReceiveTaskImpl(targetPort, dataSize));

}

private class ReceiveTaskImpl implements ReceiveTask {

    /** This task is alive or not.*/

    volatile private boolean mIsAlive = true;

    /** Target port.*/

    private final int mTargetPort;

    /** Receive data packet byte size.*/

    private final int mReceivedDataSize;

    //

    ReceiveTaskImpl(int targetPort, int dataSize) {

        mTargetPort = targetPort;

        mReceivedDataSize = dataSize;

    }

    /**

     * Kill task.

     */

    @Override

    public void release() {

        mIsAlive = false;

    }

    @Override

    public void run() {

        // Receive data packet.

        byte[] buffer = new byte[mReceivedDataSize];

        DatagramPacket receivePacket = new DatagramPacket(buffer, buffer.length);

        DatagramSocket datagramSocket = null;

        // Fail safe loop.

        while (mIsAlive) {

            // Total exception barrier.

            try {

                // Socket.

                datagramSocket = new DatagramSocket(mTargetPort);

                datagramSocket.setReuseAddress(true);

                datagramSocket.setSoTimeout(SOCKET_SO_TIMEOUT);

                // Loop.

                while (mIsAlive) {

                    // Wait for receive.

                    try {

                        datagramSocket.receive(receivePacket);

                        // Packet buffer.

                        byte[] receiveBuffer = receivePacket.getData();

                        int length = receivePacket.getLength();

                        int offset = receivePacket.getOffset();

                        // Address.

                        InetAddress remoteIp = receivePacket.getAddress();

                        String hostIp = remoteIp.getHostAddress();

                        // Notify callback.

                        MessageReceivedCallback callback = mMessageReceivedCallback;

                        if (callback != null) {

                            byte[] message = Arrays.copyOfRange(receiveBuffer, offset, length);

                            callback.onMessageReceived(message, hostIp);

                        }

                    } catch (SocketTimeoutException e) {

                        ……

                }

            } catch (IOException e) {

                Log.logError(TAG, "UniCastStream.ReceiveTask.", e);

            } finally {

                if (datagramSocket != null) {

                    datagramSocket.close();

                    datagramSocket = null;

                }

            }

        }

    }

}

vim vendor/semc/packages/apps/camera-addons/OnlineRemoteCamera/src/com/sonymobile/android/addoncamera/onlineremote/controller/StateMachineController.java

callback函数得到图像数据:

private class ViewFinderStreamReceiveCallback implements MessageReceivedCallback {

    @Override

    public void onMessageReceived(byte[] message, String ipAddress) {

        if (IS_WIFIP2P_DEBUG) logFps("onMessageReceived() : E");

        FragmentFrame fragment = FragmentFrame.generate(message);

        if (fragment == null) {

            if (IS_WIFIP2P_DEBUG) logDebug("Fragment is NULL");

            return;

        }

        if (IS_WIFIP2P_DEBUG) logDebug(fragment.toString());

        // Render queue.

        FragmentFrameCache frameCache = null;

        // Executor.

        ExecutorService backWorker = null;

        // Buffer ring.

        ByteBufferRing bufRing = null;

        // FrameData queue.

        Queue<FrameData> frameDataStack = null;

        // Frame ID.

        String frameId = createFrameId(ipAddress, fragment.frameStreamId);

        if (frameId == null) {

            if (IS_WIFIP2P_DEBUG) logDebug("FrameID is NULL");

            return;

        }

        // Host node.

        NetworkNode hostNode = com.sonymobile.cameracommon.wifip2pcontroller.util.Util

                .getNetworkNodeWithIpAddress(mGroupedNodeSet, ipAddress);

        ……

        // Check and create render queue.

        if (mFrameIdRelatedDataMap.containsKey(frameId)) {

            // Already exists.

            frameCache = mFrameIdRelatedDataMap.get(frameId).getFragmentFrameCache();

            backWorker = mFrameIdRelatedDataMap.get(frameId).getPostReceiveExecutor();

            bufRing = mFrameIdRelatedDataMap.get(frameId).getBufferRing();

            final int expectedYuvSize =

                    fragment.framePixelWidth * fragment.framePixelHeight * 3 / 2;

            if (expectedYuvSize > bufRing.getCurrent().capacity()) {

                bufRing = new ByteBufferRing(

                        MAX_RECEIVED_FRAME_STACK_BUFFER_RING_SIZE,

                        fragment.framePixelWidth * fragment.framePixelHeight *

                                3 / 2, // YVU420SP

                        true);

                // Replace buffer ring.

                FrameIdRelatedDataContainer container =

                        new FrameIdRelatedDataContainer(

                                mFrameIdRelatedDataMap.get(frameId).getFragmentFrameCache(),

                                mFrameIdRelatedDataMap.get(frameId).getPostReceiveExecutor(),

                                bufRing,

                                mFrameIdRelatedDataMap.get(frameId).getReceivedFrameDataStack(),

                                mFrameIdRelatedDataMap.get(frameId).getStreamId());

                mFrameIdRelatedDataMap.put(frameId, container);

            }

            frameDataStack = mFrameIdRelatedDataMap.get(frameId)

                    .getReceivedFrameDataStack();

        } else {

            // New node.

            frameCache = new FragmentFrameCache();

            backWorker = Executors.newSingleThreadExecutor(

                    new QueueWorkerThreadFactory("decode"));

            bufRing = new ByteBufferRing(

                    MAX_RECEIVED_FRAME_STACK_BUFFER_RING_SIZE,

                    fragment.framePixelWidth * fragment.framePixelHeight *

                            3 / 2, // YVU420SP

                    true);

            frameDataStack = new ConcurrentLinkedQueue<FrameData>();

            // Create frame ID related data.

            FrameIdRelatedDataContainer container =

                    new FrameIdRelatedDataContainer(

                            frameCache,

                            backWorker,

                            bufRing,

                            frameDataStack,

                            fragment.frameStreamId);

            mFrameIdRelatedDataMap.put(frameId, container);

        }

        // Update streamId.

        mFrameIdRelatedDataMap.get(frameId).setStreamId(fragment.frameStreamId);

        ……

        // Back task.

        DecodeFrameAndRequestRenderTask task = new DecodeFrameAndRequestRenderTask(

                hostNode,

                frameCache,

                fragment,

                bufRing,

                frameDataStack);

        backWorker.execute(task);

        if (IS_WIFIP2P_DEBUG) logFps("onMessageReceived() : X");

    }

}

DecodeFrameAndRequestRenderTask(

        NetworkNode node,

        FragmentFrameCache cache,

        FragmentFrame fragment,

        ByteBufferRing bufferRing,

        Queue<FrameData> frameDataStack) {

    mNode = node;

    mFrameCache = cache;

    mFragment = fragment;

    mBufferRing = bufferRing;

    mFrameDataStack = frameDataStack;

}

@Override

public void run() {

    // Check and create re-constructor.

    mFrameCache.cache(mFragment);

    // Check hit.

    byte[] completedFrame = mFrameCache.completedFrame();

    if (IS_WIFIP2P_DEBUG) logFps("generate completed Frame : DONE");

    if (completedFrame != null && mGLRendererAccessor != null) {

        int frameWidth = 0;

        int frameHeight = 0;

        // JPEG decode.

        if (IS_WIFIP2P_DEBUG) logFps("Decode JPEG->YVU420SP : IN");

        byte[] yvuBytes = null;

        // Decode JPEG.

        try {

            ImageConvertor.decodeJpegToYvu420Sp(

                    completedFrame,

                    mFragment.framePixelWidth,

                    mFragment.framePixelHeight,

                    mBufferRing.getCurrent().array(),

                    false);

            frameWidth = mFragment.framePixelWidth;

            frameHeight = mFragment.framePixelHeight;

        } catch (ImageConvertorException e) {

            if (CameraLogger.DEBUG) CameraLogger.d(TAG,

                    "ImageConvertorException: " + e);

            // Get ARGB8888 buffer.

            Bitmap remoteFrameBmp = BitmapFactory.decodeByteArray(

                    completedFrame,

                    0,

                    completedFrame.length);

            if (mPostDecodeFrameBuffer == null) {

                int length = remoteFrameBmp.getWidth() * remoteFrameBmp.getHeight();

                mPostDecodeFrameBuffer = new int[length];

            }

            remoteFrameBmp.getPixels(

                    mPostDecodeFrameBuffer,

                    0,

                    remoteFrameBmp.getWidth(),

                    0,

                    0,

                    remoteFrameBmp.getWidth(),

                    remoteFrameBmp.getHeight());

            // Convert ARGB8888 to YVU420SP.

            ImageConvertor.convertArgb8888ToYvu420Sp(

                    remoteFrameBmp.getWidth(),

                    remoteFrameBmp.getHeight(),

                    mPostDecodeFrameBuffer,

                    mBufferRing.getCurrent().array());

            frameWidth = remoteFrameBmp.getWidth();

            frameHeight = remoteFrameBmp.getHeight();

        }

        yvuBytes = mBufferRing.getCurrent().array();

        mBufferRing.increment();

        if (IS_WIFIP2P_DEBUG) logFps("Decode JPEG->YVU420SP : OUT");

        // Frame.

        FrameData frameData = new FrameData(

                FrameData.ImageFormat.YVU420_SEMIPLANAR,

                frameWidth,

                frameHeight,

                yvuBytes);

        // Stack.  将帧数据存入堆栈中

        mFrameDataStack.offer(frameData);

        // Clean up.

        if (MAX_RECEIVED_FRAME_STACK_COUNT < mFrameDataStack.size()) {

            // Too old. Remove oldest one.

            mFrameDataStack.poll();

        }

    }

    ……

}

 

评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

产品人卫朋

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值