Service能够创建界面(addView)吗?

一个Service能够创建界面(addView)吗?

一个app,只有Service,没有Activity,能够通过WindowManager调用addView()添加可视界面吗?

答案是可以,但是能够创建的界面类型(WindowManager.LayoutParams.type)不多,且大多需要android.permission.INTERNAL_SYSTEM_WINDOW权限,这个权限只能授予system app。

在没有Activity的进程中创建显示View,存在两个问题:

  1. 线程,安卓只能在主线程操作UI界面。
  2. token,在启动Activity时实例化ActivityRecord对象创建token(ActivityRecord继承WindowToken类)。

第一个问题好解决,Service生命周期方法onStartCommand()运行在主线程,可以在此方法中定义Handler,service线程通过handler发送消息跨线程通信,在主线程操作UI界面。

    Handler handler;
    @Override
    public int onStartCommand(Intent intent, int flags, int startId) {
        handler = new Handler() {
            @Override
            public void handleMessage(Message msg) {
                super.handleMessage(msg);
                switch (msg.what) {
                    case 1:
                        CursorLocationView cursorLocationView = new CursorLocationView(BleService.this);
                        params.type = 2015;		//WindowManager.LayoutParams.TYPE_SECURE_SYSTEM_OVERLAY
                        mWindowManager.addView(cursorLocationView, params);
                        break;
                }
            }
        };
    }

第二个问题需要解决token问题,WindowManagerService.addWindow()方法中在检查到token==null的情况时通过unprivilegedAppCanCreateTokenWith()检查属性:WindowManager.LayoutParams.type,如果不是特权类型的话返回false。

//WindowManagerService.java
	private boolean unprivilegedAppCanCreateTokenWith(WindowState parentWindow,
            int callingUid, int type, int rootType, IBinder tokenForLog, String packageName) {
        if (rootType >= FIRST_APPLICATION_WINDOW && rootType <= LAST_APPLICATION_WINDOW) {
            ProtoLog.w(WM_ERROR, "Attempted to add application window with unknown token "
                    + "%s.  Aborting.", tokenForLog);
            return false;
        }
        if (rootType == TYPE_INPUT_METHOD) {
            ProtoLog.w(WM_ERROR, "Attempted to add input method window with unknown token "
                    + "%s.  Aborting.", tokenForLog);
            return false;
        }
        if (rootType == TYPE_VOICE_INTERACTION) {
            ProtoLog.w(WM_ERROR,
                    "Attempted to add voice interaction window with unknown token "
                            + "%s.  Aborting.", tokenForLog);
            return false;
        }
        if (rootType == TYPE_WALLPAPER) {
            ProtoLog.w(WM_ERROR, "Attempted to add wallpaper window with unknown token "
                    + "%s.  Aborting.", tokenForLog);
            return false;
        }
        if (rootType == TYPE_QS_DIALOG) {
            ProtoLog.w(WM_ERROR, "Attempted to add QS dialog window with unknown token "
                    + "%s.  Aborting.", tokenForLog);
            return false;
        }
        if (rootType == TYPE_ACCESSIBILITY_OVERLAY) {
            ProtoLog.w(WM_ERROR,
                    "Attempted to add Accessibility overlay window with unknown token "
                            + "%s.  Aborting.", tokenForLog);
            return false;
        }
        if (type == TYPE_TOAST) {
            // Apps targeting SDK above N MR1 cannot arbitrary add toast windows.
            if (doesAddToastWindowRequireToken(packageName, callingUid, parentWindow)) {
                ProtoLog.w(WM_ERROR, "Attempted to add a toast window with unknown token "
                        + "%s.  Aborting.", tokenForLog);
                return false;
            }
        }
        return true;
    }

可以看到以下类型是需要token的,都和用户界面交互强相关:

  • FIRST_APPLICATION_WINDOW~LAST_APPLICATION_WINDOW
  • TYPE_INPUT_METHOD
  • TYPE_VOICE_INTERACTION
  • TYPE_WALLPAPER
  • TYPE_QS_DIALOG
  • TYPE_ACCESSIBILITY_OVERLAY
  • TYPE_TOAST(appInfo.targetSdkVersion < Build.VERSION_CODES.O 时例外)

其余特权类型在token=null的情况下,创建新的Token对象:

//WindowManagerService.java
public int addWindow(Session session, IWindow client, int seq,
            LayoutParams attrs, int viewVisibility, int displayId, Rect outFrame,
            Rect outContentInsets, Rect outStableInsets,
            DisplayCutout.ParcelableWrapper outDisplayCutout, InputChannel outInputChannel,
            InsetsState outInsetsState, InsetsSourceControl[] outActiveControls,
            int requestUserId) {
    		...
            if (token == null) {
                if (!unprivilegedAppCanCreateTokenWith(parentWindow, callingUid, type,
                        rootType, attrs.token, attrs.packageName)) {
                    return WindowManagerGlobal.ADD_BAD_APP_TOKEN;
                }
                
                // 特权类型!!!!
                if (hasParent) {
                    // Use existing parent window token for child windows.
                    token = parentWindow.mToken;
                } else {
                    final IBinder binder = attrs.token != null ? attrs.token : client.asBinder();
                    token = new WindowToken(this, binder, type, false, displayContent,
                            session.mCanAddInternalSystemWindow, isRoundedCornerOverlay);
                }
            }
}

一些权限类型:

Window Typeintpermission (*只能赋权给system app)
TYPE_SECURE_SYSTEM_OVERLAY2015*android.permission.INTERNAL_SYSTEM_WINDOW覆盖在所有Window上
TYPE_PHONE2002*android.permission.INTERNAL_SYSTEM_WINDOW覆盖在所有application上,但是不覆盖status bar
TYPE_APPLICATION_OVERLAY2038android.permission.SYSTEM_ALERT_WINDOW覆盖所有Activity window,(types between {@link #FIRST_APPLICATION_WINDOW} and {@link #LAST_APPLICATION_WINDOW})
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
好的,下面是一个使用 GLSurfaceView 进行视频编码和录制的示例代码,可以在前台服务中运行,并且不会因为界面退到后台而停止录制。 首先,需要在前台服务中创建一个 GLSurfaceView,并设置 Renderer。在 Renderer 的 onDrawFrame() 方法中,将 Surface 和纹理传递给编码器进行编码。同时,在 SurfaceTexture 上设置 OnFrameAvailableListener,当有新的帧可用时,会回调这个监听器,将最新的帧传递给编码器进行编码。编码器的回调函数中,将编码后的数据写入 Muxer。 以下是示例代码: ``` public class VideoEncoderService extends Service implements GLSurfaceView.Renderer, SurfaceTexture.OnFrameAvailableListener, MediaEncoder.MediaEncoderListener { private static final String TAG = "VideoEncoderService"; private static final int FRAME_RATE = 30; private static final int I_FRAME_INTERVAL = 1; private static final int VIDEO_BITRATE = 2000000; private static final int AUDIO_BITRATE = 128000; private GLSurfaceView mGLSurfaceView; private SurfaceTexture mSurfaceTexture; private MediaCodec mVideoEncoder; private MediaMuxer mMuxer; private int mVideoTrackIndex; private boolean mMuxerStarted; private MediaEncoder mAudioEncoder; private MediaEncoder mVideoEncoderWrapper; private final Object mSyncObject = new Object(); private boolean mFrameAvailable; private boolean mIsRecording; @Override public void onCreate() { super.onCreate(); mGLSurfaceView = new GLSurfaceView(this); mGLSurfaceView.setEGLContextClientVersion(2); mGLSurfaceView.setRenderer(this); mGLSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY); WindowManager.LayoutParams params = new WindowManager.LayoutParams( WindowManager.LayoutParams.MATCH_PARENT, WindowManager.LayoutParams.MATCH_PARENT, WindowManager.LayoutParams.TYPE_APPLICATION_OVERLAY, WindowManager.LayoutParams.FLAG_NOT_TOUCHABLE | WindowManager.LayoutParams.FLAG_NOT_FOCUSABLE, PixelFormat.TRANSLUCENT); WindowManager wm = (WindowManager) getSystemService(WINDOW_SERVICE); wm.addView(mGLSurfaceView, params); } @Override public int onStartCommand(Intent intent, int flags, int startId) { startRecording(); return START_STICKY; } @Override public void onDestroy() { stopRecording(); super.onDestroy(); } private void startRecording() { synchronized (mSyncObject) { if (mIsRecording) { return; } try { String outputFile = getExternalFilesDir(null).getAbsolutePath() + "/output.mp4"; mMuxer = new MediaMuxer(outputFile, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4); mVideoEncoder = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_VIDEO_AVC); MediaFormat videoFormat = MediaFormat.createVideoFormat(MediaFormat.MIMETYPE_VIDEO_AVC, 720, 1280); videoFormat.setInteger(MediaFormat.KEY_BIT_RATE, VIDEO_BITRATE); videoFormat.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE); videoFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface); videoFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, I_FRAME_INTERVAL); mVideoEncoder.configure(videoFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE); Surface surface = mVideoEncoder.createInputSurface(); mVideoEncoder.start(); mVideoEncoderWrapper = new MediaEncoder(mVideoEncoder, this); mAudioEncoder = new AudioEncoder(AUDIO_BITRATE); mAudioEncoder.setMediaEncoderListener(this); mSurfaceTexture = new SurfaceTexture(1); mSurfaceTexture.setOnFrameAvailableListener(this); mGLSurfaceView.queueEvent(new Runnable() { @Override public void run() { mSurfaceTexture.attachToGLContext(TEXTURE_ID); } }); mIsRecording = true; } catch (IOException e) { Log.e(TAG, "Failed to start recording", e); stopRecording(); } } } private void stopRecording() { synchronized (mSyncObject) { if (!mIsRecording) { return; } mIsRecording = false; mAudioEncoder.stop(); mAudioEncoder = null; mGLSurfaceView.queueEvent(new Runnable() { @Override public void run() { mSurfaceTexture.detachFromGLContext(); mSurfaceTexture.release(); } }); mVideoEncoderWrapper.stop(); mVideoEncoderWrapper = null; mVideoEncoder.stop(); mVideoEncoder.release(); mVideoEncoder = null; if (mMuxerStarted) { mMuxer.stop(); mMuxer.release(); mMuxer = null; } WindowManager wm = (WindowManager) getSystemService(WINDOW_SERVICE); wm.removeView(mGLSurfaceView); } } @Override public void onSurfaceCreated(GL10 gl, EGLConfig config) { GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f); GLES20.glDisable(GLES20.GL_DEPTH_TEST); } @Override public void onSurfaceChanged(GL10 gl, int width, int height) { GLES20.glViewport(0, 0, width, height); } @Override public void onDrawFrame(GL10 gl) { GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT); synchronized (mSyncObject) { if (mFrameAvailable) { mFrameAvailable = false; mSurfaceTexture.updateTexImage(); float[] mSTMatrix = new float[16]; mSurfaceTexture.getTransformMatrix(mSTMatrix); mVideoEncoderWrapper.frameAvailableSoon(mSTMatrix); } } } @Override public void onFrameAvailable(SurfaceTexture surfaceTexture) { synchronized (mSyncObject) { mFrameAvailable = true; mGLSurfaceView.requestRender(); } } @Override public void onMediaEncoderInitialized(MediaEncoder encoder) { synchronized (mSyncObject) { if (encoder instanceof AudioEncoder) { mAudioEncoder.start(); } else if (encoder instanceof VideoEncoder) { mVideoTrackIndex = mMuxer.addTrack(encoder.getMediaFormat()); mMuxer.start(); mMuxerStarted = true; } } } @Override public void onMediaEncoderFrameEncoded(byte[] data, boolean isKeyFrame, long presentationTimeUs) { synchronized (mSyncObject) { if (mMuxerStarted) { ByteBuffer buffer = ByteBuffer.wrap(data); MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo(); bufferInfo.size = data.length; bufferInfo.presentationTimeUs = presentationTimeUs; bufferInfo.flags = isKeyFrame ? MediaCodec.BUFFER_FLAG_KEY_FRAME : 0; mMuxer.writeSampleData(mVideoTrackIndex, buffer, bufferInfo); } } } @Override public void onMediaEncoderStopped(MediaEncoder encoder) { synchronized (mSyncObject) { if (encoder instanceof AudioEncoder) { mAudioEncoder = null; } else if (encoder instanceof VideoEncoder) { mVideoTrackIndex = -1; mMuxerStarted = false; } } } } ``` 上面的代码中,我们在 onCreate() 方法中创建 GLSurfaceView 并添加到 WindowManager 中,然后在 onStartCommand() 方法中调用 startRecording() 方法开始录制视频。在 startRecording() 方法中,我们创建了一个 MediaCodec 编码器和一个 MediaMuxer Muxer,并将其关联起来。同时,我们还创建了一个 AudioEncoder 对象用于编码音频数据。我们还创建了一个 SurfaceTexture,并将其与 GLSurfaceView 关联起来。然后,我们在 onDrawFrame() 方法中获取最新的帧,并将其传递给编码器进行编码。在编码器的回调函数中,我们将编码后的数据写入 Muxer。最后,在 stopRecording() 方法中释放所有资源。 需要注意的是,由于涉及到多个线程之间的同步和通信,因此需要仔细处理线程安全问题。同时,为了避免被系统杀死,需要将服务设置为前台服务,并在通知栏中显示一个通知。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值