Android 中使用华为机器学习套件从摄像头流中检测和翻译设备文本 [导航组件,MVVM]

介绍

在本文中,我们将学习如何在 Android 应用程序 KnowMyBoard 中集成华为 ML 套件相机流。Account Kit 为拥有庞大用户群的应用程序提供无缝登录功能。

文本识别服务可以从收据、名片和文档的图像中提取文本。该服务适用于印刷、教育和物流等行业。您可以使用它来创建处理数据输入和检查任务的应用程序。

文本识别服务能够通过大量 API 识别静态图像和动态摄像头流中的文本,您可以同步或异步调用这些 API 来构建支持文本识别的应用程序。

设备端语言检测服务可以在互联网不可用时检测文本的语言。ML Kit 检测文本中的语言并返回语言代码(符合 ISO 639-1 标准)及其各自的置信度或置信度最高的语言代码。目前,可以检测到 56 种语言。

与实时翻译服务类似,设备端翻译服务可广泛应用于需要跨语言翻译的场景。例如,旅游APP可以集成这项服务,将其他语言的路标和菜单翻译成游客的母语,为游客提供更贴心的服务。与实时翻译不同,设备端翻译不需要互联网连接。即使互联网断开,您也可以轻松使用翻译服务。

发展概况

您需要安装 Android Studio IDE,并且我假设您具有 Android 应用程序开发的先验知识。

硬件要求

运行 Windows 10 的计算机(台式机或笔记本电脑)。
Android 手机(带有 USB 数据线),用于调试。

软件要求

Java JDK 1.8 或更高版本。
已安装 Android Studio 软件或 Visual Studio 或 Code。
HMS Core (APK) 4.X 或更高版本

集成步骤

步骤 1.华为开发者账号并在华为开发者网站完成身份验证,参考注册华为ID。

步骤 2.在 AppGallery Connect 中创建项目

步骤3.添加HMS Core SDK

让我们开始编码

导航图.xml

<?xml version="1.0" encoding="utf-8"?>
<navigation xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:id="@+id/navigation_graph"
    app:startDestination="@id/loginFragment">
    <fragment
        android:id="@+id/loginFragment"
        android:name="com.huawei.hms.knowmyboard.dtse.activity.fragments.LoginFragment"
        android:label="LoginFragment"/>
    <fragment
        android:id="@+id/mainFragment"
        android:name="com.huawei.hms.knowmyboard.dtse.activity.fragments.MainFragment"
        android:label="MainFragment"/>
    <fragment
        android:id="@+id/searchFragment"
        android:name="com.huawei.hms.knowmyboard.dtse.activity.fragments.SearchFragment"
        android:label="fragment_search"
        tools:layout="@layout/fragment_search" />
</navigation>

TextRecognitionActivity.java

public final class TextRecognitionActivity extends BaseActivity
        implements OnRequestPermissionsResultCallback, View.OnClickListener {
    private static final String TAG = "TextRecognitionActivity";
    private LensEngine lensEngine = null;
    private LensEnginePreview preview;
    private GraphicOverlay graphicOverlay;
    private ImageButton takePicture;
    private ImageButton imageSwitch;
    private RelativeLayout zoomImageLayout;
    private ZoomImageView zoomImageView;
    private ImageButton zoomImageClose;
    CameraConfiguration cameraConfiguration = null;
    private int facing = CameraConfiguration.CAMERA_FACING_BACK;
    private Camera mCamera;
    private boolean isLandScape;
    private Bitmap bitmap;
    private Bitmap bitmapCopy;
    private LocalTextTransactor localTextTransactor;
    private Handler mHandler = new MsgHandler(this);
    private Dialog languageDialog;
    private AddPictureDialog addPictureDialog;
    private TextView textCN;
    private TextView textEN;
    private TextView textJN;
    private TextView textKN;
    private TextView textLN;
    private TextView tv_language,tv_translated_txt;

    private String textType = Constant.POSITION_CN;
    private boolean isInitialization = false;
    MLTextAnalyzer analyzer;
    private static class MsgHandler extends Handler {
        WeakReference<TextRecognitionActivity> mMainActivityWeakReference;

        public MsgHandler(TextRecognitionActivity mainActivity) {
            this.mMainActivityWeakReference = new WeakReference<>(mainActivity);
        }

        @Override
        public void handleMessage(Message msg) {
            super.handleMessage(msg);
            TextRecognitionActivity mainActivity = this.mMainActivityWeakReference.get();
            if (mainActivity == null) {

                return;
            }

            if (msg.what == Constant.SHOW_TAKE_PHOTO_BUTTON) {
                mainActivity.setVisible();

            } else if (msg.what == Constant.HIDE_TAKE_PHOTO_BUTTON) {
                mainActivity.setGone();

            }
        }
    }

    private void setVisible() {
        if (this.takePicture.getVisibility() == View.GONE) {
            this.takePicture.setVisibility(View.VISIBLE);
        }
    }

    private void setGone() {
        if (this.takePicture.getVisibility() == View.VISIBLE) {
            this.takePicture.setVisibility(View.GONE);
        }
    }

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        this.setContentView(R.layout.activity_text_recognition);
        if (savedInstanceState != null) {
            this.facing = savedInstanceState.getInt(Constant.CAMERA_FACING);
        }
        this.tv_language = this.findViewById(R.id.tv_lang);
        this.tv_translated_txt = this.findViewById(R.id.tv_translated_txt);
        this.preview = this.findViewById(R.id.live_preview);
        this.graphicOverlay = this.findViewById(R.id.live_overlay);
        this.cameraConfiguration = new CameraConfiguration();
        this.cameraConfiguration.setCameraFacing(this.facing);
        this.initViews();
        this.isLandScape = (this.getResources().getConfiguration().orientation == Configuration.ORIENTATION_LANDSCAPE);
        this.createLensEngine();
        this.setStatusBar();
    }

    private void initViews() {
        this.takePicture = this.findViewById(R.id.takePicture);
        this.takePicture.setOnClickListener(this);
        this.imageSwitch = this.findViewById(R.id.text_imageSwitch);
        this.imageSwitch.setOnClickListener(this);
        this.zoomImageLayout = this.findViewById(R.id.zoomImageLayout);
        this.zoomImageView = this.findViewById(R.id.take_picture_overlay);
        this.zoomImageClose = this.findViewById(R.id.zoomImageClose);
        this.zoomImageClose.setOnClickListener(this);
        this.findViewById(R.id.back).setOnClickListener(this);
        this.findViewById(R.id.language_setting).setOnClickListener(this);
        this.createLanguageDialog();
        this.createAddPictureDialog();
    }

    @Override
    public void onClick(View view) {
        if (view.getId() == R.id.takePicture) {
            this.takePicture();
        } else if (view.getId() == R.id.zoomImageClose) {
            this.zoomImageLayout.setVisibility(View.GONE);
            this.recycleBitmap();
        } else if (view.getId() == R.id.text_imageSwitch) {
            this.showAddPictureDialog();
        } else if (view.getId() == R.id.language_setting) {
            this.showLanguageDialog();
        } else if (view.getId() == R.id.simple_cn) {
            SharedPreferencesUtil.getInstance(this)
                    .putStringValue(Constant.POSITION_KEY, Constant.POSITION_CN);
            this.languageDialog.dismiss();
            this.restartLensEngine(Constant.POSITION_CN);
        } else if (view.getId() == R.id.english) {
            SharedPreferencesUtil.getInstance(this)
                    .putStringValue(Constant.POSITION_KEY, Constant.POSITION_EN);
            this.languageDialog.dismiss();
            this.preview.release();
            this.restartLensEngine(Constant.POSITION_EN);
        } else if (view.getId() == R.id.japanese) {
            SharedPreferencesUtil.getInstance(this)
                    .putStringValue(Constant.POSITION_KEY, Constant.POSITION_JA);
            this.languageDialog.dismiss();
            this.preview.release();
            this.restartLensEngine(Constant.POSITION_JA);
        } else if (view.getId() == R.id.korean) {
            SharedPreferencesUtil.getInstance(this)
                    .putStringValue(Constant.POSITION_KEY, Constant.POSITION_KO);
            this.languageDialog.dismiss();
            this.preview.release();
            this.restartLensEngine(Constant.POSITION_KO);
        } else if (view.getId() == R.id.latin) {
            SharedPreferencesUtil.getInstance(this)
                    .putStringValue(Constant.POSITION_KEY, Constant.POSITION_LA);
            this.languageDialog.dismiss();
            this.preview.release();
            this.restartLensEngine(Constant.POSITION_LA);
        } else if (view.getId() == R.id.back) {
            releaseLensEngine();
            this.finish();
        }
    }

    private void restartLensEngine(String type) {
        if (this.textType.equals(type)) {
            return;
        }
        this.lensEngine.release();
        this.lensEngine = null;
        this.createLensEngine();
        this.startLensEngine();
        if (this.lensEngine == null || this.lensEngine.getCamera() == null) {
            return;
        }
        this.mCamera = this.lensEngine.getCamera();
        try {
            this.mCamera.setPreviewDisplay(this.preview.getSurfaceHolder());
        } catch (IOException e) {
            Log.d(TextRecognitionActivity.TAG, "initViews IOException");
        }
    }

    @Override
    public void onBackPressed() {
        if (this.zoomImageLayout.getVisibility() == View.VISIBLE) {
            this.zoomImageLayout.setVisibility(View.GONE);
            this.recycleBitmap();
        } else {
            super.onBackPressed();
            releaseLensEngine();
        }
    }

    private void createLanguageDialog() {
        this.languageDialog = new Dialog(this, R.style.MyDialogStyle);
        View view = View.inflate(this, R.layout.dialog_language_setting, null);
        // Set up a custom layout
        this.languageDialog.setContentView(view);
        this.textCN = view.findViewById(R.id.simple_cn);
        this.textCN.setOnClickListener(this);
        this.textEN = view.findViewById(R.id.english);
        this.textEN.setOnClickListener(this);
        this.textJN = view.findViewById(R.id.japanese);
        this.textJN.setOnClickListener(this);
        this.textKN = view.findViewById(R.id.korean);
        this.textKN.setOnClickListener(this);
        this.textLN = view.findViewById(R.id.latin);
        this.textLN.setOnClickListener(this);
        this.languageDialog.setCanceledOnTouchOutside(true);
        // Set the size of the dialog
        Window dialogWindow = this.languageDialog.getWindow();
        WindowManager.LayoutParams layoutParams = dialogWindow.getAttributes();
        layoutParams.width = WindowManager.LayoutParams.MATCH_PARENT;
        layoutParams.height = WindowManager.LayoutParams.WRAP_CONTENT;
        layoutParams.gravity = Gravity.BOTTOM;
        dialogWindow.setAttributes(layoutParams);
    }

    private void showLanguageDialog() {
        this.initDialogViews();
        this.languageDialog.show();
    }

    private void createAddPictureDialog() {
        this.addPictureDialog = new AddPictureDialog(this, AddPictureDialog.TYPE_NORMAL);
        final Intent intent = new Intent(TextRecognitionActivity.this, RemoteDetectionActivity.class);
        intent.putExtra(Constant.MODEL_TYPE, Constant.CLOUD_TEXT_DETECTION);
        this.addPictureDialog.setClickListener(new AddPictureDialog.ClickListener() {
            @Override
            public void takePicture() {
                lensEngine.release();
                isInitialization = false;
                intent.putExtra(Constant.ADD_PICTURE_TYPE, Constant.TYPE_TAKE_PHOTO);
                TextRecognitionActivity.this.startActivity(intent);
            }

            @Override
            public void selectImage() {
                intent.putExtra(Constant.ADD_PICTURE_TYPE, Constant.TYPE_SELECT_IMAGE);
                TextRecognitionActivity.this.startActivity(intent);
            }

            @Override
            public void doExtend() {

            }
        });
    }

    private void showAddPictureDialog() {
        this.addPictureDialog.show();
    }

    private void initDialogViews() {
        String position = SharedPreferencesUtil.getInstance(this).getStringValue(Constant.POSITION_KEY);
        this.textType = position;
        this.textCN.setSelected(false);
        this.textEN.setSelected(false);
        this.textJN.setSelected(false);
        this.textLN.setSelected(false);
        this.textKN.setSelected(false);
        switch (position) {
            case Constant.POSITION_CN:
                this.textCN.setSelected(true);
                break;
            case Constant.POSITION_EN:
                this.textEN.setSelected(true);
                break;
            case Constant.POSITION_LA:
                this.textLN.setSelected(true);
                break;
            case Constant.POSITION_JA:
                this.textJN.setSelected(true);
                break;
            case Constant.POSITION_KO:
                this.textKN.setSelected(true);
                break;
            default:
        }
    }

    @Override
    protected void onSaveInstanceState(Bundle outState) {
        outState.putInt(Constant.CAMERA_FACING, this.facing);
        super.onSaveInstanceState(outState);
    }


    private void createLensEngine() {
        MLLocalTextSetting setting = new MLLocalTextSetting.Factory()
                .setOCRMode(MLLocalTextSetting.OCR_DETECT_MODE)
                // Specify languages that can be recognized.
                .setLanguage("ko")
                .create();
        analyzer = MLAnalyzerFactory.getInstance().getLocalTextAnalyzer(setting);
        //analyzer = new MLTextAnalyzer.Factory(this).create();

        if (this.lensEngine == null) {
            this.lensEngine = new LensEngine(this, this.cameraConfiguration, this.graphicOverlay);

        }
        try {
            this.localTextTransactor = new LocalTextTransactor(this.mHandler, this);
            this.lensEngine.setMachineLearningFrameTransactor(this.localTextTransactor);
           // this.lensEngine.setMachineLearningFrameTransactor((ImageTransactor) new ObjectAnalyzerTransactor());
            isInitialization = true;
        } catch (Exception e) {
            Toast.makeText(
                    this,
                    "Can not create image transactor: " + e.getMessage(),
                    Toast.LENGTH_LONG)
                    .show();
        }
    }

    private void startLensEngine() {
        if (this.lensEngine != null) {
            try {
                this.preview.start(this.lensEngine, false);
            } catch (IOException e) {
                Log.e(TextRecognitionActivity.TAG, "Unable to start lensEngine.", e);
                this.lensEngine.release();
                this.lensEngine = null;
            }
        }
    }

    @Override
    public void onResume() {
        super.onResume();
        if (!isInitialization){
           createLensEngine();
        }
        this.startLensEngine();
    }

    @Override
    protected void onStop() {
        super.onStop();
        this.preview.stop();
    }

    private void releaseLensEngine() {
        if (this.lensEngine != null) {
            this.lensEngine.release();
            this.lensEngine = null;
        }
        recycleBitmap();
    }

    @Override
    protected void onDestroy() {
        super.onDestroy();
        releaseLensEngine();
        if (analyzer != null) {
            try {
                analyzer.stop();
            } catch (IOException e) {
                // Exception handling.
                Log.e(TAG,"Error while releasing analyzer");
            }
        }
    }

    private void recycleBitmap() {
        if (this.bitmap != null && !this.bitmap.isRecycled()) {
            this.bitmap.recycle();
            this.bitmap = null;
        }
        if (this.bitmapCopy != null && !this.bitmapCopy.isRecycled()) {
            this.bitmapCopy.recycle();
            this.bitmapCopy = null;
        }
    }

    private void takePicture() {

        this.zoomImageLayout.setVisibility(View.VISIBLE);
        LocalDataProcessor localDataProcessor = new LocalDataProcessor();
        localDataProcessor.setLandScape(this.isLandScape);
        this.bitmap = BitmapUtils.getBitmap(this.localTextTransactor.getTransactingImage(), this.localTextTransactor.getTransactingMetaData());

        float previewWidth = localDataProcessor.getMaxWidthOfImage(this.localTextTransactor.getTransactingMetaData());
        float previewHeight = localDataProcessor.getMaxHeightOfImage(this.localTextTransactor.getTransactingMetaData());
        if (this.isLandScape) {
            previewWidth = localDataProcessor.getMaxHeightOfImage(this.localTextTransactor.getTransactingMetaData());
            previewHeight = localDataProcessor.getMaxWidthOfImage(this.localTextTransactor.getTransactingMetaData());
        }
        this.bitmapCopy = Bitmap.createBitmap(this.bitmap).copy(Bitmap.Config.ARGB_8888, true);

        Canvas canvas = new Canvas(this.bitmapCopy);
        float min = Math.min(previewWidth, previewHeight);
        float max = Math.max(previewWidth, previewHeight);

        if (this.getResources().getConfiguration().orientation == Configuration.ORIENTATION_PORTRAIT) {
            localDataProcessor.setCameraInfo(this.graphicOverlay, canvas, min, max);
        } else {
            localDataProcessor.setCameraInfo(this.graphicOverlay, canvas, max, min);
        }
        localDataProcessor.drawHmsMLVisionText(canvas, this.localTextTransactor.getLastResults().getBlocks());
        this.zoomImageView.setImageBitmap(this.bitmapCopy);
        // Create an MLFrame object using the bitmap, which is the image data in bitmap format.
        MLFrame frame = MLFrame.fromBitmap(bitmap);
        Task<MLText> task = analyzer.asyncAnalyseFrame(frame);
        task.addOnSuccessListener(new OnSuccessListener<MLText>() {
            @Override
            public void onSuccess(MLText text) {
                String detectText = text.getStringValue();
                // Processing for successful recognition.
                // Create a local language detector.
                MLLangDetectorFactory factory = MLLangDetectorFactory.getInstance();
                MLLocalLangDetectorSetting setting = new MLLocalLangDetectorSetting.Factory()
                        // Set the minimum confidence threshold for language detection.
                        .setTrustedThreshold(0.01f)
                        .create();
                MLLocalLangDetector myLocalLangDetector = factory.getLocalLangDetector(setting);
                Task<String> firstBestDetectTask = myLocalLangDetector.firstBestDetect(detectText);
                firstBestDetectTask.addOnSuccessListener(new OnSuccessListener<String>() {
                    @Override
                    public void onSuccess(String languageDetected) {
                        // Processing logic for detection success.
                        Log.d("TAG", "Lang detect :" + languageDetected);
                        Log.d("TAG", " detectText :" + detectText);

                        translate(languageDetected,detectText);
                    }
                }).addOnFailureListener(new OnFailureListener() {
                    @Override
                    public void onFailure(Exception e) {
                        // Processing logic for detection failure.
                        Log.e("TAG", "Lang detect error:" + e.getMessage());
                    }
                });
            }
        }).addOnFailureListener(new OnFailureListener() {
            @Override
            public void onFailure(Exception e) {
                // Processing logic for recognition failure.
                Log.e("TAG"," Text : Processing logic for recognition failure");
            }
        });
    }

    private void translate(String languageDetected, String detectText) {

            MLApplication.initialize(getApplication());
            MLApplication.getInstance().setApiKey("DAEDAF48ZIMI4ettQdTfCKlXgaln/E+TO/PrsX+LpP2BubkmED/iC0iVEps5vfx1ol27rHvuwiq64YphpPkGYWbf9La8XjnvC9qhwQ==");

            // Create an offline translator.
            MLLocalTranslateSetting setting = new MLLocalTranslateSetting.Factory()
                    // Set the source language code. The ISO 639-1 standard is used. This parameter is mandatory. If this parameter is not set, an error may occur.
                    .setSourceLangCode(languageDetected)
                    // Set the target language code. The ISO 639-1 standard is used. This parameter is mandatory. If this parameter is not set, an error may occur.
                    .setTargetLangCode("en")
                    .create();

        MLLocalTranslator myLocalTranslator = MLTranslatorFactory.getInstance().getLocalTranslator(setting);
            // Set the model download policy.
            MLModelDownloadStrategy downloadStrategy = new MLModelDownloadStrategy.Factory()
                    .needWifi()// It is recommended that you download the package in a Wi-Fi environment.
                    .create();
            // Create a download progress listener.
            MLModelDownloadListener modelDownloadListener = new MLModelDownloadListener() {
                @Override
                public void onProcess(long alreadyDownLength, long totalLength) {
                    runOnUiThread(new Runnable() {
                        @Override
                        public void run() {
                            // Display the download progress or perform other operations.
                        }
                    });
                }
            };

            myLocalTranslator.preparedModel(downloadStrategy, modelDownloadListener).
                    addOnSuccessListener(new OnSuccessListener<Void>() {
                        @Override
                        public void onSuccess(Void aVoid) {
                            // Called when the model package is successfully downloaded.
                            // input is a string of less than 5000 characters.
                            final Task<String> task = myLocalTranslator.asyncTranslate(detectText);
                            // Before translation, ensure that the models have been successfully downloaded.
                            task.addOnSuccessListener(new OnSuccessListener<String>() {
                                @Override
                                public void onSuccess(String translated) {
                                    // Processing logic for detection success.                           
                                    Log.e("TAG"," Translated Text : "+translated);
                                    tv_translated_txt.setText(translated);
                                }
                            }).addOnFailureListener(new OnFailureListener() {
                                @Override
                                public void onFailure(Exception e) {
                                    // Processing logic for detection failure.
                                    Log.e("TAG"," Translation failed "+e.getMessage());
                                    Toast.makeText(TextRecognitionActivity.this,"Please check internet connection.",Toast.LENGTH_SHORT).show();
                                }
                            });
                        }
                    }).addOnFailureListener(new OnFailureListener() {
                        @Override
                        public void onFailure(Exception e) {
                            // Called when the model package fails to be downloaded.
                            Log.e("TAG"," Translation failed onFailure "+e.getMessage());
                        }
                    });


    }

}

MainFragment.java

public class MainFragment extends Fragment {


    static String TAG = "TAG";
    FragmentMainFragmentBinding binding;
    LoginViewModel loginViewModel;

    public MainFragment() {
        // Required empty public constructor
    }

    @Override
    public void onCreate(@Nullable Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setHasOptionsMenu(true);
    }

    @Override
    public View onCreateView(LayoutInflater inflater, ViewGroup container,
                             Bundle savedInstanceState) {
        // Inflate the layout for this fragment
        binding = DataBindingUtil.inflate(inflater, R.layout.fragment_main_fragment, container, false);
        loginViewModel = new ViewModelProvider(getActivity()).get(LoginViewModel.class);
        binding.setLoginViewModel(loginViewModel);

        binding.buttonScan.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                dialog();
            }
        });
        loginViewModel.getImagePath().observeForever(new Observer<Bitmap>() {
            @Override
            public void onChanged(Bitmap bitmap) {
                try{
                      binding.imageView.setImageBitmap(bitmap);
                }catch (Exception e){
                    e.printStackTrace();
                    Log.e("TAG","Error : "+e.getMessage());
                }
            }
        });
        loginViewModel.getTextRecognized().observeForever(new Observer<ArrayList<String>>() {
            @Override
            public void onChanged(ArrayList<String> res) {

                  binding.textLanguage.setText("Language : "+getStringResourceByName(res.get(0)));
                  binding.textDetected.setText("Detected text : "+res.get(1));
                  binding.textTranslated.setText("Translated text : "+res.get(2));
            }
        });


        return binding.getRoot();

    }

    private void openCamera() {
        Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
        getActivity().startActivityForResult(intent, OPEN_CAMERA);
    }

    private String getStringResourceByName(String aString) {
        try{
            String packageName = getActivity().getPackageName();
            int resId = getResources()
                    .getIdentifier(aString, "string", packageName);
            if (resId == 0) {
                return aString;
            } else {
                return getString(resId);
            }
        }catch (Exception e){
            e.printStackTrace();
            return aString;
        }

    }
    private void scan() {

        Intent intent = new Intent(Intent.ACTION_GET_CONTENT);
        intent.setType("image/*");
        String[] mimeTypes = {"image/jpeg", "image/png"};
        intent.putExtra(Intent.EXTRA_MIME_TYPES,mimeTypes);
        getActivity().startActivityForResult(intent, OPEN_GALLERY);
    }

    public void dialog()
    {
        final Dialog dialog = new Dialog(getActivity(), R.style.AppTheme);
        dialog.setTitle("Choose");
        dialog.setContentView(R.layout.dialog_pop_up);

        TextView txt_gallry=(TextView)dialog.findViewById(R.id.textView_gallery);
        TextView txt_camera=(TextView)dialog.findViewById(R.id.textView_camera);

        txt_gallry.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                dialog.dismiss();
               scan();
            }
        });
        txt_camera.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                dialog.dismiss();
                openCamera();
            }
        });
        dialog.show();
    }

    @Override
    public void onCreateOptionsMenu(@NonNull Menu menu, @NonNull MenuInflater inflater) {
        menu.clear();
        super.onCreateOptionsMenu(menu, inflater);
        inflater.inflate(R.menu.main_fragment_menu, menu);


    }

    @SuppressLint("NonConstantResourceId")
    @Override
    public boolean onOptionsItemSelected(MenuItem item) {
        switch (item.getItemId()) {

            case R.id.menu_camera:

                getActivity().startActivityForResult(new Intent(getActivity(),  TextRecognitionActivity.class),1234);
                break;

        }
        return super.onOptionsItemSelected(item);
    }

}

MainActivity.java

public class MainActivity extends AppCompatActivity {

    LoginViewModel loginViewModel;
    private MLTextAnalyzer mTextAnalyzer;
    public Uri imagePath;
    Bitmap bitmap;
    static String TAG = "TAG";
    ArrayList<String> result = new ArrayList<>();
    MLLocalLangDetector myLocalLangDetector;
    MLLocalTranslator myLocalTranslator;
    String textRecognized;
    ProgressDialog progressDialog;
    NavController navController;
    ActivityMainBinding activityMainBinding;
    BottomNavigationView bottomNavigationView;
    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        activityMainBinding = DataBindingUtil.setContentView(MainActivity.this,R.layout.activity_main);
        loginViewModel = new ViewModelProvider(MainActivity.this).get(LoginViewModel.class);

        navController = Navigation.findNavController(MainActivity.this, R.id.nav_host_fragment);
        MyApplication.setActivity(this);
        progressDialog = new ProgressDialog(this);
        progressDialog.setCancelable(false);
        bottomNavigationView = activityMainBinding.bottomNavigation;
        NavigationUI.setupWithNavController(bottomNavigationView, navController);
    }

    @Override
    protected void onActivityResult(int requestCode, int resultCode, @Nullable Intent data) {
        // Process the authorization result to obtain the authorization code from AuthAccount.
        super.onActivityResult(requestCode, resultCode, data);

        if (requestCode == 8888) {

            Task<AuthAccount> authAccountTask = AccountAuthManager.parseAuthResultFromIntent(data);
            if (authAccountTask.isSuccessful()) {
                // The sign-in is successful, and the user's ID information and authorization code are obtained.
                AuthAccount authAccount = authAccountTask.getResult();
                UserData userData = new UserData();
                userData.setAccessToken(authAccount.getAccessToken());
                userData.setCountryCode(authAccount.getCountryCode());
                userData.setDisplayName(authAccount.getDisplayName());
                userData.setEmail(authAccount.getEmail());
                userData.setFamilyName(authAccount.getFamilyName());
                userData.setGivenName(authAccount.getGivenName());
                userData.setIdToken(authAccount.getIdToken());
                userData.setOpenId(authAccount.getOpenId());
                userData.setUid(authAccount.getUid());
                userData.setPhotoUriString(authAccount.getAvatarUri().toString());
                userData.setUnionId(authAccount.getUnionId());

                loginViewModel = new ViewModelProvider(MainActivity.this).get(LoginViewModel.class);
                loginViewModel.sendData(authAccount.getDisplayName());

            } else {
                // The sign-in failed.
                Log.e("TAG", "sign in failed:" + ((ApiException) authAccountTask.getException()).getStatusCode());

            }
        }
        if (requestCode == 2323 && resultCode == RESULT_OK && data != null) {
            progressDialog.setMessage("Initializing text detection..");
            progressDialog.show();

            imagePath = data.getData();
            try {
                bitmap = MediaStore.Images.Media.getBitmap(this.getContentResolver(), imagePath);
                asyncAnalyzeText(bitmap);
            } catch (IOException e) {
                e.printStackTrace();
                Log.e("TAG", " BITMAP ERROR");
            }

        }
        if (requestCode == 2424 && resultCode == RESULT_OK && data != null) {
            progressDialog.setMessage("Initializing text detection..");
            progressDialog.show();
            try {
                bitmap =  (Bitmap) data.getExtras().get("data");
                asyncAnalyzeText(bitmap);
            } catch (Exception e) {
                e.printStackTrace();
                Log.e("TAG", " BITMAP ERROR");
            }
        }
    }

    private void asyncAnalyzeText(Bitmap bitmap) {

        if (mTextAnalyzer == null) {
            createMLTextAnalyzer();
        }

        MLFrame frame = MLFrame.fromBitmap(bitmap);

        Task<MLText> task = mTextAnalyzer.asyncAnalyseFrame(frame);
        task.addOnSuccessListener(new OnSuccessListener<MLText>() {
            @Override
            public void onSuccess(MLText text) {
                progressDialog.setMessage("Initializing language detection..");
                textRecognized = text.getStringValue().trim();
                if(!textRecognized.isEmpty()){
                    // Create a local language detector.
                    MLLangDetectorFactory factory = MLLangDetectorFactory.getInstance();
                    MLLocalLangDetectorSetting setting = new MLLocalLangDetectorSetting.Factory()
                            // Set the minimum confidence threshold for language detection.
                            .setTrustedThreshold(0.01f)
                            .create();
                    myLocalLangDetector = factory.getLocalLangDetector(setting);

                    Task<String> firstBestDetectTask = myLocalLangDetector.firstBestDetect(textRecognized);

                    firstBestDetectTask.addOnSuccessListener(new OnSuccessListener<String>() {
                        @Override
                        public void onSuccess(String languageDetected) {
                            progressDialog.setMessage("Initializing text translation..");
                            // Processing logic for detection success.
                            textTranslate(languageDetected, textRecognized, bitmap);
                        }
                    }).addOnFailureListener(new OnFailureListener() {
                        @Override
                        public void onFailure(Exception e) {
                            // Processing logic for detection failure.
                            Log.e("TAG", "Lang detect error:" + e.getMessage());
                        }
                    });
                }else{
                    progressDialog.dismiss();
                    showErrorDialog("Failed to recognize text.");
                }

            }
        }).addOnFailureListener(new OnFailureListener() {
            @Override
            public void onFailure(Exception e) {
                Log.e("TAG", "#==>" + e.getMessage());
            }
        });
    }

    private void showErrorDialog(String msg) {
        AlertDialog alertDialog = new AlertDialog.Builder(this).create();
        alertDialog.setTitle("Error");
        alertDialog.setMessage(msg);

        alertDialog.setButton(AlertDialog.BUTTON_POSITIVE, "OK", new DialogInterface.OnClickListener() {
            public void onClick(DialogInterface dialog, int which) {
                dialog.dismiss();
            }
        });

        alertDialog.show();
    }

    private void textTranslate(String languageDetected, String textRecognized, Bitmap uri) {
        MLApplication.initialize(getApplication());
        MLApplication.getInstance().setApiKey(Constants.API_KEY);
        Log.d(TAG,"Lang detect : "+languageDetected);
        Log.d(TAG,"Text : "+textRecognized);
        // Create an offline translator.
        MLLocalTranslateSetting setting = new MLLocalTranslateSetting.Factory()
                // Set the source language code. The ISO 639-1 standard is used. This parameter is mandatory. If this parameter is not set, an error may occur.
                .setSourceLangCode(languageDetected)
                // Set the target language code. The ISO 639-1 standard is used. This parameter is mandatory. If this parameter is not set, an error may occur.
                .setTargetLangCode("en")
                .create();

         myLocalTranslator = MLTranslatorFactory.getInstance().getLocalTranslator(setting);
        // Set the model download policy.
        MLModelDownloadStrategy downloadStrategy = new MLModelDownloadStrategy.Factory()
                .needWifi()// It is recommended that you download the package in a Wi-Fi environment.
                .create();
        // Create a download progress listener.
        MLModelDownloadListener modelDownloadListener = new MLModelDownloadListener() {
            @Override
            public void onProcess(long alreadyDownLength, long totalLength) {
                runOnUiThread(new Runnable() {
                    @Override
                    public void run() {
                        // Display the download progress or perform other operations.
                    }
                });
            }
        };

        myLocalTranslator.preparedModel(downloadStrategy, modelDownloadListener).
                addOnSuccessListener(new OnSuccessListener<Void>() {
                    @Override
                    public void onSuccess(Void aVoid) {
                        // Called when the model package is successfully downloaded.
                        // input is a string of less than 5000 characters.
                        final Task<String> task = myLocalTranslator.asyncTranslate(textRecognized);
                        // Before translation, ensure that the models have been successfully downloaded.
                        task.addOnSuccessListener(new OnSuccessListener<String>() {
                            @Override
                            public void onSuccess(String translated) {
                                // Processing logic for detection success.
                                result.clear();
                                result.add(languageDetected.trim());
                                result.add(textRecognized.trim());
                                result.add(translated.trim());
                                loginViewModel.setImage(uri);

                                loginViewModel.setTextRecognized(result);
                                progressDialog.dismiss();
                            }
                        }).addOnFailureListener(new OnFailureListener() {
                            @Override
                            public void onFailure(Exception e) {
                                // Processing logic for detection failure.
                                progressDialog.dismiss();
                            }
                        });
                    }
                }).addOnFailureListener(new OnFailureListener() {
                    @Override
                    public void onFailure(Exception e) {
                        // Called when the model package fails to be downloaded.
                        progressDialog.dismiss();
                    }
                });

    }

    private void createMLTextAnalyzer() {
        MLLocalTextSetting setting = new MLLocalTextSetting.Factory()
                .setOCRMode(MLLocalTextSetting.OCR_DETECT_MODE)
                .create();
        mTextAnalyzer = MLAnalyzerFactory.getInstance().getLocalTextAnalyzer(setting);

    }

    @Override
    protected void onStop() {
        if (myLocalLangDetector != null) {
            myLocalLangDetector.stop();
        }
        if (myLocalTranslator!= null) {
            myLocalTranslator.stop();
        }
        if(progressDialog!= null){
            progressDialog.dismiss();
        }
        super.onStop();
    }
}

结果

示例应用程序结果在下面的 gif 中显示了文本文本识别、语言检测和文本翻译,您可以看到日语文本识别和检测以及翻译成英语。同样,您可以在应用程序中使用 ML 服务。

技巧和窍门

  • 确保添加了 agconnect-services.json 文件。
  • 确保添加了所需的依赖项
  • 确保在 AGC 中启用了服务
  • 在 gradle.build 文件中启用数据绑定
  • 确保底部导航 id 应与导航图中的片段 id 相同
  • 确保在调用服务之前设置 apk 密钥。
  • 确保您从下面的链接中添加了模块文本在模块文本
  • 中将 gradle 文件应用程序更改为库

结论

在本文中,我们学习了如何集成华为机器学习套件摄像头流,您可以在其中提取设备摄像头流上的文本,并在 Android 应用程序 KnowMyBoard 中检测语言并翻译成英文。您可以在结果部分检查所需的结果。您也可以在此处浏览之前的文章。希望华为机器学习套件的能力也对你有所帮助,像这个样例,你可以根据自己的需求来使用。

非常感谢您的阅读。希望本文能帮助您了解华为机器学习套件在Android应用KnowMyBoard中的集成。

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值