在Android设备上使用MACE实现图像分类

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/qq_33200967/article/details/81901068

目录

前言

在之前笔者有介绍过《在Android设备上使用PaddleMobile实现图像分类》,使用的框架是百度开源的PaddleMobile。在本章中,笔者将会介绍使用小米的开源手机深度学习框架MACE来实现在Android手机实现图像分类。

MACE的GitHub地址:https://github.com/XiaoMi/mace

编译MACE库和模型

编译MACE库和模型有两种方式,一种是在Ubuntu本地上编译,另一种是使用docker编译。下面就介绍使用这两种编译方式。

使用Ubuntu编译

使用Ubuntu编译源码比较麻烦的是就要自己配置环境,所以下面我们就来配置一下环境。以下是官方给出的环境依赖:

所需依赖

SoftwareInstallation commandTested version
Python2.7
Bazelbazel installation guide0.13.0
CMakeapt-get install cmake>= 3.11.3
Jinja2pip install -I jinja2==2.102.10
PyYamlpip install -I pyyaml==3.123.12.0
shpip install -I sh==1.12.141.12.14
Numpypip install -I numpy==1.14.0Required by model validation
sixpip install -I six==1.11.0Required for Python 2 and 3 compatibility (TODO)

可选依赖

SoftwareInstallation commandRemark
Android NDKNDK installation guideRequired by Android build, r15b, r15c, r16b, r17b
ADBapt-get install android-tools-adbRequired by Android run, >= 1.0.32
TensorFlowpip install -I tensorflow==1.6.0Required by TensorFlow model
Dockerdocker installation guideRequired by docker mode for Caffe model
Scipypip install -I scipy==1.0.0Required by model validation
FileLockpip install -I filelock==3.0.0Required by run on Android

安装依赖环境

  • 安装Bazel
export BAZEL_VERSION=0.13.1
mkdir /bazel && \
    cd /bazel && \
    wget https://github.com/bazelbuild/bazel/releases/download/$BAZEL_VERSION/bazel-$BAZEL_VERSION-installer-linux-x86_64.sh && \
    chmod +x bazel-*.sh && \
    ./bazel-$BAZEL_VERSION-installer-linux-x86_64.sh && \
    cd / && \
    rm -f /bazel/bazel-$BAZEL_VERSION-installer-linux-x86_64.sh
   
   
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 安装Android NDK
# Download NDK r15c
cd /opt/ && \
    wget -q https://dl.google.com/android/repository/android-ndk-r15c-linux-x86_64.zip && \
    unzip -q android-ndk-r15c-linux-x86_64.zip && \
    rm -f android-ndk-r15c-linux-x86_64.zip

export ANDROID_NDK_VERSION=r15c
export ANDROID_NDK=/opt/android-ndk-${ANDROID_NDK_VERSION}
export ANDROID_NDK_HOME=${ANDROID_NDK}

# add to PATH
export PATH=${PATH}</span>:<span class="hljs-variable">${ANDROID_NDK_HOME}
   
   
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 安装其他工具
apt-get install -y --no-install-recommends \
    cmake \
    android-tools-adb
pip install -i http://pypi.douban.com/simple/ --trusted-host pypi.douban.com setuptools
pip install -i http://pypi.douban.com/simple/ --trusted-host pypi.douban.com \
    "numpy>=1.14.0" \
    scipy \
    jinja2 \
    pyyaml \
    sh==1.12.14 \
    pycodestyle==2.4.0 \
    filelock
   
   
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 安装TensorFlow
pip install -i http://pypi.douban.com/simple/ --trusted-host pypi.douban.com tensorflow==1.6.0
   
   
  • 1

编译库和模型

  • 克隆MACE源码
git clone https://github.com/XiaoMi/mace.git
   
   
  • 1
  • 进入到官方的Android Demo上
cd mace/mace/examples/android/
   
   
  • 1
  • 修改当前目录下的build.sh,修成如下:
#!/usr/bin/env bash

set -e -u -o pipefail

pushd ../../../

TARGET_ABI=armeabi-v7a
LIBRARY_DIR=mace/examples/android/macelibrary/src/main/cpp/
INCLUDE_DIR=$LIBRARY_DIR/include/mace/public/
LIBMACE_DIR=$LIBRARY_DIR</span>/lib/<span class="hljs-variable">$TARGET_ABI/

rm -rf $LIBRARY_DIR/include/
mkdir -p $INCLUDE_DIR

rm -rf $LIBRARY_DIR/lib/
mkdir -p $LIBMACE_DIR

rm -rf $LIBRARY_DIR/model/

python tools/converter.py convert --config=mace/examples/android/mobilenet.yml --target_abis=$TARGET_ABI
cp -rf builds/mobilenet/include/mace/public/*.h $INCLUDE_DIR
cp -rf builds/mobilenet/model $LIBRARY_DIR

bazel build --config android --config optimization mace/libmace:libmace_static --define neon=true --define openmp=true --define opencl=true --cpu=$TARGET_ABI
cp -rf mace/public/*.h $INCLUDE_DIR
cp -rf bazel-genfiles/mace/libmace/libmace.a $LIBMACE_DIR

popd
   
   
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 修改模型的配置文件mobilenet.yml,修改成如下,这些属性的文件可以查看官方的文档,各个模型的配置可以参考Mobile Model Zoo下的各个模型,以下是以为MobileNet V2为例。
library_name: mobilenet
target_abis: [armeabi-v7a]
model_graph_format: code
model_data_format: code
models:
  mobilenet_v2:
    platform: tensorflow
    model_file_path: https://cnbj1.fds.api.xiaomi.com/mace/miai-models/mobilenet-v2/mobilenet-v2-1.0.pb
    model_sha256_checksum: 369f9a5f38f3c15b4311c1c84c032ce868da9f371b5f78c13d3ea3c537389bb4
    subgraphs:
      - input_tensors:
          - input
        input_shapes:
          - 1,224,224,3
        output_tensors:
          - MobilenetV2/Predictions/Reshape_1
        output_shapes:
          - 1,1001
    runtime: cpu+gpu
    limit_opencl_kernel_time: 0
    nnlib_graph_mode: 0
    obfuscate: 0
    winograd: 0
   
   
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 开始编译
./build.sh
   
   
  • 1
  • 编译完成之后,可以在mace/mace/examples/android/macelibrary/src/main/cpp/看到多了3个文件:
    1. include是存放调用mace接口和模型配置的头文件
    2. lib是存放编译好的mace库
    3. model是存放模型的文件夹,比如我们编译的MobileNet V2模型

使用Docker编译

  • 首先安装docker,命令如下:
apt-get install docker.io
   
   
  • 1
  • 拉取mace镜像:
docker pull registry.cn-hangzhou.aliyuncs.com/xiaomimace/mace-dev
   
   
  • 1
  • 获取MACE源码,并按照上一步修改mace/mace/examples/android/目录下的build.shmobilenet.yml这个两个文件。
git clone https://github.com/XiaoMi/mace.git
   
   
  • 1
  • 进入到MACE的根目录,执行以下命令:
docker run -it -v $PWD:/mace registry.cn-hangzhou.aliyuncs.com/xiaomimace/mace-dev
   
   
  • 1
  • 接着执行以下命令:
cd mace/mace/examples/android/
./build.sh
   
   
  • 1
  • 2

执行之后便可得到跟上一步获取的一样的文件。使用docker就简单很多,少了很多安装依赖环境的步骤。

开发Android项目

  • 创建Android项目

在创建项目是要选择C++支持。
这里写图片描述

因为MACE最低支持版本是Android5.0,所以这里要选择Android5.0。
这里写图片描述

MACE使用的是C++11。
这里写图片描述

  • 复制C++文件。删除cpp目录下自动生产的C++文件,并复制上一步编译得到的3个目录和本来就有的两C++文件到Android项目的cpp目录下。如下图:
    这里写图片描述

  • 修改CMakeLists.txt编译文件,修改如下,编译对应的是我们上一步复制的C++文件:

# For more information about using CMake with Android Studio, read the




# documentation: https: //d.android.com/studio/projects/add-native-code.html # Sets the minimum version of CMake required to build the native library. cmake_minimum_required(VERSION 3.4. 1) # Creates and names a library, sets it as either STATIC # or SHARED, and provides the relative paths to its source code. # You can define multiple libraries, and CMake builds them for you. # Gradle automatically packages shared libraries with your APK. # set(CMAKE_LIBRARY_OUTPUT_DIRECTORY <spanclass="hljscomment">PROJECTSOURCEDIR</span>/../app/libs/ < s p a n c l a s s =" h l j s − c o m m e n t "> P R O J E C T S O U R C E D I R < / s p a n > / . . / a p p / l i b s / <script type="math/tex" id="MathJax-Element-1"> {PROJECT_SOURCE_DIR}/../app/libs/</script> {ANDROID_ABI}) include_directories($ {CMAKE_SOURCE_DIR}/) include_directories($ {CMAKE_SOURCE_DIR}/src/main/cpp/include) set(mace_lib $ {CMAKE_SOURCE_DIR}/src/main/cpp/lib/armeabi-v7a/libmace.a) set(mobilenet_lib $ {CMAKE_SOURCE_DIR}/src/main/cpp/model/armeabi-v7a/mobilenet.a) add_library (mace_lib STATIC IMPORTED) set_target_properties(mace_lib PROPERTIES IMPORTED_LOCATION $ {mace_lib}) add_library (mobilenet_lib STATIC IMPORTED) set_target_properties(mobilenet_lib PROPERTIES IMPORTED_LOCATION $ {mobilenet_lib}) add_library( # Sets the name of the library. mace_mobile_jni # Sets the library as a shared library. SHARED # Provides a relative path to your source file(s). src/main/cpp/image_classify.cc ) # Searches for a specified prebuilt library and stores the path as a # variable. Because CMake includes system libraries in the search path by # default, you only need to specify the name of the public NDK library # you want to add. CMake verifies that the library exists before # completing its build. find_library( # Sets the name of the path variable. log-lib # Specifies the name of the NDK library that # you want CMake to locate. log ) # Specifies libraries CMake should link to your target library. You # can link multiple libraries, such as libraries you define in this

build script, prebuilt third-party libraries, or system libraries.

target_link_libraries( # Specifies the target library.
mace_mobile_jni
mace_lib
mobilenet_lib
# Links the target library to the log library
# included in the NDK.
${log-lib} )

  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49
  • 50
  • 51
  • 52
  • 53
  • 54
  • 55
  • 56
  • 57

  • 修改app目录下的build.gradle,修改如下:

把原来的

externalNativeBuild {
            cmake {
                cppFlags "-std=c++11"
            }
        }
 
 
  • 1
  • 2
  • 3
  • 4
  • 5

修改成,因为我们只编译了armeabi-v7a支持:

externalNativeBuild {
            cmake {
                cppFlags "-std=c++11 -fopenmp"
                abiFilters "armeabi-v7a"
            }
        }
 
 
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6

android下加上:

    sourceSets {
        main {
            jniLibs.srcDirs = ["src/main/jniLibs"]
            jni.srcDirs = ['src/cpp']
        }
    }
 
 
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 修改Android项目使用的NDK版本,我们编译的时候是使用r15c,所以我们在Android项目上也要使用r15c,如下:
    这里写图片描述

  • 创建一个com.xiaomi.mace包,并复制官方demo中的java类JniMaceUtils.java到该包中,代码如下,这个就是使用mace的JNI接口:

package com.xiaomi.mace;

public class JniMaceUtils {

    static {
        System.loadLibrary("mace_mobile_jni");
    }
    // 设置模型属性
    public static native int maceMobilenetSetAttrs(int ompNumThreads, int cpuAffinityPolicy, int gpuPerfHint, int gpuPriorityHint, String kernelPath);
    // 加载模型和选择使用GPU或CPU
    public static native int maceMobilenetCreateEngine(String model, String device);
    // 预测图片
    public static native float[] maceMobilenetClassify(float[] input);
}
 
 
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 在项目的包下创建一个InitData.java类,这个是配置mace的信息类,比如使用CPU还是GPU来预测,加载的是那个模型等等:
package com.example.myapplication;

import android.os.Environment;

import java.io.File;

public class InitData {

    public static final String[] DEVICES = new String[]{"CPU", "GPU"};
    public static final String[] MODELS = new String[]{"mobilenet_v1", "mobilenet_v2"};

    private String model;
    private String device = "";
    private int ompNumThreads;
    private int cpuAffinityPolicy;
    private int gpuPerfHint;
    private int gpuPriorityHint;
    private String kernelPath = "";

    public InitData() {
        model = MODELS[1];
        ompNumThreads = 4;
        cpuAffinityPolicy = 0;
        gpuPerfHint = 3;
        gpuPriorityHint = 3;
        device = DEVICES[0];
        kernelPath = Environment.getExternalStorageDirectory().getAbsolutePath() + File.separator + "mace";
        File file = new File(kernelPath);
        if (!file.exists()) {
            file.mkdir();
        }

    }

    public String getModel() {
        return model;
    }

    public void setModel(String model) {
        this.model = model;
    }

    public String getDevice() {
        return device;
    }

    public void setDevice(String device) {
        this.device = device;
    }

    public int getOmpNumThreads() {
        return ompNumThreads;
    }

    public void setOmpNumThreads(int ompNumThreads) {
        this.ompNumThreads = ompNumThreads;
    }

    public int getCpuAffinityPolicy() {
        return cpuAffinityPolicy;
    }

    public void setCpuAffinityPolicy(int cpuAffinityPolicy) {
        this.cpuAffinityPolicy = cpuAffinityPolicy;
    }

    public int getGpuPerfHint() {
        return gpuPerfHint;
    }

    public void setGpuPerfHint(int gpuPerfHint) {
        this.gpuPerfHint = gpuPerfHint;
    }

    public int getGpuPriorityHint() {
        return gpuPriorityHint;
    }

    public void setGpuPriorityHint(int gpuPriorityHint) {
        this.gpuPriorityHint = gpuPriorityHint;
    }

    public String getKernelPath() {
        return kernelPath;
    }

    public void setKernelPath(String kernelPath) {
        this.kernelPath = kernelPath;
    }
}
 
 
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49
  • 50
  • 51
  • 52
  • 53
  • 54
  • 55
  • 56
  • 57
  • 58
  • 59
  • 60
  • 61
  • 62
  • 63
  • 64
  • 65
  • 66
  • 67
  • 68
  • 69
  • 70
  • 71
  • 72
  • 73
  • 74
  • 75
  • 76
  • 77
  • 78
  • 79
  • 80
  • 81
  • 82
  • 83
  • 84
  • 85
  • 86
  • 87
  • 88
  • 89
  • 90
  • 同样是在项目的包下创建PhotoUtil.java类,这是一个工具类,包括启动相机获拍摄图片并返回该图片的绝对路径,还有一个是把图片转换成预测的数据,mace读取的预测数据是一个float数组。
package com.example.myapplication;

import android.app.Activity;
import android.content.Context;
import android.content.Intent;
import android.database.Cursor;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.net.Uri;
import android.os.Build;
import android.provider.MediaStore;
import android.support.v4.content.FileProvider;

import java.io.File;
import java.io.IOException;
import java.nio.FloatBuffer;


public class PhotoUtil {

    // start camera
    public static Uri start_camera(Activity activity, int requestCode) {
        Uri imageUri;
        // save image in cache path
        File outputImage = new File(activity.getExternalCacheDir(), "out_image.jpg");
        try {
            if (outputImage.exists()) {
                outputImage.delete();
            }
            outputImage.createNewFile();
        } catch (IOException e) {
            e.printStackTrace();
        }
        if (Build.VERSION.SDK_INT >= 24) {
            // compatible with Android 7.0 or over
            imageUri = FileProvider.getUriForFile(activity,
                    "com.example.myapplication", outputImage);
        } else {
            imageUri = Uri.fromFile(outputImage);
        }
        // set system camera Action
        Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
        // set save photo path
        intent.putExtra(MediaStore.EXTRA_OUTPUT, imageUri);
        // set photo quality, min is 0, max is 1
        intent.putExtra(MediaStore.EXTRA_VIDEO_QUALITY, 0);
        activity.startActivityForResult(intent, requestCode);
        return imageUri;
    }

    // get picture in photo
    public static void use_photo(Activity activity, int requestCode){
        Intent intent = new Intent(Intent.ACTION_PICK);
        intent.setType("image/*");
        activity.startActivityForResult(intent, requestCode);
    }

    // get photo from Uri
    public static String get_path_from_URI(Context context, Uri uri) {
        String result;
        Cursor cursor = context.getContentResolver().query(uri, null, null, null, null);
        if (cursor == null) {
            result = uri.getPath();
        } else {
            cursor.moveToFirst();
            int idx = cursor.getColumnIndex(MediaStore.Images.ImageColumns.DATA);
            result = cursor.getString(idx);
            cursor.close();
        }
        return result;
    }

    // Compress the image to the size of the training image
    public static float[] getScaledMatrix(Bitmap bitmap, int desWidth,
                                          int desHeight) {
        // create data buffer
        float[] floatValues = new float[desWidth * desHeight * 3];
        FloatBuffer floatBuffer = FloatBuffer.wrap(floatValues, 0, desWidth * desHeight * 3);
        floatBuffer.rewind();
        // get image pixel
        int[] pixels = new int[desWidth * desHeight];
        Bitmap bm = Bitmap.createScaledBitmap(bitmap, desWidth, desHeight, false);
        bm.getPixels(pixels, 0, bm  .getWidth(), 0, 0, desWidth, desHeight);
        // pixel to data
        for (int clr : pixels) {
            floatBuffer.put((((clr >> 16) & 0xFF) - 128f) / 128f);
            floatBuffer.put((((clr >> 8) & 0xFF) - 128f) / 128f);
            floatBuffer.put(((clr & 0xFF) - 128f) / 128f);
        }
        if (bm.isRecycled()) {
            bm.recycle();
        }
        return floatBuffer.array();
    }

    // compress picture
    public static Bitmap getScaleBitmap(String filePath) {
        BitmapFactory.Options opt = new BitmapFactory.Options();
        opt.inJustDecodeBounds = true;
        BitmapFactory.decodeFile(filePath, opt);

        int bmpWidth = opt.outWidth;
        int bmpHeight = opt.outHeight;

        int maxSize = 500;

        // compress picture with inSampleSize
        opt.inSampleSize = 1;
        while (true) {
            if (bmpWidth / opt.inSampleSize < maxSize || bmpHeight / opt.inSampleSize < maxSize) {
                break;
            }
            opt.inSampleSize *= 2;
        }
        opt.inJustDecodeBounds = false;
        return BitmapFactory.decodeFile(filePath, opt);
    }
}
 
 
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49
  • 50
  • 51
  • 52
  • 53
  • 54
  • 55
  • 56
  • 57
  • 58
  • 59
  • 60
  • 61
  • 62
  • 63
  • 64
  • 65
  • 66
  • 67
  • 68
  • 69
  • 70
  • 71
  • 72
  • 73
  • 74
  • 75
  • 76
  • 77
  • 78
  • 79
  • 80
  • 81
  • 82
  • 83
  • 84
  • 85
  • 86
  • 87
  • 88
  • 89
  • 90
  • 91
  • 92
  • 93
  • 94
  • 95
  • 96
  • 97
  • 98
  • 99
  • 100
  • 101
  • 102
  • 103
  • 104
  • 105
  • 106
  • 107
  • 108
  • 109
  • 110
  • 111
  • 112
  • 113
  • 114
  • 115
  • 116
  • 117
  • 118
  • 修改MainActivity.java,修改如下,主要是有两个功能,第一个是打开相册选择图片进行预测,另一个是启动相机拍摄图片进行预测。在进入应用是就调用init_model()方法来设置mace的配置信息和加载模型,其中可以通过调用load_model(String model)该更换模型。通过调用predict_image(String image_path)方法预测图片并显示结果:
package com.example.myapplication;

import android.Manifest;
import android.app.Activity;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.content.res.AssetManager;
import android.graphics.Bitmap;
import android.net.Uri;
import android.os.Bundle;
import android.support.annotation.NonNull;
import android.support.annotation.Nullable;
import android.support.v4.app.ActivityCompat;
import android.support.v4.content.ContextCompat;
import android.support.v7.app.AppCompatActivity;
import android.text.method.ScrollingMovementMethod;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.ImageView;
import android.widget.TextView;
import android.widget.Toast;

import com.bumptech.glide.Glide;
import com.bumptech.glide.load.engine.DiskCacheStrategy;
import com.bumptech.glide.request.RequestOptions;
import com.xiaomi.mace.JniMaceUtils;

import java.io.BufferedReader;
import java.io.InputStreamReader;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;

public class MainActivity extends AppCompatActivity {
    private static final String TAG = MainActivity.class.getName();
    private static final int USE_PHOTO = 1001;
    private static final int START_CAMERA = 1002;
    private Uri camera_image_path;
    private ImageView show_image;
    private TextView result_text;
    private boolean load_result = false;
    private int[] ddims = {1, 3, 224, 224};
    private int model_index = 1;
    private InitData initData = new InitData();
    private List<String> resultLabel = new ArrayList<>();

    private static final String[] PADDLE_MODEL = {
            "mobilenet_v1",
            "mobilenet_v2"
    };


    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        init_view();
        init_model();
        readCacheLabelFromLocalFile();
    }

    // initialize view
    private void init_view() {
        request_permissions();
        show_image = (ImageView) findViewById(R.id.show_image);
        result_text = (TextView) findViewById(R.id.result_text);
        result_text.setMovementMethod(ScrollingMovementMethod.getInstance());
        Button use_photo = (Button) findViewById(R.id.use_photo);
        Button start_photo = (Button) findViewById(R.id.start_camera);


        // use photo click
        use_photo.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                if (!load_result) {
                    Toast.makeText(MainActivity.this, "never load model", Toast.LENGTH_SHORT).show();
                    return;
                }
                PhotoUtil.use_photo(MainActivity.this, USE_PHOTO);
            }
        });

        // start camera click
        start_photo.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                if (!load_result) {
                    Toast.makeText(MainActivity.this, "never load model", Toast.LENGTH_SHORT).show();
                    return;
                }
                camera_image_path = PhotoUtil.start_camera(MainActivity.this, START_CAMERA);
            }
        });
    }

    // init mace environment
    private void init_model() {
        int result = JniMaceUtils.maceMobilenetSetAttrs(
                initData.getOmpNumThreads(), initData.getCpuAffinityPolicy(),
                initData.getGpuPerfHint(), initData.getGpuPriorityHint(),
                initData.getKernelPath());
        Log.i(TAG, "maceMobilenetSetAttrs result = " + result);

        load_model(PADDLE_MODEL[model_index]);
    }

    // load infer model
    private void load_model(String model) {
        // set will load model name
        initData.setModel(model);
        // load model
        int result = JniMaceUtils.maceMobilenetCreateEngine(initData.getModel(), initData.getDevice());
        Log.i(TAG, "maceMobilenetCreateEngine result = " + result);
        // set load model result
        load_result = result == 0;
        if (load_result) {
            Toast.makeText(MainActivity.this, model + " model load success", Toast.LENGTH_SHORT).show();
            Log.d(TAG, model + " model load success");
        } else {
            Toast.makeText(MainActivity.this, model + " model load fail", Toast.LENGTH_SHORT).show();
            Log.d(TAG, model + " model load fail");
        }
    }


    private void readCacheLabelFromLocalFile() {
        try {
            AssetManager assetManager = getApplicationContext().getAssets();
            BufferedReader reader = new BufferedReader(new InputStreamReader(assetManager.open("cacheLabel.txt")));
            String readLine = null;
            while ((readLine = reader.readLine()) != null) {
                resultLabel.add(readLine);
            }
            reader.close();
        } catch (Exception e) {
            Log.e("labelCache", "error " + e);
        }
    }

    @Override
    protected void onActivityResult(int requestCode, int resultCode, @Nullable Intent data) {
        String image_path;
        RequestOptions options = new RequestOptions().skipMemoryCache(true).diskCacheStrategy(DiskCacheStrategy.NONE);
        if (resultCode == Activity.RESULT_OK) {
            switch (requestCode) {
                case USE_PHOTO:
                    if (data == null) {
                        Log.w(TAG, "user photo data is null");
                        return;
                    }
                    Uri image_uri = data.getData();
                    Glide.with(MainActivity.this).load(image_uri).apply(options).into(show_image);
                    // get image path from uri
                    image_path = PhotoUtil.get_path_from_URI(MainActivity.this, image_uri);
                    // predict image
                    predict_image(image_path);
                    break;
                case START_CAMERA:
                    // show photo
                    Glide.with(MainActivity.this).load(camera_image_path).apply(options).into(show_image);
                    image_path = PhotoUtil.get_path_from_URI(MainActivity.this, camera_image_path);
                    // predict image
                    predict_image(image_path);
                    break;
            }
        }
    }

    //  predict image
    private void predict_image(String image_path) {
        // picture to float array
        Bitmap bmp = PhotoUtil.getScaleBitmap(image_path);
        float[] inputData = PhotoUtil.getScaledMatrix(bmp, ddims[2], ddims[3]);
        try {
            // Data format conversion takes too long
            // Log.d("inputData", Arrays.toString(inputData));
            long start = System.currentTimeMillis();
            // get predict result
            float[] result = JniMaceUtils.maceMobilenetClassify(inputData);
            long end = System.currentTimeMillis();
            Log.d(TAG, "origin predict result:" + Arrays.toString(result));
            long time = end - start;
            Log.d("result length", String.valueOf(result.length));
            // show predict result and time
            int r = get_max_result(result);
            String show_text = "result:" + r + "\nname:" + resultLabel.get(r) + "\nprobability:" + result[r] + "\ntime:" + time + "ms";
            result_text.setText(show_text);
        } catch (Exception e) {
            e.printStackTrace();
        }
    }

    // get max probability label
    private int get_max_result(float[] result) {
        float probability = result[0];
        int r = 0;
        for (int i = 0; i < result.length; i++) {
            if (probability < result[i]) {
                probability = result[i];
                r = i;
            }
        }
        return r;
    }

    // request permissions
    private void request_permissions() {

        List<String> permissionList = new ArrayList<>();
        if (ContextCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
            permissionList.add(Manifest.permission.CAMERA);
        }

        if (ContextCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED) {
            permissionList.add(Manifest.permission.WRITE_EXTERNAL_STORAGE);
        }

        if (ContextCompat.checkSelfPermission(this, Manifest.permission.READ_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED) {
            permissionList.add(Manifest.permission.READ_EXTERNAL_STORAGE);
        }

        // if list is not empty will request permissions
        if (!permissionList.isEmpty()) {
            ActivityCompat.requestPermissions(this, permissionList.toArray(new String[permissionList.size()]), 1);
        }
    }

    @Override
    public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
        super.onRequestPermissionsResult(requestCode, permissions, grantResults);
        switch (requestCode) {
            case 1:
                if (grantResults.length > 0) {
                    for (int i = 0; i < grantResults.length; i++) {

                        int grantResult = grantResults[i];
                        if (grantResult == PackageManager.PERMISSION_DENIED) {
                            String s = permissions[i];
                            Toast.makeText(this, s + " permission was denied", Toast.LENGTH_SHORT).show();
                        }
                    }
                }
                break;
        }
    }
}
 
 
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49
  • 50
  • 51
  • 52
  • 53
  • 54
  • 55
  • 56
  • 57
  • 58
  • 59
  • 60
  • 61
  • 62
  • 63
  • 64
  • 65
  • 66
  • 67
  • 68
  • 69
  • 70
  • 71
  • 72
  • 73
  • 74
  • 75
  • 76
  • 77
  • 78
  • 79
  • 80
  • 81
  • 82
  • 83
  • 84
  • 85
  • 86
  • 87
  • 88
  • 89
  • 90
  • 91
  • 92
  • 93
  • 94
  • 95
  • 96
  • 97
  • 98
  • 99
  • 100
  • 101
  • 102
  • 103
  • 104
  • 105
  • 106
  • 107
  • 108
  • 109
  • 110
  • 111
  • 112
  • 113
  • 114
  • 115
  • 116
  • 117
  • 118
  • 119
  • 120
  • 121
  • 122
  • 123
  • 124
  • 125
  • 126
  • 127
  • 128
  • 129
  • 130
  • 131
  • 132
  • 133
  • 134
  • 135
  • 136
  • 137
  • 138
  • 139
  • 140
  • 141
  • 142
  • 143
  • 144
  • 145
  • 146
  • 147
  • 148
  • 149
  • 150
  • 151
  • 152
  • 153
  • 154
  • 155
  • 156
  • 157
  • 158
  • 159
  • 160
  • 161
  • 162
  • 163
  • 164
  • 165
  • 166
  • 167
  • 168
  • 169
  • 170
  • 171
  • 172
  • 173
  • 174
  • 175
  • 176
  • 177
  • 178
  • 179
  • 180
  • 181
  • 182
  • 183
  • 184
  • 185
  • 186
  • 187
  • 188
  • 189
  • 190
  • 191
  • 192
  • 193
  • 194
  • 195
  • 196
  • 197
  • 198
  • 199
  • 200
  • 201
  • 202
  • 203
  • 204
  • 205
  • 206
  • 207
  • 208
  • 209
  • 210
  • 211
  • 212
  • 213
  • 214
  • 215
  • 216
  • 217
  • 218
  • 219
  • 220
  • 221
  • 222
  • 223
  • 224
  • 225
  • 226
  • 227
  • 228
  • 229
  • 230
  • 231
  • 232
  • 233
  • 234
  • 235
  • 236
  • 237
  • 238
  • 239
  • 240
  • 241
  • 242
  • 243
  • 244
  • 245
  • 246
  • 247
  • 248
  • 249
  • main下创建一个asset目录并加入这个文件

  • 最后别忘了在配置文件AndroidManifest.xml上加上权限

<uses-permission android:name="android.permission.CAMERA"/>
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
 
 
  • 1
  • 2
  • 3

最后运行得到的结果如下图:
这里写图片描述

注意:该项目对Android7.0相机兼容不是很好。
上面已经是全部代码了,如果读者想更方便使用,可以直接下载该项目

参考资料


  1. https://github.com/XiaoMi/mace
  2. https://mace.readthedocs.io/en/latest/

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值