钓鱼识别app

钓鱼识别App制作:

1.简介

作为一名钓鱼爱好者,当遇到没鱼口的时候是一件非常烦恼的事情,需要一直盯着浮漂,有时甚至持续数小时之久,白白的浪费许多时间。为了解决这一问题,本文将教你制作一款钓鱼识别的app,只需要使用手机摄像头对准浮漂,然后使用双指放大画面,将浮漂放大至合适大小,然后将手机固定即可,当有鱼口时,手机将自动发出警报声,这样再也不用一直盯着浮漂无聊的等待了。

2.效果预览

2.1 PC端效果

钓鱼利器


视频源地址: https://www.bilibili.com/video/BV1ea411j7QS/

2.2 移动端效果

由于本人目前在学校中,无法进行实战演练,只能使用手机摄像头对着电脑屏幕进行识别,感觉效果还可以,等放假可以去户外实战一下看看效果。哈哈。。。


视频源地址: https://www.bilibili.com/video/BV1YY4y1x71b/

2.3 app下载

欢迎前往下载体验,app下载地址:https://www.pgyer.com/tUCJ

使用数据集

1. 数据集简介

本人使用的数据集大部分是从快手短视频平台截屏的图片,数据集共有270多张图片,已经标注完成。部分图片如下所示:
在这里插入图片描述
在这里插入图片描述
在这里插入图片描述

2. 数据集下载

复制这段内容后打开天翼云盘手机App,操作更方便哦!链接:https://cloud.189.cn/t/BvEbI3Mjaui2(访问码:rhv2)

训练模型

本人使用的网络模型为yolov5n,该模型为yolo系列的最小体积模型。为了方便模型部署到移动端,不能使用太大的模型,否则无法实时进行检测。
可以去下载yolov5的源代码进行训练,训练过程也很简单,这里不做过多说明。
yolov5源码下载地址:https://github.com/ultralytics/yolov5

如果不想再次重新训练,可以下载我已经训练好的模型,我已经将其转换成ncnn格式的模型。下载地址:

https://cloud.189.cn/t/yieInaRjuEFz (访问码:lds2)

判断是否有鱼口

判断是否有鱼口的过程也很简单,当检测到浮漂以后,我们可以得到浮漂的高度h,和矩形框左上角横坐标x。入下图所示。如果高度h发生改变,并且超过设定的阈值,则认定有鱼,或者当x发生变化,并且超过其设定阈值时则也认为有鱼,然后手机发出警报声。最后每隔几秒重新更新一下x和h的值。

在这里插入图片描述

模型部署

1. 模型转换

训练好模型以后,为了部署到移动端使用ncnn进行推理需要先将模型转成成onnx格式,然后进一步转换成ncnn格式。具体的转换方法可以参考:https://blog.csdn.net/weixin_41693877/article/details/120720939?utm_source=app&app_version=5.5.0&code=app_1562916241&uLinkId=usr1mkqgl919blen

2. app制作

在模型推理过程中使用jni在Android中调用c++的代码,推理给出代码如下1所示:

#include <jni.h>
#include <string>
#include <android/asset_manager_jni.h>
#include <android/bitmap.h>
#include <android/log.h>

#include <jni.h>
#include <cmath>

#include <vector>

// ncnn
#include "layer.h"
#include "net.h"
#include "benchmark.h"

static ncnn::UnlockedPoolAllocator g_blob_pool_allocator;
static ncnn::PoolAllocator g_workspace_pool_allocator;
static ncnn::Net SSD;

struct Object
{
    float x;
    float y;
    float w;
    float h;
    int label;
    float prob;
};

static float soft_sum(std::vector<float>& v){
    float sum=0;
    float len=0;

    for(float f:v){
        sum+=f;
    }
    for (int i = 0; i < v.size(); i++)
    {
        float a=v[i]/sum*i;
//        v[i]=a;
        len+=a;

    }
    return len;


}

static inline float intersection_area(const Object& a, const Object& b)
{
    float zuo_x=std::max(a.x-0.5*a.w,b.x-0.5*b.w);
    float zuo_y=std::max(a.y-0.5*a.h,b.y-0.5*b.h);

    float you_x=std::min(a.x+0.5*a.w,b.x+0.5*b.w);
    float you_y=std::min(a.y+0.5*a.h,b.y+0.5*b.h);

    float inter_width=you_x-zuo_x;
    float inter_height=you_y-zuo_y;

    if(inter_height<=0 || inter_width<=0){
        return 0.f;
    }

    return inter_width * inter_height;
}

static void qsort_descent_inplace(std::vector<Object>& faceobjects, int left, int right)
{
    int i = left;
    int j = right;
    float p = faceobjects[(left + right) / 2].prob;

    while (i <= j)
    {
        while (faceobjects[i].prob > p)
            i++;

        while (faceobjects[j].prob < p)
            j--;

        if (i <= j)
        {
            // swap
            std::swap(faceobjects[i], faceobjects[j]);

            i++;
            j--;
        }
    }

#pragma omp parallel sections
    {
#pragma omp section
        {
            if (left < j) qsort_descent_inplace(faceobjects, left, j);
        }
#pragma omp section
        {
            if (i < right) qsort_descent_inplace(faceobjects, i, right);
        }
    }
}
static void qsort_descent_inplace(std::vector<Object>& faceobjects)
{
    if (faceobjects.empty())
        return;

    qsort_descent_inplace(faceobjects, 0, faceobjects.size() - 1);
}
static void nms_sorted_bboxes(const std::vector<Object>& faceobjects, std::vector<int>& picked, float nms_threshold)
{
    picked.clear();

    const int n = faceobjects.size();

    std::vector<float> areas(n);
    for (int i = 0; i < n; i++)
    {
        areas[i] = faceobjects[i].w * faceobjects[i].h;
    }

    for (int i = 0; i < n; i++)
    {
        const Object& a = faceobjects[i];

        int keep = 1;
        for (int j = 0; j < (int)picked.size(); j++)
        {
            const Object& b = faceobjects[picked[j]];

            // intersection over union
            float inter_area = intersection_area(a, b);
            float union_area = areas[i] + areas[picked[j]] - inter_area;
            // float IoU = inter_area / union_area
            if (inter_area / union_area > nms_threshold)
                keep = 0;
        }

        if (keep)
            picked.push_back(i);
    }
}

static inline float sigmoid(float x)
{
    return static_cast<float>(1.f / (1.f + exp(-x)));
}
static void generate_proposals(const ncnn::Mat& anchors, int stride, const ncnn::Mat& in_pad, const ncnn::Mat& feat_blob, float prob_threshold, std::vector<Object>& objects)
{
    const int num_grid = feat_blob.h;

    int num_grid_x;
    int num_grid_y;
    if (in_pad.w > in_pad.h)
    {
        num_grid_x = in_pad.w / stride;
        num_grid_y = num_grid / num_grid_x;
    }
    else
    {
        num_grid_y = in_pad.h / stride;
        num_grid_x = num_grid / num_grid_y;
    }

    const int num_class = feat_blob.w - 5;

    const int num_anchors = anchors.w / 2;

    for (int q = 0; q < num_anchors; q++)
    {
        const float anchor_w = anchors[q * 2];
        const float anchor_h = anchors[q * 2 + 1];

        const ncnn::Mat feat = feat_blob.channel(q);

        for (int i = 0; i < num_grid_y; i++)
        {
            for (int j = 0; j < num_grid_x; j++)
            {
                //获取每一行数据
                const float* featptr = feat.row(i * num_grid_x + j);

                // find class index with max class score
                int class_index = 0;
                float class_score = -FLT_MAX;

                //求取每一行的最大值和最大值的索引
                for (int k = 0; k < num_class; k++)
                {
                    float score = featptr[5 + k];
                    if (score > class_score)
                    {
                        class_index = k;
                        class_score = score;
                    }
                }

                //获取盒子的置信度分数
                float box_score = featptr[4];
                //获取总的置信度分数
                float confidence = sigmoid(box_score) * sigmoid(class_score);

                if (confidence >= prob_threshold)
                {
                    // yolov5/models/yolo.py Detect forward
                    // y = x[i].sigmoid()
                    // y[..., 0:2] = (y[..., 0:2] * 2. - 0.5 + self.grid[i].to(x[i].device)) * self.stride[i]  # xy
                    // y[..., 2:4] = (y[..., 2:4] * 2) ** 2 * self.anchor_grid[i]  # wh

                    float dx = sigmoid(featptr[0]);
                    float dy = sigmoid(featptr[1]);
                    float dw = sigmoid(featptr[2]);
                    float dh = sigmoid(featptr[3]);

                    float pb_cx = (dx * 2.f - 0.5f + j) * stride;
                    float pb_cy = (dy * 2.f - 0.5f + i) * stride;

                    float pb_w = pow(dw * 2.f, 2) * anchor_w;
                    float pb_h = pow(dh * 2.f, 2) * anchor_h;

                    float x0 = pb_cx - pb_w * 0.5f;
                    float y0 = pb_cy - pb_h * 0.5f;
                    float x1 = pb_cx + pb_w * 0.5f;
                    float y1 = pb_cy + pb_h * 0.5f;

                    Object obj;
                    obj.x = x0;
                    obj.y = y0;
                    obj.w = x1 - x0;
                    obj.h = y1 - y0;
                    obj.label = class_index;
                    obj.prob = confidence;

                    objects.push_back(obj);
                }
            }
        }
    }
}


// FIXME DeleteGlobalRef is missing for objCls
static jclass objCls = NULL;
static jmethodID constructortorId;
static jfieldID xId;
static jfieldID yId;
static jfieldID wId;
static jfieldID hId;
static jfieldID labelId;
static jfieldID probId;

//初始化函数
extern "C" JNIEXPORT jboolean JNICALL

Java_com_myapp_dyu2_SSd_Init(JNIEnv *env, jobject thiz, jobject assetManager) {
    // TODO: implement Init()
    ncnn::Option opt;
    opt.lightmode = true;
    opt.num_threads = 4;
    opt.blob_allocator = &g_blob_pool_allocator;
    opt.workspace_allocator = &g_workspace_pool_allocator;
    opt.use_packing_layout = true;


    // use vulkan compute
    if (ncnn::get_gpu_count() != 0)
        opt.use_vulkan_compute = true;

    AAssetManager* mgr = AAssetManager_fromJava(env, assetManager);

    SSD.opt = opt;
    // init param
    {
        int ret = SSD.load_param(mgr, "model.param");
        if (ret != 0)
        {
            __android_log_print(ANDROID_LOG_DEBUG, "aa", "load_param failed");
            return JNI_FALSE;
        }
    }

    // init bin
    {
        int ret = SSD.load_model(mgr, "model.bin");
        if (ret != 0)
        {
            __android_log_print(ANDROID_LOG_DEBUG, "aa", "load_model failed");
            return JNI_FALSE;
        }
    }
    // init jni glue
    //获取java中的对应实例类
    jclass localObjCls = env->FindClass("com/myapp/dyu2/SSd$Obj");
    objCls = reinterpret_cast<jclass>(env->NewGlobalRef(localObjCls));


    constructortorId = env->GetMethodID(objCls, "<init>", "(Lcom/myapp/dyu2/SSd;)V");

    xId = env->GetFieldID(objCls, "x", "F");
    yId = env->GetFieldID(objCls, "y", "F");
    wId = env->GetFieldID(objCls, "w", "F");
    hId = env->GetFieldID(objCls, "h", "F");
    labelId = env->GetFieldID(objCls, "label", "Ljava/lang/String;");
    probId = env->GetFieldID(objCls, "prob", "F");



    return JNI_TRUE;

}

static int max(int a,int b){
    if (a>b){
        return a;
    } else{return b;}
}
static int min(int a,int b){
    if (a>b){
        return b;
    } else{return a;}
}

extern "C"
JNIEXPORT jobjectArray JNICALL
Java_com_myapp_dyu2_SSd_Detect(JNIEnv *env, jobject thiz, jobject bitmap, jboolean use_gpu,jfloat prob_threshold ) {

    // TODO: implement Detect()
    if (use_gpu == JNI_TRUE && ncnn::get_gpu_count() == 0)
    {
        return NULL;
        //return env->NewStringUTF("no vulkan capable gpu");
    }
    //计数当前时间
    double start_time = ncnn::get_current_time();

    AndroidBitmapInfo info;
    AndroidBitmap_getInfo(env, bitmap, &info);
    //原始图像的宽和高
    const int width = info.width;
    const int height = info.height;
    if (info.format != ANDROID_BITMAP_FORMAT_RGBA_8888)
        return NULL;
    // ncnn from bitmap
    const int target_size = 640;

    ncnn::Mat in = ncnn::Mat::from_android_bitmap_resize(env, bitmap, ncnn::Mat::PIXEL_RGB, target_size, target_size);


    const float norm_vals[3] = {1 / 255.f, 1 / 255.f, 1 / 255.f};
    in.substract_mean_normalize(0, norm_vals);

    ncnn::Extractor ex = SSD.create_extractor();

    ex.set_vulkan_compute(use_gpu);

    ex.input("images", in);


//    prob_threshold = 0.25f;
    const float nms_threshold = 0.45f;

    //符合要求的盒子都放入该容器中
    std::vector<Object> objects;


    std::vector<Object> proposals;
    // stride 8
    {
        ncnn::Mat out;
        ex.extract("output", out);

        ncnn::Mat anchors(6);
        anchors[0] = 10.f;
        anchors[1] = 13.f;
        anchors[2] = 16.f;
        anchors[3] = 30.f;
        anchors[4] = 33.f;
        anchors[5] = 23.f;

        std::vector<Object> objects8;
        generate_proposals(anchors, 8, in, out, prob_threshold, objects8);

        proposals.insert(proposals.end(), objects8.begin(), objects8.end());
    }
    // stride 16
    {
        ncnn::Mat out;
        ex.extract("365", out);

        ncnn::Mat anchors(6);
        anchors[0] = 30.f;
        anchors[1] = 61.f;
        anchors[2] = 62.f;
        anchors[3] = 45.f;
        anchors[4] = 59.f;
        anchors[5] = 119.f;

        std::vector<Object> objects16;
        generate_proposals(anchors, 16, in, out, prob_threshold, objects16);

        proposals.insert(proposals.end(), objects16.begin(), objects16.end());
    }

    // stride 32
    {
        ncnn::Mat out;
        ex.extract("385", out);

        ncnn::Mat anchors(6);
        anchors[0] = 116.f;
        anchors[1] = 90.f;
        anchors[2] = 156.f;
        anchors[3] = 198.f;
        anchors[4] = 373.f;
        anchors[5] = 326.f;

        std::vector<Object> objects32;
        generate_proposals(anchors, 32, in, out, prob_threshold, objects32);

        proposals.insert(proposals.end(), objects32.begin(), objects32.end());
    }


    // sort all proposals by score from highest to lowest
    qsort_descent_inplace(proposals);

    // apply nms with nms_threshold
    std::vector<int> picked;
    nms_sorted_bboxes(proposals, picked, nms_threshold);

    int count = picked.size();


    //objects size 0=>2
    objects.resize(count);
    for (int i = 0; i < count; i++)
    {
        objects[i] = proposals[picked[i]];

        // adjust offset to original unpadded
        float wpad=0;
        float hpad=0;
        float scalex=1;//640.f/(float)width;
        float scaley=1;//640.f/(float)height;

        float x0 = (objects[i].x - (wpad / 2)) / scalex;
        float y0 = (objects[i].y - (hpad / 2)) / scaley;
        float x1 = (objects[i].x + objects[i].w - (wpad / 2)) / scalex;
        float y1 = (objects[i].y + objects[i].h - (hpad / 2)) / scaley;

//        __android_log_print(ANDROID_LOG_DEBUG, "aa", x0);

        // clip
//        x0 = std::max(std::min(x0, (float)(width - 1)), 0.f);
//        y0 = std::max(std::min(y0, (float)(height - 1)), 0.f);
//        x1 = std::max(std::min(x1, (float)(width - 1)), 0.f);
//        y1 = std::max(std::min(y1, (float)(height - 1)), 0.f);
        x0 = std::max(std::min(x0/640.f, 1.f), 0.f);
        y0 = std::max(std::min(y0/640.f, 1.f), 0.f);
        x1 = std::max(std::min(x1/640.f, 1.f), 0.f);
        y1 = std::max(std::min(y1/640.f, 1.f), 0.f);

        objects[i].x = x0;
        objects[i].y = y0;
        objects[i].w = x1 - x0;
        objects[i].h = y1 - y0;

    }


    static const char* class_names[]={"浮漂"};

    jobjectArray jObjArray = env->NewObjectArray(objects.size(), objCls, NULL);
    for (size_t i=0; i<objects.size(); i++)
    {
        jobject jObj = env->NewObject(objCls, constructortorId, thiz);

        env->SetFloatField(jObj, xId, objects[i].x);
        env->SetFloatField(jObj, yId, objects[i].y);
        env->SetFloatField(jObj, wId, objects[i].w);
        env->SetFloatField(jObj, hId, objects[i].h);
        env->SetObjectField(jObj, labelId, env->NewStringUTF(class_names[objects[i].label]));
        env->SetFloatField(jObj, probId, objects[i].prob);

        env->SetObjectArrayElement(jObjArray, i, jObj);
    }

    return jObjArray;


}

下面给出MainActivity.java的代码:

package com.myapp.dyu2;

import androidx.annotation.NonNull;
import androidx.appcompat.app.AppCompatActivity;
import androidx.camera.core.Camera;
import androidx.camera.core.CameraControl;
import androidx.camera.core.CameraSelector;
import androidx.camera.core.ImageAnalysis;
import androidx.camera.core.ImageCapture;
import androidx.camera.core.ImageProxy;
import androidx.camera.core.Preview;
import androidx.camera.extensions.HdrImageCaptureExtender;
import androidx.camera.lifecycle.ProcessCameraProvider;
import androidx.camera.view.PreviewView;
import androidx.core.app.ActivityCompat;
import androidx.core.content.ContextCompat;
import androidx.lifecycle.LifecycleOwner;

import android.annotation.SuppressLint;
import android.content.pm.PackageManager;
import android.content.res.Configuration;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.Canvas;
import android.graphics.ImageFormat;
import android.graphics.Matrix;
import android.graphics.Paint;
import android.graphics.PixelFormat;
import android.graphics.Rect;
import android.graphics.YuvImage;
import android.media.MediaPlayer;
import android.os.Bundle;
import android.os.CountDownTimer;
import android.util.DisplayMetrics;
import android.util.Log;
import android.view.MotionEvent;
import android.view.View;
import android.widget.AdapterView;
import android.widget.Button;
import android.widget.ImageButton;
import android.widget.ImageView;
import android.widget.RelativeLayout;
import android.widget.Spinner;
import android.widget.TextView;

import com.google.common.util.concurrent.ListenableFuture;

import java.io.ByteArrayOutputStream;
import java.nio.ByteBuffer;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.Executor;
import java.util.concurrent.Executors;

public class MainActivity extends AppCompatActivity {

    // Used to load the 'native-lib' library on application startup.
//    static {
//        System.loadLibrary("native-lib");
//    }

    private SSd sSdnet=new SSd();
//    private ImageView imageView;
    private Mypreview mPreviewView;
    private TextView textView;
    private Myview myview;


    int screenWidth;
    int screenHeight;

    private int REQUEST_CODE_PERMISSIONS = 1001;
    private Executor executor = Executors.newSingleThreadExecutor();

    private Spinner spinner;
    private Spinner spinner2;
    private Spinner spinner3;
    private Spinner spinner4;
    private ImageButton imageButton;
    private RelativeLayout relativeLayout;
    private Button button;

    private boolean start=false;
    private boolean us_gpu=false;
    float conf_thred=0.5f;
    float music_thred=0.2f;
    float time_thred=3.f;

    long toch_time=0;
    @SuppressLint("ClickableViewAccessibility")
    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        DisplayMetrics dm = new DisplayMetrics();
        getWindowManager().getDefaultDisplay().getMetrics(dm);
        screenWidth = dm.widthPixels;
        screenHeight = dm.heightPixels;

        //获取设置的配置信息
        mConfiguration = this.getResources().getConfiguration();


        boolean init=sSdnet.Init(getAssets());
        Log.i("aa",init+"");

        imageButton=findViewById(R.id.image_btn);
        imageButton.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                start=!start;
                if (start){
                    imageButton.setImageBitmap(BitmapFactory.decodeResource(getResources(),R.mipmap.open));
                }else {
                    imageButton.setImageBitmap(BitmapFactory.decodeResource(getResources(),R.mipmap.close));
                }
            }
        });

        spinner=findViewById(R.id.spin);
        spinner.setSelection(0);
        spinner.setOnItemSelectedListener(new AdapterView.OnItemSelectedListener() {
            @Override
            public void onItemSelected(AdapterView<?> parent, View view, int position, long id) {
                String string=parent.getSelectedItem().toString();
                if (string.equals("GPU")){
                    us_gpu=true;
                }else {
                    us_gpu=false;
                }
                Log.i("aa",""+us_gpu);
            }

            @Override
            public void onNothingSelected(AdapterView<?> parent) {

            }
        });

        spinner2=findViewById(R.id.spin2);
        spinner2.setSelection(5);
        spinner2.setOnItemSelectedListener(new AdapterView.OnItemSelectedListener() {
            @Override
            public void onItemSelected(AdapterView<?> parent, View view, int position, long id) {
//                String string=parent.getSelectedItem().toString();
                conf_thred=0.25f+0.05f*(float)position;
//                Log.i("aa",""+conf_thred);
            }

            @Override
            public void onNothingSelected(AdapterView<?> parent) {

            }
        });

        spinner3=findViewById(R.id.spin3);
        spinner3.setSelection(2);
        spinner3.setOnItemSelectedListener(new AdapterView.OnItemSelectedListener() {
            @Override
            public void onItemSelected(AdapterView<?> parent, View view, int position, long id) {
                music_thred=0.1f+0.05f*(float)position;
                Log.i("aa",""+music_thred);
            }

            @Override
            public void onNothingSelected(AdapterView<?> parent) {

            }
        });

        spinner4=findViewById(R.id.spin4);
        spinner4.setSelection(5);
        spinner4.setOnItemSelectedListener(new AdapterView.OnItemSelectedListener() {
            @Override
            public void onItemSelected(AdapterView<?> parent, View view, int position, long id) {
                time_thred=0.5f+0.5f*(float)position;
                Log.i("aa",""+time_thred);
            }

            @Override
            public void onNothingSelected(AdapterView<?> parent) {

            }
        });

        button=findViewById(R.id.btn1);
        button.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                relativeLayout.setVisibility(View.GONE);
            }
        });


        mPreviewView=findViewById(R.id.mypreview);

        textView=findViewById(R.id.txt);
//        imageView=findViewById(R.id.image);

        relativeLayout=findViewById(R.id.relate);
        myview=findViewById(R.id.myview);

        myview.setOnTouchListener(new View.OnTouchListener() {
            @Override
            public boolean onTouch(View v, MotionEvent event) {
                toch_time=System.currentTimeMillis();
                if (relativeLayout.getVisibility()!=View.INVISIBLE){
                    relativeLayout.setVisibility(View.VISIBLE);
                }
                return false;
            }
        });


        //获取权限
        if(allPermissionsGranted()){
            startCamera(); //start camera if permission has been granted by user
        } else{
            ActivityCompat.requestPermissions(this, REQUIRED_PERMISSIONS, REQUEST_CODE_PERMISSIONS);
        }

        mediaPlayer= MediaPlayer.create(getApplicationContext(),R.raw.m1);
//        imageView=findViewById(R.id.image);
//        Bitmap bmp2= BitmapFactory.decodeResource(getResources(),R.mipmap.a1);
//        imageView.setImageBitmap(bmp2);
//
//        SSd.Obj[] outcome=sSdnet.Detect(bmp2,false);
//
//        mysurface.outcom=outcome;
//        mysurface.bitmapWidth=bmp2.getWidth();
//        mysurface.bitmapHeight=bmp2.getHeight();

//        show(outcome,bmp2);

    }


    public void show(SSd.Obj[] outcome,Bitmap bmp){
        if (outcome==null){
            return;
        }

        Paint paint=new Paint();

        Bitmap rgba = bmp.copy(Bitmap.Config.ARGB_8888, true);
        Canvas canvas=new Canvas(rgba);

        paint.setStyle(Paint.Style.STROKE);
        paint.setStrokeWidth(8);


        for (int i=0;i<outcome.length;i++){
            Log.i("aa",outcome[i].x+" "+outcome[i].y+" ");
            Log.i("aa",bmp.getWidth()+" "+bmp.getHeight()+" ");
            Log.i("aa",screenWidth+" "+screenHeight+" ");

            canvas.drawRect(outcome[i].x, outcome[i].y, outcome[i].x + outcome[i].w, outcome[i].y + outcome[i].h, paint);

        }

//        imageView.setImageBitmap(rgba);

    }
    /**
     * A native method that is implemented by the 'native-lib' native library,
     * which is packaged with this application.
     */
    private ProcessCameraProvider cameraProvider;
    private void startCamera() {

        final ListenableFuture<ProcessCameraProvider> cameraProviderFuture = ProcessCameraProvider.getInstance(this);

        cameraProviderFuture.addListener(new Runnable() {
            @Override
            public void run() {
                try {
                    cameraProvider = cameraProviderFuture.get();
                    bindPreview(cameraProvider);


                } catch (ExecutionException | InterruptedException e) {
                    // No errors need to be handled for this Future.
                    // This should never be reached.
                }
            }
        }, ContextCompat.getMainExecutor(this));
    }

    long t1=0;
    long t2=0;
    long t3=0;
    private int camea_id=1;
    private Bitmap bmp;
    private CameraControl cameraControl;
    private boolean su;
    private boolean heng;

    private Configuration mConfiguration;

    float now_h=0.f;
    float now_x=0.f;

    boolean music=false;
    boolean isPlaying=false;
    private MediaPlayer mediaPlayer;
    void bindPreview(@NonNull ProcessCameraProvider cameraProvider) {

        Preview preview = new Preview.Builder()
                .build();

        @SuppressLint("WrongConstant") CameraSelector cameraSelector = new CameraSelector.Builder()
                .requireLensFacing(camea_id)
                .build();

        ImageAnalysis imageAnalysis = new ImageAnalysis.Builder()
                .build();
        //
        //imageAnalysis.setAnalyzer(cameraExecutor, new MyAnalyzer());
        imageAnalysis.setAnalyzer(executor, new ImageAnalysis.Analyzer() {
            @Override
            public void analyze(@NonNull ImageProxy image) {

                runOnUiThread(() ->{
                    //获取当下屏幕状态
                    int ori = mConfiguration.orientation; //获取屏幕方向
                    if (ori == mConfiguration.ORIENTATION_LANDSCAPE) {
                        //横屏
                        heng=true;
                        su=false;
                    } else if (ori == mConfiguration.ORIENTATION_PORTRAIT) {
                        //竖屏
                        su=true;
                        heng=false;
                    }

                    t1=t2;
                    t2= System.currentTimeMillis();
                    long fps=1000/(t2-t1);
                    textView.setText("FPS= "+fps);


                    if (!start){
                        image.close();
                        return;
                    }

                    //yuv图像数据转bitmap
                    ImageProxy.PlaneProxy[] planes = image.getPlanes();

                    //cameraX 获取yuv
                    ByteBuffer yBuffer = planes[0].getBuffer();
                    ByteBuffer uBuffer = planes[1].getBuffer();
                    ByteBuffer vBuffer = planes[2].getBuffer();

                    int ySize = yBuffer.remaining();
                    int uSize = uBuffer.remaining();
                    int vSize = vBuffer.remaining();

                    byte[] nv21 = new byte[ySize + uSize + vSize];

                    yBuffer.get(nv21, 0, ySize);
                    vBuffer.get(nv21, ySize, vSize);
                    uBuffer.get(nv21, ySize + vSize, uSize);
                    //获取yuvImage
                    YuvImage yuvImage = new YuvImage(nv21, ImageFormat.NV21, image.getWidth(), image.getHeight(), null);
                    //输出流
                    ByteArrayOutputStream out = new ByteArrayOutputStream();
                    //压缩写入out
                    yuvImage.compressToJpeg(new Rect(0, 0, yuvImage.getWidth(), yuvImage.getHeight()), 50, out);
                    //转数组
                    byte[] imageBytes = out.toByteArray();
                    //生成bitmap
                    Bitmap bmp = BitmapFactory.decodeByteArray(imageBytes, 0, imageBytes.length);

                    //旋转bitmap
                    Bitmap rotateBitmap=null;
                    if (camea_id==1 && su){
                        rotateBitmap = rotateBitmap(bmp, 90);
                    }else if(camea_id==0 && su){
                        rotateBitmap = rotateBitmap(bmp, 270);
                    }else if(camea_id==1 && heng){
                        rotateBitmap=bmp;
                    }else {
                        rotateBitmap=rotateBitmap(bmp, 0);
                    }


                    Bitmap bmp2=rotateBitmap.copy(Bitmap.Config.ARGB_8888, true);

                    SSd.Obj[] outcome=sSdnet.Detect(bmp2,us_gpu,conf_thred);


                    //Log.i("aa","bitmap_w="+bmp2.getWidth()+"  bitmap_h="+bmp2.getHeight());
//                    imageView.setImageBitmap(bmp2);

                    float aa=0;
                    float bb=0;

                    SSd.Obj[] pick_outcom=new SSd.Obj[1];
                    float maxconf=0;
                    int index=0;
                    if(outcome.length>0){

                        for (int i=0;i<outcome.length;i++){
                            if (i==0){
                                maxconf=outcome[i].prob;
                                index=i;
                            }else {
                                if(outcome[i].prob>maxconf){
                                    maxconf=outcome[i].prob;
                                    index=i;
                                }
                            }
                        }

                        pick_outcom[0]=outcome[index];

                    }



                    if (outcome.length>0){
                        aa=Math.abs(pick_outcom[0].h*bmp2.getHeight()/now_h-1.f);
                        if (now_x==0){
                            bb=0;
                        }else {
                            bb=Math.abs(pick_outcom[0].x*bmp2.getHeight()/now_x-1.f);
                        }
                        music= aa > music_thred || bb > music_thred;


                        if (music && !mediaPlayer.isPlaying()){

                            mediaPlayer.start();

                        }
                        Log.i("aa","isplaying="+isPlaying);


//                        Log.i("aa"," aa="+aa+"   bb="+bb);

                        if (t2-t3>3000){
                            t3=t2;
                            now_h=pick_outcom[0].h*bmp2.getHeight();
                            now_x=pick_outcom[0].x*bmp2.getHeight();
                        }

                    }

                    new Thread(new Runnable() { // 匿名类的Runnable接口
                        @Override
                        public void run() {
                            myview.draws(outcome,bmp2.getWidth(),bmp2.getHeight(),heng,mediaPlayer.isPlaying());
                        }
                    }).start();


                    //关闭
                    image.close();

                });

            }
        });

        ImageCapture.Builder builder = new ImageCapture.Builder();

        //Vendor-Extensions (The CameraX extensions dependency in build.gradle)
        HdrImageCaptureExtender hdrImageCaptureExtender = HdrImageCaptureExtender.create(builder);

        // Query if extension is available (optional).
        if (hdrImageCaptureExtender.isExtensionAvailable(cameraSelector)) {
            // Enable the extension if available.
            hdrImageCaptureExtender.enableExtension(cameraSelector);
        }

        final ImageCapture imageCapture = builder
                .setTargetRotation(this.getWindowManager().getDefaultDisplay().getRotation())
                .build();

        preview.setSurfaceProvider(mPreviewView.createSurfaceProvider());

        try {
            cameraProvider.unbindAll();
            Camera camera = cameraProvider.bindToLifecycle((LifecycleOwner)this, cameraSelector, preview, imageAnalysis, imageCapture);
            cameraControl=camera.getCameraControl();
            mPreviewView.cameraControl=cameraControl;
//            cameraControl.setLinearZoom(1f);


        } catch (Exception e) {
            e.printStackTrace();
        }


    }
    private Bitmap rotateBitmap(Bitmap origin, float alpha) {
        if (origin == null) {
            return null;
        }
        int width = origin.getWidth();
        int height = origin.getHeight();
        Matrix matrix = new Matrix();
        matrix.setRotate(alpha);
        if (camea_id==0){
            matrix.postScale(-1,1);
        }
        // 围绕原地进行旋转
        Bitmap newBM = Bitmap.createBitmap(origin, 0, 0, width, height, matrix, false);
        if (newBM.equals(origin)) {
            return newBM;
        }
        origin.recycle();
        return newBM;
    }

    private final String[] REQUIRED_PERMISSIONS = new String[]{"android.permission.CAMERA", "android.permission.WRITE_EXTERNAL_STORAGE"};
    //获取权限函数
    private boolean allPermissionsGranted(){
        for(String permission : REQUIRED_PERMISSIONS){
            if(ContextCompat.checkSelfPermission(this, permission) != PackageManager.PERMISSION_GRANTED){
                return false;
            }
        }
        return true;
    }

}

github代码下载

文中给出的代码仅为部分关键代码,如果想理解全部过程,建议到github下载全部代码,随便点个关注,谢谢。
GitHub代码下载地址:https://github.com/qwhh11/dyu2

  • 1
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值