PaddleOCR使用笔记
Linux环境下
PaddleOCR develop版本安装及使用(第二次尝试)(成功)
端侧部署之使用opt转换模型(第三次尝试)(成功)
cd /home/lexiaoyuan/projects/PaddleOCR_dev/PaddleOCR
cp -R ./inference /home/lexiaoyuan/Paddle-Lite/
cd ./Paddle-Lite/build.opt/lite/api
# 转换检测模型
./opt --model_file=/Paddle-Lite/inference/det_mv3_east/model --param_file=/Paddle-Lite/inference/det_mv3_east/params --optimize_out_type=naive_buffer --optimize_out=./det_mv3_east_opt --valid_targets=arm
# 转换识别模型
./opt --model_file=/Paddle-Lite/inference/rec_en_number_lite_crown/model --param_file=/Paddle-Lite/inference/rec_en_number_lite_crown/params --optimize_out_type=naive_buffer --optimize_out=./rec_en_number_lite_crown_opt --valid_targets=arm
./opt --model_file=/Paddle-Lite/inference/rec_aug_en_number_lite_crown/model --param_file=/Paddle-Lite/inference/rec_aug_en_number_lite_crown/params --optimize_out_type=naive_buffer --optimize_out=./rec_aug_en_number_lite_crown_opt --valid_targets=arm
# 转换方向分类模型
./opt --model_file=/Paddle-Lite/inference/cls_crown/model --param_file=/Paddle-Lite/inference/cls_crown/params --optimize_out_type=naive_buffer --optimize_out=./cls_crown_opt --valid_targets=arm
# 将debug文件夹push到手机上
adb push debug /data/local/tmp/
adb shell
cd /data/local/tmp/debug
export LD_LIBRARY_PATH=${PWD}:$LD_LIBRARY_PATH
chmod 777 *
# ./ocr_db_crnn 检测模型文件 方向分类器模型文件 识别模型文件 测试图像路径 字典文件路径
./ocr_db_crnn det_mv3_east_opt.nb rec_en_number_lite_crown_opt.nb ./20160517_101050_A8Y0930108BZ_Z.jpg en_dict.txt
./ocr_db_crnn det_mv3_east_opt.nb ch_ppocr_mobile_v1.1_cls_quant_opt.nb rec_en_number_lite_crown_opt.nb ./20160517_101050_A8Y0930108BZ_Z.jpg en_dict.txt
./ocr_db_crnn ch_ppocr_mobile_v1.1_det_prune_opt.nb ch_ppocr_mobile_v1.1_rec_quant_opt.nb ch_ppocr_mobile_v1.1_cls_quant_opt.nb ./1.jpg ppocr_keys_v1.txt
# 【推荐】 下载PaddleOCR V1.1版本的中英文 inference模型,V1.1比1.0效果更好,模型更小
wget https://paddleocr.bj.bcebos.com/20-09-22/mobile-slim/det/ch_ppocr_mobile_v1.1_det_prune_infer.tar && tar xf ch_ppocr_mobile_v1.1_det_prune_infer.tar
wget https://paddleocr.bj.bcebos.com/20-09-22/mobile-slim/rec/ch_ppocr_mobile_v1.1_rec_quant_infer.tar && tar xf ch_ppocr_mobile_v1.1_rec_quant_infer.tar
wget https://paddleocr.bj.bcebos.com/20-09-22/mobile-slim/cls/ch_ppocr_mobile_v1.1_cls_quant_infer.tar && tar xf ch_ppocr_mobile_v1.1_cls_quant_infer.tar
# 转换V1.1检测模型
./opt --model_file=./ch_ppocr_mobile_v1.1_det_prune_infer/model --param_file=./ch_ppocr_mobile_v1.1_det_prune_infer/params --optimize_out=./ch_ppocr_mobile_v1.1_det_prune_opt --valid_targets=arm
# 转换V1.1识别模型
./opt --model_file=./ch_ppocr_mobile_v1.1_rec_quant_infer/model --param_file=./ch_ppocr_mobile_v1.1_rec_quant_infer/params --optimize_out=./ch_ppocr_mobile_v1.1_rec_quant_opt --valid_targets=arm
# 转换V1.1分类模型
./opt --model_file=./ch_ppocr_mobile_v1.1_cls_quant_infer/model --param_file=./ch_ppocr_mobile_v1.1_cls_quant_infer/params --optimize_out=./ch_ppocr_mobile_v1.1_cls_quant_opt --valid_targets=arm
模型转换终于成功了,现在转换出来的
.nb
的模型就可以在移动端使用了。
创作不易,喜欢的话加个关注点个赞,❤谢谢谢谢❤