学校的立项,想把客户端做的高端一点,要用的二维码,遂研究。图片解码算法部分本来是用书上看到的sourceforge.jp里的qrcode项目(http://sourceforge.jp/projects/qrcode/)/**书上看到的**/,demo做完发现效果极差,解码极慢,对采样要求极高,甚至样片倒置就不能读了,后来偶遇google的zxing,效果极好。依照本人对android的一知半解写的程序,效果还不错,于是整理出来和大家分享一下,请多指教
整理是痛苦的...........................
demo项目是两个Activity,mainActivity上一个按钮,点击startActivityForResult启动另一个Activity,这个Activity负责扫描,得到结果在传回mainActiviy,显示在mainActivity的按钮上。
项目源码请戳http://download.csdn.net/detail/sparkstrike/7707563 (下方提供源码,慎点,积分嫌多除外)
于是重点就是第二个Activity,项目里是QERdecodeActivity,本人是本着代码最简和UI最优的原则写的程序,所有解码算法都是在主线程里完成,调试的时候看Log,大于每次解码要用0.2秒,UI上本来 想用一些补间动画让中间的那个环转起来,可是主线程都被解码占了,运行的时候一顿一顿的,改成解码前重绘一下View,差不多一秒5帧,将就着转起来
整个运行过程:
QERdecodeActivity本身就是一个SurfaceView,他的SurfaceHolder给Camera对象,呈现相机的预览;同时SurfaceView上改一个自定义的view,view里画出取景框,中间的圆圈,剩余的阴影;同时QERdecodeActivity实现一个PreviewCallback接口,这个接口就是camera的预览数据(连续不断的画面)再给出一份实,于是对二维码的解码就使用PreviewCallback接口里给出的数据
其中比较重点的是:
1.相机的配置
// 配置相机
Camera.Parameters params = camera.getParameters();
params.setPreviewSize(scrH, scrW);// 设置预览照片的大小
// params.setPreviewFpsRange(2, 2);// 设置预览照片时每秒显示多少帧的最小值和最大值
this.camera.setParameters(params);
this.camera.setDisplayOrientation(90);//相机旋转
用Camera.parameters对象对相机进行设置,其中setPreviewSize(int ,int)//设置预览照片大小时一定要用相机支持的,否则会报错,可以先用camera.getParameters().getSupportedPreviewSize();返回一个List<Size>,里面有相机支持的所有预览尺寸,我是直接设置为屏幕的分辨率,试了几个手机都没报错。
2.为surfaceViewde的surfaceHolder设置callback
// 设置surfaceView的callback
this.surfaceView.getHolder().addCallback(new SurfaceHolder.Callback() {
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
// TODO Auto-generated method stub
// 释放相机
}
@Override
public void surfaceCreated(SurfaceHolder holder) {
// TODO Auto-generated method stub
// 为相机设置预览
try {
camera.setPreviewDisplay(holder);
} catch (IOException e) {
// TODO Auto-generated catch block
}
// 开始预览
camera.startPreview();
camera.setOneShotPreviewCallback(QERdecodeActivity.this);
camera.autoFocus(null);
}
@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
// TODO Auto-generated method stub
}
});
surfaceView的surfaceHolder的callback实际上就是告诉给surfaceView资源的对象(这里是相机)什么时候surfaceView被创建了,什么时候surfaceView的大小改变了,什么时候surfaceView销毁了,让这个对象采取应对措施
这里public void surfaceCreated(SurfaceHolder holder){}为相机开启预览;public void surfaceChanged(SurfaceHolder holder, int format, int width,int height){}因为程序不复杂,就不用;public void surfaceDestroyed(SurfaceHolder holder)销毁时本应该释放相机,我留到Activity的OnDestroy()里了
3.画扫描框
/**
* 绘制一个中心方框全透的view
* @author Administrator
* _________________
* | |
* |h |
* |_w___cw_____w__|
* | | | |
* |cw | |cw |
* |___ |______|___|
* | |
* |h |
* |_______________|
*/
class DrawCapture extends View{
private float cw;
private float h;
private float w;
private float rectWigth = 0;
private float startAngle = 0;
private RectF rectF = null;
private float arcWigth = 0;
public DrawCapture(Context context, float per) {
super(context);
// TODO Auto-generated constructor stub
// 正方形的参数
this.cw = (int)scrW*per;
this.h = (int)(scrH-cw)/2;
this.w = (int)(scrW-cw)/2;
this.rectWigth = 0.0083f*scrW;
// 圆的参数
/**
_________________
* | | |
* | | |
* | |y |
* | r/2 |
* |--x- 圆 -x-|
* | |
* | |y |
* | | |
* |________|______|
*
*/
int r = (int) (scrW*per*0.9f);
int x = (scrW - r)/2;
int y = (scrH - r)/2;
this.arcWigth = scrW*0.025f;
this.rectF = new RectF(x, y ,x+r, y+r);
}
@Override
protected void onDraw(Canvas canvas){
Paint paint = new Paint();
// 画阴影:四周的半透明阴影
paint.setARGB(100, 99, 99, 99);
paint.setStyle(Style.FILL);
canvas.drawRect(0, 0, scrW, h, paint);
canvas.drawRect(0, h, w, h+cw, paint);
canvas.drawRect(w+cw, h, scrW, h+cw, paint);
canvas.drawRect(0, h+cw, scrW, scrH, paint);
// 画镜框:中间的正方形
paint.setARGB(200, 99, 99, 99);
paint.setStrokeWidth(this.rectWigth);
paint.setStyle(Paint.Style.STROKE);
canvas.drawRect(w, h, cw+w, h+cw, paint);
// 绘制圆:中间的圆
// 消除锯齿
paint.setAntiAlias(true);
paint.setStrokeWidth(this.arcWigth);
paint.setStyle(Paint.Style.STROKE);
// 绘制低透部分
paint.setARGB(150, 117, 179, 21);
canvas.drawArc(rectF, this.startAngle+0, 90, false, paint);
paint.setARGB(150, 189, 153, 75);
canvas.drawArc(rectF, this.startAngle+180, 90, false, paint);
// 绘制高透部分
paint.setARGB(150, 136, 129, 121);
canvas.drawArc(rectF, this.startAngle+90, 90, false, paint);
paint.setARGB(150, 6, 78, 128);
canvas.drawArc(rectF, this.startAngle+270, 90, false, paint);
this.startAngle= this.startAngle+1;
if(this.startAngle >360){
this.startAngle = 0;
};
}
};
扫描框只有一个参数,float per,扫描框占整个屏幕的百分比,比如说per=0.6,扫描框就占整个屏幕的60%;
view由3部分组成,中间的正方形,四周的半透明阴影,中间的圆,中间的圆由4个不同颜色的1/4圆组成,1/4半圆的其实角度是变量startAngle,每次重绘都+1,于是圆就转起来了
扫描框的所有参数都是根据屏幕尺寸定的,通过下面函数获取尺寸
Display mDisplay = getWindowManager().getDefaultDisplay();
this.scrW = mDisplay.getWidth();//宽度
this.scrH = mDisplay.getHeight();//高度
Activity添加这个View
// 初始焦点框
this.captureView = new DrawCapture(this, this.capturePer);
this.addContentView(this.captureView, new LayoutParams(LayoutParams.WRAP_CONTENT, LayoutParams.WRAP_CONTENT));
4.相机对焦和获取相机预览图片
public void onPreviewFrame(final byte[] data, final Camera camera) {
// TODO Auto-generated method stub
String str = null;
try {
this.autoFouce++;
if(this.autoFouce == 6){
camera.autoFocus(null);
this.autoFouce = 0;
};
this.captureView.invalidate();
Size size = camera.getParameters().getPreviewSize();
YuvImage image = new YuvImage(data, ImageFormat.NV21, size.width,
size.height, null);
if (image != null) {
ByteArrayOutputStream stream = new ByteArrayOutputStream();
image.compressToJpeg(new Rect(0, 0, size.width, size.height),
80, stream);
Bitmap bmp = BitmapFactory.decodeByteArray(
stream.toByteArray(), 0, stream.size());
stream.close();
// 裁剪图片
bmp = cutBitmap(bmp, capturePer);
str = decodeImage(bmp);
}
} catch (Exception ex) {
}
if(str != null){
// Toast.makeText(this, str, Toast.LENGTH_SHORT).show();
Vibrate(300);
this.returnStr(str);
}else{
camera.setOneShotPreviewCallback(QERdecodeActivity.this);
};
}
相机对焦和获取相机预览图片都在PreviewCallback接口中public void onPreviewFrame(final byte[] data, final Camera camera){}完成,相机每次只是传送一帧预览图像:camera.setOneShotPreviewCallback(QERdecodeActivity.this);但是public void onPreviewFrame(final byte[] data, final Camera camera){}最后都会camera.setOneShotPreviewCallback(QERdecodeActivity.this);于是camera的预览画面就会源源不断的传给实现了PreviewCallback接口的对象了
相机每传送6预览图像,即调用PreviewCallback接口中public void onPreviewFrame(final byte[] data, final Camera camera){}6次,相机对焦一次,因为解码都在public void onPreviewFrame(final byte[] data, final Camera camera){}中完成,所以6次要花1秒多时间,也就是1秒多对一次焦
<span style="white-space:pre"> </span>this.autoFouce++;
if(this.autoFouce == 6){
camera.autoFocus(null);
this.autoFouce = 0;
};
相机提供的预览数据public void onPreviewFrame(final byte[] data, final Camera camera){}中的data格式比较奇葩,要转成Bitmap
Size size = camera.getParameters().getPreviewSize();
YuvImage image = new YuvImage(data, ImageFormat.NV21, size.width,
size.height, null);
if (image != null) {
ByteArrayOutputStream stream = new ByteArrayOutputStream();
image.compressToJpeg(new Rect(0, 0, size.width, size.height),
80, stream);
Bitmap bmp = BitmapFactory.decodeByteArray(
stream.toByteArray(), 0, stream.size());
stream.close();
5.扫描框里图片的裁剪
/**
* 裁剪图片,扣除中心的宽为w的正方形
* @param bmp
* @param w
* @return
*/
private Bitmap cutBitmap(Bitmap bmp, float per){
int bmpW = bmp.getWidth();
int bmpH = bmp.getHeight();
int w = (int) ((bmpW < bmpH ? bmpW : bmpH)*per);
int x = (bmpW-w)/2;
int y = (bmpH-w)/2;
// Toast.makeText(this, "bmpW:"+bmpW+" bmpH:"+bmpH+" w:"+w+" x:"+x+" y:"+y, Toast.LENGTH_LONG).show();
return Bitmap.createBitmap(bmp, x, y, w, w);
因为相机预览设置的就是手机屏幕的分辨率,于是前面转的Bitmap也是手机分辨率的尺寸,需要将扫描时手机扫描框里的部分提取出来,前面通过一个float型的小数设置的中心的扫描框,这里也需要从转过后的Bitmap里吧中心那块扣出来,所以,这个扣图函数的参数也是一个float型的小数,要和前面扫描框的参数一致
6.最关键的一步:解码
利用zxing解码,一张BItmap先转成google.zxing.BinaryBitmap,在用google.zxing.Result = reader.decode(BinaryBitmap对象)解码,reader是一个google.zxing.qrcode.QRCodeReader对象,最后通过String str = result.getTxet()获得字符串
/**
*解码bmp
* @param bmp
* @return
*/
private String decodeImage(Bitmap bmp){
String str = null;
RGBLuminanceSource source = new RGBLuminanceSource(bmp);
BinaryBitmap binBmp = new BinaryBitmap(new HybridBinarizer(source));
try {
Result result = this.reader.decode(binBmp);
str = result.getText();
}catch(Exception e ){
};
return str;
};
BItmap先转成google.zxing.BinaryBitmap时略纠结,我直接抄的网上的,RGBLuminanceSource是一个继承了google.zxing.LuminanceSource的类,看代码大概是对Bitmap对象换算,转成灰度图,提供索引接口等
RGBLuminanceSource source = new RGBLuminanceSource(bmp);
BinaryBitmap binBmp = new BinaryBitmap(new HybridBinarizer(source));
RGBLuminanceSource类源码
class RGBLuminanceSource extends LuminanceSource {
private final byte[] luminances;
public RGBLuminanceSource(Bitmap bitmap) {
super(bitmap.getWidth(), bitmap.getHeight());
int width = bitmap.getWidth();
int height = bitmap.getHeight();
int[] pixels = new int[width * height];
bitmap.getPixels(pixels, 0, width, 0, 0, width, height);
// In order to measure pure decoding speed, we convert the entire
// image to a greyscale array
// up front, which is the same as the Y channel of the
// YUVLuminanceSource in the real app.
luminances = new byte[width * height];
for (int y = 0; y < height; y++) {
int offset = y * width;
for (int x = 0; x < width; x++) {
int pixel = pixels[offset + x];
int r = (pixel >> 16) & 0xff;
int g = (pixel >> 8) & 0xff;
int b = pixel & 0xff;
if (r == g && g == b) {
// Image is already greyscale, so pick any channel.
luminances[offset + x] = (byte) r;
} else {
// Calculate luminance cheaply, favoring green.
luminances[offset + x] = (byte) ((r + g + g + b) >> 2);
}
}
}
}
@Override
public byte[] getRow(int y, byte[] row) {
if (y < 0 || y >= getHeight()) {
throw new IllegalArgumentException(
"Requested row is outside the image: " + y);
}
int width = getWidth();
if (row == null || row.length < width) {
row = new byte[width];
}
System.arraycopy(luminances, y * width, row, 0, width);
return row;
}
// Since this class does not support cropping, the underlying byte array
// already contains
// exactly what the caller is asking for, so give it to them without a
// copy.
@Override
public byte[] getMatrix() {
return luminances;
}
}
7.最后别忘了加权限
<!-- 震动权限 -->
<uses-permission android:name="android.permission.VIBRATE" />
<!-- 相机权限 -->
<uses-permission android:name="android.permission.CAMERA"></uses-permission>
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />
以下是整个Activity的源码
package com.demo;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import com.google.zxing.BinaryBitmap;
import com.google.zxing.LuminanceSource;
import com.google.zxing.Result;
import com.google.zxing.common.HybridBinarizer;
import com.google.zxing.qrcode.QRCodeReader;
import android.app.Activity;
import android.app.Service;
import android.content.Context;
import android.content.Intent;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.Canvas;
import android.graphics.ImageFormat;
import android.graphics.Paint;
import android.graphics.Paint.Style;
import android.graphics.Rect;
import android.graphics.RectF;
import android.graphics.YuvImage;
import android.hardware.Camera;
import android.hardware.Camera.PreviewCallback;
import android.hardware.Camera.Size;
import android.os.Bundle;
import android.os.Vibrator;
import android.util.Log;
import android.view.Display;
import android.view.MotionEvent;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.ViewGroup.LayoutParams;
import android.view.Window;
import android.view.WindowManager;
import android.widget.Toast;
public class QERdecodeActivity extends Activity implements PreviewCallback{
private SurfaceView surfaceView = null;
private Camera camera = null;
private DrawCapture captureView = null;
private int scrW = 0;
private int scrH = 0;
private final float capturePer = 0.6f;
private QRCodeReader reader = null;
private int autoFouce = 0;
private String str = null;
@Override
protected void onCreate(Bundle savedInstanceState) {
// TODO Auto-generated method stub
super.onCreate(savedInstanceState);
// 设置全屏
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN);
// 设置无标题
requestWindowFeature(Window.FEATURE_NO_TITLE);
reader= new QRCodeReader();
// 获取屏幕分辨率
Display mDisplay = getWindowManager().getDefaultDisplay();
this.scrW = mDisplay.getWidth();//宽度
this.scrH = mDisplay.getHeight();//高度
this.setContentView(R.layout.activity_qer_decode);
// 初始焦点框
this.captureView = new DrawCapture(this, this.capturePer);
this.addContentView(this.captureView, new LayoutParams(LayoutParams.WRAP_CONTENT, LayoutParams.WRAP_CONTENT));
// 初始化surfaceView
this.surfaceView = (SurfaceView) this.findViewById(R.id.activity_qer_decode_surfaceview);
// 设置surface不需要自己的维护缓存区
this.surfaceView.getHolder().setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
// 初始化相机
this.camera = Camera.open();
// 配置相机
Camera.Parameters params = camera.getParameters();
params.setPreviewSize(scrH, scrW);// 设置预览照片的大小
// params.setPreviewFpsRange(2, 2);// 设置预览照片时每秒显示多少帧的最小值和最大值
this.camera.setParameters(params);
this.camera.setDisplayOrientation(90);//相机旋转
// 设置surfaceView的callback
this.surfaceView.getHolder().addCallback(new SurfaceHolder.Callback() {
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
// TODO Auto-generated method stub
// 释放相机
}
@Override
public void surfaceCreated(SurfaceHolder holder) {
// TODO Auto-generated method stub
// 为相机设置预览
try {
camera.setPreviewDisplay(holder);
} catch (IOException e) {
// TODO Auto-generated catch block
}
// 开始预览
camera.startPreview();
camera.setOneShotPreviewCallback(QERdecodeActivity.this);
camera.autoFocus(null);
}
@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width,
int height) {
// TODO Auto-generated method stub
}
});
}
/**
* 绘制一个中心方框全透的view
* @author Administrator
* _________________
* | |
* |h |
* |_w___cw_____w|
* | | | |
* |cw | |cw |
* |___ |______ |____|
* | |
* |h |
* |_______________ |
*/
class DrawCapture extends View{
private float cw;
private float h;
private float w;
private float rectWigth = 0;
private float startAngle = 0;
private RectF rectF = null;
private float arcWigth = 0;
public DrawCapture(Context context, float per) {
super(context);
// TODO Auto-generated constructor stub
// 正方形的参数
this.cw = (int)scrW*per;
this.h = (int)(scrH-cw)/2;
this.w = (int)(scrW-cw)/2;
this.rectWigth = 0.0083f*scrW;
// 圆的参数
/**
_________________
* | | |
* | | |
* | |y |
* | r/2 |
* |--x- 圆 --x-|
* | |
* | |y |
* | | |
* |_______|_______|
*
*/
int r = (int) (scrW*per*0.9f);
int x = (scrW - r)/2;
int y = (scrH - r)/2;
this.arcWigth = scrW*0.025f;
this.rectF = new RectF(x, y ,x+r, y+r);
}
@Override
protected void onDraw(Canvas canvas){
Paint paint = new Paint();
// 画阴影:四周的半透明阴影
paint.setARGB(100, 99, 99, 99);
paint.setStyle(Style.FILL);
canvas.drawRect(0, 0, scrW, h, paint);
canvas.drawRect(0, h, w, h+cw, paint);
canvas.drawRect(w+cw, h, scrW, h+cw, paint);
canvas.drawRect(0, h+cw, scrW, scrH, paint);
// 画镜框:中间的正方形
paint.setARGB(200, 99, 99, 99);
paint.setStrokeWidth(this.rectWigth);
paint.setStyle(Paint.Style.STROKE);
canvas.drawRect(w, h, cw+w, h+cw, paint);
// 绘制圆:中间的圆
// 消除锯齿
paint.setAntiAlias(true);
paint.setStrokeWidth(this.arcWigth);
paint.setStyle(Paint.Style.STROKE);
// 绘制低透部分
paint.setARGB(150, 117, 179, 21);
canvas.drawArc(rectF, this.startAngle+0, 90, false, paint);
paint.setARGB(150, 189, 153, 75);
canvas.drawArc(rectF, this.startAngle+180, 90, false, paint);
// 绘制高透部分
paint.setARGB(150, 136, 129, 121);
canvas.drawArc(rectF, this.startAngle+90, 90, false, paint);
paint.setARGB(150, 6, 78, 128);
canvas.drawArc(rectF, this.startAngle+270, 90, false, paint);
this.startAngle= this.startAngle+1;
if(this.startAngle >360){
this.startAngle = 0;
};
}
};
@Override
public boolean onTouchEvent(MotionEvent event) {
// TODO Auto-generated method stub
this.captureView.invalidate();
return super.onTouchEvent(event);
}
/**
* 返回字符串给mainActivity
* @param str
*/
private void returnStr(String str){
Intent intent=new Intent();//创建Intent对象
intent.putExtra("back",str);//设置返回值
setResult(Activity.RESULT_OK, intent);//设置返回标识
finish();//关闭当前Activity
}
/**
* 手机震动
* @param milliseconds
*/
private void Vibrate(long milliseconds) {
Vibrator vib = (Vibrator) this.getSystemService(Service.VIBRATOR_SERVICE);
vib.vibrate(milliseconds);
}
@Override
protected void onDestroy() {
// TODO Auto-generated method stub
// 释放相机
if(camera != null){
camera.stopPreview();
camera.release();
};
this.reader = null;
super.onDestroy();
}
@Override
public void onPreviewFrame(final byte[] data, final Camera camera) {
// TODO Auto-generated method stub
String str = null;
try {
this.autoFouce++;
if(this.autoFouce == 6){
camera.autoFocus(null);
this.autoFouce = 0;
};
this.captureView.invalidate();
Size size = camera.getParameters().getPreviewSize();
YuvImage image = new YuvImage(data, ImageFormat.NV21, size.width,
size.height, null);
if (image != null) {
ByteArrayOutputStream stream = new ByteArrayOutputStream();
image.compressToJpeg(new Rect(0, 0, size.width, size.height),
80, stream);
Bitmap bmp = BitmapFactory.decodeByteArray(
stream.toByteArray(), 0, stream.size());
stream.close();
// 裁剪图片
bmp = cutBitmap(bmp, capturePer);
str = decodeImage(bmp);
}
} catch (Exception ex) {
}
if(str != null){
// Toast.makeText(this, str, Toast.LENGTH_SHORT).show();
Vibrate(300);
this.returnStr(str);
}else{
camera.setOneShotPreviewCallback(QERdecodeActivity.this);
};
}
/**
* 裁剪图片,扣除中心的宽为w的正方形
* @param bmp
* @param w
* @return
*/
private Bitmap cutBitmap(Bitmap bmp, float per){
int bmpW = bmp.getWidth();
int bmpH = bmp.getHeight();
int w = (int) ((bmpW < bmpH ? bmpW : bmpH)*per);
int x = (bmpW-w)/2;
int y = (bmpH-w)/2;
// Toast.makeText(this, "bmpW:"+bmpW+" bmpH:"+bmpH+" w:"+w+" x:"+x+" y:"+y, Toast.LENGTH_LONG).show();
return Bitmap.createBitmap(bmp, x, y, w, w);
};
/**
*解码bmp
* @param bmp
* @return
*/
private String decodeImage(Bitmap bmp){
String str = null;
RGBLuminanceSource source = new RGBLuminanceSource(bmp);
BinaryBitmap binBmp = new BinaryBitmap(new HybridBinarizer(source));
try {
Result result = this.reader.decode(binBmp);
str = result.getText();
}catch(Exception e ){
};
return str;
};
class RGBLuminanceSource extends LuminanceSource {
private final byte[] luminances;
public RGBLuminanceSource(Bitmap bitmap) {
super(bitmap.getWidth(), bitmap.getHeight());
int width = bitmap.getWidth();
int height = bitmap.getHeight();
int[] pixels = new int[width * height];
bitmap.getPixels(pixels, 0, width, 0, 0, width, height);
// In order to measure pure decoding speed, we convert the entire
// image to a greyscale array
// up front, which is the same as the Y channel of the
// YUVLuminanceSource in the real app.
luminances = new byte[width * height];
for (int y = 0; y < height; y++) {
int offset = y * width;
for (int x = 0; x < width; x++) {
int pixel = pixels[offset + x];
int r = (pixel >> 16) & 0xff;
int g = (pixel >> 8) & 0xff;
int b = pixel & 0xff;
if (r == g && g == b) {
// Image is already greyscale, so pick any channel.
luminances[offset + x] = (byte) r;
} else {
// Calculate luminance cheaply, favoring green.
luminances[offset + x] = (byte) ((r + g + g + b) >> 2);
}
}
}
}
@Override
public byte[] getRow(int y, byte[] row) {
if (y < 0 || y >= getHeight()) {
throw new IllegalArgumentException(
"Requested row is outside the image: " + y);
}
int width = getWidth();
if (row == null || row.length < width) {
row = new byte[width];
}
System.arraycopy(luminances, y * width, row, 0, width);
return row;
}
// Since this class does not support cropping, the underlying byte array
// already contains
// exactly what the caller is asking for, so give it to them without a
// copy.
@Override
public byte[] getMatrix() {
return luminances;
}
}
}
布局只是一个SurfaceView
<?xml version="1.0" encoding="utf-8"?>
<SurfaceView xmlns:android="http://schemas.android.com/apk/res/android"
android:id="@+id/activity_qer_decode_surfaceview"
android:layout_width="match_parent"
android:layout_height="match_parent" >
</SurfaceView>