Making Sense of Multitouch

[This post is by Adam Powell, one of our more touchy-feelyAndroid engineers. — Tim Bray]

The word “multitouch” gets thrown around quite a bit and it’snot always clear what people are referring to. For some it’s abouthardware capability, for others it refers to specific gesturesupport in software. Whatever you decide to call it, today we’regoing to look at how to make your apps and views behave nicely withmultiple fingers on the screen.

This post is going to be heavy on code examples. It will covercreating a custom View that responds to touch events and allows theuser to manipulate an object drawn within it. To get the most outof the examples you should be familiar with setting up an Activityand the basics of the Android UI system. Full project source willbe linked at the end.

We’ll begin with a new View class that draws an object (ourapplication icon) at a given position:

public class TouchExampleView extends View {
    private Drawable mIcon;
    private float mPosX;
    private float mPosY;
    
    private float mLastTouchX;
    private float mLastTouchY;
    
    public TouchExampleView(Context context) {
        this(context, null, 0);
    }
    
    public TouchExampleView(Context context, AttributeSet attrs) {
        this(context, attrs, 0);
    }
    
    public TouchExampleView(Context context, AttributeSet attrs, int defStyle) {
        super(context, attrs, defStyle);
        mIcon = context.getResources().getDrawable(R.drawable.icon);
        mIcon.setBounds(0, 0, mIcon.getIntrinsicWidth(), mIcon.getIntrinsicHeight());
    }

    @Override
    public void onDraw(Canvas canvas) {
        super.onDraw(canvas);
        
        canvas.save();
        canvas.translate(mPosX, mPosY);
        mIcon.draw(canvas);
        canvas.restore();
    }

    @Override
    public boolean onTouchEvent(MotionEvent ev) {
        // More to come here later...
        return true;
    }
}

MotionEvent

The Android framework’s primary point of access for touch datais the android.view.MotionEvent class. Passed to your views throughthe onTouchEvent and onInterceptTouchEvent methods, MotionEventcontains data about “pointers,” or active touch points on thedevice’s screen. Through a MotionEvent you can obtain X/Ycoordinates as well as size and pressure for each pointer.MotionEvent.getAction() returns a value describing what kind ofmotion event occurred.

One of the more common uses of touch input is letting the userdrag an object around the screen. We can accomplish this in ourView class from above by implementing onTouchEvent as follows:

@Override
public boolean onTouchEvent(MotionEvent ev) {
    final int action = ev.getAction();
    switch (action) {
    case MotionEvent.ACTION_DOWN: {
        final float x = ev.getX();
        final float y = ev.getY();
        
        // Remember where we started
        mLastTouchX = x;
        mLastTouchY = y;
        break;
    }
        
    case MotionEvent.ACTION_MOVE: {
        final float x = ev.getX();
        final float y = ev.getY();
        
        // Calculate the distance moved
        final float dx = x - mLastTouchX;
        final float dy = y - mLastTouchY;
        
        // Move the object
        mPosX += dx;
        mPosY += dy;
        
        // Remember this touch position for the next move event
        mLastTouchX = x;
        mLastTouchY = y;
        
        // Invalidate to request a redraw
        invalidate();
        break;
    }
    }
    
    return true;
}

The code above has a bug on devices that support multiplepointers. While dragging the image around the screen, place asecond finger on the touchscreen then lift the first finger. Theimage jumps! What’s happening? We’re calculating the distance tomove the object based on the last known position of the defaultpointer. When the first finger is lifted, the second finger becomesthe default pointer and we have a large delta between pointerpositions which our code dutifully applies to the object’slocation.

If all you want is info about a single pointer’s location, themethods MotionEvent.getX() and MotionEvent.getY() are all you need.MotionEvent was extended in Android 2.0 (Eclair) to report dataabout multiple pointers and new actions were added to describemultitouch events. MotionEvent.getPointerCount() returns the numberof active pointers. getX and getY now accept an index to specifywhich pointer’s data to retrieve.

Index vs. ID

At a higher level, touchscreen data from a snapshot in time maynot be immediately useful since touch gestures involve motion overtime spanning many motion events. A pointer index does notnecessarily match up across complex events, it only indicates thedata’s position within the MotionEvent. However this is not workthat your app has to do itself. Each pointer also has an ID mappingthat stays persistent across touch events. You can retrieve this IDfor each pointer using MotionEvent.getPointerId(index) and find anindex for a pointer ID using MotionEvent.findPointerIndex(id).

Feeling Better?

Let’s fix the example above by taking pointer IDs intoaccount.

private static final int INVALID_POINTER_ID = -1;

// The ‘active pointer’ is the one currently moving our object.
private int mActivePointerId = INVALID_POINTER_ID;

// Existing code ...

@Override
public boolean onTouchEvent(MotionEvent ev) {
    final int action = ev.getAction();
    switch (action & MotionEvent.ACTION_MASK) {
    case MotionEvent.ACTION_DOWN: {
        final float x = ev.getX();
        final float y = ev.getY();
        
        mLastTouchX = x;
        mLastTouchY = y;

        // Save the ID of this pointer
        mActivePointerId = ev.getPointerId(0);
        break;
    }
        
    case MotionEvent.ACTION_MOVE: {
        // Find the index of the active pointer and fetch its position
        final int pointerIndex = ev.findPointerIndex(mActivePointerId);
        final float x = ev.getX(pointerIndex);
        final float y = ev.getY(pointerIndex);
        
        final float dx = x - mLastTouchX;
        final float dy = y - mLastTouchY;
        
        mPosX += dx;
        mPosY += dy;
        
        mLastTouchX = x;
        mLastTouchY = y;
        
        invalidate();
        break;
    }
        
    case MotionEvent.ACTION_UP: {
        mActivePointerId = INVALID_POINTER_ID;
        break;
    }
        
    case MotionEvent.ACTION_CANCEL: {
        mActivePointerId = INVALID_POINTER_ID;
        break;
    }
    
    case MotionEvent.ACTION_POINTER_UP: {
        // Extract the index of the pointer that left the touch sensor
        final int pointerIndex = (action & MotionEvent.ACTION_POINTER_INDEX_MASK) 
                >> MotionEvent.ACTION_POINTER_INDEX_SHIFT;
        final int pointerId = ev.getPointerId(pointerIndex);
        if (pointerId == mActivePointerId) {
            // This was our active pointer going up. Choose a new
            // active pointer and adjust accordingly.
            final int newPointerIndex = pointerIndex == 0 ? 1 : 0;
            mLastTouchX = ev.getX(newPointerIndex);
            mLastTouchY = ev.getY(newPointerIndex);
            mActivePointerId = ev.getPointerId(newPointerIndex);
        }
        break;
    }
    }
    
    return true;
}

There are a few new elements at work here. We’re switching onaction & MotionEvent.ACTION_MASK now rather thanjust action itself, and we’re using a new MotionEvent actionconstant, MotionEvent.ACTION_POINTER_UP. ACTION_POINTER_DOWN andACTION_POINTER_UP are fired whenever a secondary pointer goes downor up. If there is already a pointer on the screen and a new onegoes down, you will receive ACTION_POINTER_DOWN instead ofACTION_DOWN. If a pointer goes up but there is still at least onetouching the screen, you will receive ACTION_POINTER_UP instead ofACTION_UP.

The ACTION_POINTER_DOWN and ACTION_POINTER_UP events encodeextra information in the action value. ANDing it withMotionEvent.ACTION_MASK gives us the action constant while ANDingit with ACTION_POINTER_INDEX_MASK gives us the index of the pointerthat went up or down. In the ACTION_POINTER_UP case our exampleextracts this index and ensures that our active pointer ID is notreferring to a pointer that is no longer touching the screen. If itwas, we select a different pointer to be active and save itscurrent X and Y position. Since this saved position is used in theACTION_MOVE case to calculate the distance to move the onscreenobject, we will always calculate the distance to move using datafrom the correct pointer.

This is all the data that you need to process any sort ofgesture your app may require. However dealing with this low-leveldata can be cumbersome when working with more complex gestures.Enter GestureDetectors.

GestureDetectors

Since apps can have vastly different needs, Android does notspend time cooking touch data into higher level events unless youspecifically request it. GestureDetectors are small filter objectsthat consume MotionEvents and dispatch higher level gesture eventsto listeners specified during their construction. The Androidframework provides two GestureDetectors out of the box, but youshould also feel free to use them as examples for implementing yourown if needed. GestureDetectors are a pattern, not a prepackedsolution. They’re not just for complex gestures such as drawing astar while standing on your head, they can even make simplegestures like fling or double tap easier to work with.

android.view.GestureDetector generates gesture events forseveral common single-pointer gestures used by Android includingscrolling, flinging, and long press. For Android 2.2 (Froyo) we’vealso added android.view.ScaleGestureDetector for processing themost commonly requested two-finger gesture: pinch zooming.

Gesture detectors follow the pattern of providing a methodpublic boolean onTouchEvent(MotionEvent). This method, like itsnamesake in android.view.View, returns true if it handles the eventand false if it does not. In the context of a gesture detector, areturn value of true implies that there is an appropriate gesturecurrently in progress. GestureDetector and ScaleGestureDetector canbe used together when you want a view to recognize multiplegestures.

To report detected gesture events, gesture detectors uselistener objects passed to their constructors. ScaleGestureDetectoruses ScaleGestureDetector.OnScaleGestureListener.ScaleGestureDetector.SimpleOnScaleGestureListener is offered as ahelper class that you can extend if you don’t care about all of thereported events.

Since we are already supporting dragging in our example, let’sadd support for scaling. The updated example code is shownbelow:

private ScaleGestureDetector mScaleDetector;
private float mScaleFactor = 1.f;

// Existing code ...

public TouchExampleView(Context context, AttributeSet attrs, int defStyle) {
    super(context, attrs, defStyle);
    mIcon = context.getResources().getDrawable(R.drawable.icon);
    mIcon.setBounds(0, 0, mIcon.getIntrinsicWidth(), mIcon.getIntrinsicHeight());
    
    // Create our ScaleGestureDetector
    mScaleDetector = new ScaleGestureDetector(context, new ScaleListener());
}

@Override
public boolean onTouchEvent(MotionEvent ev) {
    // Let the ScaleGestureDetector inspect all events.
    mScaleDetector.onTouchEvent(ev);
    
    final int action = ev.getAction();
    switch (action & MotionEvent.ACTION_MASK) {
    case MotionEvent.ACTION_DOWN: {
        final float x = ev.getX();
        final float y = ev.getY();
        
        mLastTouchX = x;
        mLastTouchY = y;
        mActivePointerId = ev.getPointerId(0);
        break;
    }
        
    case MotionEvent.ACTION_MOVE: {
        final int pointerIndex = ev.findPointerIndex(mActivePointerId);
        final float x = ev.getX(pointerIndex);
        final float y = ev.getY(pointerIndex);

        // Only move if the ScaleGestureDetector isn't processing a gesture.
        if (!mScaleDetector.isInProgress()) {
            final float dx = x - mLastTouchX;
            final float dy = y - mLastTouchY;

            mPosX += dx;
            mPosY += dy;

            invalidate();
        }

        mLastTouchX = x;
        mLastTouchY = y;

        break;
    }
        
    case MotionEvent.ACTION_UP: {
        mActivePointerId = INVALID_POINTER_ID;
        break;
    }
        
    case MotionEvent.ACTION_CANCEL: {
        mActivePointerId = INVALID_POINTER_ID;
        break;
    }
    
    case MotionEvent.ACTION_POINTER_UP: {
        final int pointerIndex = (ev.getAction() & MotionEvent.ACTION_POINTER_INDEX_MASK) 
                >> MotionEvent.ACTION_POINTER_INDEX_SHIFT;
        final int pointerId = ev.getPointerId(pointerIndex);
        if (pointerId == mActivePointerId) {
            // This was our active pointer going up. Choose a new
            // active pointer and adjust accordingly.
            final int newPointerIndex = pointerIndex == 0 ? 1 : 0;
            mLastTouchX = ev.getX(newPointerIndex);
            mLastTouchY = ev.getY(newPointerIndex);
            mActivePointerId = ev.getPointerId(newPointerIndex);
        }
        break;
    }
    }
    
    return true;
}

@Override
public void onDraw(Canvas canvas) {
    super.onDraw(canvas);
    
    canvas.save();
    canvas.translate(mPosX, mPosY);
    canvas.scale(mScaleFactor, mScaleFactor);
    mIcon.draw(canvas);
    canvas.restore();
}

private class ScaleListener extends ScaleGestureDetector.SimpleOnScaleGestureListener {
    @Override
    public boolean onScale(ScaleGestureDetector detector) {
        mScaleFactor *= detector.getScaleFactor();
        
        // Don't let the object get too small or too large.
        mScaleFactor = Math.max(0.1f, Math.min(mScaleFactor, 5.0f));

        invalidate();
        return true;
    }
}

This example merely scratches the surface of whatScaleGestureDetector offers. The listener methods receive areference to the detector itself as a parameter that can be queriedfor extended information about the gesture in progress. See theScaleGestureDetector API documentation for more details.

Now our example app allows a user to drag with one finger, scalewith two, and it correctly handles passing active pointer focusbetween fingers as they contact and leave the screen. You candownload the final sample project athttp://code.google.com/p/android-touchexample/. It requires theAndroid 2.2 SDK (API level 8) to build and a 2.2 (Froyo) powereddevice to run.

From Example to Application

In a real app you would want to tweak the details about howzooming behaves. When zooming, users will expect content to zoomabout the focal point of the gesture as reported byScaleGestureDetector.getFocusX() and getFocusY(). The specifics ofthis will vary depending on how your app represents and draws itscontent.

Different touchscreen hardware may have different capabilities;some panels may only support a single pointer, others may supporttwo pointers but with position data unsuitable for complexgestures, and others may support precise positioning data for twopointers and beyond. You can query what type of touchscreen adevice has at runtime using PackageManager.hasSystemFeature().

As you design your user interface keep in mind that people usetheir mobile devices in many different ways and not all Androiddevices are created equal. Some apps might be used one-handed,making multiple-finger gestures awkward. Some users prefer usingdirectional pads or trackballs to navigate. Well-designed gesturesupport can put complex functionality at your users’ fingertips,but also consider designing alternate means of accessingapplication functionality that can coexist with gestures.

Making Sense of Multitouch

[This post is by Adam Powell, one of our more touchy-feelyAndroid engineers. — Tim Bray]

The word “multitouch” gets thrown around quite a bit and it’snot always clear what people are referring to. For some it’s abouthardware capability, for others it refers to specific gesturesupport in software. Whatever you decide to call it, today we’regoing to look at how to make your apps and views behave nicely withmultiple fingers on the screen.

This post is going to be heavy on code examples. It will covercreating a custom View that responds to touch events and allows theuser to manipulate an object drawn within it. To get the most outof the examples you should be familiar with setting up an Activityand the basics of the Android UI system. Full project source willbe linked at the end.

We’ll begin with a new View class that draws an object (ourapplication icon) at a given position:

public class TouchExampleView extends View {
    private Drawable mIcon;
    private float mPosX;
    private float mPosY;
    
    private float mLastTouchX;
    private float mLastTouchY;
    
    public TouchExampleView(Context context) {
        this(context, null, 0);
    }
    
    public TouchExampleView(Context context, AttributeSet attrs) {
        this(context, attrs, 0);
    }
    
    public TouchExampleView(Context context, AttributeSet attrs, int defStyle) {
        super(context, attrs, defStyle);
        mIcon = context.getResources().getDrawable(R.drawable.icon);
        mIcon.setBounds(0, 0, mIcon.getIntrinsicWidth(), mIcon.getIntrinsicHeight());
    }

    @Override
    public void onDraw(Canvas canvas) {
        super.onDraw(canvas);
        
        canvas.save();
        canvas.translate(mPosX, mPosY);
        mIcon.draw(canvas);
        canvas.restore();
    }

    @Override
    public boolean onTouchEvent(MotionEvent ev) {
        // More to come here later...
        return true;
    }
}

MotionEvent

The Android framework’s primary point of access for touch datais the android.view.MotionEvent class. Passed to your views throughthe onTouchEvent and onInterceptTouchEvent methods, MotionEventcontains data about “pointers,” or active touch points on thedevice’s screen. Through a MotionEvent you can obtain X/Ycoordinates as well as size and pressure for each pointer.MotionEvent.getAction() returns a value describing what kind ofmotion event occurred.

One of the more common uses of touch input is letting the userdrag an object around the screen. We can accomplish this in ourView class from above by implementing onTouchEvent as follows:

@Override
public boolean onTouchEvent(MotionEvent ev) {
    final int action = ev.getAction();
    switch (action) {
    case MotionEvent.ACTION_DOWN: {
        final float x = ev.getX();
        final float y = ev.getY();
        
        // Remember where we started
        mLastTouchX = x;
        mLastTouchY = y;
        break;
    }
        
    case MotionEvent.ACTION_MOVE: {
        final float x = ev.getX();
        final float y = ev.getY();
        
        // Calculate the distance moved
        final float dx = x - mLastTouchX;
        final float dy = y - mLastTouchY;
        
        // Move the object
        mPosX += dx;
        mPosY += dy;
        
        // Remember this touch position for the next move event
        mLastTouchX = x;
        mLastTouchY = y;
        
        // Invalidate to request a redraw
        invalidate();
        break;
    }
    }
    
    return true;
}

The code above has a bug on devices that support multiplepointers. While dragging the image around the screen, place asecond finger on the touchscreen then lift the first finger. Theimage jumps! What’s happening? We’re calculating the distance tomove the object based on the last known position of the defaultpointer. When the first finger is lifted, the second finger becomesthe default pointer and we have a large delta between pointerpositions which our code dutifully applies to the object’slocation.

If all you want is info about a single pointer’s location, themethods MotionEvent.getX() and MotionEvent.getY() are all you need.MotionEvent was extended in Android 2.0 (Eclair) to report dataabout multiple pointers and new actions were added to describemultitouch events. MotionEvent.getPointerCount() returns the numberof active pointers. getX and getY now accept an index to specifywhich pointer’s data to retrieve.

Index vs. ID

At a higher level, touchscreen data from a snapshot in time maynot be immediately useful since touch gestures involve motion overtime spanning many motion events. A pointer index does notnecessarily match up across complex events, it only indicates thedata’s position within the MotionEvent. However this is not workthat your app has to do itself. Each pointer also has an ID mappingthat stays persistent across touch events. You can retrieve this IDfor each pointer using MotionEvent.getPointerId(index) and find anindex for a pointer ID using MotionEvent.findPointerIndex(id).

Feeling Better?

Let’s fix the example above by taking pointer IDs intoaccount.

private static final int INVALID_POINTER_ID = -1;

// The ‘active pointer’ is the one currently moving our object.
private int mActivePointerId = INVALID_POINTER_ID;

// Existing code ...

@Override
public boolean onTouchEvent(MotionEvent ev) {
    final int action = ev.getAction();
    switch (action & MotionEvent.ACTION_MASK) {
    case MotionEvent.ACTION_DOWN: {
        final float x = ev.getX();
        final float y = ev.getY();
        
        mLastTouchX = x;
        mLastTouchY = y;

        // Save the ID of this pointer
        mActivePointerId = ev.getPointerId(0);
        break;
    }
        
    case MotionEvent.ACTION_MOVE: {
        // Find the index of the active pointer and fetch its position
        final int pointerIndex = ev.findPointerIndex(mActivePointerId);
        final float x = ev.getX(pointerIndex);
        final float y = ev.getY(pointerIndex);
        
        final float dx = x - mLastTouchX;
        final float dy = y - mLastTouchY;
        
        mPosX += dx;
        mPosY += dy;
        
        mLastTouchX = x;
        mLastTouchY = y;
        
        invalidate();
        break;
    }
        
    case MotionEvent.ACTION_UP: {
        mActivePointerId = INVALID_POINTER_ID;
        break;
    }
        
    case MotionEvent.ACTION_CANCEL: {
        mActivePointerId = INVALID_POINTER_ID;
        break;
    }
    
    case MotionEvent.ACTION_POINTER_UP: {
        // Extract the index of the pointer that left the touch sensor
        final int pointerIndex = (action & MotionEvent.ACTION_POINTER_INDEX_MASK) 
                >> MotionEvent.ACTION_POINTER_INDEX_SHIFT;
        final int pointerId = ev.getPointerId(pointerIndex);
        if (pointerId == mActivePointerId) {
            // This was our active pointer going up. Choose a new
            // active pointer and adjust accordingly.
            final int newPointerIndex = pointerIndex == 0 ? 1 : 0;
            mLastTouchX = ev.getX(newPointerIndex);
            mLastTouchY = ev.getY(newPointerIndex);
            mActivePointerId = ev.getPointerId(newPointerIndex);
        }
        break;
    }
    }
    
    return true;
}

There are a few new elements at work here. We’re switching onaction & MotionEvent.ACTION_MASK now rather thanjust action itself, and we’re using a new MotionEvent actionconstant, MotionEvent.ACTION_POINTER_UP. ACTION_POINTER_DOWN andACTION_POINTER_UP are fired whenever a secondary pointer goes downor up. If there is already a pointer on the screen and a new onegoes down, you will receive ACTION_POINTER_DOWN instead ofACTION_DOWN. If a pointer goes up but there is still at least onetouching the screen, you will receive ACTION_POINTER_UP instead ofACTION_UP.

The ACTION_POINTER_DOWN and ACTION_POINTER_UP events encodeextra information in the action value. ANDing it withMotionEvent.ACTION_MASK gives us the action constant while ANDingit with ACTION_POINTER_INDEX_MASK gives us the index of the pointerthat went up or down. In the ACTION_POINTER_UP case our exampleextracts this index and ensures that our active pointer ID is notreferring to a pointer that is no longer touching the screen. If itwas, we select a different pointer to be active and save itscurrent X and Y position. Since this saved position is used in theACTION_MOVE case to calculate the distance to move the onscreenobject, we will always calculate the distance to move using datafrom the correct pointer.

This is all the data that you need to process any sort ofgesture your app may require. However dealing with this low-leveldata can be cumbersome when working with more complex gestures.Enter GestureDetectors.

GestureDetectors

Since apps can have vastly different needs, Android does notspend time cooking touch data into higher level events unless youspecifically request it. GestureDetectors are small filter objectsthat consume MotionEvents and dispatch higher level gesture eventsto listeners specified during their construction. The Androidframework provides two GestureDetectors out of the box, but youshould also feel free to use them as examples for implementing yourown if needed. GestureDetectors are a pattern, not a prepackedsolution. They’re not just for complex gestures such as drawing astar while standing on your head, they can even make simplegestures like fling or double tap easier to work with.

android.view.GestureDetector generates gesture events forseveral common single-pointer gestures used by Android includingscrolling, flinging, and long press. For Android 2.2 (Froyo) we’vealso added android.view.ScaleGestureDetector for processing themost commonly requested two-finger gesture: pinch zooming.

Gesture detectors follow the pattern of providing a methodpublic boolean onTouchEvent(MotionEvent). This method, like itsnamesake in android.view.View, returns true if it handles the eventand false if it does not. In the context of a gesture detector, areturn value of true implies that there is an appropriate gesturecurrently in progress. GestureDetector and ScaleGestureDetector canbe used together when you want a view to recognize multiplegestures.

To report detected gesture events, gesture detectors uselistener objects passed to their constructors. ScaleGestureDetectoruses ScaleGestureDetector.OnScaleGestureListener.ScaleGestureDetector.SimpleOnScaleGestureListener is offered as ahelper class that you can extend if you don’t care about all of thereported events.

Since we are already supporting dragging in our example, let’sadd support for scaling. The updated example code is shownbelow:

private ScaleGestureDetector mScaleDetector;
private float mScaleFactor = 1.f;

// Existing code ...

public TouchExampleView(Context context, AttributeSet attrs, int defStyle) {
    super(context, attrs, defStyle);
    mIcon = context.getResources().getDrawable(R.drawable.icon);
    mIcon.setBounds(0, 0, mIcon.getIntrinsicWidth(), mIcon.getIntrinsicHeight());
    
    // Create our ScaleGestureDetector
    mScaleDetector = new ScaleGestureDetector(context, new ScaleListener());
}

@Override
public boolean onTouchEvent(MotionEvent ev) {
    // Let the ScaleGestureDetector inspect all events.
    mScaleDetector.onTouchEvent(ev);
    
    final int action = ev.getAction();
    switch (action & MotionEvent.ACTION_MASK) {
    case MotionEvent.ACTION_DOWN: {
        final float x = ev.getX();
        final float y = ev.getY();
        
        mLastTouchX = x;
        mLastTouchY = y;
        mActivePointerId = ev.getPointerId(0);
        break;
    }
        
    case MotionEvent.ACTION_MOVE: {
        final int pointerIndex = ev.findPointerIndex(mActivePointerId);
        final float x = ev.getX(pointerIndex);
        final float y = ev.getY(pointerIndex);

        // Only move if the ScaleGestureDetector isn't processing a gesture.
        if (!mScaleDetector.isInProgress()) {
            final float dx = x - mLastTouchX;
            final float dy = y - mLastTouchY;

            mPosX += dx;
            mPosY += dy;

            invalidate();
        }

        mLastTouchX = x;
        mLastTouchY = y;

        break;
    }
        
    case MotionEvent.ACTION_UP: {
        mActivePointerId = INVALID_POINTER_ID;
        break;
    }
        
    case MotionEvent.ACTION_CANCEL: {
        mActivePointerId = INVALID_POINTER_ID;
        break;
    }
    
    case MotionEvent.ACTION_POINTER_UP: {
        final int pointerIndex = (ev.getAction() & MotionEvent.ACTION_POINTER_INDEX_MASK) 
                >> MotionEvent.ACTION_POINTER_INDEX_SHIFT;
        final int pointerId = ev.getPointerId(pointerIndex);
        if (pointerId == mActivePointerId) {
            // This was our active pointer going up. Choose a new
            // active pointer and adjust accordingly.
            final int newPointerIndex = pointerIndex == 0 ? 1 : 0;
            mLastTouchX = ev.getX(newPointerIndex);
            mLastTouchY = ev.getY(newPointerIndex);
            mActivePointerId = ev.getPointerId(newPointerIndex);
        }
        break;
    }
    }
    
    return true;
}

@Override
public void onDraw(Canvas canvas) {
    super.onDraw(canvas);
    
    canvas.save();
    canvas.translate(mPosX, mPosY);
    canvas.scale(mScaleFactor, mScaleFactor);
    mIcon.draw(canvas);
    canvas.restore();
}

private class ScaleListener extends ScaleGestureDetector.SimpleOnScaleGestureListener {
    @Override
    public boolean onScale(ScaleGestureDetector detector) {
        mScaleFactor *= detector.getScaleFactor();
        
        // Don't let the object get too small or too large.
        mScaleFactor = Math.max(0.1f, Math.min(mScaleFactor, 5.0f));

        invalidate();
        return true;
    }
}

This example merely scratches the surface of whatScaleGestureDetector offers. The listener methods receive areference to the detector itself as a parameter that can be queriedfor extended information about the gesture in progress. See theScaleGestureDetector API documentation for more details.

Now our example app allows a user to drag with one finger, scalewith two, and it correctly handles passing active pointer focusbetween fingers as they contact and leave the screen. You candownload the final sample project athttp://code.google.com/p/android-touchexample/. It requires theAndroid 2.2 SDK (API level 8) to build and a 2.2 (Froyo) powered device to run.

From Example to Application

In a real app you would want to tweak the details about howzooming behaves. When zooming, users will expect content to zoomabout the focal point of the gesture as reported byScaleGestureDetector.getFocusX() and getFocusY(). The specifics ofthis will vary depending on how your app represents and draws itscontent.

Different touchscreen hardware may have different capabilities;some panels may only support a single pointer, others may supporttwo pointers but with position data unsuitable for complexgestures, and others may support precise positioning data for twopointers and beyond. You can query what type of touchscreen adevice has at runtime using PackageManager.hasSystemFeature().

As you design your user interface keep in mind that people usetheir mobile devices in many different ways and not all Androiddevices are created equal. Some apps might be used one-handed,making multiple-finger gestures awkward. Some users prefer usingdirectional pads or trackballs to navigate. Well-designed gesturesupport can put complex functionality at your users’ fingertips,but also consider designing alternate means of accessingapplication functionality that can coexist with gestures.

Making <wbr>Sense <wbr>of <wbr>Multitouch Making <wbr>Sense <wbr>of <wbr>Multitouch
Making <wbr>Sense <wbr>of <wbr>Multitouch View full post on Android Developers Blog
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值