4.2. Multi-Touch Events Handling

The multi-touch support on the iPhone provides a series of touch events consisting of smaller, individual parts of a single multi-touch gesture. For example, placing your finger on the screen generates one event, placing a second finger on the screen generates another, and moving either finger generates yet another. All of these are handled through the UITouch and UIEvent APIs. These objects provide information about which gesture events have been made and the screen coordinates where they occur. The location of the touch is consistent with a particular window and view, so you'll also learn how to tell in which object a touch occurred.

Because events are relative to the object in which they occur, the coordinates returned will not actually be screen coordinates, but rather relative coordinates. For example, if a view window is drawn halfway down the screen, at 0x240, and your user touches the upper-left corner of that view, the coordinates reported to your application will be 0x0, not 0x240. The 0x0 coordinates represent the upper-left corner of the view object that the user tapped.

4.2.1. UITouch Notifications

UITouch is the class used to convey a single action within a series of gestures. A UITouch object might include notifications for one or more fingers down, a finger that has moved, or the termination of a gesture. Several UITouch notifications can occur over the period of a single gesture.

The UITouch object includes various properties that identify an event:



timestamp

Provides a timestamp relative to a previous event using the foundation class NSTimeInterval.



phase

The type of touch event occurring. This informs the application about what the user is actually doing with his finger, and can be one of the following enumerated values:



UITouchPhaseBegan

Sent when one or more fingers touch the screen surface.



UITouchPhaseMoved

Sent when a finger is moved along the screen. If the user is making a gesture or dragging an item, this type of notification will be sent several times, updating the application with the coordinates of the move.



UITouchPhaseEnded

Sent when a finger is removed from the screen.



UITouchPhaseStationary

Sent when a finger is touching the screen, but hasn't moved since the previous notification.



UIPhaseCancelled

Sent when the system cancels the gesture. This can happen during events that would cause your application to be suspended, such as an incoming phone call, or if there is little enough contact on the screen surface that the iPhone doesn't believe a finger gesture is being made any longer, and is thus cancelled. Apple humorously gives the example of sticking the iPhone on your face to generate this notification, which may be useful if you're writing a "stick an iPhone on your face" application to complement your flashlight application.



tapCount

Identifies the number of taps (including the current tap). For example, the second tap of a double tap would have this property set to 2.



window

A pointer to the UIWindow object in which this touch event occurred.



view

A pointer to the UIView object in which this event occurred.

In addition to its properties, the UITouch class contains the following methods that you can use to identify the screen coordinates at which the event took place. Remember, these coordinates are going to be relative to the view objects in which they occurred, and do not represent actual screen coordinates:



locationInView

- (CGPoint) locationInView: (UIView *) view;

Returns a CGPoint structure containing the coordinates (relative to the UIView object) at which this event occurred.



previousLocationInView

- (CGPoint) previousLocationInView: (UIView *) view;

Returns a CGPoint structure containing the coordinates (relative to the UIView object) from which this event originated. For example, if this event described a UITouchPhaseMoved event, it will return the coordinates from which the finger was moved.

4.2.2. UIEvent

A UIEvent object aggregates a series of UITouch objects into a single, portable object that you can manage easily. A UIEvent provides methods to look up the touches that have occurred in a single window, view, or even across the entire application.

Event objects are sent as a gesture is made, containing all of the events for that gesture. The foundation class NSSet is used to deliver the collection of subsequent UITouch events. Each of the UITouch events included in the set includes the specific timestamp, phase, and location of each event, while the UIEvent object includes its own timestamp property as well.

The UIEvent object supports the following methods:



allTouches

- (NSSet *) allTouches;

Returns a set of all touches occurring within the application.



touchesForWindow

- (NSSet *) touchesForWindow: (UIWindow *) window;

Returns a set of all touches occurring only within the specified UIWindow object.



touchesForView

- (NSSet *) touchesForView: (UIView *) view;

Returns a set of all touches occurring only within the specified UIView object.

4.2.3. Events Handling

When the user makes a gesture, the application notifies the key window and supplies the UIEvent structure containing the events that are occurring. The key window will relay this information to the first responder for the window. This is typically the view in which the actual event occurred. Once a gesture has been associated with a given view, all subsequent events related to the gesture will be reported to that view. The UIApplication and UIWindow objects contain a method named sendEvent:

- (void) sendEvent: (UIEvent *)event;

This method is called as part of the event dispatch process. These methods are responsible for receiving and dispatching events to their correct destinations. In general, you won't need to override these methods unless you want to monitor all incoming events.

When the key window receives an event, it polls each of the view classes it presides over to determine in which one the event originated, and then dispatches the event to it. The object's responder will then dispatch the event to its view controller, if the view is assigned to one, and then to the view's own super class. It will then make its way back up the responder chain to its window, and finally to the application.

To receive multi-touch events, you must override one or more of the event-handler methods below in your UIView–derived object. Your UIWindow and UIApplication classes may also override these methods to receive events, but there is rarely a need to do so. Apple has specified the following prototypes to receive multi-touch events. Each of the methods used is associated with one of the touch phases described in the previous section:

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;

Two arguments are provided for each method. The NSSet provided contains a set consisting only of new touches that have occurred since the last event. A UIEvent is also provided, serving two purposes: it allows you to filter the individual touch events for only a particular view or window, and it provides a means of accessing previous touches as they pertain to the current gesture.

You'll only need to implement the notification methods for events you're interested in. If you're interested only in receiving taps as "mouse clicks," you'll need to use only the touchesBegan and touchesEnded methods. If you're using dragging, such as in a slider control, you'll need the touchesMoved method. The touchesCancelled method is optional, but Apple recommends using it to clean up objects in persistent classes.

4.2.4. Example: Tap Counter

In this example, you'll override the touchesBegan method to detect single, double, and triple taps. For our purposes here, we'll send output to the console only, but in your own application, you'll relay this information to its UI components.

To build this example, create a sample view-based application in Xcode named TouchDemo, and add the code below to your TouchDemoViewController.m class:

- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
    UITouch *touch = [ touches anyObject ];
    CGPoint location = [ touch locationInView: self.view ];
    NSUInteger taps = [ touch tapCount ];

    NSLog(@"%s tap at %f, %f tap count: %d",
        (taps == 1) ? "Single" :
            (taps == 2) ? "Double" : "Triple+",
        location.x, location.y, taps);
}

Compile the application and run it in the simulator. In Xcode, choose Console from the Run menu to open the display console. Now experiment with single, double, and triple taps on the simulator's screen. You should see output similar to the output below:

Untitled[4229:20b] Single tap at 161.000000, 113.953613 taps count: 1
Untitled[4229:20b] Double tap at 161.000000, 113.953613 taps count: 2
Untitled[4229:20b] Triple+ tap at 161.000000, 113.953613 taps count: 3
Untitled[4229:20b] Triple+ tap at 161.000000, 113.953613 taps count: 4

NOTE

 

The tap count will continue to increase indefinitely, meaning you could technically track quadruple taps and even higher tap counts.

4.2.5. Example: Tap and Drag

To track objects being dragged, you'll need to override three methods: touchesBegan, touchesMoved, and touchesEnded. The touchesMoved method will be called frequently as the mouse is dragged across the screen. When the user releases her finger, the touchesEnded method will notify you of this.

To build this example, create a sample view-based application in Xcode named DragDemo, and add the code below to your DragDemoViewController.m class:

 

- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
    UITouch *touch = [ touches anyObject ];
    CGPoint location = [ touch locationInView: self.view ];
    NSUInteger taps = [ touch tapCount ];

    [ super touchesBegan: touches withEvent: event ];

    NSLog(@"Tap BEGIN at %f, %f Tap count: %d", location.x, location.y, taps);
}

- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
    UITouch *touch = [ touches anyObject ];
    CGPoint oldLocation = [ touch previousLocationInView: self.view ];
    CGPoint location = [ touch locationInView: self.view ];

    [ super touchesMoved: touches withEvent: event ];
    NSLog(@"Finger MOVED from %f, %f to %f, %f",
        oldLocation.x, oldLocation.y, location.x, location.y);
}

- (void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
    UITouch *touch = [ touches anyObject ];
    CGPoint location = [ touch locationInView: self.view ];

    [ super touchesEnded: touches withEvent: event ];
    NSLog(@"Tap ENDED at %f, %f", location.x, location.y);
}

                                          

When you run this application, you'll get notifications whenever the user places his finger down or lifts it up, and you'll also receive many more as the user moves his finger across the screen:

BEGIN at 101.000000, 117.953613 Tap count: 1
Finger MOVED from 101.000000, 117.953613 to 102.000000, 117.953613
Finger MOVED from 102.000000, 117.953613 to 104.000000, 117.953613
Finger MOVED from 104.000000, 117.953613 to 105.000000, 117.953613
Finger MOVED from 105.000000, 117.953613 to 107.000000, 117.953613
Finger MOVED from 107.000000, 117.953613 to 109.000000, 116.953613
Finger MOVED from 109.000000, 116.953613 to 113.000000, 115.953613
Finger MOVED from 113.000000, 115.953613 to 116.000000, 115.953613
Finger MOVED from 116.000000, 115.953613 to 120.000000, 114.953613
Finger MOVED from 120.000000, 114.953613 to 122.000000, 114.953613
...
Tap ENDED at 126.000000, 144.953613

Because you'll receive several different notifications when the user's finger moves, it's important to make sure your application doesn't perform any resource-intensive functions until after the user's finger is released, or at externally timed intervals. For example, if your application is a game application, the touchesMoved method may not be the ideal place to write character movements. Instead, use this method to queue up the movement, and use a separate mechanism, such as an NSTimer, to execute the actual movements. Otherwise, larger mouse movements will slow your application down because it will be trying to handle so many events in a short period of time.

4.2.6. Processing Multi-Touch

By default, a view is configured only to receive single fingering events. To enable a view to process multi-touch gestures, set the multipleTouchEnabled method. In the previous example, overriding the view controller's viewDidLoad method, as shown below, would do this:

- (void)viewDidLoad {
    self.view.multipleTouchEnabled = YES;
    [ super viewDidLoad ];
}

Once enabled, both single-finger events and multiple-finger gestures will be sent to the touch notification methods. You can determine the number of touches on the screen by looking at the number of UITouch events provided in the NSSet from the first argument. During a gesture, each of the touch methods will be provided with two UITouch events, one for each finger. You can determined this by performing a simple count on the NSSet object provided:

int fingerCount = [ touches count ];

To determine the number of fingers used in the gesture as a whole, query the UIEvent. This allows you to determine when the last finger in a gesture has been lifted:

int fingersInGesture = [ [ event touchesForView: self.view ] count ];
if (fingerCount == fingersInGesture) {
    NSLog(@"All fingers present in the event");
}

4.2.7. PinchMe: Pinch Tracking

This example is similar to the previous examples in this chapter, but instead of tracking a single finger movement, both fingers are tracked to determine a scaling factor for a pinch operation. You can then easily apply the scaling factor to resize an image or perform other similar operations using a pinch. You'll override three methods in your view controller, as before: touchesBegan, touchesMoved, and touchedEnded. You'll also override the viewDidLoad method to enable multi-touch gestures. The touchesMoved method will be called frequently as the mouse is dragged across the screen, where the scaling factor will be calculated. When the user releases her finger, the touchesEnded method will notify you of this.

To build this example, create a sample view-based application in Xcode named PinchMe, and add the code from Example 4-1 to your PinchMeViewController.m class.

Example 4-1. Code supplement for PinchMeViewController.m

 

- (void)viewDidLoad {
    [super viewDidLoad];
    [ self.view setMultipleTouchEnabled: YES ];
}

- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
    UITouch *touch = [ touches anyObject ];
    CGPoint location = [ touch locationInView: self.view ];
    NSUInteger taps = [ touch tapCount ];

    [ super touchesBegan: touches withEvent: event ];

    NSLog(@"Tap BEGIN at %f, %f Tap count: %d", location.x, location.y, taps);
}

- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
    int finger = 0;
    NSEnumerator *enumerator = [ touches objectEnumerator ];
    UITouch *touch;
    CGPoint location[2];
    while ((touch = [ enumerator nextObject ]) && finger < 2)
    {
        location[finger] = [ touch locationInView: self.view ];
        NSLog(@"Finger %d moved: %fx%f",
        finger+1, location[finger].x, location[finger].y);
        finger++;
    }

    if (finger == 2) {
        CGPoint scale;
        scale.x = fabs(location[0].x - location[1].x);
        scale.y = fabs(location[0].y - location[1].y);
        NSLog(@"Scaling: %.0f x %.0f", scale.x, scale.y);
    }
    [ super touchesMoved: touches withEvent: event ];
}

- (void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
    UITouch *touch = [ touches anyObject ];
    CGPoint location = [ touch locationInView: self.view ];

    [ super touchesEnded: touches withEvent: event ];
    NSLog(@"Tap ENDED at %f, %f", location.x, location.y);
}

                                          

 

Open up a console while you run the application. Holding in the option key, you'll be able to simulate multi-touch pinch gestures in the iPhone Simulator. As you move the mouse around, you'll see reports of both finger coordinates and of the x, y delta between the two:

Untitled[5039:20b] Finger 1 moved: 110.000000x293.046387
Untitled[5039:20b] Finger 2 moved: 210.000000x146.953613
Untitled[5039:20b] Scaling: 100 x 146
Untitled[5039:20b] Finger 1 moved: 111.000000x289.046387
Untitled[5039:20b] Finger 2 moved: 209.000000x150.953613
Untitled[5039:20b] Scaling: 98 x 138
Untitled[5039:20b] Finger 1 moved: 112.000000x285.046387
Untitled[5039:20b] Finger 2 moved: 208.000000x154.953613
Untitled[5039:20b] Scaling: 96 x 130
Untitled[5039:20b] Finger 1 moved: 113.000000x282.046387
Untitled[5039:20b] Finger 2 moved: 207.000000x157.953613
Untitled[5039:20b] Scaling: 94 x 124
Untitled[5039:20b] Finger 1 moved: 113.000000x281.046387
Untitled[5039:20b] Finger 2 moved: 207.000000x158.953613
Untitled[5039:20b] Scaling: 94 x 122

4.2.8. TouchDemo: Multi-Touch Icon Tracking

For a fuller example of the touches API, TouchDemo illustrates how you can use the touches API to track individual finger movements on the screen. Four demo images are downloaded automatically from a website and displayed on the screen. The user may use one or more fingers to simultaneously drag the icons to a new position. The example tracks each touch individually; it repositions each icon as the user drags. See Figure 4-1.

Figure 4-1. TouchDemo example


You can compile this application, shown in Examples Example 4-2 through Example 4-6, with the SDK by creating a view-based application project named TouchDemo. Be sure to pull out the Interface Builder code so you can see how these objects are created from scratch.

Example 4-2. TouchDemo application delegate prototypes (TouchDemoAppDelegate.h)

 

#import <UIKit/UIKit.h>

@class TouchDemoViewController;

@interface TouchDemoAppDelegate : NSObject <UIApplicationDelegate> {
    UIWindow *window;
    TouchDemoViewController *viewController;
}

@property (nonatomic, retain) IBOutlet UIWindow *window;
@property (nonatomic, retain) IBOutlet TouchDemoViewController *viewController;

@end

                                          

Example 4-3. TouchDemo application delegate (TouchDemoAppDelegate.m)
#import "TouchDemoAppDelegate.h"
#import "TouchDemoViewController.h"

@implementation TouchDemoAppDelegate

@synthesize window;
@synthesize viewController;

- (void)applicationDidFinishLaunching:(UIApplication *)application {
    CGRect screenBounds = [ [ UIScreen mainScreen ] applicationFrame ];

    self.window = [ [ [ UIWindow alloc ] initWithFrame: screenBounds ]
        autorelease
    ];

    viewController = [ [ TouchDemoViewController alloc ] init ];

    [ window addSubview: viewController.view ];
    [ window makeKeyAndVisible ];
}

- (void)dealloc {
    [ viewController release ];
    [ window release ];
    [ super dealloc ];
}


@end

Example 4-4. TouchDemo view controller prototypes (TouchDemoViewController.h)
#import <UIKit/UIKit.h>

@interface TouchView : UIView {
    UIImage *images[4];
    CGPoint offsets[4];
    int tracking[4];
}

@end

@interface TouchDemoViewController : UIViewController {
    TouchView *touchView;
}

@end

Example 4-5. TouchDemo view controller (TouchDemoViewController.m)

 

#import "TouchDemoViewController.h"

@implementation TouchView

- (id)initWithFrame:(CGRect)frame {
    frame.origin.y = 0.0;
    self = [ super initWithFrame: frame ];
    if (self != nil) {
        self.multipleTouchEnabled = YES;
        for(int i=0; i<4; i++) {
            NSURL *url = [ NSURL URLWithString: [
                NSString stringWithFormat:
                    @"http://www.zdziarski.com/demo/%d.png", i+1 ]
            ];
            images[i] = [ [ UIImage alloc ]
                initWithData: [ NSData dataWithContentsOfURL: url ]
            ];
            offsets[i] = CGPointMake(0.0, 0.0);
        }

        offsets[0] = CGPointMake(0.0, 0.0);
        offsets[1] = CGPointMake(self.frame.size.width
            - images[1].size.width, 0.0);
        offsets[2] = CGPointMake(0.0, self.frame.size.height
            - images[2].size.height);
        offsets[3] = CGPointMake(self.frame.size.width
            - images[3].size.width, self.frame.size.height
            - images[3].size.height);
    }

    return self;
}

- (void)drawRect:(CGRect)rect {
    for(int i=0; i<4; i++ ) {
        [ images[i] drawInRect: CGRectMake(offsets[i].x, offsets[i].y,
            images[i].size.width, images[i].size.height)
        ];
    }
}

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
    UITouch *touch;
    int touchId = 0;

    NSEnumerator *enumerator = [ touches objectEnumerator ];
    while ((touch = (UITouch *) [ enumerator nextObject ])) {
        tracking[touchId] = -1;
        CGPoint location = [ touch locationInView: self ];
        for(int i=3;i>=0;i--) {
            CGRect rect = CGRectMake(offsets[i].x, offsets[i].y,
                images[i].size.width, images[i].size.height);
            if (CGRectContainsPoint(rect, location)) {
                NSLog(@"Begin Touch ID %d Tracking with image %d\n", touchId, i);
                tracking[touchId] = i;
                break;
            }
        }

        touchId++;
    }
}

- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
    UITouch *touch;
    int touchId = 0;

    NSEnumerator *enumerator = [ touches objectEnumerator ];
    while ((touch = (UITouch *) [ enumerator nextObject ])) {
        if (tracking[touchId] != -1) {
            NSLog(@"Touch ID %d Tracking with image %d\n",
               touchId, tracking[touchId]);

            CGPoint location = [ touch locationInView: self ];
            CGPoint oldLocation = [ touch previousLocationInView: self ];
            offsets[tracking[touchId]].x += (location.x - oldLocation.x);
            offsets[tracking[touchId]].y += (location.y - oldLocation.y);

            if (offsets[tracking[touchId]].x < 0.0)
                offsets[tracking[touchId]].x = 0.0;
            if (offsets[tracking[touchId]].x
                + images[tracking[touchId]].size.width > self.frame.size.width)
            {
                offsets[tracking[touchId]].x = self.frame.size.width
                    - images[tracking[touchId]].size.width;
            }

            if (offsets[tracking[touchId]].y < 0.0)
                offsets[tracking[touchId]].y = 0.0;
            if (offsets[tracking[touchId]].y
                + images[tracking[touchId]].size.height > self.frame.size.height)
            {
                offsets[tracking[touchId]].y = self.frame.size.height
                    - images[tracking[touchId]].size.height;
            }
        }
        touchId++;
    }

    [ self setNeedsDisplay ];
}
@end

@implementation TouchDemoViewController

- (void)loadView {
    [ super loadView ];

    touchView = [ [ TouchView alloc ] initWithFrame: [
        [ UIScreen mainScreen ] applicationFrame ]
    ];
    self.view = touchView;
}

- (void)didReceiveMemoryWarning {
    [ super didReceiveMemoryWarning ];
}

- (void)dealloc {
    [ super dealloc ];
}

@end

                                          

Example 4-6. TouchDemo main (main.m)

 

#import <UIKit/UIKit.h>

int main(int argc, char *argv[]) {

    NSAutoreleasePool * pool = [ [ NSAutoreleasePool alloc ] init ];
    int retVal = UIApplicationMain(argc, argv, nil, @"TouchDemoAppDelegate");
    [ pool release ];
    return retVal;
}

                                          

4.2.9. What's Going On

The TouchDemo example works as follows:

  1. When the application starts, it instantiates a new window and view controller. A subclass of UIView, named TouchView, is created and attached to the view controller.

  2. When the user presses one or more fingers, the touchesBegan method is notified. It compares the location of each touch with the image's offsets and size to determine which images the user touched. It records the index in the tracking array.

  3. When the user moves a finger, the touchesMoves method is notified. This determines which images the user originally touched and adjusts their offsets on the screen by the correct amount. It then calls the setNeedsDisplay method, which invokes the drawRect method for redrawing the view's contents.

4.2.10. Further Study

  • Check out the following prototypes in your SDK's header files: UITouch.h and UIEvent.h. You'll find these deep within /Developer/Platforms/iPhoneOS.plat⁠form, inside the UI Kit framework's Headers directory.