Touch events, the accelerometer, and scroll views
Relevant resources
Documentation
Sample code - Apple
Sample code - others
Sample code - class
The iPhone broke with the mold when it comes to mobile devices by removing almost all physical buttons in favor of a giant touch screen. To replace behaviors that had been performed by dedicated buttons, the iPhone allows for sophisticated touch gestures involving multiple fingers. For example, it is now common knowledge that pinching and expanding two fingers on a screen will zoom in and out of a map, image, web page, etc., something that startled the crowd at the unveiling of the iPhone.
In addition to the large touch screen, the iPhone supports a second means of interaction through onboard accelerometers, which can sense movement of the device.
Touch events
Almost all of your interaction with the iPhone will come through touching items on its screen. Most of these touches will be handled by native Cocoa Touch views and their controllers, which do all the processing for you. For example, when dealing with a UIButton, you don't need to manually respond to touches. Instead, you set the action to be taken by the UIButton on various touch conditions.
However, there are times when you would like to manually handle touch events. These include situations like drawing, moving items with your finger, or implementing touch gestures. In these cases, you will need to override a series of methods in either your UIView or a UIViewController that is the delegate of the view receiving touch events. These methods are
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
This is called when touches start. Note that this may simply indicate that a second, third, etc. finger has now started touching the screen in addition to ones that were already touching.
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
This is called when one or more of the fingers touching the screen have moved.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
This is called when one or more of the fingers touching the screen have lifted up. Again, other fingers may still be on the screen.
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
This is called when a finger travels outside of the area of the view receiving touch events, or when an encompassing view, like a UIScrollView, cancels touch events so that it can use them for its own processing.
Each of these methods will take in an NSSet of UITouch objects. Note that NSSets have no ordering to them, so it is up to you to keep track of which touches correspond to which finger.
In order to get the location of a touch event in the coordinate space of the UIView in question, you will need to use code like the following:
CGPoint point1 = [currentTouch locationInView:self.view];
If a touch is moving, you can use code like the following to obtain the location of this touch as of the last time it was measured:
CGPoint previousLocationOfTouch1 = [currentTouch previousLocationInView:self.view];
You can detect double-taps on the screen by examining the tapCount property on a UITouch event. This is probably best done in the -touchesEnded:withEvent: method.
To perform multitouch gestures within the iPhone Simulator, you simply hold Option, which will bring up a second finger (as a translucent pad). It's easy to do pinch gestures using this by clicking and dragging the mouse while holding Option. To move both fingers at once, hold down Option and Shift while moving the mouse.
Unfortunately, you really can't get the full effect of multitouch while working in the Simulator, so you will need to start your testing on the device early when handling this kind of input in your application. In addition to the Simulator not supporting more complex gestures than pinches and two fingers moving together, in real life your fingers will almost never touch the surface at the same moment, or lift off in unison. They will shake and move independently of one another, something you'll need to take into account.
-Hit testing
UIScrollView
If you don't want to handle common touch gestures yourself, Apple has provided UIScrollView, a container view that abstracts away a lot of this. Unfortunately, UIScrollView can be a class that is a little tricky to get a handle on.
By default, UIScrollView does what its name describes: it lets you scroll through the contents of a larger view. You're probably familiar with its use as the superclass of UITableView, where it lets you scroll through many more rows of information than can fit onscreen at once.
To enable scrolling for a view, simply make it a subview of a UIScrollView, set the contentSize property of the UIScrollView to the size of your subview, and set the UIScrollView's scrollEnabled property to YES. If you like, you can create an inset for your view, so that there's a buffer around its edges, by setting the contentInset property to a UIEdgeInsets struct (which lets you specify top, bottom, left, and right insets).
If you have views within the scroll view that you'd like to have respond to touch events, you will need to set the canCancelContentTouches property of the UIScrollView to YES to allow the scroll view to detect when a scrolling event takes place. How this happens is that there is a slight delay between a touch starting and it being passed on to your subview. The scroll view uses this delay to determine if the user is simply touching your subview, or if they are initiating a scrolling motion. This delay can be disabled by setting the delaysContentTouches property of the scroll view to NO.
You can determine the origin of the currently displayed rectangular portion of your scroll view by querying the contentOffset property of the scroll view. You can even cause your view to be scrolled to a particular position using –setContentOffset:animated: with the latter argument being YES.
Several aspects of the scrolling process can be tweaked, such as the decelerationRate (acceptable values include UIScrollViewDecelerationRateNormal and UIScrollViewDecelerationRateFast), whether or not the scrolled view bounces back when it hits an edge (using the bounces property), the indicatorStyle (values include UIScrollViewIndicatorStyleDefault, UIScrollViewIndicatorStyleBlack, and UIScrollViewIndicatorStyleWhite), and the insets for the scroll indicators (using the scrollIndicatorInsets property). If you want to let the user know that the view in front of them is a scroll view, you may wish to use –flashScrollIndicators to subtly bring up the scroll bars on the first appearance of a view.
However, a UIScrollView can do more than just scroll, it can also handle pinch-zooming of your content. To enable zooming, you will first need to set the contentSize of the scroll view to the size of the subview, set a maximumZoomScale, and set a minimumZoomScale. Next, you will need to set a delegate for the UIScrollView and have that delegate conform to the UIScrollViewDelegate protocol. The two delegate methods that need to be overridden are
- (UIView *)viewForZoomingInScrollView:(UIScrollView *)scrollView;
which should return your subview that will be zoomed, and
- (void)scrollViewDidEndZooming:(UIScrollView *)scrollView withView:(UIView *)view atScale:(float)scale;
which is where you will do any cleanup or scaling of the views needed after the pinch gesture is finished.
In actuality, you don't need this second method to get zooming to work. By default, the UIScrollView will apply a scaling transform to the subview being zoomed, which is a very fast operation that can be done on the GPU.
One thing you will notice if you just use the default zoom scaling is that your content will get blurry as it zooms in and out. This is because it is just being transformed, not crisply rerendered at the new zoom factor. Making sure that your view is properly rerendered is a little more tricky.
During a pinch-zooming event, UIScrollView will take the view you specified in the -viewForZoomingInScrollView: delegate method and apply a transform to it. This transform provides a smooth scaling of the view without having to redraw it each frame. At the end of the zoom operation, your delegate method -scrollViewDidEndZooming:withView:atScale: will be called and give you a chance to do a more high-quality rendering of your view at the new scale factor. Generally, it's suggested that you reset the transform on your view to be CGAffineTransformIdentity (a scale factor of 1.0) and then have your view manually redraw itself at the new size scale.
However, this causes a problem because UIScrollView doesn't monitor the content view transform, so on the next zoom operation it sets the transform of the content view to whatever the overall scale factor is. Since you've manually redrawn your view at the last scale factor, it compounds the scaling, which will cause the view to jump around.
As a workaround, you can use a UIView subclass for the content view with the following methods defined:
- (void)setTransformWithoutScaling:(CGAffineTransform)newTransform;
{
[super setTransform:newTransform];
}
- (void)setTransform:(CGAffineTransform)newValue;
{
[super setTransform:CGAffineTransformScale(newValue, 1.0f / previousScale, 1.0f / previousScale)];
}
where previousScale is a float instance variable of the view. You can then implement the zooming delegate method as follows:
- (void)scrollViewDidEndZooming:(UIScrollView *)scrollView withView:(UIView *)view atScale:(float)scale;
{
[contentView setTransformWithoutScaling:CGAffineTransformIdentity];
// Code to manually redraw view at new scale here
contentView.previousScale = scale;
scrollView.contentSize = contentView.frame.size;
}
By doing this, the transforms sent to the content view are adjusted based on the scale at which the view was last redrawn. When the pinch-zooming is done, the transform is reset to a scale of 1.0 by bypassing the adjustment in the normal -setTransform: method. This seems to provide the correct scaling behavior while letting you draw a crisp view at the completion of a zoom.
As of iPhone OS 3.0, you can programmatically set the zoom scale through –setZoomScale:animated: or –zoomToRect:animated:.
Accelerometer
Every iPhone OS device comes with a built-in three-axis accelerometer. This accelerometer provides another means of interaction with the device, beyond simply touching the screen. Many games use this as a primary form of control, but there are practical applications for accelerometer readings, including simulating a bubble level, acting as a pedometer, performance testing a car, etc.
As its name suggests, an accelerometer measures acceleration of the device in a certain direction. It's important to note that this acceleration also includes gravitational acceleration, which you either must take into account or use to determine device orientation. The iPhone has accelerometers that measure in three axes, giving you feedback on the three-dimensional forces acting on the device.
Handling rotation
At the highest level, the iPhone will detect when the device has been rotated to different orientations. Delegate methods on UIViewController are used to determine what to do in this case. If you override
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
you can return YES or NO to indicate whether your view controller supports rotation to the new orientation of the device. Possible values include UIInterfaceOrientationPortrait, UIInterfaceOrientationPortraitUpsideDown, UIInterfaceOrientationLandscapeLeft, and UIInterfaceOrientationLandscapeRight. If you return NO for a particular orientation, your view controller will prevent autorotation to that orientation. In a hierarchy of visible view controllers, if one return NO to this method, no rotation will take place within the entire interface.
Two helpful macros are provided for you, UIInterfaceOrientationIsPortrait() and UIInterfaceOrientationIsLandscape(), to help you filter on the two groupings of rotation positions.
When rotation is about to begin, your UIViewController will have the following method be triggered:
- (void)willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration
You can override this to disable something that might not work well with the rotation animation.
When the rotation has finished, the following method will be called
- (void)didRotateFromInterfaceOrientation:(UIInterfaceOrientation)fromInterfaceOrientation
iPhone OS 3.0 introduced a means of overriding the autorotation animation by letting you override
- (void)willAnimateRotationToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation duration:(NSTimeInterval)duration
Additionally, you can return custom views from
- (UIView *)rotatingFooterView
- (UIView *)rotatingHeaderView
to provide views that will slide in and out from the top and bottom of the screen during autorotation.
You can simulate rotation changes within the iPhone Simulator by choosing the Hardware | Rotate Left and Hardware | Rotate Right menu items. You can also use the keyboard shortcuts for this.
If you want to force your application to start in landscape and stay there, you need to do a little extra setup. First, within your Info.plist, you'll want to add lines like the following:
<key>UIInterfaceOrientation</key>
<string>UIInterfaceOrientationLandscapeRight</string>
Then, within your application delegate, you'll want to implement the following delegate method:
- (void)application:(UIApplication *)application willChangeStatusBarOrientation:(UIInterfaceOrientation)newStatusBarOrientation duration:(NSTimeInterval)duration;
{
if ((newStatusBarOrientation == UIInterfaceOrientationPortrait) || (newStatusBarOrientation == UIInterfaceOrientationPortraitUpsideDown))
[application setStatusBarOrientation:UIInterfaceOrientationLandscapeRight animated:NO];
}
I've found this to be needed in some cases to override a rotation glitch with the iPhone Simulator.
Finally, you'll want to override the correct method to allow autorotation only in the landscape orientations:
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
return UIInterfaceOrientationIsLandscape(interfaceOrientation);
}
Motion events
At a slightly higher level, iPhone OS 3.0 added support for motion events, which are gestures triggered by motion detected by the accelerometer. If your view or view controller is the first responder (something we discussed when dealing with undo / redo and copy / paste), it will get sent the following messages when motion events occur:
- (void)motionBegan:(UIEventSubtype)motion withEvent:(UIEvent *)event
- (void)motionEnded:(UIEventSubtype)motion withEvent:(UIEvent *)event
- (void)motionCancelled:(UIEventSubtype)motion withEvent:(UIEvent *)event
where the UIEventSubtype can be UIEventSubtypeMotionNone for a general movement event or UIEventSubtypeMotionShake for a detected shake gesture.
Raw accelerometer data
Beyond basic interface interaction, you may want to read the raw data from the accelerometers to determine motion of the device or its current orientation with respect to the ground. Reading data from the accelerometers is simple, you just need to grab the shared UIAccelerometer instance, set its sample rate, and set a delegate to handle acceleration callbacks:
UIAccelerometer* theAccelerometer = [UIAccelerometer sharedAccelerometer];
theAccelerometer.updateInterval = 1.0 / 50.0;
theAccelerometer.delegate = self;
In this case, we're setting a sample rate of 50 Hz (50 samples per second). Apple recommends the following sample rates for common uses:
The one delegate method that needs to be implemented to handle accelerometer events is
- (void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration
This passes in a UIAcceleration instance, which has four properties: x, y, z, and timestamp. These are the accelerations measured by the three directional sensors, and the relative timestamp of the current measurement.
You can use these values directly, but you'll notice that the sensors are fairly noisy. To deal with this, a common solution is to use low-pass or high-pass filters to determine constant acceleration (the gravitational acceleration vector) or instantaneous acceleration (the movement of the device). These terms are commonly found in electrical circuitry, where they refer to circuits that allow only low-frequency or high-frequency signals through, and filter out the rest.
A sample low-pass filter is as follows:
UIAccelerationValue lowPassFilteredXAcceleration = (currentXAcceleration * kLowPassFilteringFactor) + (previousLowPassFilteredXAcceleration * (1.0 - kLowPassFilteringFactor));
UIAccelerationValue lowPassFilteredYAcceleration = (currentYAcceleration * kLowPassFilteringFactor) + (previousLowPassFilteredYAcceleration * (1.0 - kLowPassFilteringFactor));
UIAccelerationValue lowPassFilteredZAcceleration = (currentZAcceleration * kLowPassFilteringFactor) + (previousLowPassFilteredZAcceleration * (1.0 - kLowPassFilteringFactor));
This tries to smooth out the measured acceleration by making it part of a rolling average with the previously measured accelerations. The filtering factor can be tuned to the needs of your particular application.
Note that UIAccelerationValue is simply typecast to double.
Similarly, a high-pass filter can pick out instantaneous changes in the acceleration simply by subtracting the low-pass-filtered values:
UIAccelerationValue highPassFilteredXAcceleration = currentXAcceleration - lowPassFilteredXAcceleration;
UIAccelerationValue highPassFilteredYAcceleration = currentYAcceleration - lowPassFilteredYAcceleration;
UIAccelerationValue highPassFilteredZAcceleration = currentZAcceleration - lowPassFilteredZAcceleration;
These are crude filtering approaches, but they can be effective for many applications. Several other filtering algorithms have been applied to this problem, of varying complexity.