This article is based on iOS 4 in Action, to be published on June 2011. It is being reproduced here by permission from Manning Publications. Manning publishes MEAP (Manning Early Access Program,) eBooks and pBooks. MEAPs are sold exclusively through Manning.com. All pBook purchases include free PDF, mobi and epub. When mobile formats become available all customers will be contacted and upgraded. Visit Manning.com for more information. [ Use promotional code ‘java40beat’ and get 40% discount on eBooks and pBooks ]
Recognizing Simple Accelerometer Movement
Introduction
If you want to write programs using acceleration gestures, we suggest that you download the Accelerometer Graph program available from Apple’s developer site. This is a nice, simple example of accelerometer use; but more important, it also provides you with a clear display of what the accelerometers report as you make different gestures. Make sure you enable the high-pass filter to get the clearest results.
Figure 1 shows what the Accelerometer Graph looks like in use (but without movement occurring). As you move the device around, you’ll quickly come to see how the accelerometers respond.
Figure 1 The Accelerometer Graph shows movement in all three directions.Here are some details you’ll notice about how the accelerometers report information when you look at the Accelerometer Graph:
- Most gestures cause all three accelerometers to report force; the largest force should usually be on the axisof main movement.
- Even though there’s usually a compensating stop force, the start force is typically larger and shows the direction of main movement.
- Casual movement usually results in forces of .1 g to .5 g.
- Slightly forceful movement usually tops out at 1 g.
- A shake or other more forceful action usually results in a 2 g force.
- The accelerometers can show things other than simple movement. For example, when you’re walking with an iPhone or iPad, you can see the rhythm of your pace in the accelerometers.
All of this suggests a simple methodology for detecting basic accelerometer movement: you monitor the accelerometer over the course of movement, saving the largest acceleration in each direction. When the movement has ended, you can report the largest acceleration as the direction of movement.
Listing 1 puts these lessons together in a program that could easily be used to report the direction of the device’s movement (which you could then use to take some action).
Listing 1 Movement reporter that could be applied as a program controller
- (void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration { accelX = ((acceleration.x * kFilteringFactor) #1 + (accelX * (1 - kFilteringFactor))); #1 accelY = ((acceleration.y * kFilteringFactor) #1 + (accelY * (1 - kFilteringFactor))); #1 accelZ = ((acceleration.z * kFilteringFactor) #1 + (accelZ * (1 - kFilteringFactor))); #1 float moveX = acceleration.x - accelX; #2 float moveY = acceleration.y - accelY; #2 float moveZ = acceleration.z - accelZ; #2 if (!starttime) { #3 starttime = acceleration.timestamp; #3 } #3 if (acceleration.timestamp > starttime + 1 && (fabs(moveX) >= .3 || fabs(moveY) >= .3 || fabs(moveZ) >= .3)) { if (fabs(moveX) > fabs(moveVector)) { moveVector = moveX; #4 moveDir = (moveVector > 0 ? @"Right" : @"Left"); } if (fabs(moveY) > fabs(moveVector)) { moveVector = moveY; #4 moveDir = (moveVector > 0 ? @"Up" : @"Down"); } if (fabs(moveZ) > fabs(moveVector)) { moveVector = moveZ; #4 moveDir = (moveVector > 0 ? @"Forward" : @"Back"); } lasttime = acceleration.timestamp; } else if (moveVector && acceleration.timestamp > lasttime + .1) { myReport.text = [moveDir stringByAppendingFormat: @": %f.",moveVector]; moveDir = [NSString string]; moveVector = 0; } } #1 Gathers filtered info #2 Measures movement #3 Marks start time #4 Saves largest movements
You start by creating a low-pass filter (#1) and then taking the inverse of it (#2) in order to get relatively clean movement data. Because the data can be a little dirty at the start, you don’t accept any acceleration data sent in the first second (#3). You could cut this down to a mere fraction of a second.
You start looking for movement whenever one of the accelerometers goes above .3 g. When that occurs, you save the direction of highest movement (#4) and keep measuring it until movement drops below .3 g. Afterwards, you make sure that at least a tenth of a second has passed so that you know you’re not in a lull during a movement.
Finally, you do whatever you want to do with your movement data. This example reports the information in a label, but you’d doubtless do something much more intricate in a live program. Cleanup is required to get the next iteration of movement reporting going.
This sample program works well, unless the movement is very subtle. In those cases, it occasionally reports the opposite direction because of the force when the device stops its motion. If this type of subtlety is a problem for your application, more work is required. To resolve this, you need to make a better comparison of the start and stop forces for movements; if they’re similar in magnitude, you’ll usually want to use the first force measured, not necessarily the biggest one. But for the majority of cases, the code in listing 1 is sufficient. You now have an application that can accurately report (and take action based on) direction of movement.
Together, gravity and force measurement represent the most obvious things that you can do with the accelerometers, but they’re by no means the only things. We suspect that using the accelerometers to measure three-dimensional gestures will be one of their best (and most frequent) uses as the platform matures.
Summary
Accelerometers can give you access to a variety of information about where a device exists in space. By sensing gravity, you can easily discover precise orientation. By measuring movement, you can see how the device is being guided through space.