Problem

Mobile devices should react to user behaviors, movements and the relationship of the device to the user in a natural and understandable manner.

Solution

Kinesthetics is the ability to detect movement of the body; while usually applied in a self-aware sense, here it refers to the mobile device using sensors to detect and react to proximity, action and orientation.

The most common sensor is the accelerometer, which as the name says, measures acceleration along a single axis. As utilized in modern mobile devices, accelerometers are very compact, integrated micro electro-mechanical systems, usually with detectors for all three axes mounted to a single frame.

In certain circles the term "accelerometer" is beginning to achieve the same sense of conflating hardware with behavior as the GPS and location. Other sensors can and should be used for detection of Kinesthetic gestures including cameras, proximity sensors, magnetometers (compasses), audio, and close-range radios such as NFC, RFID and Bluetooth. All these sensors should be used in coordination with each other.

Kinesthetic gesturing is largely a matter of detecting incidental or natural movements and reacting in appropriate or expected manners. Unlike on-screen gestures -- which form a language and can become abstracted very easily -- kinesthetic gesture is about context. Is the device moving? In what manner? In relation to what? How close to the user, or other devices?

Design of mobile devices should consider what various types of movement, proximity and orientation mean, and behave appropriately.

Specific subsets of device movement such as Orientation have specialize behaviors and are covered separately. Location is also covered separately. Subsidiary senses of position such as those used in augmented reality and at close range (e.g. within-building location) may have some overlaps, but are not yet established enough for patterns to emerge.

The use of a secondary device for communicating gesture, such as a game controller, is covered under the Remote Gestures pattern.

Variations

Only some of the variations of this pattern are explicitly kinesthetic, and sense user movement. Others are primarily detecting position, such as other device proximity...

Proximity to signals or other environmental conditions further than a few inches away is not widely used as yet. When radio, audio or visual cues become commonly available they are likely to be a related contextually related manner.

These methods of gesturing can initiate actions in three categories:

Interaction Details

The core of designing interactions of Kinesthetic Gestures is in determining what gestures and sensors are to be used. Not just a single gesture, but which of them, in combination, can most accurately and reliably detect the expected condition. For example, it is a good idea to lock the keypad from input when it is placed to the ear, during a call. Using any one sensor alone would make this unreliable or unusable, but combining accelerometers (moving towards the head) with proximity sensors or cameras (at the appropriate time of movement, an object of head size approaches), this can be made extremely reliable.

Kinesthetic Gestures reinforce other contextual use of devices. Do not forget to integrate them with other known information from external sources, location services and other observable behaviors. A meeting can be surmised from travel to a meeting location, at the time scheduled, then being placed on a table. The device could then behave appropriately, and switch to a quieter mode even if not placed face-down.

Context must be used to load correct conditions and as much relevant information as possible. When proximity to another device opens a Bluetooth exchange application, it must also automatically request to connect to the other device (not simply load a list of cryptic names from which the user must choose), and open the service type being requested.

All State Change gestures should be reversible, usually with a directly opposite gesture. Removing the phone from the ear must also be sensed, so reliably and quickly it immediately unlocks so there is no delay in the use of the device.

Some indication of the gesture must be presented visually. Very often, this is not explicit (such as an icon or overlay) but switches modes at a higher level. When a device is locked due to being placed face-down, or near the ear, the screen is also blanked. Aside from saving power when there is no way to read the screen anyway, this serves to signal that input is impossible in the event that it does not unlock immediately.

Non-gestural methods must be available to remove or reverse state-changing gestures. For example, if sensors do not unlock the device in the examples above, the conventional unlock method (e.g. power/lock button) must be enabled. Generally, this means that Kinesthetic Gestures should simply be shortcuts to existing features.

For Process Initiation, usually used with transactional features such as NFC payment, the gesture towards the reader should be considered to be the same as the user selecting an on-screen function to open a shortcut; this is simply the first step of the process. Additional explicit actions must be taken to confirm the remainder of the transaction.

Kinesthetic gestures are not as ubiquitous as On-Screen Gestures so may be unexpected to some users. Settings should be provided to disable (or enable) certain behaviors.

Presentation Details

When reacting to another device, such as an NFC reader, the positional relationship between the devices should be implied (or explicitly communicated) by position of elements on the screen.

When gesture initiates a process, Tones, Voice Notifications and Haptic Output should be used, along with on-screen displays, to notify the user that the application has opened. Especially for functions with financial liability, this will reassure the user that the device cannot take action without them.

When a sensor is obvious, and LED may be used on or adjacent to the apparent location of the sensor. For the NFC reader example, an icon will often accompany the sensor, and can be additionally illuminated when it is active. This is used in much the same manner as a light on a camera, so surreptitious use of the sensor is avoided.

Use of Kinesthetic Gestures for control, such as scrolling a page, should have on-screen displays that the equivalent to a keypress for scrolling has been made. This can be used to reinforce the reason the device is scrolling, and for the user to interact with. Especially if the indicator shows the degree of the gesture, the user can balance not just the output (scroll speed) but directly have a way to control the gesture.

Kinesthetic Gestures are not expected in mobile devices, so users will have to be educated as to their purpose, function and interaction. This can be done with:

Antipatterns

Whole-device Kinesthetic Gestures can also be used as a pointer, or method of scrolling. These are not yet consistently applied, so require instruction, icons and overlays to explain and indicate when the effect is being used, and controls to disable them when not desired. Their use in games (think of rolling the ball into a pocket types) is due to the difficultly of control; this should indicate that using gestural pointing as a matter of course can be troublesome.

Avoid using graphics which indicate position when they cannot be made to reflect the actual relationship. For example, do not use a graphic that shows data transferring between devices next to each other, if the remote device can only connect through a radio clearly located on the bottom of the device.

Use caution designing these gestures. Actions and their gestures must be carefully mapped to be sensible, and not conflict with other gestures. Shake to send, for example, makes no sense. But a "flick away" to send data would be okay, like casting a fishing line, it implies something departs the user's device and is sent into the distance. Since information is often visualized as an object, extending the object metaphor tends to work.

Examples