Problem

Mobile devices should react to user behaviors or... and something...

Solution

Kinesthetics is the ability to detect movement of the body; while usually applied in a self-aware sense, here it refers to the mobile device using sensors to detect and react to proximity, action and orientation.

The most common sensor is the accelerometer, which as the name says, measures acceleration along a single axis. As utilized in modern mobile devices, accelerometers are very compact, integrated micro electro-mechanical systems, usually with detectors for all three axes mounted to a single frame.

In certain circles the term "accelerometer" is beginning to achieve the same sense of conflating hardware with behavior as the GPS and location. Other sensors can and should be used for detection of Kinesthetic gestures including cameras, proximity sensors, magnetometers (compasses), audio, and close-range radios such as NFC, RFID and Bluetooth. All these sensors should be used in coordination with each other.

Kinesthetic gesturing is largely a matter of detecting incidental or natural movements and reacting in appropriate or expected manners. Unlike on-screen gestures -- which form a language and can become abstracted very easily -- kinesthetic gesture is about context. Is the device moving? In what manner? In relation to what? How close to the user, or other devices?

Design of mobile devices should consider what various types of movement, proximity and orientation mean, and behave appropriately.

Specific subsets of device movement such as Orientation have specialize behaviors and are covered separately. Location is also covered separately. Subsidiary senses of position such as those used in augmented reality and at close range (e.g. within-building location) may have some overlaps, but are not yet established enough for patterns to emerge.

The use of a secondary device for communicating gesture, such as a game controller, is covered under the Remote Gestures pattern.

Variations

Only some of the variations of this pattern are explicitly kinesthetic, and sense user movement. Others are primarily detecting position, such as other device proximity...

natural gestures, but may require some arbitrary mapping to behaviors. This may require learning. An example is the relatively common behavior of shaking to clear, reset or delete items.

Proximity to signals or other environmental conditions further than a few inches away is not widely used as yet. When radio, audio or visual cues become commonly available they are likely to be a related contextually related manner.

Interaction Details

The core of designing interactions of Kinesthetic Gestures is in determining what gestures and sensors are to be used. Not just a single gesture, but which in combination detect...

combination gestures. almost all are. See device orientation in the variations... Combinations include things not included, like GPS. When you walk to a car, accelerometer detects you walk, then GPS determines you are moving at vehicle speeds, etc.... combinations are not momentary but over time. The device should monitor full time... privacy concerns though...

User movement (swinging up) and user proximity (your head) have to work together. Otherwise it might lock when you wave it by a table... combinations add intelligence, to make sure actions are more contextually accurate. Sensors can be activated based on other sensors, to reduce output and battery consumption. Proximity detectors, for example, can be activated only when a proper movement type is detected.

Kinesthetic gestures are not as ubiquitous as On-Screen Gestures so may be unexpected to some users. Settings should be provided to disable (or enable) certain behaviors.

do action???

Presentation Details

when reacting to a proximate other device, present the relationship on screen when possible... when known, display items on the screen adjacent to the location of the adjacent device...

Antipatterns

Examples