Think Wii, Kinect. Sort of in between (many gestures are wrist-level, so using it as a pointing device) and the effect is different. E.g. needs to have an affector (made that word up): either a cursor, or an element on the screen such as a character in a game, that is clearly under direct control of the input mechanism... etc. Easy to write an interaction pattern, I think.

Also "touchless gestures" which can be very similar to on-screen gestures, but don't make contact...

Review how the KB on the Wii works, for the virtual cursor and highlight functions.

Problem

Solution

the remote interaction may be with another device, or with a physical object, for AR or things like bar code scanning... maybe?

Variations

There are two facets of variation, the use and the method:

Methods are very simply divided between device pointing and non-device gestures.

Interaction Details

Presentation Details

Antipatterns

Be aware of the problem of pilot-induced oscillation and take steps to detect and alleviate it. PIO arises from a match between the frequency of the feedback loop in the interactive system, and the frequency of response of users. While similar behaviors can arise in other input/feedback environments, they lead to single-point errors. In aircraft, the movement of the vehicle (especially in the vertical plane) can result in the oscillation building, possibly to the point of loss of control. The key facet of many remote gesturing systems -- being maneuvered in three dimensions, in free-space -- can lead to the same issues. These cannot generally be detected during design, but must be tested for. Alleviate this via reducing the control lag, or reducing the gain in the input system. The unsuitability of a responses varies by context; games will tolerate much more "twitchy" controls than pointing for entry on a virtual keyboard.

Examples

PIO Article: http://dtrs.dfrc.nasa.gov/archive/00001004/01/210389v2.pdf