Differences between revisions 6 and 35 (spanning 29 versions)
Revision 6 as of 2011-05-14 07:09:05
Size: 3574
Editor: eberkman
Comment:
Revision 35 as of 2011-12-13 17:03:09
Size: 3895
Editor: shoobe01
Comment:
Deletions are marked like this. Additions are marked like this.
Line 1: Line 1:
Keyboard, Keypad and Other input features. Gestures here or somewhere else? Sensors? - Sizes of touch targets, etc. Include diagrams from Mobile Design Elements. Also see this W3C stuff, or at least refer to the group: http://www.w3.org/2010/webevents/ [[http://www.amazon.com/gp/product/1449394639/ref=as_li_tf_tl?ie=UTF8&tag=4ourthmobile-20&linkCode=as2&camp=217145&creative=399373&creativeASIN=1449394639|{{attachment:wiki-banner-book.png|Click here to buy from Amazon.|align="right"}}]]
The varying ways in which people prefer to interact with their devices highly depend upon their natural tendencies, their comfort levels, and the context of use. As designers and developers, we need to understand these influences and offer user interfaces that ap- peal to these needs.
Line 3: Line 4:
Good for discussion: 1) People will input the way they are comfortable: anecdote of alison's kb breaking, taking photos of hand-written notes and MMSing instead of SMS 2) Therefore, lots of input. Don't get caught up in touch as the end-all, be-all. Try to cover handwriting (natural and synthetic), funny layouts and gestures (swype), etc. User preferences may range from inputting data using physical keys, natural handwriting, or other gestural behaviors. Some users may prefer to receive information with an eyes- off-screen approach, and instead relying on haptics or audible notifications.
Line 5: Line 6:
Go light on each pattern, or combine a lot. These are design considerations, more than real patterns... I think. This part of the book will discuss in detail the different mobile methods and controls users can interact with to access apartnd receive information.
Line 7: Line 8:
Talk about touch target sizes, with the illustrations, among other things. The types of input and output we will discuss are subdivided into the following chapters:
 * Chapter 9, [[Text and Character Input]]
 * Chapter 10, [[General Interactive Controls]]
 * Chapter 11, [[Input and Selection]]
 * Chapter 12, [[Audio and Vibration]]
 * Chapter 13, [[Screens, Lights, and Sensors]]

== Types of Input & Output ==
=== Text and Character Input ===
Whether they are sending an email, sending an SMS message, searching, or filling out forms, users require ways to input both text and characters. Such methods may be through keyboards and keypads, as well as pen control. Regardless, these methods must work very efficiently in performance while limiting input errors.

=== General Interactive Controls ===
Functions on the device and in the interface are influenced by a series of controls. They may be keys arrayed around the periphery of the device, or they may be controlled by gestural behaviors. Users must be able to find, understand, and easily learn these control types.

=== Input and Selection ===
Users require methods to enter and remove text and other character-based information without restriction. Many times users are filling out forms or selecting information from lists. At any time, they may also need to make quick, easy changes to remove contents from these fields or from entire forms.

=== Audio and Vibration ===
Our mobile devices are not always in plain sight. They may be across the room, or placed deep in our pockets. When important notifications occur, users need to be alerted. Using audio and vibration as notifiers and forms of feedback can be very effective.

=== Screens, Lights, and Sensors ===
Mobile devices today are equipped with a range of technologies meant to improve our in- teractive experiences. These devices may be equipped with advanced display technology to improve viewability while offering better battery life, and incorporate location-based services integrated within other applications.

== Getting Started ==
You now have a general sense of the types of input and output we will discuss in this part of the book. The following chapters will provide you with specific information on theory and tactics, and will illustrate examples of appropriate design patterns you can apply to specific situations in the mobile space.
Line 10: Line 35:
Title because these are "generaIZED" interactions, hence not widgets. They don't do something specific, but are interactions to do anything. Like the previous chapter was text and character entry for any purpose. Make sense? -------
Next: '''[[Text and Character Input]]'''
-------
= Discuss & Add =
Please do not change content above this line, as it's a perfect match with the printed book. Everything else you want to add goes down here.
Line 12: Line 41:
== General Touch Interaction Guidelines ==
The minimum area for touch activation, to address the general population, is a square 3/8” on each side (10 mm). When possible, use larger target areas. Important targets should be larger than others.
== Examples ==
If you want to add examples (and we occasionally do also) add them here.
Line 15: Line 44:
There is no distinct preference for vertical or horizontal finger touch areas. All touch can be assumed to be a circle, though the actual input item may be shaped as needed to fit the space, or express a preconceived notion (e.g. button).

{{attachment:GICintro-Sizec.png|The caption is up to you!}}


=== Targets ===
The visual target is not always the same as the touch area. However the touch area may never be smaller than the visual target. When practical (i.e. there is no adjacent interctive item) the touch area should be notably larger than the visual target.

See the example to the right; the orange dotted line is the touch area. It is notably larger than the visual target, so a missed touch (as shown) still functions as expected.

{{attachment:GICintro-Target.png|The caption is up to you!}}


=== Touch area and the centroid of contact ===
The point activated by a touch (on capacitive touch devices) is the centroid of the touched area; that area where the user’s finger is flat against the screen.

The centroid is the center of area whose coordinates are the average (arithmetic mean) of the co-ordinates of all the points of the shape. This may be sensed directly (the highest change in local capacitance for projected-capacitive screens) or calculated (center of the obscured area for beam-sensors).
A larger area will typically be perceived to be touched by the user, due to parallax (advanced users may become aware of the centroid phenomenon, and expect this).

{{attachment:GICintro-Centroid.png|The caption is up to you!}}


=== Bezels, edges and size cheats ===
Buttons on the edges of screens with flat bezels may take advantage of this to use smaller target sizes. The user may place their finger so that part of the touch is on the bezel (off the sensing area of the screen). This will effectively reduce the size of their finger, and allow smaller input areas.

This effective size reduction can only be about 60% of normal (so no smaller than 0.225 in or 6 mm) and only in the dimension with the edge condition. This is practically most useful to give high priority items a large target size without increasing the apparent or on-screen size of the target or touch area.

{{attachment:GICintro-Bezel.png|The caption is up to you!}}
== Make a new section ==
Just like this. If, for example, you want to argue about the differences between, say, Tidwell's Vertical Stack, and our general concept of the List, then add a section to discuss. If we're successful, we'll get to make a new edition and will take all these discussions into account.

Click here to buy from Amazon. The varying ways in which people prefer to interact with their devices highly depend upon their natural tendencies, their comfort levels, and the context of use. As designers and developers, we need to understand these influences and offer user interfaces that ap- peal to these needs.

User preferences may range from inputting data using physical keys, natural handwriting, or other gestural behaviors. Some users may prefer to receive information with an eyes- off-screen approach, and instead relying on haptics or audible notifications.

This part of the book will discuss in detail the different mobile methods and controls users can interact with to access apartnd receive information.

The types of input and output we will discuss are subdivided into the following chapters:

Types of Input & Output

Text and Character Input

Whether they are sending an email, sending an SMS message, searching, or filling out forms, users require ways to input both text and characters. Such methods may be through keyboards and keypads, as well as pen control. Regardless, these methods must work very efficiently in performance while limiting input errors.

General Interactive Controls

Functions on the device and in the interface are influenced by a series of controls. They may be keys arrayed around the periphery of the device, or they may be controlled by gestural behaviors. Users must be able to find, understand, and easily learn these control types.

Input and Selection

Users require methods to enter and remove text and other character-based information without restriction. Many times users are filling out forms or selecting information from lists. At any time, they may also need to make quick, easy changes to remove contents from these fields or from entire forms.

Audio and Vibration

Our mobile devices are not always in plain sight. They may be across the room, or placed deep in our pockets. When important notifications occur, users need to be alerted. Using audio and vibration as notifiers and forms of feedback can be very effective.

Screens, Lights, and Sensors

Mobile devices today are equipped with a range of technologies meant to improve our in- teractive experiences. These devices may be equipped with advanced display technology to improve viewability while offering better battery life, and incorporate location-based services integrated within other applications.

Getting Started

You now have a general sense of the types of input and output we will discuss in this part of the book. The following chapters will provide you with specific information on theory and tactics, and will illustrate examples of appropriate design patterns you can apply to specific situations in the mobile space.


Next: Text and Character Input


Discuss & Add

Please do not change content above this line, as it's a perfect match with the printed book. Everything else you want to add goes down here.

Examples

If you want to add examples (and we occasionally do also) add them here.

Make a new section

Just like this. If, for example, you want to argue about the differences between, say, Tidwell's Vertical Stack, and our general concept of the List, then add a section to discuss. If we're successful, we'll get to make a new edition and will take all these discussions into account.

Input and Output (last edited 2011-12-13 17:03:09 by shoobe01)