26 April 2011

Paper Reading #12: Pen + Touch = New Tools

Commentary

See what I have to say about ___'s and ___'s work.

References

Hinckley, K., et al. (2010). Pen + touch = new tools. Proceeding of the Acm conference on user interface software and technology. New York: http://www.acm.org/uist/uist2010/.

Article Summary

In this paper, the researchers examine the capabilities of a multimodal interface with touch- and pen-based interaction. They believe that interactions can be separated into unimodal and multimodal categories based both on input device and intended action.

In the first phase of their project, the researchers conducted an initial study where subjects used pen, paper, and various tools to organize objects in a notebook. They took note of common actions that all subjects performed, and what affordances a fully manipulatable environment granted them. Some examples that helped the researchers categorize different unimodal and multimodal commands include the subject tucking the pen in the fingers to manipulate objects in the environment, using only fingers to hold down or reposition objects, or using objects as part of the environment with one hand while drawing with the other.

Using their observations from the first phase, they designed an interface using the Microsoft Surface system that incorporates as many natural affordances from the first phase into the second: a multimodal pen-and-touch interface. Touch and pen input can be unimodal and have their own affordances in these cases; when the input is combined, i.e. multimodal, the context of the interactions changes and a new set of affordances become available. They categorize this difference as such: "...the pen writes, and touch manipulates, period." Some of the combined, i.e. multimodal, interactions included: holding objects together and tapping with the pen to "staple" them; holding an object steady and using the pen as an X-acto knife; holding an object steady and creating a "carbon copy" by dragging a copy off of it with the pen; and holding an object steady and using it as a straightedge along which to draw with the pen.

Using an object in the scene as a straightedge.
Image courtesy of the above-cited paper.

Discussion

This interface is even more intuitive than the last one I reviewed! And I love it! I appreciate the work that the researchers put into observing natural interactions with the type of environment that they wanted to create. This seems the be the smartest way to make an interface parallel interaction in the real world, and indeed, to allow an interface to achieve its maximum usability potential. The way they separated out the roles of touch and pen was ingenious. All in all, this is one of the best designs for a new interface I have seen throughout these papers. I'm almost as excited about this interface as I am about the Minority Report-style interface :)

No comments:

Post a Comment