02 April 2011

Paper Reading #18: Evaluating the Design of Inclusive Interfaces by Simulation

Commentary

See what I have to say about Steven's and Evin's work.

References

Biswas, P., and Robinson, P. (2010). Evaluating the design of inclusive interfaces by simulation. Proceeding of the Acm conference on intelligent user interfaces. . Hong Kong: http://www.iuiconf.org/.

Article Summary

In this paper, Biswas and Robinson discuss their development of a simulator that evaluates usage scenarios for different assistive interfaces. Assistive interfaces refer to those interfaces designed to assist users who are physically impaired.

The Samsung Jitterbug, an example (sort of) of an assistive interface. Image courtesy of My Vision Aid, Inc.

Their study consisted of comparing the simulator's predictions of how long different tasks would take for users with various impairments against actual measured times for different users. The researchers identify text-search tasks and icon-search tasks, but specifically focus on icon-search tasks. The two subtasks tested were searching for an icon, and pointing and clicking on an icon. They varied the spacing between icons and font size for the icon captions. The participants in their test consisted of able-bodied individuals, individuals with vision impairments, and individuals with motor impairments. In computing the error in the simulator's prediction of how long the task would take, they found that the simulator accurately predicted task times with statistical significance.

Discussion

I am sure that there have been other systems similar to this that have been developed, but this is the first I have heard about such a system. This seems to be a natural extension of unit testing, where in this case the unit is the usability rather than the functionality of the interface itself. The obvious gain here is that, given the efficacy of this model in predicting performance, one need not waste the time and money to perform an actual user study on an interface: just input the impairment parameters and run the interface through the simulator. Of course, the interface would have to be codified as per the simulator's capabilities, i.e. one would need to know font size and distance between icons, but this seems like something that is a pretty important part of inclusive interface design anyway. What I'm saying is, it doesn't seem like it would be a big stretch to be able to set up an interface to be tested by this simulator. Here's an idea for future work: extend the simulator from processing an interface based on a flat screen to processing three-dimensional data. Simulate movement throughout an environment, say, a home? I'm a fan of optimization.

No comments:

Post a Comment