(Comment left on William Hodge's blog.)
Youn-kyung Lim, Sang-Su Lee, Kwang-young Lee
Department of Industrial Design, KAIST
Republic of Korea
Video demonstration link:
The primary goal of this research was to define a set of attributes that allow designers to describe what they called "the shape of interactivity" or an interactive artifact.
In other words, they were setting out to describe interactive objects/interfaces.
With that knowledge, they believed that designers could more effectively create good interfaces. They believed that the kind of interaction given to an interface could be concretely describable much like physical materials are described.
More than that, this team set out to identify the kinds of emotional reactions people had to these kinds of interactions.
For example, they created a set of interactive flash modules that the users manipulated with a mouse. These flash modules demonstrated several kinds/shapes of interaction:
Concurrency (concurrent/sequential) - When clicked all the marbles moved together or separately
Continuity (continuous/discrete) - a marble moves in a circle continuously or in steps
Expectedness (expected/unexpected) - Marbles moved to a corresponding slot or in a random slot
Movement Range (narrow/wide) - Marbles shuffle around when mouse in near or far
Movement Speed (fast/slow) - Marbles move quickly or slowly when mouse is near
Proximity (precise/proximate) - User adjusts area with measurement or without
Response Speed (delayed/prompt) - User clicks marble and the the marble moves instantly or is delayed
After the users played with a particular flash module, they asked them if they the interface felt like it was described and what kind of emotions the user felt.
They found that certain kinds of interfaces produced a range of emotional responses and they believe that when designers create interfaces, the designer consciously choose the kind of interaction environment to produce the desired emotional response.
(For the kinds of emotions that are attached to which interactivity modules, please refer to the paper.)
This paper was interesting in that it has sought out to measure the kinds of emotions people had to varying kinds of interfaces. It would be useful if designers of systems and interfaces had a kind of chart where they could look up and see what kind of interface they should use to evoke pleasure or introspection or sadness or whatever the designer may choose.
That said, I really don't think much was accomplished by this research. We already have ways of describing how objects react to certain action and how objects move. To some extent, we also know how people generally feel about certain reactions. We know for ease of use, we generally want quick response speeds, precise measurements, fast movements (but trackable) and reactions we expect.
As far as what kind of future work can come from this, the research team itself did say that they didn't test different styles of the same interactions and how the users felt about those differences. For all we know, the emotional responses may just be attributable to the kind of flash module they made rather than the interactions themselves.