What is interaction?

Bret Victor’s “A brief rant on future of interaction design” sets a great starting point to Physical Computing class as it proposes the Need-Tool-Capability framework to look at interactivity as an extension of human faculties such as vision and motor skills. Many references cross my mind as I write this post- the apes and the monolith in the movie 2001: A Space Odyssey, a range of interfaces and machines from The Matrix, and even the Ikea assembly instructions manual that I recently received along with the DIY furniture kit.

Roughly, interaction would mean any set of steps an entity follows in order to communicate with another entity. It is a dialog that essentially involves a series of responses from both parties.

In the context of Physical Computing I can think of several ways to describe interaction by proposing a few constraints:

First, one of the two entities is a human! (Or may be a sentient being like a cat). Let’s not look at machine-to-machine dialog as it is either a dull information exchange or a delightful installation that goes on without active participation of human(s).
Second, the knowledge of performing these actions is something that a person discovers accidentally/ learns deliberately by trial-and-error approach or by following instructions/ copies from others/ develops over time as it becomes a routine. In short, this knowledge isn’t directly genetically coded. Humans are just curious about the environment that surrounds them and therefore there is a huge and wonderful scope to design entities that humans can communicate with.

A good example of delightful Interactions would be the Nest thermostat, and I strongly feel that these interactions are useful and meaningful even if Timo Arnall discusses how pathetic an attempt it is to market Nest as an invisible and smart entity. It presents a more legible and human way to communicate user’s needs to HVAC system than what was used as a standard by its predecessors. This factor of diverging from the set practices makes the user curious about the interface and hence initiates a dialog.

Examples of non-interactive systems: Automated systems that work on their own, such as a coffee machine that runs out of ingredients and automatically intakes coffee beans and milk from containers. The process involves sensing a need, using tools and providing feedback to the user; however it lacks initiation of a dialog with the human.

This leads my contemplation to a series of thoughts that I would like to discuss at the next class: Human-machine communications certainly involve the need-tool-capacity model, may involve pleasant micro-interactions, and may lead to a series of signals from humans to machines and vice-versa. Hence these can be termed as Interactive. However, over the time such dialogs may become commonplace. A laptop turns on/off upon triggering the switch and leaves no ripple of delight as it does not surprise an experienced user. It feels amazing to use Nest thermostat while it is altogether a new way to communicate your needs; but not so much after using it daily. Communications that once looked Interactive might seem mere Reactive as they become part of the routine. Does this limit the scope of designing interactive entities?

One thought on “What is interaction?”

  1. A couple of responses:

    The need-tool-capability framework is the strongest thing about the Victor rant, in my mind.

    Have you used a Nest thermostat first-hand? If so, what would you say are its weaknesses and what were its strengths? How long did it take before you understood its patterns?

    Re: interactions becoming commonplace: are there times when this is appropriate? When should a tool slip from the center of our attention to the periphery, so that we can focus on the task at hand? When does delight get in the way? And how, as designers, do we deal with this?

Leave a Reply

Your email address will not be published. Required fields are marked *