Physical Interface and Embodied Cognition

At Kicker Studio, we’re interested in the idea of physical interface. We’re interested in discovering how we can take technology off the screen and make it become an additive that can be embedded into any product to make it stronger, smarter, and more powerful.

Technology needs to respond in ways we understand. Technology needs to learn to speak human, not the other way around. We’re physical. Products are physical. The feedback between user and technology needs to be natural. We’re just now training technology to respond in ways that feel comfortable. Our experience with technology should always be personable, easy, and intuitive. User-friendly is no longer good enough.

The divide between technology and human started like this:

Rene Descartes, in the early 1600s–more than 100 years before the Industrial Revolution–envisioned the brain as a pump that moved “animating fluid” through the body. The mind, Descartes argued, had no body or form. Instead, it was an abstract entity that interacted with the body through the pineal gland. The modern computer, developed after WWII, embraced this mental model. The computer itself acted like the brain, as housing for the software that served as the computer’s mind.

Eventually, we combined two decidedly mechanical products to create a new technology: the typewriter and the television. People needed a simple manner of introducing commands to this new machine. Only our minds mattered in this new relationship because the computer’s mind operated on language. And in turn, we only needed to control the “mind” of the computer, not the “body”. The “body”, in true Descartian terms, was merely housing for the “mind”.

It was an easy relationship, and the mental model made sense. Almost everyone could understand the concept. That world, in there, where anything was possible (TV), was controlled through these specific language- based means (typewriter). We stayed on the outside, at bent-arms’ length, controlling its mind with a few keystrokes. Yet, we in so doing, we made the computer, and subsequent technology, an “other”. We quarantined it. We put it in a box and used specialized tools to control it. And we felt it was so powerful it needed to be dominated. Our precarious relationship with technology was fodder for science fiction.

In the Digital Age, our control over technology is rapidly changing, and we’re experiencing sci-fi realities. Perhaps, the most fundamental shift, though, is that we are developing a more physical relationship with it. We touch it and move it. We carry it in our pocket and sleep with it next to our bed. We’re letting it out of its box, its “body”, and creating new ways to engage with it. Technology is now in our personal space and we can communicate with it through touch, voice, and gesture just like we do to everything else in our physical world. We’re finally allowing ourselves to engage and connect with it.

“Embodied Cognition”

The evolution of technology control points comes at a time when our understanding of how the brain functions is also rapidly shifting. Beginning in the 1980s, scholars began to rethink the way we “think”. A growing body of new research suggests that we think not just with our brains, but also with our bodies. This is called embodied cognition.

In a recent study at the ad firm TBWA, researchers asked participants to draw where they experience certain emotions on a diagram of the body. Then the researchers overlaid all of the sketches. As you can see, people experience emotion all over the body, even though emotion is something most of us associate with the mind.

TBWA SketchesSketches from a recent study by TBWA showing where participants experience emotion.

In 1995, a team of scientists in Italy made a major discovery. They found something called “mirror neurons”. These neurons respond in a similar way, whether we see someone perform an action, hear it described, or do the action ourselves. Because they play a role in both acting and thinking, mirror neurons suggested that the mind and body might not be so separate after all.

There are several recent studies–the latest published in November of 2008–that support this idea of embodied cognition. One showed that children solve math problems better if they are told to use their hands while thinking. Another suggested that stage actors remember their lines better when they are moving. And in one study published in 2007, subjects asked to move their eyes in a specific pattern while puzzling through a brainteaser were twice as likely to solve it. These studies suggest that involving the body in thought actually helps cognition. Our mental experience is more than just the brain; it is physical as well.

Pattie Maes and Pranav Mistry's Sixth SensePattie Maes’ and Pranav Mistry’s Sixth Sense

There is a series of emerging technologies that capitalize on this evolving mind/body concept. Technologies like gesture, touch, haptics, and a multitude of physical sensors mean that we can confront and control technology like we do everything else in our world, with our embodied minds. We’ve lured technology out of the incubator and into the world. We can touch it. Move it. We can finally be physical with technology. This means there are new opportunities for innate control of technology, but first we need to establish new mental models to help people understand how to interact with technology in this way.

Many of these emerging technologies are still in their infancies and show up in gaming or secondary communication. We haven’t yet closed the logic loop of interacting with technology in this new bodily way. We have finally figured out technically how to detect these interactions, but we have forgotten to train the technology to respond in a way that we would expect. We are relying primarily on video or audio feedback, while our bodies innately expect multi-sensory feedback. So despite the fact that technology is increasingly present in our world, and we are increasingly dependent on it, we’re still trying contain it like we’re afraid it will take over.

The physicality we are now experiencing with technology changes our expectations of how the technology should respond. There are behaviors that we anticipate, intuit, when interacting fully with body and mind. When these rules are not present the interaction feels foreign and unfamiliar, and requires too much thought.

Working with emerging interface technologies requires understanding how people absorb information and create meaning. At Kicker we’re investigating how people communicate to create a more natural interface with technology. We are focused on how people expect these different technologies to communicate, and we’re intrigued with designing a bridge over that gap. Stay tuned for insights and guidelines for designing with emerging technologies in ways that feel comfortable, approachable, and compelling to people.