Artificial Muscle: Designing for Haptics

  • Artificial Muscle productivity tool demo.

    Artificial Muscle productivity tool demo.

  • Haptic map for two-thumb touchscreen keyboard.

    Haptic map for two-thumb touchscreen keyboard.

How can haptics (vibrotactile feedback) improve our experience of touchscreen products?

That’s what we set out to explore when Artificial Muscle, Inc. approached us to play with their ViviTouch™ Technology. They wondered how their haptics could be used on a mobile phone. Kicker Studio wanted to know if haptics could improve the mobile user experience. We decided to look at the common everyday activity of making a conference call on a mobile phone and see where tactile feedback would be the most effective.


Developers of a new technology are experts in developing that technology. They are very interested in the how of it all. Their primary question is “how does it work?” and they know the answer better than anyone. That is how they managed to figure out how to make the technology in the first place.

To begin with, Kicker Studio needed to know the parameters (the how) of AMI’s haptic solution, including the ratio of possible to impossible. AMI supplied us with a thorough understanding of how the haptics worked, and what was different about their haptics from other solutions. They explained how to manipulate sound waves to make the responses feel different.

However, as designers, we are interested not only in “how?” but also “why?” Not necessarily “why does x_part move y_part over here”, but “why use this technology?” Why do people need this and in what context? What does this technology do for the user? What if you combine it with that technology over there? In short, what is it good for? By asking these questions, we can develop real world applications for emerging technologies that are useful to real people.

So, armed with the knowledge we gained from AMI, we sat down to think about haptics in the context of our Kicker Conference Phone. What if it were a mobile application? What tasks would be most important in those scenarios? Where would haptics help the functionality of the application? And most importantly, what was haptics best at communicating?

To find out, we made some calls on our own touchscreen phones. We noticed it would be great if we could feel the changing of pages or tell what button we were pressing by the way it felt to press it. This would help complete the visual metaphor of the screen graphics—the ‘button’ could feel like a button, a page would have a sense of physical mass. It would also allow users to operate the phone even if they could not look at the screen, as the user would know in an instant what key they were touching. In the context of a mobile phone call, where the user is either holding their phone or resting it on a surface, vibrotactile feedback could be invaluable.

We also decided it would be great to know what kinds of alerts were happening just by the vibro feedback on the phone. For example, in the context of the Kicker Phone, if someone was poking you, it would feel one way. If they were raising their hand, it would feel another. Alternatively, imagine if while on a call you could tell the difference between a text message alert and a calendar event alert just by the feel of the vibration.

Haptics would also be a good way to indicate that a user had performed a gesture-triggered non-touch event. For example, much like in the removable pods of the original Kicker Phone, the mobile conference user could shake their phone in order to indicate a ‘raised hand’ on the conference line, and haptic feedback would alert the current speaker that someone has performed that action. This would allow for more types of gestural inputs and outputs than is currently possible because there could be ‘eyes free’ response by the device, even when the sound was turned off.



We set about quickly mapping out how the Kicker Conference Phone could become a mobile phone application. We picked the iPhone because Artificial Muscle had the API ready for that platform. We modified the process, keeping in mind the context of use of a mobile phone. Priority was put on joining and participating on a call. We figured the majority of users would set up a call elsewhere, but made sure the process was quick and simple for that last minute need to set up a call. Then we worked out a visual design for the new application.

We created a matrix of possible haptic solutions for each button and action. As we manipulated the sound waves, we tried to match the feel of the button with what we imagined it should feel like to push it. We also imagined what it might feel like to slide one of the cards across the screen. Right away, we noticed that it was very difficult to imagine how the haptic would feel in context—it had to be prototyped and felt. The challenge was that we could only experience the vibrations through a reactive tablet hooked up to a PC (to generate the necessary sound triggers). The graphics were on an entirely different screen. Thus, it was hard to put the three separate pieces (the visual, the haptic response, and the actual experience of holding the phone) together to imagine their combined effect.

We worked with Fortified Studio to create a working prototype on an iPod Touch. Finally, we could experience the separate elements working together. We discovered that the motion graphics had to be timed just right with the haptic in order for the response to make sense. We also discovered that some of the haptics we thought would be great in reality felt buzzy and not believable. We were able to adjust those issues rather quickly thanks to the prototype, and we landed on a great first edition of the Kicker Conference Mobile Phone App.

Going forward, there are several things we would like to add to the first edition. Unfortunately, the prototype couldn’t accommodate some of the more impactful solutions, such as the haptic smart keyboard, due to time and budget. Also, we would like to add sound effects to accompany each haptic. We found that haptic + sound effect = believability. The same haptic with a different sound effect feels different. A deep click sound will make the haptic feel heavy, while a light click makes it feel shallow. We think this would help to heighten the impact of certain haptics.

Overall, the results were exciting. Artificial Muscle is currently sharing the prototype with many of its partners and we look forward to seeing what this work will inspire for OEMs.


To see how we can help you on your next project, contact us »

New blog post: Eyes on the Road! Or why my car should NOT be a giant smartphone on wheels: hours ago