Kicker Tactile Touchscreen Reader

Now that technology is increasingly accessed through touch, isn’t it odd that it all feels like glass? Technology is at our fingertips. How can we use the sense of touch to control it?

Several recent studies, the latest published in June 2011 in Science, have shown that we as humans take in information with our whole body. We see this as an opportunity to develop alternative channels for interfacing with technology beyond just visual and audio. At Kicker Studio, we have been working with haptics (vibrotactile feedback) for a few years now, and have become very interested in touch as a method of communication. Touch is one of the first senses we develop and therefore, it carries a lot of significance in our understanding of the world. At Kicker, we decided to investigate and develop a baseline vernacular for tactile interface for digital devices.



We started by talking to the blind. We realized the blind spend a lot of time communicating with the world through touch, and would likely be articulate about which types of tactile systems aide in their comprehension of the world.

Our conversations led us to investigate Braille as a tactile system. We aren’t suggesting that we teach the world how to read Braille. Even among the visually impaired, this is a specialized skill set that requires time and study. But for tactile encoding of large amounts of information, Braille is extremely successful. We suspected there were many conventions in the system of raised print that we could use to develop a multi-purpose tactile grid.

We also discovered a series of MRI studies first done by Sadato, et. al., in 1996 that showed that the blind use the same visual cortex to process Braille as sighted people do reading printed text. In other words, both the blind and the sighted use some sort of spatial cognition when reading. Therein lay our mission– design a tactile interface that would benefit both the blind and the sighted.



First things first, we took a look at visual reading. In visual reading, the eye moves around the page in a series of expected ways to digest information. The underlying grid enables reading activities including scanning, skimming, and searching.




Braille as a method of communication may have a limited audience, but it is helpful to examine as a successful example of tactile interface for large amounts of information. Entire libraries are printed and read through this method of encoding by thousands of people every day.

We learned from our Neuroscience friend, Dr. Alan Rorie, that there are several neurological factors that come together to enable someone to read Braille. The key contribution is from the Merkel Cell. It is stimulated by angles and points enabling the reader to detect the raised Braille cells. Additional neurological contributors are the Mishear Corpuscles. These nerves are “rapidly adapting”. In other words, they quickly notice frequencies, and in essence, go numb to them. This is why it’s necessary for the finger to move over the texture, rather than the texture be fed to the finger in place.

A type-1 Braille character is made up of 6 cells. These 6 cells contain dots that are either raised or not raised.Braille cells
The consistency of the grid enables the encoding of language.blind_abc
Braille grid
Just like with visual reading, reading Braille relies on scanning, skimming, and searching. The Braille “interface” is similar to printed text. In Western culture, text is printed left to right and top to bottom, in ordered rows.
Braille hands
The hands work in a method similar to the eyes in visual reading in order to develop spatial context. One hand reads specific characters while the other gathers spatial information, such as word and sentence length.


This much is probably clear by looking at a page of Braille. We learned something very interesting from Noel Runyon, an individual who has been working on interfaces for the blind since the early days of IBM. He is, I believe he phrased it, “coincidentally also blind.” He explained the one thing that sighted people always miss is the importance of the negative spaces to a Braille reader. They are equally as important as the positive spaces because it is those absences that define the edges of content. The gullies where there is no printed text help the hands to keep track of the direction and location of the Braille while it helps the reader to establish where the cells start and stop. The resulting grid ultimately provides direct spatial manipulation of text. The reader can then skim, scan, and search just like a sighted reader does with printed text.



Modern accessibility tools for visual interface are problematic. They provide a very narrow window into digital content, making it nearly impossible to develop the spatial cognition that is so essential to reading. Basically, imagine reading War and Peace one letter at a time. It would be painstaking and nearly impossible. That book is no where near as vast as the amount of digital information that is being generated every single day, all of which is increasingly delivered through visual interfaces on touch screen devices.




Here is the Kicker tactile touchscreen reader. We believe we can create a better spatial understanding of information on touch screen devices by creating a tactile grid developed with hi-fidelity, multi-channel haptics — which will soon be widely available in mobile devices. Here’s how it works:

A basic underlying grid helps the user to feel where a column of information lives. We call them “sight lines”. They act as the negative spaces (or gullies) between lines of information.

Speed of drag provides dynamic control of verbosity settings, with a proportional relationship between speed of drag and amount of detail provided.

Gestural control enables easy modality control. With a one-fingered drag, the reader receives audio feedback; with a double-fingered drag, the reader receives V-Braille time-encoded feedback.

The resulting tactile interface will restore for visually impaired persons a cognitive sense of space essential to reading, unlike any available modern accessibility tools. And because it is software and can be added to any tablet, suddenly the entire catalog of digital content instantly becomes available to people with visual impairment.

But it also enables keyboards like this one, which can easily translate into simple grids for navigating all kinds of screens and surfaces, eyes-free. Modality controls can instead control menus, and perhaps a double swipe across keys provides letters, another with numbers. There are all kinds of possibilities.

Kicker Tactile Touchscreen




We’re currently working on prototypes to do a series of tests with users to create just the right haptic frequencies for our purposes. We look forward to telling you about the results very soon.

In the meantime, we are continuing to develop an understanding of the etiquette and vernacular of touch as a method of communication. We are examining how such a language could be used, and what it might be used to transmit. There are endless possibilities. Stay tuned.

To see how we can help you on your next project, contact us »

New blog post: Eyes on the Road! Or why my car should NOT be a giant smartphone on wheels: hours ago