Is Industrial Design the New Interface Design?

I read with some interest Carla Diana’s Core77 report on this year’s CHI conference. (It’s always better to get a summary of CHi than to actually go.) Buried in the article is this line: “Industrial design is the new interface design” was the mantra of the week.

I found this interesting because viewed from certain angles this is decidedly true. With tangible computing, objects can embody a single piece of functionality. Siftables is perhaps the clearest recent example of this. With robots too, the “interface” is the physical form of the object. Many sensor-based products have no visible digital interface at all (“Wave hand to dispense towel”).

But viewed from another angle, industrial design has frequently been about interface design since even before it was a profession. If we define interface design as the visual controls for an invisible system, industrial designers have been doing interface design for (at least) over a century, ever since the first “push button” mechanisms from the 1890s like the first Kodak cameras, and probably even sooner. Pushing a physical button triggers an internal system. How is that fundamentally much different than tapping a button onscreen, triggering a digital response (which could in turn trigger an analog one)?

In fact, as products begin to dematerialize (see: the gestural interface for a TV we designed, which replaces the physical remote control, for instance), and touchscreens replace physical buttons, the opposite argument can also be made: interface design is the new industrial design. So which is it?

At Kicker, we feel like a blend is usually necessary: for some functions, a physical control works best; for others, a digital one. We do functional cartographies to figure this out.

What is clear is that industrial design and the engineering behind it is becoming deeply entwined with digital behavior. Those touchscreens and the forms around them don’t design themselves. Via sensors, we can control digital behavior through analog means like never before, and sometimes those means are embedded within a physical object (or a physical space). Think of the Wiimote for example.

Start from the inside-out (the behavior), and then figure out what should control it: the physical form, UI elements on a screen, or even gestures in space. For users, the interface is the system, and they don’t care which discipline(s) designed it, only that it looks good and works well.