The smell of user

By Peter Coffee  |  Posted 2005-03-21 Print this article Print

interface design"> I found myself thinking that people with this kind of sensory cross-linkage—synesthesia, the neurologists call it—should be getting involved in user interface design. As users are increasingly tasked to look for patterns or find similarities and differences—in data sets of increasing volatility and complexity—it seems as if people who already have different styles of perception might do a better job of thinking outside the box about data presentation.

Im not just talking about variations on haptics, the field of force-feedback interactive design.
The game designers and their compatriots in controller devices are already doing some interesting things with that kind of sensory fusion. Companies such as Logitech and AVB offer joysticks and other devices with actuators to produce appropriate effects linking on-screen actions to controller-force sensations. Im happy to have them and their customers funding the R&D to make that hardware cheap and reliable.

To be sure, Id like to see haptics go beyond adding bumps and clicks to conventional knobs and levers. Todays typical touch-screen devices are too hard to read in the sunlight and give insufficient feedback to confirm that input has registered. Id like to see a head-up display combined with a force-feedback touch-pad; that display could then project a view of any kind of control panel over my view of that general-purpose pad, with haptic feedback so that I actually felt as if I were typing or sliding a control knob or whatever.

But thats not nearly enough. We need much more original thinking in helping network operators, financial analysts, software developers and other people deal with large systems that combine high complexity with low intrinsic difference among data items. One line of code looks much like another, but in running my finger across a source code view, I might literally feel a hot spot or rough spot where a line or a module is creating a bottleneck or failing tests.

Would this just be a stupid user interface trick, compared with using color-coded editors and other approaches already being pursued? I dont know, but my brain is probably too plain-vanilla to think of many things that Id find useful if someone else devised them.

Im sure we need more ways to let someone detect or even diagnose a problem without staring at a screen all day. Bring on the synesthetes.

Technology Editor Peter Coffee can be reached at

Peter Coffee is Director of Platform Research at, where he serves as a liaison with the developer community to define the opportunity and clarify developersÔÇÖ technical requirements on the companyÔÇÖs evolving Apex Platform. Peter previously spent 18 years with eWEEK (formerly PC Week), the national news magazine of enterprise technology practice, where he reviewed software development tools and methods and wrote regular columns on emerging technologies and professional community issues.Before he began writing full-time in 1989, Peter spent eleven years in technical and management positions at Exxon and The Aerospace Corporation, including management of the latter companyÔÇÖs first desktop computing planning team and applied research in applications of artificial intelligence techniques. He holds an engineering degree from MIT and an MBA from Pepperdine University, he has held teaching appointments in computer science, business analytics and information systems management at Pepperdine, UCLA, and Chapman College.

Submit a Comment

Loading Comments...
Manage your Newsletters: Login   Register My Newsletters

Rocket Fuel