Simply Put, Few High-Tech Devices Are Designed for Ease of the User
- Share via
I recently got a vivid reminder of what makes a technology product great.
Representatives of 3Com Corp., makers of the Palm hand-held computer, offered to “beam” some documents to my own Palm--that is, send them via the wireless communications port. I started to say that my Palm doesn’t do beaming, then remembered that the wireless feature is a key selling point--one of the many features I had never used, and therefore had forgotten.
It reminded me of why I’m a big fan of the Palm: It’s an understated marvel. It does the few things I need with intuitive ease, without cluttering up my experience with a host of unwanted “necessities.” Yet in this rare case when I needed to beam, I learned how in seconds. No wonder the Palm dominates its market.
Why is simple elegance so rare in the world of software-driven devices?
Design guru Alan Cooper has a few answers in his new book, “The Inmates Are Running the Asylum: Why High-Tech Products Drive Us Crazy and How to Restore the Sanity.”
Fortunately, the book is more readable than the overdrawn title. Unfortunately, Cooper’s answers seem a bit discouraging. The industry’s design problems are hard-wired into computing’s culture of creativity.
Keeping up with Internet time in Silicon Valley (and a few other high-tech enclaves) requires obsessive focus that promotes social isolation, Cooper argues. That contributes to an illusion on the part of engineers that the stuff they make is actually intuitive.
Software engineers work so hard to meet crushing deadlines, who can fault them for becoming a little self-absorbed? But consumers pay the price: Products resemble their creators--designed for function rather than ease of use. Says Cooper: “Programmers trade simplicity for control.”
For example, long ago it became a cliche for a product to be marketed as “the Swiss Army Knife” of its category. If only it were true.
“In physical objects, like my Swiss Army Knife, there is a natural brake on the proliferation of marginal features,” Cooper points out. “Each proposed new feature must pass a gantlet of justifications before it makes it into a shipping product.”
A knife with 100 blades would be impractical and dangerous. But unlike a knife, the monetary cost of adding software features is low, so the temptation to pad a product with yet another small selling point seems irresistible. The result: clutter and complexity.
Consider the modern cell phone. It now includes games--hopelessly lame ones. But those games and similarly useless nonsense create layers of cumbersome menus that hide practical features like entering speed-dial numbers.
Digital watches bristle with so many knobs and buttons that analog units seem to be making a comeback. Minimalist designs without numbers on the dial finesse the interface question with subtlety and style--they correctly assume people know what the digits are.
*
The endless tendency toward feature bloat in a period of increasingly rapid product cycles means that software and Web sites have developed a pernicious dependence on the consumer as guinea pig, on tech support as an extension of product research. Only in high tech do consumers expect most products to have problems operating reliably; almost any dreck thrown on the market survives long enough for version 3.1 to “engineer out” the most egregious stupidities.
Neil Stephenson, best-selling author of “cypherpunk” novels--geek thrillers--says the problems are even deeper. In an essay posted on his latest book’s Web site (https://www.cryptonomicon.com/beginning.html), he argues that today’s PCs mediate our experience via fundamentally confusing terms: “massively promiscuous metaphor-mixing” that populates the computing “desktop.”
“Learning to use [those metaphors] is essentially a word game, a process of learning new definitions of words like ‘window’ and ‘document’ and ‘save’ that are different from, and in many cases almost diametrically opposed to, the old,” he argues. “What we’re buying into is the underlying assumption that metaphors are a good way to deal with the world.”
So what’s the fix? Cooper says it’s conceptually simple: Do fewer things better and always consider the user’s experience--keep “interaction design” in mind from start to finish. This would produce what Cooper calls “polite” software that doesn’t make a user feel stupid or incompetent, instead serving reliably, cheerfully and easily.
Jef Raskin, an original designer of Apple Computer’s Macintosh, has long advocated a more radical solution: ditching the operating systems such as Windows and Macintosh altogether in favor of computers that anticipate your needs. The act of drawing would launch a drawing program, typing would launch a text editor without the interference of a slow and complex kibitzer.
*
Unfortunately, computers as we know them may be too entrenched for such a profound rehab. For all the talk about the pace of change in high tech, consider this: The graphical user interface, or GUI, introduced on the Mac in 1984, formed the blueprint for all PC “interaction design” since. Sure, today’s GUIs are loaded with new features, but their underlying approach and system of metaphors has endured for more than 15 years.
Stephenson imagined the end of clunky metaphors in his novel “Snow Crash.” An ultra-realistic, 3D virtual-reality image acts as a librarian and guide to the vast expanse of online data. The character looks like a personification of Jeeves, the butler from the popular Ask Jeeves Web search engine: “A pleasant, fiftyish, silver-haired, bearded man with bright blue eyes, wearing a V-neck sweater over a work shirt, with a coarsely woven, tweedy-looking tie,” Stephenson wrote. Fast and competent, capable for carrying on a real conversation, the librarian “is eager without being obnoxiously chipper; he clasps his hands behind his back, rocks forward slightly on the balls of his feet, raises his eyebrows expectantly over his half-glasses.”
Of course, the VR gear, databases, processing power and high-speed networks needed to create such a vision mean this kind of experience won’t be common any time soon.
But in the meantime, teams of engineers are building a future of “ubiquitous” computers embedded in everyday objects from dishwashers to walls to clothing, anticipating and serving our needs seamlessly and silently. Under this notion, PCs would ultimately recede into the background as specialized terminals to run those spreadsheets and word processors.
For those of us who have lives beyond the screen, it’s an appealing notion.
*
Times staff writer Charles Piller can be reached at [email protected].