FEATURES

Computer Hell

Why can't someone build a personal computer that's aseasy to use as a microwave oven? A veteran high-tech journalistsearches for answers.

September/October 1997

Reading time min

Computer Hell

Scott Ritchie

They're supposed to make us richer, smarter, even happier. But if personal computers are so great, I have a question: Why did my laptop just crash--for the fifth time today?

And Stanford grad and veteran computer journalist that I am, why can't I figure out how to move a picture from the top of my screen to the center?

I'm just asking.

It turns out I'm not alone in my frustration. The world is starting to notice that computers are cranky, complicated and decidedly user-unfriendly. In fact, so many people complain about these piles of silicon, wire, metal and glass that computer vendors are now ranked No. 7 on the Better Business Bureau's nationwide complaint list, just behind used-car dealerships and home remodeling contractors. Even brand-new computers fail at an alarming rate. More than half of the respondents to a recent ComputerWorld magazine survey found serious flaws with their new machines, right out of the box.

"Nobody would tolerate telephones that crash every other day," says Paul Strassmann, a former chief information officer for Xerox and the Department of Defense who is now a computer columnist. He's even harsher in print: "Software can easily rate among the most poorly constructed, unreliable and least maintainable technological artifacts invented by man--with perhaps the exception of Icarus's wings."

But industry experts at Stanford and throughout Silicon Valley point out that consumers are still gobbling up computers--so they can't be that bad. "A lot of people complain about what they want once they get it," says Avron Barr, a research director at Stanford's Computer Industry Project (see sidebar).

Computers are improving so fast, industry gurus say, that in 50 years we will achieve immortality by loading the sum of our experiences onto memory chips. "I hope to address an audience like this in 2047," Microsoft's chief of technology Nathan Myhrvold told a gathering of computer scientists last spring. "But I hope that I won't talk about software--but that I will be software."

That's something to ponder while you're waiting on hold for tech support. I've been doing a lot of that lately, and it got me wondering: What are the real reasons computers are such a pain? Why can't they build a PC that's as easy to operate as a microwave? Here's what I discovered:

A Young Industry

Many computer pros readily admit that their creations are flawed and confusing. But Stanford senior lecturer Joe Corn says that's normal for an innovation that only recently became an indispensable part of modern life.

Corn, who is an expert in the history of technology and is working on a book about the early days of the automobile, says the computer is a little like the Model-T Ford. It didn't even come with a speedometer, windshield wiper or gas gauge. Like today's computer users, he says, early motorists had to cull maintenance and operating instructions from thick, nearly indecipherable manuals. And people read automobile magazines with gusto, marveling that something with 10,000 parts could run at all.

But the problem with computers is more acute. After all, automobiles never went through a period of such breakneck innovation, which itself is responsible for many of the glitches computer users experience. No manufacturer, no matter how thorough, can test its products with every compatible piece of software and hardware, says Don Norman, a former vice president of research at Apple and the author of the design classic, Things That Make Us Smart. "It's impossible to check all the combinations," he says. "So nobody can guarantee there won't be a problem."

The Evils of "Bloatware"

The brutal realities of the software marketplace are probably more responsible for our frustration than the industry's growing pains. Unlike cars and other consumer products, software doesn't wear out. Sell people a functional word processing package today, and they can use it for the rest of their lives. So, to continue making money, software vendors are caught in a never-ending quest to add features and functions they hope will entice consumers to buy upgrades.

As a result, today's fully loaded applications are more complicated, require more electronic memory and are more likely to crash than their forebears. For example, the first version of Microsoft Word, released in 1984, used a mere 27,000 lines of code. Today's version has 100 times as many lines and three times as many commands. Included are capabilities that not long ago could have passed for distinct applications in their own right: a drawing program, a desktop publishing program and a web page authoring program.

Philippe Kahn, a software executive who helped developed the Micral, one of the first personal computers in the 1970s, calls it "bloatware." Fierce competition accelerates this virtual obesity. When one vendor adds 10 more features, its competitors are forced to follow suit or lose market share.

Consumer experts insist buyers want bigger, better hardware and software. Just like fast cars and designer clothes, the latest computer gadget makes buyers feel powerful and cool, says Terry Winograd, a Stanford computer science professor who heads the program in human-computer interaction. "Most of it has to do with self-image and status."

When it comes to software, less is definitely not more, as Microsoft found out when it tried to sell a version of the spreadsheet program Excel with shorter, simpler menus. Nobody wanted to buy it, says Ken Dye, who manages the company's desktop products test labs. "People want the features because they say to themselves, 'One day I may use them.' Just like they prefer to buy a whole shed full of tools rather than the one wrench they need today."

The pressure to rush sexy new products to market is so great, in fact, that quality assurance often falls by the wayside, according to a survey of product managers in 11 software companies that was released last year by the Stanford Computer Industry Project.

A majority of the companies surveyed allowed engineers to add and remove features right up to the product release date--even if that meant that the manuals were no longer accurate. Squeezed by their own marketing departments, product managers poured out their frustration over slipshod quality control in post-survey interviews. "It almost brought some of them to tears," recalls Barr, who managed the study.

Facelift Needed

For an industry that prides itself on innovation, not much has changed lately about the face--or interface-- computers show the world. The state of the art remains the design launched by Apple in 1984. Called a "graphical user interface," it enshrined the "desktop" and "folders" metaphor found on just about every personal computer sold today.

But that interface was designed to work on the primordial Macintosh, a computer equipped with the memory of a housefly and software as simple as a cookie recipe. It wasn't built to navigate the hundreds of folders and thousands of files in today's personal computer.

Bruce Tognazzini, who helped design the early Macintosh interface, now likens it to the electronic equivalent of caveman grunts. "You have to point, 'ugh,' to the thing you want to copy," Tognazzini gestures with his hand on an imaginary screen, "Then, 'ugh,' to paste it. If you look at the screen, it's covered with grunt points."

In 1995, Microsoft launched a new interface that was supposed to make computing easy for neophytes and computerphobes. "Microsoft Bob" was built around the metaphor of a cozy room with a fireplace. Instead of applications, folders and files, there was a writing desk with a calendar, a checkbook, a rolodex and a notepad. Clicking on these objects launched simple applications for scheduling appointments, balancing a checkbook and writing letters.

Newbies could select a personalized cartoon buddy to help them along. Hopper the rabbit would say: "I think you might want to save this file before quitting." Ruby the parrot would instead squawk, "Hey! You forgot to save that."

The idea for this facelift grew out of research by Byron Reeves and Clifford Nass, two Stanford communication professors whose experiments suggest that people use the rules of social interaction when dealing with computers. Nass and Reeves's vision, spelled out in their recent book, The Media Equation: How People Treat Computers, Television and New Media Like Real People and Places, was to create an interface that would play into that tendency. Microsoft Bob's different personalities were supposed to have resonance with their owners.

The trouble was that after the novelty of working with a computer buddy wore off and the rookies learned their way around, many found Bob's in-your-face alter egos just plain annoying. Microsoft recently shelved Bob's characters, but the company hasn't given up the notion that cartoon buddies who smile and wink can make computers more palatable. To help you find your way around Microsoft Office '97, for example, you can select one of nine different characters ranging from Einstein or Shakespeare to a UFO, Mother Nature or Clip-It, a contortionist paper clip.

The search for a better interface is still on. Some experts predict that displays with maps, buttons and on-screen slider controls will someday take the place of the desktop and folders. Or maybe, as Macintosh pioneer Tognazzini foresees, a combination of voice commands and in-the-air gesturing will eventually make computers easier to operate. Others are betting that, in the future, we'll run our machines by donning special goggles and gloves and working in a simulated three-dimensional world.

Infantile Executives

Computers are also hard to use because super-competitive industry honchos won't agree on standards that would allow machines to work together smoothly and consistently. Mac files don't work in PCs, and PC files don't work very well on Macs. Word processing programs can't always talk to each other. And e-mail that reads fine when it leaves one screen can look like gibberish when it arrives on another.

There are international bodies established to set standards, but the industry views their members as stuffed shirts somewhere in Washington or Geneva plotting to stifle innovation. And by the time such bickering bodies make a decision, the argument goes, the industry has already surpassed them by two years. After all, the techies argue, a standard such as HTML--the computer language that made the World Wide Web possible--has been adopted voluntarily without any order from governments or industry bodies.

Some economists agree. Stanford's Paul David, a professor of economics, points out that governments aren't nimble enough to figure out what's best for fast-growing technologies like the Internet. And when they do impose standards, he says, they run the risk of locking the country into an inferior technology. That's exactly what happened when the government chose the current U.S. broadcast standard for television.

Many in the industry agree on the need for standards but maintain they should not be imposed from the outside. Our lives could be a lot easier if the industry had more self-discipline, says Norman, the former Apple exec who is currently a fellow with the company. "The computer industry is greedy, immature and childish. Everyone is saying, 'I want mine. I want it this way.' It has to grow up or be replaced by some other industry because this model is fundamentally wrong."

Geeks Rule

The roots of computers reach back to the days when only techies had to mess with software and vacuum tubes. Engineers created computers for other engineers, and companies with large and clunky mainframes employed whole technical staffs to keep the basement monsters humming. Upstairs, regular workers pecked at their terminals and left the plumbing to the geeks.

Now, the descendants of those geeks have come to rule the world--Bill Gates is the globe's richest man--so should it be any surprise that the ability to download the latest web browser and hold forth on the merits of megahertz and RAM has replaced the capacity to wax poetic about the engine of a Camaro or Mustang?

Being a techie has even become sexy. Dilbert's creator, Scott Adams, receives tons of e-mail from women smitten with his dumpy, nerdy creation. Adams has developed a Darwinian explanation: "In the old days, it was important to spear an antelope," he writes. "Today, all that matters is if you can install your own Ethernet card without having to call tech support."

Besides, it turns out that the technically demanding is sometimes more efficient. Consider something as ordinary as a supermarket cash register. Checkers must be trained to type in a product code for every kind of fresh produce a store sells. This requires more know-how than, say, clicking on a picture of a ripe tomato. But it's also a lot faster when there are five different kinds of tomatoes, says Winograd, the Stanford computer science professor. As long as the complicated is more efficient than the simple, computers will remain complicated.

Glimmers of Hope

The industry is acutely aware of all this frustration and is hard at work searching for ways to make computers easier to use. For example, the ballyhooed "network computer"--a pet project of Microsoft's twin nemeses Sun Microsystems and Oracle Systems--would revive the mainframe on a grand scale: Techies would do all the work of installing and maintaining fancy applications on a remote server, while users worked on a stripped-down PC.

Another idea is to break down the enormous software packages into smaller components. Instead of buying a multipurpose word processor that can be used to write correspondence, create web pages, produce sales presentations and lay out books, you'd buy only the software components you need. That's what a group of small companies called the Component 100 are trying to do for the Macintosh.

Norman, the former Apple executive, has doubts that either of these ideas will succeed in making computers any easier to use, especially for people with computers at home or in small businesses without in-house technical support. His proposal is to get rid of the "computer" altogether and replace it with a bunch of information "appliances," each good at a specific task and yet all communicating with one another so you don't have to type everything twice. There might be a datebook-sized computer where you keep your appointments, another device to use as a letter-writing pad and a third machine for accessing the Internet from your desk.

In some ways, Norman's vision is already becoming a reality. Consumer electronics companies churn out game machines, electronic address books and personal information managers like the Newton. Then there is Web TV for accessing the Internet and new information products from companies such as Diba Inc. in Sunnyvale, which has built prototypes of electronic yellow pages, an electronic banker and an appliance for e-mailing and faxing.

The most hopeful sign on the horizon may be a growing corporate reluctance to buy every new high-tech release that comes off the Silicon Valley assembly line. Until recently, companies were afraid that if they didn't upgrade, they'd be left behind. Not anymore, says Christopher Germann of the Gartner Group, a technology consulting firm. He says that about one-third of his clients have started "asset management initiatives" to counter the reflex to upgrade at any cost. "It's a matter of corporations becoming savvy," Germann says.

Savvy or not, the current state of techno-confusion is likely to be with us for at least another decade, according to Stanford's Barr and others. So we will continue to cram our limited brains with the minutiae of acronyms, file names and product variations that become obsolete as soon as we learn them.

Oops, gotta go. Tech support is on the line.


Jiri Weiss, '78, lives in Berkeley and writes about the high-tech industry.

Trending Stories

  1. Let It Glow

    Advice & Insights

  2. Meet Ryan Agarwal

    Athletics

  3. Neurosurgeon Who Walked Out on Sexism

    Medicine

  4. Art and Soul

    Arts/Media

  5. Three Cheers

    Alumni Community

You May Also Like

© Stanford University. Stanford, California 94305.