DISCUSSION: Imagining the future of computers.

It is well known that the user interface of computers has never been ideal, but new research projects may rewrite the way we communicate with computers.

Let's look at the actual situation. Although it is believed that the personal computer can greatly improve our productivity, it is really hard to use. This can be seen by comparing it to other machines in our daily lives. Does a TV require us to use a keyboard to change channels? No. Does it take five minutes to start a car? No. But the personal computer still retains the user-unfriendly features of its early days -- the keyboard, the quirky commands, the complicated procedures for entering information. We need a machine that makes it possible for us to communicate on our own terms, not one that forces us to communicate on the machine's terms.

That's why DARPA (Defense Advanced Research Projects Agency) wants to develop a whole new user interface for personal computers. According to DARPA, right now we spend too much time feeding information into computers, telling them how to accomplish specific tasks, and dealing with requests that distract us from real tasks. It sure would make sense if computers could gather information themselves and adapt to user needs without being told to do process-oriented work.That's what DARPA thinks.

So DARPA is contributing $10 million (and is about to announce a second round of $40 million) to fund several universities to work on computer systems that can see, hear, and use artificial intelligence to respond to all of our needs, including Carnegie Mellon University, George Institute of Technology, MIT, the University of California at Berkeley, and the University of Washington.

Professor Daniel Siewiorek, director of Carnegie Mellon's Institute for Human-Computer Interaction, said, "The most valuable resource in a computer system is no longer the processor, the memory, the disk, or the network; it's the user's attention." The solution, according to Siewiorek, is invisible computing (also known as pervasive or universal computing). The idea is to replace computers that need to type every command by computers that use microphones to listen, cameras to see, and speakers to deliver information to the user. By combining speech recognition, vision, and voice synthesis with artificial intelligence, these new systems can collect data and anticipate user needs on their own.

Of course, companies like IBM and Xerox have also long invested heavily in user interfaces over the past few decades, but we're still waiting. However, some of the basic building blocks, such as speech recognition software, artificial intelligence, and wireless networks, are now mature enough to make it possible to achieve the goal.

Project Aura

Siewiorek and his colleagues call the DARPA-funded project Aura, and he says the end result will be a voice-controlled system with monitors, microphones, and speakers throughout the workplace as the user interface. It should work pretty much like this: Harry is the lead designer working for a virtual product design consulting firm. Every morning when he arrives at the office, he needs to respond to voicemails and emails, read articles related to his field, and see how the London office is doing. Today, Harry went straight to his desk and spent an hour browsing the Web and typing. With the Aura program, the computer system will create a digital ambience for Harry that contains the news he wants to read, his voicemail and email inboxes, and the people he regularly contacts. This digital ambience won't just exist on a single PC or laptop; instead, it will exist throughout the network and will follow Harry as he moves around the office. So Harry could enter the system through sensor authentication, and then use the microphones scattered throughout the building to control the computers.

That way, Harry could check his voicemail and e-mail from anywhere in the building, send news to the nearest display, and even send messages, such as a video of a prototype of his design, from any location. Carnegie Mellon plans to develop Aura using technology available now.

Project Oxygen

MIT is taking a somewhat different approach. Over the next four years, MIT will invest $50 million with its partners DARPA, Acer, Delta Electronics, Hewlett-Packard, NTT, Nokia, and Philips Electronics***, who are attempting to create an entirely new system, including hardware, software, and networking components. Similar to Aura, it will attempt to ease the complexity of using computers.

MIT's system, called the Oxygen project, will run on two devices: the Enviro 21 and the Handy 21, workstation-like computers that will serve as command and control centers for video cameras, telephones, microphones, and speakers throughout the home or office. Its user interface is the Handy 21, a personal digital assistant with a camera, microphone, and color screen that is controlled primarily by voice commands. At home or in the office, it will serve as a general-purpose remote controller, and on the road, it can serve as a personal digital assistant that maintains an uninterrupted connection with the Enviro 21.

So an aeronautical engineer could use the Handy 21 to ingest photos of damage to a Boeing 747 on the ground, and then project them onto the nearest monitor when he gets back to the office. He can then tell the chief engineer what needs to be repaired and have Handy 21 ask Enviro 21, which is Internet-connected, to find, locate, and purchase the needed parts.

Portolano Project

Gaetano Borriello, a professor in the computer science department at the University of Washington, said, "The problem with most cell biology labs is that there's no way to accurately record the experimental process, so most of the work that's accomplished is in the biologist's head or in his laptop. "

So the University of Washington will equip every piece of equipment in the new cell biology lab with radio-frequency markers to capture and record data. When a biologist places a sample under a microscope, a computerized control system will know what material is being examined and what the settings are. Pipettes containing chemical reagents also relay how much reagent is left in them and how much is used in each experiment. The system also uses voice and video technology to record the biologist's notes as he or she works.

It will be five to 10 years before these systems become mainstream. But right now we can also have a pervasive computing environment that includes PDAs, cell phones, digital video cameras, and MP3 players. What we need is an upper layer architecture that integrates all of these devices, and that's what MIT, and those companies, are trying to do.