The virtual apartment in which volunteers have been turning appliances on and off is modeled after Bayliss' own. Such a simple, virtual world is the first step toward developing a way to accurately control the real world. Once Bayliss has perfected the computer's ability to determine what a person is looking at in the virtual room, the next hurdle will be to devise a system that can tell what object a person is looking at in the real world. BCI groups are also close to surmounting another obstacle-that of attaching the sensors to the head. Right now dozens of electrodes must be attached to the scalp one at a time with a gooey gel, but Bayliss says dry sensors are just around the corner, and simple slip-on head caps should not be far behind.
"One place such an interface may be very useful is in wearable computers," Ballard says. "With the roving eye as a mouse and the P300 wave as a mouse-click, small computers that you wear as glasses may be more promising than ever."
BCIs are divided into two categories: biofeedback and stimulus-response. Bayliss uses the latter approach, which simply measures the response the brain has to an event. Biofeedback is a method where a person learns to control some aspect of his or her body, such as relaxing, and the resulting change in the brain can be detected. Though many BCI groups use this approach, Bayliss decided against it because people must be trained, sometimes for a year or more, and not everyone can learn to accurately control their thought patterns.
Bayliss and Ballard work in the University's National Resource Laboratory for the Study of Brain and Behavior, which brings together computer scientists, cognitive scientists
Contact: Jonathan Sherwood
University of Rochester