Rethinking the human factor
NSF project to develop systems that put people at ease with working together online
A computer system that can be operated using a glove along with eye contact and speech recognition may not be much of a fashion statement. But Rutgers University researchers believe that it's more natural for people to communicate electronically using such a system.
Rutgers has completed a three-year, $780,000 research project funded by the National Science Foundation to develop systems that will replace the keyboard and mouse with devices that track the user's gaze, respond to voice commands and enable the user to move icons on the screen with a gloved hand that registers motion.
During the project, the Army National Guard of New Jersey demonstrated the ability of military planners to more accurately and more easily collaborate on military planning procedures using the system, called the Speech, Text, Image and Multimedia Advanced Technology Effort (Stimulate).
Stimulate is a joint initiative of NSF's Computer and Information Science and Engineering Directorate, the National Security Agency's Office of Research and Signals Intelligence Technology, the CIA's Office of Research and Development, and the Defense Advanced Research Projects Agency's Information Technology Office.
Rutgers researchers used a combination of prototype technologies to study how people interact with computers employing systems that offer multiple modes of communicating sight, sound and touch, said Edward Devinney, senior associate director of Rutgers' Center for Advanced Information Processing (www.caip.rutgers.edu), where the Stimulate research was conducted. The research team plans to deliver its final report to NSF within a few weeks.
"We've found it is faster to use multimodal input," Devinney said. If someone who's never used a computer is told to grab an object using a glove, the tendency is to just grab in a natural way, he said, while it takes some time to become comfortable with a mouse.
The system uses special software called Fusion Agent to teach the computer to interpret what the user wants and to prioritize the activities, he said. For instance, in a situation where a user looks at one thing but says another, the computer may be programmed to always follow the voice command.
A key part of the system is a force-feedback glove, patented by Rutgers, which reads gestures by detecting fingertip positions relative to the palm. When a user points to or picks up an object on the screen, the user receives feedback in the form of pressure from the glove. The glove weighs less than three ounces.
The system also employs a gaze tracker. Rather than a headpiece, the device is a unit mounted on the desktop that rotates to detect where the user is looking. The user can direct a cursor just by looking at the computer screen. Voice recognition and voice synthesizing software can understand simple commands and respond audibly.
For instance, if a military user tells the computer to create a camp, it will place the new camp where the user's eyes are pointed or where the user points with the glove and then respond that it has created the camp. "We've got quite a playground here," Devinney said.
Users can work together from different locations through a standard Internet connection or other type of network. The users would launch Rutgers' object-oriented groupware called Distributed System for Collaborative Information Processing and Learning (Disciple), which integrates the inputs from the various devices on the user end into logical commands.
Devinney said potential applications include battlefield management, tele-medicine and other types of collaborative decision-making during which users are mobile. The Rutgers team has suggested to the Defense Advanced Research Projects Agency that it employ an enhanced version of the system for designing the command post of the future. According to that proposal, users would collaborate around a three-dimensional situational map, moving assets around or identifying units and using the glove to retrieve information about the unit's readiness and size.
Seven participants from the Training Technology Battle Lab at Fort Dix, N.J., spent several months field-testing disaster-relief scenarios in which military planners could collaborate remotely to decide how to deploy resources and assess the terrain on a map-based display.
"The advantage is the ability to be able to see the map and the activity on the map," said Brig. Gen. William Marshall, deputy State Area Command commander for the New Jersey Army National Guard.
Marshall said the Rutgers Multimodal Input Manager led to the concept of the digital battlefield, in which planners can test a possible movement and see its impact.
Because moving an icon that represents a military unit automatically generates a wide variety of technical information about the action, such as position or number of personnel in the unit, the Rutgers system saves considerable time and redundancy, Marshall said.
"If you move a flag on a flat map, you have to record the data in two or three places. It's one entry vs. three with grease pencils," he said.
Devinney said the system has potential for helping people with disabilities access computers. For example, the glove is being used elsewhere at Rutgers as a physical therapy tool for people with hand injuries.
But he said brand-new developments are not ideal for such computer applications.
"The best chance of getting technology into the hands of the handicapped is if you adapt something made in large quantities," he said.
NEXT STORY: Army fine-tunes missile defense C3