Future Astronaut Suits Could Feature Augmented Reality
NASA is eyeing the technology to address communication delays with Earth and help astronauts with decision making.
NASA's Johnson Space Center in Texas and Glenn Research Center in Ohio are exploring adding augmented reality to spacesuits in an effort to increase astronaut autonomy in instances of real-time communication challenges between space and Earth.
According to last week’s request for information, NASA is looking for potential sources or providers for a spacesuit augmented reality display system for its Joint Augmented Reality Visual Informatics System, or Joint AR, project. As noted in the request, NASA “aims to realize a spacesuit compatible AR system that is comprised of a display, control and compute subsystems.”
Future suited crew operations are expected to be more self-reliant because “communication time delays constrain the ability for crew to interact with mission control support from Earth in a real-time capacity.” The requested AR system is a potential solution to allow astronauts to work on missions during extravehicular activity.
The request stated that the AR system will provide a new communication medium between the crew and Earth’s mission control by adding dynamic visual cueing, as a short term benefit. Meanwhile, in the long run, it will allow interplanetary human exploration by supplementing or replacing mission support from Earth to help the space crew make informed decisions. The cross-agency project includes the Glenn Research Center but will be primarily lead out of NASA's Johnson Space Center, according to a NASA official.
Specifically, the request noted that an important element for the success of Joint AR is “to comfortably display information to the suited crew member via a minimally intrusive see-through display.” NASA does not want head-worn near-eye display configurations because of operation and system integration challenges. Instead, the agency is looking for display configurations that are not attached to the user’s head, like a Heads-Up-Display. But the limitations of the suit prevent traditional HUD designs. However, NASA provided sample renderings of what this could look like, as well as optical requirements for the AR system configuration.
NASA added that it would like the Joint AR displays to be full-colored and binocular, but the first generation of these displays can be monochrome and monocular. The display elements may be located inside the suit helmet bubble or outside it, but must be low-profile and minimally invasive to the bubble mold-line to not interfere with crew actions. If these are inside the bubble, the elements will need to be able to be in a 100% oxygen environment and powered elements in this environment must address flammability and other safety concerns. Additionally, system elements will be subject to vacuum, dust, radiation and extreme thermal conditions, the RFI stated.
As noted in the RFI, the Joint AR display should mitigate vergence-accommodation conflict— which prevents up-close objects from being in focus. According to NASA, field tests demonstrate that, during suited crew operations, the astronaut will need information to display while looking at objects at varying distances—for example, something in the distant horizon and then something up close—so having technologies that can automatically adjust and fix this effect is beneficial, particularly for usability. NASA added that crew members will be using the AR system periodically for up to eight hours.
NASA provided several questions and areas for respondents to address, such as how current AR systems could be modified or scaled to meet the agency’s requirements, how technologies under development could satisfy these requirements and the feasibility of the request, among other things.
Responses are due to the NASA contracting officer on March 17, 2023, by 5pm ET.
Editor's note: This story has been updated to reflect new information from NASA>