Skip to page content Mission Information


Enabling Autonomous Crew Task Performance with Multimodal Electronic Procedure Countermeasures (80NSSC19K0657)
Research Area:
Human factors
Technology development
Species Studied
Scientific Name: Homo sapiens Species: Human

Long-duration exploration missions (LDEM) in the near future will involve numerous challenges, including limited contact between flight crew and ground-based communications. Currently all missions require regular and consistent updates between crew and Mission Control Center (MCC) to ensure optimal safety and performance. Beyond low earth orbit, however, there are likely to be difficulties maintaining regular communications with MCC such as distance, radiation interference, and lunar/planetary orbits. Broadly speaking, ideal countermeasures would incorporate programs and tools that require little to no regular contact with ground-based support and can be used by any crewmember regardless of specialized training.

Multimodal technologies provide system support by using a combination of auditory, visual, and tactile information and feedback. These inputs and outputs, when used with augmented reality (AR) displays, provide adaptive, integrated assistance that will be crucial for LDEMs. AR describes any display which augments the real world with virtual signals into the sensory system and can be direct (such as head mounted) or indirect (such as through a monitor), and they typically provide at least visual and auditory sensory feedback. The integration of multimodal technologies, electronic procedures, and system state, referred to as enhanced electronic procedures” (EEP), will help reduce the reliance on mission control and enable more autonomous operations. The EEP would help serve as a network for embedded smart chips on objects for location and identification; this sort of linked system is often called the Internet of Things (IoT). IoT can provide multiple types of two-way feedback, and will be used to assess a multimodal interface with its input/output capabilities. Current technology is available via tablet and may feature step-by-step and/or animation features for performing a task, but requires that these steps are executed correctly, with little feedback if done incorrectly. Multimodal EEP would be able to provide not only different feedback (visual, audio, and touch), but also interact with the three-dimensional environment in a way tablet-based instructions cannot through special recognition. This is especially useful in situations such as locating a tool or part in microgravity, and identifying if the correct part or step has been used.

This experiment seeks to address 4 distinct aims:

  1. Develop multimodal EEP countermeasures for autonomous crew performance.
  2. Evaluate crew performance using EEP.
  3. Validate the effects of multimodal countermeasures in an analog flight environment.
  4. Report findings and make guideline recommendations to NASA for multimodal electronic procedures.

++ -- View more


Data Information
Data Availability
Archiving in progress. Data is not yet available for this experiment.

Auditory feedback
Errors, number
Frequency of errors
++ -- View more

Mission/Study Information
Mission Launch/Start Date Landing/End Date Duration
HERA Campaign 6 10/01/2021 In progress

Additional Information
Managing NASA Center
Johnson Space Center (JSC)
Responsible NASA Representative
Johnson Space Center LSDA Office
Project Manager: Jessica Keune
Institutional Support
National Aeronautics and Space Administration (NASA)
Proposal Date
Proposal Source
2017 HERO 80JSC017N0001-Crew Health and Performance