Multimodal technologies provide system support by using a combination of auditory, visual, and tactile information and feedback. These inputs and outputs, when used with augmented reality (AR) displays, provide adaptive, integrated assistance that will be crucial for LDEMs. AR describes any display which augments the real world with virtual signals into the sensory system and can be direct (such as head mounted) or indirect (such as through a monitor), and they typically provide at least visual and auditory sensory feedback. The integration of multimodal technologies, electronic procedures, and system state, referred to as enhanced electronic procedures” (EEP), will help reduce the reliance on mission control and enable more autonomous operations. The EEP would help serve as a network for embedded smart chips on objects for location and identification; this sort of linked system is often called the Internet of Things (IoT). IoT can provide multiple types of two-way feedback, and will be used to assess a multimodal interface with its input/output capabilities. Current technology is available via tablet and may feature step-by-step and/or animation features for performing a task, but requires that these steps are executed correctly, with little feedback if done incorrectly. Multimodal EEP would be able to provide not only different feedback (visual, audio, and touch), but also interact with the three-dimensional environment in a way tablet-based instructions cannot through special recognition. This is especially useful in situations such as locating a tool or part in microgravity, and identifying if the correct part or step has been used.
This experiment seeks to address 4 distinct aims:
To achieve this, the experiment design will develop the EEP tool and integrate it into the following outlined experimental procedures. First, after consulting with NASA HRP, upgrades to a previous version of the EEP prototype or design of a completely new prototype will be developed, which will incorporate a NASA-provided electronic procedure system. Focus will be placed on interactions that are most effective in displaying information in a way that can be evaluated formally for effectiveness. Using that focus, the research team will evaluate crew performance, trust, and situational awareness of the multimodal EEP tool using quantitative measures of subject performance of the EEP tool versus a baseline electronic procedure tool (such as a digital PDF). Task performance will include measuring number of errors and situational awareness of the task procedures and execution. Ten subjects per research group are anticipated to be recruited, with novice levels of expertise for the tasks. This ground-based experiment will be conducted three times – once annually – in preparation for the flight analog validation experiment. The first of the three experiments will be tablet only versus PDF file, the second will be multimodal EEP with tablet AR versus HMD AR, and the third will be hybrid AR (tablet or HMD). Following the ground-based experiment, the goal of the analog mission experiments will be to validate that the same effects in the context of an operational environment verify those measured in the ground-based studies. This process could potentially be consolidated with other VNSCOR experiments if resource conservation becomes necessary.
The study is still in progress, results will be made available at a later date.
|Mission||Launch/Start Date||Landing/End Date||Duration|
|HERA Campaign 6||10/01/2021||In progress|