Voice Interaction Design for Astronauts based on LLM and AR

NASA Suit Challenge

Overview

Ursa is an operating system designed for astronauts on the lunar missions, featuring a voice assistant with LLM, real-environment navigation, and non-intrusive user interface.

Category:

UX Design | XR

Duration:

8 Months

My role

As part of the UC Berkeley team selected for NASA’s SUITS (Spacesuit User Interface Technologies for Students) Design Challenge, I joined the HMD team to design and implement a system in Unity. My primary focus is on designing a safe, non-intrusive system that doesn't interfere with multitasking.

Me working onsiteTeam presentation

We are creating an AR voice control interface to assist astronauts on the lunar surface, especially when their hands are occupied. The system uses a lightweight, locally run LLM model to enable tasks like viewing maps, collecting geological data, and locating rovers,all without disrupting internal communication systems.

Design Context

Two main questions were asked at the beginning design stage:

1. Who is our user?

Astronauts

2. What are some challenges they are facing?

Astronauts often carry heavy equipment, restricting their use of hands for system controls. The harsh lunar environment requires continuous monitoring of suits and surroundings, making cognitive load a critical consideration.