NASA Suit Challenge

Ursa_Egress
Ursa_Egress
Ursa_Egress
NASA Suit Challenge
NASA Suit Challenge
NASA Suit Challenge
NASAsuits_teamPre
NASAsuits_teamPre
NASAsuits_teamPre

Category:

Augmented Reality | UX System

Duration:

6 Months

Overview

As part of the Design team selected for NASA’s SUITS (Spacesuit User Interface Technologies for Students) Challenge, I collaborated with the HMD Engineering team to design a system in Figma and Unity. My primary focus is on designing a safe, non-intrusive system that doesn't interfere with multitasking.

We are creating an AR voice control interface to assist astronauts on the lunar surface, especially when their hands are occupied. The system uses a lightweight, locally run LLM model to enable tasks like viewing maps, collecting geological data, and locating rovers, all without disrupting internal communication systems.

Design Context

I started with two main questions at the beginning design stage:

1. Who is our user?

Astronauts

2. What are some challenges they are facing?

Astronauts often carry heavy equipment, restricting their use of hands for system controls. The harsh lunar environment requires continuous monitoring of suits and surroundings, making cognitive load a critical consideration.

  1. Physical Limitations 
    Restricted movement due to heavy equipment and spacesuits.

  2. Complex Environment 
    Multitasking is difficult in a complex lunar environment. Safety concerns need constant monitoring.

  3. Limited Visibility 
    Harsh lighting conditions (with areas of intense brightness or extreme darkness) on the moon restrict visibility.

Opportunities

How might we help astronauts explore the moon under physical and cognitive limitations using AI and AR technology?

Design Highlights

To facilitate tasks, I designed a system that displays information on the astronaut's helmet when the astronaut communicates with the LMCC (Lunar Mission Control Center, where another astronaut inside the station provides guidance). When instructions are received, the UI will appear, and the voice assistant, Ursa, will guide the astronaut through completing the tasks.

  • T1 - Navigation

  • T2 - Egress

  • T3 - Geo Sampling

  • T4 - Control Rover

  • T5 - Communicate between LMCC and Astronaut


LLM-Powered Responsive Voice AssistantUrsa's voice assistant is enhanced with Large Language Models (LLM), offering astronauts tailored instructions and feedback.

Non-intrusive InterfacesUrsa features non-intrusive interfaces strategically placed in the user's peripheral vision, ensuring essential information is accessible without obstructing the lunar environment.

Those are the 2D prototypes to see how the requirements would function on a digital screen.

Design Thinking

Design Thinking Example 1

Provide actionable insights, not just raw data.

Don't

👤User: “Ursa, what’s my heart rate?”

🐻Ursa: “Your heart rate is 90 bpm.”

This provides only the number without any interpretation or guidance.

Do

👤User: “Ursa, what’s my heart rate?”

🐻Ursa: “Your heart rate is 90 bpm {heart_rate}, and it’s within the nominal {nominal} range.”

Here, it gives the heart rate and informs the astronaut that it’s within a safe range.

Design Thinking Example 2

Simplifying Geographic Information Display

This project involved displaying complex geographic data clearly by removing unnecessary details. It featured two parts: a 2D map for locations like airlocks and rovers, and 3D navigation markers that appeared when astronauts needed guidance.

2D Map UI
2D Map UI
2D Map UI
3D Navigation Markers
3D Navigation Markers
3D Navigation Markers

The UX design should maximize spatial awareness and interactivity. Before generating routes, we introduced a brief scanning phase, helping users understand which part of their environment was being scanned. This also provided time for real-time updates of object positions.

For the 2D map, we iterated from a blank map to a multicolored geographic display, eventually refining it by removing excess colors and borders. We used HoloLens to sample (X, Y, Z) coordinates in Unity, creating real-time routes that highlighted critical information.

Design Thinking Example 3

Use visual feedback to assist communicationWe designed a voice-controlled AR interface to help astronauts perform tasks without disrupting internal communications. However, voice distortion in space due to air density changes made word recognition difficult.
My focus was on improving URSA's feedback to ensure astronauts knew whether their commands were accurately processed, setting clear expectations and enabling smooth task progression.

Learnings & Reflections

This project, spanning over six months, required collaboration between both design and front-end development teams.

One of my key insights came from designing the voice recognition system. Early research revealed that changes in air density in space can alter astronauts' voices, making word recognition difficult. This technical limitation impacted the reliability of voice commands in critical tasks. Our solution was to incorporate a language model (LLM) to capture and correct these issues, though the core challenges ultimately lay in back-end development.
From a design standpoint, my focus was on creating a UI system hat seamlessly communicated recognition results. Providing astronauts with real-time feedback ensured they could verify if their voice commands were accurately understood.

This was not only essential for task completion but also helped other astronauts in the Lunar Mission Control Center (LMCC) assist when text was correctly recognized. Clear, natural UI responses became vital to enhance overall communication between astronauts and the system.

My involvement in both design and development gave me a broader perspective on the project. This proved especially valuable in designing the map and geo-location system, which assisted astronauts in navigating back to base. I realized that this was not merely a 2D/UI design problem but a challenge of interaction design and visual guidance within a spatial context. Adapting flexible design thinking was crucial for developing a solution that worked effectively in this complex environment.