OASIcs, Volume 130

Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)



Thumbnail PDF

Event

SpaceCHI 2025, June 23-24, 2025, European Astronaut Centre (EAC), Cologne, Germany

Editors

Leonie Bensch
  • MIT Media Lab, Cambridge, MA, USA
Tommy Nilsson
  • Fraunhofer FIT, Sankt Augustin, Germany
Martin Nisser
  • University of Washington, Seattle, WA, USA
Pat Pataranutaporn
  • MIT Media Lab, Cambridge, MA, USA
Albrecht Schmidt
  • LMU Munich, Munich, Germany
Valentina Sumini
  • MIT Media Lab, Cambridge, MA, USA

Publication Details

  • published at: 2025-09-21
  • Publisher: Schloss Dagstuhl – Leibniz-Zentrum für Informatik
  • ISBN: 978-3-95977-384-3

Access Numbers

Documents

No documents found matching your filter selection.
Document
Complete Volume
OASIcs, Volume 130, SpaceCHI 2025, Complete Volume

Authors: Leonie Bensch, Tommy Nilsson, Martin Nisser, Pat Pataranutaporn, Albrecht Schmidt, and Valentina Sumini


Abstract
OASIcs, Volume 130, SpaceCHI 2025, Complete Volume

Cite as

Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 1-496, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@Proceedings{bensch_et_al:OASIcs.SpaceCHI.2025,
  title =	{{OASIcs, Volume 130, SpaceCHI 2025, Complete Volume}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{1--496},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025},
  URN =		{urn:nbn:de:0030-drops-247792},
  doi =		{10.4230/OASIcs.SpaceCHI.2025},
  annote =	{Keywords: OASIcs, Volume 130, SpaceCHI 2025, Complete Volume}
}
Document
Front Matter
Front Matter, Table of Contents, Preface, Conference Organization

Authors: Leonie Bensch, Tommy Nilsson, Martin Nisser, Pat Pataranutaporn, Albrecht Schmidt, and Valentina Sumini


Abstract
Front Matter, Table of Contents, Preface, Conference Organization

Cite as

Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 0:i-0:xvi, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{bensch_et_al:OASIcs.SpaceCHI.2025.0,
  author =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  title =	{{Front Matter, Table of Contents, Preface, Conference Organization}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{0:i--0:xvi},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.0},
  URN =		{urn:nbn:de:0030-drops-247780},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.0},
  annote =	{Keywords: Front Matter, Table of Contents, Preface, Conference Organization}
}
Document
Human-AI Interaction in Space: Insights from a Mars Analog Mission with the Harmony Large Language Model

Authors: Hippolyte Hilgers, Jean Vanderdonckt, and Radu-Daniel Vatavu


Abstract
The operational complexities of space missions require reliable, context-aware technical assistance for astronauts, especially when technical expertise is not available onboard and communication with Earth is delayed or limited. In this context, Large Language Models present a promising opportunity to augment human capabilities. To this end, we present Harmony, a model designed to provide astronauts with real-time technical assistance, fostering human-AI collaboration during analog missions. We report empirical results from an experiment involving seven analog astronauts that evaluated their user experience with Harmony in both a conventional environment and an isolated, confined, and extreme physical setting at the Mars Desert Research Station over four sessions, and discuss how the Mars analog environment impacted their experience. Our findings reveal the extent to which human-AI interactions evolve across various user experience dimensions and suggest how Harmony can be further adapted to suit extreme environments, with a focus on SpaceCHI.

Cite as

Hippolyte Hilgers, Jean Vanderdonckt, and Radu-Daniel Vatavu. Human-AI Interaction in Space: Insights from a Mars Analog Mission with the Harmony Large Language Model. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 1:1-1:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{hilgers_et_al:OASIcs.SpaceCHI.2025.1,
  author =	{Hilgers, Hippolyte and Vanderdonckt, Jean and Vatavu, Radu-Daniel},
  title =	{{Human-AI Interaction in Space: Insights from a Mars Analog Mission with the Harmony Large Language Model}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{1:1--1:20},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.1},
  URN =		{urn:nbn:de:0030-drops-239912},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.1},
  annote =	{Keywords: Extreme user experience, Human-AI interaction, Isolated-confined-extreme environment, Interaction design, Large Language Models, Mars Desert Research Station, Space mission, Technical assistance, Technical documentation, User experience}
}
Document
Best Practices in CubeSat Control Software Development: A Case Study of the SAGE Mission

Authors: Andrin Benz, Sebastian Oes, and Johannes Schöning


Abstract
The increasing complexity of CubeSat missions necessitates mission control software that is both efficient and user-friendly. This paper describes the design and implementation of a multi-mission CubeSat user interface for the SAGE Mission, a student-led CubeSat program. The system includes a web-based interface that provides telemetry visualisation, automated job scheduling, and real-time monitoring. Its architecture is designed to be modular, scalable, and accessible, allowing integration with multiple ground stations and support for various mission configurations. Input from stakeholders played a crucial role in shaping the interface through user evaluations, expert feedback, and digital twin simulations. Our findings highlight the significance of user-centred design in space mission software, particularly in educational and resource-constrained settings. The CubeSat SAGE mission control software enhances the accessibility of multi-mission operations and offers valuable insights into the future of space system interfaces.

Cite as

Andrin Benz, Sebastian Oes, and Johannes Schöning. Best Practices in CubeSat Control Software Development: A Case Study of the SAGE Mission. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 2:1-2:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{benz_et_al:OASIcs.SpaceCHI.2025.2,
  author =	{Benz, Andrin and Oes, Sebastian and Sch\"{o}ning, Johannes},
  title =	{{Best Practices in CubeSat Control Software Development: A Case Study of the SAGE Mission}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{2:1--2:11},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.2},
  URN =		{urn:nbn:de:0030-drops-239923},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.2},
  annote =	{Keywords: Automated mission operations, CubeSat operations, Ground segment software, Mission control software (MCS), Multi-mission operations, Satellite mission planning, SpaceCHI Telemetry data visualisation, User interface design}
}
Document
Multidimensional Usability Assessment in Spaceflight Analog Missions

Authors: Shivang Shelat, Katherine E. Homer, John A. Karasinski, and Jessica J. Marquez


Abstract
Software planning tools enable the self-scheduling of operational timelines during spaceflight, reducing reliance on ground support crews. Here, we assess analog crew perceptions of a planning tool’s usability across two space mission analogs with two validated questionnaires: the unidimensional System Usability Scale and the multidimensional User Experience Questionnaire. Critically, half the missions had assistive countermeasures integrated into the planning software interface whereas the other half did not. Correlation tests revealed high convergence between usability measures in the spaceflight analog setting. Group comparisons showed that the interface countermeasures enhanced several dimensions of usability, particularly for perceptions of the tool’s efficiency and dependability. These findings highlight the utility of a multidimensional approach to characterizing usability in order to capture fine-grained shifts in human-software dynamics, especially in spaceflight environments.

Cite as

Shivang Shelat, Katherine E. Homer, John A. Karasinski, and Jessica J. Marquez. Multidimensional Usability Assessment in Spaceflight Analog Missions. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 3:1-3:9, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{shelat_et_al:OASIcs.SpaceCHI.2025.3,
  author =	{Shelat, Shivang and Homer, Katherine E. and Karasinski, John A. and Marquez, Jessica J.},
  title =	{{Multidimensional Usability Assessment in Spaceflight Analog Missions}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{3:1--3:9},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.3},
  URN =		{urn:nbn:de:0030-drops-239934},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.3},
  annote =	{Keywords: space usability, crew autonomy, self-scheduling software}
}
Document
Designing for Usability in Modular Space Habitats: A Space Syntax Perspective

Authors: Nick Dalton, Ruth Conroy Dalton, Sam McElhinney, and Christoph Hölscher


Abstract
As human ambitions turn toward establishing settlements on the Moon and Mars, the architectural and spatial configuration of these habitats has received comparatively little attention relative to their engineering systems. This paper explores the spatial configuration and architectural usability of modular extraterrestrial settlements, focusing on their potential growth, vulnerability, and navigability. Drawing from architectural theory and Space Syntax methods, we propose a novel framework for evaluating habitat layouts based on two key metrics: intelligibility and vulnerability. Using a combination of analytic tools and simulations - including adapted versions of the "beady-ring" growth model - we assess both small-scale configurations and larger aggregated settlements. We introduce a new vulnerability metric based on spatial types and configurational redundancy, allowing us to quantify how module failure can fragment a habitat system. Our findings confirm five core hypotheses, including the inverse relationship between structural resilience and spatial intelligibility as modular settlements scale, and the applicability of Space Syntax theories to non-terrestrial environments. We argue that without deliberate planning, accretive modular growth leads to declining usability, and we advocate for intentional and informed planning interventions, to sustain human-centered design at larger scales. This work provides a foundational methodology for evaluating and guiding the spatial evolution of off-world settlements.

Cite as

Nick Dalton, Ruth Conroy Dalton, Sam McElhinney, and Christoph Hölscher. Designing for Usability in Modular Space Habitats: A Space Syntax Perspective. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 4:1-4:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{dalton_et_al:OASIcs.SpaceCHI.2025.4,
  author =	{Dalton, Nick and Dalton, Ruth Conroy and McElhinney, Sam and H\"{o}lscher, Christoph},
  title =	{{Designing for Usability in Modular Space Habitats: A Space Syntax Perspective}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{4:1--4:16},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.4},
  URN =		{urn:nbn:de:0030-drops-239949},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.4},
  annote =	{Keywords: Space Syntax, architectural usability, intelligibility, lunar habitats, Mars habitats, modular design, human-centered computer simulation}
}
Document
VibroLink: A Wireless Vibro-Auditory Transmission System to Improve Situational Awareness During EVA

Authors: Gabriela Vega and Paul Strohmeier


Abstract
On earth, technicians rely on auditory or haptic cues, such as engine sounds and vibration, for a tacit understanding of complex machinery and its status. However, such vibrational cues are absent in space, potentially leaving astronauts unaware of safety-critical information about environmental changes during extravehicular activities (EVAs). This work-in-progress paper presents vibroLink, a concept for a standalone system designed to enhance situational awareness in spacewalks by wirelessly transmitting audio and vibration cues from machinery to the astronaut. Our approach employs a modular, two-component system: a transmitter (sensing) unit equipped with a piezo sensor that detects vibrations from machinery or other critical sources and a receiver unit with a vibrotactile actuator that can be attached, for example, to the astronaut’s helmet to replicate the detected vibrations. A preliminary evaluation with a proof-of-concept prototype shows that our concept successfully transmits basic tactile cues, and naive users can leverage their tacit understanding of actions and materiality to identify how the cues originated.

Cite as

Gabriela Vega and Paul Strohmeier. VibroLink: A Wireless Vibro-Auditory Transmission System to Improve Situational Awareness During EVA. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 5:1-5:9, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{vega_et_al:OASIcs.SpaceCHI.2025.5,
  author =	{Vega, Gabriela and Strohmeier, Paul},
  title =	{{VibroLink: A Wireless Vibro-Auditory Transmission System to Improve Situational Awareness During EVA}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{5:1--5:9},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.5},
  URN =		{urn:nbn:de:0030-drops-239955},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.5},
  annote =	{Keywords: haptics, extravehicular activity, situational awareness}
}
Document
Toward an Earth-Independent System for EVA Mission Planning: Integrating Physical Models, Domain Knowledge, and Agentic RAG to Provide Explainable LLM-Based Decision Support

Authors: Kaisheng Li and Richard S. Whittle


Abstract
We propose a unified framework for an Earth‑independent AI system that provides explainable, context‑aware decision support for EVA mission planning by integrating six core components: a fine‑tuned EVA domain LLM, a retrieval‑augmented knowledge base, a short-term memory store, physical simulation models, an agentic orchestration layer, and a multimodal user interface. To ground our design, we analyze the current roles and substitution potential of the Mission Control Center - identifying which procedural and analytical functions can be automated onboard while preserving human oversight for experiential and strategic tasks. Building on this framework, we introduce RASAGE (Retrieval & Simulation Augmented Guidance Agent for Exploration), a proof‑of‑concept toolset that combines Microsoft Phi‑4‑mini‑instruct with a FAISS (Facebook AI Similarity Search)‑powered EVA knowledge base and custom A* path planning and hypogravity metabolic models to generate grounded, traceable EVA plans. We outline a staged validation strategy to evaluate improvements in route efficiency, metabolic prediction accuracy, anomaly response effectiveness, and crew trust under realistic communication delays. Our findings demonstrate the feasibility of replicating key Mission Control functions onboard, enhancing crew autonomy, reducing cognitive load, and improving safety for deep‑space exploration missions.

Cite as

Kaisheng Li and Richard S. Whittle. Toward an Earth-Independent System for EVA Mission Planning: Integrating Physical Models, Domain Knowledge, and Agentic RAG to Provide Explainable LLM-Based Decision Support. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 6:1-6:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{li_et_al:OASIcs.SpaceCHI.2025.6,
  author =	{Li, Kaisheng and Whittle, Richard S.},
  title =	{{Toward an Earth-Independent System for EVA Mission Planning: Integrating Physical Models, Domain Knowledge, and Agentic RAG to Provide Explainable LLM-Based Decision Support}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{6:1--6:17},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.6},
  URN =		{urn:nbn:de:0030-drops-239967},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.6},
  annote =	{Keywords: Human-AI Interaction for Space Exploration, Extravehicular Activities, Cognitive load and Human Performance Issues, Human Systems Exploration, Lunar Exploration, LLM}
}
Document
Unbound Human-Machine Interfaces for Interaction in Weightless Environments

Authors: Jessica R. Cauchard


Abstract
User interfaces are subject to the rules of physics (e.g., Newton and Archimedes' laws) relevant to the environment they are in. As such, most interfaces and interaction techniques have been designed for Earth surface. However, when interacting with technology in weightless environments, such as in space, both human and machine will be subject to different physical constraints. For instance, underwater or in Space, people can experience spatial disorientation, which will in turn affect how they use a system. This position paper conceptualizes unbound Human-Machine Interfaces (HMIs) as interfaces where either, or both, human and machine are located beyond Earth surface. In particular, it describes how traditional HCI needs to be rethought for interaction in weightless environments and how theoretical models such as joint cognition can support future developments of unbound interfaces.

Cite as

Jessica R. Cauchard. Unbound Human-Machine Interfaces for Interaction in Weightless Environments. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 7:1-7:8, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{cauchard:OASIcs.SpaceCHI.2025.7,
  author =	{Cauchard, Jessica R.},
  title =	{{Unbound Human-Machine Interfaces for Interaction in Weightless Environments}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{7:1--7:8},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.7},
  URN =		{urn:nbn:de:0030-drops-239970},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.7},
  annote =	{Keywords: human-robot interaction, gravity, space, interaction technique}
}
Document
CASIMAR, a Collaborative BVSR Project

Authors: Rico Nerger, Ian Luca Benecken, Paul Droste, Kristina Remić, Hanjo Schnellbächer, Stefan Ursu, Nils Hensch, Gregor Mokansky, Malte Eckermann, Simon Hestermann, Alexander Kerth, Talin Aswad, and Azalia Azzahra


Abstract
CASIMAR is a pioneering collaboration project within the Bundesverband studentischer Raumfahrt e.V. (BVSR), uniting student groups across Germany in designing, developing and testing a modular lunar rover and specialized EVA (extravehicular activity) tools for human-robot interaction experiments at the LUNA Analog Facility in Cologne, Germany. This student-driven initiative aims to establish a lasting framework for integrating projects of BVSR into the professional lunar exploration research environments. The rover development focuses on three mission scenarios: EVA support, search & rescue, and stand-alone operations, all focused on improving human-robot interaction for astronaut safety and efficiency. In parallel, students develop EVA tools which are essential for hands-on experience and also will be complementary to the rover’s proposed capabilities. The tools also enable early testing campaigns within the LUNA Analog Facility before the rover’s completion. To enhance design, training, and operational planning, CASIMAR will integrate a virtual reality (VR) environment that will serve as a digital twin of the rover and lunar setting, supporting interactive simulations, early-stage validation, and astronaut training. CASIMAR will not only contribute innovative technological solutions but also foster interdisciplinary collaboration and practical experience of future aerospace professionals.

Cite as

Rico Nerger, Ian Luca Benecken, Paul Droste, Kristina Remić, Hanjo Schnellbächer, Stefan Ursu, Nils Hensch, Gregor Mokansky, Malte Eckermann, Simon Hestermann, Alexander Kerth, Talin Aswad, and Azalia Azzahra. CASIMAR, a Collaborative BVSR Project. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 8:1-8:8, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{nerger_et_al:OASIcs.SpaceCHI.2025.8,
  author =	{Nerger, Rico and Benecken, Ian Luca and Droste, Paul and Remi\'{c}, Kristina and Schnellb\"{a}cher, Hanjo and Ursu, Stefan and Hensch, Nils and Mokansky, Gregor and Eckermann, Malte and Hestermann, Simon and Kerth, Alexander and Aswad, Talin and Azzahra, Azalia},
  title =	{{CASIMAR, a Collaborative BVSR Project}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{8:1--8:8},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.8},
  URN =		{urn:nbn:de:0030-drops-239989},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.8},
  annote =	{Keywords: CASIMAR, BVSR, LUNA Analog Facility, rover, EVA, tools, students, collaboration, virtual reality, VR, digital twin, HRI, HCI}
}
Document
Design as an Astronaut: An XR/VR Experience of the Argonaut Habitat Unit

Authors: Valentina Sumini, Cody Paige, Tommy Nilsson, Joseph Paradiso, Marta Rossi, Leonie Bensch, Ardacan Özvanlıgil, Deniz Gemici, Dava Newman, Gui Trotti, Aidan Cowley, and Lionel Ferra


Abstract
This research explores the conceptual design of a lunar habitat integrated with the Argonaut lander, an autonomous lunar landing vehicle currently under development by an international consortium led by the European Space Agency (ESA). As Europe’s first lunar lander, Argonaut was conceived to provide ESA and relevant European stakeholders with independent access to the Moon. Although the lander is primarily designed to transport various types of cargo to the lunar surface, this study proposes its adaptation as a platform for future human habitation: the Argonaut Habitat Unit. The project is the result of an international collaboration between ESA, the MIT Media Lab, and Politecnico di Milano. Drawing on a wide range of methodological approaches, this paper reflects on key aspects of the concept, including its synergy with the existing Argonaut project, algorithmic modeling of a lunar habitat, consideration of technical requirements, and interior design development. The project addresses the spatial, material, and environmental constraints of lunar habitation through a combination of three-dimensional modeling software, computational design tools, and virtual reality (VR) development environments. The integration of VR offers an immersive understanding of the proposed habitat, enabling a first-hand experience of its spatial qualities. This approach supports both the evaluation and refinement of the design, enhancing its livability and practical feasibility.

Cite as

Valentina Sumini, Cody Paige, Tommy Nilsson, Joseph Paradiso, Marta Rossi, Leonie Bensch, Ardacan Özvanlıgil, Deniz Gemici, Dava Newman, Gui Trotti, Aidan Cowley, and Lionel Ferra. Design as an Astronaut: An XR/VR Experience of the Argonaut Habitat Unit. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 9:1-9:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{sumini_et_al:OASIcs.SpaceCHI.2025.9,
  author =	{Sumini, Valentina and Paige, Cody and Nilsson, Tommy and Paradiso, Joseph and Rossi, Marta and Bensch, Leonie and \"{O}zvanl{\i}gil, Ardacan and Gemici, Deniz and Newman, Dava and Trotti, Gui and Cowley, Aidan and Ferra, Lionel},
  title =	{{Design as an Astronaut: An XR/VR Experience of the Argonaut Habitat Unit}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{9:1--9:14},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.9},
  URN =		{urn:nbn:de:0030-drops-239996},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.9},
  annote =	{Keywords: Argonaut, Lunar Habitat, Virtual Reality, Extended Reality Computational Design}
}
Document
A Postcard from Mars: Exploring Interplanetary Communications in Virtual Reality

Authors: Adalberto L. Simeone


Abstract
In this paper we present an Immersive Speculative Enactment focused on the theme of interplanetary communications. These are a novel approach extending conventional Speculative Enactments to Virtual Reality. We created a narrative-based scenario in which participants played the role of human colonists on either Mars or the Moon, to explore a possible future in which interplanetary communication becomes a necessity. To enact this scenario, we created a VR interactive experience to elicit feedback on the idea of communicating across planets. Through an exploratory qualitative analysis of this immersive enactment, we found that while the future envisioned was seen as too distant to prompt realistic behaviour from all participants, the enactment helped us and the participants to reflect on the experience. We discuss these findings, drawing potential implications for the improvement of the feeling of "really being there" even in implausible situations and further contribute reflections on the role of ISEs in space-related scenarios.

Cite as

Adalberto L. Simeone. A Postcard from Mars: Exploring Interplanetary Communications in Virtual Reality. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 10:1-10:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{simeone:OASIcs.SpaceCHI.2025.10,
  author =	{Simeone, Adalberto L.},
  title =	{{A Postcard from Mars: Exploring Interplanetary Communications in Virtual Reality}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{10:1--10:16},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.10},
  URN =		{urn:nbn:de:0030-drops-240002},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.10},
  annote =	{Keywords: Immersive Speculative Enactments, Interplanetary Communications, Virtual Reality}
}
Document
Building Trustworthy Cognitive Monitoring for Safety-Critical Human Tasks: A Phased Methodological Approach

Authors: Maciej Grzeszczuk, Grzegorz Pochwatko, Barbara Karpowicz, Stanisław Knapiński, and Wiesław Kopeć


Abstract
Operators performing high-stakes, safety-critical tasks - such as air traffic controllers, surgeons, or mission control personnel - must maintain exceptional cognitive performance under variable and often stressful conditions. This paper presents a phased methodological approach to building cognitive monitoring systems for such environments. By integrating insights from human factors research, simulation-based training, sensor technologies, and fundamental psychological principles, the proposed framework supports real-time performance assessment with minimum intrusion. The approach begins with simplified simulations and evolves towards operational contexts. Key challenges addressed include variability in workload, the effects of fatigue and stress, thus the need for adaptive monitoring for early warning support mechanisms. The methodology aims to improve situational awareness, reduce human error, and support decision-making without undermining operator autonomy. Ultimately, the work contributes to the development of resilient and transparent systems in domains where human performance is critical to safety.

Cite as

Maciej Grzeszczuk, Grzegorz Pochwatko, Barbara Karpowicz, Stanisław Knapiński, and Wiesław Kopeć. Building Trustworthy Cognitive Monitoring for Safety-Critical Human Tasks: A Phased Methodological Approach. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 11:1-11:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{grzeszczuk_et_al:OASIcs.SpaceCHI.2025.11,
  author =	{Grzeszczuk, Maciej and Pochwatko, Grzegorz and Karpowicz, Barbara and Knapi\'{n}ski, Stanis{\l}aw and Kope\'{c}, Wies{\l}aw},
  title =	{{Building Trustworthy Cognitive Monitoring for Safety-Critical Human Tasks: A Phased Methodological Approach}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{11:1--11:11},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.11},
  URN =		{urn:nbn:de:0030-drops-240013},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.11},
  annote =	{Keywords: cognitive load, safety-critical systems, human performance, simulation environments, human factors, air traffic control, aviation}
}
Document
Conceptual Design, Manufacturing, and Assembly of a Tall Lunar Tower

Authors: Marina Konstantatou, Aran Sena, George Cann, Irene Gallou, Salvador Navarro Perez, Orla Punch, Marc Guberman, David Goodloe, and Platt Boyd


Abstract
This research paper presents the results of a NASA-funded collaborative project between Foster + Partners and Branch Technology that developed the design, manufacturing, and assembly of an optimised 50m tall lunar tower for solar energy generation at the Moon’s South Pole. The tower’s structure is characterised by a helical diagrid geometry with integrated spiralling rails, designed to enable crane-free manufacturing, solar array deployment, and maintenance. The tower’s unique geometry is designed to be compatible with a freeform 3D printing and cellular fabrication strategy, creating opportunities for in-situ resource utilisation and load path optimisation. Particular attention was placed into developing a site-specific design that takes into consideration the unique environmental and lighting conditions of the Lunar South Pole. Two demonstrators were fabricated by the consortium: a 1:50 scale functional prototype with a robotically-deployed rotating solar array and a 1:1 scale 5m section of the tower. Both were showcased in March 2025 during the "Earth to Space" exhibition at the Kennedy Space Centre in Washington DC. This project contributes to the efforts of developing supporting infrastructure, such as power and communications networks, which will enable lunar exploration and a sustained human presence on our Moon and beyond.

Cite as

Marina Konstantatou, Aran Sena, George Cann, Irene Gallou, Salvador Navarro Perez, Orla Punch, Marc Guberman, David Goodloe, and Platt Boyd. Conceptual Design, Manufacturing, and Assembly of a Tall Lunar Tower. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 12:1-12:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{konstantatou_et_al:OASIcs.SpaceCHI.2025.12,
  author =	{Konstantatou, Marina and Sena, Aran and Cann, George and Gallou, Irene and Perez, Salvador Navarro and Punch, Orla and Guberman, Marc and Goodloe, David and Boyd, Platt},
  title =	{{Conceptual Design, Manufacturing, and Assembly of a Tall Lunar Tower}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{12:1--12:13},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.12},
  URN =		{urn:nbn:de:0030-drops-240027},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.12},
  annote =	{Keywords: Space Architecture, Lunar Infrastructure, Lunar Tower, Additive Manufacturing, In Situ Resource Utilisation (ISRU), Solar Energy}
}
Document
Exploring the Symbiotic Collaboration Paradigm in Virtual Reality and Its Potential Applications to Human Spaceflight

Authors: Florian Dufresne, Geoffrey Gorisse, and Olivier Christmann


Abstract
As the quest to go back to the Moon and beyond continues, preparation for such critical missions relies in part on the use of immersive technologies. Especially, Virtual Reality (VR) unique affordances allow to simulate scenarios in a convincing digitally recreated space. But the potential of VR is not limited to solely emulating real-world environments. Indeed, some works from the Human-Computer Interaction (HCI) community explored new ways to collaborate virtually by inhabiting the same virtual representation, namely an avatar. Taking this paradigm further, one could offer new ways to collaborate between an immersed VR user and an external supervisor being granted access to the virtual environment by way of non-immersive devices like a computer or a smartphone. The non-immersed user could for instance inhabit some body parts of the VR user’s avatar to benefit from unique viewpoints and leverage mutual spatial awareness, as well as social interactions, alike a symbiotic relationship that benefits both actors. Therefore, this paper introduces our on-going research project exploring this new paradigm of symbiotic co-embodiment as a tool leveraging social presence during supervised embodied sessions in VR. It especially discusses how this paradigm could benefit human spaceflight, both in mission preparation and during spaceflight.

Cite as

Florian Dufresne, Geoffrey Gorisse, and Olivier Christmann. Exploring the Symbiotic Collaboration Paradigm in Virtual Reality and Its Potential Applications to Human Spaceflight. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 13:1-13:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{dufresne_et_al:OASIcs.SpaceCHI.2025.13,
  author =	{Dufresne, Florian and Gorisse, Geoffrey and Christmann, Olivier},
  title =	{{Exploring the Symbiotic Collaboration Paradigm in Virtual Reality and Its Potential Applications to Human Spaceflight}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{13:1--13:13},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.13},
  URN =		{urn:nbn:de:0030-drops-240034},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.13},
  annote =	{Keywords: Virtual Reality, Co-Embodiment, Human Spaceflight, Supervised Training, On-field Activities}
}
Document
Human Factors and Behavioral Performance Evaluation Framework for IntraVehicular Activities(IVAs) Under Simulated Lunar Gravity: Focus on the Lunar Agriculture Module (LAM)

Authors: Kyunghwan Kim, Daniel Schubert, Gisela Detrell, and Aidan Cowley


Abstract
The Planetary Infrastructure Research Group at the German Aerospace Center (DLR) is developing a Lunar Agriculture Module (LAM) to support sustainable food production and provide Bioregenerative Life Support System (BLSS) functions for long-duration lunar missions. Despite various ongoing research efforts on BLSS development and lunar surface human activities, a critical knowledge gap remains regarding how reduced gravity (0.16g) impacts human factors and behavioral performance (HFBP) during intravehicular activities (IVAs) in a lunar module. To fill the existing research gap, DLR is constructing the Lunar Agriculture Module - Reduced Gravity Simulator (LAM-RGS). The LAM-RGS integrates a Mixed Reality (MR) environment - combining Physical Reality (PR) mockups and Virtual Reality (VR) systems - with a gravity offloading system and multimodal data acquisition tools. This simulator will assess task performance, workload, and biomechanics under simulated lunar gravity conditions to optimize the internal system and rack design of the LAM, minimize ergonomic risks, and improve human-system interaction. To achieve these goals, this paper presents the experimental design and architecture of the LAM-RGS, introducing a four-pillar research framework consisting of: (1) simulator system development and experimental design, (2) system integration and validation, (3) human factors and performance assessment, and (4) data-driven design optimization. The proposed methodology provides a foundation for systematically evaluating human performance in lunar IVA operations and supports the evidence-based design of future lunar habitat systems.

Cite as

Kyunghwan Kim, Daniel Schubert, Gisela Detrell, and Aidan Cowley. Human Factors and Behavioral Performance Evaluation Framework for IntraVehicular Activities(IVAs) Under Simulated Lunar Gravity: Focus on the Lunar Agriculture Module (LAM). In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 14:1-14:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{kim_et_al:OASIcs.SpaceCHI.2025.14,
  author =	{Kim, Kyunghwan and Schubert, Daniel and Detrell, Gisela and Cowley, Aidan},
  title =	{{Human Factors and Behavioral Performance Evaluation Framework for IntraVehicular Activities(IVAs) Under Simulated Lunar Gravity: Focus on the Lunar Agriculture Module (LAM)}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{14:1--14:15},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.14},
  URN =		{urn:nbn:de:0030-drops-240044},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.14},
  annote =	{Keywords: Bioregenerative Life Support Systems (BLSS), Human Factors and Behavioral Performance (HFBP) Evaluation, Human-In-The-Loop (HITL), Intravehicular Activity (IVA), Lunar Agriculture Module (LAM), Mixed Reality (MR), Module Design Optimization, Simulated Lunar Gravity}
}
Document
Integrating Human-In-The-Loop AI to Tackle Space Communication Delay Challenges

Authors: Nikos Mavrakis, Effie Lai-Chong Law, and Hubert P. H. Shum


Abstract
Deep space missions face significant communication delays that disrupt both operational workflows and psychological support for crew members. Unlike low Earth orbit operations, delays ranging from several minutes to nearly an hour make real-time communication with mission control infeasible, forcing crews to act with greater independence under uncertain conditions. This position paper examines how human-in-the-loop AI, digital twins, and edge AI can be integrated to mitigate these delays while maintaining astronaut autonomy and engagement. We argue that human-in-the-loop AI enables decision-making processes that are responsive to local context while remaining adaptable to changing mission demands. Digital twins offer real-time simulation and predictive modelling capabilities, allowing astronauts to explore options and troubleshoot without waiting for ground input. Edge AI brings computation closer to data sources, enabling low-latency inference onboard spacecraft for time-critical decisions. These ideas are explored through two use cases: using deepfakes to support emotionally resonant communication with loved ones, and applying visual-language models for onboard fault diagnosis and adaptive task replanning. We conclude with reflections on system design challenges under constrained and high-stakes conditions.

Cite as

Nikos Mavrakis, Effie Lai-Chong Law, and Hubert P. H. Shum. Integrating Human-In-The-Loop AI to Tackle Space Communication Delay Challenges. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 15:1-15:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{mavrakis_et_al:OASIcs.SpaceCHI.2025.15,
  author =	{Mavrakis, Nikos and Law, Effie Lai-Chong and Shum, Hubert P. H.},
  title =	{{Integrating Human-In-The-Loop AI to Tackle Space Communication Delay Challenges}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{15:1--15:16},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.15},
  URN =		{urn:nbn:de:0030-drops-240051},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.15},
  annote =	{Keywords: Human-in-the-loop AI, communication delays, human spaceflight}
}
Document
Lighting Scenarios and Human Well-Being in Extreme Environments: Insights from St. Kliment Ohridski Antarctic Base

Authors: Christina Balomenaki, Ismene Chrysochoou, and Konstantinos-Alketas Oungrinis


Abstract
Lighting is a critical factor in human well-being, cognitive function, and performance - particularly in extreme and confined environments such as space habitats and Antarctic research stations. Despite its importance, limited research exists on how individuals adapt lighting to meet personal and contextual needs in these settings. This study investigates lighting use at the Bulgarian Antarctic base, focusing on how inhabitants adjusted natural and artificial light to support mood, comfort, and daily routines. Data was collected through on-site observations, qualitative interviews, and structured questionnaires. Results reveal key challenges in current lighting systems, including a lack of flexibility, user control, and emotional responsiveness. Findings suggest that human-centered lighting - especially systems that are adaptable, automated, and user-driven - can significantly improve well-being and operational efficiency in isolated environments. These insights inform future design strategies for space analogues and extraterrestrial habitats, contributing to the advancement of human-computer interaction in space exploration through responsive and personalized lighting solutions.

Cite as

Christina Balomenaki, Ismene Chrysochoou, and Konstantinos-Alketas Oungrinis. Lighting Scenarios and Human Well-Being in Extreme Environments: Insights from St. Kliment Ohridski Antarctic Base. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 16:1-16:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{balomenaki_et_al:OASIcs.SpaceCHI.2025.16,
  author =	{Balomenaki, Christina and Chrysochoou, Ismene and Oungrinis, Konstantinos-Alketas},
  title =	{{Lighting Scenarios and Human Well-Being in Extreme Environments: Insights from St. Kliment Ohridski Antarctic Base}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{16:1--16:25},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.16},
  URN =		{urn:nbn:de:0030-drops-240060},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.16},
  annote =	{Keywords: sensponsive design, human-centric lighting, circadian light, colorful lighting}
}
Document
MUSE: Designing Immersive Virtual Realities for Spaceflight UX Research

Authors: Noora Archer, Pasquale Castellano, and Aidan Cowley


Abstract
Virtual reality (VR) provides unique opportunities for assessing early spacecraft design and usability by employing human-centered narrative and scenario-driven design methods. This paper details a narrative-focused VR simulation of a speculative spaceflight scenario, emphasizing narrative techniques for enhancing user immersion and user testing in evaluating operational usability aspects inside a spacecraft capsule. We designed a Modular User-centric Spaceflight Experience (MUSE) including a spacecraft capsule design and virtual mission scenario based on the findings and suggestions in Human Inspirator Co-Engineering (HICE) study. Results from user testing with MUSE underline the effectiveness and opportunities of narrative scenarios in early UX- evaluations in improving experience flow, operational understanding and user engagement. At the same time there remains several questions in defining best methodology to measure users insight and action motivation born from narrative immersion with the VR- experience.

Cite as

Noora Archer, Pasquale Castellano, and Aidan Cowley. MUSE: Designing Immersive Virtual Realities for Spaceflight UX Research. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 17:1-17:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{archer_et_al:OASIcs.SpaceCHI.2025.17,
  author =	{Archer, Noora and Castellano, Pasquale and Cowley, Aidan},
  title =	{{MUSE: Designing Immersive Virtual Realities for Spaceflight UX Research}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{17:1--17:13},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.17},
  URN =		{urn:nbn:de:0030-drops-240079},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.17},
  annote =	{Keywords: Virtual Reality, Spaceflight Simulation, Narrative Design, Game Design, Scenario Design, Immersive Experience}
}
Document
Advancing Intelligent Personal Assistants for Human Spaceflight

Authors: Leonie Bensch, Oliver Bensch, and Tommy Nilsson


Abstract
The Artemis program and upcoming missions to Mars mark a new era of human space exploration that will require new tools to support astronaut autonomy in the absence of real-time communication with Earth. This paper investigates the role of voice-based intelligent personal assistants (IPAs) in future crewed space missions. Through semi-structured interviews with astronauts (n=3) and spaceflight experts (n=12), we identify key user-centered design requirements for IPAs in this uniquely constrained and safety-critical environment. Our thematic analysis reveals core requirements for flexibility, reliability, offline capability, and multimodal interaction. Drawing on these findings, we outline design guidelines for next-generation IPAs and discuss how technologies such as retrieval-augmented generation (RAG), knowledge graphs, and augmented reality should be combined to support flexible, reliable, and multimodal IPAs for future human spaceflight missions.

Cite as

Leonie Bensch, Oliver Bensch, and Tommy Nilsson. Advancing Intelligent Personal Assistants for Human Spaceflight. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 18:1-18:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{bensch_et_al:OASIcs.SpaceCHI.2025.18,
  author =	{Bensch, Leonie and Bensch, Oliver and Nilsson, Tommy},
  title =	{{Advancing Intelligent Personal Assistants for Human Spaceflight}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{18:1--18:18},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.18},
  URN =		{urn:nbn:de:0030-drops-240082},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.18},
  annote =	{Keywords: Conversational Assistant, Intelligent Personal Assistant, Artificial Intelligence, Astronaut, Human Spaceflight, Generative Pre-Trained Transformer (GPT), Retrieval Augmented Generation (RAG), Knowledge Graphs, Augmented Reality, Voice Assistant, Long Duration Spaceflight}
}
Document
Multi-Axis, Multi-Material Additive Fabrication of Multi-Layer Conformal SMD Circuitry to Support In-Space Mission Resilience

Authors: Ivan Revenga Riesco, Borut Lampret, Connor Myant, and David Boyle


Abstract
This work presents the development and evaluation of multi-material, multi-axis Material Extrusion (MEX) additive manufacturing combined with electroplating for the fabrication of complex conformal circuitry. The proposed approach enables the direct printing of functional electronics onto conformal surfaces, while offering a lower-cost and lower-complexity alternative to conventional PCB manufacturing and other in-space electronics fabrication methods. A key contribution of this work is the introduction of small multi-material bridges as a lightweight and scalable solution to miniaturisation challenges in 3D-printed electronics. The printed circuits' physical dimensions were analysed and compared among samples, and their electrical performance was benchmarked against traditional FR4 PCBs. Lastly, the role of such a system is evaluated in the context of a space exploration mission. While the printed circuits exhibited increased noise and reduced reliability, they successfully demonstrated the ability to regulate and deliver current. The results highlight the potential of MEX-based additive manufacturing as a potential lower-cost alternative technique to proposed in-space additive electronics manufacturing processes.

Cite as

Ivan Revenga Riesco, Borut Lampret, Connor Myant, and David Boyle. Multi-Axis, Multi-Material Additive Fabrication of Multi-Layer Conformal SMD Circuitry to Support In-Space Mission Resilience. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 19:1-19:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{revengariesco_et_al:OASIcs.SpaceCHI.2025.19,
  author =	{Revenga Riesco, Ivan and Lampret, Borut and Myant, Connor and Boyle, David},
  title =	{{Multi-Axis, Multi-Material Additive Fabrication of Multi-Layer Conformal SMD Circuitry to Support In-Space Mission Resilience}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{19:1--19:17},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.19},
  URN =		{urn:nbn:de:0030-drops-240093},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.19},
  annote =	{Keywords: Space Digital Fabrication, Additive Manufactured Electronics Systems, 3D printed electronics, In-space manufacturing}
}
Document
Navigating Exoplanetary Systems in Augmented Reality: Preliminary Insights on ExoAR

Authors: Bryson Lawton, Frank Maurer, and Daniel Zielasko


Abstract
With thousands of exoplanets now confirmed by space missions such as NASA’s Kepler and TESS, scientific interest and public curiosity about these distant worlds continue to grow. However, current visualization tools for exploring exoplanetary systems often lack sufficient scientific accuracy or interactive features, limiting their educational effectiveness and analytical utility. To help address this gap, we developed ExoAR, an augmented reality tool designed to offer immersive, scientifically sound visualizations of all known exoplanetary systems using data directly sourced from NASA’s Exoplanet Archive. By leveraging augmented reality’s strengths, ExoAR enables users to immerse themselves in interactive, dynamic 3D models of these planetary systems with data-driven representations of planets and their host stars. The application also allows users to adjust various visualization scales independently, a capability designed to aid comprehension of comparative astronomical properties such as orbital mechanics, planetary sizes, and stellar classifications. To begin assessing ExoAR’s potential as an educational and analytical tool and inform future iterations, a pilot user study was conducted. Its findings indicate that participants found ExoAR improved user engagement and spatial understanding compared to NASA’s Eyes on Exoplanets application, a non-immersive exoplanetary system visualization tool. This work-in-progress paper presents these early insights, acknowledges current system limitations, and outlines future directions for more rigorously evaluating and further improving ExoAR’s capabilities for both educational and scientific communities.

Cite as

Bryson Lawton, Frank Maurer, and Daniel Zielasko. Navigating Exoplanetary Systems in Augmented Reality: Preliminary Insights on ExoAR. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 20:1-20:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{lawton_et_al:OASIcs.SpaceCHI.2025.20,
  author =	{Lawton, Bryson and Maurer, Frank and Zielasko, Daniel},
  title =	{{Navigating Exoplanetary Systems in Augmented Reality: Preliminary Insights on ExoAR}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{20:1--20:13},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.20},
  URN =		{urn:nbn:de:0030-drops-240106},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.20},
  annote =	{Keywords: Immersive Analytics, Data Visualization, Astronomy, Astrophysics, Exoplanet, Augmented Reality, AR}
}
Document
Towards Passively Actuated Short-Range Telehaptics for Astronauts

Authors: Matvey Boguslavskiy, Digby Chappell, and Thrishantha Nanayakkara


Abstract
Human extra-vehicular activity (EVA) plays a vital role in current and near future space exploration for two reasons: the superior dexterity exhibited by human astronauts, and their flexible problem-solving and decision-making capabilities. However, the dexterity of astronauts during EVA is limited by the flexibility and tactility of their EVA suit gloves, which are primarily designed to provide thermal insulation and pressure for the hand. This creates a compromise between utility and protection. To address this compromise, a Passively Actuated Short-range Telehaptic (PAST) device is proposed. The PAST device couples the motion of fingers between a robotic hand and a human hand through a hydraulically actuated linkage. It also transfers tactile information, including pressure, direction of motion, and position of contact, via a taxel array. Results demonstrate that the proposed prototype PAST device surpasses an unpressurised benchmark heavy work glove (HWG) in tasks involving tactile position and motion direction identification. This provides evidence supporting the feasibility of enhancing astronaut dexterity during EVA through the use of PAST devices as an alternative paradigm to gloves.

Cite as

Matvey Boguslavskiy, Digby Chappell, and Thrishantha Nanayakkara. Towards Passively Actuated Short-Range Telehaptics for Astronauts. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 21:1-21:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{boguslavskiy_et_al:OASIcs.SpaceCHI.2025.21,
  author =	{Boguslavskiy, Matvey and Chappell, Digby and Nanayakkara, Thrishantha},
  title =	{{Towards Passively Actuated Short-Range Telehaptics for Astronauts}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{21:1--21:16},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.21},
  URN =		{urn:nbn:de:0030-drops-240115},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.21},
  annote =	{Keywords: Extra-vehicular Activity, Mechanism Design, Haptics, Dexterity, Tactility}
}
Document
Mixed-Initiative Dynamic Autonomy Through Variable Levels of Immersion and Control (MIDA-VIC): A New Paradigm for Collaborative Robotic Teleoperation in Space Exploration

Authors: Hans-Christian Jetter, Leon Raule, Jens Gerken, and Sören Pirk


Abstract
In this position paper, we propose the new control paradigm and conceptual framework MIDA-VIC for collaborative robotic teleoperation in space exploration and beyond. Such teleoperation is a complex and demanding team effort with distributed responsibilities that require both efficient human-robot and human-human collaboration. To address these challenges, we propose a new paradigm of mixed-initiative dynamic autonomy for robotic teleoperation. It exploits recent advances in human-computer interaction (HCI), human-robot interaction (HRI), augmented and virtual reality (AR/VR), and artificial intelligence (AI) research. By integrating methods from multiple fields, our paradigm allows human operators to choose their preferred level of immersion, from traditional 2D graphical user interfaces (GUIs) to fully immersive AR/VR environments. It also supports a dynamic adjustment of the level of control, ranging from direct motor commands (e.g., using a joystick) to high-level task delegation using AI (e.g., instructing the robot via natural language to select a path or explore autonomously). In addition, we propose a mixed-initiative paradigm in which a robot can also take the initiative, request human assistance, and propose the specific level of immersion and control to the human operator that it currently considers useful for effective and efficient collaboration.

Cite as

Hans-Christian Jetter, Leon Raule, Jens Gerken, and Sören Pirk. Mixed-Initiative Dynamic Autonomy Through Variable Levels of Immersion and Control (MIDA-VIC): A New Paradigm for Collaborative Robotic Teleoperation in Space Exploration. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 22:1-22:10, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{jetter_et_al:OASIcs.SpaceCHI.2025.22,
  author =	{Jetter, Hans-Christian and Raule, Leon and Gerken, Jens and Pirk, S\"{o}ren},
  title =	{{Mixed-Initiative Dynamic Autonomy Through Variable Levels of Immersion and Control (MIDA-VIC): A New Paradigm for Collaborative Robotic Teleoperation in Space Exploration}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{22:1--22:10},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.22},
  URN =		{urn:nbn:de:0030-drops-240122},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.22},
  annote =	{Keywords: Collaboration, Teleoperation, Robot, Space Exploration}
}
Document
(Can't) Fly Me to the Moon or Mars? Context of Use Analysis Approaches for Space Exploration

Authors: Tilo Mentler


Abstract
Contexts of use are a central concept of research and development in human-computer interaction (HCI). Their in-depth understanding is a key for usable and acceptable computer-aided solutions and a particular challenge in connection with space exploration. It is necessary to examine which of the established approaches can be implemented here and where methodological adjustments are necessary. This article provides a systematic consideration of three perspectives to understand space contexts of use: theory and literature, imparted experiential knowledge, and personal experience. Potentials and risks are evaluated. The findings of HCI research in safety-critical contexts and under COVID-19 conditions that can be transferred to space HCI are taken up.

Cite as

Tilo Mentler. (Can't) Fly Me to the Moon or Mars? Context of Use Analysis Approaches for Space Exploration. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 23:1-23:7, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{mentler:OASIcs.SpaceCHI.2025.23,
  author =	{Mentler, Tilo},
  title =	{{(Can't) Fly Me to the Moon or Mars? Context of Use Analysis Approaches for Space Exploration}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{23:1--23:7},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.23},
  URN =		{urn:nbn:de:0030-drops-240135},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.23},
  annote =	{Keywords: Context of Use, Experiential Knowledge, Remote Contextual Inquiry}
}
Document
Understanding Time in Space: Improving Timeline Understandability for Uncrewed Space Systems

Authors: Elizabeth Sloan and Kristin Yvonne Rozier


Abstract
Timelines are critical in space exploration. Timelines facilitate planning, resource management, and automation of uncrewed missions. As NASA and other space agencies increasingly rely on timelines for autonomous spacecraft operations, ensuring their understandability and verifiability is essential for mission success. However, interdisciplinary design teams face challenges in interpreting timelines due to variations in cultural and educational backgrounds, leading to communication barriers and potential system mismatches. This work-in-progress research explores time-oriented data visualizations to improve timeline comprehension in space systems. We contribute (1) a survey of visualization techniques, identifying patterns and gaps in historic time-oriented data visualizations and industry tools, (2) a focus group pilot study analyzing user interpretations of timeline visualizations, and (3) a novel method for visualizing aggregate runs of a timeline on a complex system, including identification of key features for usability of aggregate-data visuals. Our findings inform future visualization strategies for debugging and verifying timelines in uncrewed systems. While focused on space, this research has broader implications for aerospace, robotics, and emergency response systems.

Cite as

Elizabeth Sloan and Kristin Yvonne Rozier. Understanding Time in Space: Improving Timeline Understandability for Uncrewed Space Systems. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 24:1-24:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{sloan_et_al:OASIcs.SpaceCHI.2025.24,
  author =	{Sloan, Elizabeth and Rozier, Kristin Yvonne},
  title =	{{Understanding Time in Space: Improving Timeline Understandability for Uncrewed Space Systems}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{24:1--24:12},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.24},
  URN =		{urn:nbn:de:0030-drops-240143},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.24},
  annote =	{Keywords: Human-Ceneterd Design, Time-Oriented Data Visualization, Uncrewed Spacecraft Operations, Formal Methods}
}
Document
NEREUS: An Assistive Decision Support System for Real-Time, Adaptive Route Guidance in Extravehicular Navigation Activities on the Lunar Surface

Authors: Jasmine Q. Wu, Andrew J. Hwang, and Matthew J. Bietz


Abstract
Extravehicular Activity (EVA) is one of the most complex operational endeavors during human lunar exploration. A key aspect of successful operations involves adapting procedures to address unexpected hazards on the lunar surface. Current route mapping systems rely heavily on static navigation planning around craters, high elevations, and extreme weather conditions to accomplish pre-defined mission objectives. However, the high-resolution data necessary for reliable route mapping is often unavailable. To address this challenge, we have designed NEREUS, a Decision Support System (DSS) that helps EVA operators on the ground respond to anomalies faster by simulating multiple alternative routes in parallel and visualizing trade-offs in consumable resources, speed, and safety as well as impact on overall mission timeline. The system offloads computationally intensive tasks like calculating the impact of evolving hazard data, allowing operators to focus on higher-level decision-making.

Cite as

Jasmine Q. Wu, Andrew J. Hwang, and Matthew J. Bietz. NEREUS: An Assistive Decision Support System for Real-Time, Adaptive Route Guidance in Extravehicular Navigation Activities on the Lunar Surface. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 25:1-25:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{wu_et_al:OASIcs.SpaceCHI.2025.25,
  author =	{Wu, Jasmine Q. and Hwang, Andrew J. and Bietz, Matthew J.},
  title =	{{NEREUS: An Assistive Decision Support System for Real-Time, Adaptive Route Guidance in Extravehicular Navigation Activities on the Lunar Surface}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{25:1--25:14},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.25},
  URN =		{urn:nbn:de:0030-drops-240158},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.25},
  annote =	{Keywords: Human Computer Interaction (HCI), Adaptive Navigation, Decision Support, Cognitive Load Analysis, Decision Support System, Extravehicular Activity}
}
Document
Movement in Low Gravity (MoLo) – LUNA: Biomechanical Modelling to Mitigate Lunar Surface Operation Risks

Authors: David Andrew Green


Abstract
The Artemis programme seeks to develop and test concepts, hardware and approaches to support long term habitation of the Lunar surface, and future missions to Mars. In preparation for the Artemis missions determination of tasks to be performed, the functional requirements of such tasks and as mission duration extends whether physiological deconditioning becomes functionally significant, compromising the crew member’s ability to perform critical tasks on the surface, and/or upon return to earth [MoLo-LUNA – leveraging the Molo programme (and several other activities) - could become a key supporting activity for LUNA incl. validation of the Puppeteer offloading system itself via creation of a complementary MoLo-LUNA-LAB. Furthermore, the MoLo-LUNA programme could become a key facilitator of simulator suit instrumentation/definition, broader astronaut training activities and mission architecture development – including Artemis mission simulations. By employing a Puppeteer system external to the LUNA chamber hall it will optimise utilisation and cost-effectiveness of LUNA, and as such represents a critical service to future LUNA stakeholders. Furthermore, MoLo-LUNA would generate a unique data set that can be leveraged to predict de-conditioning on the Lunar surface - and thereby optimise functionality, and minimise mission risk – including informing the need for, and prescription of exercise countermeasures on the Lunar Surface and in transit. Thus, MoLo-LUNA offers a unique opportunity to place LUNA, and ESA as a key ongoing provider of evidence to define, optimise and support crew Artemis surface missions.

Cite as

David Andrew Green. Movement in Low Gravity (MoLo) – LUNA: Biomechanical Modelling to Mitigate Lunar Surface Operation Risks. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 26:1-26:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{green:OASIcs.SpaceCHI.2025.26,
  author =	{Green, David Andrew},
  title =	{{Movement in Low Gravity (MoLo) – LUNA: Biomechanical Modelling to Mitigate Lunar Surface Operation Risks}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{26:1--26:11},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.26},
  URN =		{urn:nbn:de:0030-drops-240166},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.26},
  annote =	{Keywords: Locomotion, hypogravity, modelling, Lunar}
}
Document
Assessing the Use of Mixed Reality as a Valid Tool for Human-Robot Interaction Studies in the Context of Space Exploration

Authors: Enrico Guerra, Sebastian Thomas Büttner, Alper Beşer, and Michael Prilla


Abstract
Mixed Reality (MR) is a technology with strong potential for advancing research in Human-Robot Interaction (HRI) for space exploration. Apart from the efficiency and high flexibility MR can offer, we argue that its benefits for HRI research in space contexts lies particularly in its ability to aid human-in-the-loop development, offer realistic hybrid simulations, and foster broader participation in HRI research in the space exploration context. However, we believe that this is only plausible if MR-based simulations can yield comparable results to fully physical approaches in human-centred studies. In this position paper, we highlight several arguments in favour of MR as a tool for space HRI research, while emphasising the importance of the open question regarding its scientific validity. We believe MR could become a central tool for preparing for future human-robotic space exploration missions and significantly diversify research in this domain.

Cite as

Enrico Guerra, Sebastian Thomas Büttner, Alper Beşer, and Michael Prilla. Assessing the Use of Mixed Reality as a Valid Tool for Human-Robot Interaction Studies in the Context of Space Exploration. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 27:1-27:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{guerra_et_al:OASIcs.SpaceCHI.2025.27,
  author =	{Guerra, Enrico and B\"{u}ttner, Sebastian Thomas and Be\c{s}er, Alper and Prilla, Michael},
  title =	{{Assessing the Use of Mixed Reality as a Valid Tool for Human-Robot Interaction Studies in the Context of Space Exploration}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{27:1--27:11},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.27},
  URN =		{urn:nbn:de:0030-drops-240175},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.27},
  annote =	{Keywords: Mixed Reality, Augmented Reality, Human-Robot Interaction, Space Exploration, Validity}
}
Document
Underwater VR for Astronaut Training

Authors: Sven Jörissen, David L. Hilbert, Michael Bleier, Dorit Borrmann, Helge A. Lauterbach, and Andreas Nüchter


Abstract
Pools are excellent places for testing many nautical technologies, as well as training divers or astronauts in simulated weightlessness. However, for extensive astronaut training underwater, a large pool is necessary. The Neutral Buoyancy Laboratory (NBL) is an astronaut training facility and located at the Sonny Carter Training Facility, near the Johnson Space Center in Houston, Texas, containing 23 million liters of water. In Europe, Blue Abyss Ltd. is currently building the world’s largest and deepest indoor pool in Cornwall, also having space applications in mind. We believe that a VR solution can overcome the needs for large pools for astronaut training as space equipment can be well-simulated in virtual reality. To this end, we combined a full-face diving mask with a custom built VR headset for simulating a space environment. Besides constructing a water-tight VR headset, a precise tracking system to determine the position and orientation in space plays an important role. We use an outside-in tracking system consisting of four cameras in watertight housings, mounted on aluminium rails, covering a 2×3.5 meter experimental area, which enable us to track reference markers placed on the underwater VR diving mask. To calibrate this system, a rectangular cuboidal structure with reference markers is placed in the experimental area, which additionally serves as a handrail to perform basic Extra Vehicular Activity (EVA) tasks. The position tracking of the underwater headset and mirroring of physical objects in VR enables the user to move physically in the virtual environment as well as interact with the physical objects, such as the handrail. Due to the underwater environment, refraction at different media needs to be taken into account for both calibration and tracking.

Cite as

Sven Jörissen, David L. Hilbert, Michael Bleier, Dorit Borrmann, Helge A. Lauterbach, and Andreas Nüchter. Underwater VR for Astronaut Training. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 28:1-28:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{jorissen_et_al:OASIcs.SpaceCHI.2025.28,
  author =	{J\"{o}rissen, Sven and Hilbert, David L. and Bleier, Michael and Borrmann, Dorit and Lauterbach, Helge A. and N\"{u}chter, Andreas},
  title =	{{Underwater VR for Astronaut Training}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{28:1--28:16},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.28},
  URN =		{urn:nbn:de:0030-drops-240187},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.28},
  annote =	{Keywords: Head Mounted Display, VR Glasses, Underwater, Motion Tracking}
}
Document
Gaze Beyond Limits: Integrating Eye-Tracking and Augmented Reality for Next-Generation Spacesuit Interaction

Authors: Jiayu He, Yifan Li, Oliver R. Runswick, Peter D. Hodkinson, Jarle Steinberg, Felix Gorbatsevich, and Yang Gao


Abstract
Extravehicular activities (EVAs) are increasingly frequent in human spaceflight, particularly in spacecraft maintenance, scientific research, and planetary exploration. Spacesuits are essential for sustaining astronauts in the harsh environment of space, making their design a key factor in the success of EVA missions. The development of spacesuit technology has traditionally been driven by highly engineered solutions focused on life support, mission adaptability and operational efficiency. Modern spacesuits prioritize maintaining optimal internal temperature, humidity and pressure, as well as withstanding extreme temperature fluctuations and providing robust protection against micrometeoroid impacts and space debris. However, their bulkiness and rigidity impose significant physical strain on astronauts, reducing mobility and dexterity, particularly in tasks requiring fine motor control. The restricted field of view further complicates situational awareness, increasing the cognitive load during high-precision operations. While traditional spacesuits support basic EVA tasks, future space exploration shifting toward long-duration lunar and Martian surface missions demand more adaptive, intelligent, and astronaut-centric designs to overcome current constraints. To explore a next-generation spacesuit, this paper proposed an in-process eye-tracking embedded Augmented Reality (AR) Spacesuit System to enhance astronaut-environment interactions. By leveraging Segment-Anything Models (SAM) and Vision-Language Models (VLMs), we demonstrate a four-step approach to enable top-down gaze detection to minimize erroneous fixation data, gaze-based segmentation of objects of interest, real-time contextual assistance via AR overlays and hands-free operation within the spacesuit. This approach enhances real-time situational awareness and improves EVA task efficiency. We conclude with an exploration of the AR Helmet System’s potential in revolutionizing human-space interaction paradigms for future long-duration deep-space missions and discuss the further optimization of eye-tracking interactions using VLMs to predict astronaut intent and highlight relevant objects preemptively.

Cite as

Jiayu He, Yifan Li, Oliver R. Runswick, Peter D. Hodkinson, Jarle Steinberg, Felix Gorbatsevich, and Yang Gao. Gaze Beyond Limits: Integrating Eye-Tracking and Augmented Reality for Next-Generation Spacesuit Interaction. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 29:1-29:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{he_et_al:OASIcs.SpaceCHI.2025.29,
  author =	{He, Jiayu and Li, Yifan and Runswick, Oliver R. and Hodkinson, Peter D. and Steinberg, Jarle and Gorbatsevich, Felix and Gao, Yang},
  title =	{{Gaze Beyond Limits: Integrating Eye-Tracking and Augmented Reality for Next-Generation Spacesuit Interaction}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{29:1--29:15},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.29},
  URN =		{urn:nbn:de:0030-drops-240197},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.29},
  annote =	{Keywords: Augmented Reality (AR), Eye-Tracking, Cognitive Load/Workload, Segment Anything Model (SAM), Visual Language Models (VLMs)}
}
Document
A Research Framework to Develop a Real-Time Synchrony Index to Monitor Team Cohesion and Performance in Long-Duration Space Exploration

Authors: Federico Nemmi, Emma Chabani, Laure Boyer, Charlie Madier, and Daniel Lewkowicz


Abstract
As humanity prepares for long-distance space exploration, optimizing group performance, the ability of a group to achieve its goals efficiently, is critical. Astronaut crews will endure isolation, confinement, and operational stress, making group synchrony - the alignment of behaviors, emotions, and physiological states - a key factor in mission success. Synchrony influences team cohesion, performance, and resilience, necessitating effective crew management strategies. This paper proposes a framework for a real-time, unobtrusive index of group synchrony to support astronauts and mission control. Research indicates that team cohesion fluctuates in isolated environments, with reduced communication and interpersonal conflicts emerging over time. A system tracking synchrony could mitigate these issues, providing proactive support and improving remote management. Additionally, it could serve as a cognitive and physiological feedback tool for astronauts and a decision-making aid for mission control, enhancing well-being and efficiency. Our approach integrates behavioral and physiological synchrony measures to assess team cohesion and performance. We propose a multi-modal synchrony index combining movement coordination, communication patterns, and physiological signals such as heart rate, electrodermal activity, and EEG. This index will be validated across different tasks to ensure applicability across diverse mission scenarios. By developing a robust synchrony index, we address a fundamental challenge in space missions: sustaining team effectiveness under extreme conditions. Beyond space exploration, our findings could benefit high-risk, high-isolation teams in submarine crews, polar expeditions, and remote research groups. Our collaboration with the Centre National d'Etudes Spatiales, the Institut de Médecine et de Physiologie Spatiales, and the Toulouse University Hospital marks the first step, with experimental data collection starting this year. Ultimately, this research fosters more adaptive, responsive, and resilient teams for future space missions.

Cite as

Federico Nemmi, Emma Chabani, Laure Boyer, Charlie Madier, and Daniel Lewkowicz. A Research Framework to Develop a Real-Time Synchrony Index to Monitor Team Cohesion and Performance in Long-Duration Space Exploration. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 30:1-30:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{nemmi_et_al:OASIcs.SpaceCHI.2025.30,
  author =	{Nemmi, Federico and Chabani, Emma and Boyer, Laure and Madier, Charlie and Lewkowicz, Daniel},
  title =	{{A Research Framework to Develop a Real-Time Synchrony Index to Monitor Team Cohesion and Performance in Long-Duration Space Exploration}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{30:1--30:16},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.30},
  URN =		{urn:nbn:de:0030-drops-240200},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.30},
  annote =	{Keywords: Performance, Synchronie, Crew monitoring, Cohesion}
}
Document
Monitoring the Structural Health of Space Habitats Through Immersive Data Art Visualization

Authors: Ze Gao, Yuan Zhuang, Kunqi Wang, and Mengyao Guo


Abstract
As humanity advances toward long-term space habitation, traditional SHM systems - reliant on abstract data representations - struggle to support rapid decision-making in extreme environments. This study addresses this critical gap by introducing an engineering-art-human factors framework that transforms SHM through immersive data-art visualization. By integrating sensor networks and machine learning, structural data (stress, vibration, deformation) is converted into intuitive visual languages: dynamic color gradients and biomimetic morphologies leverage perceptual laws (e.g., Weber-Fechner) to amplify critical signals. Multimodal interfaces (AR, haptic feedback) and natural elements mitigate cognitive load and psychological stress in confined habitats. Our contribution lies in redefining SHM as a synergy of precision and intuition, enabling "at-a-glance" assessments while balancing functionality and human-centric design. The urgency of this research stems from the inadequacy of conventional systems in extreme space conditions and the growing demand for astronaut safety and operational efficiency. This framework not only pioneers a sustainable monitoring paradigm for space habitats but also extends to terrestrial high-risk infrastructure, demonstrating the necessity of interdisciplinary innovation in extreme environments.

Cite as

Ze Gao, Yuan Zhuang, Kunqi Wang, and Mengyao Guo. Monitoring the Structural Health of Space Habitats Through Immersive Data Art Visualization. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 31:1-31:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{gao_et_al:OASIcs.SpaceCHI.2025.31,
  author =	{Gao, Ze and Zhuang, Yuan and Wang, Kunqi and Guo, Mengyao},
  title =	{{Monitoring the Structural Health of Space Habitats Through Immersive Data Art Visualization}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{31:1--31:18},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.31},
  URN =		{urn:nbn:de:0030-drops-240217},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.31},
  annote =	{Keywords: Structural health monitoring, space habitats, immersive visualization, human-centered design, interdisciplinary innovation}
}
Document
Virtual Reality Prototyping Environment for Concurrent Design, Training and Rover Operations

Authors: Pinar Dogru, Hanjo Schnellbächer, Tarek Can Battikh, and Kristina Remić


Abstract
As part of the CASIMAR (Collaborative Astronaut Supporting Interregional Moon Analog Rover) project, initiated by the BVSR e.V. (Bundesverband Studentischer Raumfahrt), the TUDSaT (TU Darmstadt Space Technology e.V.) team is developing a Virtual Reality (VR) prototype environment to support the interdisciplinary design process of lunar exploration technologies. Given the complexity of collaboration among eight organizations, this tool aims to streamline design integration and enhance mission planning. The primary objective is to create a comprehensive 3D model of the rover, complete with predefined procedures and activities, to simulate astronaut-robot interaction. By leveraging VR technology, astronauts can familiarize themselves with the rover and its EVA (Extravehicular Activity) tools before actual deployment, improving operational safety and efficiency. Beyond training applications, this virtual environment serves as a critical platform for designing, testing, and benchmarking rover functionalities and EVA procedures. Ultimately, our work contributes to optimizing human-robotic interaction, ensuring that lunar exploration missions are both effective and well-prepared before reaching the Moon.

Cite as

Pinar Dogru, Hanjo Schnellbächer, Tarek Can Battikh, and Kristina Remić. Virtual Reality Prototyping Environment for Concurrent Design, Training and Rover Operations. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 32:1-32:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{dogru_et_al:OASIcs.SpaceCHI.2025.32,
  author =	{Dogru, Pinar and Schnellb\"{a}cher, Hanjo and Battikh, Tarek Can and Remi\'{c}, Kristina},
  title =	{{Virtual Reality Prototyping Environment for Concurrent Design, Training and Rover Operations}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{32:1--32:13},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.32},
  URN =		{urn:nbn:de:0030-drops-240226},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.32},
  annote =	{Keywords: virtual reality (VR), digital twin, human-robot-interaction (HRI), LUNA analog facility, rover, extravehicular activities (EVA), gamification, simulation, user-centered design (UCD), concurrent engineering (CE), space system engineering}
}
Document
Digital Health for Space: Towards Prevention, Training, Empowerment, and Autonomy

Authors: Mario A. Cypko, Ulrich Straube, Russell J. Andrews, and Oliver Amft


Abstract
Future long-duration and deep-space missions will rely on digital health technologies to ensure the health and safety of the crew, as well as to enable the required mission autonomy. This position paper redefines the current paradigms of digital health by emphasizing prevention, self-management, and individual empowerment for health as central challenges for both space and terrestrial medicine. We focus on future mission scenarios and highlight the potential of co-evolving digital health and related technologies, particularly sensing, artificial intelligence (AI), and human-computer interaction (HCI), across the continuum of space medicine: from astronaut selection and training to prevention, diagnostics, therapy, rehabilitation, and long-term care. Future digital health technologies can respond to pressing needs arising from limited medical infrastructure, rising care costs, and increasing demands on healthcare systems in space and on Earth. To structure research and development needs, we introduce a framework with four autonomy levels based on mission distance and communication latency (Earth orbit, Lunar Gateway and Moon vicinity, Mars, and deep space) that illustrate how mission context constrains medical support and dictates system requirements. Using the Lunar Orbital Platform-Gateway as a near-future reference, we discuss how growing communication delays demand greater onboard autonomy and new telemedical strategies. Within the proposed framework, we integrate solutions built around AI-supported decision making, multimodal monitoring, and adaptive HCI, which should be co-designed through human-centered methods to form a cohesive health management ecosystem. The framework opens up synergies for proactive and trustworthy health support under isolation and limited ground contact. The paper consolidates current technological readiness and strategic challenges, offering guidance for space health research and policy, with clear translational benefits for terrestrial care delivery.

Cite as

Mario A. Cypko, Ulrich Straube, Russell J. Andrews, and Oliver Amft. Digital Health for Space: Towards Prevention, Training, Empowerment, and Autonomy. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 33:1-33:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{cypko_et_al:OASIcs.SpaceCHI.2025.33,
  author =	{Cypko, Mario A. and Straube, Ulrich and Andrews, Russell J. and Amft, Oliver},
  title =	{{Digital Health for Space: Towards Prevention, Training, Empowerment, and Autonomy}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{33:1--33:12},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.33},
  URN =		{urn:nbn:de:0030-drops-240236},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.33},
  annote =	{Keywords: Digital Health in Space, AI-based Decision Support, Wearable Health Monitoring, Human-Computer Interaction (HCI), Autonomous Medical Systems}
}
Document
XR/UX for Virtual and Mixed Mock-Ups Utilization in Space Habitat Design Development: Use Cases and Lessons Learnt

Authors: Marinella Ferrino and Jan Persson


Abstract
In the frame of Artemis/Gateway programs, TASI developed in 2019 a 3D tool to support the Habitat Modules Design verification since the early phase of the design process. The aim is to improve the use of virtual mock-ups for encouraging flexible decision–making and knowledge sharing, and to reinforce fruitful interaction between the interdisciplinary teams involved in the projects. Since the virtual environment is currently not considered applicable for the HITL design developmental requirements verification phases, the lesson learnt emerged by the final users feedbacks (astronauts Samantha Cristoforetti, Luca Parmitano and Alex Gerst) during IHAB design reviews sessions in immersive environment (2019-2021) have been considered as guideline to propose a mixed reality environment able to overcome the 3D tool usability criticalities emerged. In this paper are briefly discussed the application of the VR/UX process in the frame of the crew systems design review of IHAB where the HSI requirements (e.g. anthropometrics, range of motion, orientation, clearance…) and tasks operations (e.g. galley area for dining together, four crew of 99th percentiles) have been evaluated. The comparison between the design guidelines emerged in the frame of the HITL developmental tests in a low fidelity physical mock-up performed in Turin in May 2024 and the crew feedbacks collected during the vr session in an immersive environment by using avatars, in 2021 reviewing the same crew systems are showed to be considered as additional support to reinforce the need to promote the development of virtual environments to maximize their use to support future Space Habitat Design developments. The opportunity to reduce cost/time in providing suitable verification tools like a mixed mock-up composed by simplified physical parts combined with virtual reality scenarios is under way in the frame of ERM Esprit module which will be characterize by six windows to permit the crew to observe the moon surface and take photos from the Gateway station. Testbed scenarios e.g. window scratch pane removal tasks procedures are under evaluation to test future tools combining physical infrastructures with digital set extensions supported by the XR innovative technologies under development. The main envisaged objectives go through the need to contribute to innovate processes and tools following the ongoing digital transformation in support of future habitat design development which includes lunar surface habitat (in partial gravity).

Cite as

Marinella Ferrino and Jan Persson. XR/UX for Virtual and Mixed Mock-Ups Utilization in Space Habitat Design Development: Use Cases and Lessons Learnt. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 34:1-34:10, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{ferrino_et_al:OASIcs.SpaceCHI.2025.34,
  author =	{Ferrino, Marinella and Persson, Jan},
  title =	{{XR/UX for Virtual and Mixed Mock-Ups Utilization in Space Habitat Design Development: Use Cases and Lessons Learnt}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{34:1--34:10},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.34},
  URN =		{urn:nbn:de:0030-drops-240242},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.34},
  annote =	{Keywords: Crew Systems, User eXperience, Virtual Reality, Human System Integration, Human Centered Design, Mixed Reality, Human in the Loop}
}

Filters


Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail