Search Results

Documents authored by Nüchter, Andreas


Artifact
Audiovisual
Underwater VR for Astronaut Training

Authors: Sven Jörissen, David L. Hilbert, Michael Bleier, Dorit Borrmann, Helge A. Lauterbach, and Andreas Nüchter


Abstract

Cite as

Sven Jörissen, David L. Hilbert, Michael Bleier, Dorit Borrmann, Helge A. Lauterbach, Andreas Nüchter. Underwater VR for Astronaut Training (Audiovisual, video demonstrating the approach). Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@misc{dagstuhl-artifact-24343,
   title = {{Underwater VR for Astronaut Training}}, 
   author = {J\"{o}rissen, Sven and Hilbert, David L. and Bleier, Michael and Borrmann, Dorit and Lauterbach, Helge A. and N\"{u}chter, Andreas},
   note = {Audiovisual (visited on 2025-09-21)},
   url = {https://youtu.be/rKG1XqJKrDw},
   doi = {10.4230/artifacts.24343},
}
Document
Underwater VR for Astronaut Training

Authors: Sven Jörissen, David L. Hilbert, Michael Bleier, Dorit Borrmann, Helge A. Lauterbach, and Andreas Nüchter

Published in: OASIcs, Volume 130, Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)


Abstract
Pools are excellent places for testing many nautical technologies, as well as training divers or astronauts in simulated weightlessness. However, for extensive astronaut training underwater, a large pool is necessary. The Neutral Buoyancy Laboratory (NBL) is an astronaut training facility and located at the Sonny Carter Training Facility, near the Johnson Space Center in Houston, Texas, containing 23 million liters of water. In Europe, Blue Abyss Ltd. is currently building the world’s largest and deepest indoor pool in Cornwall, also having space applications in mind. We believe that a VR solution can overcome the needs for large pools for astronaut training as space equipment can be well-simulated in virtual reality. To this end, we combined a full-face diving mask with a custom built VR headset for simulating a space environment. Besides constructing a water-tight VR headset, a precise tracking system to determine the position and orientation in space plays an important role. We use an outside-in tracking system consisting of four cameras in watertight housings, mounted on aluminium rails, covering a 2×3.5 meter experimental area, which enable us to track reference markers placed on the underwater VR diving mask. To calibrate this system, a rectangular cuboidal structure with reference markers is placed in the experimental area, which additionally serves as a handrail to perform basic Extra Vehicular Activity (EVA) tasks. The position tracking of the underwater headset and mirroring of physical objects in VR enables the user to move physically in the virtual environment as well as interact with the physical objects, such as the handrail. Due to the underwater environment, refraction at different media needs to be taken into account for both calibration and tracking.

Cite as

Sven Jörissen, David L. Hilbert, Michael Bleier, Dorit Borrmann, Helge A. Lauterbach, and Andreas Nüchter. Underwater VR for Astronaut Training. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 28:1-28:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{jorissen_et_al:OASIcs.SpaceCHI.2025.28,
  author =	{J\"{o}rissen, Sven and Hilbert, David L. and Bleier, Michael and Borrmann, Dorit and Lauterbach, Helge A. and N\"{u}chter, Andreas},
  title =	{{Underwater VR for Astronaut Training}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{28:1--28:16},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.28},
  URN =		{urn:nbn:de:0030-drops-240187},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.28},
  annote =	{Keywords: Head Mounted Display, VR Glasses, Underwater, Motion Tracking}
}
Document
6D SLAM with Cached kd-tree Search

Authors: Andreas Nüchter, Kai Lingemann, and Joachim Hertzberg

Published in: Dagstuhl Seminar Proceedings, Volume 6421, Robot Navigation (2007)


Abstract
6D SLAM (Simultaneous Localization and Mapping) or 6D Concurrent Localization and Mapping of mobile robots considers six degrees of freedom for the robot pose, namely, the x, y and z coordinates and the roll, yaw and pitch angles. In previous work we presented our scan matching based 6D SLAM approach, where scan matching is based on the well known iterative closest point (ICP) algorithm [Besl 1992]. Efficient implementations of this algorithm are a result of a fast computation of closest points. The usual approach, i.e., using kd-trees is extended in this paper. We describe a novel search stategy, that leads to significant speed-ups. Our mapping system is real-time capable, i.e., 3D maps are computed using the resources of the used Kurt3D robotic hardware.

Cite as

Andreas Nüchter, Kai Lingemann, and Joachim Hertzberg. 6D SLAM with Cached kd-tree Search. In Robot Navigation. Dagstuhl Seminar Proceedings, Volume 6421, pp. 1-12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2007)


Copy BibTex To Clipboard

@InProceedings{nuchter_et_al:DagSemProc.06421.3,
  author =	{N\"{u}chter, Andreas and Lingemann, Kai and Hertzberg, Joachim},
  title =	{{6D SLAM with Cached kd-tree Search}},
  booktitle =	{Robot Navigation},
  pages =	{1--12},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2007},
  volume =	{6421},
  editor =	{S\'{a}ndor Fekete and Rudolf Fleischer and Rolf Klein and Alejandro Lopez-Ortiz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.06421.3},
  URN =		{urn:nbn:de:0030-drops-8705},
  doi =		{10.4230/DagSemProc.06421.3},
  annote =	{Keywords: SLAM, kd tree search}
}
Document
Searching with an Autonomous Robot

Authors: Sándor Fekete, Rolf Klein, and Andreas Nüchter

Published in: Dagstuhl Seminar Proceedings, Volume 5031, Algorithms for Optimization with Incomplete Information (2005)


Abstract
We discuss online strategies for visibility-based searching for an object hidden behind a corner, using Kurt3D, a real autonomous mobile robot. This task is closely related to a number of well-studied problems. Our robot uses a three-dimensional laser scanner in a stop, scan, plan, go fashion for building a virtual three-dimensional environment. Besides planning trajectories and avoiding obstacles, Kurt3D is capable of identifying objects like a chair. We derive a practically useful and asymptotically optimal strategy that guarantees a competitive ratio of 2, which differs remarkably from the well-studied scenario without the need of stopping for surveying the environment. Our strategy is used by Kurt3D, documented in a separate video.

Cite as

Sándor Fekete, Rolf Klein, and Andreas Nüchter. Searching with an Autonomous Robot. In Algorithms for Optimization with Incomplete Information. Dagstuhl Seminar Proceedings, Volume 5031, pp. 1-2, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2005)


Copy BibTex To Clipboard

@InProceedings{fekete_et_al:DagSemProc.05031.27,
  author =	{Fekete, S\'{a}ndor and Klein, Rolf and N\"{u}chter, Andreas},
  title =	{{Searching with an Autonomous Robot}},
  booktitle =	{Algorithms for Optimization with Incomplete Information},
  pages =	{1--2},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2005},
  volume =	{5031},
  editor =	{Susanne Albers and Rolf H. M\"{o}hring and Georg Ch. Pflug and R\"{u}diger Schultz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.05031.27},
  URN =		{urn:nbn:de:0030-drops-1919},
  doi =		{10.4230/DagSemProc.05031.27},
  annote =	{Keywords: Searching , visibility problems , watchman problems , online searching , competitive strategies , autonomous mobile robots three-dimensional laser scanning , Kurt3D}
}
Any Issues?
X

Feedback on the Current Page

CAPTCHA

Thanks for your feedback!

Feedback submitted to Dagstuhl Publishing

Could not send message

Please try again later or send an E-mail