Wayfinding Stages: The Role of Familiarity, Gaze Events, and Visual Attention

Authors Negar Alinaghi , Ioannis Giannopoulos



PDF
Thumbnail PDF

File

LIPIcs.COSIT.2024.1.pdf
  • Filesize: 5.36 MB
  • 21 pages

Document Identifiers

Author Details

Negar Alinaghi
  • Geoinformation, TU Wien, Austria
Ioannis Giannopoulos
  • Geoinformation, TU Wien, Austria

Cite AsGet BibTex

Negar Alinaghi and Ioannis Giannopoulos. Wayfinding Stages: The Role of Familiarity, Gaze Events, and Visual Attention. In 16th International Conference on Spatial Information Theory (COSIT 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 315, pp. 1:1-1:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)
https://doi.org/10.4230/LIPIcs.COSIT.2024.1

Abstract

Understanding the cognitive processes involved in wayfinding is crucial for both theoretical advances and practical applications in navigation systems development. This study explores how gaze behavior and visual attention contribute to our understanding of cognitive states during wayfinding. Based on the model proposed by Downs and Stea, which segments wayfinding into four distinct stages: self-localization, route planning, monitoring, and goal recognition, we conducted an outdoor wayfinding experiment with 56 participants. Given the significant role of spatial familiarity in wayfinding behavior, each participant navigated six different routes in both familiar and unfamiliar environments, with their eye movements being recorded. We provide a detailed examination of participants' gaze behavior and the actual objects of focus. Our findings reveal distinct gaze behavior patterns and visual attention, differentiating wayfinding stages while emphasizing the impact of spatial familiarity. This examination of visual engagement during wayfinding explains adaptive cognitive processes, demonstrating how familiarity influences navigation strategies. The results enhance our theoretical understanding of wayfinding and offer practical insights for developing navigation aids capable of predicting different wayfinding stages.

Subject Classification

ACM Subject Classification
  • Applied computing → Psychology
  • General and reference → Empirical studies
  • Computing methodologies → Interest point and salient region detections
  • Computing methodologies → Video segmentation
Keywords
  • Eye-tracking
  • Wayfinding
  • Spatial Familiarity
  • Visual Attention
  • Gaze Behavior

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. N. Alinaghi, M. Kattenbeck, A. Golab, and I. Giannopoulos. Will you take this turn? gaze-based turning activity recognition during navigation. In Proc. of the 11th Intl. Conf. on Geographic Information Science (GIScience 2021)-Part II, 2021. Google Scholar
  2. Negar Alinaghi and Ioannis Giannopoulos. Consider the head movements! saccade computation in mobile eye-tracking. In 2022 Symposium on Eye Tracking Research and Applications, pages 1-7, 2022. Google Scholar
  3. Negar Alinaghi, Samuel Hollendonner, and Ioannis Giannopoulos. Myfix: Automated fixation annotation of eye-tracking videos. Sensors, 24(9), 2024. URL: https://doi.org/10.3390/s24092666.
  4. Negar Alinaghi, Markus Kattenbeck, and Ioannis Giannopoulos. I can tell by your eyes! continuous gaze-based turn-activity prediction reveals spatial familiarity. In 15th International Conference on Spatial Information Theory (COSIT 2022). Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 2022. Google Scholar
  5. Guy Thomas Buswell. How people look at pictures: a study of the psychology and perception in art. Univ. Chicago Press, 1935. Google Scholar
  6. Bowen Cheng, Ishan Misra, Alexander G Schwing, Alexander Kirillov, and Rohit Girdhar. Masked-attention mask transformer for universal image segmentation. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 1290-1299, 2022. Google Scholar
  7. Eric Chown, Stephen Kaplan, and David Kortenkamp. Prototypes, location, and associative networks (plan): Towards a unified theory of cognitive mapping. Cognitive Science, 19(1):1-51, 1995. Google Scholar
  8. Marius Cordts, Mohamed Omran, Sebastian Ramos, Timo Rehfeld, Markus Enzweiler, Rodrigo Benenson, Uwe Franke, Stefan Roth, and Bernt Schiele. The cityscapes dataset for semantic urban scene understanding. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 3213-3223, 2016. Google Scholar
  9. Ruth Conroy Dalton. The secret is to follow your nose: Route path selection and angularity. Environment and Behavior, 35(1):107-131, 2003. Google Scholar
  10. Weihua Dong, Hua Liao, Bing Liu, Zhicheng Zhan, Huiping Liu, Liqiu Meng, and Yu Liu. Comparing pedestrians’ gaze behavior in desktop and in real environments. Cartography and Geographic Information Science, 47(5):432-451, 2020. Google Scholar
  11. Weihua Dong, Tong Qin, Tianyu Yang, Hua Liao, Bing Liu, Liqiu Meng, and Yu Liu. Wayfinding behavior and spatial knowledge acquisition: Are they the same in virtual reality and in real-world environments? Annals of the American Association of Geographers, 112(1):226-246, 2022. Google Scholar
  12. M. R. Downs and D. Stea. The World In The Head. Harper & Row Series in Geography. Harper & Row, 1977. Google Scholar
  13. Karen M Evans, Robert A Jacobs, John A Tarduno, and Jeff B Pelz. Collecting and analyzing eye tracking data in outdoor environments. Journal of Eye Movement Research, 5(2):6, 2012. Google Scholar
  14. Anna Charisse Farr, Tristan Kleinschmidt, Prasad Yarlagadda, and Kerrie Mengersen. Wayfinding: A simple concept, a complex process. Transport Reviews, 32(6):715-743, 2012. Google Scholar
  15. Tom Foulsham and Alan Kingstone. Asymmetries in the direction of saccades during perception of scenes and fractals: Effects of image type and image features. Vision research, 50(8):779-795, 2010. Google Scholar
  16. Andreas Gegenfurtner, Erno Lehtinen, and Roger Säljö. Expertise differences in the comprehension of visualizations: A meta-analysis of eye-tracking research in professional domains. Educational psychology review, 23:523-552, 2011. Google Scholar
  17. I. Giannopoulos, P. Kiefer, and M. Raubal. Gazenav: gaze-based pedestrian navigation. In Proc of MobileHCI 2015, pages 337-346, 2015. Google Scholar
  18. Ioannis Giannopoulos. Supporting Wayfinding Through Mobile Gaze-Based Interaction. PhD thesis, ETH Zurich, 2016. Google Scholar
  19. Joseph H Goldberg, Mark J Stimson, Marion Lewenstein, Neil Scott, and Anna M Wichansky. Eye tracking in web search tasks: design implications. In Proceedings of the 2002 symposium on Eye tracking research & applications, pages 51-58, 2002. Google Scholar
  20. Reginald G Golledge. Human wayfinding and cognitive maps. Colonization of unfamiliar landscapes: the archaeology of adaptation, 25, 2003. Google Scholar
  21. Mary Hegarty, Chuanxiuyue He, Alexander P Boone, Shuying Yu, Emily G Jacobs, and Elizabeth R Chrastil. Understanding differences in wayfinding strategies. Topics in Cognitive Science, 15(1):102-119, 2023. Google Scholar
  22. Mary Hegarty, Anthony E Richardson, Daniel R Montello, Kristin Lovelace, and Ilavanil Subbiah. Development of a self-report measure of environmental spatial ability. Intelligence, 30(5):425-447, 2002. Google Scholar
  23. Qiaosong Hei, Weihua Dong, and Bowen Shi. Detecting dynamic visual attention in augmented reality aided navigation environment based on a multi-feature integration fully convolutional network. Cartography and Geographic Information Science, 50(1):63-78, 2023. Google Scholar
  24. Nadja Herten, Tobias Otto, and Oliver T Wolf. The role of eye fixation in memory enhancement under stress-an eye tracking study. Neurobiology of learning and memory, 140:134-144, 2017. Google Scholar
  25. Kenneth Holmqvist, Marcus Nyström, Richard Andersson, Richard Dewhurst, Halszka Jarodzka, and Joost Van de Weijer. Eye tracking: A comprehensive guide to methods and measures. OUP Oxford, 2011. Google Scholar
  26. David E Irwin. Fixation location and fixation duration as indices of cognitive processing. In The interface of language, vision, and action, pages 105-133. Psychology Press, 2013. Google Scholar
  27. Robert JK Jacob and Keith S Karn. Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. In The mind’s eye, pages 573-605. Elsevier, 2003. Google Scholar
  28. Peter Kiefer, Ioannis Giannopoulos, and Martin Raubal. Where am i? investigating map matching during self-localization with mobile eye tracking in an urban environment. Transactions in GIS, 18(5):660-686, 2014. Google Scholar
  29. Alexander Klippel. Wayfinding choremes. In Spatial Information Theory. Foundations of Geographic Information Science: International Conference, COSIT 2003, Kartause Ittingen, Switzerland, September 24-28, 2003. Proceedings 6, pages 301-315. Springer, 2003. Google Scholar
  30. H. Liao, W. Zhao, C. Zhang, W. Dong, and H. Huang. Detecting individuals' spatial familiarity with urban environments using eye movement data. CEUS, 93:101758, 2022. Google Scholar
  31. Tsung-Yi Lin, Michael Maire, Serge Belongie, Lubomir Bourdev, Ross Girshick, James Hays, Pietro Perona, Deva Ramanan, C. Lawrence Zitnick, and Piotr Dollár. Microsoft coco: Common objects in context, 2015. URL: https://arxiv.org/abs/1405.0312.
  32. Kevin Lynch. The image of the city. MIT press, 1964. Google Scholar
  33. Henry B Mann and Donald R Whitney. On a test of whether one of two random variables is stochastically larger than the other. The annals of mathematical statistics, pages 50-60, 1947. Google Scholar
  34. David Marr. Vision: A computational investigation into the human representation and processing of visual information. MIT press, 2010. Google Scholar
  35. Daniel R. Montello. Spatial cognition. In Neil J. Smelser and Paul B. Baltes, editors, International Encyclopedia of the Social & Behavioral Sciences, pages 14771-14775. Elsevier, Oxford, 2001. Google Scholar
  36. Daniel R Montello. Navigation. In Priti Shah and Akira Miyake, editors, The Cambridge Handbook of Visuospatial Thinking, pages 257-294. Cambridge University Press, 2005. URL: https://doi.org/10.1017/CBO9780511610448.008.
  37. Minoru Nakayama, Koji Takahashi, and Yasutaka Shimizu. The act of task difficulty and eye-movement frequency for the'oculo-motor indices'. In Proceedings of the 2002 symposium on Eye tracking research & applications, pages 37-42, 2002. Google Scholar
  38. Romedi Passini. Wayfinding: A conceptual framework. Urban Ecology, 5(1):17-31, 1981. Google Scholar
  39. Beatrice Rammstedt and Oliver P John. Measuring personality in one minute or less: A 10-item short version of the big five inventory in english and german. Journal of research in Personality, 41(1):203-212, 2007. Google Scholar
  40. Keith Rayner and Alexander Pollatsek. Eye movements and scene perception. Canadian Journal of Psychology/Revue canadienne de psychologie, 46(3):342, 1992. Google Scholar
  41. Joseph Redmon, Santosh Divvala, Ross Girshick, and Ali Farhadi. You only look once: Unified, real-time object detection. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 779-788, 2016. Google Scholar
  42. Ferdinand Reimer, Ulrich Kral, Emre Can Sönmez, Friedrich Hauer, Severin Hohensinner, Hannah Wolfinger, Klara Stuppacher, Andreas Danzinger, Ingeborg Hengl, Lupina Prospero, et al. Data description of “building age map, vienna, around 1920”. Data in Brief, 41:107864, 2022. Google Scholar
  43. Kai-Florian Richter and Stephan Winter. Landmarks. Springer Cham Heidelberg New York Dordrecht London. doi, 10:978-3, 2014. Google Scholar
  44. Dario D Salvucci and Joseph H Goldberg. Identifying fixations and saccades in eye-tracking protocols. In Proceedings of the 2000 symposium on Eye tracking research & applications, pages 71-78, 2000. Google Scholar
  45. Norma Saiph Savage, Wendy Chun, Norma Elva Chavez, and Tobias Höllerer. Seems familiar: An algorithm for inferring spatial familiarity automatically. Computer Science Department, University of California, Santa Barbara, accessed Feb, 5, 2013. Google Scholar
  46. Robert Schleicher, Niels Galley, Susanne Briest, and Lars Galley. Blinks and saccades as indicators of fatigue in sleepiness warnings: looking tired? Ergonomics, 51(7):982-1010, 2008. Google Scholar
  47. Charlotte Schwedes and Dirk Wentura. Through the eyes to memory: Fixation durations as an early indirect index of concealed knowledge. Memory & cognition, 44:1244-1258, 2016. Google Scholar
  48. Benjamin W Tatler, Iain D Gilchrist, and Michael F Land. Visual memory for objects in natural scenes: From fixations to object files. The Quarterly Journal of Experimental Psychology Section A, 58(5):931-960, 2005. Google Scholar
  49. Aldert Vrij, João Oliveira, Annie Hammond, and Howard Ehrlichman. Saccadic eye movement rate as a cue to deceit. Journal of Applied Research in Memory and Cognition, 4(1):15-19, 2015. Google Scholar
  50. J. M Wiener, S. J Büchner, and C. Hölscher. Taxonomy of human wayfinding tasks: A knowledge-based approach. Spatial Cognition & Computation, 9(2):152-165, 2009. Google Scholar
  51. Jan M Wiener, Alexander Schnee, and Hanspeter A Mallot. Use and interaction of navigation strategies in regionalized environments. Journal of Environmental Psychology, 24(4):475-493, 2004. Google Scholar
  52. Liu Xin, Zheng Bin, Duan Xiaoqin, He Wenjing, Li Yuandong, Zhao Jinyu, Zhao Chen, and Wang Lin. Detecting task difficulty of learners in colonoscopy: Evidence from eye-tracking. Journal of Eye Movement Research, 14(2), 2021. Google Scholar
  53. Fan Yang, Zhixiang Fang, and Fangli Guan. What do we actually need during self-localization in an augmented environment? In International symposium on web and wireless geographical information systems, pages 24-32. Springer, 2020. Google Scholar
  54. Z. Zhou, R. Weibel, and H. Huang. Familiarity-dependent computational modelling of indoor landmark selection for route communication: a ranking approach. International J.of Geographical Information Science, pages 1-33, 2021. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail