2 Search Results for "Nuhn, Eva"


Document
Vision Paper
Are Psychological Variables Relevant to Evaluating Geoinformatics Applications? The Case of Landmarks (Vision Paper)

Authors: Jakub Krukar and Angela Schwering

Published in: LIPIcs, Volume 240, 15th International Conference on Spatial Information Theory (COSIT 2022)


Abstract
Interdisciplinary integration of spatial cognition and spatial computation promises to create better spatial technology based on findings from cognitive psychology experiments. Using the example of psychological studies and computational modelling of landmarks, this paper argues that core evaluation criteria of both disciplines are not well aligned with the goal of evaluating landmark-enhanced navigation support systems that support users in everyday wayfinding. The paper raises two points. First, it reviews evaluation criteria used in the interdisciplinary field of landmark research. It is argued that when to consider the role of landmark-enhanced navigation support systems in everyday life of their users, different evaluation criteria are needed. If strictly-psychological or strictly-computational criteria continue being prioritised by the community, we risk undervaluing significant technological contributions. Second, it proposes one such potential criterion: testing whether the cognitive task has changed due to equipping users with the new technology. This goal might be achieved at the expense of criteria typical to strictly-psychological studies (such as spatial memory of landmarks along the travelled route) or strictly-computational studies (such as efficiency and accuracy of a landmark-selection algorithm). Thus, promoting and implementing alternative evaluation criteria comes with methodological risks. In order to mitigate them we propose a process based on pre-registration of "postdiction" studies and hope to stimulate a further debate on a consensus-based approach in the community.

Cite as

Jakub Krukar and Angela Schwering. Are Psychological Variables Relevant to Evaluating Geoinformatics Applications? The Case of Landmarks (Vision Paper). In 15th International Conference on Spatial Information Theory (COSIT 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 240, pp. 10:1-10:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{krukar_et_al:LIPIcs.COSIT.2022.10,
  author =	{Krukar, Jakub and Schwering, Angela},
  title =	{{Are Psychological Variables Relevant to Evaluating Geoinformatics Applications? The Case of Landmarks (Vision Paper)}},
  booktitle =	{15th International Conference on Spatial Information Theory (COSIT 2022)},
  pages =	{10:1--10:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-257-0},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{240},
  editor =	{Ishikawa, Toru and Fabrikant, Sara Irina and Winter, Stephan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.COSIT.2022.10},
  URN =		{urn:nbn:de:0030-drops-168956},
  doi =		{10.4230/LIPIcs.COSIT.2022.10},
  annote =	{Keywords: wayfinding, navigation support systems, cognitive geoengineering, landmarks}
}
Document
Is Salience Robust? A Heterogeneity Analysis of Survey Ratings

Authors: Markus Kattenbeck, Eva Nuhn, and Sabine Timpf

Published in: LIPIcs, Volume 114, 10th International Conference on Geographic Information Science (GIScience 2018)


Abstract
Differing weights for salience subdimensions (e.g. visual or structural salience) have been suggested since the early days of salience models in GIScience. Up until now, however, it remains unclear whether weights found in studies are robust across environments, objects and observers. In this study we examine the robustness of a survey-based salience model. Based on ratings of N_{o}=720 objects by N_{p}=250 different participants collected in-situ in two different European cities (Regensburg and Augsburg) we conduct a heterogeneity analysis taking into account environment and sense of direction stratified by gender. We find, first, empirical evidence that our model is invariant across environments, i.e. the strength of the relationships between the subdimensions of salience does not differ significantly. The structural model coefficients found can, hence, be used to calculate values for overall salience across different environments. Second, we provide empirical evidence that invariance of our measurement model is partly not given with respect to both, gender and sense of direction. These compositional invariance problems are a strong indicator for personal aspects playing an important role.

Cite as

Markus Kattenbeck, Eva Nuhn, and Sabine Timpf. Is Salience Robust? A Heterogeneity Analysis of Survey Ratings. In 10th International Conference on Geographic Information Science (GIScience 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 114, pp. 7:1-7:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{kattenbeck_et_al:LIPIcs.GISCIENCE.2018.7,
  author =	{Kattenbeck, Markus and Nuhn, Eva and Timpf, Sabine},
  title =	{{Is Salience Robust? A Heterogeneity Analysis of Survey Ratings}},
  booktitle =	{10th International Conference on Geographic Information Science (GIScience 2018)},
  pages =	{7:1--7:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-083-5},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{114},
  editor =	{Winter, Stephan and Griffin, Amy and Sester, Monika},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.GISCIENCE.2018.7},
  URN =		{urn:nbn:de:0030-drops-93353},
  doi =		{10.4230/LIPIcs.GISCIENCE.2018.7},
  annote =	{Keywords: Salience Model, Measurement Invariance, Heterogeneity Analysis, PLS Path Modeling, Structural Equation Models}
}
  • Refine by Author
  • 1 Kattenbeck, Markus
  • 1 Krukar, Jakub
  • 1 Nuhn, Eva
  • 1 Schwering, Angela
  • 1 Timpf, Sabine

  • Refine by Classification
  • 2 Human-centered computing → Empirical studies in ubiquitous and mobile computing
  • 1 Applied computing → Psychology
  • 1 Human-centered computing → Personal digital assistants
  • 1 Mathematics of computing → Multivariate statistics

  • Refine by Keyword
  • 1 Heterogeneity Analysis
  • 1 Measurement Invariance
  • 1 PLS Path Modeling
  • 1 Salience Model
  • 1 Structural Equation Models
  • Show More...

  • Refine by Type
  • 2 document

  • Refine by Publication Year
  • 1 2018
  • 1 2022

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail