Augmented Reality for Enhanced Spatial Cognition

Aaron Gardony (pictured here) uses hand gestures to interact with a 3-D model of a city. Gardony is a research psychologist on the Cognitive Science Team at the Natick Soldier Research, Development and Engineering Center. NSRDEC is investigating how augmented reality, or AR, may help Soldiers improve their mission-planning skills.
My current research is investigating the cognitive impacts of augmented reality (AR) for spatial cognition. Individuals learn novel environments from different spatial perspectives, including birds-eye survey perspectives and ground-level route perspectives. AR permits users to interact with rich 3-D representations of environments, allowing them to tailor their learning experiences to their spatial preferences. You can read more about it here.

AR Environmental Learning Demo

Neuroscientific Investigations into Strategy Use in Cognitive Tasks

A mixture of cognitive processes contribute to parity judgments in the MRT, including motoric processing, working memory (WM) maintenance, and visuospatial representation (Zacks, 2008). I use time-frequency analysis of ICA-decomposed EEG to infer real-time trade offs between these different cognitive processes and their underlying neural networks. The ability to flexibly shift between mental simulation and more analytic forms of thinking may be domain-general principle that underlies spatial intelligence in a variety of cognitive domains.

Mental and Physical Rotation

MR demo

In their seminal study on mental rotation, Shepard & Metzler (1971) had participants compare pairs of rotated block figures, making same/different judgments. They showed the angular disparity effect, a strong positive linear relationship between the angular disparity between the two figures and response time. They interpreted this finding phenomenologically as participants mentally rotating a mental image of the figures until aligned, thus supporting analog mental imagery.

Central to their interpretation was the assumption that mental rotation is akin to motor rotation in the mind. That is, that when we rotate mental imagery we employ similar cognitive processes and strategies as when we physically rotate objects. Research comparing mental and physical rotation is important for the insights it can offer about mental representation in general. Surprisingly, scarce research has focused on this comparison. This line of research takes an important step to address this gap in the literature.

In my experiments, participants mentally and physically rotate 3D models of Shepard & Metzler-style figures using a handheld rotational sensor. Real-time rotational data allow us to examine the precise rotational strategies participants employ during the tasks and further allow for direct comparison of RT and error rate with mental rotation. Read about some of our findings here.

Physical Rotation Task Demo

Virtual Environment Navigation

ve screenshot
I am broadly interested in navigation in virtual environments. Virtual environments provide tight experimental control and allow for direct manipulation of environmental factors. I have modified a consumer video game (Unreal Tournament 2004) for use in my experiments allowing output of position and orientation data at fine temporal resolutions. I analyze and plot participant navigation using a variety of software tools I have programmed. Some of these tools are available here.

Using virtual environments as a jumping off point I explore a variety of questions in spatial cognition.

Some of these questions include:

- How shifting spatial perspectives influence navigation performance and spatial memory
- How emotional state influences cue utilization
- How navigational aids (e.g. GPS) influence navigation performance and spatial memory
- How human wayfinding heuristics can be integrated into navigational aids