It is known that our everyday senses, such as vision, hearing, self-motion, and others help us navigate everyday tasks, such as crossing the street. For example, we combine the sound of cars zooming past us with the visual cues of car lights to ensure that it is safe. But what happens when one of those senses is taken away? Are we less accurate in our navigation? Our research study aims to compare vision and self-motion cues and see what happens when one of those cues is taken away, leaving participants with one or the other to navigate. It also compares accuracy and reliability within and across participants, having them come back for a retest of data. This specific research is important, because it shows us how stable our metric is of cue combination, as well as improving the foundation of the field of multi sensory integration. Once we can answer that question and determine if this metric is stable, we can then ask further questions, such as how do people combine, or weight, their senses during an orientation/navigation task?
University / Institution: University of Utah
Format: In Person
SESSION A (9:00-10:30AM)
Area of Research: Social Sciences
Faculty Mentor: Sarah Creem-Regehr