DSI: Sensory Integration


Sensory Integration in Laboratory Test
In a controlled environment, we present simple visual, auditory, or bimodal (i.e. with both visual and auditory input) stimuli from various spatial localizations, and ask participants to locate the target. We compute the biases (i.e. offset of the responses to the correct location), and variabilities (i.e. consistency when locating a target multiple times). Using the classic Maximum-Likelihood Estimation model, we ask whether participants optimally integrate their vision and hearing, which would predict the localization to be more reliable under the bimodal condition than each single sense alone, or at least equally reliable as the better sense.


Sensory Integration in Real Life Spatial Localization
Whether locating an approaching a person in a hallway, finding a friend in a cafeteria, or tracking traffic at an intersection, the visual and auditory information we use in real-life tasks are complex. How do people with DSI integrate their senses in these real-life situation? We carefully selected 11 real-life tasks including 5 task in static environments and 6 task in dynamic environments, and asked participants to locate objects, people, and traffic using their vision, hearing, and both senses combined. Using a latent-trait signal detection theory analysis, we introduce the concept of “functional reserve” to model sensory integration in these fun real-life tasks.


Sensory Integration in Speech Perception
Speech perception is a primary emphasis of hearing rehabilitation, due to its importance in social interactions. How could vision facilitate speech perception in real-life tasks? Imagine you are with many people at a party, and someone seems to be walking towards your direction. Is this person you know? Are they happy or mad? Are they speaking to you? We came up with the idea of measuring “critical viewing distances” for identifying faces, emotions, and detecting visual speech. Vision might facilitate speech perception by allowing auditory speech to be detected from further distance than using hearing alone. Vision skill training in vision rehabilitation might amplify this facilitation effect.


Sensory Integration in Patient Perspective
Patient-centered care is key in sensory rehabilitation, and patient perspectives of their own abilities determine the level of confidence they have for completing everyday tasks. Can “sensory integration” be studied from patient perspectives? We developed a Dual Sensory Spatial Localization Questionnaire (DS-SLQ), a first survey instrument that allows assessment and direct comparison of vision and hearing abilities in an individual. A first version of DS-SLQ includes 35 everyday tasks that require localization of common objects. Participants verbally report their perceived difficulties in completing each task with their vision alone, hearing alone, and which sense (vision or hearing) they would primarily rely on to complete the task. Using a psychometric analysis method Rasch analysis, DSL-SLQ provides a system for comparing visual and hearing spatial abilities within an individual.