At the LAB Lab, we suggest that language is supported by more than the sounds we hear and a few static brain regions. Our research begins with the proposal that, in order to understand the organization of language in the brain, we must include context, both internal – for example, our memories of past experiences hearing words, reading, sign language, and listening to music – and external – like the mouth movements, facial expressions, gestures and postures that accompany face-to-face communication.
Our work has led to the surprising suggestion that, far from simply being ‘non-verbal’ appendages to language, the brain actively hears context: The whole brain dynamically (re-)organizes to use whatever contextual information is available during natural language use to predict associated and forthcoming speech. The result of this mechanism is a more efficient brain and, without it, we would argue that speech perception and language comprehension can not occur.
The Captain of the LAB Lab.
nnnnnnnnnnbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbibbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb”— Language, Consciousness & Psychedelics Lab 🏳️🌈 (@thelablab) April 21, 2021
The Naturalistic Neuroimaging Database v2.0 is out. Still N=86 people watching movies during #fMRI but we have significantly improved derivative preprocessing. See: https://t.co/rEWWd7S5Nz— Language, Consciousness & Psychedelics Lab 🏳️🌈 (@thelablab) April 20, 2021
The NNDb v2.0 is also now available on the amazeballs Neuroscout: https://t.co/KZxMb2vnJX pic.twitter.com/U8cpDosxCo
The NOLB Model
Multisensory sensory substitution