At the LAB Lab, we suggest that language is supported by more than the sounds we hear and a few static brain regions. Our research begins with the proposal that, in order to understand the organization of language in the brain, we must include context, both internal – for example, our memories of past experiences hearing words, reading, sign language, and listening to music – and external – like the mouth movements, facial expressions, gestures and postures that accompany face-to-face communication.
Our work has led to the surprising suggestion that, far from simply being ‘non-verbal’ appendages to language, the brain actively hears context: The whole brain dynamically (re-)organizes to use whatever contextual information is available during natural language use to predict associated and forthcoming speech. The result of this mechanism is a more efficient brain and, without it, we would argue that speech perception and language comprehension can not occur.
The Captain of the LAB Lab.
TWO POSTDOCS AVAILABLE!: The @UCLanguageLab and @thelablab have post-doctoral fellowships in human neuroimaging (https://t.co/Upv0AfEBqW) and multimodal communication (https://t.co/OgGgOQD7tz). Please apply or DM me for more information. pic.twitter.com/gtqnmtqifp— The LAB Lab 🏳️🌈 (@thelablab) March 10, 2020
DEADLINE 18 MARCH for a possible #epsrc funded #phdstudentship @thelablab to use #machinelearning approaches to predict who will benefit from treatment with #psychedelic #drugs like #DMT using ecological #neuroimaging networks.— The LAB Lab 🏳️🌈 (@thelablab) February 3, 2020
DM or Email me @ email@example.com pic.twitter.com/YRTidMDtDL
DEADLINE 7 FEB for a #phdstudentship @thelablab to work on the effects of #psychedelic #drugs like #DMT on short and long term #neuroplasticity of real-world, task-driven #fmri #brain networks.— The LAB Lab 🏳️🌈 (@thelablab) January 24, 2020
Apply @ecologicalbrain DTP @UCL: https://t.co/Pjy4VWcJA8 pic.twitter.com/6Hn1H9nAdB
The NOLB Model
Multisensory sensory substitution