At the LAB Lab, we suggest that language is supported by more than the sounds we hear and a few static brain regions. Our research begins with the proposal that, in order to understand the organization of language in the brain, we must include context, both internal – for example, our memories of past experiences hearing words, reading, sign language, and listening to music – and external – like the mouth movements, facial expressions, gestures and postures that accompany face-to-face communication.
Our work has led to the surprising suggestion that, far from simply being ‘non-verbal’ appendages to language, the brain actively hears context: The whole brain dynamically (re-)organizes to use whatever contextual information is available during natural language use to predict associated and forthcoming speech. The result of this mechanism is a more efficient brain and, without it, we would argue that speech perception and language comprehension can not occur.
The Captain of the LAB Lab.
'Earthlings' (narrated by Joaquin Phoenix) is available to watch for free!!!: https://t.co/bxmXewY7bx— Jeremy I Skipper (@thelablab) April 1, 2018
Can you watch this and not become a vegetarian/vegan? I'm sure the more than 55,000,000,000 (yes, 55 billion) animals we kill every year will not mind.
Another sad day for good science. https://t.co/Y5xeNqbxqC— Jeremy I Skipper (@thelablab) March 31, 2018
The NOLB Model
Multisensory sensory substitution