At the LAB Lab, we suggest that language is supported by more than the sounds we hear and a few static brain regions. Our research begins with the proposal that, in order to understand the organization of language in the brain, we must include context, both internal – for example, our memories of past experiences hearing words, reading, sign language, and listening to music – and external – like the mouth movements, facial expressions, gestures and postures that accompany face-to-face communication.
Our work has led to the surprising suggestion that, far from simply being ‘non-verbal’ appendages to language, the brain actively hears context: The whole brain dynamically (re-)organizes to use whatever contextual information is available during natural language use to predict associated and forthcoming speech. The result of this mechanism is a more efficient brain and, without it, we would argue that speech perception and language comprehension can not occur.
The Captain of the LAB Lab.
This year @UCLPALS will be making an ethical appeal to get our university @UCL to FINALLY divest ALL investment in fossil fuels. Our own (and the world's) empirical research tells us that it is necessary. We educate too many people to do otherwise. Who is with us? @FossilFreeUCL?— Jeremy I Skipper (@thelablab) September 27, 2019
The NOLB Model
Multisensory sensory substitution