Mapping Timescales of Cortical Language Processing

Monday, September 20, 2021 - 11:30am to 12:30pm

Event Calendar Category


Speaker Name

Alex Huth


University of Texas Austin

Zoom meeting id


Join Zoom meeting


Natural language contains information that must be integrated over multiple timescales. To understand how the human brain represents this information, one approach is to build encoding models that predict fMRI responses to natural language using representations extracted from neural network language models (LMs). However, these LM-derived representations do not explicitly separate information at different timescales, making it difficult to interpret the encoding models. Here I will discuss how a language model can be engineered to explicitly represent different timescales, and how this model can be used to map representations in the human cortex.


Alex Huth is an Assistant Professor at The University of Texas at Austin in the departments of neuroscience and computer science. His lab uses natural language stimuli and fMRI to study language processing in the human cortex in work funded by the Burroughs Wellcome Foundation, Sloan Foundation, Whitehall Foundation, NIH, and others. Before joining UT, Alex did his undergraduate and master’s work at Caltech under Christof Koch, and then Ph.D. and postdoc in Jack Gallant’s laboratory at UC Berkeley, where he developed novel methods for mapping semantic representations of visual and linguistic stimuli in the human cortex.