Events / E.A.R.S. seminar – Sam Norman-Haignere & Ross Williamson

E.A.R.S. seminar – Sam Norman-Haignere & Ross Williamson

No dates for this event

online offering

This event will be held via Zoom. For the link, please contact us: pennmindcore@sas.upenn.edu

 

 

Sam Norman-Haignere
Department of Biostatistics and Computational Biology
Department of Neuroscience
University of Rochester

 

Neural integration in the human auditory cortex
To derive meaning from sound, the brain must integrate information across many timescales spanning tens (e.g., phonemes) to hundreds (e.g., words) of milliseconds. Yet, identifying the integration timescales of auditory neural populations has been challenging, in part due to their complex, nonlinear tuning for natural sound structure. In this talk, I will describe a new method to estimate neural integration windows using natural stimuli (the temporal context invariance paradigm). Our method is conceptually simple and general, and thus applicable to virtually any brain region, sensory domain, or temporally precise recording modality. By applying this method to intracranial recordings from human neurosurgical patients, we have found that the human auditory cortex integrates hierarchically across diverse timescales spanning approximately 50 to 400 ms, with substantially longer integration windows in non-primary regions, bilaterally. Moreover, we have found that neural populations with short and long integration windows exhibit distinct functional properties: short-integration electrodes (less than ~200 ms) show prominent spectrotemporal modulation selectivity, while long-integration electrodes (greater than ~200 ms) show prominent category selectivity. These findings reveal how multiscale integration organizes auditory computation in the human brain.


 

Ross Williamson
Department of Otolaryngology
University of Pittsburgh

 

Brain-wide neural circuits for sensory-guided behavior

 

Auditory-guided behavior is ubiquitous in everyday life, whenever auditory information is used to guide the decisions we make and the actions we take. One such behavior is auditory categorization, a process that reflects the ability to transform bottom-up sensory stimuli into discrete perceptual categories and use these perceptual categories to drive a subsequent action. Although this process is well-documented at the behavioral and cognitive levels, surprisingly little is known about the explicit neural circuit mechanisms that underlie categorical computation and how the result of this computation drives behavioral outcomes. We believe that the transformation of auditory information into an appropriate behavioral response is necessarily a brain-wide endeavor. The deep layers of the auditory cortex give rise to several massive projection systems that exert influence over many downstream brain areas. Here, I will discuss our efforts towards understanding how these distinct systems differentially contribute to auditory-guided behavior.