The Role of Thalamic Population Synchrony in the Emergence of Cortical Feature Selectivity Sean T. Kelly 1 , Jens Kremkow 2 , Jianzhong Jin 2 , Yushi Wang 2 , Qi Wang 3 , Jose-Manuel Alonso 2 , Garrett B. Stanley 1 * 1 Coulter Dept. of Biomedical Engineering, Georgia Institute of Technology, Emory University, Atlanta, Georgia, United States of America, 2 Department of Biological Sciences, State University of New York, College of Optometry, New York, New York, United States of America, 3 Department of Biomedical Engineering, Columbia University, New York, New York, United States of America Abstract In a wide range of studies, the emergence of orientation selectivity in primary visual cortex has been attributed to a complex interaction between feed-forward thalamic input and inhibitory mechanisms at the level of cortex. Although it is well known that layer 4 cortical neurons are highly sensitive to the timing of thalamic inputs, the role of the stimulus-driven timing of thalamic inputs in cortical orientation selectivity is not well understood. Here we show that the synchronization of thalamic firing contributes directly to the orientation tuned responses of primary visual cortex in a way that optimizes the stimulus information per cortical spike. From the recorded responses of geniculate X-cells in the anesthetized cat, we synthesized thalamic sub-populations that would likely serve as the synaptic input to a common layer 4 cortical neuron based on anatomical constraints. We used this synchronized input as the driving input to an integrate-and-fire model of cortical responses and demonstrated that the tuning properties match closely to those measured in primary visual cortex. By modulating the overall level of synchronization at the preferred orientation, we show that efficiency of information transmission in the cortex is maximized for levels of synchronization which match those reported in thalamic recordings in response to naturalistic stimuli, a property which is relatively invariant to the orientation tuning width. These findings indicate evidence for a more prominent role of the feed-forward thalamic input in cortical feature selectivity based on thalamic synchronization. Citation: Kelly ST, Kremkow J, Jin J, Wang Y, Wang Q, et al. (2014) The Role of Thalamic Population Synchrony in the Emergence of Cortical Feature Selectivity. PLoS Comput Biol 10(1): e1003418. doi:10.1371/journal.pcbi.1003418 Editor: Lyle J. Graham, Universite ´ Paris Descartes, Centre National de la Recherche Scientifique, France Received June 17, 2013; Accepted November 17, 2013; Published January 9, 2014 Copyright: ß 2014 Kelly et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Funding: Funding by: NIH EY005253, NIH NS48285, NSF CRCNS IIS-0904630 (http://www.nsf.gov/funding/pgm_summ.jsp?pims_id = 5147) and DFG Research Fellowship (JK). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. Competing Interests: The authors have declared that no competing interests exist. * E-mail: garrett.stanley@bme.gatech.edu Introduction Sensory systems serve the purpose of allowing us to extract perceptually relevant features from the environment. Although there are certainly examples of sensory features whose coding originates in the sensory periphery (e.g. auditory frequency, visual color, etc.), the more intriguing and less well understood phenom- ena involve the emergence of feature selectivity in more central brain structures that do not just inherit the selectivity from the periphery. Perhaps the most well studied of these phenomena is that of orientation selectivity in primary visual cortex (V1), where many if not most neurons in the mammalian primary visual cortex exhibit differential firing activity for visual stimuli at different orientations, despite the fact that the neurons projecting from the lateral geniculate nucleus (LGN) serving as input to V1 exhibit little to no orientation preference on their own [1] (see [2] for a review). This implies that the thalamocortical link is a transformative location for representation of stimuli as collections of particular features rather than samples (i.e. it does far more than simply relay luminance values to the cortex). This transformation can serve as a general model for how sensory systems convey increasing feature selectivity as the information moves to higher-order brain areas. How do these convergent thalamic structures drive cortical feature selectivity, and in what way do populations drive this selectivity? The mechanistic origin of orientation tuning in V1 has been vigorously explored in the literature [1–5]. In their seminal work, Hubel and Wiesel outlined a conceptual model that involved the projection of LGN neurons along a particular axis of orientation to a common cortical target [1], the core connectivity of which was subsequently confirmed in recordings from connected pairs of neurons in LGN and V1 [6–8]. Although the relative roles of this feedforward architecture versus cortico-cortico connectivity in sharpening and refining orientation selectivity in such phenomena as contrast-invariance and cross-orientation suppression has been intensely debated [2,9], the thalamic basis for the origin of the basic selectivity is not in dispute, and by its nature implies a role for the timing of thalamic inputs to the cortical target. That is, the several decade old proposal by Hubel and Wiesel conceptually suggests that an edge activating the subset of thalamic neurons projecting to a common cortical target at the same time would naturally drive the cortical neuron more so than when the thalamic inputs are activated at different times, establishing the orientation tuning for the cortical neuron. However, the precise role of timing of thalamic inputs in the downstream cortical orientation selectivity is not known. In the context of the natural visual environment, it has been shown that LGN neurons (individually and across pairs) are temporally precise to a time scale of 10–20 ms, a level that is matched to what is necessary to PLOS Computational Biology | www.ploscompbiol.org 1 January 2014 | Volume 10 | Issue 1 | e1003418