B0739
Title: NeuroGen: A tool for activity modulation in human visual networks
Authors: Amy Kuceyeski - Weill Cornell Medicine (United States) [presenting]
Abstract: Recording human visual system activity in response to image viewing has classically been a way to understand how the human brain processes incoming information. We present NeuroGen, a machine learning framework that couples an encoding model of human vision and a deep generative network to synthesize images predicted to achieve a target pattern of macro-scale brain activation. We demonstrate that the reduction of noise that the encoding model provides, coupled with the generative network's ability to produce images of high fidelity, results in a robust discovery architecture for visual neuroscience. We begin by validating NeuroGen with known image-brain response relationships, i.e. face/body/word/place areas in the visual system. Then we demonstrate NeuroGen's use in a discovery context by showing that only a few synthetic images can capture novel individual- and region-level variations in the level of activity response to dog compared to human faces. We further demonstrate that NeuroGen can create synthetic images predicted to achieve regional response patterns not achievable by the best-matching natural images. Finally, we present some prospective functional MRI results where we recorded brain activity in individuals viewing NeuroGen's synthetic images. We propose that NeuroGen can enable vision neuroscience discoveries and may allow modulation of the activity of regions within the human visual system (and beyond!) in a controlled way.