Localization of Stimuli Based on Neural Activity in Early Visual Areas

Researcher(s)

  • Zoe Cronin, Neuroscience, University of Delaware

Faculty Mentor(s)

  • Timothy Vickery, Psychological and Brain Sciences, University of Delaware

Abstract

Visual illusions experienced by humans are often spatial in nature. For example, an illusion of spatial distortion is experienced when two point-like stimuli placed within an object’s boundaries appear farther apart (“object-based warping,” Vickery & Chun, 2010). To assess the neural mechanisms that result in such illusions, we would like to estimate point positions using functional magnetic resonance imaging (fMRI) of the visual system in the presence and absence of objects. However, it is currently unknown how precisely the positions of individual point-like stimuli can be estimated. In this study, we will use retinotopic mapping and population receptive field (pRF) techniques to delineate visual regions and then estimate the precision with which we can guess at the locations of point-like stimuli from brain activity alone. pRF mapping involves estimating, for each voxel, the coordinates and size of the region in the visual field within which stimuli cause a response. pRF estimates can then be inverted to estimate, based on brain activity, what the subject was seeing during stimulation.

We will examine this approach’s degree of precision at localizing small point-like stimuli in the visual field based on fMRI signals. Some trials’ stimuli will include expanding/contracting rings and rotating wedges flickering between various textures, while other trials will have flickering dot stimuli. We hypothesize that, in at least early visual areas (V1-V3), the model will accurately estimate positions of stimuli. Estimates of such precision will allow us to design an additional experiment in which we calculate the distortion caused by the presence of an object and attempt to isolate regions of the visual system that demonstrate hallmarks of object-based warping effects.