Spatiotemporal regularity and infants' visual predictions: an ERP study

Tummeltshammer, K. 1 , Pomiechowska, B. 2, 1 , Gliga, T. 1 & Kirkham, N. 1

1 Birkbeck, University of London
2 Central European University, Budapest

Young infants are sensitive to the probabilistic structure of their visual environment and make use of statistical regularities when distributing attention (Kidd, et al., 2012; Tummeltshammer & Kirkham, 2013). Infants differentiate among visual events of varying likelihoods, looking longer and deploying attention more rapidly to predictable, rather than unpredictable, events. Studies using event-related potentials (ERPs) have found differences in cortical signals associated with predictive and reactive saccades (Csibra, et al., 2001). However, no studies to date investigate the neural bases of probabilistic visual predictions in infants. In a visual expectancy paradigm, we presented 8-month-olds (n=22) with a 2x2 grid of four animations that appeared one at a time with the following probabilities: 100%, 75%, or 50%. Horizontal eye movements were coded from video and EOG. Infants looked away from the screen less often and made faster saccades to the two predictable (100% and 75%) events rather than the two unpredictable (50%) events (Looks away: F(3,54)=3.76, p=0.02; Saccadic latencies: F(3,54)=3.92, p=0.01). Scalp EEGs were recorded during viewing, and both stimulus- and saccade-locked ERPs were analyzed. Preliminary results indicate differences in neural processing of probabilistic information, which may guide infants' visual attention and the rapid formation of visuospatial predictions.