Heart-rate as a physiological marker of infant's attention in a statistical learning task

Abu-Zhaya, R. 1 , Tonnsen , B. 2 , Francis, A. 2 & Seidl, A. . 2

1 The Hebrew University of Jerusalem
2 Purdue University

Human infants receive multimodal input that is replete with probabilistic information. For example, touch occurs systematically with infant-directed speech (Abu-Zhaya et al., 2017), and so do visual cues like facial expressions (Nomikou & Rohlfing, 2011) and object motion (e.g., Gogate et al., 2000). Infants learn from such systematicity in the input, treating high probability cross-modal patterns differently than low probability ones for both audio-tactile (Seidl et al., 2015) and audio-visual inputs (Smith et al., 2014). However, it remains unclear what role attention plays in such implicit learning tasks (Perruchet & Pacton, 2006; Saffran et al., 1997). The current study addresses this issue by investigating the role of attention in infants? use of probabilistic cues in a cross-modal audio-tactile task.
Using the same cross-modal touch-speech input stream as in Seidl et al. (2015), we tested 21 infants on their sensitivity to high vs. low probability cross-modal sequences and explored whether heart rate-defined sustained attention underlies infants? ability to track probabilistic audio-tactile patterns. Results showed that behaviorally infants differentiated high from low probability items. Further, audio-tactile events triggered sustained attention more than audio-only events regardless of event probability. These findings shed light on the role of attentional processes in statistical learning.