What is Statistical Learning? Insights from attempting to model the processes that give rise to sensitivity to statistical structure.

Thiessen, E.

Carnegie Mellon University

Statistical learning is described as the ability to adapt to statistical structure in the input. Indeed, across a variety of tasks, stimuli, and modality, learners demonstrate sensitivity to a wide variety of statistical structures: learners are sensitive to both sequential and simultaneous co-occurrence statistics, as well as to statistical information about the distribution of elements in the input, such as frequency and variability. However, this very range of sensitivity poses problems for theoretical accounts of statistical learning, because it is not clear that the same underlying process could explain sensitivity to such a wide range of statistical structures, in such a wide variety of tasks. In this talk, I will present a theoretical framework - the Extraction and Integration Framework - that attempts to provide a unified account of statistical learning. The Extraction and Integration framework proposes that statistical learning can be explained as the interplay among two coordinated processes: Extraction, which chunks elements of the input into discrete representations, and Integration, which compares across these chunks to identify common patterns. In addition, I will present a computational model of the process of Integration (iMinerva), derived in part from exemplar memory models simulating more general processes of human memory. The model makes a set of novel predictions about language acquisition and about learning more generally that I will briefly discuss. Finally, I will explain how this computational framework makes it possible to simulate both conditional and distributional statistical learning in the same computational architecture.