Stekelenburg, J.J., Baart, M., & Vroomen, J. (2026). A behavioral and electrophysiological investigation of the effect of horizontal head viewing angle on audiovisual speech integration. Brain and Language, 273. Doi:10.1016/j.bandl.2025.105688
2025
Stekelenburg, J.J., Limpens, J., Baart, M., & Vroomen, J. (2025). Electrophysiological correlates of selective speech adaptation. Brain and Language, 263. Doi:10.1016/j.bandl.2025.105545
2024
Pourhashemi, F., Baart, M., & Vroomen, J. (2024). Perceptual Adaptation to Noise-Vocoded Speech by Lip-Read Information: No Difference between Dyslexic and Typical Readers. Multisensory Research, 37(3), 243-259. Doi:10.1163/22134808-bja10125
Syed, I., Baart, M., & Vroomen, J. (2024). The Multimodal Trust Effects of Face, Voice, and Sentence Content. Multisensory Research, 37(2), 125-141. Doi:10.1163/22134808-bja10119
2023
Bertels, J., Niesen, M., Destoky, F., Coolen, T., Vander Ghinst, M., Wens, V., Rovai, A., Trotta, N., Baart, M., Molinaro, N., De Tiège, X., & Bourguignon, M. (2023). Neurodevelopmental oscillatory basis of speech processing in noise. Developmental Cognitive Neuroscience, 59. Doi:10.1016/j.dcn.2022.101181
2022
López Zunini, R.A., Baart, M., Samuel, A.G., & Armstrong, B.C. (2022). Lexico-semantic access and audiovisual integration in the aging brain: Insights from mixed-effects regression analyses of event-related potentials. Neuropsychologia, 165. Doi:10.1016/j.neuropsychologia.2021.108107
Pourhashemi, F., Baart, M., van Laarhoven, T., & Vroomen, J. (2022). Want to quickly adapt to distorted speech and become a better listener? Read lips, not text. Plos One, 17(12 December). Doi:10.1371/journal.pone.0278986
2021
Guediche, S., de Bruin, A., Caballero-Gaudes, C., Baart, M., & Samuel, A.G. (2021). Second-language word recognition in noise: Interdependent neuromodulatory effects of semantic context and crosslinguistic interactions driven by word form similarity. Neuroimage, 237. Doi:10.1016/j.neuroimage.2021.118168
2020
Bourguignon, M., Baart, M., Kapnoula, E.C., & Molinaro, N. (2020). Lip-reading enables the brain to synthesize auditory features of unknown silent speech. Journal of Neuroscience, 40(5), 1053-1065. Doi:10.1523/JNEUROSCI.1101-19.2019
Burgering, M.A., van Laarhoven, T., Baart, M., & Vroomen, J. (2020). Fluidity in the perception of auditory speech: Cross-modal recalibration of voice gender and vowel identity by a talking face. Quarterly Journal of Experimental Psychology, 73(6), 957-967. Doi:10.1177/1747021819900884
Guediche, S., Baart, M., & Samuel, A.G. (2020). Semantic priming effects can be modulated by crosslinguistic interactions during second-language auditory word recognition. Bilingualism, 23(5), 1082-1092. Doi:10.1017/S1366728920000164
López Zunini, R.A., Baart, M., Samuel, A.G., & Armstrong, B.C. (2020). Lexical access versus lexical decision processes for auditory, visual, and audiovisual items: Insights from behavioral and neural measures. Neuropsychologia, 137. Doi:10.1016/j.neuropsychologia.2019.107305
McLean, M.A., Van den Bergh, B.R.H., Baart, M., Vroomen, J., & van den Heuvel, M.I. (2020). The late positive potential (LPP): A neural marker of internalizing problems in early childhood. International Journal of Psychophysiology, 155, 78-86. Doi:10.1016/j.ijpsycho.2020.06.005
2019
Barraza, P., Dumas, G., Liu, H., Blanco-Gomez, G., van den Heuvel, M.I., Baart, M., & Pérez, A. (2019). Implementing EEG hyperscanning setups. Methodsx, 6, 428-436. Doi:10.1016/j.mex.2019.02.021
Lindborg, A., Baart, M., Stekelenburg, J.J., Vroomen, J., & Andersen, T.S. (2019). Speech-specific audiovisual integration modulates induced theta-band oscillations. Plos One, 14(7). Doi:10.1371/journal.pone.0219744
Modelska, M., Pourquié, M., & Baart, M. (2019). No "self" advantage for audiovisual speech aftereffects. Frontiers in Psychology, 10(MAR). Doi:10.3389/fpsyg.2019.00658
2018
Baart, M., & Vroomen, J. (2018). Recalibration of vocal affect by a dynamic face. Experimental Brain Research, 236(7), 1911-1918. Doi:10.1007/s00221-018-5270-y
2017
Baart, M., Armstrong, B.C., Martin, C.D., Frost, R., & Carreiras, M. (2017). Cross-modal noise compensation in audiovisual words. Scientific Reports, 7. Doi:10.1038/srep42055
Baart, M., Lindborg, A., & Andersen, T.S. (2017). Electrophysiological evidence for differences between fusion and combination illusions in audiovisual speech perception. European Journal of Neuroscience, 46(10), 2578-2583. Doi:10.1111/ejn.13734
2016
Baart, M. (2016). Quantifying lip-read-induced suppression and facilitation of the auditory N1 and P2 reveals peak enhancements and delays. Psychophysiology, 53(9), 1295-1306. Doi:10.1111/psyp.12683
2015
Baart, M., Bortfeld, H., & Vroomen, J. (2015). Phonetic matching of auditory and visual speech develops during childhood: Evidence from sine-wave speech. Journal of Experimental Child Psychology, 129, 157-164. Doi:10.1016/j.jecp.2014.08.002
Baart, M., & Samuel, A.G. (2015). Early processing of auditory lexical predictions revealed by ERPs. Neuroscience Letters, 585, 98-102. Doi:10.1016/j.neulet.2014.11.044
Baart, M., & Samuel, A.G. (2015). Turning a blind eye to the lexicon: ERPs show no cross-talk between lip-read and lexical context during speech sound processing. Journal of Memory and Language, 85, 42-59. Doi:10.1016/j.jml.2015.06.008
Shaw, K., Baart, M., Depowski, N., & Bortfeld, H. (2015). Infants' preference for native audiovisual speech dissociated from congruency preference. Plos One, 10(4). Doi:10.1371/journal.pone.0126059
2014
Baart, M., Stekelenburg, J.J., & Vroomen, J. (2014). Electrophysiological evidence for speech-specific audiovisual integration. Neuropsychologia, 53(1), 115-121. Doi:10.1016/j.neuropsychologia.2013.11.011
Baart, M., Vroomen, J., Shaw, K., & Bortfeld, H. (2014). Degrading phonetic information affects matching of audiovisual speech in adults, but not in infants. Cognition, 130(1), 31-43. Doi:10.1016/j.cognition.2013.09.006
2013
Baart, M., Vroomen, J., Shaw, K.E., & Bortfeld, H. (2013). Phonetic information in audiovisual speech is more important for adults than for infants; preliminary findings. Auditory Visual Speech Processing 2013 Avsp 2013, 61-64.
Zouridakis, G., Baart, M., Stekelenburg, J.J., & Vroomen, J. (2013). Speech perception: Single trial analysis of the N1/P2 complex of unimodal and audiovisual evoked responses. 13th IEEE International Conference on Bioinformatics and Bioengineering IEEE Bibe 2013. Doi:10.1109/BIBE.2013.6701590
2012
Baart, M., De Boer-Schellekens, L., & Vroomen, J. (2012). Lipread-induced phonetic recalibration in dyslexia. Acta Psychologica, 140(1), 91-95. Doi:10.1016/j.actpsy.2012.03.003
2010
Baart, M., & Vroomen, J. (2010). Phonetic recalibration does not depend on working memory. Experimental Brain Research, 203(3), 575-582. Doi:10.1007/s00221-010-2264-9
Baart, M., & Vroomen, J. (2010). Do you see what you are hearing? Cross-modal effects of speech sounds on lipreading. Neuroscience Letters, 471(2), 100-103. Doi:10.1016/j.neulet.2010.01.019
2009
Vroomen, J., & Baart, M. (2009). Phonetic recalibration only occurs in speech mode. Cognition, 110(2), 254-259. Doi:10.1016/j.cognition.2008.10.015
Vroomen, J., & Baart, M. (2009). Recalibration of Phonetic Categories by Lipread Speech: Measuring Aftereffects after a 24-hour Delay. Language and Speech, 52(2-3), 341-350. Doi:10.1177/0023830909103178
2007
Vroomen, J., van Linden, S., & Baart, M. (2007). Lipread Aftereffects in Auditory Speech Perception: Measuring Aftereffects After a Twenty-Four Hours Delay. Auditory Visual Speech Processing 2007 Avsp 2007.