You are here

Investigating the Mechanisms Underlying Infant Selective Attention to Multisensory Speech

Download pdf | Full Screen View

Date Issued:
2015
Summary:
From syllables to fluent speech, it is important for infants to quickly learn and decipher linguistic information. To do this, infants must not only use their auditory perception but also their visual perception to understand speech and language as a multisensory coherent event. Previous research by Lewkowicz and Hansen-Tift (2012) demonstrated that infants shift their allocation of visual attention from the eyes to the mouth of the speaker's face throughout development as they become interested in speech production. This project examined how infants, from 4-14-months of age, allocate their visual attention to increasingly complex speech tasks. In Experiment 1, infants were presented with upright and inverted faces vocalizing syllables and the results demonstrated that in response to the upright faces, 4-month-old infants attended to the eyes and 8- and 10-month-olds attended equally to the eyes and mouth. In response to the inverted face presentation, both the 4- and 10-month-olds attended equally to the eyes and mouth but the 8-month olds attended to the eyes. In Experiment 2, infants were presented with a phoneme matching task (Patterson & Werker, 1999, 2002, 2003) and the results demonstrated that the 4-month-old infants successfully matched the voice to the corresponding face, but that older infants did not. Measures of their selective attention to this task showed that the 4-month-old infants attended more to the eyes of the faces during the task, not attending to the redundant speech information at the mouth, but older infants attended equally to the eyes and mouth, although they did not match the voice to the face. Experiment 3 presented infants with a fluent speech matching task (Lewkowicz et al., 2015) which demonstrated that although the infants (12-14-months) did not systematically match the voice to the corresponding face, the infants attended more to the mouth region, which would have provided them with the neces sary redundant information. Overall, these studies demonstrate that there are developmental changes in how infants distribute their visual attention to faces as they learn about speech and that the complexity of the speech is a critical factor in how they allocate their visual attention.
Title: Investigating the Mechanisms Underlying Infant Selective Attention to Multisensory Speech.
0 views
0 downloads
Name(s): Tift, Amy H., author
Bjorklund, David F., Thesis advisor
Florida Atlantic University, Degree grantor
Charles E. Schmidt College of Science
Department of Psychology
Type of Resource: text
Genre: Electronic Thesis Or Dissertation
Date Created: 2015
Date Issued: 2015
Publisher: Florida Atlantic University
Place of Publication: Boca Raton, Fla.
Physical Form: application/pdf
Extent: 103 p.
Language(s): English
Summary: From syllables to fluent speech, it is important for infants to quickly learn and decipher linguistic information. To do this, infants must not only use their auditory perception but also their visual perception to understand speech and language as a multisensory coherent event. Previous research by Lewkowicz and Hansen-Tift (2012) demonstrated that infants shift their allocation of visual attention from the eyes to the mouth of the speaker's face throughout development as they become interested in speech production. This project examined how infants, from 4-14-months of age, allocate their visual attention to increasingly complex speech tasks. In Experiment 1, infants were presented with upright and inverted faces vocalizing syllables and the results demonstrated that in response to the upright faces, 4-month-old infants attended to the eyes and 8- and 10-month-olds attended equally to the eyes and mouth. In response to the inverted face presentation, both the 4- and 10-month-olds attended equally to the eyes and mouth but the 8-month olds attended to the eyes. In Experiment 2, infants were presented with a phoneme matching task (Patterson & Werker, 1999, 2002, 2003) and the results demonstrated that the 4-month-old infants successfully matched the voice to the corresponding face, but that older infants did not. Measures of their selective attention to this task showed that the 4-month-old infants attended more to the eyes of the faces during the task, not attending to the redundant speech information at the mouth, but older infants attended equally to the eyes and mouth, although they did not match the voice to the face. Experiment 3 presented infants with a fluent speech matching task (Lewkowicz et al., 2015) which demonstrated that although the infants (12-14-months) did not systematically match the voice to the corresponding face, the infants attended more to the mouth region, which would have provided them with the neces sary redundant information. Overall, these studies demonstrate that there are developmental changes in how infants distribute their visual attention to faces as they learn about speech and that the complexity of the speech is a critical factor in how they allocate their visual attention.
Identifier: FA00004551 (IID)
Degree granted: Dissertation (Ph.D.)--Florida Atlantic University, 2015.
Collection: FAU Electronic Theses and Dissertations Collection
Note(s): Includes bibliography.
Subject(s): Child development
Cognition in infants
Interpersonal communication in infants
Language acquisition
Visual perception in infants
Held by: Florida Atlantic University Libraries
Sublocation: Digital Library
Persistent Link to This Record: http://purl.flvc.org/fau/fd/FA00004551
Use and Reproduction: Copyright © is held by the author, with permission granted to Florida Atlantic University to digitize, archive and distribute this item for non-profit research and educational purposes. Any reuse of this item in excess of fair use or other copyright exemptions requires permission of the copyright holder.
Use and Reproduction: http://rightsstatements.org/vocab/InC/1.0/
Host Institution: FAU
Is Part of Series: Florida Atlantic University Digital Library Collections.