The role of visual preferences in later language learning
This project focuses on how children use semantic cues such as animacy and causality to develop linguistic concepts.
Although we know quite a lot about infants’ understanding of intentions and causality, there is a gap in our knowledge of how perceptual and cognitive biases and abilities in infancy link to later language comprehension and production. For example, how does what infants prefer to look at relate to the kinds of sentence structures they find easiest to learn? How does the animacy of the participants in a scene influence infant preferences? In language learning children, how are these preferences influenced by different kinds of sentence structures?
We are exploring these issues by investigating 9-month-old children’s sensitivity to causality and animacy, and how these semantic features are distributed in various sentence structures in child-directed speech. We are also be looking at how toddlers process different sentence structures as a function of their semantic features.
Eye-tracking and Electroencephalograph (EEG) methods will be used to measure infant attention. In addition, eye-tracking and behavioural measures will be used to test both visual scene inspection and sentence interpretation strategies in toddlers.
To supplement these behavioural measures, analysis of naturalistic language data collected from children and their caregivers will be carried out to determine the distribution of various semantic cues in the speech addressed to young children. This will help us to better understand how language-specific measures might interact with early non-linguistic visual preferences in determining children’s sentence interpretation strategies.
Start date: 1 March 2015
Duration: 3 years
(Work Package 2)