uu.seUppsala University Publications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Visual Attention to Dynamic Spatial Relations in Infants and Adults
Uppsala University, Disciplinary Domain of Humanities and Social Sciences, Faculty of Social Sciences, Department of Psychology.
Uppsala University, Disciplinary Domain of Humanities and Social Sciences, Faculty of Social Sciences, Department of Psychology.
2016 (English)In: Infancy, ISSN 1525-0008, E-ISSN 1532-7078, Vol. 21, no 1, 90-103 p.Article in journal (Refereed) Published
Resource type
Text
Abstract [en]

Previous research has found that kinematic features of interactions, such as spatial proximity, capture adult visual attention. The current research uses online measures of gaze behavior to determine attentional capture to objects with reduced interobject spacing in adults as well as infants at 5 and 12months. The three age groups observed three identical geometrical shapes that moved randomly. Relative distance between the objects was mapped and intervals of high and low spatial proximity were identified. Findings demonstrate that only adults and 12-month-olds look significantly more at the objects that are close during instances of high spatial proximity, while 5-month-olds look at chance. The findings speak for a developmental trend in oculomotor processes, where a bias to look at objects with high spatial proximity develops within the first year of life.

Place, publisher, year, edition, pages
2016. Vol. 21, no 1, 90-103 p.
National Category
Psychology
Identifiers
URN: urn:nbn:se:uu:diva-268915DOI: 10.1111/infa.12091ISI: 000368719000005OAI: oai:DiVA.org:uu-268915DiVA: diva2:881618
Funder
Swedish Research Council, VR-2011-1528
Available from: 2015-12-11 Created: 2015-12-11 Last updated: 2017-12-01Bibliographically approved
In thesis
1. Social causality in motion: Visual bias and categorization of social interactions during the observation of chasing in infancy
Open this publication in new window or tab >>Social causality in motion: Visual bias and categorization of social interactions during the observation of chasing in infancy
2017 (English)Doctoral thesis, comprehensive summary (Other academic) [Artistic work]
Abstract [en]

Since the seminal work of Fritz Heider and Marienne Simmel (1944) the study of animacy perception, or the perception and attribution of life from the motion of simple geometrical shapes has intrigued researchers. The intrigue for psychologists and vision scientists then and today centered on the stark disconnect between the simplicity of the visual input and the universal richness of the resulting percept.

Infant research in this domain has become critical in examining the ontological processes behind the formation of animated percepts. To date, little is known about how infants process these kinds of stimuli. While numerous habituation studies have shown sensitivity to animate motion in general, none to date has examined whether infants actually perceive animate displays as social interactions.

The overarching goal of the present thesis is to answer this question and further augment knowledge about the mechanisms behind the formation of animated percepts in infancy. I, along with my collaborators, do so in three ways, in three separate studies. First, we examined visual attention during online observation of randomly moving geometrical shapes in adults and infants (Study I, using eye tracking). Second, we examine distribution of visual attention in infancy during online observation of non-contact causal interactions, focusing on the most ubiquitous, fitness relevant of interactions – chasing (Study II, using eye tracking). Third, we answer the question whether infants perceive social content in chasing displays by measuring the neural correlates in response to chasing (Study III, using EEG).

The collective contribution of the present work is also three fold. First, it demonstrates that starting at the end of the first year of life, human visual system is sensitive to cues that efficiently predict an interaction. Second, at 5-months infants begins allocating attention differently across agents within interactions. Finally, attention to specific objects is not due to low-level saliency but the social nature of the interaction. Subsequently, I present the case that perception of social agents is fast, direct, and reflects the workings of a specialized learning mechanisms whose function is the detection of heat-seeking animates in motion. 

Place, publisher, year, edition, pages
Uppsala: Acta Universitatis Upsaliensis, 2017. 101 p.
Series
Digital Comprehensive Summaries of Uppsala Dissertations from the Faculty of Social Sciences, ISSN 1652-9030 ; 144
Keyword
social causality, motion, animacy perception, chasing goal-directed motion, heat-seeking, EEG; P400, Nc, spatial proximity, non-contact causality, functional specialization, specialized perception, evolution
National Category
Psychology (excluding Applied Psychology)
Research subject
Psychology
Identifiers
urn:nbn:se:uu:diva-321904 (URN)978-91-554-9943-3 (ISBN)
Public defence
2017-08-31, Auditorium Minus, Gustavianum, Akademigatan 3, Uppsala, 10:15 (English)
Opponent
Supervisors
Available from: 2017-06-07 Created: 2017-05-12 Last updated: 2017-09-20

Open Access in DiVA

No full text

Other links

Publisher's full text

Authority records BETA

Galazka, MartynaNyström, Pär

Search in DiVA

By author/editor
Galazka, MartynaNyström, Pär
By organisation
Department of Psychology
In the same journal
Infancy
Psychology

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 233 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf