Perceptual development

?

Visual preferences

Fantz (1961) demonstrated infants of only a week old show visual preferences for:

  • bullseye figures over striped figures
  • checkerboard figures over plain square figures
  • schematic faces over almost any other stimulus

Infants only 30 minutes old show preference for tracking of faces compared to scrambled figures and blank (Johnson et al, 1991). They also show preference for direct eye-gaze at birth (Farroni et al, 2003).

Fantz (1996) showed that infants demonstrated a decline in visual preference after repeated exposure (habituation) and a subsequent dishabituation to novel stimuli

1 of 6

Perceiving constant objects

Slater, Morison & Rose (1983): newborn infants can discriminate shapes, but they could be doing this in lots of ways - configural shape, contour density, orientations of lines. The way infants discriminate forms changes in the first months (Cohen & Younger, 1983).

Alan Slater was able to use fixed trials familiarisation to demonstrate that newborns could perceive object constancy in the first days of life. Fixed trial familiarisation permits desensitising infants to certain aspects of forms. Size constancy (Slater, Mattock & Brown, 1990) and shape constancy (Slater & Morison, 1985) was demonstrated.

2 of 6

Speech perception

Even foetuses can process auditory information.

  • DeCasper & Fifer (1980) used non-nutritive sucking technique to show that newborns (3 days old) could differentiate their mother's voice from that of an unknown mother
  • Hepper (2005) "foetal soap addiction": newborns could recognise the theme song of their mother's favourite soap
  • Spence & DeCasper (1986) showed that newborns who had been read cat in the hat by their mother before birth preferred this passage to a different passage when both were read by a stranger

Phoneme discrimination: there is evidence that infants can distinguish many of these speech sounds by 1 and 2 months of age. There are some phoneme distinctions which exist in some languages only. Werker & Tees (1984) showed that 6 month olds initially appear able to discriminate between a wide range of phonemes, but that by 12 months they lose the ability to discriminate speech sounds which aren't differentiated in their own language; known as perceptual narrowing

3 of 6

Multisensory perception

Humans are equipped with various highly specialised sensory systems that give us access to numerous types of information on the surrounding environment. Each sensory modality gives us a unique outlook on the world. The brain integrates multisensory information to provide a complete and coherent representation of what is being perceived, and consequently for appropriate behavioural responses to be generated.

According to the Early Integration Approach, the nervous system is multisensorial right from its early development stage, possessing capacities to detect redundant aspects of the surrounding environment (Robinson & Sloutsky, 2010). In support of this approach, Bower, Broughton & Moore (1970) observed that infants are able to move their hands toward visual targets as early as 6 days after birth, indicating hand-eye coordination occurs very early on in life.

Bahrick & Lickliter (2000) proposed the intersensory redundancy hypothesis that explains how infants perceive coherent, unified multimodal objects and events through different sensory modalities. This hypothesis refers to the presentation of the same information spatially coordinated and temporally synchronous across 2 or more modalities, and is only possible for amodal properties that aren't specific to a single sense modality (e.g shape, rhythm, duration, intensity)

4 of 6

Evidence for multisensory perception

Studies on the cross-modal transfer of information from touch to vision revealed that neonates are able to process and encode shape information about manually experienced objects, and can discriminate between subsequently presented visual objects (Streri, 2003).

Newborns can visually recognise the texture that they previously felt, and tactually recognise the texture that they previously saw (Sann & Streri, 2007). One-moth old infants can benefit from the tactile-oral properties of an object during visual recognition, showing a clear visual preference for objects with which they had been familiarised through oral presentation (Meltzoff & Borton, 1979).

The ability to perceive audio-visual relations emerges early in human development. Lewkowicz & Turkewitz (1980) showed that 3-week old infants responded to the equivalence of different levels of auditory loudness and visual brightness inputs on the basis of intensity.

On the basis of synchrony, newborn infants are able to associate objects and linguistic stimuli (Slater, Quinn, Brown & Hayes, 1999), and that infants only a few hours old can learn sight-sound pairing (Morrongiello, Fenwick & Nutley, 1998). Bahrick (2001) found that as early as 4 weeks after birth, infants are sensitive and able to learn arbitrary relations between audiovisual inputs. 4 month old infants can connect and bind visual objects to the specific sounds produced by these objects (Spelke, 1979). These are all signs of integration

5 of 6

Failures in body perception

G Stanley Hall (1898): "sometimes the hand would be stared at steadily, perhaps with growing intensity, until interest reached such a pitch that a grasping movement followed as if the infant tried by automatic action of the motor hand to grasp the visual hand, and it was switched out of the centre of vision and lost, as if it had magically vanished".

It may be that an even more basic multisensory task involves locating touch in external visual space. 6.5 month olds made more incorrect manual responses when their hands were crossed, in comparison to uncrossed hands posture. This indicates that touch is mapped to an external space.

Comparing 4 and 6 month olds' foot responses to tactile stimuli in crossed and uncrossed feet postures: 4 month olds made acrruate responses, but were unaffected by posture. 6 month olds made accurate responses with uncrossed feet, but were at chance with crossed feet. Coding touches in external (visual) coordinates develops between 4 and 6 months.

6 of 6

Comments

No comments have yet been made

Similar Psychology resources:

See all Psychology resources »See all Developmental Psychology resources »