Fiveable

🤔Cognitive Psychology Unit 4 Review

QR code for Cognitive Psychology practice questions

4.2 Auditory Perception

4.2 Auditory Perception

Written by the Fiveable Content Team • Last updated August 2025
Written by the Fiveable Content Team • Last updated August 2025
🤔Cognitive Psychology
Unit & Topic Study Guides

Auditory System and Sound Processing

Your ears convert vibrations in the air into electrical signals the brain can interpret. This process involves a chain of structures, each transforming the signal from one form to another. Understanding how sound moves from the outer ear to the auditory cortex is essential for grasping both normal hearing and what goes wrong when hearing is impaired.

Structures of the Auditory System

Sound passes through three main regions before reaching the brain:

Outer ear

  • The pinna (the visible part of your ear) collects sound waves and funnels them inward. Its folds and ridges also help you determine where sounds are coming from.
  • The ear canal (about 2–3 cm long) channels those waves toward the eardrum.

Middle ear

  • The tympanic membrane (eardrum) is a thin membrane (roughly 0.1 mm thick) that vibrates when sound waves hit it.
  • The ossicles are three tiny bones: the malleus, incus, and stapes. They amplify the eardrum's vibrations and transmit them to the inner ear. This amplification is necessary because the signal has to move from air into fluid, which requires more energy.
  • The Eustachian tube connects the middle ear to the throat and equalizes air pressure on both sides of the eardrum (it opens when you swallow or yawn).

Inner ear

  • The cochlea is a snail-shaped, fluid-filled structure about 35 mm long. It contains hair cells, which are the actual sensory receptors for hearing. These hair cells sit on the basilar membrane and perform transduction: converting mechanical vibrations into electrical signals.
  • The auditory nerve (containing roughly 30,000 nerve fibers) carries those electrical signals from the cochlea to the brain.

Auditory cortex

  • Located in the temporal lobe, the auditory cortex is where the brain processes and interprets sound, turning raw neural signals into what you experience as speech, music, or noise.
Structures of auditory system, Hearing and Vestibular Sensation | Biology for Majors II

Process of Auditory Transduction

Here's the step-by-step pathway from sound wave to brain signal:

  1. Sound waves enter the ear canal and strike the eardrum.
  2. The eardrum vibrates in response to the pressure changes.
  3. The ossicles amplify these vibrations and pass them to the oval window of the cochlea.
  4. Fluid inside the cochlea begins to move, creating a traveling wave along the basilar membrane.
  5. The movement bends tiny projections called stereocilia on top of the hair cells. This bending opens ion channels, allowing charged particles to flow in.
  6. The influx of ions triggers the hair cells to release neurotransmitters onto auditory nerve fibers.
  7. Those neurotransmitters generate action potentials in the auditory nerve.
  8. The auditory nerve transmits these signals to the auditory cortex for processing.

The key transformation here is at step 5: mechanical energy (vibration) becomes electrochemical energy (neural signals). That's transduction.

Structures of auditory system, Auditory Pathways to the Brain – Introduction to Sensation and Perception

Auditory Perception and Attention

Once signals reach the brain, the auditory system has to figure out what you're hearing, how loud it is, and where it's coming from. On top of that, attention determines which sounds you actually become aware of.

Principles of Sound Perception

Pitch perception

Two theories explain how the brain encodes pitch, and both are partially correct:

  • Place theory: Different frequencies activate different locations along the basilar membrane. High-frequency sounds vibrate the base of the cochlea (near the oval window), while low-frequency sounds vibrate the apex (the far tip). The brain reads pitch based on which region is most active. This works well for high-frequency sounds.
  • Temporal theory: For lower frequencies, auditory nerve fibers fire in sync with the peaks of the sound wave, a process called phase locking. The brain reads pitch based on the timing pattern of neural firing. This works well up to about 5,000 Hz.

For most everyday sounds, the brain likely uses both mechanisms together.

Loudness perception

  • The amplitude (height) of a sound wave corresponds to perceived loudness, measured in decibels (dB).
  • The Weber-Fechner law states that perceived loudness increases logarithmically with physical intensity. The formula ΔII=k\frac{\Delta I}{I} = k means that the just-noticeable difference in intensity is a constant proportion of the starting intensity. In practical terms, going from 40 dB to 50 dB feels like a similar jump as going from 70 dB to 80 dB, even though the actual energy increase is vastly different.

Sound localization

Your brain uses three main cues to figure out where a sound is coming from:

  • Interaural time difference (ITD): A sound coming from your left reaches your left ear slightly before your right ear (up to about 0.7 ms difference). The brain detects this tiny delay. ITDs are most useful for localizing low-frequency sounds.
  • Interaural level difference (ILD): Your head casts an "acoustic shadow," making a sound slightly quieter in the far ear. This cue is most useful for high-frequency sounds, because high frequencies are more easily blocked by the head.
  • Head-related transfer function (HRTF): The shape of your pinna, head, and shoulders filters sound in unique ways depending on the direction it comes from. These spectral cues are especially important for distinguishing whether a sound is above or below you, or in front versus behind.

Role of Auditory Attention

Even though your ears pick up everything around you, you don't consciously process all of it. Attention acts as a filter, determining which sounds reach awareness.

  • Selective attention is the ability to focus on one auditory source while ignoring others. Researchers study this using dichotic listening tasks, where different messages are played to each ear and participants must attend to only one.
  • The cocktail party effect is a classic example: you can follow a single conversation in a loud, crowded room. You use a combination of spatial cues, voice characteristics, and even visual cues (like watching someone's lips) to separate that one voice from the background.
  • Bottom-up attention is involuntary. A sudden loud noise or someone shouting your name will grab your attention automatically, regardless of what you were focusing on.
  • Top-down attention is voluntary and goal-driven. If you're listening for a friend's voice in a crowd, you're actively directing your attention based on what you expect to hear.

Auditory scene analysis

Your brain constantly groups and separates the sounds around you into distinct auditory streams. It does this based on cues like pitch, timbre, spatial location, and timing. For example, you can distinguish a guitar from a voice in a song even though both sounds hit your ears simultaneously. This process, studied extensively by Albert Bregman, is sometimes called the auditory system's version of perceptual organization.

Effects of attention on processing

  • Attended sounds receive enhanced neural processing: neurons in the auditory cortex respond more strongly to sounds you're paying attention to.
  • Unattended sounds receive reduced processing, consistent with the idea of an attentional filter that dampens irrelevant input.
  • Inattentional deafness occurs when you completely fail to notice an auditory stimulus because your attention is focused elsewhere. This is the auditory equivalent of inattentional blindness (the famous "gorilla experiment" demonstrated the visual version, and analogous studies show the same thing happens with sound).