Fiveable

๐Ÿ’•Intro to Cognitive Science Unit 4 Review

QR code for Intro to Cognitive Science practice questions

4.1 Linguistic theory and language structure

4.1 Linguistic theory and language structure

Written by the Fiveable Content Team โ€ข Last updated August 2025
Written by the Fiveable Content Team โ€ข Last updated August 2025
๐Ÿ’•Intro to Cognitive Science
Unit & Topic Study Guides

Language structure is the foundation of human communication. From the smallest sound units to complex sentences, it's a hierarchical system that lets you express an infinite number of ideas using a finite set of rules. Understanding these components helps explain how we process and produce language.

Linguistic theories attempt to explain how language works in our minds. Whether language relies on innate grammar or emerges from general cognition, these theories address universal patterns across languages and how we acquire them.

Language Structure and Linguistic Theory

Components of language structure

Language can be broken down into distinct levels, each with its own rules. Think of it like zooming in: sentences are made of phrases, phrases of words, words of meaningful parts, and those parts of individual sounds.

  • Phonology studies the sound system of a language.
    • Phonemes are the smallest units of sound that distinguish meaning. For example, swapping /p/ for /b/ turns "pat" into "bat," so /p/ and /b/ are separate phonemes in English.
    • Phonological rules govern how phonemes can be combined and pronounced. English allows "str" at the start of a word ("string") but not "tsr."
  • Morphology examines the internal structure of words.
    • Morphemes are the smallest meaningful units of language.
      • Free morphemes can stand alone as words ("cat," "run").
      • Bound morphemes must attach to other morphemes ("-s" for plural, "-ed" for past tense, "un-" for negation).
    • Inflectional morphology modifies a word's form to express grammatical categories like tense or number ("walk" โ†’ "walked," "dog" โ†’ "dogs"). The word's core meaning and part of speech stay the same.
    • Derivational morphology creates entirely new words by adding affixes ("happy" โ†’ "unhappy," "teach" โ†’ "teacher"). These often change the word's part of speech or core meaning.
  • Syntax investigates the rules governing how words combine into phrases and sentences.
    • Phrase structure refers to the hierarchical grouping of words into constituents (noun phrases, verb phrases, etc.).
    • Grammatical categories are parts of speech: nouns, verbs, adjectives, and so on.
    • Syntactic rules determine whether a sentence is well-formed. "The dog chased the cat" follows English syntax; "Dog the cat the chased" does not.
  • Semantics explores meaning in language.
    • Lexical semantics focuses on the meaning of individual words.
    • Compositional semantics examines how meaning is built up from combining words. "The dog bit the man" means something different from "The man bit the dog," even though the words are identical.
    • Pragmatics considers how context influences interpretation. If someone says "Can you pass the salt?" they're making a request, not asking about your physical ability.
Components of language structure, Language - Wikipedia

Hierarchical organization of language

Language isn't just a flat string of sounds. It's organized in layers, with smaller units nesting inside larger ones:

  1. Phonemes combine to form morphemes
  2. Morphemes combine to form words
  3. Words combine to form phrases
  4. Phrases combine to form sentences

This hierarchical structure enables two powerful properties:

  • Recursion allows phrases to be embedded within other phrases. You can say "The cat sat on the mat," but you can also say "The cat that the dog chased sat on the mat that was near the door that..." and keep going. This is what makes it possible to generate an infinite number of sentences from a finite grammar.
  • Productivity is the ability to create and understand sentences you've never encountered before. You've probably never read this exact sentence, yet you understand it perfectly.

For cognitive processing, hierarchy matters in several ways:

  • Chunking groups linguistic elements into larger units so your working memory isn't overwhelmed. You don't process a sentence phoneme by phoneme; you group sounds into words and words into phrases.
  • Parsing is the incremental, real-time analysis of linguistic input to figure out its structure and meaning as you hear or read it.
  • Ambiguity resolution uses context and world knowledge to choose among multiple possible interpretations. "I saw the man with the telescope" could mean you used a telescope to see him, or that he was holding one. Your brain resolves this quickly, often without you noticing.
Components of language structure, Frontiers | How Grammar Introduces Asymmetry Into Cognitive Structures: Compositional Semantics ...

Theories in linguistics

Two major theoretical frameworks offer competing views of how language relates to the rest of cognition.

Generative Grammar (Chomsky)

  • Language is an innate, domain-specific faculty, meaning it's a dedicated mental module separate from other cognitive abilities.
  • Universal Grammar (UG) is a proposed set of built-in principles and adjustable parameters that constrain all human languages. Children don't learn language from scratch; they're born with UG and just need to "set the switches" based on the language they hear.
  • This approach focuses on the formal, structural properties of language, especially syntax.
  • Syntax is treated as autonomous, operating independently from meaning and context.

Cognitive Linguistics

  • Language is not a separate module but an integral part of general cognitive processes like memory, attention, and categorization.
  • Meaning, context, and experience play central roles in shaping grammatical structure. Grammar isn't arbitrary; it's motivated by how we think.
  • Conceptual factors like metaphor and embodiment drive grammatical patterns. For example, we talk about time using spatial language ("looking forward to the future") because our understanding of time is grounded in bodily experience of space.
  • This framework rejects the idea of an autonomous syntax. Structure and meaning can't be cleanly separated.

Role of linguistic universals

Linguistic universals are properties shared by all (or nearly all) human languages. They come in two types:

  • Substantive universals are specific elements found across languages. All known languages have vowels and consonants, and all distinguish nouns from verbs.
  • Formal universals are abstract structural principles. For instance, all languages have a way to distinguish subjects from objects, even though they use different strategies (word order, case marking, etc.).

These universals play a role in shaping language structure. They constrain the possible forms languages can take, and they likely reflect underlying cognitive and communicative pressures that have shaped how languages evolve.

They also matter for language acquisition. The poverty of the stimulus argument is central here: children acquire complex grammar rapidly and with remarkable consistency, despite hearing limited and often messy input. Proponents of Universal Grammar argue this gap between input and output suggests children bring innate linguistic knowledge to the task.

That said, the concept of linguistic universals faces real challenges:

  • Linguistic diversity is vast. Languages vary enormously in their sound systems, word orders, and grammatical structures. Some proposed universals turn out to have exceptions (e.g., Pirahรฃ, an Amazonian language, has been argued to lack recursion).
  • Language change means languages evolve over time, and features once thought to be universal can shift or disappear in particular languages, complicating claims about fixed universals.