Study smarter with Fiveable
Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.
Syntactic structures reveal how your brain organizes and produces language in real time. These aren't just abstract grammar rules. When you study them, you're exploring the architecture that allows humans to generate an infinite number of sentences from a finite set of rules. This connects to bigger questions you'll encounter throughout linguistics: What do we innately know about language? How do meaning and form interact? What makes human language unique?
Understanding these concepts will help you analyze sentences systematically, recognize patterns across languages, and grasp why certain constructions feel "right" or "wrong" to native speakers. Don't just memorize definitions. Know what principle each concept demonstrates about how language works. Whether you're diagramming a sentence or explaining why recursion matters, you're being tested on your ability to connect structure to meaning and theory to evidence.
Every sentence follows an invisible blueprint. These concepts explain the hierarchical organization underlying all human language. Words don't just string together in a line. They nest inside increasingly complex structures, with smaller units grouping into larger ones.
These are the formal rules that specify how constituents combine to form grammatical sentences. A rule like says that a sentence consists of a noun phrase followed by a verb phrase.
A constituent is a group of words that functions as a single grammatical unit. In "the tall woman laughed," the words the tall woman form one constituent (a noun phrase) that acts as the subject.
Syntactic trees are visual diagrams of hierarchical structure. They show exactly how phrases nest within phrases, making relationships between words explicit: what modifies what, what's the head, what's the complement.
Compare: Phrase structure rules vs. syntactic trees: rules generate structures while trees represent them visually. Think of rules as the recipe and trees as the photograph of the finished dish. On exams, you may need to write rules that produce a given tree or draw trees that reflect specific rules.
Noam Chomsky transformed linguistics by proposing that sentences have two levels of representation: an abstract meaning level and a concrete output level. These concepts explain how the same underlying idea can appear in different forms.
Deep structure is the abstract representation where semantic relationships (who did what to whom) are directly encoded. Surface structure is the actual word order and form you hear or read after transformational rules have applied.
Chomsky introduced transformational grammar to explain how related sentences share underlying structure even when they look different on the surface. John hit the ball and The ball was hit by John express the same basic meaning through different surface arrangements.
Compare: Deep structure vs. surface structure: deep structure is what you mean, surface structure is what you say. A classic example: "The chicken is ready to eat" has one surface structure but two possible deep structures (the chicken is going to eat something, or someone is going to eat the chicken).
These concepts address the nature of linguistic knowledge itself: what speakers implicitly know, how that knowledge is organized, and what's universal across all human languages.
A generative grammar is a formal model of the unconscious rules that allow speakers to produce and understand sentences they've never heard before. The key insight is that a finite set of rules can generate an unlimited number of grammatical sentences.
These are native speaker intuitions about whether a sentence is acceptable. You know instantly that "The dog bit the man" is fine and that *"Bit dog the man the" is not, even if you can't explain the rule you're applying.
Universal Grammar (UG) is the hypothesis that all human languages share deep structural properties, and that these properties reflect an innate language faculty. Despite enormous surface differences, languages consistently show features like noun phrases, verb phrases, and hierarchical structure.
Compare: Generative grammar vs. Universal Grammar: generative grammar describes how rules generate sentences in a specific language, while Universal Grammar proposes what's shared across all languages. UG is the innate toolkit; generative grammar is the language-specific implementation built on top of it.
These concepts zoom in on the internal architecture of phrases and what makes the system's infinite productivity possible.
X-bar theory proposes that all phrases, regardless of category (NP, VP, PP, etc.), share a common internal template. Every phrase has a head (the word that determines the phrase's category), a complement (the head's closest dependent, completing its meaning), and a specifier (an element in a higher position, like a determiner in a noun phrase).
Recursion is the property that allows a phrase to contain another phrase of the same type, with no built-in limit. You can keep embedding: the cat that chased the rat that ate the cheese that was in the house that...
Compare: X-bar theory vs. recursion: X-bar theory explains the internal structure of individual phrases, while recursion explains how phrases can contain other phrases of the same type. X-bar gives you the blueprint for one floor; recursion lets you stack infinite floors.
| Concept | Best Examples |
|---|---|
| Hierarchical organization | Phrase structure rules, Constituent structure, Syntactic trees |
| Meaning-form relationship | Deep structure, Surface structure, Transformational grammar |
| Implicit linguistic knowledge | Generative grammar, Grammaticality judgments |
| Innateness and universals | Universal Grammar, Recursion |
| Phrase-internal structure | X-bar theory, Constituent structure |
| Infinite generativity | Recursion, Generative grammar, Phrase structure rules |
Which two concepts both address the hierarchical organization of sentences but differ in whether they generate structures or represent them visually?
Explain how deep structure and surface structure account for the ambiguity in a sentence like "Visiting relatives can be annoying."
Compare generative grammar and Universal Grammar: What question does each concept primarily answer about human language?
If a linguist asks native speakers whether "The was dog happy" is acceptable, which concept are they using as their primary evidence, and what does this reveal about linguistic knowledge?
How does recursion relate to the claim that human language is fundamentally different from animal communication? Give an example of an embedded structure that demonstrates this property.