Study smarter with Fiveable
Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.
Syntactic structures aren't just abstract grammar rules—they reveal how your brain organizes and produces language in real time. When you study these concepts, you're exploring the fundamental architecture that allows humans to generate an infinite number of sentences from a finite set of rules. This connects directly to bigger questions you'll encounter throughout linguistics: What do we innately know about language? How do meaning and form interact? What makes human language unique?
Understanding these concepts will help you analyze sentences systematically, recognize patterns across different languages, and grasp why certain constructions feel "right" or "wrong" to native speakers. Don't just memorize definitions—know what principle each concept demonstrates about how language works. Whether you're diagramming a sentence or explaining why recursion matters, you're being tested on your ability to connect structure to meaning and theory to evidence.
Every sentence follows an invisible blueprint. These concepts explain the hierarchical organization that underlies all human language—the idea that words don't just string together linearly but nest inside increasingly complex structures.
Compare: Phrase structure rules vs. syntactic trees—rules generate structures while trees represent them visually. Think of rules as the recipe and trees as the photograph of the finished dish. On exams, you may need to write rules that produce a given tree or draw trees that reflect specific rules.
Noam Chomsky transformed linguistics by proposing that sentences have two levels of representation—an abstract meaning level and a concrete output level. These concepts explain how the same underlying idea can appear in different forms.
Compare: Deep structure vs. surface structure—deep structure is what you mean, surface structure is what you say. A classic example: "The chicken is ready to eat" has one surface structure but two possible deep structures (the chicken will eat, or someone will eat the chicken).
These concepts address the nature of linguistic knowledge itself—what speakers implicitly know, how that knowledge is organized, and what's universal across all human languages.
Compare: Generative grammar vs. Universal Grammar—generative grammar describes how rules generate sentences in a specific language, while Universal Grammar proposes what's shared across all languages. UG is the innate toolkit; generative grammar is the language-specific implementation.
These concepts zoom in on the internal architecture of phrases—how individual phrases are organized and what makes recursion possible.
Compare: X-bar theory vs. recursion—X-bar theory explains the internal structure of individual phrases, while recursion explains how phrases can contain other phrases of the same type. X-bar gives you the blueprint for one floor; recursion lets you stack infinite floors.
| Concept | Best Examples |
|---|---|
| Hierarchical organization | Phrase structure rules, Constituent structure, Syntactic trees |
| Meaning-form relationship | Deep structure, Surface structure, Transformational grammar |
| Implicit linguistic knowledge | Generative grammar, Grammaticality judgments |
| Innateness and universals | Universal Grammar, Recursion |
| Phrase-internal structure | X-bar theory, Constituent structure |
| Infinite generativity | Recursion, Generative grammar, Phrase structure rules |
Which two concepts both address the hierarchical organization of sentences but differ in whether they generate structures or represent them visually?
Explain how deep structure and surface structure account for the ambiguity in a sentence like "Visiting relatives can be annoying."
Compare generative grammar and Universal Grammar: What question does each concept primarily answer about human language?
If a linguist asks native speakers whether "The was dog happy" is acceptable, which concept are they using as their primary evidence, and what does this reveal about linguistic knowledge?
How does recursion relate to the claim that human language is fundamentally different from animal communication? Give an example of an embedded structure that demonstrates this property.