Study smarter with Fiveable
Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.
Privacy concerns sit at the heart of every major debate about social media's role in society. You're being tested on your ability to analyze how platforms collect, use, and sometimes exploit user data—and what that means for individuals, communities, and democratic institutions. These concepts connect directly to larger themes you'll encounter throughout the course: platform economics, digital citizenship, algorithmic influence, and the tension between personalization and manipulation.
Don't just memorize a list of privacy problems. Instead, understand why each concern exists (hint: it usually traces back to business models), how platforms and users interact around these issues, and what trade-offs are involved. When you can explain the underlying mechanisms—not just name the problems—you'll be ready for any question the exam throws at you.
Social media platforms offer "free" services, but the real transaction involves your personal data. The surveillance capitalism model means user information becomes the product sold to advertisers and partners.
Compare: Data collection vs. targeted advertising—both rely on surveillance, but collection is the input while advertising is the output. FRQ tip: If asked about platform business models, connect these two concepts to explain why "free" services aren't actually free.
Beyond what you type and click, platforms increasingly capture physical data about who you are and where you go. These technologies raise unique consent and safety concerns.
Compare: Facial recognition vs. location tracking—both capture data about your physical self, but facial recognition identifies who you are while location data reveals where you are. Together, they create comprehensive surveillance profiles that follow you offline.
Even when platforms intend to protect data, technical and human vulnerabilities create opportunities for malicious actors to access personal information.
Compare: Data breaches vs. identity theft—breaches are the cause (platform failure), while identity theft is often the consequence (individual harm). Understanding this chain helps you analyze who bears responsibility for privacy failures.
Privacy violations don't just affect data—they create real harm to users' wellbeing, relationships, and sense of safety online.
Compare: Cyberbullying vs. permanence concerns—both involve harm from content, but cyberbullying is about malicious intent while permanence problems can arise from innocent posts taken out of context. Both demonstrate why privacy extends beyond just data protection.
The gap between available privacy protections and actual user behavior reveals how platform design choices shape privacy outcomes.
Compare: Privacy settings complexity vs. third-party sharing—one represents the illusion of user control while the other shows what happens behind the scenes regardless of settings. This tension is central to debates about meaningful consent in digital environments.
| Concept | Best Examples |
|---|---|
| Surveillance capitalism | Data collection, third-party sharing, targeted advertising |
| Biometric privacy | Facial recognition, location tracking |
| Security failures | Data breaches, identity theft |
| Consent problems | Privacy settings complexity, terms of service, default settings |
| Physical safety risks | Location services, doxxing, stalking |
| Psychological harm | Cyberbullying, permanence of content, context collapse |
| Platform accountability | Breach disclosure, harassment policies, dark patterns |
Which two privacy concerns both rely on the surveillance capitalism business model, and how do they function as input and output in that system?
Compare facial recognition and location tracking: What type of data does each capture, and how might they combine to create comprehensive surveillance?
If an FRQ asks you to analyze who bears responsibility for privacy failures, which concepts would you use to argue for platform accountability versus individual user responsibility?
Which privacy concerns demonstrate the gap between user control and actual data practices? Explain why having privacy settings doesn't guarantee privacy protection.
A user deletes an embarrassing post immediately after publishing it. Using your knowledge of digital permanence and context collapse, explain why this action might not fully protect their privacy.