Fiveable

🎠Social Psychology Unit 13 Review

QR code for Social Psychology practice questions

13.4 Media and Technology in Social Influence

13.4 Media and Technology in Social Influence

Written by the Fiveable Content Team • Last updated August 2025
Written by the Fiveable Content Team • Last updated August 2025
🎠Social Psychology
Unit & Topic Study Guides

Online Social Influence

Social Media's Impact on Persuasion

Social media platforms don't just connect people; they actively shape how we think, what we buy, and which ideas we adopt. The persuasion principles covered earlier in this unit (social proof, authority, liking) all show up in amplified form on these platforms.

Algorithms and personalized content are the foundation. Platforms track what you click, watch, and linger on, then serve you more of the same. This isn't neutral; it creates a feedback loop where certain messages get reinforced while others disappear from your feed entirely.

Social proof goes digital through likes, shares, and comments. A post with 50,000 likes feels more credible than one with 12, even if the content is identical. This is the same social proof mechanism from offline persuasion, just made visible and quantifiable.

Social comparison intensifies online. Users constantly encounter curated highlight reels of other people's lives, which shifts self-perception and drives consumer behavior. Research consistently links heavy social media use with increased upward social comparison.

Viral content exploits network structure. When something gets shared enough, it reaches people far beyond the original audience, amplifying messages at a speed traditional media never could.

Influencer marketing applies the liking and authority principles directly:

  • Micro-influencers (smaller, niche followings) tend to have higher engagement rates because their audiences see them as relatable and trustworthy
  • Macro-influencers (large followings) offer broader reach but often feel less personal, which can reduce perceived authenticity

Digital Marketing Strategies

Online persuasion adapts classic techniques to digital environments, often making them more targeted and harder to recognize.

  • Call-to-action buttons use strategic placement, color, and wording to guide behavior (e.g., "Start Your Free Trial" is more effective than "Submit")
  • A/B testing lets marketers compare two versions of a webpage or ad to see which one converts more users, essentially fine-tuning persuasion in real time
  • Retargeting ads follow you across platforms after you visit a product page, using the mere exposure effect to keep the product in your mind

Viral marketing is designed from the start to be shared. Two triggers that reliably drive sharing:

  • Emotional appeal: Content that evokes strong emotions (heartwarming stories, outrage) gets shared far more than neutral content
  • Humor and entertainment: Memes spread because they're fun to pass along, and the brand message rides along with them

Influencer collaborations work because they disguise marketing as personal recommendation. Product reviews, unboxing videos, and behind-the-scenes content all build trust by making the promotion feel organic rather than scripted.

Social Media's Impact on Persuasion, Warum Social Media immer kommerzieller werden — Social-Media-Blog, Statistikpool, Metverse ...

Information Bubbles

Echo Chambers and Filter Bubbles

These are related but distinct concepts, and the distinction matters.

An echo chamber forms when people actively surround themselves with like-minded voices. You follow accounts you agree with, join groups that share your views, and gradually stop encountering opposing perspectives. Confirmation bias drives this: we naturally seek out information that supports what we already believe. Over time, group polarization kicks in, meaning the group's views become more extreme than any individual member's original position.

A filter bubble is created for you by algorithms. Even if you didn't deliberately seek out one-sided content, the platform's personalization system notices what you engage with and feeds you more of it. You may not even realize your information diet is being narrowed.

Both phenomena lead to the same outcomes: increased political polarization, reduced understanding of other viewpoints, and decreased empathy for people who disagree with you. Social media platforms intensify both because their business model rewards engagement, and content that confirms your existing beliefs keeps you scrolling.

Social Media's Impact on Persuasion, Social media analytics - Wikipedia

Algorithmic Persuasion and User Manipulation

Platforms don't just filter content passively. They actively shape your behavior through several mechanisms:

  • Recommendation systems suggest content based on what similar users engaged with, gradually steering your preferences in directions you didn't consciously choose
  • Machine learning algorithms analyze massive amounts of user data (clicks, watch time, search history) to predict what will keep you on the platform longest
  • Personalized advertising targets you based on browsing history, demographics, location, and even inferred personality traits

Platforms also use design features that exploit known psychological tendencies:

  • Gamification (streaks on Snapchat, badges on Reddit) taps into variable reward schedules to keep you coming back
  • Infinite scrolling removes natural stopping points, so there's no moment where you decide "that's enough"
  • Autoplay on video platforms exploits inertia; it's easier to keep watching than to actively stop

These aren't accidental design choices. They're built on behavioral psychology research to maximize the time you spend on the platform.

Manipulative Digital Content

Digital Propaganda Techniques

Digital propaganda uses many of the same principles as historical propaganda but operates faster and at greater scale. Key techniques to recognize:

  • Astroturfing: Creating the appearance of a grassroots movement that doesn't actually exist. Fake social media accounts post coordinated messages to make a fringe opinion look mainstream.
  • Coordinated inauthentic behavior: Networks of fake accounts and bots amplify specific messages, making them appear more popular or widespread than they are.
  • Deepfakes: AI-generated audio or video that makes it look like someone said or did something they never did. The technology is improving rapidly, making detection harder.
  • Selective fact presentation: Choosing real but cherry-picked facts to frame an issue in a misleading way. This is harder to counter than outright lies because the individual facts may be true.
  • Emotional manipulation: Content designed to trigger fear, anger, or hope bypasses critical thinking and drives impulsive sharing.
  • Cross-platform repetition: The same message appears across multiple platforms and accounts, creating an illusion of consensus through sheer volume.

Misinformation and Disinformation Spread

The difference between these two terms is about intent:

Misinformation is false or misleading content shared without the intent to deceive. The person sharing it genuinely believes it's true. Common examples include misinterpreted scientific studies that lead to health scares, or images and videos shared out of their original context to support a false narrative.

Disinformation is deliberately created and spread to deceive. State-sponsored operations targeting foreign elections and political groups fabricating stories to discredit opponents both fall into this category. The creators know the content is false.

Social media amplifies both types because sharing is frictionless. A false claim can reach millions before anyone fact-checks it. Research shows that false news stories spread faster on social media than true ones, partly because they tend to be more novel and emotionally provocative.

Efforts to combat this problem include:

  • Fact-checking organizations that verify claims, though they consistently struggle to keep pace with the volume of false content
  • Digital literacy education that teaches users to evaluate sources, check claims, and recognize manipulation tactics
  • Platform content moderation policies that flag or remove harmful content, though these raise ongoing debates about effectiveness and free expression

The most reliable defense is individual critical thinking: checking sources, questioning emotional reactions to content, and seeking out multiple perspectives before forming conclusions.