Fiveable

🤔Cognitive Psychology Unit 18 Review

QR code for Cognitive Psychology practice questions

18.1 Common Cognitive Biases

18.1 Common Cognitive Biases

Written by the Fiveable Content Team • Last updated August 2025
Written by the Fiveable Content Team • Last updated August 2025
🤔Cognitive Psychology
Unit & Topic Study Guides

Understanding Cognitive Biases

Cognitive biases are systematic patterns of deviation from rational judgment. They arise because our brains rely on mental shortcuts (called heuristics) to process information quickly. These shortcuts are often useful, but they can also produce predictable errors in how we reason, interpret evidence, and make decisions.

Understanding these biases matters for cognitive psychology because they reveal how the mind actually works, not how it should work according to logic. They show up everywhere: personal finance, medical decisions, politics, and everyday social interactions.

Cognitive Biases in Decision Making

A cognitive bias isn't a random mistake. It's a systematic error, meaning it follows a predictable pattern across people and situations. Biases typically stem from heuristics, which are mental rules of thumb the brain uses to save time and effort.

Here's how biases affect thinking at different levels:

  • They distort how we process information. Instead of weighing all evidence equally, biases cause us to filter, emphasize, or ignore certain inputs. The sunk cost fallacy is a clear example: you keep investing in a failing project because you've already spent so much on it, even though past spending shouldn't affect future decisions.
  • They skew our judgments about reality. The overconfidence effect leads people to overestimate the accuracy of their own knowledge or predictions. Studies consistently show that when people say they're "99% sure" of an answer, they're wrong about 20–40% of the time.
  • They create self-reinforcing patterns. Once a bias takes hold, it can shape what information you seek out next, which deepens the distortion over time.
Cognitive biases in decision making, Raconteur – Cognitive Bias

Common Types of Cognitive Biases

Confirmation bias is the tendency to seek out, notice, and remember information that supports what you already believe, while downplaying or ignoring evidence that contradicts it. For example, someone with a strong political opinion might read only news sources that align with their views and dismiss opposing coverage as unreliable. This isn't deliberate dishonesty; it happens automatically.

Anchoring bias occurs when an initial piece of information disproportionately influences subsequent judgments. In a classic study, participants who were first asked whether Gandhi died before or after age 140 gave significantly higher estimates of his actual age at death than participants anchored at age 9. In everyday life, this shows up in negotiations: the first price mentioned in a car deal sets a mental anchor that pulls all counteroffers toward it.

Availability heuristic leads people to estimate the probability of an event based on how easily examples come to mind, rather than on actual statistics. After extensive news coverage of a plane crash, many people overestimate the danger of flying, even though driving is statistically far more dangerous. Events that are vivid, recent, or emotionally charged are easier to recall, so they feel more likely than they actually are.

Cognitive biases in decision making, Cognitive Biases - Sensemaking Resources, Education, and Community

Applications and Implications of Cognitive Biases

Real-World Impact of Cognitive Biases

Cognitive biases don't just show up in lab experiments. They shape real outcomes across multiple domains:

  • Individual decision making. The framing effect shows that how a choice is worded changes what people decide. Consumers are more likely to buy meat labeled "90% lean" than meat labeled "10% fat," even though those are identical. Loss aversion, the tendency to feel losses roughly twice as strongly as equivalent gains, helps explain why investors hold onto losing stocks too long, hoping to avoid locking in a loss.
  • Group decision making. Groupthink occurs when a group prioritizes consensus over critical evaluation, leading to poor decisions. The 1986 Challenger disaster is a frequently cited case: engineers had concerns about the O-ring seals, but group pressure suppressed dissent. On social media, group polarization pushes people toward more extreme positions as they interact mainly with like-minded users in echo chambers.
  • Political and social contexts. The bandwagon effect drives people to support candidates or policies partly because others already do, which can influence election outcomes independent of policy substance. Selective exposure, where people choose media that confirms their existing views, creates filter bubbles that make it harder to encounter diverse perspectives.

Adaptive vs. Maladaptive Cognitive Biases

Not all biases are harmful. Many evolved because they provided real survival advantages.

  • Adaptive aspects. Heuristics allow rapid decision making when time and cognitive resources are limited. A quick "gut feeling" about danger, rooted in the availability of threat-related memories, can trigger a fight-or-flight response faster than careful deliberation would allow. In environments where speed matters more than precision, these shortcuts are genuinely useful.
  • Maladaptive aspects. The same shortcuts become problematic in contexts that require careful analysis. Confirmation bias can reinforce stereotypes and contribute to workplace discrimination. Anchoring can lead to unfair sentencing in courtrooms. When biases operate outside the environments they evolved for, they tend to produce errors rather than advantages.
  • Cultural and evolutionary context. Some biases, like in-group favoritism (preferring members of your own group), likely have evolutionary roots in tribal cooperation. But cultural factors also shape which biases are strongest and how they manifest. A bias that's adaptive in one context can be deeply maladaptive in another.

Mitigation strategies focus on two main approaches:

  1. Awareness and education. Simply learning about biases can reduce their influence, though it doesn't eliminate them. Knowing about anchoring, for instance, helps you consciously adjust away from an initial number.
  2. Structured decision-making frameworks. Techniques like requiring people to actively argue against their preferred conclusion (a "devil's advocate" approach), using checklists, or introducing delays before finalizing decisions can all reduce bias impact. These are sometimes grouped under the term cognitive debiasing.

The key takeaway: biases aren't signs of stupidity. They're built into how human cognition works. The goal isn't to eliminate them entirely but to recognize when they're likely to lead you astray and apply corrective strategies.