Basic elements of propositional logic
Propositional logic gives you a precise system for deciding whether complex statements are true or false. It's the backbone of mathematical reasoning: before you can write proofs or evaluate arguments, you need a way to break statements apart, combine them, and track their truth values. This section covers the building blocks: propositions, connectives, and truth tables.
Propositions and statements
A proposition is a declarative sentence that is either true or false, with no middle ground. "7 is a prime number" is a proposition (it's true). "Is 7 prime?" is not a proposition because questions don't have truth values.
- Atomic propositions are the simplest kind: single statements that can't be broken down further. Example: "It is raining."
- Compound propositions combine atomic propositions using logical connectives. Example: "It is raining and the ground is wet."
- In formal notation, propositions are denoted by lowercase letters like , , .
Logical connectives
Connectives are the glue that joins simple propositions into compound ones. There are five you need to know:
- AND (): conjunction
- OR (): disjunction
- NOT (): negation
- IF-THEN (): conditional
- IF AND ONLY IF (): biconditional
Each connective has specific rules that determine the truth value of the compound proposition based on the truth values of its parts. Those rules are captured in truth tables.
Truth tables
A truth table lists every possible combination of truth values for the atomic propositions in an expression, then shows the resulting truth value of the whole expression.
For example, with two propositions and , there are four possible combinations (TT, TF, FT, FF). Each row of the table evaluates one combination. Here's the truth table for :
| T | T | T |
| T | F | F |
| F | T | F |
| F | F | F |
Truth tables are your go-to tool for verifying logical equivalences, checking argument validity, and understanding how connectives behave.
Logical equivalence
Two propositions are logically equivalent if they produce the same truth value in every possible scenario. This is denoted by .
You can verify equivalence by building truth tables for both expressions and checking that their output columns match row by row. Alternatively, you can transform one expression into the other using logical laws (like De Morgan's laws, covered below). Logical equivalence is what lets you simplify complex expressions into simpler ones without changing their meaning.
Logical operators
These operators are the core toolkit for building and manipulating logical expressions. Each one has a precise definition, and understanding exactly when each produces true or false is essential for everything that follows.
Negation and conjunction
Negation () simply flips the truth value of a proposition.
- is true when is false, and false when is true.
- Think of it as "It is not the case that ."
Conjunction () is the logical AND.
- is true only when both and are true.
- If either one is false, the whole conjunction is false.
- Example: "It is sunny AND warm" is only true if both conditions hold.
Disjunction and exclusive or
Disjunction () is the logical OR (inclusive).
- is true when at least one of or is true.
- It's also true when both are true. This trips people up because everyday English "or" often implies "one or the other but not both." In logic, inclusive OR includes the "both" case.
Exclusive OR () is true when exactly one operand is true.
- is true when and have different truth values.
- Example: "You can have soup or salad" (at a restaurant where you pick one) is exclusive or.
Conditional and biconditional
Conditional () represents "if , then ."
- is false in only one case: when is true and is false.
- This is the trickiest connective. When is false, the conditional is always true regardless of . This is called being "vacuously true." For instance, "If pigs fly, then the moon is made of cheese" is technically true in propositional logic because the hypothesis is false.
Biconditional () means "if and only if."
- is true when and have the same truth value (both true or both false).
- It expresses that two propositions are necessary and sufficient for each other.
Logical laws and properties
These laws let you rewrite logical expressions into equivalent forms, which is critical for simplifying proofs and working with complex statements. They play the same role that algebraic rules (like distributing or factoring) play in arithmetic.
Commutative and associative laws
Commutative law: The order of operands doesn't matter for AND and OR.
Associative law: You can regroup operands without changing the result.
These may seem obvious, but they're important to state explicitly because not all operations are commutative or associative. The conditional (), for instance, is not commutative: is not the same as .

Distributive and De Morgan's laws
Distributive law: AND distributes over OR, and OR distributes over AND.
De Morgan's laws: These tell you how negation interacts with AND and OR. They're used constantly.
The pattern: when you push a negation through a conjunction or disjunction, the connective flips (AND becomes OR, OR becomes AND) and each component gets negated.
Law of excluded middle
This law states that for any proposition , either is true or is true. There's no third option.
Formally: is always true (a tautology).
This principle underpins proof by contradiction. If assuming leads to a contradiction, then must be true because one of the two has to hold. Worth noting: some branches of mathematics (intuitionistic logic) reject this law, but in standard logic and most math courses, it's accepted.
Argument forms and validity
An argument in logic is a sequence of propositions (premises) followed by a conclusion. An argument is valid if the conclusion must be true whenever all the premises are true. Recognizing standard valid argument forms helps you both construct and evaluate proofs.
Modus ponens and modus tollens
Modus ponens (affirming the antecedent):
- You know (if , then ).
- You know is true.
- Therefore, must be true.
Formally:
This is the most common form of reasoning in direct proofs.
Modus tollens (denying the consequent):
- You know .
- You know is false ().
- Therefore, must be false ().
Formally:
This is the logical engine behind proof by contradiction.
Hypothetical syllogism
This lets you chain conditionals together:
- You know .
- You know .
- Therefore, .
Formally:
If being a square implies being a rectangle, and being a rectangle implies having four sides, then being a square implies having four sides.
Disjunctive syllogism
This works by elimination:
- You know (at least one is true).
- You know ( is false).
- Therefore, must be true.
Formally:
This form shows up in proofs by cases and whenever you're narrowing down possibilities.
Proof techniques
These are the methods you'll actually use to prove mathematical statements. Each technique relies on the logical principles covered above, and knowing which technique to reach for is a skill that develops with practice.
Direct proof
A direct proof starts from the hypothesis and works forward to the conclusion using definitions, known results, and valid argument forms.
- Assume the hypothesis (the "if" part) is true.
- Apply definitions, theorems, and logical steps.
- Arrive at the conclusion (the "then" part).
Example structure: To prove "if is even, then is even," you'd start by assuming is even (so for some integer ), then compute , which is even by definition.
Proof by contradiction
Here you assume the opposite of what you want to prove, then show that assumption leads to something impossible.
- Assume (the negation of your statement).
- Use logical reasoning to derive a contradiction (something of the form ).
- Conclude that must be false, so is true.
This technique relies on the law of excluded middle. It's especially useful when direct proof seems difficult or when the statement is about nonexistence.
Proof by cases
When a statement covers multiple scenarios, you can prove each one separately.
- Identify all possible cases (they must be exhaustive and mutually exclusive).
- Prove the statement holds in each case.
- Since the cases cover all possibilities, the statement holds in general.
Example: To prove something about all integers, you might split into "case 1: is even" and "case 2: is odd," then prove the result for each.

Applications of propositional logic
Propositional logic isn't just abstract theory. The same structures that underpin mathematical proofs also drive the design of digital hardware, the logic of programming languages, and the analysis of databases.
Boolean algebra
Boolean algebra is an algebraic system with two values (typically 0 and 1) and operations that correspond directly to logical connectives. AND maps to multiplication, OR maps to addition (with ), and NOT maps to complementation.
It's used to simplify logical expressions, design efficient circuits, write database queries, and work with sets. If you understand propositional logic, you already understand the core of Boolean algebra.
Circuit design
Digital circuits are physical implementations of Boolean functions. Logic gates in hardware correspond to connectives:
- AND gate
- OR gate
- NOT gate
By simplifying a logical expression (using laws like De Morgan's or distribution), engineers can reduce the number of gates needed, making circuits cheaper and faster.
Computer programming
Conditional statements (if, else, while) in programming are built on propositional logic. Boolean expressions control the flow of a program.
&&in many languages corresponds to||corresponds to!corresponds to
Understanding propositional logic also matters for formal program verification, where you prove that code behaves correctly under all possible inputs.
Limitations of propositional logic
Propositional logic is powerful but not expressive enough for everything mathematicians need. Recognizing its boundaries helps you understand why more advanced systems exist.
Expressiveness vs predicate logic
Propositional logic treats entire statements as atomic units. It can't talk about objects or express ideas like "for all" or "there exists."
For instance, "All prime numbers greater than 2 are odd" requires predicate logic, which adds variables, predicates (properties of objects), and quantifiers ( for "for all," for "there exists"). Propositional logic simply can't represent this kind of statement. Most of the math you'll encounter needs predicate logic, which builds directly on the propositional foundation covered here.
Paradoxes in propositional logic
Some statements resist consistent truth value assignment. The classic example is the Liar's Paradox: "This sentence is false." If it's true, then it's false; if it's false, then it's true.
The issue is self-reference. Propositional logic assumes every proposition has a definite truth value, but self-referential statements can violate that assumption. These paradoxes motivated the development of more careful logical frameworks and are a reminder that even formal systems have boundaries.
Advanced topics
These topics extend propositional logic toward computation and automation. They're especially relevant if you'll be working with algorithms, formal verification, or computer science applications.
Normal forms
A normal form is a standardized way of writing a logical formula, which makes systematic analysis possible.
- Conjunctive Normal Form (CNF): a conjunction (AND) of clauses, where each clause is a disjunction (OR) of literals. Example:
- Disjunctive Normal Form (DNF): a disjunction (OR) of clauses, where each clause is a conjunction (AND) of literals. Example:
Any propositional formula can be converted into CNF or DNF. CNF is particularly important for satisfiability (SAT) problems.
Resolution principle
Resolution is an inference rule used in automated theorem proving. It works on formulas in CNF.
The idea: if you have two clauses where one contains a literal and the other contains , you can combine them into a new clause that drops and but keeps everything else. For example, from and , resolution produces .
By applying resolution repeatedly, you can derive new consequences or show that a set of clauses is unsatisfiable (leads to a contradiction).
Automated theorem proving
Automated theorem provers are programs that use techniques like resolution and normal forms to prove (or disprove) logical statements without human intervention. They're used in:
- Verifying that hardware designs are correct
- Checking software for bugs
- Solving SAT problems (which have applications across optimization and AI)
The main challenge is managing the enormous search space that arises as formulas grow in complexity.