Alternating Series
An alternating series has terms that flip between positive and negative. What makes these series interesting is that the cancellation between positive and negative terms can cause a series to converge even when the absolute values of its terms form a divergent series. That distinction between absolute and conditional convergence is one of the central ideas in this section.
Alternating Series Test (Leibniz Test)
The Alternating Series Test gives you a straightforward way to confirm convergence for series whose terms alternate in sign. An alternating series takes one of these forms, where for all :
To apply the test, check three conditions:
- The terms alternate in sign. The series contains the factor or .
- The sequence is eventually decreasing. That is, for all beyond some point. The magnitudes of the terms shrink (or at least don't grow).
- . The terms approach zero.
If all three conditions hold, the series converges.
Note that this test only tells you the series converges. It doesn't tell you what it converges to. Sometimes you can find the sum (for example, ), but often there's no neat closed form.
When is this test useful? It's your go-to when the ratio and root tests are inconclusive. For instance, can't be handled by the ratio test, but the Alternating Series Test confirms convergence quickly: is decreasing and approaches 0.
Common mistake: If , the series diverges by the Divergence Test. But if and the sequence is not eventually decreasing, the Alternating Series Test simply doesn't apply. That doesn't automatically mean the series diverges; you'd need another method.

Sum Estimation and Error Bounds
One of the best features of convergent alternating series is that you get a built-in error bound when approximating the sum with a partial sum.
The partial sum adds up the first terms:
The Alternating Series Estimation Theorem says the error from stopping at is bounded by the absolute value of the very next term:
This is remarkably useful. Here's how to apply it in practice:
- Decide how much error you can tolerate (say, error < 0.01).
- Find the smallest such that .
- Compute . Your approximation is guaranteed to be within 0.01 of the true sum.
Example: For , the partial sum has error . If you need better accuracy, take more terms: has error at most .
There's another useful geometric fact: the true sum always lies between any two consecutive partial sums and . The partial sums oscillate around the true value, alternately overshooting and undershooting it.

Absolute vs. Conditional Convergence
This distinction matters a lot, and it comes up frequently on exams.
Absolute convergence: A series converges absolutely if also converges. You're stripping away the alternating signs and asking whether the series still converges.
- Example: converges absolutely because is a convergent p-series ().
Conditional convergence: A series converges conditionally if it converges, but diverges. The alternating signs are doing all the work.
- Example: converges by the Alternating Series Test, but is the harmonic series, which diverges. So this series converges only conditionally.
Testing an alternating series for convergence type
- Apply the Alternating Series Test. If the series doesn't converge, you're done.
- If it converges, take absolute values and test using any appropriate test (p-series, comparison, ratio, etc.).
- If converges, the original series is absolutely convergent.
- If diverges, the original series is conditionally convergent.
Why does this matter? Absolute convergence is the stronger condition. An absolutely convergent series can have its terms rearranged in any order without changing the sum. A conditionally convergent series is fragile: the Riemann Rearrangement Theorem says you can rearrange its terms to converge to any real number, or even to diverge. That's a striking difference.