Definition of the Infinitesimal Generator Matrix
The infinitesimal generator matrix, denoted , encodes all the information about how a continuous-time Markov chain (CTMC) evolves. Where a discrete-time chain uses a transition probability matrix to describe jumps at fixed time steps, the generator matrix captures instantaneous rates of transition between states.
Each entry (with ) represents the rate at which the process transitions from state to state over an infinitesimal time interval. More precisely, for small :
The diagonal entries are defined so that each row sums to zero, which means . This makes the total rate at which the process leaves state .
Properties of the Infinitesimal Generator Matrix
Non-negative off-diagonal elements
Off-diagonal entries (for ) are always non-negative. They represent transition rates, and a negative rate has no physical meaning. If , direct transitions from to simply don't occur.
Negative diagonal elements
Each diagonal entry is non-positive (negative or zero). Its absolute value gives the total departure rate from state . The holding time in state is exponentially distributed with parameter , so the expected time spent in state before jumping is .
A diagonal entry of zero means the state is absorbing: once the process enters, it never leaves.
Row sums equal to zero
Every row of sums to zero:
This is the continuous-time analogue of rows summing to one in a discrete-time transition matrix. It enforces conservation of probability: probability flowing out of state (captured by ) exactly balances the total probability flowing into other states (captured by the off-diagonal entries).
Relationship to the Transition Rate Matrix
Some texts define a separate transition rate matrix whose entries (for ) are the same transition rates, but whose diagonal entries are zero (or defined differently). The generator and share the same off-diagonal entries:
The diagonal of is then constructed from :
So is fully determined by the off-diagonal rates. If you're given , building just means filling in the diagonal to make each row sum to zero.
Role in Continuous-Time Markov Chains
Evolution of state probabilities
Let be the row vector of state probabilities at time . The generator matrix governs how changes over time through the relationship:
This single matrix equation encodes the entire dynamics of the CTMC. It also leads to the matrix exponential solution , where is the transition probability matrix at time .
Kolmogorov forward equations
The forward equations (also called master equations) describe how transition probabilities evolve forward in time:
Given an initial distribution , you solve this system to find . These equations are the workhorse for computing state probabilities at any future time.
Kolmogorov backward equations
The backward equations take a different perspective, conditioning on the starting state:
Notice multiplies from the left here, not the right. The backward equations are particularly useful for computing hitting times and absorption probabilities, where you want to ask: "Starting from state , what's the probability of reaching state by time ?"
Computation of the Infinitesimal Generator Matrix
From the transition rate matrix
- Write down all off-diagonal rates directly from the rate matrix .
- For each row , sum the off-diagonal entries: .
- Set the diagonal entry: .
- Verify every row sums to zero.
From a state transition diagram
- Label each state as a node. Each directed arrow from state to state carries a rate .
- Read off the arrow labels to fill in the off-diagonal entries of .
- For each state , sum all outgoing arrow rates and negate to get .
Example: Suppose you have three states with arrows: , , , and . The generator matrix is:
Check: each row sums to zero.

Eigenvalues and Eigenvectors
Interpretation of eigenvalues
The eigenvalues of reveal how quickly the chain approaches equilibrium.
- One eigenvalue is always . This corresponds to the stationary distribution.
- All other eigenvalues have non-positive real parts (for an irreducible, finite-state chain, they have strictly negative real parts).
- The eigenvalue with the smallest nonzero magnitude (closest to zero) is called the spectral gap. It controls the rate of convergence to stationarity: a larger spectral gap means faster convergence.
Since , each eigenvalue contributes a term proportional to . The zero eigenvalue gives the steady-state component, while the negative eigenvalues produce transient terms that decay exponentially.
Stationary distribution and eigenvectors
The stationary distribution is the row vector satisfying:
This makes the left eigenvector of corresponding to eigenvalue zero. Meanwhile, the right eigenvector for eigenvalue zero is (a column vector of ones), which is just a restatement of the row-sum-equals-zero property.
For an irreducible, positive-recurrent CTMC, this stationary distribution exists, is unique, and represents the long-run fraction of time spent in each state.
Applications of the Infinitesimal Generator Matrix
Queueing systems
In an M/M/1 queue with arrival rate and service rate , the states represent the number of customers in the system. The generator matrix has on the superdiagonal (arrivals) and on the subdiagonal (departures). Solving yields performance measures like average queue length and utilization .
Birth-death processes
Birth-death processes are CTMCs where transitions only occur between neighboring states ( or ). The generator matrix has a tridiagonal structure: birth rates on the superdiagonal, death rates on the subdiagonal, and on the diagonal. This structure makes the stationary distribution particularly tractable, with a product-form solution.
Reliability analysis
System components can be modeled as occupying operational or failed states, with failure rates and repair rates as the transition rates. The generator matrix captures all possible failure/repair transitions. From it, you can compute the mean time to failure (MTTF), system availability (long-run fraction of time the system is operational), and the probability of being in any particular degraded state.
Numerical Example
Consider a two-state CTMC (states 0 and 1) with transition rate from 0 to 1 and rate from 1 to 0.
Step 1: Construct :
Step 2: Find the stationary distribution by solving with :
Step 3: The eigenvalues are and . The transient behavior decays at rate , so the chain reaches stationarity faster when both rates are large.
Step 4: The transition probability matrix is:
As , the exponential terms vanish and each row converges to .
Comparison to Discrete-Time Markov Chains
| Feature | DTMC | CTMC |
|---|---|---|
| Key matrix | Transition probability matrix | Generator matrix |
| Entry meaning | = probability of jumping to | = rate of transitioning to |
| Row sums | Sum to 1 | Sum to 0 |
| Entry constraints | All entries in | Off-diagonal , diagonal |
| Stationary condition | ||
| -step/time- evolution | (matrix power) | (matrix exponential) |
| The connection between them: if you uniformize a CTMC (sample it at a rate at least as large as $$\max_i | q_{ii} | Qt = 0Q = \lim_{\Delta t \to 0} \frac{P(\Delta t) - I}{\Delta t}$$. |
Extensions and Generalizations
Time-inhomogeneous Markov chains
When transition rates change over time, the generator becomes a function with time-dependent entries . The Kolmogorov equations still hold but with replacing :
The solution is no longer a simple matrix exponential; instead it involves a time-ordered exponential (or product integral). This extension is useful for modeling systems with seasonal variation, aging components, or externally driven rate changes.
Absorbing states and absorption probabilities
An absorbing state has for all (and therefore ). To analyze absorption, partition the state space into transient states and absorbing states , and write in block form:
Here is the submatrix of rates among transient states. Two key quantities follow:
- Mean time to absorption from each transient state:
- Absorption probabilities (probability of being absorbed into each absorbing state):
These formulas are direct analogues of the discrete-time results using the fundamental matrix.