Ergodicity
This article may be too technical for most readers to understand. (July 2014) (Learn how and when to remove this template message)

In probability theory, an ergodic dynamical system is one that, broadly speaking, has the same behavior averaged over time as averaged over the space of all the system's states in its phase space. In physics the term implies that a system satisfies the ergodic hypothesis of thermodynamics.
A random process is ergodic if its time average is the same as its average over the probability space, known in the field of thermodynamics as its ensemble average. The state of an ergodic process after a long time is nearly independent of its initial state.^{[1]}
The term "ergodic" was derived from the Greek words έργον (ergon: "work") and οδός (odos: "path," "way"). It was chosen by Ludwig Boltzmann while he was working on a problem in statistical mechanics.^{[2]}
Contents
Formal definition
Let be a probability space, and be a measurepreserving transformation. We say that T is ergodic with respect to (or alternatively that is ergodic with respect to T) if one of the following equivalent statements is true:^{[3]}
 for every with either or .
 for every with we have or (where denotes the symmetric difference).
 for every with positive measure we have .
 for every two sets E and H of positive measure, there exists an n > 0 such that .
 Every measurable function with is almost surely constant.
Measurable flows
These definitions have natural analogues for the case of measurable flows and, more generally, measurepreserving semigroup actions. Let {T^{t}} be a measurable flow on (X, Σ, μ). An element A of Σ is invariant mod 0 under {T^{t}} if
for each t ∈ R. Measurable sets invariant mod 0 under a flow or a semigroup action form the invariant subalgebra of Σ, and the corresponding measurepreserving dynamical system is ergodic if the invariant subalgebra is the trivial σalgebra consisting of the sets of measure 0 and their complements in X.
Unique ergodicity
A discrete dynamical system , where is a topological space and a continuous map, is said to be uniquely ergodic if there exists a unique invariant Borel probability measure on . The invariant measure is then necessary ergodic for (otherwise it could be decomposed as a barycenter of two invariant probability measures with disjoint support).
Markov chains
In a Markov chain with a finite state space, a state is said to be ergodic if it is aperiodic and positive recurrent (a state is recurrent if there is a nonzero probability of exiting the state and the probability of an eventual return to it is 1; if the former condition is not true the state is "absorbing"). If all states in a Markov chain are ergodic, the chain is said to be ergodic.
Markov's theorem: a Markov chain is ergodic if there is a positive probability to pass from any state to any other state in one step.
For a Markov chain, a simple test for ergodicity is using eigenvalues of its transition matrix (which are always ≤1 in absolute value). The number 1 is always an eigenvalue. If all other eigenvalues are less than 1 in absolute value, then the Markov chain is ergodic. This follows from the spectral decomposition of a nonsymmetric matrix.
Examples in electronics
Ergodicity means the ensemble average equals the time average. Each resistor has an associated thermal noise that depends on the temperature. Take N resistors (N should be very large) and plot the voltage across those resistors for a long period. For each resistor you will have a waveform. Calculate the average value of that waveform; this gives you the time average. Note also that you have N waveforms as we have N resistors. These N plots are known as an ensemble. Now take a particular instant of time in all those plots and find the average value of the voltage. That gives you the ensemble average for each plot. If ensemble average and time average are the same then it is ergodic.
Ergodic decomposition
Conceptually, ergodicity of a dynamical system is a certain irreducibility property, akin to the notions of irreducibility in the theory of Markov chains, irreducible representation in algebra and prime number in arithmetic. A general measurepreserving transformation or flow on a Lebesgue space admits a canonical decomposition into its ergodic components, each of which is ergodic.
See also
Notes
References
 Walters, Peter (1982). An Introduction to Ergodic Theory. Springer. ISBN 0387951520.
 Brin, Michael; Garrett, Stuck (2002). Introduction to Dynamical Systems. Cambridge University Press. ISBN 0521808413.
 Birkhoff, G. D. (1931). "Proof of the ergodic theorem". Proceedings of the National Academy of Sciences of the United States of America. 17 (12): 656. doi:10.1073/pnas.17.2.656.
 Alaoglu, L.; Birkhoff, G. (1940). "General ergodic theorems". The Annals of Mathematics. 41 (2): 293–309.
External links
Look up ergodic in Wiktionary, the free dictionary. 
 Steven Arthur Kalikow, "Outline of Ergodic Theory"
 Karma Dajani and Sjoerd Dirksin, "A Simple Introduction to Ergodic Theory"