In mathematics, an information source is a sequence of random variables ranging over a finite alphabet Γ, having a stationary distribution.
The uncertainty, or entropy rate, of an information source is defined as
 
where
 
is the sequence of random variables defining the information source, and
 
is the conditional information entropy of the sequence of random variables.  Equivalently, one has
