A measure of the amount of information that is output by a source, or throughput by a channel, or received by an observer (per symbol or per second). Following Shannon (1948) and later writers, the entropy of a discrete memoryless source with alphabet A = {ai} of size n, and output X at time t is
where
The logarithmic base
b is chosen to give a convenient scale factor. Usually,
or
Entropy is then measured in
bits, in
natural units or
nats, or in
Hartleys, respectively. When the source has memory, account has to be taken of the dependence between successive symbols output by the source.
The term arises by analogy with entropy in thermodynamics, where the defining expression has the same form but with a physical scale factor k (Boltzmann constant) and with the sign changed. The word negentropy is therefore sometimes used for the measure of information, as is uncertainty or simply ‘information’.