Generally, information is whatever is capable of causing a human mind to change its opinion about the current state of the real world. Formally, and especially in science and engineering, information is whatever contributes to a reduction in the uncertainty of the state of a system; in this case, uncertainty is usually expressed in an objectively measurable form. Commonly, this is done by means of Shannon’s entropy. Nevertheless, this formula for uncertainty involves probabilities, and these may well have to be subjective. If that is so, the formal measurement must be qualified as depending on subjective probabilities, and ‘uncertainty’ must be replaced by ‘opinion, or personal estimate, of uncertainty’.
Information must be distinguished from any medium that is capable of carrying it. A physical medium (such as a magnetic disk) may carry a logical medium (data, such as binary or text symbols). The information content of any physical objects, or logical data, cannot be measured or discussed until it is known what range of possibilities existed before and after they were received. The information lies in the reduction in uncertainty resulting from the receipt of the objects or the data, and not in the size or complexity of the objects or data themselves. Questions of the form, function, and semantic import of data are only relevant to information inasmuch as they contribute to the reduction of uncertainty. If an identical memorandum is received twice, it does not convey twice the information that its first occurrence conveyed: the second occurrence conveys no information at all, unless, by prior agreement, the number of occurrences is itself to be regarded as significant.
Information has ramifications in security, politics, culture, and the economy, as well as in science and engineering. The extent to which information is used as an economic commodity is one of the defining characteristics of the ‘post-industrial’ society, hence the phrase ‘the information society’.