A method of writing real numbers, used in computing, in which a number is written as a×10n, where 0.1 ≤ a<1 and n is an integer. The number a is called the significand, and n is the order of magnitude. Thus, 634.8 and 0.002 34 are written as 0.6348×103 and 0.234×10−2. (There is also a base 2 version similar to the base 10 version just described.)
This is in contrast to fixed‐point notation, in which all numbers are given by means of a fixed number of digits with a fixed number of digits after the decimal point. For example, if numbers are given by means of 8 digits with four of them after the decimal point, the two numbers above would be written (with an approximation) as 0634.8000 and 0000.0023. Compare scientific notation.