A measurement of the speed of a computer, usually applied to computers used in science rather than general-purpose computers, that focuses on floating-point operations rather than instruction executions. (‘Flops’ is an abbreviation of floating-point operations per second.) See also mega-; floating-point representation.