A quantity that characterizes the effect of lattice vibrations on the scattering intensity in X-ray diffraction by crystals. The existence of the Debye–Waller factor was pointed out and calculated by Peter Debye in 1913–14 and Ivar Waller (1898–1969) in 1923–7. Because the amplitude of lattice vibrations increases with temperature, it was thought in the very early days of X-ray diffraction studies that the diffraction pattern would disappear at high temperatures. The work of Debye and Waller showed that the lattice vibrations at higher temperatures reduce the intensities of the diffracted radiation but do not destroy the diffraction pattern altogether.