A method originated by Legendre, which refers to the process of estimating the unknown parameters of a model by minimizing the sum of squared differences between the observed values of a random variable and the values predicted by the model. If every observation is given equal weight then this is ordinary least squares (OLS).
For example, with n pairs of observations (x1, y1), …, (xn, yn) and the linear regression model![method of least squares](Images/oree/doc/10.1093/acref/9780199679188.001.0001/acref-9780199679188-math-0411-full.gif)
where α and β are unknown parameters, εj is an error of observation, and E(Yj) denotes the expected value of Yj, the ordinary least squares estimates are the values for α and β that minimize
See also generalized least squares; weighted least squares.