<Tr> <Td> <Ul> <Li> </Li> <Li> </Li> <Li> </Li> </Ul> </Td> </Tr> <Ul> <Li> </Li> <Li> </Li> <Li> </Li> </Ul> <P> In statistics, ordinary least squares (OLS) or linear least squares is a method for estimating the unknown parameters in a linear regression model . OLS chooses the parameters of a linear function of a set of explanatory variables by minimizing the sum of the squares of the differences between the observed dependent variable (values of the variable being predicted) in the given dataset and those predicted by the linear function . Geometrically, this is seen as the sum of the squared distances, parallel to the axis of the dependent variable, between each data point in the set and the corresponding point on the regression surface--the smaller the differences, the better the model fits the data . The resulting estimator can be expressed by a simple formula, especially in the case of a single regressor on the right - hand side . </P> <P> The OLS estimator is consistent when the regressors are exogenous, and optimal in the class of linear unbiased estimators when the errors are homoscedastic and serially uncorrelated . Under these conditions, the method of OLS provides minimum - variance mean - unbiased estimation when the errors have finite variances . Under the additional assumption that the errors are normally distributed, OLS is the maximum likelihood estimator . </P>

For what kind of models can we use ols