<Dd> S = ∑ i = 1 n W i i r i 2, W i i = 1 σ i 2 (\ displaystyle S = \ sum _ (i = 1) ^ (n) W_ (ii) (r_ (i)) ^ (2), \ qquad W_ (ii) = (\ frac (1) ((\ sigma _ (i)) ^ (2)))) </Dd> <P> The gradient equations for this sum of squares are </P> <Dl> <Dd> − 2 ∑ i W i i ∂ f (x i, β) ∂ β j r i = 0, j = 1,..., n (\ displaystyle - 2 \ sum _ (i) W_ (ii) (\ frac (\ partial f (x_ (i), (\ boldsymbol (\ beta)))) (\ partial \ beta _ (j))) r_ (i) = 0, \ qquad j = 1, \ ldots, n) </Dd> </Dl> <Dd> − 2 ∑ i W i i ∂ f (x i, β) ∂ β j r i = 0, j = 1,..., n (\ displaystyle - 2 \ sum _ (i) W_ (ii) (\ frac (\ partial f (x_ (i), (\ boldsymbol (\ beta)))) (\ partial \ beta _ (j))) r_ (i) = 0, \ qquad j = 1, \ ldots, n) </Dd>

The ordinary least squares method for determining the best fit minimises