dot-RSS: Residual Sum of Square (from \boldsymbol{\hat{\varepsilon}})

Description Usage Arguments Details Value Author(s) References See Also

Description

Calculates the residual sum of squares ≤ft( \mathrm{RSS} \right) using

\mathrm{RSS} = ∑_{i = 1}^{n} ≤ft( Y_i - \hat{Y}_i \right)^2 \\ = ∑_{i = 1}^{n} ≤ft( Y_i - ≤ft[ \hat{β}_{1} + \hat{β}_{2} X_{2i} + \hat{β}_{3} X_{3i} + … + \hat{β}_{k} X_{ki} \right] \right)^2 \\ = ∑_{i = 1}^{n} ≤ft( Y_i - \hat{β}_{1} - \hat{β}_{2} X_{2i} - \hat{β}_{3} X_{3i} - … - \hat{β}_{k} X_{ki} \right)^2 .

In matrix form

\mathrm{RSS} = ∑_{i = 1}^{n} ≤ft( \mathbf{y} - \mathbf{\hat{y}} \right)^{2} \\ = ∑_{i = 1}^{n} ≤ft( \mathbf{y} - \mathbf{X} \boldsymbol{\hat{β}} \right)^{2} \\ = ≤ft( \mathbf{y} - \mathbf{X} \boldsymbol{\hat{β}} \right)^{\prime} ≤ft( \mathbf{y} - \mathbf{X} \boldsymbol{\hat{β}} \right) .

Or simply

\mathrm{RSS} = ∑_{i = 1}^{n} \boldsymbol{\hat{\varepsilon}}_{i}^{2} = \boldsymbol{\hat{\varepsilon}}^{\prime} \boldsymbol{\hat{\varepsilon}}

where \boldsymbol{\hat{\varepsilon}} is an n \times 1 vector of residuals, that is, the difference between the observed and predicted value of \mathbf{y} ≤ft( \boldsymbol{\hat{\varepsilon}} = \mathbf{y} - \mathbf{\hat{y}} \right). Equivalent computational matrix formula

\mathrm{RSS} = \mathbf{y}^{\prime} \mathbf{y} - 2 \boldsymbol{\hat{β}} \mathbf{X}^{\prime} \mathbf{y} + \boldsymbol{\hat{β}}^{\prime} \mathbf{X}^{\prime} \mathbf{X} \boldsymbol{\hat{β}}.

Note that

\mathrm{TSS} = \mathrm{ESS} + \mathrm{RSS}.

Usage

1

Arguments

epsilonhat

Numeric vector of length n or n by 1 numeric matrix. n \times 1 vector of residuals.

X

n by k numeric matrix. The data matrix \mathbf{X} (also known as design matrix, model matrix or regressor matrix) is an n \times k matrix of n observations of k regressors, which includes a regressor whose value is 1 for each observation on the first column.

y

Numeric vector of length n or n by 1 matrix. The vector \mathbf{y} is an n \times 1 vector of observations on the regressand variable.

betahat

Numeric vector of length k or k by 1 matrix. The vector \boldsymbol{\hat{β}} is a k \times 1 vector of estimates of k unknown regression coefficients.

Details

If epsilonhat = NULL, ≤ft( \mathrm{RSS} \right) is computed with X and y as required arguments and betahat as an optional argument.

Value

Returns residual sum of squares ≤ft( \mathrm{RSS} \right).

Author(s)

Ivan Jacob Agaloos Pesigan

References

Wikipedia: Residual Sum of Squares

Wikipedia: Explained Sum of Squares

Wikipedia: Total Sum of Squares

Wikipedia: Coefficient of Determination

See Also

Other sum of squares functions: .ESS(), ESS(), RSS(), TSS()


jeksterslabds/jeksterslabRlinreg documentation built on Jan. 7, 2021, 8:30 a.m.