dot-Py: y-hat <=ft( \mathbf{\hat{y}} = \mathbf{P} \mathbf{y} \right)

Description Usage Arguments Details Value Author(s) References See Also

Description

Calculates y-hat ≤ft( \mathbf{\hat{y}} \right), that is, the predicted value of \mathbf{y} given \mathbf{X} using

\mathbf{\hat{y}} = \mathbf{P} \mathbf{y}

where

\mathbf{P} = \mathbf{X} ≤ft( \mathbf{X}^{T} \mathbf{X} \right)^{-1} \mathbf{X}^{T} .

Usage

1
.Py(y, P = NULL, X = NULL)

Arguments

y

Numeric vector of length n or n by 1 matrix. The vector \mathbf{y} is an n \times 1 vector of observations on the regressand variable.

P

n by n numeric matrix. The n \times n projection matrix ≤ft( \mathbf{P} \right).

X

n by k numeric matrix. The data matrix \mathbf{X} (also known as design matrix, model matrix or regressor matrix) is an n \times k matrix of n observations of k regressors, which includes a regressor whose value is 1 for each observation on the first column.

Details

If P = NULL, the P matrix is computed using P() with X as its argument. If P is provided, X is not needed.

Value

Returns y-hat ≤ft( \mathbf{\hat{y}} \right).

Author(s)

Ivan Jacob Agaloos Pesigan

References

Wikipedia: Linear Regression

Wikipedia: Ordinary Least Squares

See Also

Other y-hat functions: .Xbetahat(), Py(), Xbetahat(), yhat()


jeksterslabds/jeksterslabRlinreg documentation built on Jan. 7, 2021, 8:30 a.m.