Description Usage Arguments Value Author(s) References See Also Examples
Estimates coefficients of a linear regression model using Singular Value Decomposition. The data matrix \mathbf{X} is decomposed into
\mathbf{X} = \mathbf{U} \mathbf{Σ} \mathbf{V}^{T} .
Estimates are found by solving
\boldsymbol{\hat{β}} = \mathbf{V} \mathbf{Σ}^{+} \mathbf{U}^{T} \mathbf{y}
where the superscript + indicates the pseudoinverse.
1 | .betahatsvd(X, y)
|
X |
|
y |
Numeric vector of length |
Returns \boldsymbol{\hat{β}}, that is, a k \times 1 vector of estimates of k unknown regression coefficients estimated using ordinary least squares.
Ivan Jacob Agaloos Pesigan
Wikipedia: Ordinary least squares
Wikipedia: Singular value decomposition
Wikipedia: Orthogonal decomposition methods
Other beta-hat functions:
.betahatnorm()
,
.betahatqr()
,
.intercepthat()
,
.slopeshatprime()
,
.slopeshat()
,
betahat()
,
intercepthat()
,
slopeshatprime()
,
slopeshat()
1 2 3 4 5 6 7 8 9 10 11 | # Simple regression------------------------------------------------
X <- jeksterslabRdatarepo::wages.matrix[["X"]]
X <- X[, c(1, ncol(X))]
y <- jeksterslabRdatarepo::wages.matrix[["y"]]
.betahatsvd(X = X, y = y)
# Multiple regression----------------------------------------------
X <- jeksterslabRdatarepo::wages.matrix[["X"]]
# age is removed
X <- X[, -ncol(X)]
.betahatsvd(X = X, y = y)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.