dot-slopes: Regression Slopes \boldsymbol{beta}_{2, \cdots, k}

Description Usage Arguments Details Value Author(s) See Also

Description

Derives the slopes \boldsymbol{β}_{2, \cdots, k} of a linear regression model (\boldsymbol{β} minus the intercept) as a function of covariances.

Usage

1
.slopes(SigmaX = NULL, sigmayX = NULL, X, y)

Arguments

SigmaX

p by p numeric matrix. p \times p matrix of variances and covariances between regressor variables {X}_{2}, {X}_{3}, \cdots, {X}_{k} ≤ft( \boldsymbol{Σ}_{\mathbf{X}} \right).

sigmayX

Numeric vector of length p or p by 1 matrix. p \times 1 vector of covariances between the regressand y variable and regressor variables X_2, X_3, \cdots, X_k ≤ft( \boldsymbol{σ}_{\mathbf{y}, \mathbf{X}} = ≤ft\{ σ_{y, X_2}, σ_{y, X_3}, \cdots, σ_{y, X_k} \right\}^{T} \right).

X

n by k numeric matrix. The data matrix \mathbf{X} (also known as design matrix, model matrix or regressor matrix) is an n \times k matrix of n observations of k regressors, which includes a regressor whose value is 1 for each observation on the first column.

y

Numeric vector of length n or n by 1 matrix. The vector \mathbf{y} is an n \times 1 vector of observations on the regressand variable.

Details

The linear regression slopes are calculated using

\boldsymbol{β}_{2, \cdots, k} = \boldsymbol{Σ}_{\mathbf{X}}^{T} \boldsymbol{σ}_{\mathbf{y}, \mathbf{X}}

where

Value

Returns the slopes \boldsymbol{β}_{2, \cdots, k} of a linear regression model derived from the variance-covariance matrix.

Author(s)

Ivan Jacob Agaloos Pesigan

See Also

Other parameter functions: .intercept(), .slopesprime(), intercept(), sigma2epsilon(), slopesprime(), slopes()


jeksterslabds/jeksterslabRlinreg documentation built on Jan. 7, 2021, 8:30 a.m.