svdlm: Least-squares regression via SVD

View source: R/helpers.R

svdlmR Documentation

Least-squares regression via SVD

Description

Least-squares regression via SVD

Usage

svdlm(x, y, rel.tol = 1e-09, abs.tol = 1e-100)

Arguments

x

Model matrix.

y

Response vector.

rel.tol

Relative zero tolerance for generalised inverse via SVD.

abs.tol

Absolute zero tolerance for generalised inverse via SVD.

Newton steps for many empirical likelihoods are of least-squares type. Denote x^+ to be the generalised inverse of x. If SVD algorithm failures are encountered, it sometimes helps to try svd(t(x)) and translate back. First check to ensure that x does not contain NaN, or Inf, or -Inf.

The tolerances are used to check the closeness of singular values to zero. The values of the singular-value vector d that are less than max(rel.tol * max(d), abs.tol) are set to zero.

Value

A vector of coefficients.

Examples

b.svd <- svdlm(x = cbind(1, as.matrix(mtcars[, -1])), y = mtcars[, 1])
b.lm  <- coef(lm(mpg ~ ., data = mtcars))
b.lm - b.svd  # Negligible differences

smoothemplik documentation built on Aug. 22, 2025, 1:11 a.m.