ls.project: Least squares projection

Description Usage Arguments Value Examples

View source: R/ls.project.R

Description

Least squares projection (H) of samples (A) onto a feature model (W)

Usage

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
ls.project(
  samples,
  W,
  n.threads = 0,
  k = ncol(W),
  mask.zeros = FALSE,
  H.nonneg = TRUE,
  H.L1 = 0,
  H.L2 = 0,
  H.angular = 0,
  inner.rel.tol = 1e-08,
  inner.max.iter = 100
)

Arguments

samples

dgCMatrix of samples (columns) by features (rows) to be projected onto "W"

W

factor model of features (rows) by factors (columns) of class "matrix"

n.threads

number of threads/CPUs to use, if not all availble threads as decided by OpenMP

k

rank of projection. By default, k = ncol(W).

mask.zeros

treat zeros as missing values

H.nonneg

constrain mapping to positive values

H.L1

lasso regularization

H.L2

ridge regularization

H.angular

angular regularization

inner.rel.tol

Default value should satisfy. Stopping criterion for sequential coordinate descent least squares solver between two successive iterations

inner.max.iter

Default value should satisfy. Maximum number of permitted iterations for sequential coordinate descent least squares solver if inner.rel.tol is not met.

Value

a sample embeddings matrix of samples (columns) by factor coefficients (rows)

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
## Not run: 
data(moca7k)
# calculate a model for 1000 cells and then project all 7500 onto that model
model <- lsmf(moca7k[,1:1000], k = 20)
H.all <- ls.project(moca7k, model$W)

# compare projection to the original weights for the first 1000 cells
plot(H.all[,1:1000], model$H)
# just about perfect!

## End(Not run)

zdebruine/LSMF documentation built on Jan. 1, 2021, 1:50 p.m.