Run_lasso: Post-processing

Description Usage Arguments Details Value Examples

View source: R/Run_lasso.R

Description

Solve the lasso-like problem for post-processing

Usage

1
Run_lasso(data, par, var, g, lambda = c(8, 4), max = 1000)

Arguments

data

The CNV dataset prepared for lasso (see data_lasso)

par

The parameter prepared for lasso (see par_lasso)

var

The initialized variables for lasso (see var_lasso)

g

Integer-valued ploidy

lambda

L1-Penalty for all recurrent CNV

max

The maximum number of cycles to run

Details

This step makes the solution more biologically interpretable. And the lasso-like L1 regularizer makes the result sparser which is crucial for researchers to find region of interest.

This function utilizes a proximal gradient descent method to solve the problem, and it stops when the improvement is sufficiently small and the dual variable is close to feasible (or it reaches the max number of iterations specified by user).

In post-processing, we don't have separate values for minor and major copies, and thus, the result would be an intermediate between changes for minor and major copy.

We find this easier to understand, but if the user indeed wants to have a set of separate values for minor and major copy, he could copy major copy data file to minor copy and re-load the dataset. Namely, he could change major and minor copy data file to both contain either minor or major data, and run the lasso.

Value

A list containing the optimal values for variables and the loss array

Examples

1
Lasso_res<-Run_lasso(wkdata,par,var,g_int)

yun-feng/WGDAP documentation built on Nov. 5, 2019, 1:22 p.m.