subClass_train: Sub-Class Training (new and improved version) with merged...

View source: R/subClass_train.R

subClass_trainR Documentation

Sub-Class Training (new and improved version) with merged rand category

Description

Tranining sub-class classifier

Usage

subClass_train(
  cnProc_broad,
  stTrain,
  expTrain,
  colName_broadCat,
  colName_subClass,
  name_broadCat,
  colName_samp = "row.names",
  nTopGenes = 20,
  nTopGenePairs = 50,
  nRand = 40,
  nTrees = 1000,
  stratify = FALSE,
  sampsize = 40,
  weightedDown_total = 5e+05,
  weightedDown_dThresh = 0.25,
  transprop_xFact = 1e+05,
  weight_broadClass = 1,
  quickPairs = FALSE,
  coreProportion = 0
)

Arguments

cnProc_broad

the broad cnProc from broadClass_train

stTrain

a dataframe that matches the samples with broad category and sub-class

expTrain

the expression matrix

colName_broadCat

the name of the column in sample table that contains broad categories

colName_subClass

the name of the column in sample table that contains sub class

name_broadCat

the name of the broad class in which the subclasses are

colName_samp

the name of the column that contains sample names

nTopGenes

the number of classification genes per category

nTopGenePairs

the number of top gene pairs per category

nRand

number of random profiles generate for training

nTrees

number of trees for random forest classifier

weightedDown_dThresh

the threshold at which anything lower than that is 0 for weighted_down function

transprop_xFact

scaling factor for transprop

weight_broadClass

the weight on the result of the broad classification as features for subclassifier

coreProportion

the proportion of logical cores for finding classification genes and top scoring gene pairs. If you want to disable parallel processing, then enter 0

weightDown_total

numeric post transformation sum of read counts for weighted_down function

Value

a list containing normalized expression data, classification gene list, cnProc


pcahan1/cancerCellNet documentation built on July 16, 2022, 12:12 a.m.