TReNA-package: Inferring Transcriptional Regulation with TReNA

Description Details See Also

Description

TReNA provides a framework for using gene expression data to infer relationships between a target gene and a set of transcription factors. It does so using a several classes and their associated methods, briefly documented below

Details

TReNA Class Objects

The TReNA class is the central piece of the package. It houses the matrix of gene expression data as well as the details of the solver chosen for feature selection. Its main method is solve, which performs the feature selection and returns the resulting coefficients.

Solver Class Objects

The Solver class is a base class used within a TReNA object. A particular Solver object contains the name of the selected solver and dispatches the correct feature selection method when solve is called on the TReNA object. It is inherited by all the following subclasses, representing the different feature selection methods: BayesSpikeSolver, EnsembleSolver, LassoPVSolver, LassoSolver, PearsonSolver, RandomForestSolver, RidgeSolver, SpearmanSolver, SqrtLassoSolver.

CandidateFilter Class Objects

The CandidateFilter class is separate from the aforementioned classes. It is a base class that contains a gene expression matrix and is used to filter the transcription factors in the matrix. Filtering method depends on the filter type chosen; there are currently the following subclasses: FootprintFilter, NullFilter, VarianceFilter. The filters are applied using the getCandidates method on a given CandidateFilter object.

FootprintFinder Class Objects

The FootprintFinder class is designed to allow extraction of gene footprinting information from existing PostgreSQL or SQLite databases. In standard use of the TReNA package, it is used solely by the getCandidates method for a FootprintFilter object. However, a FootprintFinder object has many more available methods that allow it to extract information more flexibly.

See Also

TReNA


TReNA documentation built on Nov. 17, 2017, 12:35 p.m.