Description Usage Arguments Details Value Slots Methods Author(s) See Also Examples
Represents the posterior distribution used to infer copy number from targeted deep sequencing.
1 2 3 4 5 6 7 8 9 | makeCNVPosterior(obs, prior)
## S4 method for signature 'CNVPosterior,missing'
plot(x, place="topright", lwd=1, ...)
## S4 method for signature 'CNVPosterior'
hist(x, place="topright", lwd=1, ...)
## S4 method for signature 'CNVPosterior'
summary(object, ...)
## S4 method for signature 'CNVPosteriorSummary'
as.data.frame(x, row.names = NULL, optional = FALSE, ...)
|
obs |
Observed read-count data |
prior |
Prior distribution, represented using the class |
object |
object of class |
x |
object of class |
place |
Character string; where to place the figure legend |
lwd |
Line width parameter. |
row.names |
See |
optional |
See |
... |
extra arguments for generic or plotting routines |
The DeepCNV
class is used to fit a Bayesian model to targeted
sequencing data from one or a few genes in order to draw inferences
about possible copy number changes. Basically, we assume that the
observed data consists of a list of triples (K, N, V), one for each
variant in a gene. Here K is the number of variant reads, N is the
total number of reads, and V is the type of each variant (either a
known SNP or a somatic mutation). We model (K, N) using a binomial
distribution, where the 'success' parameter depends (in a
deterministic way) on the unknown parameters of interest: the fraction
ν of normal cells in the sample and the copy number state (Normal,
Deleted, or Gained).
The prior distribution consists of a continuous (by default,
Beta(α, β)) distribution on ν and a discrete
distribution on the copy number state S, which are stored ina n object
of the CNVPrior
class. The CNVariant
computes the
success parameter φ as a function of the observed data
(K,N,V). Then φ and thbe observed data are passed on to the
cnvLikelihood
function, whcih computes the binomial
log-likelihood.
We compute the posterior distribution essentially by brute force, in
the function makeCNVPosterior
. We choopse regularly spaced
grid of points on the inerval (0,1) and evaluate the log-likelihood at
every grid point foir each conceivable copy number state. These
likelihoods are combined with the prior distribution by teh usual
application of Bayes Rule.
One complication arises from the variant type V, which can be observed with error. Additioal complications arise because the variant (whether SNP or mutation) can turn out to be either the major or minor allele. These difficulties are mostly hidden from the user, and resolved by rfeplacing an overabundaqnce of states with the maximu a posteriori selection at each variant. This design decision afects the structure of the class, which compute multiple forms of the likelihood:
hiddenloglike
, for all hidden states
snploglike
, for each variant separately, collapsiong
hidden states
loglike
, for the unified gene
and posterior distribution:
snppost
, for eqach variant separately
posterior
, for the unified gene
The makeCNVPosterior
constructor returns a valid object of the
class.
hiddenloglike
:Log-likelihood using the hidden expanded discrete model
snploglike
:Log-likelihood for each SNP analyzed one at a time
snppost
:Posterior distribution of parameters for each SNP analyzed one at a time
loglike
:Log-likelihood merging data from all SNPs
posterior
:Posterrio distribution merging data from all SNPs
calls
:List of maximu a posterior variant types used to collapse the hidden model
observed
:Observed read data
prior
:Prior distribution
Prints all the information in the object
Prints all the information in the object
Writes out a summary of the object
Kevin R. Coombes krc@silicovore.com
1 2 3 4 5 6 7 8 | prior <- setCNVPrior(alpha=1.2, beta=4.8, pAbnormal=0.6)
dataset <- simReads(2, 7, 0.17, "Normal")
posterior <- makeCNVPosterior(dataset, prior)
s <- summary(posterior)
s
as.data.frame(s)
plot(posterior)
hist(posterior)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.