Description Usage Arguments Value Note
View source: R/annotation_analysis.R
Assess the prior improvement using Kullback-Leibler divergence
1 | get_kl_divergences(mpra_data, cond_prior, marg_prior, n_cores, verbose = TRUE)
|
mpra_data |
a data frame of mpra data |
cond_prior |
a conditional prior (see fit_cond_prior()) |
marg_prior |
a marginal prior (see fit_marg_prior()) |
n_cores |
number of cores for parallelization |
verbose |
logical indicating whether to print messages |
a data frame giving the KL between the observed normalized counts and the two priors for each allele of each variant_id
The KL divergences are approximations because gamma kernel density estimation is used to obtain the empirical distribution of the normalized counts.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.