get_kl_divergences: Get Kullback-Leibler Divergences

Description Usage Arguments Value Note

View source: R/annotation_analysis.R

Description

Assess the prior improvement using Kullback-Leibler divergence

Usage

1
get_kl_divergences(mpra_data, cond_prior, marg_prior, n_cores, verbose = TRUE)

Arguments

mpra_data

a data frame of mpra data

cond_prior

a conditional prior (see fit_cond_prior())

marg_prior

a marginal prior (see fit_marg_prior())

n_cores

number of cores for parallelization

verbose

logical indicating whether to print messages

Value

a data frame giving the KL between the observed normalized counts and the two priors for each allele of each variant_id

Note

The KL divergences are approximations because gamma kernel density estimation is used to obtain the empirical distribution of the normalized counts.


andrewGhazi/malacoda documentation built on Aug. 2, 2020, 12:54 a.m.