View source: R/helper_functions.R
estimate_memory_posterior | R Documentation |
sample_MegaLMM
A call to sample_MegaLMM(MegaLMM_state,n_iter)
will run n_iter
of the Gibbs sampler. If nrun > burn
, then a posterior sample of all variables
stored in MegaLMM_state$Posterior
every thin
iteration. If you are doing
a long run, and storing a large number of parameters, this will take a lot of memory.
This function will estimate the memory requirements.
estimate_memory_posterior(MegaLMM_state, n_iter)
MegaLMM_state |
The model after calling |
n_iter |
number of iterations of the Gibbs sampler |
Note 1: The estimated value will assume all iterations are post-burnin
Note 2: sample_MegaLMM()
will instantiate all arrays to hold the posterior samples
prior to running the iterations, so memory requirements will not increase much during the sampling.
Note 3: It is generally not needed to run sample_MegaLMM(MegaLMM_state,n_iter)
with a large n_iter
. Instead, run the function many times, each with a small n_iter
,
calling save_posterior_chunk
between each run. This gives you the ability
to diagnose problems during the run, and keeps the memory requirments low. You can always
reload the posterior samples from the database on the disk using reload_Posterior
or
load_posterior_param
.
The estimated memory size in bytes
estimate_memory_initialization_MegaLMM
,
save_posterior_chunk
, reload_Posterior
,
load_posterior_param
estimate_memory_posterior(MegaLMM_state,100)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.