knitr::opts_chunk$set( collapse = TRUE, comment = "#>", fig.width = 6, fig.height = 4 ) old_opts <- options(digits = 4)
The likelihood.model package is part of an ecosystem of three interoperable
packages:
algebraic.dist -- probability distribution objects with a common interface
(params, sampler, expectation, etc.)algebraic.mle -- maximum likelihood estimation infrastructure, defining
the mle class and generics like params, nparams, observed_fim, rmap,
marginal, and expectationlikelihood.model -- likelihood model specification and Fisherian inferenceWhen you fit a model with likelihood.model, the result is a fisher_mle
object that inherits from algebraic.mle::mle. This means all of the
algebraic.mle generics work on it directly:
library(likelihood.model) library(algebraic.mle)
params, nparams, observed_fimLet's fit an exponential model and explore the MLE through the algebraic.mle
interface:
set.seed(42) true_lambda <- 2.0 n <- 200 df <- data.frame(t = rexp(n, rate = true_lambda)) model <- exponential_lifetime("t") mle_result <- fit(model)(df) # algebraic.mle generics -- these work because fisher_mle inherits from mle params(mle_result) nparams(mle_result)
The MLE estimate is close to the true value (2.0), demonstrating good recovery
with $n = 200$ observations. The params() and nparams() generics work
identically to how they work on native algebraic.mle::mle objects -- because
fisher_mle inherits from that class, the entire algebraic.mle interface is
available without any adapter code.
The observed_fim() function returns the observed Fisher information matrix,
computed as the negative Hessian of the log-likelihood at the MLE:
observed_fim(mle_result)
For the exponential model, this is a 1x1 matrix equal to $n/\hat\lambda^2$. It should be positive at the MLE:
cat("Observed FIM:", observed_fim(mle_result)[1,1], "\n") cat("Positive:", observed_fim(mle_result)[1,1] > 0, "\n")
For comparison, vcov() is the inverse of the observed FIM:
vcov(mle_result) 1 / observed_fim(mle_result)
The numerical equality confirms that vcov() is computed as
solve(observed_fim()), the classical relationship
$\widehat{\text{Var}}(\hat\theta) = I(\hat\theta)^{-1}$.
rmapOne powerful feature of algebraic.mle is the rmap() function, which
transforms MLE parameters using the delta method. This propagates uncertainty
through nonlinear transformations.
For an Exponential($\lambda$) distribution, the mean lifetime is $1/\lambda$. We can transform our rate MLE to get an MLE for the mean lifetime:
# Transform rate -> mean lifetime mean_life_mle <- rmap(mle_result, function(p) { c(mean_lifetime = 1 / p[1]) }) params(mean_life_mle) se(mean_life_mle)
Compare to the true mean lifetime:
true_mean <- 1 / true_lambda cat("True mean lifetime:", true_mean, "\n") cat("Estimated mean lifetime:", params(mean_life_mle), "\n") cat("95% CI:", confint(mean_life_mle), "\n")
We can also transform to multiple derived quantities at once:
# Derive mean, variance, and median of the exponential distribution derived_mle <- rmap(mle_result, function(p) { lam <- p[1] c( mean = 1 / lam, var = 1 / lam^2, median = log(2) / lam ) }) params(derived_mle) se(derived_mle)
Notice that the variance has the largest SE relative to its estimate -- second moments are inherently harder to estimate than first moments.
The expectation() function computes Monte Carlo expectations over the MLE's
asymptotic sampling distribution.
set.seed(123) # E[lambda^2] under the asymptotic distribution e_lam_sq <- expectation(mle_result, function(p) p[1]^2, control = list(n = 10000L)) cat("E[lambda^2]:", e_lam_sq, "\n") cat("lambda^2 at MLE:", params(mle_result)[1]^2, "\n") # Probability that lambda > 1.5 (under asymptotic distribution) pr_lam_gt <- expectation(mle_result, function(p) as.numeric(p[1] > 1.5), control = list(n = 10000L)) cat("P(lambda > 1.5):", pr_lam_gt, "\n")
Under standard MLE asymptotics, bias is zero and MSE equals the variance-covariance matrix:
mse(mle_result) all.equal(mse(mle_result), vcov(mle_result))
The TRUE result confirms that MSE = Vcov under the assumption of zero
asymptotic bias.
The sampler() generic creates a bootstrap sampling distribution. The resulting
fisher_boot object also satisfies the mle interface:
set.seed(42) boot_sampler <- sampler(model, df = df, par = c(lambda = 2)) boot_result <- boot_sampler(n = 200) # Same algebraic.mle generics work params(boot_result) nparams(boot_result) se(boot_result) bias(boot_result)
The bootstrap bias is negligible compared to the standard error, consistent with the MLE being asymptotically unbiased.
Compare bootstrap and asymptotic confidence intervals:
cat("Asymptotic 95% CI:\n") confint(mle_result) cat("\nBootstrap percentile 95% CI:\n") confint(boot_result, type = "perc")
algebraic.dist Distributionshas_dist <- requireNamespace("algebraic.dist", quietly = TRUE)
library(algebraic.dist)
The algebraic.dist package provides distribution objects with a consistent
interface. We can compare the MLE's asymptotic sampling distribution to
theoretical distribution objects.
For example, the MLE of an exponential rate parameter $\hat\lambda$ is asymptotically normal with variance $\lambda^2/n$. We can create this theoretical distribution:
set.seed(42) cat("MLE:", params(mle_result), "\n") cat("SE:", se(mle_result), "\n") # Theoretical asymptotic distribution of the MLE asymp_var <- true_lambda^2 / n asymp_dist <- normal(mu = true_lambda, var = asymp_var) cat("\nTheoretical asymptotic distribution:\n") cat(" Mean:", params(asymp_dist)[1], "\n") cat(" Variance:", params(asymp_dist)[2], "\n") # Compare: sample from the MLE's estimated distribution mle_sampler <- sampler(mle_result) set.seed(1) mle_samples <- mle_sampler(5000) # vs. sample from the theoretical distribution dist_sampler <- sampler(asymp_dist) set.seed(1) dist_samples <- dist_sampler(5000) cat("\nMLE sampler mean:", mean(mle_samples), "\n") cat("Theoretical sampler mean:", mean(dist_samples), "\n") cat("MLE sampler sd:", sd(mle_samples), "\n") cat("Theoretical sampler sd:", sd(dist_samples), "\n")
The fisher_mle objects returned by likelihood.model are fully compatible
with the algebraic.mle interface:
| Generic | What it does on fisher_mle |
|---------|------------------------------|
| params() | Parameter estimates (same as coef()) |
| nparams() | Number of parameters |
| se() | Standard errors |
| vcov() | Variance-covariance matrix |
| observed_fim() | Observed Fisher information ($-H$) |
| bias() | Asymptotic bias (zero) |
| mse() | Mean squared error (equals vcov asymptotically) |
| AIC() | Akaike information criterion |
| confint() | Wald confidence intervals |
| rmap() | Delta method parameter transformations |
| marginal() | Marginal sampling distributions |
| expectation() | Monte Carlo expectations over sampling distribution |
| sampler() | Asymptotic or bootstrap sampling function |
options(old_opts)
Any scripts or data that you put into this service are public.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.