hsmc | R Documentation |
Computes the analytic 3x3 Hessian matrix (matrix of second partial derivatives)
of the negative log-likelihood function for the McDonald (Mc) distribution
(also known as Beta Power) with parameters gamma
(\gamma
),
delta
(\delta
), and lambda
(\lambda
). This distribution
is the special case of the Generalized Kumaraswamy (GKw) distribution where
\alpha = 1
and \beta = 1
. The Hessian is useful for estimating
standard errors and in optimization algorithms.
hsmc(par, data)
par |
A numeric vector of length 3 containing the distribution parameters
in the order: |
data |
A numeric vector of observations. All values must be strictly between 0 and 1 (exclusive). |
This function calculates the analytic second partial derivatives of the
negative log-likelihood function (-\ell(\theta|\mathbf{x})
).
The components are based on the second derivatives of the log-likelihood \ell
(derived from the PDF in dmc
).
Note: The formulas below represent the second derivatives of the positive
log-likelihood (\ell
). The function returns the negative of these values.
Users should verify these formulas independently if using for critical applications.
\frac{\partial^2 \ell}{\partial \gamma^2} = -n[\psi'(\gamma) - \psi'(\gamma+\delta+1)]
\frac{\partial^2 \ell}{\partial \gamma \partial \delta} = -n\psi'(\gamma+\delta+1)
\frac{\partial^2 \ell}{\partial \gamma \partial \lambda} = \sum_{i=1}^{n}\ln(x_i)
\frac{\partial^2 \ell}{\partial \delta^2} = -n[\psi'(\delta+1) - \psi'(\gamma+\delta+1)]
\frac{\partial^2 \ell}{\partial \delta \partial \lambda} = -\sum_{i=1}^{n}\frac{x_i^{\lambda}\ln(x_i)}{1-x_i^{\lambda}}
\frac{\partial^2 \ell}{\partial \lambda^2} = -\frac{n}{\lambda^2} -
\delta\sum_{i=1}^{n}\frac{x_i^{\lambda}[\ln(x_i)]^2}{(1-x_i^{\lambda})^2}
where \psi'(\cdot)
is the trigamma function (trigamma
).
(Note: The formula for \partial^2 \ell / \partial \lambda^2
provided in the source
comment was different and potentially related to the expected information matrix;
the formula shown here is derived from the gradient provided earlier. Verification
is recommended.)
The returned matrix is symmetric, with rows/columns corresponding to
\gamma, \delta, \lambda
.
Returns a 3x3 numeric matrix representing the Hessian matrix of the
negative log-likelihood function, -\partial^2 \ell / (\partial \theta_i \partial \theta_j)
,
where \theta = (\gamma, \delta, \lambda)
.
Returns a 3x3 matrix populated with NaN
if any parameter values are
invalid according to their constraints, or if any value in data
is
not in the interval (0, 1).
Lopes, J. E.
McDonald, J. B. (1984). Some generalized functions for the size distribution of income. Econometrica, 52(3), 647-663.
Cordeiro, G. M., & de Castro, M. (2011). A new family of generalized distributions. Journal of Statistical Computation and Simulation,
(Note: Specific Hessian formulas might be derived or sourced from additional references).
hsgkw
(parent distribution Hessian),
llmc
(negative log-likelihood for Mc),
grmc
(gradient for Mc, if available),
dmc
(density for Mc),
optim
,
hessian
(for numerical Hessian comparison),
trigamma
.
# Assuming existence of rmc, llmc, grmc, hsmc functions for Mc distribution
# Generate sample data
set.seed(123)
true_par_mc <- c(gamma = 2, delta = 3, lambda = 0.5)
sample_data_mc <- rmc(100, gamma = true_par_mc[1], delta = true_par_mc[2],
lambda = true_par_mc[3])
hist(sample_data_mc, breaks = 20, main = "Mc(2, 3, 0.5) Sample")
# --- Find MLE estimates ---
start_par_mc <- c(1.5, 2.5, 0.8)
mle_result_mc <- stats::optim(par = start_par_mc,
fn = llmc,
gr = if (exists("grmc")) grmc else NULL,
method = "BFGS",
hessian = TRUE, # Ask optim for numerical Hessian
data = sample_data_mc)
# --- Compare analytical Hessian to numerical Hessian ---
if (mle_result_mc$convergence == 0 &&
requireNamespace("numDeriv", quietly = TRUE) &&
exists("hsmc")) {
mle_par_mc <- mle_result_mc$par
cat("\nComparing Hessians for Mc at MLE estimates:\n")
# Numerical Hessian of llmc
num_hess_mc <- numDeriv::hessian(func = llmc, x = mle_par_mc, data = sample_data_mc)
# Analytical Hessian from hsmc
ana_hess_mc <- hsmc(par = mle_par_mc, data = sample_data_mc)
cat("Numerical Hessian (Mc):\n")
print(round(num_hess_mc, 4))
cat("Analytical Hessian (Mc):\n")
print(round(ana_hess_mc, 4))
# Check differences (monitor sign consistency)
cat("Max absolute difference between Mc Hessians:\n")
print(max(abs(num_hess_mc - ana_hess_mc)))
# Optional: Use analytical Hessian for Standard Errors
# tryCatch({
# cov_matrix_mc <- solve(ana_hess_mc) # ana_hess_mc is already Hessian of negative LL
# std_errors_mc <- sqrt(diag(cov_matrix_mc))
# cat("Std. Errors from Analytical Mc Hessian:\n")
# print(std_errors_mc)
# }, error = function(e) {
# warning("Could not invert analytical Mc Hessian: ", e$message)
# })
} else {
cat("\nSkipping Mc Hessian comparison.\n")
cat("Requires convergence, 'numDeriv' package, and function 'hsmc'.\n")
}
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.