R/MMAD.R

Defines functions MMAD

Documented in MMAD

#' Minorization-Maximization Algorithm via the Assembly--Decomposition Technology
#'
#' This function allows the user to maximize the target function \eqn{f(\boldsymbol{\theta})} using MM algorithm with AD technology.
#'
#' @param Function_obj An R list depicting the target function.
#' @param init The initial value \eqn{\boldsymbol{\theta}^{(0)}} for iterative optimization.
#' @param tol The tolerance for convergence detection (default: \eqn{1\times 10^{-4}}).
#'
#' @return
#' The maximizer \deqn{\widehat{\boldsymbol{\theta}}=\arg\max_{\boldsymbol{\theta} \in\Theta }f(\boldsymbol{\theta}).}
#'
#' @export
MMAD<-function(Function_obj,init,tol=1e-4)
{
  # init<-lambda
  value_0<--Inf
  argg_0<-init-1
  argg_1<-init
  value_1<-Function_evaluation(Function_obj,argg_1)$Value



  while((mean(abs(argg_1-argg_0))>tol)|(((value_1-value_0))>0))
  {
    value_0<-value_1
    argg_0<-argg_1
    Surrogate_function<-Function_minorization(Function_obj,argg_1)

    Surrogate_all<-list(Value=Surrogate_function$Constant,Gradient=rep(0,Function_obj$dimension),Hessian=rep(0,Function_obj$dimension))
    for(i in 1:Function_obj$dimension)
    {
      GH_i<-Function_evaluation(Surrogate_function[[i]],argg_1[i])
      Surrogate_all$Value<-Surrogate_all$Value+GH_i$Value
      Surrogate_all$Gradient[i]<-Surrogate_all$Gradient[i]+GH_i$Gradient
      Surrogate_all$Hessian[i]<-Surrogate_all$Hessian[i]+GH_i$Hessian
    }
    argg_1<-argg_1-Surrogate_all$Gradient/Surrogate_all$Hessian
    value_1<-Function_evaluation(Function_obj,argg_1)$Value
  }

  return(argg_1)
}

Try the MMAD package in your browser

Any scripts or data that you put into this service are public.

MMAD documentation built on March 12, 2026, 5:07 p.m.