Package autodiffr
provides an R
wrapper for Julia
packages ForwardDiff.jl
and ReverseDiff.jl
through JuliaCall
to do automatic differentiation for native R
functions.
knitr::opts_chunk$set( collapse = TRUE, comment = "#>", fig.path = "README-" )
Julia
is needed to use autodiffr
. You can download a generic Julia
binary from
https://julialang.org/downloads/ and add it to the path.
Pakcage autodiffr
is not on CRAN yet. You can get the development version of autodiffr
by
devtools::install_github("Non-Contradiction/autodiffr")
Important: Note that currently Julia
v0.6.x, v0.7.0 and v1.0 are all supported by autodiffr
, but to use autodiffr
with Julia
v0.7/1.0,
you need to get the development version of JuliaCall
by:
devtools::install_github("Non-Contradiction/JuliaCall")
library(autodiffr) ## Do initial setup ad_setup() ## If you want to use a julia at a specific location, you could do the following: ## ad_setup(JULIA_HOME = "the folder that contains julia binary"), ## or you can set JULIA_HOME in command line environment or use `options(...)` ## Define a target function with vector input and scalar output f <- function(x) sum(x^2L) ## Calculate gradient of f at [2,3] by ad_grad(f, c(2, 3)) ## deriv(f, c(2, 3)) ## Get a gradient function g g <- makeGradFunc(f) ## Evaluate the gradient function g at [2,3] g(c(2, 3)) ## Calculate hessian of f at [2,3] by ad_hessian(f, c(2, 3)) ## Get a hessian function h h <- makeHessianFunc(f) ## Evaluate the hessian function h at [2,3] h(c(2, 3)) ## Define a target function with vector input and vector output f <- function(x) x^2 ## Calculate jacobian of f at [2,3] by ad_jacobian(f, c(2, 3)) ## Get a jacobian function j j <- makeJacobianFunc(f) ## Evaluate the gradient function j at [2,3] j(c(2, 3))
## Define a target function with mulitple arguments f <- function(a = 1, b = 2, c = 3) a * b ^ 2 * c ^ 3 ## Calculate gradient/derivative of f at a = 2, when b = c = 1 by ad_grad(f, 2, b = 1, c = 1) ## deriv(f, 2, b = 1, c = 1) ## Get a gradient/derivative function g w.r.t a when b = c = 1 by g <- makeGradFunc(f, b = 1, c = 1) ## Evaluate the gradient/derivative function g at a = 2 g(2) ## Calculate gradient/derivative of f at a = 2, b = 3, when c = 1 by ad_grad(f, list(a = 2, b = 3), c = 1) ## Get a gradient/derivative function g w.r.t a and b when c = 1 by g <- makeGradFunc(f, c = 1) ## Evaluate the gradient/derivative function g at a = 2, b = 3 g(list(a = 2, b = 3))
Make sure the Julia
installation is correct.
autodiffr
is able to find Julia
on PATH,
and there are three ways for autodiffr
to find Julia
not on PATH.
ad_setup(JULIA_HOME = "the folder that contains julia binary")
options(JULIA_HOME = "the folder that contains julia binary")
JULIA_HOME
in command line environment.The GitHub Pages for this repository host the documentation for the development
version of autodiffr
: https://non-contradiction.github.io/autodiffr/.
And you are more than welcome to contact me about autodiffr
at lch34677@gmail.com or cxl508@psu.edu.
autodiffr
is under active development now.
Any suggestion or issue reporting is welcome!
You may report it using the link: https://github.com/Non-Contradiction/autodiffr/issues/new.
Or email me at lch34677@gmail.com or cxl508@psu.edu.
The project autodiffr
was a Google Summer of Code
(GSoC) 2018 project for the "R Project for statistical computing" and
with mentors John Nash and Hans W Borchers.
Thanks a lot!
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.