osp.prob.design: Longstaff-Schwartz RMC algorithm with a variety of regression...

View source: R/ospProbDesign.R

osp.prob.designR Documentation

Longstaff-Schwartz RMC algorithm with a variety of regression methods.

Description

RMC using probabilistic design: backpropagation along fixed set of paths (a la Longstaff-Schwartz). All designs are kept in memory. By default produces only an in-sample estimate. Use in conjuction with forward.sim.policy to generate out-of-sample price estimates.

Usage

osp.prob.design(N, model, subset = 1:N, method = "lm")

Arguments

N

is the number of paths

model

defines the simulator and reward model, with the two main model hooks being. Initial condition is model$x0. Can be either a vector of length model$dim or a vector of length model$dim*N option payoff payoff.func (plus parameters) and stochastic simulator sim.func (plus parameters)

subset

To have out-of-sample paths, specify subset (e.g 1:1000) to use for testing. By default everything is in-sample

method

a string specifying regression method to use

  • spline: Smoothing splines smooth.spline from base. Only works in 1D. Requires number of knots nk. If nk is omitted, does cross-validation.

  • randomforest: (from randomForest package) requires rf.maxnode and rf.ntree (number of trees) model parameters

  • loess: local polynomial regression. Only works in 1D or 2D, requires lo.span model parameter

  • earth: multivariate adaptive regression splines (MARS) using earth package. Requires earth.deg (interaction degree), earth.nk (max number of terms to keep), earth.thresh params

  • rvm: relevance vector machine from kernlab package. Optional rvm.kernel model parameter to decide which kernel family to utilize. Default kernel is rbfdot

  • npreg: kernel regression using np package. Can optionally provide np.kertype (default is "gaussian"); np.regtype (default is "lc"); np.kerorder (default is 2)

  • nnet: neural network using nnet. This is a single-layer neural net. Specify a scalar nn.nodes to describe the number of nodes at the hidden layer

  • lagp: local approximate Gaussian Process regression using lagp package. Can optionally provide lagp.type (default is "alcray" which is fastest, other choices are "alc" and "mspe") that determines how the local design is constructed, and lagp.end which determines how many inputs are in the above local design.

  • dynatree: dynamic trees using dynaTree. Requires dt.type ("constant" or "linear" fits at the leafs), dt.Npart (number of trees), dt.minp (minimum size of each partition) and dt.ab (the tree prior parameter) model parameters.

  • lm [Default]: linear global regression using model$bases (required) basis functions (+ constant) which is a function pointer.

Details

Works with a probabilistic design that requires storing all paths in memory. Specifying subset allows to compute in parallel with the original computation an out-of-sample estimate of the value function

Calls model$payoff.func, so the latter must be set prior to calling. Also needs model$dt, model$T for simulation and model$r for discounting

Calls model$sim.func to generate forward paths

Emulator is trained only on paths where payoffs are strictly positive

Value

a list containing

  • fit a list containing all the models generated at each time-step. fit[[1]] is the emulator at t=\Delta t, the last one is fit[[M-1]] which is emulator for T-\Delta t.

  • val: the in-sample pathwise rewards

  • test: the out-of-sample pathwise rewards

  • p: the final price (2-vector for in/out-of-sample)

  • timeElapsed total running time in seconds, based on Sys.time

Examples

set.seed(1)
model2d <- list(look.ahead=1,K=40,x0=rep(40,2),sigma=rep(0.2,2),r=0.06,
 div=0, T=1,dt=0.04,dim=2, sim.func=sim.gbm, payoff.func=put.payoff)
 bas22 <- function(x) return(cbind(x[,1],x[,1]^2,x[,2],x[,2]^2,x[,1]*x[,2]))
 model2d$bases <- bas22
 prob.lm <- osp.prob.design(30000,model2d,method="lm",subset=1:15000)
 prob.lm$p
 # yields [1] 1.440918 1.482422

mludkov/mlOSP documentation built on April 29, 2023, 7:56 p.m.