This package implements the schumaker spline for one dimensional interpolation. This is the first publicly available R package to give a shape-constrained spline without any optimisation being necessary. It also has significant speed advantages compared to the other shape constrained splines. It is intended for use in dynammic programing problems and in other cases where a simple shape constrained spline is useful.

This package first illistrates that the base splines can deliver nonconcave splines from concave data. It follows by describing the options the schumaker spline provides. It then describes a consumption smoothing problem before illustrating why the schumaker spline performs significantly better than existing splines.

Finally the speed of this spline in comparison with other splines is described.

The schumaker spline and base splines

We can show that the base splines are not shape preserving. Now we can plot the monotonic spline (you can experiment with the other ones for a similar result):

x = seq(1,10)
y = log(x)

xarray = seq(1,10,0.01)

BaseSpline = splinefun(x,y, method = "monoH.FC")
Base0 = BaseSpline(xarray)
DerivBaseSpline = splinefun(xarray, numDeriv::grad(BaseSpline, xarray))
Base1 = DerivBaseSpline(xarray)
Deriv2BaseSpline = splinefun(xarray, numDeriv::grad(DerivBaseSpline, xarray))
Base2 = Deriv2BaseSpline(xarray)

plot(xarray, Base0, type = "l", col = 4, ylim = c(-1,3), main = "Base Spline and first two derivatives",
     ylab = "Spline and derivatives", xlab = "x")
lines(xarray, Base1, col = 2)
lines(xarray, Base2, col = 3)
abline(h = 0, col = 1)
text(x=rep(8,8,8), y=c(2, 0.5,-0.2), pos=4, labels=c('Spline', 'First Derivative', 'Second Derivative'))

Here you can see that the first derivative is always positive - the spline is monotonic. However the second derivative moves above and below 0. The spline is not globally concave.

Now we can show the schumaker spline:

library(schumaker)
SchumSpline = schumaker::Schumaker(x,y)
Schum0 = SchumSpline$Spline(xarray)
Schum1 = SchumSpline$DerivativeSpline(xarray)
Schum2 = SchumSpline$SecondDerivativeSpline(xarray)

plot(xarray, Schum0, type = "l", col = 4, ylim = c(-1,3), main = "Schumaker Spline and first two derivatives",
     ylab = "Spline and derivatives", xlab = "x")
lines(xarray, Schum1, col = 2)
lines(xarray, Schum2, col = 3)
abline(h = 0, col = 1)
text(x=rep(8,8,8), y=c(2, 0.5,-0.2), pos=4, labels=c('Spline', 'First Derivative', 'Second Derivative'))

Here the second derivative is always negative - the spline is globally concave as well as monotonic.

Optional Settings

There are three optional setting in creating a spline. Firstly the gradients at each of the (x,y) points can be input to give more accuracy. If not supplied these are estimated from the points.

Secondly if Vectorised = TRUE arrays can be input to the spline. If arrays will never be input to the spline then you can set Vectorised = FALSE for a small speed improvement (about 30%).

Finally there are three options for out of sample prediction.

The three out of sample options are shown below. Here you can see in black the curve is extended out. In green the ends are extrapolated linearly whilst red has constant extrapolation. Note that there is no difference between the 3 within sample.

x = seq(1,10)
y = log(x)
xarray = seq(-5,15,0.01)

SchumSplineCurve    = Schumaker(x,y, Extrapolation = "Curve"   )$Spline

SchumSplineConstant = Schumaker(x,y, Extrapolation = "Constant")$Spline

SchumSplineLinear   = Schumaker(x,y, Extrapolation = "Linear"  )$Spline

SchumSplineCurveVals    = SchumSplineCurve(xarray)
SchumSplineConstantVals = SchumSplineConstant(xarray)
SchumSplineLinearVals   = SchumSplineLinear(xarray)

plot(xarray, SchumSplineCurveVals, type = "l", col = 1, ylim = c(-5,5),
     main = "Ways of predicting outside of sample", ylab = "Spline value", xlab = "x")
lines(xarray, SchumSplineConstantVals, col = 2)
lines(xarray, SchumSplineLinearVals, col = 3)

Example: A simple consumption smoothing problem

Consider a consumer that has a budget of $B_t$ at time $t$ and a periodic income of $1$. They have a periodic utility function given by:

$u_t = \epsilon_t x_t^{0.2}$

where $x_t$ is spending in period $t$ and $\epsilon_t$ is the shock in period $t$ drawn from some stationary nonnegative shock process with pdf $f(\epsilon)$.

The problem for the consumer in period $t$ is:

$V(B_t | \epsilon_{t}) = \max_{0 < x_t < B} \hspace{0.5cm} \epsilon_t x_t^{0.2} + \beta E_t[ V(B_{t+1})]$

Where $\beta$ is a discounting factor and $B_{t+1} = 1 + B_t - x_t$.

Algorithm

We can first note that due to the shock process it is not possible to get analytical equations to describe the paths of spending and the budget over the long term. We can get a numerical solution however. The key step is to find expressions for the Expected value function as a function of $B_{t+1}$. With this we can run simulations with random shocks to see the long term distributions of $x_t$ and $B_t$. The algorithm we will use is:

  1. We discretize the budget statespace.
  2. We make an initial guess of the future value function $E[ V(B_{t+1})]$ at every value of $B_{t+1}$ in our discretized statespace. A deterministic approximation of the problem (assume shocks will be $E[\epsilon_{t}]$ forever) is often good for this.
  3. We use the schumaker spline to join together our estimates at each point in the discretized statespace into an interpolation function.
  4. At every point in the statespace we create updated estimates
  5. We use a cubature package to estimate $\int^\infty_{0} V(B_t | \epsilon_t ) f(\epsilon_t) d\epsilon_t$
  6. In this integral we use a one dimensional optimizer along with the above equation to evaluate $V(B_t | \epsilon_t )$
  7. Check a convergence criteria. Are the new $V(B_t | \epsilon_t )$ values very close to the old values?
  8. If they are we end the algorithm
  9. If they are not we go back to point 3.

This strategy relies on the consumption problem being a contraction mapping. This means that if we use this algorithm we will coverge to a fixed point function. The turboem package (squareem method) can be useful here in accelerating the convergence.

Why a schumaker spline is necessary

There are a few reasons we need the spline to be shape preserving and without optimization to make this work:

Speed

These microbenchmarks are available below:

library(microbenchmark)
library(cobs)
library(scam)

x = seq(1,10)
y = log(x)
dat = data.frame(x = x, y = y)
xarray = seq(0,15,0.01)

microbenchmark(
  Schumaker(x,y),
  splinefun(x,y,"monoH.FC"),
  scam(y~s(x,k=4,bs="mdcx",m=1),data=dat),
  cobs(x , y, constraint = c("decrease", "convex"), print.mesg = FALSE),
  unit = "relative"
)

BaseSp =   splinefun(x,y,"monoH.FC")
SchuSp =   Schumaker(x,y)$Spline
ScamSp =   scam(y~s(x,k=4,bs="mdcx",m=1),data=dat)
CobsSp =     cobs(x , y, constraint = c("decrease", "convex"), print.mesg = FALSE)

ScamPr = function(x){  predict.scam(ScamSp,data.frame(x = x))}
CobsPr = function(x){  predict(CobsSp, x)[,2] }

microbenchmark(
  SchuSp(xarray),
  BaseSp(xarray),
  ScamPr(xarray),
  CobsPr(xarray),
  unit = "relative"
)

SchuSp =   Schumaker(x,y, Vectorise = FALSE)$Spline
microbenchmark(
  SchuSp(runif(1)),
  BaseSp(runif(1)),
  ScamPr(runif(1)),
  CobsPr(runif(1)),
  unit = "relative"
)

Reference

The key reference for the schumaker spline is Judd's Numerical Methods in Economics (1998). This presents precisely how to create the spline and it is simple to reconcile the code with the equations from this book. This book also gives more detail on solving dynammic control problems. A further reference which advocates the use of the schumaker spline is Ljungqvist and Sargent's Recursive Economic Theory.



s-baumann/schumaker documentation built on May 23, 2019, 5:07 p.m.