vagam: Fitting generalized additive models (GAMs) using variational...

Description Usage Arguments Details Value Author(s) References See Also Examples

View source: R/vagam.R

Description

Follows the variational approximation approach of Hui et al. (2018) for fitted generalized additive models. In this package, the term GAM is taken to be generalized linear mixed model, specifically, the nonparametric component is modeled using a P-splines i.e., cubic B-splines with a first order difference penalty. Because the penalty can be written as a quadratic form in terms of the smoothing coefficients, then it is treated a (degenerate) multivariate normal random effects distribution and a marginal log-likleihood for the resulting mixed model can be constructed.

The VA framework is then utilized to provide a fully or at least closed to fully tractable lower bound approximation to the marginal likelihood of a GAM. In doing so, the VA framework aims offers both the stability and natural inference tools available in the mixed model approach to GAMs, while achieving computation times comparable to that of using the penalized likelihood approach to GAMs.

Usage

1
2
3
4
5
vagam(y, smooth.X, para.X = NULL, lambda = NULL, int.knots, family = gaussian(), 
A.struct = c("unstructured", "block"), offset = NULL, save.data = FALSE, 
para.se = FALSE, doIC = FALSE, 
control = list(eps = 0.001, maxit = 1000, trace = TRUE, seed.number = 123, 
mc.samps = 4000, pois.step.size = 0.01))

Arguments

y

A response vector.

smooth.X

A matrix of covariates, each of which are to be entered as additive smooth terms in the GAM.

para.X

An optional matrix of covariates, each of which are to be entered as parametric terms in the GAM. Please note that NO intercept term needs to be included as it is included by default.

lambda

An optional vector of length ncol(smooth.X), where each element corresponds to the smoothing parameter to be applied to the respective covariate in smooth.X. If supplied, then the GAM is fitted with the smoothing parameters held fixed at this values. If lambda=NULL, then smoothing parameters for all covariates to be smoothed are updated automatically as part of the VA algorithm.

int.knots

Either a single number of a vector of length ncol(smooth.X), corresponding to the number of interior knots to be use for the respective covariate iin smooth.X. This argument is passed to the function smooth.construct from the mgcv package (Wood, 2017) in order to construct the P-splines bases. Equally spaced knots based on quantiles are used.

family

Currently only the gaussian(link = "identity"), poisson(link = "log"), and binomial(link = "logit") corresponding to Bernoulli distributions are available.

A.struct

The assumed structure of the covariance matrix in the variational distribution of the smoothing coefficients. Currently, the two options are A.struc = "unstructured" corresponding to assuming an fully unstructured covariance matrix, and A.struc = "block" which assumes a block diagonal structure where the all covariances between different smoothing covariates are assumed to be zero (but the covariance submatrix remains unstructured within the spline basis functions for a selected smoothing covariate). The latter is sub-optimal in the sense that the most appropriate variational distribution should use a completely unstructured covariance matrix , but MAY (but is not guaranteed) save computation time especially when the number of smoothing covariates and/or the number of interior knots is very large.

offset

This can be used to specify an a-priori known component to be included in the linear predictor during fitting. This should be NULL or a numeric vector of length equal to length(y).

save.data

If save.date=TRUE, then the returned vagam object will also include y, smooth.X, para.X, and the full matrix of P-spline basis functions.

para.se

If para.se=TRUE, the standard errors based on the VA approach are returned for any covariates in para.X that are included as parametric terms. Note that if para.se=FALSE then a standard error for the intercept term will not be returned even though an intercept term is included by default.

doIC

If doIC=TRUE, then the AIC and BIC are returned, where the AIC is calculated as AIC = -2\times variation log-likelihood + 2\times trace(H) with trace(H) bring a measure of the degrees of freedom of the model as based on the hat-matrix arising from iterative reweighted least squares, and the BIC replaces the 2 with log(length(y)) for the model complexity penalty; please see Wood (2017) for more details. Note however that this out is largely mute as the VA approach provides an automatic method of selecting the smoothing parameters, meaning an external approach such as information criteria is not required.

control

A list controlling the finer details of the VA approach for fitting GAMs. These include:

  • mc.samps:This controls Monte Carlo samples for calculating variational observed information matrix using Louis' method

  • seed:This controls seed for starting values of the fitting algorithm in general

  • pois.step.size:This controls step size for penalized iterative reweighted least squares (P-IRLS) portion of the VA approach when family=poisson(). This may be tweaked to use smaller step sizes as the approach here can be a tad unstable especially if there is possible overdispersion.

Details

Please note that the package is still in its early days, and only a very basic form of GAMs with purely additive terms and P-splines is fitted. The function borrows heavily from the excellent software available in the mgcv package (Wood, 2017), in the sense that it uses the smooth.construct function with bs = "ps" to set up the matrix of P-splines bases (so cubic B-splines with a first order difference penalty matrix) along with imposing the necessary centering constraints. With these ingredients, it then maximizes the variational log-likelihood by iteratively updating the model and variational parameters. The variational log-likelihood is obtained by proposing a variational distribution for the smoothing coefficients (in this case, a multivariate normal distribution between unknown mean vector and covariance matrix), and then minimizing the Kullback-Leibler distance between this variational distribution and the true posterior distribution of the smoothing coefficients. In turn, this is designed to be (closed to) fully tractable lower bound approximation to the true marginal log-likelihood for the GAM, which for non-normal responses does not possess a tractable form. Note that in contrast to the marginal log-likelihood or many approximations such the Laplace approximation and adaptive quadrature, the variational approximation typically presents a tractable form that is relatively straightforward to maximize. At the same time, because it takes views the GAM as a mixed model, then it also possesses nice inference tools such as an approximate posterior distribution of the smoothing coefficients available immediately from maximizing the VA log-likelihood, and automatic choice of the smoothing parameters. We refer to readers to Wood (2017) and Ruppert et al. (2003) for detailed introductions to GAMs and how many of them can be set up as mixed models; Eilers and Marx (1996) for the seminal text on P-splines, and Hui et al. (2018) for the text on which this package is based.

Value

An object of vagam class containing one or more of the following elements:

Author(s)

Han Lin Shang [aut, cre, cph] (<https://orcid.org/0000-0003-1769-6430>), Francis K.C. Hui [aut] (<https://orcid.org/0000-0003-0765-3533>)

References

See Also

summary.vagam for a basic summary of the fitted model; plot.vagam for basic plotting the component smooths; predict.vagam for basic prediction

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
## Example 1: Application to wage data
data(wage_data)

south_code <- gender_code <- race_code <- union_code <- vector("numeric", nrow(wage_data))
union_code[wage_data$union == "member"] <- 1
south_code[wage_data$south == "yes"] <- 1
gender_code[wage_data$gender == "female"] <- 1
race_code[wage_data$race == "White"] <- 1
para.X <- data.frame(south = south_code, gender = gender_code, race = race_code)

fit_va <- vagam(y = union_code, smooth.X = wage_data[,c("education", "wage", "age")],
                       para.X = para.X,
                       int.knots = 8, save.data = TRUE, 
                       family = binomial(), 
                       para.se = TRUE)
summary(fit_va)

a <- 1
par(mfrow = c(1, 3), las = 1, cex = a, cex.lab = a+0.2, cex.main = a+0.5, mar = c(5,5,3,2))
plot(fit_va, ylim = c(-2.7, 2.7), select = 1, 
                        xlab = "Education", ylab = "Smooth of Education", lwd = 3)
plot(fit_va, ylim = c(-2.7, 2.7), select = 2, 
                        xlab = "Wage", ylab = "Smooth of Wage", main = "Plots from VA-GAM", lwd = 3)
plot(fit_va, ylim = c(-2.7, 2.7), select = 3, 
                        xlab = "Age", ylab = "Smooth of Age", lwd = 3)

                      
## Not run: 
## Example 2: Simulated data with size = 50 and compare how GAMs can be fitted 
## in VA and mgcv (which uses penalized quasi-likelihood)
choose_k <- 5 * ceiling(50^0.2)
true_beta <- c(-1, 0.5)

poisson_dat <- gamsim(n = 50, dist = "poisson", extra.X = data.frame(intercept = rep(1,50), 
                        treatment = rep(c(0,1), each = 50/2)), beta = true_beta)

## GAM using VA
fit_va <- vagam(y = poisson_dat$y, smooth.X = poisson_dat[,2:5], 
		         para.X = data.frame(treatment = poisson_dat$treatment), 
                          int.knots = choose_k, save.data = TRUE, family = poisson(), 
                          para.se = TRUE)
summary(fit_va)
                       
## GAM using mgcv with default options
fit_mgcv1 <- gam(y ~ treatment + s(x0) + s(x1) + s(x2) + s(x3), 
                             data = poisson_dat, family = poisson())


## GAM using mgcv with P-splines and preset knots; 
## this is equivalent to VA in terms of the splines bases functions 
fit_mgcv2 <- gam(y ~ treatment + s(x0, bs = "ps", k = round(choose_k/2) + 2, m  = c(2,1)) + 
                             s(x1, bs = "ps", k = round(choose_k/2) + 2, m  = c(2,1)) + 
                             s(x2, bs = "ps", k = round(choose_k/2) + 2, m  = c(2,1)) + 
                             s(x3, bs = "ps", k = round(choose_k/2) + 2, m  = c(2,1)), 
                             data = poisson_dat, family = poisson())                     

## End(Not run)

Example output

Loading required package: mgcv
Loading required package: nlme
This is mgcv 1.8-33. For overview type 'help("mgcv-package")'.
Loading required package: gamm4
Loading required package: Matrix
Loading required package: lme4

Attaching package: ‘lme4’

The following object is masked from ‘package:nlme’:

    lmList

This is gamm4 0.2-6

Loading required package: mvtnorm
Loading required package: truncnorm
Lambda updated as part of VA estimation. Yeah baby!
Iteration: 1 	 Current VA logL: -Inf  | New VA logL: -251.9948  | Difference: Inf 
Iteration: 2 	 Current VA logL: -251.9948  | New VA logL: -249.6027  | Difference: 2.392096 
Iteration: 3 	 Current VA logL: -249.6027  | New VA logL: -248.813  | Difference: 0.789726 
Iteration: 4 	 Current VA logL: -248.813  | New VA logL: -248.2817  | Difference: 0.5312606 
Iteration: 5 	 Current VA logL: -248.2817  | New VA logL: -247.9144  | Difference: 0.3673779 
Iteration: 6 	 Current VA logL: -247.9144  | New VA logL: -247.6557  | Difference: 0.2587032 
Iteration: 7 	 Current VA logL: -247.6557  | New VA logL: -247.4697  | Difference: 0.1859226 
Iteration: 8 	 Current VA logL: -247.4697  | New VA logL: -247.3333  | Difference: 0.1364035 
Iteration: 9 	 Current VA logL: -247.3333  | New VA logL: -247.2313  | Difference: 0.1020059 
Iteration: 10 	 Current VA logL: -247.2313  | New VA logL: -247.1537  | Difference: 0.07758634 
Iteration: 11 	 Current VA logL: -247.1537  | New VA logL: -247.0939  | Difference: 0.05988457 
Iteration: 12 	 Current VA logL: -247.0939  | New VA logL: -247.047  | Difference: 0.04680679 
Iteration: 13 	 Current VA logL: -247.047  | New VA logL: -247.0101  | Difference: 0.03698225 
Iteration: 14 	 Current VA logL: -247.0101  | New VA logL: -246.9806  | Difference: 0.02949385 
Iteration: 15 	 Current VA logL: -246.9806  | New VA logL: -246.9569  | Difference: 0.02371432 
Iteration: 16 	 Current VA logL: -246.9569  | New VA logL: -246.9377  | Difference: 0.01920535 
Iteration: 17 	 Current VA logL: -246.9377  | New VA logL: -246.922  | Difference: 0.01565459 
Iteration: 18 	 Current VA logL: -246.922  | New VA logL: -246.9092  | Difference: 0.01283546 
Iteration: 19 	 Current VA logL: -246.9092  | New VA logL: -246.8986  | Difference: 0.01058095 
Iteration: 20 	 Current VA logL: -246.8986  | New VA logL: -246.8898  | Difference: 0.008766283 
Iteration: 21 	 Current VA logL: -246.8898  | New VA logL: -246.8825  | Difference: 0.007297091 
Iteration: 22 	 Current VA logL: -246.8825  | New VA logL: -246.8764  | Difference: 0.006101228 
Iteration: 23 	 Current VA logL: -246.8764  | New VA logL: -246.8713  | Difference: 0.005123037 
Iteration: 24 	 Current VA logL: -246.8713  | New VA logL: -246.867  | Difference: 0.00431922 
Iteration: 25 	 Current VA logL: -246.867  | New VA logL: -246.8633  | Difference: 0.003655846 
Iteration: 26 	 Current VA logL: -246.8633  | New VA logL: -246.8602  | Difference: 0.003106149 
Iteration: 27 	 Current VA logL: -246.8602  | New VA logL: -246.8576  | Difference: 0.002648886 
Iteration: 28 	 Current VA logL: -246.8576  | New VA logL: -246.8553  | Difference: 0.002267108 
Iteration: 29 	 Current VA logL: -246.8553  | New VA logL: -246.8533  | Difference: 0.001947219 
Iteration: 30 	 Current VA logL: -246.8533  | New VA logL: -246.8517  | Difference: 0.001678265 
Iteration: 31 	 Current VA logL: -246.8517  | New VA logL: -246.8502  | Difference: 0.00145138 
Iteration: 32 	 Current VA logL: -246.8502  | New VA logL: -246.849  | Difference: 0.001259362 
Iteration: 33 	 Current VA logL: -246.849  | New VA logL: -246.8479  | Difference: 0.001096333 
Iteration: 34 	 Current VA logL: -246.8479  | New VA logL: -246.8469  | Difference: 0.0009574839 
Calculating information matrix for model parameters...
Variational approximation for GAMs

Call: vagam(y = union_code, smooth.X = wage_data[, c("education", "wage", "age")], para.X = para.X, int.knots = 8, family = binomial(), save.data = TRUE, para.se = TRUE) 

Estimated regression coefficients for parametric component: -0.715371 -0.4979345 -0.7094556 -0.7259837
Estimated smoothing coefficients for nonparametric component: 0.2024977 0.2371355 0.3083756 0.3620955 0.3070319 -0.06640864 -0.271355 -0.3181048 -0.292883 -0.274131 -1.269723 1.122673 2.295864 0.7431544 0.3187937 0.2337385 -0.08007407 -0.3819107 -0.6290628 -0.6814481 -0.1070558 -0.05720291 -0.01358479 0.03131925 0.08092134 0.142305 0.1878005 0.2265736 0.2248735 0.2117996
Estimated smoothing parameters (or fixed if lambda was supplied): 10.0231 0.5683342 46.01943
Number of interior knots used: 8 8 8
Maximized value of the variational log-likelihood: -246.8469

Summary statistics for nonparametric component:
               education     wage    age
Wald Statistic   5.24257 46.14354 1.9520
p-value          0.87440  0.00000 0.9967

Summary statistics for parametric component (if para.se = TRUE):
               Intercept    south   gender     race
Estimate        -0.71537 -0.49793 -0.70946 -0.72598
Std. Error       0.29415  0.29397  0.26386  0.29681
Wald Statistic  -2.43202 -1.69383 -2.68874 -2.44594
p-value          0.01501  0.09030  0.00717  0.01445

$call
vagam(y = union_code, smooth.X = wage_data[, c("education", "wage", 
    "age")], para.X = para.X, int.knots = 8, family = binomial(), 
    save.data = TRUE, para.se = TRUE)

$para.coeff
 Intercept      south     gender       race 
-0.7153710 -0.4979345 -0.7094556 -0.7259837 

$smooth.coeff
 [1]  0.20249774  0.23713555  0.30837560  0.36209547  0.30703185 -0.06640864
 [7] -0.27135505 -0.31810480 -0.29288301 -0.27413098 -1.26972325  1.12267312
[13]  2.29586353  0.74315437  0.31879374  0.23373847 -0.08007407 -0.38191068
[19] -0.62906283 -0.68144811 -0.10705576 -0.05720291 -0.01358479  0.03131925
[25]  0.08092134  0.14230505  0.18780048  0.22657360  0.22487350  0.21179957

$smooth.param
[1] 10.0231020  0.5683342 46.0194276

$phi
[1] 1

$logLik
[1] -246.8469

$family

Family: binomial 
Link function: logit 


$smooth.stat
               education     wage    age
Wald Statistic   5.24257 46.14354 1.9520
p-value          0.87440  0.00000 0.9967

$para.stat
               Intercept    south   gender     race
Estimate        -0.71537 -0.49793 -0.70946 -0.72598
Std. Error       0.29415  0.29397  0.26386  0.29681
Wald Statistic  -2.43202 -1.69383 -2.68874 -2.44594
p-value          0.01501  0.09030  0.00717  0.01445

attr(,"class")
[1] "summary.vagam"
Lambda updated as part of VA estimation. Yeah baby!
Iteration: 1 	 Current VA logL: -Inf  | New VA logL: -427.9932  | Difference: Inf 
Iteration: 2 	 Current VA logL: -427.9932  | New VA logL: -330.7948  | Difference: 97.19841 
Iteration: 3 	 Current VA logL: -330.7948  | New VA logL: -283.5286  | Difference: 47.26625 
Iteration: 4 	 Current VA logL: -283.5286  | New VA logL: -253.0125  | Difference: 30.51608 
Iteration: 5 	 Current VA logL: -253.0125  | New VA logL: -231.6056  | Difference: 21.40688 
Iteration: 6 	 Current VA logL: -231.6056  | New VA logL: -215.7583  | Difference: 15.84734 
Iteration: 7 	 Current VA logL: -215.7583  | New VA logL: -203.6006  | Difference: 12.15766 
Iteration: 8 	 Current VA logL: -203.6006  | New VA logL: -194.0998  | Difference: 9.50088 
Iteration: 9 	 Current VA logL: -194.0998  | New VA logL: -186.4754  | Difference: 7.62439 
Iteration: 10 	 Current VA logL: -186.4754  | New VA logL: -180.2011  | Difference: 6.274265 
Iteration: 11 	 Current VA logL: -180.2011  | New VA logL: -175.0458  | Difference: 5.155307 
Iteration: 12 	 Current VA logL: -175.0458  | New VA logL: -170.6973  | Difference: 4.348472 
Iteration: 13 	 Current VA logL: -170.6973  | New VA logL: -167.0002  | Difference: 3.697071 
Iteration: 14 	 Current VA logL: -167.0002  | New VA logL: -163.8246  | Difference: 3.175687 
Iteration: 15 	 Current VA logL: -163.8246  | New VA logL: -161.0074  | Difference: 2.817127 
Iteration: 16 	 Current VA logL: -161.0074  | New VA logL: -158.5621  | Difference: 2.445371 
Iteration: 17 	 Current VA logL: -158.5621  | New VA logL: -156.3654  | Difference: 2.196669 
Iteration: 18 	 Current VA logL: -156.3654  | New VA logL: -154.4471  | Difference: 1.918283 
Iteration: 19 	 Current VA logL: -154.4471  | New VA logL: -152.6919  | Difference: 1.755182 
Iteration: 20 	 Current VA logL: -152.6919  | New VA logL: -151.0954  | Difference: 1.596569 
Iteration: 21 	 Current VA logL: -151.0954  | New VA logL: -149.6949  | Difference: 1.400426 
Iteration: 22 	 Current VA logL: -149.6949  | New VA logL: -148.3965  | Difference: 1.298447 
Iteration: 23 	 Current VA logL: -148.3965  | New VA logL: -147.1911  | Difference: 1.205428 
Iteration: 24 	 Current VA logL: -147.1911  | New VA logL: -146.0708  | Difference: 1.120304 
Iteration: 25 	 Current VA logL: -146.0708  | New VA logL: -145.0389  | Difference: 1.031826 
Iteration: 26 	 Current VA logL: -145.0389  | New VA logL: -144.1319  | Difference: 0.9070442 
Iteration: 27 	 Current VA logL: -144.1319  | New VA logL: -143.2792  | Difference: 0.8526548 
Iteration: 28 	 Current VA logL: -143.2792  | New VA logL: -142.4771  | Difference: 0.802131 
Iteration: 29 	 Current VA logL: -142.4771  | New VA logL: -141.7221  | Difference: 0.7550214 
Iteration: 30 	 Current VA logL: -141.7221  | New VA logL: -141.011  | Difference: 0.7110918 
Iteration: 31 	 Current VA logL: -141.011  | New VA logL: -140.3409  | Difference: 0.6701192 
Iteration: 32 	 Current VA logL: -140.3409  | New VA logL: -139.709  | Difference: 0.6318943 
Iteration: 33 	 Current VA logL: -139.709  | New VA logL: -139.1127  | Difference: 0.5962212 
Iteration: 34 	 Current VA logL: -139.1127  | New VA logL: -138.5563  | Difference: 0.5564972 
Iteration: 35 	 Current VA logL: -138.5563  | New VA logL: -138.0674  | Difference: 0.4888532 
Iteration: 36 	 Current VA logL: -138.0674  | New VA logL: -137.6014  | Difference: 0.4659985 
Iteration: 37 	 Current VA logL: -137.6014  | New VA logL: -137.157  | Difference: 0.4444216 
Iteration: 38 	 Current VA logL: -137.157  | New VA logL: -136.733  | Difference: 0.4239319 
Iteration: 39 	 Current VA logL: -136.733  | New VA logL: -136.3286  | Difference: 0.4044801 
Iteration: 40 	 Current VA logL: -136.3286  | New VA logL: -135.9426  | Difference: 0.3860163 
Iteration: 41 	 Current VA logL: -135.9426  | New VA logL: -135.5741  | Difference: 0.3684918 
Iteration: 42 	 Current VA logL: -135.5741  | New VA logL: -135.2222  | Difference: 0.3518596 
Iteration: 43 	 Current VA logL: -135.2222  | New VA logL: -134.8861  | Difference: 0.3360743 
Iteration: 44 	 Current VA logL: -134.8861  | New VA logL: -134.565  | Difference: 0.3210922 
Iteration: 45 	 Current VA logL: -134.565  | New VA logL: -134.2582  | Difference: 0.3068714 
Iteration: 46 	 Current VA logL: -134.2582  | New VA logL: -133.9648  | Difference: 0.293372 
Iteration: 47 	 Current VA logL: -133.9648  | New VA logL: -133.6842  | Difference: 0.2805557 
Iteration: 48 	 Current VA logL: -133.6842  | New VA logL: -133.4158  | Difference: 0.2683862 
Iteration: 49 	 Current VA logL: -133.4158  | New VA logL: -133.159  | Difference: 0.256829 
Iteration: 50 	 Current VA logL: -133.159  | New VA logL: -132.9132  | Difference: 0.2458511 
Iteration: 51 	 Current VA logL: -132.9132  | New VA logL: -132.6777  | Difference: 0.2354214 
Iteration: 52 	 Current VA logL: -132.6777  | New VA logL: -132.4543  | Difference: 0.2234448 
Iteration: 53 	 Current VA logL: -132.4543  | New VA logL: -132.2578  | Difference: 0.196546 
Iteration: 54 	 Current VA logL: -132.2578  | New VA logL: -132.0678  | Difference: 0.1899236 
Iteration: 55 	 Current VA logL: -132.0678  | New VA logL: -131.8842  | Difference: 0.1835867 
Iteration: 56 	 Current VA logL: -131.8842  | New VA logL: -131.7068  | Difference: 0.1774629 
Iteration: 57 	 Current VA logL: -131.7068  | New VA logL: -131.5352  | Difference: 0.1715473 
Iteration: 58 	 Current VA logL: -131.5352  | New VA logL: -131.3694  | Difference: 0.1658343 
Iteration: 59 	 Current VA logL: -131.3694  | New VA logL: -131.2091  | Difference: 0.1603182 
Iteration: 60 	 Current VA logL: -131.2091  | New VA logL: -131.0541  | Difference: 0.1549932 
Iteration: 61 	 Current VA logL: -131.0541  | New VA logL: -130.9042  | Difference: 0.1498535 
Iteration: 62 	 Current VA logL: -130.9042  | New VA logL: -130.7593  | Difference: 0.1448935 
Iteration: 63 	 Current VA logL: -130.7593  | New VA logL: -130.6192  | Difference: 0.1401075 
Iteration: 64 	 Current VA logL: -130.6192  | New VA logL: -130.4837  | Difference: 0.1354898 
Iteration: 65 	 Current VA logL: -130.4837  | New VA logL: -130.3527  | Difference: 0.131035 
Iteration: 66 	 Current VA logL: -130.3527  | New VA logL: -130.226  | Difference: 0.1267376 
Iteration: 67 	 Current VA logL: -130.226  | New VA logL: -130.1034  | Difference: 0.1225923 
Iteration: 68 	 Current VA logL: -130.1034  | New VA logL: -129.9848  | Difference: 0.1185939 
Iteration: 69 	 Current VA logL: -129.9848  | New VA logL: -129.87  | Difference: 0.1147374 
Iteration: 70 	 Current VA logL: -129.87  | New VA logL: -129.759  | Difference: 0.1110177 
Iteration: 71 	 Current VA logL: -129.759  | New VA logL: -129.6516  | Difference: 0.1074301 
Iteration: 72 	 Current VA logL: -129.6516  | New VA logL: -129.5476  | Difference: 0.1039699 
Iteration: 73 	 Current VA logL: -129.5476  | New VA logL: -129.447  | Difference: 0.1006325 
Iteration: 74 	 Current VA logL: -129.447  | New VA logL: -129.3496  | Difference: 0.09741343 
Iteration: 75 	 Current VA logL: -129.3496  | New VA logL: -129.2553  | Difference: 0.0943085 
Iteration: 76 	 Current VA logL: -129.2553  | New VA logL: -129.164  | Difference: 0.09131353 
Iteration: 77 	 Current VA logL: -129.164  | New VA logL: -129.0755  | Difference: 0.08842448 
Iteration: 78 	 Current VA logL: -129.0755  | New VA logL: -128.9899  | Difference: 0.08563749 
Iteration: 79 	 Current VA logL: -128.9899  | New VA logL: -128.907  | Difference: 0.08294879 
Iteration: 80 	 Current VA logL: -128.907  | New VA logL: -128.8266  | Difference: 0.08035476 
Iteration: 81 	 Current VA logL: -128.8266  | New VA logL: -128.7487  | Difference: 0.0778519 
Iteration: 82 	 Current VA logL: -128.7487  | New VA logL: -128.6733  | Difference: 0.07543681 
Iteration: 83 	 Current VA logL: -128.6733  | New VA logL: -128.6002  | Difference: 0.07310626 
Iteration: 84 	 Current VA logL: -128.6002  | New VA logL: -128.5293  | Difference: 0.07085708 
Iteration: 85 	 Current VA logL: -128.5293  | New VA logL: -128.4607  | Difference: 0.06868625 
Iteration: 86 	 Current VA logL: -128.4607  | New VA logL: -128.3941  | Difference: 0.06659085 
Iteration: 87 	 Current VA logL: -128.3941  | New VA logL: -128.3295  | Difference: 0.06456807 
Iteration: 88 	 Current VA logL: -128.3295  | New VA logL: -128.2669  | Difference: 0.06261519 
Iteration: 89 	 Current VA logL: -128.2669  | New VA logL: -128.2062  | Difference: 0.06072961 
Iteration: 90 	 Current VA logL: -128.2062  | New VA logL: -128.1472  | Difference: 0.05890881 
Iteration: 91 	 Current VA logL: -128.1472  | New VA logL: -128.0901  | Difference: 0.05715038 
Iteration: 92 	 Current VA logL: -128.0901  | New VA logL: -128.0346  | Difference: 0.05545923 
Iteration: 93 	 Current VA logL: -128.0346  | New VA logL: -127.9808  | Difference: 0.05383573 
Iteration: 94 	 Current VA logL: -127.9808  | New VA logL: -127.9285  | Difference: 0.05227656 
Iteration: 95 	 Current VA logL: -127.9285  | New VA logL: -127.8777  | Difference: 0.05077858 
Iteration: 96 	 Current VA logL: -127.8777  | New VA logL: -127.8284  | Difference: 0.04933883 
Iteration: 97 	 Current VA logL: -127.8284  | New VA logL: -127.7805  | Difference: 0.04795453 
Iteration: 98 	 Current VA logL: -127.7805  | New VA logL: -127.7338  | Difference: 0.04662304 
Iteration: 99 	 Current VA logL: -127.7338  | New VA logL: -127.6885  | Difference: 0.04534191 
Iteration: 100 	 Current VA logL: -127.6885  | New VA logL: -127.6444  | Difference: 0.04410878 
Iteration: 101 	 Current VA logL: -127.6444  | New VA logL: -127.6015  | Difference: 0.04292144 
Iteration: 102 	 Current VA logL: -127.6015  | New VA logL: -127.5597  | Difference: 0.04177781 
Iteration: 103 	 Current VA logL: -127.5597  | New VA logL: -127.519  | Difference: 0.04067592 
Iteration: 104 	 Current VA logL: -127.519  | New VA logL: -127.4794  | Difference: 0.03961388 
Iteration: 105 	 Current VA logL: -127.4794  | New VA logL: -127.4408  | Difference: 0.03858994 
Iteration: 106 	 Current VA logL: -127.4408  | New VA logL: -127.4032  | Difference: 0.03760241 
Iteration: 107 	 Current VA logL: -127.4032  | New VA logL: -127.3665  | Difference: 0.0366497 
Iteration: 108 	 Current VA logL: -127.3665  | New VA logL: -127.3308  | Difference: 0.0357303 
Iteration: 109 	 Current VA logL: -127.3308  | New VA logL: -127.296  | Difference: 0.03484278 
Iteration: 110 	 Current VA logL: -127.296  | New VA logL: -127.262  | Difference: 0.03398578 
Iteration: 111 	 Current VA logL: -127.262  | New VA logL: -127.2288  | Difference: 0.033158 
Iteration: 112 	 Current VA logL: -127.2288  | New VA logL: -127.1965  | Difference: 0.03235822 
Iteration: 113 	 Current VA logL: -127.1965  | New VA logL: -127.1649  | Difference: 0.03158526 
Iteration: 114 	 Current VA logL: -127.1649  | New VA logL: -127.134  | Difference: 0.03083801 
Iteration: 115 	 Current VA logL: -127.134  | New VA logL: -127.1039  | Difference: 0.03011541 
Iteration: 116 	 Current VA logL: -127.1039  | New VA logL: -127.0745  | Difference: 0.02941644 
Iteration: 117 	 Current VA logL: -127.0745  | New VA logL: -127.0458  | Difference: 0.02874015 
Iteration: 118 	 Current VA logL: -127.0458  | New VA logL: -127.0177  | Difference: 0.02808562 
Iteration: 119 	 Current VA logL: -127.0177  | New VA logL: -126.9902  | Difference: 0.02745196 
Iteration: 120 	 Current VA logL: -126.9902  | New VA logL: -126.9634  | Difference: 0.02683833 
Iteration: 121 	 Current VA logL: -126.9634  | New VA logL: -126.9372  | Difference: 0.02624395 
Iteration: 122 	 Current VA logL: -126.9372  | New VA logL: -126.9115  | Difference: 0.02566803 
Iteration: 123 	 Current VA logL: -126.9115  | New VA logL: -126.8864  | Difference: 0.02510986 
Iteration: 124 	 Current VA logL: -126.8864  | New VA logL: -126.8618  | Difference: 0.02456873 
Iteration: 125 	 Current VA logL: -126.8618  | New VA logL: -126.8378  | Difference: 0.02404396 
Iteration: 126 	 Current VA logL: -126.8378  | New VA logL: -126.8142  | Difference: 0.02353493 
Iteration: 127 	 Current VA logL: -126.8142  | New VA logL: -126.7912  | Difference: 0.02304101 
Iteration: 128 	 Current VA logL: -126.7912  | New VA logL: -126.7686  | Difference: 0.02256161 
Iteration: 129 	 Current VA logL: -126.7686  | New VA logL: -126.7465  | Difference: 0.02209617 
Iteration: 130 	 Current VA logL: -126.7465  | New VA logL: -126.7249  | Difference: 0.02164416 
Iteration: 131 	 Current VA logL: -126.7249  | New VA logL: -126.7037  | Difference: 0.02120505 
Iteration: 132 	 Current VA logL: -126.7037  | New VA logL: -126.6829  | Difference: 0.02077834 
Iteration: 133 	 Current VA logL: -126.6829  | New VA logL: -126.6625  | Difference: 0.02036355 
Iteration: 134 	 Current VA logL: -126.6625  | New VA logL: -126.6426  | Difference: 0.01996023 
Iteration: 135 	 Current VA logL: -126.6426  | New VA logL: -126.623  | Difference: 0.01956794 
Iteration: 136 	 Current VA logL: -126.623  | New VA logL: -126.6038  | Difference: 0.01918626 
Iteration: 137 	 Current VA logL: -126.6038  | New VA logL: -126.585  | Difference: 0.01881477 
Iteration: 138 	 Current VA logL: -126.585  | New VA logL: -126.5666  | Difference: 0.01845309 
Iteration: 139 	 Current VA logL: -126.5666  | New VA logL: -126.5485  | Difference: 0.01810085 
Iteration: 140 	 Current VA logL: -126.5485  | New VA logL: -126.5307  | Difference: 0.01775768 
Iteration: 141 	 Current VA logL: -126.5307  | New VA logL: -126.5133  | Difference: 0.01742324 
Iteration: 142 	 Current VA logL: -126.5133  | New VA logL: -126.4962  | Difference: 0.01709721 
Iteration: 143 	 Current VA logL: -126.4962  | New VA logL: -126.4794  | Difference: 0.01677926 
Iteration: 144 	 Current VA logL: -126.4794  | New VA logL: -126.4629  | Difference: 0.01646908 
Iteration: 145 	 Current VA logL: -126.4629  | New VA logL: -126.4468  | Difference: 0.0161664 
Iteration: 146 	 Current VA logL: -126.4468  | New VA logL: -126.4309  | Difference: 0.01587091 
Iteration: 147 	 Current VA logL: -126.4309  | New VA logL: -126.4153  | Difference: 0.01558237 
Iteration: 148 	 Current VA logL: -126.4153  | New VA logL: -126.4  | Difference: 0.01530051 
Iteration: 149 	 Current VA logL: -126.4  | New VA logL: -126.385  | Difference: 0.01502508 
Iteration: 150 	 Current VA logL: -126.385  | New VA logL: -126.3702  | Difference: 0.01475584 
Iteration: 151 	 Current VA logL: -126.3702  | New VA logL: -126.3557  | Difference: 0.01449258 
Iteration: 152 	 Current VA logL: -126.3557  | New VA logL: -126.3415  | Difference: 0.01423508 
Iteration: 153 	 Current VA logL: -126.3415  | New VA logL: -126.3275  | Difference: 0.01398313 
Iteration: 154 	 Current VA logL: -126.3275  | New VA logL: -126.3138  | Difference: 0.01373654 
Iteration: 155 	 Current VA logL: -126.3138  | New VA logL: -126.3003  | Difference: 0.01349511 
Iteration: 156 	 Current VA logL: -126.3003  | New VA logL: -126.287  | Difference: 0.01325867 
Iteration: 157 	 Current VA logL: -126.287  | New VA logL: -126.274  | Difference: 0.01302706 
Iteration: 158 	 Current VA logL: -126.274  | New VA logL: -126.2612  | Difference: 0.0128001 
Iteration: 159 	 Current VA logL: -126.2612  | New VA logL: -126.2486  | Difference: 0.01257765 
Iteration: 160 	 Current VA logL: -126.2486  | New VA logL: -126.2363  | Difference: 0.01235955 
Iteration: 161 	 Current VA logL: -126.2363  | New VA logL: -126.2241  | Difference: 0.01214568 
Iteration: 162 	 Current VA logL: -126.2241  | New VA logL: -126.2122  | Difference: 0.0119359 
Iteration: 163 	 Current VA logL: -126.2122  | New VA logL: -126.2005  | Difference: 0.01173009 
Iteration: 164 	 Current VA logL: -126.2005  | New VA logL: -126.1889  | Difference: 0.01152813 
Iteration: 165 	 Current VA logL: -126.1889  | New VA logL: -126.1776  | Difference: 0.0113299 
Iteration: 166 	 Current VA logL: -126.1776  | New VA logL: -126.1665  | Difference: 0.01113532 
Iteration: 167 	 Current VA logL: -126.1665  | New VA logL: -126.1555  | Difference: 0.01094427 
Iteration: 168 	 Current VA logL: -126.1555  | New VA logL: -126.1448  | Difference: 0.01075667 
Iteration: 169 	 Current VA logL: -126.1448  | New VA logL: -126.1342  | Difference: 0.01057242 
Iteration: 170 	 Current VA logL: -126.1342  | New VA logL: -126.1238  | Difference: 0.01039145 
Iteration: 171 	 Current VA logL: -126.1238  | New VA logL: -126.1136  | Difference: 0.01021368 
Iteration: 172 	 Current VA logL: -126.1136  | New VA logL: -126.1035  | Difference: 0.01003903 
Iteration: 173 	 Current VA logL: -126.1035  | New VA logL: -126.0937  | Difference: 0.009867437 
Iteration: 174 	 Current VA logL: -126.0937  | New VA logL: -126.084  | Difference: 0.009698838 
Iteration: 175 	 Current VA logL: -126.084  | New VA logL: -126.0744  | Difference: 0.00953317 
Iteration: 176 	 Current VA logL: -126.0744  | New VA logL: -126.0651  | Difference: 0.009370377 
Iteration: 177 	 Current VA logL: -126.0651  | New VA logL: -126.0559  | Difference: 0.009210403 
Iteration: 178 	 Current VA logL: -126.0559  | New VA logL: -126.0468  | Difference: 0.009053199 
Iteration: 179 	 Current VA logL: -126.0468  | New VA logL: -126.0379  | Difference: 0.008898715 
Iteration: 180 	 Current VA logL: -126.0379  | New VA logL: -126.0292  | Difference: 0.008746906 
Iteration: 181 	 Current VA logL: -126.0292  | New VA logL: -126.0206  | Difference: 0.008597728 
Iteration: 182 	 Current VA logL: -126.0206  | New VA logL: -126.0121  | Difference: 0.008451139 
Iteration: 183 	 Current VA logL: -126.0121  | New VA logL: -126.0038  | Difference: 0.008307099 
Iteration: 184 	 Current VA logL: -126.0038  | New VA logL: -125.9956  | Difference: 0.008165571 
Iteration: 185 	 Current VA logL: -125.9956  | New VA logL: -125.9876  | Difference: 0.008026517 
Iteration: 186 	 Current VA logL: -125.9876  | New VA logL: -125.9797  | Difference: 0.007889902 
Iteration: 187 	 Current VA logL: -125.9797  | New VA logL: -125.972  | Difference: 0.007755691 
Iteration: 188 	 Current VA logL: -125.972  | New VA logL: -125.9643  | Difference: 0.007623852 
Iteration: 189 	 Current VA logL: -125.9643  | New VA logL: -125.9569  | Difference: 0.00749435 
Iteration: 190 	 Current VA logL: -125.9569  | New VA logL: -125.9495  | Difference: 0.007367154 
Iteration: 191 	 Current VA logL: -125.9495  | New VA logL: -125.9422  | Difference: 0.007242232 
Iteration: 192 	 Current VA logL: -125.9422  | New VA logL: -125.9351  | Difference: 0.007119553 
Iteration: 193 	 Current VA logL: -125.9351  | New VA logL: -125.9281  | Difference: 0.006999086 
Iteration: 194 	 Current VA logL: -125.9281  | New VA logL: -125.9212  | Difference: 0.0068808 
Iteration: 195 	 Current VA logL: -125.9212  | New VA logL: -125.9145  | Difference: 0.006764666 
Iteration: 196 	 Current VA logL: -125.9145  | New VA logL: -125.9078  | Difference: 0.006650652 
Iteration: 197 	 Current VA logL: -125.9078  | New VA logL: -125.9013  | Difference: 0.006538728 
Iteration: 198 	 Current VA logL: -125.9013  | New VA logL: -125.8949  | Difference: 0.006428864 
Iteration: 199 	 Current VA logL: -125.8949  | New VA logL: -125.8885  | Difference: 0.006321029 
Iteration: 200 	 Current VA logL: -125.8885  | New VA logL: -125.8823  | Difference: 0.006215194 
Iteration: 201 	 Current VA logL: -125.8823  | New VA logL: -125.8762  | Difference: 0.006111327 
Iteration: 202 	 Current VA logL: -125.8762  | New VA logL: -125.8702  | Difference: 0.006009399 
Iteration: 203 	 Current VA logL: -125.8702  | New VA logL: -125.8643  | Difference: 0.005909378 
Iteration: 204 	 Current VA logL: -125.8643  | New VA logL: -125.8585  | Difference: 0.005811235 
Iteration: 205 	 Current VA logL: -125.8585  | New VA logL: -125.8528  | Difference: 0.005714938 
Iteration: 206 	 Current VA logL: -125.8528  | New VA logL: -125.8471  | Difference: 0.005620457 
Iteration: 207 	 Current VA logL: -125.8471  | New VA logL: -125.8416  | Difference: 0.005527761 
Iteration: 208 	 Current VA logL: -125.8416  | New VA logL: -125.8362  | Difference: 0.005436819 
Iteration: 209 	 Current VA logL: -125.8362  | New VA logL: -125.8308  | Difference: 0.0053476 
Iteration: 210 	 Current VA logL: -125.8308  | New VA logL: -125.8256  | Difference: 0.005260075 
Iteration: 211 	 Current VA logL: -125.8256  | New VA logL: -125.8204  | Difference: 0.005174211 
Iteration: 212 	 Current VA logL: -125.8204  | New VA logL: -125.8153  | Difference: 0.005089979 
Iteration: 213 	 Current VA logL: -125.8153  | New VA logL: -125.8103  | Difference: 0.005007349 
Iteration: 214 	 Current VA logL: -125.8103  | New VA logL: -125.8054  | Difference: 0.004926289 
Iteration: 215 	 Current VA logL: -125.8054  | New VA logL: -125.8005  | Difference: 0.004846771 
Iteration: 216 	 Current VA logL: -125.8005  | New VA logL: -125.7958  | Difference: 0.004768763 
Iteration: 217 	 Current VA logL: -125.7958  | New VA logL: -125.7911  | Difference: 0.004692237 
Iteration: 218 	 Current VA logL: -125.7911  | New VA logL: -125.7865  | Difference: 0.004617162 
Iteration: 219 	 Current VA logL: -125.7865  | New VA logL: -125.7819  | Difference: 0.004543511 
Iteration: 220 	 Current VA logL: -125.7819  | New VA logL: -125.7774  | Difference: 0.004471254 
Iteration: 221 	 Current VA logL: -125.7774  | New VA logL: -125.773  | Difference: 0.004400362 
Iteration: 222 	 Current VA logL: -125.773  | New VA logL: -125.7687  | Difference: 0.004330808 
Iteration: 223 	 Current VA logL: -125.7687  | New VA logL: -125.7644  | Difference: 0.004262563 
Iteration: 224 	 Current VA logL: -125.7644  | New VA logL: -125.7602  | Difference: 0.004195601 
Iteration: 225 	 Current VA logL: -125.7602  | New VA logL: -125.7561  | Difference: 0.004129895 
Iteration: 226 	 Current VA logL: -125.7561  | New VA logL: -125.7521  | Difference: 0.004065417 
Iteration: 227 	 Current VA logL: -125.7521  | New VA logL: -125.7481  | Difference: 0.004002148 
Iteration: 228 	 Current VA logL: -125.7481  | New VA logL: -125.7441  | Difference: 0.003943301 
Iteration: 229 	 Current VA logL: -125.7441  | New VA logL: -125.7402  | Difference: 0.003882249 
Iteration: 230 	 Current VA logL: -125.7402  | New VA logL: -125.7364  | Difference: 0.003822315 
Iteration: 231 	 Current VA logL: -125.7364  | New VA logL: -125.7326  | Difference: 0.003763488 
Iteration: 232 	 Current VA logL: -125.7326  | New VA logL: -125.7289  | Difference: 0.003705743 
Iteration: 233 	 Current VA logL: -125.7289  | New VA logL: -125.7253  | Difference: 0.003649058 
Iteration: 234 	 Current VA logL: -125.7253  | New VA logL: -125.7217  | Difference: 0.003593408 
Iteration: 235 	 Current VA logL: -125.7217  | New VA logL: -125.7182  | Difference: 0.003538772 
Iteration: 236 	 Current VA logL: -125.7182  | New VA logL: -125.7147  | Difference: 0.003485127 
Iteration: 237 	 Current VA logL: -125.7147  | New VA logL: -125.7112  | Difference: 0.003432453 
Iteration: 238 	 Current VA logL: -125.7112  | New VA logL: -125.7079  | Difference: 0.003380727 
Iteration: 239 	 Current VA logL: -125.7079  | New VA logL: -125.7045  | Difference: 0.00332993 
Iteration: 240 	 Current VA logL: -125.7045  | New VA logL: -125.7012  | Difference: 0.003280042 
Iteration: 241 	 Current VA logL: -125.7012  | New VA logL: -125.698  | Difference: 0.003231042 
Iteration: 242 	 Current VA logL: -125.698  | New VA logL: -125.6948  | Difference: 0.003182912 
Iteration: 243 	 Current VA logL: -125.6948  | New VA logL: -125.6917  | Difference: 0.003135633 
Iteration: 244 	 Current VA logL: -125.6917  | New VA logL: -125.6886  | Difference: 0.003089186 
Iteration: 245 	 Current VA logL: -125.6886  | New VA logL: -125.6856  | Difference: 0.003043554 
Iteration: 246 	 Current VA logL: -125.6856  | New VA logL: -125.6826  | Difference: 0.00299872 
Iteration: 247 	 Current VA logL: -125.6826  | New VA logL: -125.6796  | Difference: 0.002954666 
Iteration: 248 	 Current VA logL: -125.6796  | New VA logL: -125.6767  | Difference: 0.002911376 
Iteration: 249 	 Current VA logL: -125.6767  | New VA logL: -125.6738  | Difference: 0.002868834 
Iteration: 250 	 Current VA logL: -125.6738  | New VA logL: -125.671  | Difference: 0.002827024 
Iteration: 251 	 Current VA logL: -125.671  | New VA logL: -125.6682  | Difference: 0.002785931 
Iteration: 252 	 Current VA logL: -125.6682  | New VA logL: -125.6655  | Difference: 0.002745539 
Iteration: 253 	 Current VA logL: -125.6655  | New VA logL: -125.6628  | Difference: 0.002705834 
Iteration: 254 	 Current VA logL: -125.6628  | New VA logL: -125.6601  | Difference: 0.002666803 
Iteration: 255 	 Current VA logL: -125.6601  | New VA logL: -125.6575  | Difference: 0.00262843 
Iteration: 256 	 Current VA logL: -125.6575  | New VA logL: -125.6549  | Difference: 0.002590703 
Iteration: 257 	 Current VA logL: -125.6549  | New VA logL: -125.6523  | Difference: 0.002553608 
Iteration: 258 	 Current VA logL: -125.6523  | New VA logL: -125.6498  | Difference: 0.002517132 
Iteration: 259 	 Current VA logL: -125.6498  | New VA logL: -125.6473  | Difference: 0.002481264 
Iteration: 260 	 Current VA logL: -125.6473  | New VA logL: -125.6449  | Difference: 0.00244599 
Iteration: 261 	 Current VA logL: -125.6449  | New VA logL: -125.6425  | Difference: 0.002411298 
Iteration: 262 	 Current VA logL: -125.6425  | New VA logL: -125.6401  | Difference: 0.002377178 
Iteration: 263 	 Current VA logL: -125.6401  | New VA logL: -125.6377  | Difference: 0.002343617 
Iteration: 264 	 Current VA logL: -125.6377  | New VA logL: -125.6354  | Difference: 0.002310605 
Iteration: 265 	 Current VA logL: -125.6354  | New VA logL: -125.6332  | Difference: 0.002278131 
Iteration: 266 	 Current VA logL: -125.6332  | New VA logL: -125.6309  | Difference: 0.002246184 
Iteration: 267 	 Current VA logL: -125.6309  | New VA logL: -125.6287  | Difference: 0.002214754 
Iteration: 268 	 Current VA logL: -125.6287  | New VA logL: -125.6265  | Difference: 0.002183831 
Iteration: 269 	 Current VA logL: -125.6265  | New VA logL: -125.6244  | Difference: 0.002153406 
Iteration: 270 	 Current VA logL: -125.6244  | New VA logL: -125.6222  | Difference: 0.002123467 
Iteration: 271 	 Current VA logL: -125.6222  | New VA logL: -125.6201  | Difference: 0.002094007 
Iteration: 272 	 Current VA logL: -125.6201  | New VA logL: -125.6181  | Difference: 0.002065016 
Iteration: 273 	 Current VA logL: -125.6181  | New VA logL: -125.616  | Difference: 0.002036485 
Iteration: 274 	 Current VA logL: -125.616  | New VA logL: -125.614  | Difference: 0.002008406 
Iteration: 275 	 Current VA logL: -125.614  | New VA logL: -125.6121  | Difference: 0.00198077 
Iteration: 276 	 Current VA logL: -125.6121  | New VA logL: -125.6101  | Difference: 0.001953569 
Iteration: 277 	 Current VA logL: -125.6101  | New VA logL: -125.6082  | Difference: 0.001926794 
Iteration: 278 	 Current VA logL: -125.6082  | New VA logL: -125.6063  | Difference: 0.001900438 
Iteration: 279 	 Current VA logL: -125.6063  | New VA logL: -125.6044  | Difference: 0.001874493 
Iteration: 280 	 Current VA logL: -125.6044  | New VA logL: -125.6025  | Difference: 0.001848951 
Iteration: 281 	 Current VA logL: -125.6025  | New VA logL: -125.6007  | Difference: 0.001823806 
Iteration: 282 	 Current VA logL: -125.6007  | New VA logL: -125.5989  | Difference: 0.001799049 
Iteration: 283 	 Current VA logL: -125.5989  | New VA logL: -125.5972  | Difference: 0.001774674 
Iteration: 284 	 Current VA logL: -125.5972  | New VA logL: -125.5954  | Difference: 0.001750674 
Iteration: 285 	 Current VA logL: -125.5954  | New VA logL: -125.5937  | Difference: 0.001727042 
Iteration: 286 	 Current VA logL: -125.5937  | New VA logL: -125.592  | Difference: 0.001703771 
Iteration: 287 	 Current VA logL: -125.592  | New VA logL: -125.5903  | Difference: 0.001680855 
Iteration: 288 	 Current VA logL: -125.5903  | New VA logL: -125.5886  | Difference: 0.001658288 
Iteration: 289 	 Current VA logL: -125.5886  | New VA logL: -125.587  | Difference: 0.001636064 
Iteration: 290 	 Current VA logL: -125.587  | New VA logL: -125.5854  | Difference: 0.001614176 
Iteration: 291 	 Current VA logL: -125.5854  | New VA logL: -125.5838  | Difference: 0.001592619 
Iteration: 292 	 Current VA logL: -125.5838  | New VA logL: -125.5822  | Difference: 0.001571386 
Iteration: 293 	 Current VA logL: -125.5822  | New VA logL: -125.5807  | Difference: 0.001550472 
Iteration: 294 	 Current VA logL: -125.5807  | New VA logL: -125.5791  | Difference: 0.001529872 
Iteration: 295 	 Current VA logL: -125.5791  | New VA logL: -125.5776  | Difference: 0.00150958 
Iteration: 296 	 Current VA logL: -125.5776  | New VA logL: -125.5761  | Difference: 0.00148959 
Iteration: 297 	 Current VA logL: -125.5761  | New VA logL: -125.5747  | Difference: 0.001469899 
Iteration: 298 	 Current VA logL: -125.5747  | New VA logL: -125.5732  | Difference: 0.001450499 
Iteration: 299 	 Current VA logL: -125.5732  | New VA logL: -125.5718  | Difference: 0.001431387 
Iteration: 300 	 Current VA logL: -125.5718  | New VA logL: -125.5704  | Difference: 0.001412557 
Iteration: 301 	 Current VA logL: -125.5704  | New VA logL: -125.569  | Difference: 0.001394005 
Iteration: 302 	 Current VA logL: -125.569  | New VA logL: -125.5676  | Difference: 0.001375726 
Iteration: 303 	 Current VA logL: -125.5676  | New VA logL: -125.5662  | Difference: 0.001357715 
Iteration: 304 	 Current VA logL: -125.5662  | New VA logL: -125.5649  | Difference: 0.001339968 
Iteration: 305 	 Current VA logL: -125.5649  | New VA logL: -125.5636  | Difference: 0.001322481 
Iteration: 306 	 Current VA logL: -125.5636  | New VA logL: -125.5623  | Difference: 0.001305248 
Iteration: 307 	 Current VA logL: -125.5623  | New VA logL: -125.561  | Difference: 0.001288267 
Iteration: 308 	 Current VA logL: -125.561  | New VA logL: -125.5597  | Difference: 0.001271532 
Iteration: 309 	 Current VA logL: -125.5597  | New VA logL: -125.5585  | Difference: 0.001255039 
Iteration: 310 	 Current VA logL: -125.5585  | New VA logL: -125.5572  | Difference: 0.001238785 
Iteration: 311 	 Current VA logL: -125.5572  | New VA logL: -125.556  | Difference: 0.001222765 
Iteration: 312 	 Current VA logL: -125.556  | New VA logL: -125.5548  | Difference: 0.001206976 
Iteration: 313 	 Current VA logL: -125.5548  | New VA logL: -125.5536  | Difference: 0.001191414 
Iteration: 314 	 Current VA logL: -125.5536  | New VA logL: -125.5524  | Difference: 0.001176076 
Iteration: 315 	 Current VA logL: -125.5524  | New VA logL: -125.5513  | Difference: 0.001160956 
Iteration: 316 	 Current VA logL: -125.5513  | New VA logL: -125.5501  | Difference: 0.001146053 
Iteration: 317 	 Current VA logL: -125.5501  | New VA logL: -125.549  | Difference: 0.001131362 
Iteration: 318 	 Current VA logL: -125.549  | New VA logL: -125.5479  | Difference: 0.001116881 
Iteration: 319 	 Current VA logL: -125.5479  | New VA logL: -125.5468  | Difference: 0.001102605 
Iteration: 320 	 Current VA logL: -125.5468  | New VA logL: -125.5457  | Difference: 0.001088531 
Iteration: 321 	 Current VA logL: -125.5457  | New VA logL: -125.5446  | Difference: 0.001074656 
Iteration: 322 	 Current VA logL: -125.5446  | New VA logL: -125.5435  | Difference: 0.001060977 
Iteration: 323 	 Current VA logL: -125.5435  | New VA logL: -125.5425  | Difference: 0.001047491 
Iteration: 324 	 Current VA logL: -125.5425  | New VA logL: -125.5415  | Difference: 0.001034195 
Iteration: 325 	 Current VA logL: -125.5415  | New VA logL: -125.5404  | Difference: 0.001021085 
Iteration: 326 	 Current VA logL: -125.5404  | New VA logL: -125.5394  | Difference: 0.001008159 
Iteration: 327 	 Current VA logL: -125.5394  | New VA logL: -125.5384  | Difference: 0.000995414 
Calculating information matrix for model parameters...
Redoing...
Warning message:
In sqrt(diag(solve(obs_info))) : NaNs produced
Variational approximation for GAMs

Call: vagam(y = poisson_dat$y, smooth.X = poisson_dat[, 2:5], para.X = data.frame(treatment = poisson_dat$treatment), int.knots = choose_k, family = poisson(), save.data = TRUE, para.se = TRUE) 

Estimated regression coefficients for parametric component: -1.040643 0.6451899
Estimated smoothing coefficients for nonparametric component: -0.2733702 -0.2461039 -0.09914876 0.03284408 0.1684629 0.3366869 0.4947952 0.5851139 0.656716 0.5336587 0.3721285 0.2277044 -0.02135385 -0.2695109 -0.5710075 -0.7604779 -0.8174288 -1.712253 -1.563433 -1.369698 -0.9899007 -0.8411191 -0.4226654 -0.07139302 -0.2168335 0.0007790999 0.5339774 0.8247894 1.282654 2.006611 1.880903 1.258986 1.21398 1.148197 -0.751521 0.4847275 3.133529 6.330736 4.998549 3.434801 1.618585 -0.6540657 -1.360546 -1.018068 -0.4136508 -1.939511 -2.64722 -2.147782 -1.521794 -2.187518 -2.458522 -1.607145 -1.329687 0.005428171 1.0108 0.4098627 0.7475761 1.115838 0.7188687 0.7304209 1.060003 0.8690969 0.613725 0.4260507 0.4167565 0.5912033 0.3957223 0.02593618
Estimated smoothing parameters (or fixed if lambda was supplied): 6.680485 3.832892 0.2155243 1.027374
Number of interior knots used: 15 15 15 15
Maximized value of the variational log-likelihood: -125.5384

Summary statistics for nonparametric component:
                    x0       x1       x2       x3
Wald Statistic 358.331 840.2793 31902.05 79.84659
p-value          0.000   0.0000     0.00  0.00000

Summary statistics for parametric component (if para.se = TRUE):
               Intercept treatment
Estimate        -1.04064   0.64519
Std. Error           NaN   0.16658
Wald Statistic       NaN   3.87310
p-value              NaN   0.00011

$call
vagam(y = poisson_dat$y, smooth.X = poisson_dat[, 2:5], para.X = data.frame(treatment = poisson_dat$treatment), 
    int.knots = choose_k, family = poisson(), save.data = TRUE, 
    para.se = TRUE)

$para.coeff
 Intercept  treatment 
-1.0406428  0.6451899 

$smooth.coeff
 [1] -0.2733701882 -0.2461038544 -0.0991487572  0.0328440844  0.1684629451
 [6]  0.3366869456  0.4947951958  0.5851139119  0.6567160275  0.5336586669
[11]  0.3721285236  0.2277043874 -0.0213538489 -0.2695109221 -0.5710074632
[16] -0.7604779036 -0.8174288063 -1.7122534105 -1.5634333694 -1.3696982975
[21] -0.9899006792 -0.8411191491 -0.4226653909 -0.0713930171 -0.2168334916
[26]  0.0007790999  0.5339773671  0.8247894171  1.2826538456  2.0066107547
[31]  1.8809033147  1.2589859418  1.2139796076  1.1481968223 -0.7515209731
[36]  0.4847274793  3.1335287755  6.3307363410  4.9985492274  3.4348007502
[41]  1.6185847776 -0.6540656573 -1.3605461824 -1.0180681483 -0.4136507929
[46] -1.9395112438 -2.6472198102 -2.1477815240 -1.5217936307 -2.1875175588
[51] -2.4585215323 -1.6071447536 -1.3296868389  0.0054281713  1.0108003112
[56]  0.4098626899  0.7475760842  1.1158378269  0.7188686895  0.7304209105
[61]  1.0600027739  0.8690969307  0.6137250079  0.4260507231  0.4167565447
[66]  0.5912032995  0.3957222731  0.0259361775

$smooth.param
[1] 6.6804854 3.8328919 0.2155243 1.0273736

$phi
[1] 1

$logLik
[1] -125.5384

$family

Family: poisson 
Link function: log 


$smooth.stat
                    x0       x1       x2       x3
Wald Statistic 358.331 840.2793 31902.05 79.84659
p-value          0.000   0.0000     0.00  0.00000

$para.stat
               Intercept treatment
Estimate        -1.04064   0.64519
Std. Error           NaN   0.16658
Wald Statistic       NaN   3.87310
p-value              NaN   0.00011

attr(,"class")
[1] "summary.vagam"

vagam documentation built on Dec. 7, 2019, 1:06 a.m.