getrmse | R Documentation |
getrmse calculates the root mean square error (rmse) of the input invar series (defaults to 'cpue') against an input 'year' time series. This is primarily designed to generate an alternative estimate of the intrinsic variability of a cpue time-series to that which may be obtained from a cpue standardization
getrmse(indat, invar = "cpue", inyr = "year")
indat |
the matrix, spmdat, or data.frame containing both a 'year' column and an invar column (default to 'cpue') |
invar |
the column name of the variable whose rmse is wanted; defaults to 'cpue' |
inyr |
the column name that points to the 'year' name |
a list of the rmse and the loess predicted values of the invar for each year in the time-series
year <- 1986:1994
cpue <- c(1.2006,1.3547,1.0585,1.0846,0.9738,1.0437,0.7759,1.0532,1.284)
dat <- as.matrix(cbind(year,cpue))
getrmse(dat,invar="cpue") # should be 0.08265127
getrmse(dat,invar="cpue")$rmse
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.