View source: R/calculate_lhsOpt.R
calculate_lhsOpt | R Documentation |
Population level analysis of metric raster data to determine optimal Latin Hypercube sample size
calculate_lhsOpt(
mats,
PCA = TRUE,
quant = TRUE,
KLdiv = TRUE,
minSamp = 10,
maxSamp = 100,
step = 10,
rep = 10,
iter = 10000
)
mats |
List. Output from |
PCA |
Logical. Calculates principal component loadings of the population for PCA similarity factor testing.
|
quant |
Logical. Perform quantile comparison testing. |
KLdiv |
Logical. Perform Kullback–Leibler divergence testing. |
minSamp |
Numeric. Minimum sample size to test. |
maxSamp |
Numeric. Maximum sample size to test. |
step |
Numeric. Sample step size for each iteration. |
rep |
Numeric. Internal repetitions for each sample size. |
iter |
Positive Numeric. The number of iterations for the Metropolis-Hastings
annealing process. Defaults to |
data.frame with summary statistics.
Special thanks to Dr. Brendan Malone for the original implementation of this algorithm.
Tristan R.H. Goodbody
Malone BP, Minasny B, Brungard C. 2019. Some methods to improve the utility of conditioned Latin hypercube sampling. PeerJ 7:e6451 DOI 10.7717/peerj.6451
## Not run:
#--- Load raster and access files ---#
r <- system.file("extdata", "mraster.tif", package = "sgsR")
mr <- terra::rast(r)
#--- calculate lhsPop details ---#
mats <- calculate_pop(mraster = mr)
calculate_lhsOpt(mats = mats)
calculate_lhsOpt(
mats = mats,
PCA = FALSE,
iter = 200
)
## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.