robust.quadcount: Optimal granularity for quadrat counting

View source: R/robust_mapping.R

robust.quadcountR Documentation

Optimal granularity for quadrat counting

Description

Given a point set, finds an optimal granularity for quadrat counting that balances uniformity and robustness, as described in Ramos et al. (2021).

Usage

robust.quadcount(
  point_set,
  random_samples = T,
  nsamples = 200,
  signif = 0.99,
  tradeoff_crit = c("product", "sum"),
  uniformity_method = c("Quadratcount", "Nearest-neighbor"),
  robustness_method = c("Poisson", "Binomial", "Resampling"),
  robustness_k = -3,
  verbose = FALSE,
  my_scales = NULL,
  W = NULL,
  grid_crs = NULL
)

Arguments

point_set

the point set for which you want to find the optimal quadrat size. Must be provided as a dataframe with the first column being the 'easting' (e.g. x, longitude) and the second column the northing (e.g y, latitude)

nsamples

number of samples taken if random_sample == T. In practice, the default value seems to work fine.

signif

significance level for the Complete Spatial Randomness (CSR) test that is applied for the samples at each different granulairity considered.

tradeoff_crit

dictactes how the balance between uniformity and robustness is determined in order to choose an optimal quadrat size. 'sum' m,eans that the granularity with the greates sum of uniformity and robustness gets picked, 'product' means that the granularity yielding the greatest product is picked.

uniformity_method

whether CSR is tested via the quadratcount method or the nearest neighbor method. In practice, the result should not differ much

robustness_method

how the robustness of each granularity is estimated (via a Poisson model, a Binomail model, or by resampling the original point set) In practice, any of the options yields similar results.

robustness_k

robustness of a cell is calculated by taking the estimated coefficient of variation for a cell - let's call it x - and applying the function exp(k*x). This parameter specificies which k is used. In practice, the final result is not very sensitive to the specific value of k

verbose

whether to print messages while running the function or not

my_scales

which granularities to be tested. If not provided, a set is automatically generated. The granularities, if provided, should be given as the dimension (side) of a cell, in the unit used for the coordinates of point_set.

W

the window of interest to be considered when doing the analyzis of point_set. If not provided, W is calculated as the minimum bounding rectangle for point_set.

grid_crs

coordinate reference system of the points, will be ascribed to the resulting grid

random_sample

whether uniformity and robustness is estimated from a random sample (T) or by generating a regular grid at the granularity being tested (F). Using random samples generally takes less time and yields similar results.

Author(s)

Rafael G. Ramos (main proponent and coder), Marcos Prates (contributor)

References

Ramos, R. G., Silva, B. F., Clarke, K. C., & Prates, M. (2021). Too Fine to be Good? Issues of Granularity, Uniformity and Error in Spatial Crime Analysis. Journal of Quantitative Criminology, 1-25. robust.quadcount()

Examples


library(robustmap)
# Loading point data
file1 = system.file("extdata/burglary.shp", package = "robustmap")
burglary <- sf::read_sf(file1, layer = "burglary")

burglary <- data.frame(x=burglary$lon_m,
                       y=burglary$lat_m)

# Estimating optimal granularity using robust.quadcount
burglary_map <- robust.quadcount(burglary,verbose = TRUE)

# Retriving estimated granularity
burglary_map$opt_granularity

# Plotting resulting map
terra::plot(burglary_map$counts)


rafaelgramos/robustmap documentation built on April 22, 2024, 8:22 a.m.