rWeights: Calculation of rarity weights at a single or at multiple...

rWeightsR Documentation

Calculation of rarity weights at a single or at multiple scales

Description

Calculate rarity weights for a single scale or for multiple scales on the basis of the selected weighting function(s).

Usage

rWeights(occData, Qmax = max(occData), Qmin = min(occData), 
         wMethods = "W", rCutoff = "Gaston", normalised = T, 
         assemblages, extended = F, rounding = 3)

Arguments

occData

vector, matrix or data.frame. Occurrence data for a single scale (vector) or several scales (matrix or data.frame).

Qmax

integer. Maximum occurrence (see details). By default, the maximum occurrence of the dataset is used (i.e., maximum occurrence among the provided set of species), however it can be changed to another value, e.g. to provide the number of possible sites.

Qmin

integer. Minimum occurrence (see details). By default, the minimum occurrence of the dataset is used (i.e., minimum occurrence among the provided set of species).

wMethods

W, invQ or oldW. Chosen function to weight species occurrence (see details)

rCutoff

a decimal or a vector of values between 0 and 1, or Gaston or Leroy. Indicates the rarity cutoff(s) or the method to use to calculate the rarity cutoff(s). The rarity cut-off is by default calculated as a percentage of the maximum occurrence (see details)

normalised

TRUE or FALSE. If TRUE, then weights are normalised between 0 and 1.

assemblages

matrix or data.frame. Set of assemblages of species to calculate the rarity cutoff point(s) with the Leroy method (optional)

extended

TRUE or FALSE. Useful in case of multiple scales only. If TRUE, then weights will be given for every input scale in addition to multiscale weights. If FALSE, then only multiscale weights will be provided.

rounding

An integer or FALSE. If an integer is provided, then the values of weights will be rounded according to this value. If FALSE, weights will not be rounded.

Details

To calculate single-scale weights, simply provide a vector with species occurrences. To calculate multiscale rarity weights, provide either a matrix or a data.frame where species are in rows, and each column provides occurrence for a particular scale.

The minimum and maximum weights can be set manually, or automatically calculated with the default parameters. Defaults parameters : if occData is a vector, Qmin = min(Q) and Qmax = max(Q). If occData is a matrix or a data.frame, Qmin = apply(occData, 2, min) and Qmax = apply(occData, 2, max)

Three weighting methods are available (more will become available later):

  1. W: This is the method described in Leroy et al. (2013). We recommend using this method for both single and multiscale weight calculations.

    \exp(-(\frac{Q_{i} - Q_{min}}{r_j \times Q_{max} - Q_{min}}\times0.97 + 1.05)^2)

    where Qi is the occurrence of species i, Qmin and Qmax are respectively the minimum and maximum occurrences in the species pool and r is the choosen rarity cut-off point (as a percentage of occurrence).

  2. invQ: This is the inverse of occurrence

    \frac{1}{Q_i}

    where Qi is the occurrence of the ith species. The inverse of the occurrence should be avoided as a weighting procedure because it cannot be adjusted to the considered species pool, and it does not attribute 0 weights to common species (see discussion in Leroy et al. (2012)).

  3. oldW: This is the original method described in Leroy et al. (2012). As this method was improved in Leroy et al. (2013), we recommend to rather use W. Formula:

    \exp(-(\frac{Q_i}{Q_{min}} n + 1)^2)

    where Qi is the occurrence of species i, Qmin is the minimum occurrence in the species pool, and n is and adjustment coefficient numerically approximated to fit the choosen rarity cut-off point.

For methods W and oldW, a rarity cutoff point is required. The rarity cutoff point can either be entered manually (a single value for a single scale, a vector of values for multiple scales), or the methods of Gaston or Leroy can be used (see references):

- Gaston method: the rarity cutoff point is the first quartile of species occurrences, i.e. rare species are the 25 percent species with the lowest occurrence.

- Leroy method: the rarity cutoff point is the occurrence at which the average proportion of rare species in local assemblages is 25 percent. This method requires assemblages to calculate the average proportion of rare species in assemblages.

NA are properly handled by the function.

Value

A data.frame containing the results : species occurrences, rarity statuses, rarity weights and the used rarity cut-offs.

- If occData is a vector (single scale weights): A data.frame with 4 columns : Q (species occurrence), R (species rarity status), W, (species rarity weights), cut.off (rarity cut-off used for weight calculation)

- If occData is matrix or a data.frame (multiscale rarity weights): A data.frame with n columns Q (species occurrences), n columns R (species rarity statuses), one (if extended = F) or n + 1 (if extended = T) columns W (species rarity weights) where n is the number of scales (number of columns of occData), n columns cut.off (rarity cut-offs used for weight calculation).

By default, weights are rounded to 3 digits, which should be sufficient in most cases. Another number of digits can also be chosen; or simply changing rounding to FALSE will remove the rounding.

Author(s)

Boris Leroy leroy.boris@gmail.com

References

Leroy B., Petillon J., Gallon R., Canard A., & Ysnel F. (2012) Improving occurrence-based rarity metrics in conservation studies by including multiple rarity cut-off points. Insect Conservation and Diversity, 5, 159-168.

Leroy B., Canard A., & Ysnel F. 2013. Integrating multiple scales in rarity assessments of invertebrate taxa. Diversity and Distributions, 19, 794-803.

See Also

Irr, Isr

Examples

# 1. Single scale rarity weights
data(spid.occ)
head(spid.occ)

regional.occ <- spid.occ$occurMA
names(regional.occ) <- rownames(spid.occ)
head(regional.occ)

# Calculation of rarity weights at a single scale:
rWeights(regional.occ, rCutoff = "Gaston")
rWeights(regional.occ, rCutoff = 0.1)
rWeights(regional.occ, wMethods = "invQ")
rWeights(regional.occ, wMethods = c("W", "invQ"))

# Calculation of rarity weights with the method of Leroy
# Creating a fictive assemblage matrix of 5 assemblages
# Warning: this is to provide an example of how the function works!
# The correct use of this method requires a matrix of actually sampled species.
assemblages.matrix <- cbind(assemblage.1 = sample(c(0, 1), 708, replace = TRUE),
                            assemblage.2 = sample(c(0, 1), 708, replace = TRUE),
                            assemblage.3 = sample(c(0, 1), 708, replace = TRUE),
                            assemblage.4 = sample(c(0, 1), 708, replace = TRUE),
                            assemblage.5 = sample(c(0, 1), 708, replace = TRUE))
rownames(assemblages.matrix) <- names(regional.occ) # Rownames of assemblages.matrix must 
                                                 # correspond to rownames in occurrences
head(assemblages.matrix)
                                          
rWeights(regional.occ, wMethods = "W", rCutoff = "Leroy", assemblages = assemblages.matrix)

# 2. Multiscale rarity weights
data(spid.occ)
head(spid.occ)

rWeights(spid.occ, wMethods = "W", rCutoff = "Gaston")
rWeights(spid.occ, wMethods = "W", rCutoff = "Gaston", extended = TRUE)
rWeights(spid.occ, wMethods = c("W", "invQ"), rCutoff = "Gaston", extended = TRUE)
rWeights(spid.occ, wMethods = c("W", "invQ"), rCutoff = "Leroy", 
         assemblages = assemblages.matrix, extended = TRUE) # Provided that you have 
                                             # created "assemblages.matrix" as above

Rarity documentation built on Aug. 21, 2023, 5:12 p.m.