#' @title Betweenness Centrality metrics using Conefor command line
#'
#' @description
#' Use this function to calculate the BC, BCIIC and BCPC indexes under one or several distance thresholds using Conefor command line.
#' @param nodes Object containing nodes (e.g., habitat patches or fragments) information. It can be of the following classes:\cr
#' - \code{Data.frame} with at least two columns: the first for node IDs and the second for attributes. \cr
#' - Spatial data of type vector (class \code{sf, SpatVector, SpatialPolygonsDataFrame}). It must be in a projected coordinate system.\cr
#' - Raster (class \code{RasterLayer, SpatRaster}). It must be in a projected coordinate system. The values must be integers representing the ID of each habitat patch or node, with non-habitat areas represented by NA values (see \link[raster]{clump} or \link[terra]{patches}).
#' @param id \code{character}. If nodes is a vector then you can specify the column name with the node ID.
#' If NULL, then a new id will be generated.
#' @param attribute \code{character} or \code{vector}. If \code{NULL} the area of the nodes will be used as the node attribute. The unit of area can be selected using the \code{area_unit} parameter. To use an alternative attribute, consider the class type of the object in the \code{nodes} parameter: \cr
#' - If \code{nodes} is a spatial vector or data.frame, specify \bold{the name of the column} containing the attribute for the nodes. \cr
#' - If \code{nodes} is a raster layer then it must be a numeric vector with the node's attribute. The length of the vector must be equal to the number of nodes.
#' @param area_unit \code{character}. (\emph{optional, default = } \code{"m2"}) \cr. A \code{character} indicating the area units when \code{attribute} is \code{NULL}. Some options are "m2" (the default), "km2", "cm2", or "ha"; See \link[Makurhini]{unit_convert} for details.
#' @param distance A \code{matrix} or \code{list} to establish the distance between each pair of nodes. Distance between nodes may be Euclidean distances (straight-line distance) or effective distances (cost distances) by considering the landscape resistance to the species movements. If it is a matrix, then the number of columns and rows must be equal to the number of nodes. This distance matrix could be generated by the \link[Makurhini]{distancefile} function. If it is a list, then it must contain the distance parameters necessary to calculate the distance between nodes. For example: “type” and “resistance”. For "type" choose one of the distances: "centroid" (faster), "edge", "least-cost" or "commute-time". If the type is equal to "least-cost" or "commute-time", then you must use the "resistance" argument. To see more arguments see the \link[Makurhini]{distancefile} function.
#' @param metric A \code{character} indicating the Betweenness Centrality metric to use: \code{"BC"} (the classical), \code{“BCIIC”} considering binary indices and topological distances, and \code{“BCPC”} (recommended) considering the maximum product probabilities.
#' @param distance_thresholds A \code{numeric} indicating the dispersal distance or distances (meters) of the considered species. If \code{NULL} then distance is estimated as the median dispersal distance between nodes. Alternatively, the \link[Makurhini]{dispersal_distance} function can be used to estimate the dispersal distance using the species home range.
#' @param probability A \code{numeric} value indicating the probability that corresponds to the distance specified in the \code{distance_threshold}. For example, if the \code{distance_threshold} is a median dispersal distance, use a probability of 0.5 (50\%). If the \code{distance_threshold} is a maximum dispersal distance, set a probability of 0.05 (5\%) or 0.01 (1\%). Use in case of selecting the \code{"BCPC"} metric. If \code{probability = NULL}, then a probability of 0.5 will be used.
#' @param LA \code{numeric}. (\emph{optional, default = } \code{NULL}). The maximum landscape attribute, which is the attribute value that would correspond to a hypothetical habitat patch covering all the landscape with the best possible habitat, in which IIC and PC would be equal to 1. For example, if nodes attribute corresponds to the node area, then LA equals total landscape area. If nodes attribute correspond to a quality-weighted area and the quality factor ranges from 0 to 100, LA will be equal to 100 multiplied by total landscape area. The value of LA does not affect at all the importance of the nodes and is only used to calculate the overall landscape connectivity. If no LA value is entered (default) and \code{overall = TRUE} or \code{onlyoverall = TRUE}, the function will only calculate the numerator of the global connectivity indices and the equivalent connected ECA or EC index.
#' @param coneforpath \code{character}. Path to Conefor 2.6 with command line interface (http://www.conefor.org/coneforsensinode.html). Example, \code{"C:/Users/coneforWin64.exe"}.
#' @param dA \code{logical}. If TRUE, then the delta attribute will be added to the node's importance result.
#' @param dvars \code{logical}. If TRUE, then the absolute variation will be added to the node's importance result.
#' @param parallel (\emph{optional, default =} \code{NULL}).
#' A \code{numeric} specifying the number of cores to parallelize the index estimation of the PC or IIC index and its deltas.Particularly useful when you have more than 1000 nodes. By default the analyses are not parallelized.
#' @param rasterparallel \code{logical}. If parallel is \code{FALSE} and nodes is a \code{raster} then you can use this argument to assign the metrics values to the nodes raster. It is useful when raster resolution is less than 100 m\out{<sup>2</sup>}.
#' @param write \code{character}. Write output shapefile, example, "C:/ejemplo.shp".
#' @param intern \code{logical}. Show the progress of the process, default = TRUE. Sometimes the advance process does not reach 100 percent when operations are carried out very quickly.
#' @details To use this function you need to download and uncompress Conefor 2.6 with command line interface from \url{http://www.conefor.org/coneforsensinode.html}\cr
#' Betweenness Centrality metrics can be calculated in three different ways:\cr
#' - \bold{BC} calculates the classical Betweenness Centrality metric as originally defined by Freeman (1977; Sociometry 40: 35–41). \cr
#' - \bold{BCIIC and BCPC} calculate the improved version of the BC metric by Bodin and Saura (2010) integrated within the same analytical framework as the IIC (binary) and the PC (probabilistic) metrics.
#' @references Saura, S. and Torne, J. (2012). Conefor 2.6. Universidad Politecnica de Madrid. Available at \url{www.conefor.org}.\cr
#' Freeman L.C. (1977). Set of Measures of Centrality Based on Betweenness. Sociometry 40: 35-41.\cr
#' Bodin, O. and Saura, S. (2010). Ranking individual habitat patches as connectivity providers: integrating network analysis and patch removal experiments. Ecological Modelling 221: 2393-2405.
#' @export
#' @examples
#' \dontrun{
#' library(Makurhini)
#' data("habitat_nodes", package = "Makurhini")
#' nrow(habitat_nodes) #Number of nodes
#'
#' #Two distance thresholds.
#' #You need to place your path to the conefor .exe
#' BCIIC <- MK_BCentrality(nodes = habitat_nodes, id = "Id",
#' coneforpath = "C:/Users/coneforWin64.exe",
#' distance = list(type = "centroid"),
#' metric = "BCIIC", LA = NULL,
#' distance_thresholds = c(10000, 30000)) #10 and 30 km
#'
#' #Using raster
#' data("habitat_nodes_raster", package = "Makurhini")
#' ##Using parallel
#' BCPC_parallel <- MK_BCentrality(nodes = habitat_nodes_raster,
#' coneforpath = "C:/Users//coneforWin64.exe",
#' id = "id", attribute = NULL,
#' distance = list(type = "centroid"),
#' metric = "BCPC", LA = NULL, probability = 0.5,
#' distance_thresholds = 40000,
#' parallel = 4) #40 and 60 km
#' }
#' @importFrom sf st_as_sf
#' @importFrom utils write.table
#' @importFrom raster values as.matrix extent raster stack extent<- writeRaster reclassify crs crs<-
#' @importFrom future multicore multisession plan availableCores
#' @importFrom furrr future_map
#' @importFrom purrr walk
#' @importFrom magrittr %>%
#' @importFrom utils installed.packages txtProgressBar setTxtProgressBar
MK_BCentrality <- function(nodes, id, attribute = NULL, area_unit = "ha",
distance = list(type= "centroid", resistance = NULL),
metric = c("BC", "BCIIC", "BCPC"), distance_thresholds = NULL,
probability = NULL, LA = NULL, coneforpath = NULL,
dA = FALSE, dvars = FALSE,
parallel = NULL,
rasterparallel = FALSE,
write = NULL, intern = TRUE) {
. = NULL
if (missing(nodes)) {
stop("error missing shapefile file of nodes")
} else {
if (is.numeric(nodes) | is.character(nodes)) {
stop("error missing shapefile file of nodes")
}
}
if (!metric %in% c("BC", "BCIIC", "BCPC")) {
stop("Type must be either 'BC', 'BCIIC', or 'BCPC'")
}
if (isTRUE(unique(metric == c("BC", "BCIIC", "BCPC")))) {
metric = "BC"
}
if (metric == "BCPC") {
if (is.null(probability) | !is.numeric(probability)) {
stop("error missing probability")
}
}
if (is.null(distance_thresholds)) {
stop("error missing numeric distance threshold(s)")
}
if (!is.null(write)) {
if (!dir.exists(dirname(write))) {
stop("error, output folder does not exist")
}
}
if(!is.null(parallel)){
if(!is.numeric(parallel)){
stop("if you use parallel argument then you need a numeric value")
}
}
if(isFALSE(parallel)){
parallel <- NULL
}
if(isTRUE(parallel)){
message(paste0("The number of available cores is ", as.numeric(availableCores()),
", so ", as.numeric(availableCores()), " cores will be used."))
parallel <- as.numeric(availableCores())-2
}
if (!is.null(coneforpath)) {
if (!dir.exists(dirname(coneforpath))) {
stop("error, output folder does not exist")
}
if(!file.exists(coneforpath)){
stop("error, Conefor 2.6 with command line interface does not exist")
}
}
if (class(nodes)[1] == "SpatialPolygonsDataFrame") {
nodes <- st_as_sf(nodes)
}
options(warn = -1)
ttt.2 <- getwd(); temp.1 <- paste0(tempdir(), "/TempInputs", sample(1:1000, 1, replace = TRUE))
dir.create(temp.1, recursive = T)
if(class(nodes)[1] == "sf"){
if (is.null(id)) {
nodes$IdTemp <- 1:nrow(nodes)
} else {
nodes$IdTemp <- nodes[[id]]
}
id = "IdTemp"
} else {
id = NULL
}
nodesfile(nodes, id = id, attribute = attribute, area_unit = area_unit,
write = paste0(temp.1, "/nodes.txt"))
distancefile(nodes = nodes, id = id, type = distance$type,
distance_unit = distance$distance_unit, keep = distance$keep,
resistance = distance$resistance,
resist.units = distance$resist.units,
CostFun = distance$CostFun, ngh = distance$ngh,
mask = distance$mask,
threshold = distance$threshold,
geometry_out = distance$geometry_out,
bounding_circles = distance$bounding_circles,
parallel = distance$parallel,
pairwise = TRUE,
least_cost.java = distance$least_cost.java,
cores.java = distance$cores.java, ram.java = distance$ram.java,
write = paste0(temp.1,"/Dist.txt"))
setwd(temp.1)
if (is.null(distance$threshold)) {
pairs = "all"
} else {
pairs = "notall"
}
x = NULL
if(is.null(parallel)){
if(length(distance_thresholds)>1 & isTRUE(intern)){
pb <- txtProgressBar(0, length(distance_thresholds), style = 3)
}
BC_metric <- tryCatch(lapply(1:length(distance_thresholds), function(x){
x <- distance_thresholds[x]
if(length(distance_thresholds)>1 & isTRUE(intern)){
setTxtProgressBar(pb, x)
}
dMetric <- EstConefor(nodeFile = "nodes.txt", connectionFile = "Dist.txt",
coneforpath = coneforpath,
typeconnection = "dist",
typepairs = pairs, index = metric,
thdist = x, multdist = NULL, conprob = probability,
onlyoverall = FALSE, LA = LA, nrestauration = FALSE,
prefix = NULL, write = NULL)
if(class(nodes)[1] == "sf"){
result_interm <- merge_conefor(datat = dMetric[[which(lapply(dMetric, function(x) ncol(x)) >= 11)]],
pattern = NULL,
merge_shape = nodes, id = "IdTemp",
write = if (!is.null(write)) paste0(write, "_d", x,".shp"),
dA = dA, var = dvars)
result_interm$"IdTemp" <- NULL
} else {
datat <- dMetric[[which(lapply(dMetric, function(x) ncol(x)) >= 11)]]
datat <- as.data.frame(datat); datat <- datat[, as.numeric(which(colSums(!is.na(datat)) > 0))]
if(isFALSE(dA)){
datat$dA <- NULL; datat$varA <- NULL
datat[,which(unique(is.na(datat)) == TRUE)] <- NULL
}
if (isFALSE(dvars)){
datat <- datat[which(grepl("var",names(datat), fixed = TRUE) == FALSE)]
}
###rasters values
rp <- unique(raster::values(nodes)) %>% as.vector(.); rp <- rp[which(!is.na(rp))]
m <- matrix(nrow = nrow(datat), ncol = 2); m[,1] <- datat[,1]
if(isTRUE(rasterparallel)){
works <- as.numeric(availableCores())-1
if(.Platform$OS.type == "unix") {
strat <- future::multicore
} else {
strat <- future::multisession
}
plan(strategy = strat, gc=TRUE, workers = works)
r_metric <- tryCatch(future_map(2:ncol(datat), function(c){
x1 <- datat[,c(1, c)]
for(i in rp){
m[which(m == i),2] <- x1[which(x1[,1]== i),2]
}
x1 <- reclassify(nodes, rcl = m)
return(x1)}, .progress = TRUE), error = function(err) err)
close_multiprocess(works)
} else {
r_metric <- lapply(2:ncol(datat), function(c){
x1 <- datat[,c(1, c)]
for(i in rp){
m[which(m == i),2] <- x1[which(x1[,1]== i),2]
}
x1 <- reclassify(nodes, rcl = m)
return(x1)})
}
result_interm <- list()
for(i in 2:(length(r_metric)+1)){
result_interm[[i]] <- r_metric[[i-1]]
}
result_interm[[1]] <- nodes; result_interm <- stack(result_interm)
names(result_interm) <- names(datat[,1:ncol(datat)])
if (!is.null(write)){
n <- names(datat[,1:ncol(datat)]); walk(as.list(2:length(n)), function(w){
x1 <- result_interm[[w]]
crs(x1) <- crs(result_interm)
writeRaster(x1, filename = paste0(write, "_", n[w], "_", x, ".tif"), overwrite = TRUE, options = c("COMPRESS=LZW", "TFW=YES"))
})
}
}
return(result_interm)
}), error = function(err) err)
} else {
works <- as.numeric(availableCores())-1; works <- if(parallel > works){works}else{parallel}
if(.Platform$OS.type == "unix") {
strat <- future::multicore
} else {
strat <- future::multisession
}
plan(strategy = strat, gc = TRUE, workers = works)
BC_metric <- tryCatch(future_map(1:length(distance_thresholds), function(x) {
x <- distance_thresholds[x]
dMetric <- EstConefor(nodeFile = "nodes.txt", connectionFile = "Dist.txt",
coneforpath = coneforpath,
typeconnection = "dist", typepairs = pairs, index = metric,
thdist = x, multdist = NULL, conprob = probability,
onlyoverall = FALSE, LA = LA, nrestauration = FALSE,
prefix = NULL, write = NULL)
if(class(nodes)[1] == "sf"){
result_interm <- merge_conefor(datat = dMetric[[which(lapply(dMetric, function(x) ncol(x)) >= 11)]], pattern = NULL,
merge_shape = nodes, id = "IdTemp",
write = if (!is.null(write)) paste0(write, "_d", x,".shp"),
dA = dA, var = dvars)
result_interm$"IdTemp" <- NULL
} else {
datat <- dMetric[[which(lapply(dMetric, function(x) ncol(x)) >= 11)]]
datat <- as.data.frame(datat)
datat <- datat[, as.numeric(which(colSums(!is.na(datat)) > 0))]
if(isFALSE(dA)){
datat$dA <- NULL; datat$varA <- NULL
datat[,which(unique(is.na(datat)) == TRUE)] <- NULL
}
if (isFALSE(dvars)){
datat <- datat[which(grepl("var",names(datat), fixed = TRUE) == FALSE)]
}
###rasters values
rp <- unique(raster::values(nodes)) %>% as.vector(.); rp <- rp[which(!is.na(rp))]
m <- matrix(nrow = nrow(datat), ncol = 2); m[,1] <- datat[,1]
r_metric <- lapply(2:ncol(datat), function(c){
x1 <- datat[,c(1, c)]
for(i in rp){
m[which(m == i),2] <- x1[which(x1[,1]== i),2]
}
x1 <- reclassify(nodes, rcl = m)
return(x1)})
result_interm <- list()
for(i in 2:(length(r_metric)+1)){
result_interm[[i]] <- r_metric[[i-1]]
}
result_interm[[1]] <- nodes; result_interm <- stack(result_interm)
names(result_interm) <- names(datat[,1:ncol(datat)])
if (!is.null(write)){
n <- names(datat[,1:ncol(datat)]); walk(as.list(2:length(n)), function(w){
x1 <- result_interm[[w]]
crs(x1) <- crs(result_interm)
writeRaster(x1, filename = paste0(write, "_", n[w], "_", x, ".tif"), overwrite = TRUE, options = c("COMPRESS=LZW", "TFW=YES"))
})
}
}
return(result_interm)
}, .progress = intern), error = function(err) err)
close_multiprocess(works)
}
if (inherits(BC_metric, "error")) {
setwd(ttt.2)
stop(BC_metric)
} else {
if (length(distance_thresholds) == 1) {
BC_metric <- BC_metric[[1]]
} else {
names(BC_metric) <- paste0("d", distance_thresholds)
}
setwd(ttt.2)
}
return(BC_metric)
}
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.