wdpa_fetch | R Documentation |
Fetch data from Protected Planet. Specifically, data are downloaded from the World Database on Protected Areas (WDPA) and the World Database on Other Effective Area-Based Conservation Measures (WDOECM). Note that data are downloaded assuming non-commercial use.
wdpa_fetch(
x,
wait = FALSE,
download_dir = tempdir(),
force_download = FALSE,
check_version = TRUE,
n = NULL,
page_wait = 2,
datatype = "gdb",
verbose = interactive()
)
x |
|
wait |
|
download_dir |
|
force_download |
|
check_version |
|
n |
|
page_wait |
|
datatype |
|
verbose |
|
This function obtains and imports data from Protected Planet.
By default (per force_download = FALSE
), it will check to see if the
data have already been downloaded and, if so, simply import the previously
downloaded data.
It will also check to see if a newer version of the dataset is available
on Protected Planet (per check_version = TRUE
) and, if so, provide an
alert.
If the latest version is not required, this alert can be safely ignored.
However, if the latest version of the data is required,
then using force_download = TRUE
will ensure that the latest version
is always obtained.
After importing the data, it is strongly recommended to clean the data
prior to analysis (see wdpa_clean()
).
A sf::sf()
object.
The PA_DEF
column indicates the data source for individual
areas and sites that comprise the imported dataset.
Specifically, data obtained through the World Database on Protected Areas
(WDPA) are indicated with a value of 1
in the PA_DEF
column.
Additionally, data obtained through the World Database on Other Effective
Area-Based Conservation Measures (WDOECM) are indicated with a value of 0
in the PA_DEF
column.
For more details on data conventions, please consult the official manual
(UNEP-WCMC 2019).
The function requires a Chromium-based browser
(e.g., Google Chrome, Chromium, or Brave) to be installed.
This is because it uses the chromote to find the URL
for downloading data from Protected Planet.
If you don't have one of these browsers installed, then please try
installing Google Chrome.
If you do have one of these browsers installed and this function
throws an error indicating that it can't find the browser,
try setting the CHROMOTE_CHROME
environment variable to the
file path of the executable. For example, you could do this with:
Sys.setenv(CHROMOTE_CHROME = "INSERT_FILE_PATH_HERE.exe")
Also, the function will sometimes produce a message
that complains about a handle_read_frame
error. Please understand
that this message is, in fact, not an error and can be safely ignored
(see https://github.com/rstudio/chromote/pull/111).
As such, if you see this message when running the function,
you can assume that the function still worked correctly.
For reference, the misleading message will look something like this:
[error] handle_read_frame error: websocketpp.transport:7 (End of File)
For further help with troubleshooting, please refer to the documentation for the chromote package (https://rstudio.github.io/chromote/).
UNEP-WCMC (2019). User Manual for the World Database on Protected Areas and world database on other effective area-based conservation measures: 1.6. UNEP-WCMC: Cambridge, UK. Available at: https://wcmc.io/WDPA_Manual.
wdpa_clean()
, wdpa_read()
,
wdpa_url()
, countrycode::countrycode()
.
## Not run:
# fetch data for Liechtenstein
lie_raw_data <- wdpa_fetch("Liechtenstein", wait = TRUE)
# print data
print(lie_raw_data)
# plot data
plot(lie_raw_data)
# fetch data for Liechtenstein using the ISO3 code
lie_raw_data <- wdpa_fetch("LIE", wait = TRUE)
# since data are saved in a temporary directory by default,
# a persistent directory can be specified to avoid having to download the
# same dataset every time the R session is restarted
lie_raw_data <- wdpa_fetch("LIE", wait = TRUE,
download_dir = rappdirs::user_data_dir("wdpar"))
# data for multiple countries can be downloaded separately and combined,
# this is useful to avoid having to download the global dataset
## load packages to easily merge datasets
library(dplyr)
library(tibble)
## define country names to download
country_codes <- c("LIE", "MHL")
## download data for each country
mult_data <- lapply(country_codes, wdpa_fetch, wait = TRUE)
## merge datasets together
mult_dat <- st_as_sf(as_tibble(bind_rows(mult_data)))
## print data
print(mult_dat)
## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.