knitr::opts_chunk$set(echo = TRUE)

esri2sf

Still many geographic data is delivered through ESRI's ArcGIS Server. It is not easy to utilize the geographic data in the GIS servers from data analysis platform like R or Pandas. This package enables users to use vector data in ArcGIS Server from R through the server's REST API. It download geographic features from ArcGIS Server and saves it as Simple Features.

Git repository is here https://github.com/yonghah/esri2sf

How it works

This program sends a request to an ArcGIS Server and gets json responses containing coordinates of geometries. Unfortunately, ESRI's json format is not the same as geojson. So it converts the json into WKT strings first; then creates simple feature geometries from the strings. Then it combines attribute data to the geometries to create sf dataframe. Often ArcGIS servers limits the maximum number of rows in the result set. So this program creates 500 features per request and send requests until it gets all features.

Install

I recommend remotes to install this package. This package has dependency on dplyr, sf, httr, jsonlite, rstudioapi, DBI, RSQLite, crayon.

remotes::install_github("yonghah/esri2sf")
library(esri2sf)
library(esri2sf)

How to use

What you need is the URL of REST service you want. You can get the URL by asking GIS admin or looking at javascript code creating feature layer.

Point data

url <- "https://services.arcgis.com/V6ZHFr6zdgNZuVG0/arcgis/rest/services/Landscape_Trees/FeatureServer/0"
df <- esri2sf(url)
plot(df)

Polyline data

You can filter output fields. This may take a minute since it gets 18000 polylines.

url <- "https://services.arcgis.com/V6ZHFr6zdgNZuVG0/arcgis/rest/services/Florida_Annual_Average_Daily_Traffic/FeatureServer/0"
outFields <- c("AADT", "DFLG")
df <- esri2sf(url, outFields)
plot(df)

Polygon data

You can filter rows as well by giving where condition.

url <- "https://sampleserver1.arcgisonline.com/ArcGIS/rest/services/Demographics/ESRI_Census_USA/MapServer/3"
where <- "STATE_NAME = 'Michigan'"
df <- esri2sf(url, where=where, outFields=c("POP2000", "pop2007", "POP00_SQMI", "POP07_SQMI"))
plot(df)

Tabular data

You can download non-spatial tables of the 'Table' layer type using esri2df().

df <- esri2df('https://sampleserver1.arcgisonline.com/ArcGIS/rest/services/WaterTemplate/WaterDistributionInventoryReport/MapServer/5', objectIds = paste(1:50, collapse = ","))
df

crs parameter example

When specifying the CRS parameter, any transformation that happens will be done within ESRI's REST API. Caution should be taken when specifying an output crs that requires a datum transformation as ESRI will automatically apply a default transformation (with no feedback as to which one) which could end up adding small unexpected errors into your data. By default, esri2sf() will transform any datasource to WGS 1984 (EPSG:4326), but it may be safer to set crs = NULL which will return the data in the same CRS as it is being hosted as in the Feature/Map Service. That way you can control any transformation manually in a known, reproducible manner.

url <- "https://sampleserver1.arcgisonline.com/ArcGIS/rest/services/Demographics/ESRI_Census_USA/MapServer/3"
where <- "STATE_NAME = 'Michigan'"
outFields <- c("POP2000", "pop2007", "POP00_SQMI", "POP07_SQMI")

#default crs = 4326
esri2sf(url, where = where, outFields = outFields) 

#No transformation (recommended)
esri2sf(url, where = where, outFields = outFields, crs = NULL)

Also since the addition of the WKT1_ESRI output from sf::st_crs() in sf version 1.0-1, you can enter common CRS format (any that sf::st_crs() can handle) into the crs parameters and it will be able to convert to the ESRI formatted WKT needed for the outSR field in the REST query. Below are examples of the variety of input types that you can use with the crs parameters. All examples are just different formulations of the ESRI:102690 CRS.

#ESRI Authority Code
df1 <- esri2sf(url, where = where, outFields = outFields, crs = "ESRI:102690")
#PROJ string
df2 <- esri2sf(url, where = where, outFields = outFields, crs = "+proj=lcc +lat_1=42.1 +lat_2=43.66666666666666 +lat_0=41.5 +lon_0=-84.36666666666666 +x_0=4000000 +y_0=0 +datum=NAD83 +units=us-ft +no_defs")
#OGC WKT
df3 <- esri2sf(url, where = where, outFields = outFields, crs = 'PROJCS["NAD_1983_StatePlane_Michigan_South_FIPS_2113_Feet",GEOGCS["GCS_North_American_1983",DATUM["North_American_Datum_1983",SPHEROID["GRS_1980",6378137,298.257222101]],PRIMEM["Greenwich",0],UNIT["Degree",0.017453292519943295]],PROJECTION["Lambert_Conformal_Conic_2SP"],PARAMETER["False_Easting",13123333.33333333],PARAMETER["False_Northing",0],PARAMETER["Central_Meridian",-84.36666666666666],PARAMETER["Standard_Parallel_1",42.1],PARAMETER["Standard_Parallel_2",43.66666666666666],PARAMETER["Latitude_Of_Origin",41.5],UNIT["Foot_US",0.30480060960121924],AUTHORITY["EPSG","102690"]]')

Their similarity on the output CRS can be proven by the following function that calculates the mean difference in X-Y coordinates at each point. All are very close to 0.

coord_diff <- function(df1, df2) {
  suppressWarnings({
    c(
      "x" = mean(sf::st_coordinates(sf::st_cast(df1, "POINT"))[,1] - sf::st_coordinates(sf::st_cast(df2, "POINT"))[,1]),
      "y" = mean(sf::st_coordinates(sf::st_cast(df1, "POINT"))[,2] - sf::st_coordinates(sf::st_cast(df2, "POINT"))[,2])
    )
  })
}
coord_diff(df1, df2)
coord_diff(df1, df3)
coord_diff(df2, df3)


yonghah/esri2sf documentation built on June 2, 2022, 8:18 a.m.