For a better version of the stars vignettes see https://r-spatial.github.io/stars/articles/
knitr::opts_chunk$set(echo = TRUE, collapse = TRUE, dev = "png") suppressPackageStartupMessages(library(dplyr)) knitr::opts_chunk$set(fig.height = 4.5) knitr::opts_chunk$set(fig.width = 6) EVAL = suppressWarnings(require(starsdata, quietly = TRUE))
NetCDF data sources are available via more and less granular files and/or OPeNDAP endpoints. This article demonstrates how stars enables discovery, access, and processing of NetCDF data across a wide range of such source-data organization schemes.
We'll start with some basics using datasets included with the stars installation. A call to read_ncdf(), for a dataset smaller than the default threshold, will just read in all the data. Below we read in and display the reduced.nc NetCDF file.
library(stars) f <- system.file("nc/reduced.nc", package = "stars") (nc <- read_ncdf(f))
Let's assume reduced.nc was 10 years of hourly data, rather than 1 time step. It would be over 10GB rather than about 130KB and we would not be able to just read it all into memory. In this case, we need a way to read the file's metadata such that we could iterate over it in a way that meets the needs of our workflow objectives. This is where proxy = TRUE comes in. Below, we'll lower the option that controls whether read_ncdf() defaults to proxy and use proxy = TRUE to show both ways of getting the same result.
old_options <- options("stars.n_proxy" = 100) (nc <- read_ncdf(f, proxy = TRUE)) options(old_options)
The above shows that we have a NetCDF sourced stars proxy derived from the reduced.nc file. We see it has four variables and their units are displayed. The normal stars dimension(s) are available and a nc_request object is also available. The nc_request object contains the information needed to make requests for data according to the dimensions of the NetCDF data source. With this information, we have what we need to request a chunk of data that is what we want and not too large.
(nc <- read_ncdf(f, var = "sst", ncsub = cbind(start = c(90, 45, 1 , 1), count = c(90, 45, 1, 1)))) plot(nc)
The ability to view NetCDF metadata so we can make well formed requests against the data is useful, but the real power of a proxy object is that we can use it in a "lazy evaluation" coding style. That is, we can do virtual operations on the object, like subsetting with another dataset, prior to actually accessing the data volume.
Lazy operations.
There are two kinds of lazy operations possible with stars_proxy objects. Some can be applied to the stars_proxy object itself without accessing underlying data. Others must be composed as a chain of calls that will be applied when data is actually required.
Methods applied to a stars_proxy object:
[ - Nearly the same as stars_proxy[[<- - stars_proxy method worksprint - unique method for nc_proxy to facilitate unique workflowsdim - stars_proxy method worksc - stars_proxy method worksst_redimension - Not sure what this entails but it might not make sense for nc_proxy.st_mosaic * Calls read_stars on assembled list. Not supported for now.st_set_bboxMethods that add a call to the call_list.
[<-adropapermis.nasplitst_applypredictmergest_cropdrop_levelsOps (group generic for +, -, etc.)Math (group generic for abs, sqrt, tan, etc.)filtermutatetansmuteselectrenamepullslice * hyperslabbing for NetCDF could be as above?pullreplace_naMethods that cause a stars_proxy object to be fetched and turned into a stars object.
as.data.frameplotst_as_starsaggregatest_dimensions<- * https://github.com/r-spatial/stars/issues/494histst_downsamplest_samplest_as_sfwrite_starsAny scripts or data that you put into this service are public.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.