extractLarge | R Documentation |
This function is the same as the extract
function in the terra package, except that it is built to better handle cases when the points/lines/polygons where the extractions are are very large in memory. Essentially, the function is a wrapper for extract
with a for loop that allows you to discard unwanted columns. Note that this function is not intended to make things fast... just feasible for cases where memory may be a problem. A few tips for memory management:
If you do not need need specific fields from an object you will be extracting from, remove them (e.g., x$doNotNeed <- NULL
).
Likewise, removing rasters in a raster stack that you don not need may help.
On a Windows machine, you can give R more memory using memory.limit(memory.limit() * 2^x)
where n
is an integer. On my machine, which has 32 GB of RAM, I use n = 29
.
Run R through the terminal, not through RStudio or the R GUI. This can reduce overhead because these programs take more memory.
Close other programs you may not be using.
Start in a fresh R session and load only what you need to get the extraction done. Save the output before proceeding.
Go get a coffee, because it may take a while.
extractLarge(x, y, keep = NULL, atATime = 10000, verbose = TRUE, ...)
x |
|
y |
|
keep |
Names or indices of the fields or values in |
atATime |
Number of records in |
verbose |
Report progress ( |
... |
Arguments to pass to |
Data frame.
# This example does *not* require this function because the raster # and points are so small in memory. But it illustrates the process. data(mad0) data(lemurs) ll <- c('longitude', 'latitude') lemurs <- lemurs[ , ll] # using extractLarge() ex1 <- extractLarge(mad0, lemurs, atATime=10) # using just extract() ex2 <- extract(mad0, lemurs)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.