stats.ecors | R Documentation |
Performs descriptive statistics on study polygons at defined sampling periods.
stats.ecors( x, edge.pixels = "weighted", remove.samples = list(num.pixelOK = NULL, prop.pixelOK = 0.9), summarizing = "all", by.image.save = T, summary.save = T, stats = list(mean = T, median = T, sd = T, count = F), lower.cutoff = NULL, upper.cutoff = NULL, bands.index.subset = NULL, spreadsheet.folder = getwd() )
x |
ecors object (from get.ecors). |
edge.pixels |
choose the treatment for pixels that coincide with the edges of the polygons: "weighted" uses the value of these pixels in the descriptive statistic weighted by the proportion of the area inside the polygon, "centroid" ignores pixels with the centroid outside the polygon. |
remove.samples |
removes sample units that were misrepresented due to bad pixel removal. Minimum number of good pixels (num.pixelOK) and/or minimum proportion of good pixels (prop.pixelOK) can be selected. |
summarizing |
selects whether the data will be summarized by year ("yearly") or for the entire evaluated period ("all"). The "yearly" option considers a year as every 12 months starting from the initial month of image collection. |
by.image.save |
save a csv file with descriptive statistics for each image? |
summary.save |
save a csv file with descriptive statistics summarizing all the images? |
stats |
enables/disables the calculation of mean, median, standard deviation (sd) or count (of pixels in the range of values between lower.cutoff and upper.cutoff). |
lower.cutoff |
Lower threshold values to ignore pixels in all stats (required for count). It must be a vector with a size equal to the number of bands/indexes to be analyzed. |
upper.cutoff |
Upper threshold values to ignore pixels in all stats (required for count). It must be a vector with a size equal to the number of bands/indexes to be analyzed. |
bands.index.subset |
Subset of bands to use for statistics. |
spreadsheet.folder |
local folder to save csv files. |
List with result tables and metadata. If enabled, csv files will be saved with tables (two for each descriptive statistics): results.mean.samples.csv, summary.mean.samples.csv, results.median.samples.csv, summary.median.samples.csv, results.sd.samples.csv, summary.sd.samples.csv, results.count.samples.csv, summary.count.samples.csv.
https://developers.google.com/earth-engine/guides/reducers_weighting
#get a ecors class object FAL.IBGE.JBB<-sf::st_read(system.file("extdata/FAL.IBGE.JBB.gpkg", package="ecors")) test.plots<-sf::st_read(system.file("extdata/Plots_tests.gpkg", package="ecors")) test.points<-sf::st_read(system.file("extdata/Points_tests.gpkg", package="ecors")) #library(ecors) d2020<-get.ecors(site=FAL.IBGE.JBB, points=test.points, plots=test.plots, buffer.points=500, buffer.plots=500, eval.area="site", projected=F, custom.crs=32723, collection="LANDSAT/LC08/C02/T1_L2", start=c("2020-01-01"), end=c("2020-12-31"), bands.eval=c("SR_B3","SR_B4"), bands.vis=F, indices=c("NDVI"), resolution=30, pOK=0.3, c.prob=NULL, c.dist=100, clouds.sentinel=NULL, cirrus.threshold=NULL, NIR.threshold=NULL, CDI.threshold=NULL, dmax.shadow=NULL, seasons=list(s1=c(11,12,1,2), s2=c(3,4), s3=c(5,6,7,8), s4=c(9,10)), group.by="season", composite=NULL) allpixels<-stats.ecors(x=d2020, edge.pixels="weighted", remove.samples=list(num.pixelOK=10,prop.pixelOK=0.8), summarizing="all", by.image.save=T, summary.save=T, stats=list(mean=T,median=F,sd=F,count=F),spreadsheet.folder=getwd() ) rangepixels<-stats.ecors(x=d2020, edge.pixels="weighted", remove.samples=list(num.pixelOK=10,prop.pixelOK=0.8), summarizing="all", by.image.save=T, summary.save=T, stats=list(mean=T,median=F,sd=F,count=T), lower.cutoff=c(9000,8500), upper.cutoff=c(10000,10000), bands.index.subset=c("SR_B3","SR_B4"), spreadsheet.folder=getwd() )
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.