p2t: Run Shiny app to label spatial data and imagery

Description Usage Arguments Value Examples

View source: R/p2t.R

Description

Provide pt2 with a directory containing UMAP tile data, a directory to save labeled tiles, metadata about target classes, and band indices of imagery. The function then launches a Shiny app in which users may label imagery by a variety of mechanisms. These label data are saved as .tif files in which integer values correspond to those provided in the label_key.

Usage

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
p2t(
  umap_dir,
  label_dir,
  label_key,
  label_cols,
  r_band = 4,
  g_band = 5,
  b_band = 6,
  nir_band = 7
)

Arguments

umap_dir

Path to UMAP tiles.

label_dir

Path where label data .tifs will be saved.

label_key

A named list of integers corresponding to target label classes.

label_cols

Color palette for representing labeled classes.

r_band

The index of the red wavelength band. Because the umap_tile function appends input imagery with the 3 UMAP axes at the beginning of the raster stack, add 3 to the index of the band before pre-processing.

g_band

The index of the green wavelength band. Because the umap_tile function appends input imagery with the 3 UMAP axes at the beginning of the raster stack, add 3 to the index of the band before pre-processing.

b_band

The index of the blue wavelength band. Because the umap_tile function appends input imagery with the 3 UMAP axes at the beginning of the raster stack, add 3 to the index of the band before pre-processing.

nir_band

The index of the near-infrared wavelength band. Because the umap_tile function appends input imagery with the 3 UMAP axes at the beginning of the raster stack, add 3 to the index of the band before pre-processing.

Value

None. Tiles are saved to label_dir.

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
library(paint2train)

image_dir <- tempfile()
image_url <- 'https://storage.googleapis.com/mpgranch_data/sample_4band.tif'
download.file(url = image_url, destfile = image_dir)
tdir <- tempdir()
setwd(tdir) 
preproc_dir <- 'preproc_tiles'
umap_dir <- 'umap_tiles'
lab_dir <- 'label_tiles'
dir.create(preproc_dir)
dir.create(umap_dir)
dir.create(lab_dir)

#some test coordinates
xcoords <- c(727495,
             727919)

ycoords <- c(5175339,
             5175408)

coord_mat <- cbind(xcoords, ycoords)

ls <- 30 #how big should the tiles be, this is the side length (in units of data, meters here)
buff <- 5  #buffer in native units of CRS
cores <- ifelse(.Platform$OS.type == 'unix', #how many cores to use for preprocessing
                   parallel::detectCores() - 1,
                   1) 
umap_cores <- parallel::detectCores() - 1                  
   
                   
tile_at_coords(coords = coord_mat,
 len_side = ls,
 buffer = buff,
 out_dir = preproc_dir,
 img = image_dir,
 ncores = cores)

preproc_pipeline <- function(t, fs, b){
 ndvi_msavi(tile = t, r_band = 1, nir_band = 4)
 sobel(t, axes = 3, fill_na = TRUE)
 mean_var(t, axes = 3, f_width = fs, fill_na = TRUE)
 remove_buffer(tile = t, b = b)
}

targ_tiles <- list.files(preproc_dir, full.names = TRUE)

mclapply(FUN = preproc_pipeline, 
 X = targ_tiles, 
 mc.cores = cores, 
 fs = c(0.5, 1),
 b = buff)
 
lapply(FUN = umap_tile,
 X = targ_tiles,
 out_dir = umap_dir,
 n_threads = umap_cores, #args passed to umap
 n_sgd_threads = umap_cores, #args passed to umap
)

label_key <- list(Unknown = 0,
        `Not woody` = 1,
        `Woody` = 2)
#Establish the color  for each class for app visualization
pal <- c('royalblue',
        'tan',
        'green')

# Start the app, note that work will be saved every time the 
# label, filter, fill buttons are clicked within the app.
# Prior work saved in the label_dir will be loaded to resume labeling
p2t(umap_dir = umap_dir, 
   label_dir = lab_dir, 
   label_key = label_key, 
   label_col = pal)

mosscoder/paint2train documentation built on Jan. 21, 2022, 11 a.m.