chunkedMap: Applies a Function on Each Chunk of a File-Backed Matrix

View source: R/chunkedApply.R

chunkedMapR Documentation

Applies a Function on Each Chunk of a File-Backed Matrix

Description

Similar to lapply, but designed for file-backed matrices. The function brings chunks of an object into physical memory by taking subsets, and applies a function on them. If nCores is greater than 1, the function will be applied in parallel using mclapply. In that case the subsets of the object are taken on the slaves.

Usage

chunkedMap(X, FUN, i = seq_len(nrow(X)), j = seq_len(ncol(X)),
  chunkBy = 2L, chunkSize = 5000L, nCores = getOption("mc.cores",
  2L), verbose = FALSE, ...)

Arguments

X

A file-backed matrix, typically the genotypes of a BGData object.

FUN

The function to be applied on each chunk.

i

Indicates which rows of X should be used. Can be integer, boolean, or character. By default, all rows are used.

j

Indicates which columns of X should be used. Can be integer, boolean, or character. By default, all columns are used.

chunkBy

Whether to extract chunks by rows (1) or by columns (2). Defaults to columns (2).

chunkSize

The number of rows or columns of X that are brought into physical memory for processing per core. If NULL, all elements in i or j are used. Defaults to 5000.

nCores

The number of cores (passed to mclapply). Defaults to the number of cores as detected by detectCores.

verbose

Whether progress updates will be posted. Defaults to FALSE.

...

Additional arguments to be passed to the apply like function.

See Also

file-backed-matrices for more information on file-backed matrices. multi-level-parallelism for more information on multi-level parallelism. BGData-class for more information on the BGData class.

Examples

# Restrict number of cores to 1 on Windows
if (.Platform$OS.type == "windows") {
    options(mc.cores = 1)
}

# Load example data
bg <- BGData:::loadExample()

# Compute column sums of each chunk
chunkedMap(X = geno(bg), FUN = colSums)

BGData documentation built on March 31, 2023, 6:57 p.m.