bed_tcrossprodSelf: tcrossprod / GRM

View source: R/bed-tcrossprodSelf.R

bed_tcrossprodSelfR Documentation

tcrossprod / GRM

Description

Compute G G^T from a bed object, with possible filtering and scaling of G. For example, this can be used to compute GRMs.

Usage

bed_tcrossprodSelf(
  obj.bed,
  fun.scaling = bed_scaleBinom,
  ind.row = rows_along(obj.bed),
  ind.col = cols_along(obj.bed),
  block.size = block_size(length(ind.row))
)

Arguments

obj.bed

Object of type bed, which is the mapping of some bed file. Use obj.bed <- bed(bedfile) to get this object.

fun.scaling

A function with parameters X (or obj.bed), ind.row and ind.col, and that returns a data.frame with ⁠$center⁠ and ⁠$scale⁠ for the columns corresponding to ind.col, to scale each of their elements such as followed:

\frac{X_{i,j} - center_j}{scale_j}.

Default uses binomial scaling. You can also provide your own center and scale by using bigstatsr::as_scaling_fun().

ind.row

An optional vector of the row indices (individuals) that are used. If not specified, all rows are used.
Don't use negative indices.

ind.col

An optional vector of the column indices (SNPs) that are used. If not specified, all columns are used.
Don't use negative indices.

block.size

Maximum number of columns read at once. Default uses block_size.

Value

A temporary FBM, with the following two attributes:

  • a numeric vector center of column scaling,

  • a numeric vector scale of column scaling.

Matrix parallelization

Large matrix computations are made block-wise and won't be parallelized in order to not have to reduce the size of these blocks. Instead, you can use the MKL or OpenBLAS in order to accelerate these block matrix computations. You can control the number of cores used by these optimized matrix libraries with bigparallelr::set_blas_ncores().

Examples

bedfile <- system.file("extdata", "example.bed", package = "bigsnpr")
obj.bed <- bed(bedfile)

K <- bed_tcrossprodSelf(obj.bed)
K[1:4, 1:6] / ncol(obj.bed)


bigsnpr documentation built on Sept. 30, 2024, 9:18 a.m.