if (requireNamespace("ggplot2", quietly = TRUE) && requireNamespace("albersdown", quietly = TRUE)) ggplot2::theme_set(albersdown::theme_albers(family = params$family, preset = params$preset)) knitr::opts_chunk$set( collapse = TRUE, comment = "#>", message = FALSE, warning = FALSE ) suppressPackageStartupMessages({ library(neuroim2) library(purrr) })
Most analysis work in neuroim2 reduces to the same pattern: define a spatial
support, extract values from it, then summarize or map those values into a new
result. This article shows the three main ways to do that.
Start with a 4D image and define one spherical region of interest.
vec_file <- system.file("extdata", "global_mask_v4.nii", package = "neuroim2") vol <- read_vol(vec_file) vec <- read_vec(vec_file)
roi <- spherical_roi(vol, c(12, 12, 12), radius = 6) roi_ts <- series_roi(vec, roi) dim(values(roi_ts))
stopifnot(length(roi) > 0L) stopifnot(nrow(values(roi_ts)) == dim(vec)[4])
That gives you one compact object containing the time series from every voxel inside the ROI.
When you already have a parcellation or cluster assignment, split_clusters()
turns one NeuroVec into a list of region-wise objects.
set.seed(1) mask_vol <- vol > 0 cluster_ids <- sample(1:4, sum(mask_vol), replace = TRUE) clustered <- ClusteredNeuroVol(mask_vol, cluster_ids) parts <- split_clusters(vec, clustered) length(parts)
part_means <- map_dbl(parts, ~ mean(values(.))) part_means
stopifnot(length(parts) == 4L) stopifnot(all(is.finite(part_means)))
This is the right pattern when the support is fixed in advance by an atlas, parcel map, or clustering step.
Searchlights define many overlapping ROIs, one centered at each voxel in a mask. The lazy form is useful because you only realize the neighborhoods you actually touch.
sl <- searchlight(mask_vol, radius = 4, eager = FALSE, nonzero = FALSE) first_sl <- sl[[1]] nrow(coords(first_sl))
stopifnot(nrow(coords(first_sl)) > 0L)
The eager form is better when you want to iterate repeatedly over the full set and can afford the up-front construction cost.
You do not need a specialized pipeline framework to use split-map-reduce style
workflows. A small list of ROIs plus purrr::map_*() is often enough.
first_five <- lapply(seq_len(5), function(i) sl[[i]]) first_five_means <- map_dbl(first_five, ~ mean(values(.))) first_five_means
stopifnot(length(first_five_means) == 5L) stopifnot(all(is.finite(first_five_means)))
The pattern is the same whether the pieces come from parcels, ROIs, connected components, or searchlights:
vignette("VolumesAndVectors") for the core data containersvignette("regionOfInterest") for the full ROI API surfacevignette("pipelines") for older split-map-reduce examplesvignette("clustered-neurovec") for parcel-based data structuresAny scripts or data that you put into this service are public.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.