ConvertToSeurat: Convert slide object to Seurat object

convertToSeuratR Documentation

Convert slide object to Seurat object

Description

This function converts our slide object of class SummarizedExperiment to Seurat object of class Seurat so that users can directly proceed with Seurat spatial analyses pipelines. Built based on Seurat's Load10X_Spatial().

Usage

convertToSeurat(slide_obj, image_dir, slice = "slice1", filter_matrix = TRUE)

Arguments

slide_obj

A slide object created or inherited from createSlide().

image_dir

(chr) Path to directory with 10X Genomics visium image data; should include files tissue_lowres_image.png, scalefactors_json.json and tissue_positions_list.csv.

slice

(chr) Name for the stored image of the tissue slice. Default: "slice1"

filter_matrix

(logical) If TRUE, only keep spots that have been determined to be over tissue. If slide_obj only contains tissue spots, filter_matrix has to be set TRUE. If slide_obj contains both tissue and background spots, setting filter_matrix=TRUE will subset the expression matrix to tissue spots only. Default: TRUE

Value

A Seurat object with spatial information.

Examples


# load count matrix and slide metadata
data(mbrain_raw)
spatial_dir <- system.file(file.path("extdata",
                                     "V1_Adult_Mouse_Brain_spatial"),
                           package = "SpotClean")
mbrain_slide_info <- read10xSlide(tissue_csv_file=file.path(spatial_dir,
                                       "tissue_positions_list.csv"),
             tissue_img_file = file.path(spatial_dir,
                                       "tissue_lowres_image.png"),
             scale_factor_file = file.path(spatial_dir,
                                       "scalefactors_json.json"))

# Create slide object
mbrain_obj <- createSlide(mbrain_raw,
                          mbrain_slide_info)

# Convert to Seurat object
seurat_obj <- convertToSeurat(mbrain_obj, spatial_dir, "raw")
str(seurat_obj)


zijianni/SpotClean documentation built on Nov. 15, 2023, 12:53 a.m.