View source: R/brainmaps-api.R
brainmaps_xyz2id | R Documentation |
Convert 3D x,y,z locations in brainmaps volumes to segmentation ids
brainmaps_xyz2id(
xyz,
volume = getOption("fafbseg.skeletonuri"),
rawcoords = FALSE,
rawvoxdims = c(8, 8, 40),
chunksize = getOption("fafbseg.brainmaps_xyz2id.chunksize", 200),
...
)
xyz |
N x 3 matrix of points or an object containing vertex data that is
compatible with |
volume |
character vector identifier string for the volume containing
segmentation data - by default it uses the value of the
|
rawcoords |
Whether the coordinates are voxel indices (when
|
rawvoxdims |
the implied voxel dimensions for the volume. If
|
chunksize |
send queries in batches each of which has at most
|
... |
Additional arguments passed to |
The underlying brainmaps
API expects raw coordinates i.e.
voxel indices. This is slightly complicated by the fact that different
segmentation, skeleton etc volumes may have different associated voxel
dimensions. brainmaps_xyz2id
automatically looks up this voxel
dimension.
However it may be that you want to pass in raw coordinates from
neuroglancer. These will generally be associated with a the resolution of
the image data not e.g. skeletons which may have a larger (coarser) voxel
size. As a convenience in this situation you can set rawcoords=TRUE
.
If you have problems with requests failing sporadically, it may be helpful to know
Google does not keep the underlying data live so there may be some spin-up time.
there is an arbitrary timeout at the Google end (currently ~ 5s)
batching nearby points helps
Using the chunksize
argument or the retry
argument passed on
to brainmaps_fetch
can overcome these issues. See the
examples.
A numeric vector of Google segment ids
## Not run:
# Physical location in nm
brainmaps_xyz2id(c(433368, 168208, 128480))
# Same location as displayed in neuroglancer
brainmaps_xyz2id(c(54171, 21026, 3212), rawcoords=TRUE)
# Raw coodinates for the brainmaps volume in question - don't touch
brainmaps_xyz2id(c(433368, 168208, 128480)/c(32,32,40), rawvoxdims=NULL)
library(elmr)
# get a manually traced neuron (just keep first and only entry in neuronlist)
dl4=read.neurons.catmaid('glomerulus DL4 right')[[1]]
# map every node location to segmentation ids
dl4.segs=brainmaps_xyz2id(dl4)
# remove unmapped locations which get id 0
dl4.segs=setdiff(dl4.segs, 0)
# read in corresponding skeletons
dl4.skels=read_segments2(dl4.segs)
# read in corresponding skeletons after including agglomeration merge groups
dl4.allskels=read_segments2(find_merged_segments(dl4.segs))
## retries / cache / chunksize issues
# set small chunk size
dl4.segs=brainmaps_xyz2id(dl4, chunksize=500)
# use retries in case of failure
dl4.segs=brainmaps_xyz2id(dl4, chunksize=500, retry=3)
# cache successful requests (if you might need to repeat)
dl4.segs=brainmaps_xyz2id(dl4, chunksize=500, retry=3, cache=TRUE)
## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.