xform.ngscene | R Documentation |
Add a transformation to one or more layers in a neuroglancer scene
## S3 method for class 'ngscene'
xform(x, reg, layers = NULL, ...)
x |
A neuroglancer scene as produced by |
reg |
A registration either as a |
layers |
A character vector specifying the layers in the scene to transform. If the elements are named, they names specify the new names of the transformed layer. |
... |
Additional arguments passed to |
Neuroglancer only implements homogeneous affine transforms for layers. However these can still be quite useful when a non-rigid transform cannot be applied to a layer e.g. because the underlying neurons are undergoing rapid editing and it is not practical to generate a static set of transformed meshes.
A new ngscene
object
# flywire scene
u='https://ngl.flywire.ai/?json_url=https://globalv1.flywire-daf.com/nglstate/4559706743898112'
scf=ngl_decode_scene(u)
m=matrix(c(-0.9663, -0.0695, 0.17, 0, 0.0351, 1.043, -0.028, 0, 0.1082,
-0.0093, 0.9924, 0, 1021757.1284, 31409.0911, -85626.0572, 1), ncol=4)
# nb replaces existing layer of this name
xform(scf, m, layers=c('Production-mirrored'))
scu=ngl_decode_scene('https://tinyurl.com/kj9rwn26')
# make a new layer mirroring an existing layer
scu2=xform(scu, m, layers=c('fly_v31_m'='fly_v31'))
scu2
## Not run:
browseURL(as.character(scu2))
## End(Not run)
## Not run:
# mirror a flywire scene based on points from a specific pair of neurons
mbon18.dps=read_l2dp('MBON18')
mirror_reg=fit_xform(samplepts = mbon18.dps,
refpts = nat.jrcbrains::mirror_fafb(mbon18.dps), subsample = 500)
flywire_scene('MBON18') %>%
ngl_decode_scene %>%
xform(mirror_reg, layers=c("mirror"="Production-segmentation_with_graph")) %>%
as.character() %>%
browseURL()
## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.