knitr::opts_chunk$set( eval = TRUE ) library(analysisPipelines)
The meta-pipeline construct is one which allows users to export pipelines they have created for a particular use case to a general analysis flow which can be used for a different dataset and different set of parameters. A pipeline is one where the data can change, though retaining the same schema, and the same set of parameters for the functions. A meta-pipeline is one where only the analysis flow, function dependencies and so on are retained. The specific parameters for each of the functions can be set differently for a new use case.
The objective of a meta-pipeline is to define and execute reusable analysis flows. They can be used to:
Through this package, meta-pipelines can be created by exporting an already created pipeline to a meta-pipeline. The export retains the following items:
In the example below, we first create a pipeline, similar to the one described in the other vignettes.
pipeline <- AnalysisPipeline(input = iris) getColor <- function(color){ return(color) } getColumnName <-function(columnName){ return(columnName) } registerFunction(functionName = "getColor", isDataFunction = F, firstArgClass = "character") registerFunction(functionName = "getColumnName", isDataFunction = F, firstArgClass = "character") getRegistry()
We then generate an output from the pipeline, just to validate that the pipeline works properly. Of course, to define a meta-pipeline generation of output is not required.
pipeline %>>% getColor(color = "blue") %>>% getColumnName(columnName = "Sepal.Length") %>>% univarCatDistPlots(uniCol = "Species", priColor = ~f1, optionalPlots = 0, storeOutput = T) %>>% outlierPlot(method = "iqr", columnName = ~f2, cutoffValue = 0.01, priColor = ~f1 , optionalPlots = 0) -> complexPipeline complexPipeline %>>% getPipeline complexPipeline %>>% prepExecution -> complexPipeline complexPipeline %>>% generateOutput -> op op %>>% getOutputById("3")
Once a pipeline has been created, be it a batch or a streaming pipeline, it can be exported using the exportAsMetaPipeline
method. This returns an object of class MetaAnalysisPipeline
which stores the required information.
The meta-pipeline can be visualized similar to a normal pipeline object by calling the visualizePipeline
method on the MetaAnalysisPipeline
object.
complexPipeline %>>% exportAsMetaPipeline -> complexMetaPipeline # complexMetaPipeline %>>% visualizePipeline
The next part of using the meta-pipeline is creating another pipeline with a different set of parameters. For this purpose, the user can first export the pipeline prototype which basically contains the set of functions used in the pipeline and their respective arguments.
The pipeline prototype is exported as an object of class proto
from the 'proto' package, which is a thin skin over environments, with usability advantages such as using methods like names
to get the names of objects contained in it, as well as using the '$' operator to refer to specific objects. The aim of using this class is to provide an easy-to-use interface to set the new values of the arguments.
The pipeline prototype has a nested structure. The first level is a list of objects which represent the list of functions in the pipeline. A specific function can just be referred to through its name. The second level, is the list of arguments for each of those functions (again referred by the usual name).
The new values of the parameters can simply be set by using the '$' operator to refer to the values. The exported pipeline prototype by default contains the values of the parameters defined in the original pipeline. Therefore, the user can simply change some of the values as required or for all of the parameters.
In the following example, we reconfigure the pipeline for use with the 'iris' dataset.
pipelineProto <- getPipelinePrototype(complexMetaPipeline) str(pipelineProto) #Setting new parameters on ToothGrowth dataset pipelineProto$getColor$color<- "green" pipelineProto$getColumnName$columnName<- "len" pipelineProto$univarCatDistPlots$uniCol <- "supp" #complexMetaPipeline %>>% visualizePipeline
Now once the parameters have been set, a new pipeline object (which is executable) can be created by calling the createPipelineInstance
method, and passing the meta-pipeline object and the pipeline prototype. This creates a pipeline object with the usual properties.
We set the input of the pipeline object to the iris
dataset and then execute to generate the output.
complexMetaPipeline %>>% createPipelineInstance(pipelineProto) -> newPipelineObj newPipelineObj %>>% setInput(input = ToothGrowth) -> newPipelineObj newPipelineObj %>>% generateOutput %>>% getOutputById("3")
Similar to pipelines, meta-pipelines can be saved and loaded using the savePipeline
method and the loadMetaPipeline
function. As with pipelines, when a meta-pipeline is loaded, it overwrites the existing registry with the registry stored with the meta-pipeline.
complexMetaPipeline %>>% savePipeline("metaPipeline.RDS") #Checking if registry is updated getC <- function(color){ return(color) } getCol <-function(columnName){ return(columnName) } registerFunction(functionName = "getC", isDataFunction = F, firstArgClass = "character") registerFunction(functionName = "getCol", isDataFunction = F, firstArgClass = "character") getRegistry() loadMetaPipeline(path = "metaPipeline.RDS") -> loadedMetaPipeline getRegistry() pipelineProtoLoaded <- getPipelinePrototype(loadedMetaPipeline) str(pipelineProtoLoaded) pipelineProtoLoaded$getColor$color<- "green" pipelineProtoLoaded$getColumnName$columnName<- "Sepal.Length" pipelineProtoLoaded$univarCatDistPlots$uniCol <- "Species" loadedMetaPipeline %>>% createPipelineInstance(pipelineProtoLoaded) -> newPipelineObjLoaded newPipelineObjLoaded %>>% setInput(input = iris) %>>% generateOutput %>>% getOutputById("3")
Any scripts or data that you put into this service are public.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.