This framework allows you to design and implement complex pipelines, and deploy them on your institution's computing cluster. This has been built keeping in mind the needs of bioinformatics workflows. However, it is easily extendable to any field where a series of steps (shell commands) are to be executed in a (work)flow.
|Author||Sahil Seth [aut, cre]|
|Date of publication||2016-04-19 01:17:29|
|Maintainer||Sahil Seth <email@example.com>|
|License||MIT + file LICENSE|
check: Check consistency of flowdef and flowmat
check_args: Assert none of the arguemnts of a function are null.
error: Error Handler
fetch: Two generic functions to search for pipelines and...
flow: Describing the flow class
flowopts: Default options/params used in flowr and ngsflows
get_wds: Get all the (sub)directories in a folder
job: Describing details of the job object
kill: Kill all jobs submitted to the computing platform, for one or...
plot_flow: Plot a clean and scalable flowchart describing the (work)flow
queue: A 'queue' object defines details regarding how a job is...
replace_slots: replace slots in a S4 object
rerun: Re-run a pipeline in case of hardware or software failures.
run: Run automated Pipelines
setup: Setup and initialize flowr
status: Monitor status of flow(s)
submit_flow: Submit a flow to the cluster
submit_job: Submit a step of a flow
submit_run: Submit several flow objects, limit the max running...
to_flow: Create flow objects
to_flowdef: Flow Definition defines how to stich steps into a (work)flow.
to_flowdet: Create a flow's submission detail file
to_flowmat: Create a flowmat using a list a commands.
verbose: Verbose levels, defining verboseness of messages
whisker_render: Wrapper around whisker.render with some additional checks
write_flow_details: write files desribing this flow