Large data files can be difficult to work with in R, where data generally resides in memory. This package encourages a style of programming where data is 'streamed' from disk into R via a `producer' and through a series of `consumers' that, typically reduce the original data to a manageable size. The package provides useful Producer and Consumer stream components for operations such as data input, sampling, indexing, and transformation; see package?Streamer for details.
|Author||Martin Morgan, Nishant Gopalakrishnan|
|Date of publication||None|
|Maintainer||Martin Morgan <firstname.lastname@example.org>|
ConnectionProducer-classes: Producer classes to read file connections
Consumer-class: Class defining methods for all consumers
DAGTeam-class: Consumer classes for directed acyclic graph evaluation
Downsample-class: Consumer class to down-sample data
FunctionProducerConsumer-classes: Classes for user-defined Producers and Consumers
ParallelParam-classes: Classes to configure parallel evaluation
Producer-class: Class defining methods for all Producers
RawInput-class: Class "RawInput"
Reducer-class: Consumer class to combine successive records
reset: Function to reset a Stream, Producer, or Consumer
Seq-class: Producer class to generate (numeric) sequences
status: Function to report current status of a stream
Stream-class: Class to represent a Producer and zero or more Consumers
Streamer-package: Package to enable stream (iterative) processing of large data
Team-class: Consumer classes for parallel evaluation
Utility-classes: Consumer classes with simple functionality, e.g., RawToChar,...
yield: Function to yield one task from a Stream or Producer