bigReg: Generalized Linear Models (GLM) for Large Data Sets
Version 0.1.2

Allows the user to carry out GLM on very large data sets. Data can be created using the data_frame() function and appended to the object with object$append(data); data_frame and data_matrix objects are available that allow the user to store large data on disk. The data is stored as doubles in binary format and any character columns are transformed to factors and then stored as numeric (binary) data while a look-up table is stored in a separate .meta_data file in the same folder. The data is stored in blocks and GLM regression algorithm is modified and carries out a MapReduce- like algorithm to fit the model. The functions bglm(), and summary() and bglm_predict() are available for creating and post-processing of models. The library requires Armadillo installed on your system. It probably won't function on windows since multi-core processing is done using mclapply() which forks R on Unix/Linux type operating systems.

Getting started

Package details

AuthorChibisi Chima-Okereke <[email protected]>
Date of publication2016-07-25 19:16:58
MaintainerChibisi Chima-Okereke <[email protected]>
LicenseGPL (>= 2)
Package repositoryView on CRAN
Installation Install the latest version of this package by entering the following in R:

Try the bigReg package in your browser

Any scripts or data that you put into this service are public.

bigReg documentation built on May 29, 2017, 8:15 p.m.