gcs_data: Access datasets from Google Cloud Storage

Description Usage Arguments Details Examples

View source: R/gcs.R

Description

Helper functions for loading datasets from Google Cloud Storage (GCS). In case of tabular data, provides functions which can be used in concert with import.

Usage

1
2
3
4
5
6
7
8
gcs_data(bucket, path, loader)

gcs_table(bucket, path, loader, name = NULL)

## S3 method for class 'gcs_table'
materialize(reference)

gcs_auth(...)

Arguments

bucket

a valid GCS bucket name.

path

a path pointing to a dataset into the GCS bucket.

loader

a post-processing function (e.g. read_csv) which transforms the downloaded file into usable data. When used with import, 'loader' must return an object with S3 class 'data.frame' (e.g. a tibble).

Details

'gcs_data()'

loads and parses data from GCS.

'gcs_table()'

lazy table reference for using GCS objects with import.

'gcs_auth()'

thin wrapper over gargle and googleAuthR for faciliting user-base authentication.

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
## Not run: 
   # The typical use case is importing datasets into notebooks locally.
   gcs_auth(email = 'user@example.com')
   
   # Places the contents of cities.csv into global object "cities", and
   # stores the data into local cache for future use.
   import(
      gcs_table(
         bucket = 'somebucket.example.com',
         path = '/datasets/cities.csv',
         loader = read_csv
      )
   )

## End(Not run)

gmega/megautils documentation built on Sept. 14, 2020, 8:06 p.m.