bq_table_download: Download table data

Description Usage Arguments Value Complex data Larger datasets API documentation Examples

Description

This retrieves rows in chunks of page_size. It is most suitable for results of smaller queries (<100 MB, say). For larger queries, it is better to export the results to a CSV file stored on google cloud and use the bq command line tool to download locally.

Usage

1
2
bq_table_download(x, max_results = Inf, page_size = 10000,
  start_index = 0L, max_connections = 6L, quiet = NA)

Arguments

x

A bq_table

max_results

Maximum number of results to retrieve. Use Inf retrieve all rows.

page_size

The number of rows returned per page. Make this smaller if you have many fields or large records and you are seeing a 'responseTooLarge' error.

start_index

Starting row index (zero-based).

max_connections

Number of maximum simultaneously connections to BigQuery servers.

quiet

If FALSE, displays progress bar; if TRUE is silent; if NA displays progress bar only for long-running jobs.

Value

Because data retrieval may generalise list-cols and the data frame print method can have problems with list-cols, this method returns tibbles. If you need a data frame, coerce the results with as.data.frame().

Complex data

bigrquery will retrieve nested and repeated columns in to list-columns as follows:

Larger datasets

In my timings, this code takes around 1 minute per 100 MB of data. If you need to download considerably more than this, I recommend:

Unfortunately you can not export nested or repeated formats into CSV, and the formats that BigQuery supports (arvn and ndjson) that allow for nested/repeated values, are not well supported in R.

API documentation

Examples

1
2
3
if (bq_testable()) {
df <- bq_table_download("publicdata.samples.natality", max_results = 35000)
}

wilpoole-essence/bigrquery documentation built on May 6, 2019, 8:06 p.m.