Description Usage Arguments Examples
This function is useful for taking huge CSV files that do not fit into RAM and store them as a SQLite db which can be streamed via dplyr. This way you can work with really big data files without being limited by your machines RAM.
1 2 3 4 5 6 7 8 | csvToSQLite(
csv_file,
sqlite_file,
table_name,
pre_process_size = 1000,
chunk_size = 50000,
delim = ","
)
|
csv_file |
name of the CSV file to convert |
sqlite_file |
name of the newly created sqlite file |
table_name |
name of the table to store the data table in the sqlite dbase |
pre_process_size |
the number of lines to check the data types of the individual columns (default 1000) |
chunk_size |
the number of lines to read for each chunk (default 50000) |
delim |
the field delimiter to use (default ,) |
1 2 3 4 5 6 7 8 9 10 11 | library(RSQLite)
library(dplyr)
sqlite_file <- "example.sqlite"
table_name <- "example"
write.csv(airquality, "example.csv", row.names = FALSE)
csvToSQLite("example.csv", sqlite_file, table_name,
pre_process_size = 1000, chunk_size = 50000
)
mydb <- src_sqlite(sqlite_file, create = FALSE)
mydata <- tbl(mydb, table_name)
head(mydata)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.