save_object()
now uses httr::write_disk()
to avoid having to load a file into memory. (#158, h/t Arturo Saco)endsWith()
in two places to reduce (implicit) base R dependency. (#147, h/t Huang Pan)put_object()
and put_bucket() now expose explicit
acl` arguments. (#137)get_acl()
and put_acl()
are now exported. (#137)put_folder()
convenience function for creating an empty pseudo-folder.put_bucket()
now errors if the request is unsuccessful. (#132, h/t Sean Kross)setup_s3_url()
when region = ""
.bucketlist()
gains both an alias, bucket_list_df()
, and an argument add_region
to add a region column to the output data frame.s3sync()
function. (#20)save_object()
now creates a local directory if needed before trying to save. This is useful for object keys contains /
.s3HTTP()
.s3readRDS()
and s3saveRDS()
.s3readRDS()
. (#59)put_object()
(#80)tempfile()
instead of rawConnection()
for high-level read/write functions. (#128)get_bucket()
. (#88)get_object()
now returns a pure raw vector (without attributes). (#94)s3sync()
relies on get_bucket(max = Inf)
. (#20)s3HTTP()
gains a base_url
argument to (potentially) support S3-compatible storage on non-AWS servers. (#109)s3HTTP()
gains a dualstack
argument provide support for "dual stack" (IPv4 and IPv6) support. (#62)get_bucket()
when max = Inf
. (#127, h/t Liz Macfie)s3read_using()
and s3write_using()
provide a generic interface to reading and writing objects from S3 using a specified function. This provides a simple and extensible interface for the import and export of objects (such as data frames) in formats other than those provided by base R. (#125, #99)s3HTTP()
gains a url_style
argument to control use of "path"-style (new default) versus "virtual"-style URL paths. (#23, #118)s3save()
gains an envir
argument. (#115)get_bucket()
now automatically handles pagination based upon the specified number of objects to return. (PR #104, h/t Thierry Onkelinx)get_bucket_df()
now uses an available (but unexported) as.data.frame.s3_bucket()
method. The resulting data frame always returns character rather than factor columns.s3HTTP()
. (#46, #106 h/t John Ramey)bucketlist()
now returns (in addition to past behavior of printing) a data frame of buckets.get_bucket_df()
returns a data frame of bucket contents. get_bucket()
continues to return a list. (#102, h/t Dean Attali)s3HTTP()
gains a check_region
argument (default is TRUE
). If TRUE
, attempts are made to verify the bucket's region before performing the operation in order to avoid confusing out-of-region errors. (#46)object = "s3://bucket_name/object_key"
. In all cases, the bucketname and object key will be extracted from this string (meaning that a bucket does not need to be explicitly specified). (#100; h/t John Ramey)get_bucket()
S3 generic and methods.=
). (#64)s3save_image()
to save an entire workspace.Remotes
field.s3source()
as a convenience function to source an R script directly from S3. (#54)s3save()
, s3load()
, s3saveRDS()
, and s3readRDS()
no longer write to disk, improving performance. (#51)s3saveRDS()
and s3readRDS()
. (h/t Steven Akins, #50)get_object()
). Previously available functions that did not conform to this format have been deprecated. They continue to work, but issue a warning. (#28)bucket
and object
names was swapped in most object-related functions and the Bucket name has been added to the object lists returned by getbucket()
. This means that bucket
can be omitted when object
is an object of class "s3_object".Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.