aws_s3_upload: Upload files or folders to AWS

View source: R/utils-aws-upload.R

aws_s3_uploadR Documentation

Upload files or folders to AWS

Description

When uploading folders, the subdirectory structure will be preserved. To upload files from a folder without preserving the directory structure, pass a vector of file paths to the path argument.

Usage

aws_s3_upload(
  path,
  bucket,
  key = basename(path),
  prefix = "",
  check = TRUE,
  error = FALSE,
  file_type = "guess"
)

Arguments

path

String. The path to the file(s) or folder(s) to be uploaded

bucket

String. The name of the bucket to be uploaded to

key

String. The "path" of the file(s) or folder(s) in the AWS bucket. Should end with "/" for folders. Use "" (an empty string) to upload files in folder without top-level folder.

prefix

String. A prefix to prepend to the file or folder keys. Generally should end with "/"

check

Logical. Whether to check if the exact file already exists in the bucket and skip uploading. Defaults to TRUE

error

Logical. Whether error out if the file is missing, folder is empty, or system environment variables are missing. Otherwise a message will print but an empty list will be returned.

file_type

String. Provide a file type from mime::mimemap() (e.g. "html","csv") or provide "guess"to call mime::guess_type().

Details

If you would like the change the directory structure, pass in a vector of file paths and a corresponding vector of keys.

Value

A list, each element being having the key and etag (hash) of uploaded files

Examples

## Not run: 

# Upload a single file to a specific location in the bucket.
# this will take the readme.md file and place in the exact location
# specified by the key and prefix. Notice the key ends in a file
# extension.

containerTemplateUtils::aws_s3_upload(path = "README.md",
key = "test/key/param/readme.md",
error = TRUE,
bucket =Sys.getenv("AWS_BUCKET"))

# A vector of paths with a matching vector of keys
# will also result in exact placement.

paths <- list.files("R",full.names = TRUE )

file_names <- basename(paths)

keys <- sprintf("%s/%s","example_dir",file_names)

containerTemplateUtils::aws_s3_upload(path = paths,
key = keys,
error = TRUE,
bucket =Sys.getenv("AWS_BUCKET"))

# aws keys will be "example_dir/<file_name>"

# Supplying a single file path and key with no file extension will
# result in the key being treated as a directory and the file being placed
# in that directory.

containerTemplateUtils::aws_s3_upload(path = "R/utils-aws-upload.R",
                                      key = "test/key/param",
                                      error = TRUE,
                                      bucket =Sys.getenv("AWS_BUCKET"))

# aws key will be "test/key/param/utils-aws-upload.R"

# Supplying a single file path and no key argument will result in the file
# being uploaded to the top level directory of the bucket.

containerTemplateUtils::aws_s3_upload(path = "R/utils-aws-upload.R",
                                      error = TRUE,
                                      bucket =Sys.getenv("AWS_BUCKET"))

# aws key will be "./utils-aws-upload.R"

# If the path argument is a folder, the key argument should also be a folder.
# Files from the folder will be uploaded into that directory.

containerTemplateUtils::aws_s3_upload(path = "R/",
                                      key = "test/upload_folder/",
                                      error = TRUE,
                                      bucket =Sys.getenv("AWS_BUCKET"))

# aws keys will be "test/upload_nested_folder/<files from R/>"

# If the path argument is a folder with sub-directories, the structure of
# the sub-directories will be preserved.

dir.create("example_with_sub_dirs")
dir.create("example_with_sub_dirs/sub_dir")
file.create("example_with_sub_dirs/sub_dir/test.txt")

containerTemplateUtils::aws_s3_upload(path = "example_with_sub_dirs/",
                                      key = "test/upload_nested_folder/",
                                      error = TRUE,
                                      bucket =Sys.getenv("AWS_BUCKET"))

# aws key will be "test/upload_nested_folder/example_with_sub_dirs/sub_dir/test.txt"

# If the path argument is a folder and no key argument is supplied,
# the local directory structure will be copied to the S3 bucket.

containerTemplateUtils::aws_s3_upload(path = "example_with_sub_dirs/",
                                      error = TRUE,
                                      bucket =Sys.getenv("AWS_BUCKET"))

# aws keys will be "R/<files from R/>"

# If the path argument is a folder and key is an empty string, then only
# the files from the folder will be uploaded.

containerTemplateUtils::aws_s3_upload(path = "R/",
                                      key = "",
                                      error = TRUE,
                                      bucket =Sys.getenv("AWS_BUCKET"))

# aws keys will be "./<files from R/>"

# The prefix argument can be used to add a directory to the beginning of
# a path in the AWS bucket.This can be used with files or folders.

containerTemplateUtils::aws_s3_upload(path = "R/",
                                      key = "example_r_scripts",
                                      error = TRUE,
                                      prefix = "my_example_prefix",
                                      bucket =Sys.getenv("AWS_BUCKET"))

# aws keys will be "my_example_prefix/example_r_scripts/<files from R/>"

# This can be useful if you're using version control
# systems like git and would like to organize files by branch

library(gert)
git_prefix <- gert::git_branch()

containerTemplateUtils::aws_s3_upload(path = "R/",
                                      key = "",
                                      error = TRUE,
                                      prefix = git_prefix,
                                      bucket =Sys.getenv("AWS_BUCKET"))

# aws keys will be "<current GIT branch>/<files from R/>"




## End(Not run)




ecohealthalliance/containerTemplateUtils documentation built on March 31, 2024, 7:46 a.m.