| Imports | R Documentation |
R6 Class for managing storage imports resource endpoints.
sevenbridges2::Resource -> Imports
URLList of URL endpoints for this resource.
new()Create a new Imports object.
Imports$new(...)
...Other response arguments.
query()This call lists import jobs initiated by a particular user. Note that when you import a file from your volume on your cloud storage provider (Amazon Web Services or Google Cloud Storage), you are creating an alias on the Platform which points to the file in your cloud storage bucket. Aliases appear as files on the Platform and can be copied, executed, and modified as such. They refer back to the respective file on the given volume.
Imports$query(
volume = NULL,
project = NULL,
state = NULL,
limit = getOption("sevenbridges2")$limit,
offset = getOption("sevenbridges2")$offset,
...
)volumeVolume id or Volume object. List all imports from this particular volume. Optional.
projectProject id or Project object. List all volume imports to this particular project. Optional.
stateThe state of the import job. Possible values are:
PENDING: the import is queued;
RUNNING: the import is running;
COMPLETED: the import has completed successfully;
FAILED: the import has failed.
Example:
state = c("RUNNING", "FAILED")
limitThe maximum number of collection items to return
for a single request. Minimum value is 1.
The maximum value is 100 and the default value is 50.
This is a pagination-specific attribute.
offsetThe zero-based starting index in the entire collection
of the first item to return. The default value is 0.
This is a pagination-specific attribute.
...Other arguments that can be passed to core api() function
like 'fields', etc.
Collection of Import objects.
\dontrun{
imports_object <- Imports$new(
auth = auth
)
# List import job
imports_object$query()
}
get()This call will return the details of an import job.
Imports$get(id, ...)
idThe import job identifier (id).
...Other arguments that can be passed to core api() function
like 'fields', etc.
Import object.
\dontrun{
imports_object <- Imports$new(
auth = auth
)
# List import job
imports_object$get(id = id)
}
submit_import()This call lets you queue a job to import a file or folder
from a volume into a project on the Platform.
Essentially, you are importing an item from your cloud storage provider
(Amazon Web Services, Google Cloud Storage, Azure or Ali Cloud) via the
volume onto the Platform.
If successful, an alias will be created on the Platform. Aliases appear
on the Platform and can be copied, executed, and modified as such.
They refer back to the respective item on the given volume.
If you want to import multiple files, the recommended way is to do it in bulk considering the API rate limit (learn more). Bulk operations will be implemented in next releases.
Imports$submit_import( source_volume, source_location, destination_project = NULL, destination_parent = NULL, name = NULL, overwrite = FALSE, autorename = FALSE, preserve_folder_structure = NULL, ... )
source_volumeVolume id or Volume object you want to import files or folders from.
source_locationFile location name or folder prefix name on the volume you would like to import into some project/folder on the Platform.
destination_projectDestination project id or Project
object. Not required, but either
destination_project or destination_parent directory must be
provided.
destination_parentFolder id or File object
(with type = 'FOLDER'). Not required, but either destination_project
or destination_parent directory must be provided.
nameThe name of the alias to create. This name should be unique
to the project.
If the name is already in use in the project, you should
use the overwrite query parameter in this call to force any item with
that name to be deleted before the alias is created.
If name is omitted, the alias name will default to the last segment of
the complete location (including the prefix) on the volume.
Segments are considered to be separated with forward slashes /.
Allowed characters in file names are all alphanumeric and special
characters except forward slash /, while folder names can contain
alphanumeric and special characters _, - and ..
overwriteSet to TRUE if you want to overwrite the item if
another one with the same name already exists at the destination.
Bear in mind that if used with folders import, the folder's content
(files with the same name) will be overwritten, not the whole folder.
autorenameSet to TRUE if you want to automatically rename the
item (by prefixing its name with an underscore and number) if another
one with the same name already exists at the destination.
Bear in mind that if used with folders import, the folder content will
be renamed, not the whole folder.
preserve_folder_structureSet to TRUE if you want to keep the
exact source folder structure. The default value is TRUE if the item
being imported is a folder. Should not be used if you are importing a
file. Bear in mind that if you use preserve_folder_structure = FALSE,
the response will be the parent folder object containing imported files
alongside with other files if they exist.
...Other arguments that can be passed to core api() function
like 'fields', etc.
Import object.
\dontrun{
imports_object <- Imports$new(
auth = auth
)
# Submit new import into a project
imports_object$submit_import(
source_location = volume_file_object,
destination_project = test_project_object,
autorename = TRUE
)
}
delete()Import jobs cannot be deleted.
Imports$delete()
bulk_get()This call returns the details of a bulk import job. Note that when you import files from your volume on a cloud storage provider (Amazon Web Services or Google Cloud Storage), you create an alias on the Platform which points to the files in your cloud storage bucket. Aliases appear as files on the Platform and can be copied, executed, and modified.
Imports$bulk_get(imports)
importsThe list of the import job IDs as returned by the call
to start a bulk import job or list of Import objects.
Collection with list of Import
objects.
\dontrun{
imports_object <- Imports$new(
auth = auth
)
# List import job
imports_object$bulk_get(
imports = list("import-job-id-1", "import-job-id-2")
)
}
bulk_submit_import()This call lets you perform a bulk import of files from your volume (either Amazon Web Services or Google Cloud Storage) into your project on the Platform.
You can use this call to either import files to a specific folder or a project but you can also use it to import a folder and its files into another destination folder while preserving folder structure. One call can contain up to 100 items. Learn more about using the Volumes API for Amazon S3 and for Google Cloud Storage.
Imports$bulk_submit_import(items)
itemsNested list of elements containing information about each file/folder to be imported. For each element, users must provide:
source_volume - Volume object or its ID to import
files/folders from,
source_location - Volume-specific location pointing to the
file or folder to import.
This location should be recognizable to the underlying cloud
service as a valid key or path to the item. If the item being
imported is a folder, its path should end with a /.
Please note that if this volume was configured with a prefix
parameter when it was created, the value of prefix will be
prepended to the location before attempting to locate the item on
the volume.
destination_project - Project object or ID to import
files/folders into. Should not be used together with
destination_parent. If project is used, the items will be
imported to the root of the project's files.
destination_parent - File object of type 'folder' or its ID
to import files/folders into. Should not be used together with
destination_project. If parent is used, the import will take
place into the specified folder, within the project to which the
folder belongs.
name - The name of the alias to create.
This name should be unique to the project. If the name is already
in use in the project, you should use the autorename parameter
in this call to automatically rename the item (by prefixing its
name with an underscore and number).
If name is omitted, the alias name will default to the last
segment of the complete location (including the prefix) on the
volume. Segments are considered to be separated with forward
slashes ('/').
autorename - Whether to automatically rename the item
(by prefixing its name with an underscore and number) if another
one with the same name already exists at the destination.
preserve_folder_structure - Whether to keep the exact
source folder structure. The default value is TRUE if the item
being imported is a folder. Should not be used if you are
importing a file.
Example of the list:
items <- list(
list(
source_volume = 'rfranklin/my-volume',
source_location = 'chimeras.html.gz',
destination_project = 'rfranklin/my-project'
),
list(
source_volume = 'rfranklin/my-volume',
source_location = 'my-folder/',
destination_project = 'rfranklin/my-project',
autorename = TRUE,
preserve_folder_structure = TRUE
),
list(
source_volume = 'rfranklin/my-volume',
source_location = 'my-volume-folder/',
destination_parent = '567890abc1e5339df0414123',
name = 'new-folder-name',
autorename = TRUE,
preserve_folder_structure = TRUE
)
)
Read more on how to import folders from your volume into a project or a project folder.
Utility function prepare_items_for_bulk_import
can help you prepare the items parameter based on the provided
list of VolumeFile or VolumePrefix objects.
Collection with list of Import
objects.
\dontrun{
imports_object <- Imports$new(
auth = auth
)
# Submit new import into a project
imports_object$bulk_submit_import(items = list(
list(
source_volume = "rfranklin/my-volume",
source_location = "my-file.txt",
destination_project = test_project_object,
autorename = TRUE
),
list(
source_volume = "rfranklin/my-volume",
source_location = "my-folder/",
destination_parent = "parent-folder-id",
autorename = FALSE,
preserve_folder_structure = TRUE
)
)
)
}
clone()The objects of this class are cloneable with this method.
Imports$clone(deep = FALSE)
deepWhether to make a deep clone.
## ------------------------------------------------
## Method `Imports$query`
## ------------------------------------------------
## Not run:
imports_object <- Imports$new(
auth = auth
)
# List import job
imports_object$query()
## End(Not run)
## ------------------------------------------------
## Method `Imports$get`
## ------------------------------------------------
## Not run:
imports_object <- Imports$new(
auth = auth
)
# List import job
imports_object$get(id = id)
## End(Not run)
## ------------------------------------------------
## Method `Imports$submit_import`
## ------------------------------------------------
## Not run:
imports_object <- Imports$new(
auth = auth
)
# Submit new import into a project
imports_object$submit_import(
source_location = volume_file_object,
destination_project = test_project_object,
autorename = TRUE
)
## End(Not run)
## ------------------------------------------------
## Method `Imports$bulk_get`
## ------------------------------------------------
## Not run:
imports_object <- Imports$new(
auth = auth
)
# List import job
imports_object$bulk_get(
imports = list("import-job-id-1", "import-job-id-2")
)
## End(Not run)
## ------------------------------------------------
## Method `Imports$bulk_submit_import`
## ------------------------------------------------
## Not run:
imports_object <- Imports$new(
auth = auth
)
# Submit new import into a project
imports_object$bulk_submit_import(items = list(
list(
source_volume = "rfranklin/my-volume",
source_location = "my-file.txt",
destination_project = test_project_object,
autorename = TRUE
),
list(
source_volume = "rfranklin/my-volume",
source_location = "my-folder/",
destination_parent = "parent-folder-id",
autorename = FALSE,
preserve_folder_structure = TRUE
)
)
)
## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.