tar_meta | R Documentation |
Read the metadata of all recorded targets and global objects.
tar_meta(
names = NULL,
fields = NULL,
targets_only = FALSE,
complete_only = FALSE,
store = targets::tar_config_get("store")
)
names |
Optional, names of the targets. If supplied, |
fields |
Optional, names of columns/fields to select. If supplied,
|
targets_only |
Logical, whether to just show information about targets or also return metadata on functions and other global objects. |
complete_only |
Logical, whether to return only complete rows
(no |
store |
Character of length 1, path to the
|
A metadata row only updates when the target completes.
tar_progress()
shows information on targets that are running.
That is why the number of branches may disagree between tar_meta()
and tar_progress()
for actively running pipelines.
A data frame with one row per target/object and the selected fields.
Several functions like tar_make()
, tar_read()
, tar_load()
,
tar_meta()
, and tar_progress()
read or modify
the local data store of the pipeline.
The local data store is in flux while a pipeline is running,
and depending on how distributed computing or cloud computing is set up,
not all targets can even reach it. So please do not call these
functions from inside a target as part of a running
pipeline. The only exception is literate programming
target factories in the tarchetypes
package such as tar_render()
and tar_quarto()
.
Metadata files help targets
read data objects and decide if the pipeline is up to date.
Usually, these metadata files live in files in the local
_targets/meta/
folder in your project, e.g. _targets/meta/meta
.
But in addition, if you set repository
to anything other than
"local"
in tar_option_set()
in _targets.R
, then tar_make()
continuously uploads the metadata files to the bucket you specify
in resources
. tar_meta_delete()
will delete those files from the
cloud, and so will tar_destroy()
if destroy
is
set to either "all"
or "cloud"
.
Other functions in targets
, such as tar_meta()
,
tar_visnetwork()
, tar_outdated()
, and tar_invalidate()
,
use the local metadata only and ignore the copies on the cloud.
So if you are working on a different computer than the
one running the pipeline, you will need to download the cloud metadata
to your current machine using tar_meta_download()
. Other functions
tar_meta_upload()
, tar_meta_sync()
, and tar_meta_delete()
also manage metadata across the cloud and the local file system.
Remarks:
The repository_meta
option in tar_option_set()
is actually
what controls where the metadata lives in the cloud, but it defaults
to repository
.
Like tar_make()
, tar_make_future()
and tar_make_clustermq()
also continuously upload metadata files to the cloud bucket
specified in resources
.
tar_meta_download()
and related functions need to run _targets.R
to detect tar_option_set()
options repository_meta
and resources
,
so please be aware of side effects that may happen running your
custom _targets.R
file.
Other metadata:
tar_meta_delete()
,
tar_meta_download()
,
tar_meta_sync()
,
tar_meta_upload()
if (identical(Sys.getenv("TAR_EXAMPLES"), "true")) { # for CRAN
tar_dir({ # tar_dir() runs code from a temp dir for CRAN.
tar_script({
library(targets)
library(tarchetypes)
list(
tar_target(x, seq_len(2)),
tar_target(y, 2 * x, pattern = map(x))
)
}, ask = FALSE)
tar_make()
tar_meta()
tar_meta(starts_with("y_")) # see also any_of()
})
}
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.