This document gets you up and running with hdfqlr
, an R interface to
HDFql. In order to use this package, you will
need to download HDFql for your
system.
If you are going to be using HDFql regularly, it's a good idea to set
a default HDFql directory for use with hdfqlr
. You can do this
by either
- Setting the R options hdfqlr.dir
in your .Rprofile file.
- Setting the system environment variable HDFQL-DIR
.
If either of these settings exist, the HDFql library will be loaded
on package start.
library(hdfqlr)
Otherwise, you can load the HDFql library after package start with
hql_load()
:
hql_load('path/to/hdfql-x.x.x')
If you are on a Linux system, you will need to update
the environment variable LD_LIBRARY_PATH
to include the HDFql
directories prior to using the package:
```{bash, eval = FALSE}
export LD_LIBRARY_PATH=
The `hdfqlr` package relies on the R wrapper provided by HDFql. Functions exported by the package are prefixed with `hql_` to make it easy to differentiate them from the functions provided by the wrapper, which are prefixed with `HDFQL_` (for constants) or `hdfql_` (for functions). The contents of the HDFql wrapper are imported into the environment `hql$wrapper`. If you need to directly use the HDFql wrapper functions in an interactive session, you can access them using `with` or `attach` the environment to the search path. Any existing scripts written for use with the HDFql wrapper can therefore be run by attaching the `hql$wrapper` environment prior to running the script, or by using `with(hql$wrapper, ...)`. ```r # using with with(hql$wrapper, HDFQL_VERSION ) # or attach the environment attach(hql$wrapper) HDFQL_VERSION detach(hql$wrapper)
The hdfqlr
package is primarily designed for simple read and write use
cases, i.e. inspecting, reading and writing HDF datasets and attributes.
In order to access HDF files, they must be opened or "used":
file = tempfile(fileext = ".h5") hql_create_file(file) hql_use_file(file)
Creation of datasets and attributes is accomplished with the
hql_write_*
family of functions. Groups and sub-groups are created
on the fly when writing datasets or attributes, but can also be
explicitly created using hql_create_group()
.
The following example writes a matrix to the file. The
dataset is then read back in and compared to the original R object.
The function hql_flush
is used to ensure that any buffered data
is written to the HDF file prior to reading the data back into R.
values = matrix(rnorm(1000), nrow = 50) hql_write_dataset(values, "dataset0") hql_flush() all.equal(values, hql_read_dataset("dataset0"), check.attributes = FALSE)
Any attributes of the R object (other than dim
) are also
written to the dataset. To write an attributes (or list of
attributes) to the root of the HDF file or to a group, use
hql_write_attribute
(or hql_write_all_attributes
).
When writing character datasets, the maximum string length is
used for all elements of the dataset.
char.values = month.name attr(char.values, "abb") = month.abb hql_write_dataset(char.values, "group1/dataset1") hql_flush() hql_read_dataset("group1/dataset1", include.attributes = TRUE)
The package does not currently support editing existing datasets
or attributes; this is instead accomplished by reading the dataset
or attribute to an R object, modifying it, and writing it back to
the HDF file using the argument overwrite = TRUE
.
The package also provides options for listing the contents of an
HDF file via the functions hql_list_groups
, hql_list_datasets
, and
hql_list_attributes
. Both hql_list_groups
and hql_list_datasets
provide
support for recursively listing sub-groups and sub-datasets.
hql_list_groups() hql_list_datasets() hql_list_datasets(recursive = TRUE) hql_list_attributes("group1/dataset1")
Datasets, attributes, and groups can be removed from an HDF file
via the functions hql_drop_dataset
, hql_drop_attribute
, and
hql_drop_group
.
hql_drop_dataset("dataset0")
Once you're finished working with the HDF objects, close the file with
hql_close_file
:
hql_close_file(file)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.