As the project has grown, we've added a few processes around updating parts of the package.
Adding new methods to the dust object requires a few steps (most of which will be caught by CI)
src/
(sir.cpp
, sirs.cpp
, variable.cpp
, volatility.cpp
, walk.cpp
) particularly if your changes break compilation. Then you'll be in a position to iterate faster and control where the compilation errors are thrown (package tests will fail with this change, but the package will work fine). You can use ./script/remove_examples
to automate this../scripts/update_example
- be sure to edit both the .cpp
and .hpp
files or you will get compilation errors as the definitions will not match../scripts/update_dust_generator
before running devtools::document()
. Running make roxygen
will do this for you./scripts/build_gpu_vignette
will update the version in the packageAll PRs, no matter small, must increase the version number. This is enforced by github actions. We aim to use semanitic versioning as much as is reasonable, but our main aim is that all commits to master are easily findable in future.
DESCRIPTION
before the GitHub actions checks will passinst/include/dust
then you also need to run ./scripts/update_version
to reflect this new version number into the header file contents. This is also automatically checked on GitHub actions.We keep the random number library in inst/include/dust
so that it does not depend on anything else in the source tree so that it could be reused in other projects (R or otherwise).
To update the underlying generator code with the reference implementations at https://prng.di.unimi.it/ you should run the script at ./extra/generate.sh
which will download, compile, and run small programs with the upstream implementation and write out reference output in the test directory.
As the project has become more complex, keeping the headers under control has become harder. Basic principles here:
random/random.hpp
imports all distributions alphabetically separately from the generators so that it's clearer.dust/r/
; within these files throw errors only with throw
, not with cpp11::stop
(these errors will be correctly caught).The script scripts/check_headers
will validate that headers are self contained and that only interface headers include cpp11
files (directly or indirectly).
There are lots of places to consider putting documentation
man/
).vignettes/
) but a few are precomputed (see vignettes_src/
) where they need special resources such as access to a GPU or more cores than usually available../scripts/docs_build_cpp
will build these at sphinx/_build/html
for local developmentR -d cuda-gdb
It might be useful to set this:
set cuda api_failures stop
Then start the process by running r <enter>
and all the usual gdb things work reasonably well.
To find memory errors, compile a model with gpu = dust::dust_cuda_options(debug = TRUE)
to enable debug symbols, then run with
R -d cuda-memcheck
which will report the location of invalid access.
Using printf()
within kernels works fine, though it does make a mess of the screen.
You want the -warn-double-usage
argument, passed via -Xptxas
.
gpu <- dust::dust_cuda_options(
fast_math = TRUE, profile = FALSE, quiet = FALSE, debug = FALSE,
flags = paste("--keep --source-in-ptx --generate-line-info",
"-Xptxas -warn-double-usage"))
The additional flags are required to make this nice to use:
--source-in-ptx
: interleaves the source with the ptx so you know where the f64 calls come from--keep
: retains the ptx so that you can read it (otherwise it is deleted)
*--generate-line-info
is required for the --source-in-ptx
option to do anythingAdd the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.