If you don't have that much amount of free space on your local disk but you have an Hadoop cluster not far away, you can save your R Workspace on HDFS.
This is very useful if you have shared server that share home's directories (with NFS for example) and user that use a large amount of RAM (> 10Go).
You have to set some environment variable for Hadoop:
In file /etc/R/Renviron.site:
HADOOP_BIN=/opt/hadoop-3.0.0-alpha4/bin
JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64/
HADOOP_HOME=/opt/hadoop-3.0.0-alpha4
To autotmaticely save and restore workspace that is on HDFS, you have to completed the .First and .Last function:
For example in file /etc/R/Rprofile.site:
.First <- function() {
if(interactive()) {
require("HDFSWorkspace", quietly = TRUE)
loadWorkspaceHDFS()
}
}
.Last <- function() {
if(interactive()) {
require("HDFSWorkspace", quietly = TRUE)
saveWorkspaceHDFS()
}
}
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.