knitr::opts_chunk$set( collapse = TRUE, comment = "#>", eval = FALSE, purl = FALSE ) library(httptest2) start_vignette("Local Data Collection")
library(microinverterdata)
For inverters that do not collect historical data, you may want to do this collection with R.
Here are some ideas to achieve it.
pins
storagelibrary(pins) board <- board_local() if (!"inverter_data" %in% pin_list(board)) { initial_data <- tibble::tibble( date = Sys.time(), get_output_data(c("192.168.0.175")) ) board |> pin_write(initial_data, name = "inverter_data", versioned = TRUE) }
history <- board |> pin_read("inverter_data")
new_data <- tibble::tibble( date = Sys.time(), get_output_data(c("192.168.0.175")) ) board |> pin_write(rbind(history, new_data), name = "inverter_data", versioned = TRUE)
board |> pin_versions("inverter_data")
Now that we know the dynamic behavior, we can move that to a R script and run it on a regular basis with system tools
and you can use and edit the following file as a baseline
system.file("inverter_data.R", package = "microinverterdata")
and setup (or remove) the environment variables required for the script to run, and finally save the modified script in an accessible folder.
The system tool crontab
is the tool of choice for job scheduling on linux :
crontab -l
Last step here is to configure the crontab
to run it every 30 min like in the following crontab
entry
# m h dom mon dow command 0,30 * * * * R CMD BATCH /path/of/modified/inverter_data.R
Depending of your version, last step here is to configure the task scheduler or the powershell
PSScheduledJob
to run it.
end_vignette()
Any scripts or data that you put into this service are public.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.