adjust.duplicateTimes | R Documentation |
Duplicated DateTime values within ID are adjusted forward (recursively) by one second until no duplicates are present. This is considered reasonable way of avoiding the nonsensical problem of duplicate times.
adjust.duplicateTimes(time, id)
time |
vector of DateTime values |
id |
vector of ID values, matching DateTimes that are assumed sorted within ID |
This function is used to remove duplicate time records in animal track data, rather than removing the record completely.
The adjusted DateTime vector is returned.
I have no idea what goes on at CLS when they output data that are either not ordered by time or have duplicates. If this problem exists in your data it's probably worth finding out why.
readArgos
## DateTimes with a duplicate within ID
tms <- Sys.time() + c(1:6, 6, 7:10) *10
id <- rep("a", length(tms))
range(diff(tms))
## duplicate record is now moved one second forward
tms.adj <- adjust.duplicateTimes(tms, id)
range(diff(tms.adj))
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.