min_lag | R Documentation |
Calculate minimum time interval (min_lag) between successive detections and add to detection data set for identifying potential false detections.
min_lag(det)
det |
A OR a data frame containing detection data with the following columns:
|
min_lag
is loosely based on the the "short interval"
described by Pincock (2012) and replicates the min_lag
column in the
standard glatos detection export file. In this case (GLATOS),
min_lag
is defined for each detection as the shortest interval (in
seconds) between either the previous or next detection (whichever is
closest) of the same transmitter code (defined here as combination of
transmitter_codespace and transmitter_id) on the same receiver.
A new column (min_lag
) is added to the input
dataframe that represents the time (in seconds) between the
current detection and the next detection (either before or
after) of the same transmitter on the same receiver. This
function replicates the 'min_lag' column included in the
standard glatos export.
A column min_lag
(defined above) is added to input object.
Chris Holbrook, Todd Hayden, Angela Dini
Pincock, D.G., 2012. False detections: what they are and how to remove them
from detection data. Vemco Division, Amirix Systems Inc., Halifax,
Nova Scotia.
http://www.vemco.com/pdf/false_detections.pdf
false_detections
# load example detection file det_file <- system.file("extdata", "walleye_detections.csv", package = "glatos") det <- read_glatos_detections(det_file) # rename existing min_lag column colnames(det)[colnames(det) == "min_lag"] <- "min_lag.x" # calculate min_lag det <- min_lag(det) head(det)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.