View source: R/prob_algorithm.R
prob_algorithm | R Documentation |
prob_algorithm is a simple probabilistic algorithm to be used with geolocation data.
prob_algorithm(
particle.number = 100,
iteration.number = 10,
loess.quartile = NULL,
trn = trn,
sensor = sen,
act = act,
tagging.date = start,
retrieval.date = end,
tol = 0.08,
tagging.location = c(-36.816, -54.316),
boundary.box = c(-180, 180, -90, 90),
sunrise.sd = tw,
sunset.sd = tw,
range.solar = c(-7, -1),
speed.wet = c(20, 0.2, 25),
speed.dry = c(20, 0.2, 25),
sst.sd = 0.5,
max.sst.diff = 3,
ice.conc.cutoff = 1,
wetdry.resolution = 1,
east.west.comp = T,
land.mask = T,
med.sea = T,
black.sea = T,
baltic.sea = T,
caspian.sea = T,
backward = F,
distance.method = "ellipsoid",
NOAA.OI.location = "E:/environmental data/SST/NOAA OI SST V2"
)
particle.number |
number of particles for each location cloud used in the model |
iteration.number |
number of iterations |
loess.quartile |
quartiles for loessFilter (GeoLight), if NULL loess filter is not used |
trn |
data.frame containing twilights and at least tFirst, tSecond and type (same as computed by trn_to_dataframe, ipe_to_dataframe or lotek_to_dataframe) |
sensor |
data.frame with daily SST data deduced from tag temperature readings (sst_deduction ouput), NULLif no SST data is available (SST will not be used) |
act |
data.frame containing wet dry data (e.g. .act file from Biotrack loggers or .deg file from migrate tech loggers), NULL if no wetdry data is available (algorithm will assume that the logger was always dry) |
tagging.date |
deployment data as POSIXct or Date object |
retrieval.date |
retrieval date as POSIXct or Date object |
tol |
|
tagging.location |
tagging location longitude and latitude |
boundary.box |
min lon, max lon, min lat and max lat of extrem boundary where you expect an animal to be |
sunrise.sd |
output vector from twilight_error_estimation |
sunset.sd |
output vector from twilight_error_estimation |
range.solar |
min and max of solar angle range in degree |
speed.wet |
optimal speed, speed standard deviation and max speed allowed if logger is wet in m/s |
speed.dry |
optimal speed, speed standard deviation and max speed allowed if logger is dry in m/s |
sst.sd |
SST standard deviation in degree C |
max.sst.diff |
max difference in SST allowed in degree C |
ice.conc.cutoff |
max percentage of sea ice in which the animal is believed to be |
wetdry.resolution |
sampling rate of conductivity switch in sec (e.g. MK15 & MK3006 sample every 3 sec) |
east.west.comp |
if T apply biotrack east west movement compensation (Biotrack manual v11 page 31pp.) |
land.mask |
if T animal is only using ocean areas, if F animal is only using land areas, if NULL no land mask used |
med.sea |
if T classifiy mediterranean sea as land |
black.sea |
if T classifiy black sea as land |
baltic.sea |
if T classifiy baltic sea as land |
caspian.sea |
if T classifiy caspian sea as land |
backward |
run algorithm from end to start |
distance.method |
|
NOAA.OI.location |
directory location of NOAA OI V2 NCDF files as well as land mask file 'lsmask.oisst.v2.nc' (downloadable from http://www.esrl.noaa.gov/psd/data/gridded/data.noaa.oisst.v2.highres.html) |
Many weighting parameters can be used. Some others (which are not yet implemented) are: surface air temperature, air pressure, water salinity, and topography/ bathymetry.
A list with: [1] all positions, [2] geographic median positions, [3] all possible particles, [4] input parameters, [5] model run time. List items 1 to 3 are returned as SpatialPointsDataframe.
######################################
# example black browed albatross track
# define start and end datetimes ----
start <- as.POSIXct("2014-12-13 17:55", tz="UTC")
end <- as.POSIXct("2014-12-22 08:55", tz="UTC")
# light data ----
trn <- twilightCalc(BBA_lux$dtime, BBA_lux$lig, ask = FALSE, LightThreshold = 2, maxLight = 5)
# sst data ----
sen <- sst_deduction(datetime = BBA_sst$dtime, temp = BBA_sst$temp, temp.range = c(-2,30))
# wet dry data ----
act <- BBA_deg[BBA_deg$wet.dry=="wet",]
act$wetdry <- act$duration
# twilight error distribution estimation ----
tw <- twilight_error_estimation()
# download environmental data ----
# download yearly NetCDF files for (replace YEAR with appropriate number):
# daily mean SST -> 'sst.day.mean.YEAR.v2.nc'
# daily SST error -> 'sst.day.err.YEAR.v2.nc'
# daily mean sea ice concentration -> 'icec.day.mean.YEAR.v2.nc'
# from:
# https://www.esrl.noaa.gov/psd/data/gridded/data.noaa.oisst.v2.highres.html
# and place all into the same folder
# Also, download the land mask file: 'lsmask.oisst.v2.nc' from the same directory
# and place it in the same folder as all the other NetCDF files
# run algorithm ----
pr <- prob_algorithm(trn = trn,
sensor = sen[sen$SST.remove==F,],
act = act,
tagging.date = min(trn$tFirst),
retrieval.date = max(trn$tSecond),
loess.quartile = NULL,
tagging.location = c(-36.816,-54.316),
particle.number = 2000,
iteration.number = 100,
sunrise.sd = tw,
sunset.sd = tw,
range.solar = c(-7,-1),
boundary.box = c(-120,40,-90,0),
speed.dry = c(12,6,45),
speed.wet = c(1,1.3,5),
sst.sd = 0.5,
max.sst.diff = 3,
east.west.comp = T,
land.mask = T,
ice.conc.cutoff = 1,
tol = 0.08,
wetdry.resolution = 1,
distance.method = "ellipsoid"
NOAA.OI.location = "folder with environmental data and land mask")
# plot lat, lon, SST vs time ----
plot_timeline(pr, solar.angle = mean(pr[[2]]$median.solar.angle))
# plot lon vs lat map ----
plot_map(pr, legend.position = "topright")
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.