R/spread2.R

Defines functions asymmetryAdjust angleQuality reorderColsWDistance rbindlistDtDtpot spread2

Documented in angleQuality asymmetryAdjust rbindlistDtDtpot reorderColsWDistance spread2

if (getRversion() >= "3.1.0") {
  utils::globalVariables(c("distance",
                           "from", "i.size", "ind", "indClDT", "initialPixels",
                           "n", "newQuantity", "numNeighs", "numRetries", "origIndex", "pixels",
                           "quantityAdj", "quantityAdj2", "state", "size", "tooBigByNCells", "V1", "proportion"
  ))
}

################################################################################
#' Simulate a contagious spread process on a landscape, with data.table internals
#'
#' This can be used to simulate fires, seed dispersal, calculation of iterative,
#' concentric, symmetric (currently) landscape values and many other things.
#' Essentially, it starts from a collection of cells (\code{start}, called "events")
#' and spreads to neighbours, according to the \code{directions}
#' and \code{spreadProb} with modifications due to other arguments. \bold{NOTE:}
#' the \code{spread} function is similar, but sometimes slightly faster, but less
#' robust, and more difficult to use iteratively.
#'
#' There are 2 main underlying algorithms for active cells to "spread" to
#' nearby cells (adjacent cells): \code{spreadProb} and \code{neighProb}.
#' Using \code{spreadProb}, every "active" pixel will assess all
#' neighbours (either 4 or 8, depending on  \code{directions}), and will "activate"
#' whichever neighbours successfully pass independent calls to
#' \code{runif(1,0,1)<spreadProb}.
#' The algorithm will iterate again and again, each time starting from the newly
#' "activated" cells. Several built-in decisions are as follows.
#' 1. no active cell can active a cell that was already activated by
#' the same event (i.e., "it won't go backwards"). 2. If \code{allowOverlap} is
#' \code{FALSE}, then the previous rule will also apply, regardless of which
#' "event" caused the pixels to be previously active.
#'
#' This function can be interrupted before all active cells are exhausted if
#' the \code{iterations} value is reached before there are no more active
#' cells to spread2 into. The interrupted output (a data.table) can be passed
#' subsequently as an input to this same function (as \code{start}).
#' This is intended to be used for situations where external events happen during
#' a spread2 event, or where one or more arguments to the spread2 function
#' change before a spread2 event is completed.
#' For example, if it is desired that the \code{spreadProb} change before a
#' spread2 event is completed because, for example, a fire is spreading, and a
#' new set of conditions arise due to a change in weather.
#'
#' \code{asymmetry} here is slightly different than in the \code{spread} function,
#' so that it can deal with a \code{RasterLayer} of \code{asymmetryAngle}.
#' Here, the \code{spreadProb} values of a given set of neighbours around each active pixel
#' are adjusted to create \code{adjustedSpreadProb} which is calculated maintain the
#' following two qualities: \deqn{mean(spreadProb) = mean(ajustedSpreadProb)} and
#' \deqn{max(spreadProb)/min(spreadProb) = asymmetry} along the axis of
#' \code{asymmetryAngle}. NOTE: this means that the 8 neighbours around an active
#' cell may not fulfill the preceeding equality if \code{asymmetryAngle} is not
#' exactly one of the 8 angles of the 8 neighbours. This means that
#' \deqn{max(spreadProb)/min(spreadProb)} will generally be less than
#' \code{asymmetry}, for the 8 neighbours. The exact adjustment to the spreadProb
#' is calculated with:
#' \deqn{angleQuality <- (cos(angles - rad(asymmetryAngle))+1)/2}
#' which is multiplied to get an angle-adjusted spreadProb:
#' \deqn{spreadProbAdj <- actualSpreadProb * angleQuality}
#' which is then rescaled:
#' \deqn{adjustedSpreadProb = (spreadProbAdj - min(spreadProbAdj)) * par2 + par1},
#' where par1 and par2 are parameters calculated internally to make the 2 conditions above true.
#'
#' @section Breaking out of spread2 events:
#'
#' There are 3 ways for the spread2 to "stop" spreading.
#' Here, each "event" is defined as all cells that are spawned from each unique
#' \code{start} location.
#' The ways outlined below are all acting at all times, i.e., they are not
#' mutually exclusive.
#' Therefore, it is the user's responsibility to make sure the different rules
#' are interacting with each other correctly.
#'
#' \tabular{ll}{
#'   \code{spreadProb} \tab Probabilistically, if spreadProb is low enough,
#'                          active spreading events will stop. In practice,
#'                          this number generally should be below 0.3 to actually
#'                          see an event stop\cr
#'   \code{maxSize} \tab This is the number of cells that are "successfully" turned
#'                       on during a spreading event. \code{spreadProb} will still
#'                       be active, so, it is possible that the end size of each event
#'                       is smaller than \code{maxSize}, but they will not be greater
#'                       than \code{maxSize}\cr
#'   \code{exactSize} \tab This is the number of cells that are "successfully" turned
#'                       on during a spreading event. This will override an event that
#'                       stops probabilistically via \code{spreadProb}, but forcing
#'                       its last set of active cells to try again to find neighbours.
#'                       It will try 10 times per event, before giving up.
#'                       During those 10 times, it will try twice to "jump" up to
#'                       4 cells outwards from each of the active cells.\cr
#'   \code{iterations} \tab This is a hard cap on the number of internal iterations to
#'                          complete before returning the current state of the system
#'                          as a \code{data.table}.\cr
#' }
#'
#' @param landscape Required. A \code{RasterLayer} object. This defines the possible
#'                  locations for spreading events to start and spread2 into. Required.
#'
#' @param start Required. Either a vector of pixel numbers to initiate spreading, or a
#'              data.table that is the output of a previous \code{spread2}.
#'              If a vector, they should be cell indices (pixels) on the \code{landscape}.
#'              If user has x and y coordinates, these can be converted with
#'              \code{\link[raster]{cellFromXY}}.
#'
#' @param spreadProb  Numeric of length 1 or \code{RasterLayer}.
#'                    If numeric of length 1, then this is the global (absolute)
#'                    probability of spreading into each cell from a neighbour.
#'                    If a raster then this must be the cell-specific (absolute)
#'                    probability of a "receiving" potential cell.
#'                    Default is \code{0.23}.
#'                    If relative probabilities are required, use \code{spreadProbRel}.
#'                    If used together, then the relative probabilities will be
#'                    re-scaled so that the mean relative probability of potential
#'                    neighbours is equal to the mean of \code{spreadProb} of
#'                    the potential neighbours.
#'
#' @param persistProb Numeric of length 1 or \code{RasterLayer}.
#'                    If numeric of length 1, then this is the global (absolute)
#'                    probability of cell continuing to burn per time step.
#'                    If a raster, then this must be the cell-specific (absolute)
#'                    probability of a fire persisting.
#'                    Default is \code{NA}, which is the same as 0, i.e. a cell only burns
#'                    for one time step.
#'
#' @param spreadProbRel Optional \code{RasterLayer} indicating a surface of relative
#'                      probabilities useful when using \code{neighProbs} (which
#'                      provides a mechanism for selecting a specific number of
#'                      cells at each iteration).
#'                      This indicates the relative probabilities for the selection
#'                      of successful neighbours.
#'                      \code{spreadProb} will still be evaluated \emph{after}
#'                      the relative probabilities and \code{neighProbs} has been
#'                      evaluated, i.e., potential cells will be identified, then
#'                      some could be rejected via \code{spreadProb}.
#'                      If absolute \code{spreadProb} is not desired,
#'                      \emph{be sure to set} \code{spreadProb = 1}.
#'                      Ignored if \code{neighProbs} is not provided.
#'
#' @param asRaster Logical, length 1. If \code{TRUE}, the function will return a \code{Raster}
#'                 where raster non NA values indicate the cells that were "active", and the
#'                 value is the initial starting pixel.
#'
#' @param maxSize       Numeric. Maximum number of cells for a single or
#'                      all events to be spread2. Recycled to match \code{start} length,
#'                      if it is not as long as \code{start}. This will be overridden if
#'                      \code{exactSize} also provided.
#'                      See section on \code{Breaking out of spread2 events}.
#'
#' @param exactSize Numeric vector, length 1 or \code{length(start)}.
#'                  Similar to \code{maxSize}, but these will be the exact
#'                  final sizes of the events.  i.e., the spread2 events
#'                  will continue until they are \code{floor(exactSize)}.
#'                  This will override \code{maxSize} if both provided.
#'                  See Details.
#'
#' @param directions    The number adjacent cells in which to look;
#'                      default is 8 (Queen case). Can only be 4 or 8.
#'
#' @param iterations    Number of iterations to spread2.
#'                      Leaving this \code{NULL} allows the spread2 to continue
#'                      until stops spreading itself (i.e., exhausts itself).
#'
#' @param returnDistances Logical. Should the function include a column with the
#'                      individual cell distances from the locus where that event
#'                      started. Default is FALSE. See Details.
#'
#' @param returnFrom Logical. Should the function return a column with the
#'                      source, i.e, the lag 1 "from" pixel, for each iteration.
#'
#' @param circle  Logical. If TRUE, then outward spread2 will be by equidistant rings,
#'                rather than solely by adjacent cells (via \code{directions} arg.).
#'                Default is \code{FALSE}.
#'                Using \code{circle = TRUE} can be dramatically slower for large problems.
#'                Note, this will likely create unexpected results if \code{spreadProb < 1}.
#'
#' @param skipChecks Logical. If TRUE, the argument checking (i.e., assertions) will be
#'              skipped. This should likely only be used once it is clear that the function
#'              arguments are well understood and function speed is of the primary importance.
#'              This is likely most useful in repeated iteration cases i.e., if this call
#'              is using the previous output from this same function.
#'
#' @param neighProbs An optional numeric vector, whose sum is 1. It indicates the
#'                   probabilities that an individual
#'                   spread iteration will spread to \code{1, 2, ..., length(neighProbs)}
#'                   neighbours, respectively. If this is used (i.e., something other than
#'                   NA), \code{circle} and \code{returnDistances} will not work currently.
#'
#' @param asymmetry     A numeric or \code{RasterLayer} indicating the ratio of the
#'                      asymmetry to be used. i.e., 1 is no asymmetry; 2 means that the
#'                      angles in the direction of the \code{asymmetryAngle} are 2x the
#'                      \code{spreadProb}
#'                      of the angles opposite tot he \code{asymmetryAngle}  Default is
#'                      NA, indicating no asymmetry. See details. This is still experimental.
#'                      Use with caution.
#'
#' @param asymmetryAngle A numeric or \code{RasterLayer} indicating the angle in degrees
#'                      (0 is "up", as in North on a map),
#'                      that describes which way the \code{asymmetry} is.
#' @param allowOverlap Logical. If TRUE, then individual events can overlap with one another,
#'                     i.e., they do not interact (this is slower than if
#'                     allowOverlap = FALSE). Default is FALSE. This can also be \code{NA},
#'                     which means that the event can overlap with other events,
#'                     and also itself. This would be, perhaps, useful for dispersal of,
#'                     say, insect swarms.
#'
#' @param plot.it  If TRUE, then plot the raster at every iteration,
#'                   so one can watch the spread2 event grow.
#'
#' @inheritParams spread
#'
#' @details
#'
#' If \code{exactSize} or \code{maxSize} are used, then spreading will continue and stop
#' before or at \code{maxSize} or at \code{exactSize}. If \code{iterations} is specified,
#' then the function will end, and the returned \code{data.table} will still
#' may (if \code{maxSize}) or will (if \code{exactSize}) have at least one active
#' cell per event that did not already achieve \code{maxSize} or \code{exactSize}. This
#' will be very useful to build new, customized higher-level wrapper functions that iteratively
#' call \code{spread2}.
#'
#' @note
#' \code{exactSize} may not be achieved if there aren't enough cells in the map.
#' Also, \code{exactSize} may not be achieved because the active cells are "stuck",
#' i.e., they have no unactivated cells to move to; or the \code{spreadProb} is low.
#' In the latter two cases, the algorithm will retry again, but it will only
#' re-try from the last iterations active cells.
#' The algorithm will only retry 10 times before quitting.
#' Currently, there will also be an attempt to "jump" up to four cells away from
#' the active cells to try to continue spreading.
#'
#' A common way to use this function is to build wrappers around this, followed
#' by iterative calls in a \code{while} loop. See example.
#'
#' @section Building custom spreading events:
#'
#' This function can be used iteratively, with relatively little overhead compared to using
#' it non-iteratively. In general, this function can be called with arguments set as user
#' needs, and with specifying iterations = 1 (say). This means that the function will spread
#' outwards 1 iteration, then stop. The returned object will be a data.table or \code{RasterLayer}
#' that can be passed immediately back as the start argument into a subsequent
#' call to \code{spread2}. This means that every argument can be updated at each iteration.
#'
#' When using this function iteratively, there are several things to keep in mind.
#' The output will likely be sorted differently than the input (i.e., the
#' order of start, if a vector, may not be the same order as that returned).
#' This means that when passing the same object back into the next iteration of the
#' function call, \code{maxSize} or \code{exactSize} may not be in the same order.
#' To get the same order, the easiest thing to do is sort the initial \code{start}
#' objects by their pixel location, increasing.
#' Then, of course, sorting any vectorized arguments (e.g., \code{maxSize}) accordingly.
#'
#' \bold{NOTE}: the \code{data.table} or \code{RasterLayer} should not use be altered
#' when passed back into \code{spread2}.
#'
#' @return
#' Either a \code{data.table} (\code{asRaster=FALSE}) or a \code{RasterLayer}
#' (\code{asRaster=TRUE}, the default).
#' The \code{data.table} will have one attribute named \code{spreadState}, which
#' is a list containing a \code{data.table} of current cluster-level information
#' about the spread events.
#' If \code{asRaster=TRUE}, then the \code{data.table} (with its \code{spreadState}
#' attribute) will be attached to the \code{Raster} as an attribute named \code{pixel} as it
#' provides pixel-level information about the spread events.
#'
#' The \code{RasterLayer} represents every cell in which a successful spread2 event occurred.
#' For the case of, say, a fire this would represent every cell that burned.
#' If \code{allowOverlap} is \code{TRUE}, the return will always be a \code{data.table}.
#'
#' If \code{asRaster} is \code{FALSE}, then this function returns a
#' \code{data.table} with 3 (or 4 if \code{returnFrom} is \code{TRUE}) columns:
#'
#' \tabular{ll}{
#'   \code{initialPixels} \tab the initial cell number of that particular
#'                            spread2 event.\cr
#'   \code{pixels} \tab The cell indices of cells that have
#'                        been touched by the spread2 algorithm.\cr
#'   \code{state} \tab a logical indicating whether the cell is active (i.e.,
#'                        could still be a source for spreading) or not (no
#'                        spreading will occur from these cells).\cr
#'   \code{from} \tab The pixel indices that were the immediately preceeding
#'                    "source" for each \code{pixels}, i.e., the lag 1 pixels.
#'                    Only returned if \code{returnFrom} is \code{TRUE} \cr
#' }
#'
#' The attribute saved with the name "spreadState" (e.g., \code{attr(output, "spreadState")})
#' includes a \code{data.table} with columns:
#' \tabular{ll}{
#'   \code{id} \tab An arbitrary code, from 1 to \code{length(start)} for each "event".\cr
#'   \code{initialPixels} \tab the initial cell number of that particular
#'                            spread2 event.\cr
#'   \code{numRetries} \tab The number of re-starts the event did because it got
#'                          stuck (normally only because \code{exactSize} was used
#'                          and was not achieved.\cr
#'   \code{maxSize} \tab The number of pixels that were provided as inputs via
#'                      \code{maxSize} or \code{exactSize}.\cr
#'   \code{size} \tab The current size, in pixels, of each event.\cr
#' }
#' and several other objects that provide significant speed ups in iterative calls to
#' spread2. If the user runs \code{spread2} iteratively, there will likely be significant
#' speed gains if the \code{data.table} passed in to \code{start} should have the attribute
#' attached, or re-attached if it was lost, e.g., via
#' \code{setattr(outInput, "spreadState", attr(out, "spreadState"))}, where \code{out} is the
#' returned \code{data.table} from the previous call to \code{spread2}, and \code{outInput} is
#' the modified \code{data.table}. Currently, the modified \code{data.table} \bold{must} have the
#' same order as \code{out}.
#'
#' @author Eliot McIntire and Steve Cumming
#' @export
#' @importFrom bit bit
#' @importFrom checkmate assert assertClass assertNumeric
#' @importFrom checkmate checkClass checkDataTable checkLogical checkNumeric checkScalarNA
#' @importFrom checkmate qassert
#' @importFrom data.table := alloc.col as.data.table copy data.table is.data.table
#' @importFrom data.table rbindlist set setattr setcolorder setkeyv setnames uniqueN
#' @importFrom ff ff
#' @importFrom fpCompare %<=% %>>%
#' @importFrom magrittr %>%
#' @importFrom quickPlot Plot
#' @importFrom raster ncell raster res ncol pointDistance
#' @importFrom stats runif
#'
#' @seealso \code{\link{spread}} for a different implementation of the same algorithm.
#' \code{spread} is less robust but it is often slightly faster.
#'
#' @example inst/examples/example_spread2.R
#'
spread2 <- function(landscape, start = ncell(landscape) / 2 - ncol(landscape) / 2,
                    spreadProb = 0.23, persistProb = NA_real_, asRaster = TRUE,
                    maxSize, exactSize, directions = 8L, iterations = 1e6L,
                    returnDistances = FALSE, returnFrom = FALSE,
                    spreadProbRel = NA_real_, plot.it = FALSE, circle = FALSE,
                    asymmetry = NA_real_, asymmetryAngle = NA_real_,
                    allowOverlap = FALSE, neighProbs = NA_real_, skipChecks = FALSE) {

  #### assertions ###############
  assertClass(landscape, "Raster")
  ncells <- ncell(landscape)
  numCols <- ncol(landscape)
  if (!skipChecks) {
    assert(
      checkNumeric(start, min.len = 0, max.len = ncells, lower = 1, upper = ncells),
      checkClass(start, "Raster"),
      checkDataTable(start))

    qassert(neighProbs, "n[0,1]")
    assertNumeric(sum(neighProbs), lower = 1, upper = 1)

    assert(
      checkNumeric(spreadProb, 0, 1, min.len = 1, max.len = 1),
      checkClass(spreadProb, "RasterLayer")
    )
    assert(checkNumeric(persistProb, 0, 1, min.len = 1, max.len = 1),
           checkClass(persistProb, "RasterLayer"))
    assert(
      checkScalarNA(spreadProbRel),
      checkClass(spreadProbRel, "RasterLayer")
    )
    assert(
      checkNumeric(asymmetry, 0, Inf, min.len = 1, max.len = 1),
      checkClass(asymmetry, "RasterLayer")
    )
    assert(
      checkNumeric(asymmetryAngle, 0, 360, min.len = 1, max.len = 1),
      checkClass(asymmetryAngle, "RasterLayer")
    )
    qassert(directions, "N1[4,8]")
    qassert(iterations, "N1[0,Inf]")
    qassert(circle, "B")

    if (!missing(maxSize)) {
      if (is.data.table(start)) {
        n <- uniqueN(start, by = "initialPixels")
        assert(
          checkNumeric(maxSize, min.len = 1, max.len = 1),
          checkNumeric(maxSize, min.len = n, max.len = n)
        )
      } else {
        assert(
          checkNumeric(maxSize, min.len = 1, max.len = 1),
          checkNumeric(maxSize, min.len = NROW(start), max.len = NROW(start))
        )
      }
    }
    if (!missing(exactSize)) {
      if (is.data.table(start)) {
        n <- uniqueN(start, by = "initialPixels")
        assert(
          checkNumeric(exactSize, min.len = 1, max.len = 1),
          checkNumeric(exactSize, min.len = n, max.len = n)
        )
      } else {
        assert(
          checkNumeric(exactSize, min.len = 1, max.len = 1),
          checkNumeric(exactSize, min.len = NROW(start), max.len = NROW(start))
        )
      }
    }
  }
  ##### End assertions

  # Step 0 - set up objects -- main ones: dt, clusterDT -- get them from attributes
  ## on start or initiate them
  smallRaster <- ncells < 4e7 # should use bit vector (RAM) or ff raster (Disk)
  canUseAvailable <- !(isTRUE(allowOverlap) | is.na(allowOverlap))
  if (missing(maxSize)) {
    maxSize <- NA_real_
  }

  if (missing(exactSize)) {
    exactSize <- NA_real_
  } else {
    maxSize <- exactSize
  }

  # returnDistances = TRUE and circle = TRUE both require distance calculations
  needDistance <- returnDistances | circle
  usingAsymmetry <- !is.na(asymmetry)

  # This means that if an event can not spread any more, it will try 10 times, incl. 2 jumps
  maxRetriesPerID <- 10

  if (!is.numeric(start) & !is.data.table(start)) {
    if (is(start, "Raster")) {
      start <- attr(start, "pixel")
    } else {
      stop("Start must be either a vector of pixels, a data.table from",
           "previous spread2 or a Raster from a previous spread2")
    }
  }

  if (!is.data.table(start)) {
    # A "new" entry into spread2 -- need to set up stuff
    if (canUseAvailable) {
      if (smallRaster) {
        notAvailable <- bit(ncells)
      } else {
        notAvailable <- ff(vmode = "boolean", FALSE, length = ncells)
      }
      notAvailable[start] <- TRUE
    }

    start <- as.integer(start)

    whActive <- seq_along(start)
    whInactive <- integer()
    dt <- data.table(initialPixels = start)
    if (returnFrom) {
      set(dt, NULL, "from", NA_integer_)
    }
    set(dt, NULL, "pixels", start)
    set(dt, NULL, "state", "activeSource")

    clusterDT <- data.table(id = whActive, initialPixels = start,
                            numRetries = 0L, size = as.integer(iterations > 0))

    if (!anyNA(maxSize)) {
      set(clusterDT, NULL, "maxSize", maxSize)

      # de-activate ones that are 1 cell
      set(dt, which(clusterDT$maxSize == 1), "state", "inactive")
    }
    if (!anyNA(exactSize)) {
      set(clusterDT, NULL, "exactSize", TRUE)
    }

    setkeyv(clusterDT, "initialPixels")
    if (needDistance) set(dt, NULL, "distance", 0) # it is zero distance to self
    if (usingAsymmetry) {
      set(dt, NULL, "effectiveDistance", 0) # it is zero distance to self
      set(dt, NULL, "distClass", 0) # it is zero distance to self
    }
    totalIterations <- 0
  } else {
    # a "return" entry into spread2
    dt <- start
    if (!is.null(attr(start, "spreadState"))) {
      clusterDT <- attr(start, "spreadState")$clusterDT
      if (!key(clusterDT) == "initialPixels")
        # should have key if it came directly from output of spread2
        setkeyv(clusterDT, "initialPixels")
      if (!anyNA(maxSize)) {
        if (any(maxSize != clusterDT$maxSize)) {
          sizeType <- if (!anyNA(exactSize)) "exactSize" else "maxSize"
          message(
            sizeType, " provided. ",
            "It does not match with size attr(start, 'cluster')$maxSize. ",
            "Using the new ", sizeType, " provided. Perhaps sorted differently?",
            "Try sorting initial call to spread2 so that pixel number of start ",
            "cells is strictly increasing")
          clusterDT$maxSize <- maxSize
        }
      }
      if (any(colnames(clusterDT) == "maxSize")) maxSize <- clusterDT$maxSize
      whActive <- attr(start, "spreadState")$whActive
      whInactive <- attr(start, "spreadState")$whInactive
      totalIterations <- attr(start, "spreadState")$totalIterations
      if (canUseAvailable) {
        notAvailable <- attr(start, "spreadState")$notAvailable
      }
    } else {
      # case where user has deleted the attributes
      whActive <- which(start$state == "activeSource")
      whInactive <- which(start$state == "inactive")
      canUseAvailable <- FALSE # not worth it if it has to be remade each time
      totalIterations <- if (needDistance) max(start$distance) else 0
      unIP <- unique(dt$initialPixels)
      clusterDT <- data.table(id = seq_along(unIP), initialPixels = unIP, numRetries = 0L)
      if (!anyNA(maxSize)) {
        set(clusterDT, NULL, "maxSize", maxSize)
        if (!anyNA(exactSize)) {
          set(clusterDT, NULL, "exactSize", TRUE)
        }
        set(clusterDT, NULL, "size", dt[, .N, by = "initialPixels"]$N)
        setkeyv(clusterDT, "initialPixels")
      }
    }
  }

  whTooSmall <- integer()
  dtPotentialColNames <- c("id", "from", "to", "state", "distance"[needDistance],
                           "effectiveDistance"[usingAsymmetry])

  # start at iteration 0, note: totalIterations is also maintained,
  # which persists during iterative calls to spread2
  its <- 0

  # Main loop -- continue if active and still below iterations & none is too
  # small (and doesn't have any active cells)
  while (length(whTooSmall) | (length(whActive) &  its < iterations)) {
    # Step 1
    # Get neighbours, either via adj (default) or cir (jumping if stuck)
    if (length(whTooSmall) > 0) {
      # cir
      ## get slightly further neighbours
      dtRetry <- dt[whTooSmall]
      set(dtRetry, NULL, "state", NULL)
      whNeedJump <- which(((clusterDT$numRetries + 1) %% 10) == 0)
      if (length(whNeedJump)) {
        # jump every 10, starting at 20
        resCur <- res(landscape)[1]
        dtRetryJump <- dtRetry[clusterDT[whNeedJump], nomatch = 0]
        fromPixels <- dtRetryJump$pixels
        dtPotential <- lapply(seq_along(fromPixels), function(fp) {
          cbind(id = fp, cir(landscape, loci = fromPixels[fp],
                             includeBehavior = "excludePixels",
                             minRadius = resCur,
                             maxRadius = 20 * resCur)[, "indices"]) # 20 pixels
        }) %>%
          do.call(what = rbind)

        dtPotential <- matrix(as.integer(dtPotential), ncol = 2)
        colnames(dtPotential) <- c("id", "to")
        dtPotentialJump <- cbind(from = fromPixels[dtPotential[, "id"]],
                                 to = dtPotential[, "to"],
                                 id = dtRetryJump$initialPixels[dtPotential[, "id"]])
        dtRetry <- dtRetry[!clusterDT[whNeedJump]] # remove jumped neighbours
      }
      ## get adjacent neighbours
      dtPotential <- adj(
        directions = directions,
        numCell = ncells,
        numCol = numCols,
        id = dtRetry$initialPixels,
        cells = dtRetry$pixels, cutoff.for.data.table = 5e2,
        returnDT = TRUE
      )

      if (exists("dtPotentialJump")) {
        if (is.data.table(dtPotential)) {
          dtPotential <- rbindlist(list(dtPotential, as.data.table(dtPotentialJump)))
        } else {
          dtPotential <- rbind(dtPotential, dtPotentialJump)
        }
        rm("dtPotentialJump")
      }

      set(dt, whActive, "state", "holding") # take them out of commission for this iteration
      set(dt, whTooSmall, "state", "activeSource")
      whTooSmall <- integer()
    } else {
      # adj
      ## Spread to immediate neighbours
      dtPotential <- adj(
        numCell = ncells,
        numCol = numCols,
        directions = directions,
        id = dt$initialPixels[whActive],
        cells = dt$pixels[whActive], cutoff.for.data.table = 5e2,
        returnDT = TRUE
      )

      # only iterate if it is not a Retry situation
      its <- its + 1
      totalIterations <- totalIterations + 1
    }

    # Step 2 - Randomize order

    # randomize row order so duplicates are not always in same place
    i <- sample.int(NROW(dtPotential))
    if (!is.data.table(dtPotential)) {
      dtPotential <- as.data.table(dtPotential)
    }
    for (x in colnames(dtPotential)) set(dtPotential, NULL, x, dtPotential[[x]][i])

    # Step 3 -- if required -- calculate distances, if required ... attach to dt
    if (needDistance) {
      fromPts <- xyFromCell(landscape, dtPotential$id)
      toPts <- xyFromCell(landscape, dtPotential$to)
      dists <- pointDistance(p1 = fromPts, p2 = toPts, lonlat = FALSE)
      if (usingAsymmetry) {
        actualAsymmetry <- if (length(asymmetry) == 1) {
          asymmetry
        } else {
          asymmetry[dtPotential$to]
        }
        actualAsymmetryAngle <- if (length(asymmetryAngle) == 1) {
          asymmetryAngle
        } else {
          asymmetryAngle[dtPotential$to]
        }

        angleQualities <- angleQuality(from = dtPotential$id, to = dtPotential$to,
                                       landscape, actualAsymmetryAngle)
        naAQ <- is.na(angleQualities[, "angleQuality"])
        angleQualities[naAQ, "angleQuality"] <- 1
        # convert dists to effective distance
        effDists <- dists * ((2 - angleQualities[, "angleQuality"]) / 2 *
                               (actualAsymmetry - 1) + 1)

        # For asymmetry, we also may want to know what proportion of the outward spreading
        #  event will hit each pixel, not just the effectiveDistance
        lociHere <- if (is.numeric(start)) start else
          attributes(dt)$spreadState$clusterDT$initialPixels
        # pureCircle <- cir(landscape,
        #                   loci = lociHere,
        #                   allowOverlap = TRUE, allowDuplicates = FALSE,
        #                   maxRadius = totalIterations,
        #                   minRadius = totalIterations - 0.999999,
        #                   returnIndices = TRUE,
        #                   returnDistances = TRUE,
        #                   includeBehavior = "excludePixels")


        # This is a very fast version with allowOverlap = TRUE, allowDuplicates = FALSE,
        #   returnIndices = TRUE, returnDistances = TRUE, and includeBehavior = "excludePixels"
        pureCircle <- .cirSpecialQuick(landscape,
                                       loci = lociHere,
                                       maxRadius = totalIterations,
                                       minRadius = totalIterations - 0.999999)
        pureCircle <- cbind(pureCircle[, c("id", "indices", "dists"), drop = FALSE],
                            distClass = ceiling(pureCircle[, "dists"]))
        colnames(pureCircle)[2] <- c("to")

        theoreticalAngleQualities <- angleQuality(pureCircle[, "id", drop = FALSE],
                                                  pureCircle[, "to", drop = FALSE],
                                                  landscape,
                                                  actualAsymmetryAngle = actualAsymmetryAngle)
        naAQ <- is.na(theoreticalAngleQualities[, "angleQuality"])
        theoreticalAngleQualities[naAQ, "angleQuality"] <- 1
        # convert dists to effective distance
        effDists1 <- pureCircle[, "dists"] *
          ((2 - theoreticalAngleQualities[, "angleQuality"]) / 2 * (actualAsymmetry - 1) + 1)

        pc <- pureCircle[, "dists"] / effDists1
        pureCircle <- cbind(pureCircle, proportion = pc)
        pureCircle <- as.data.table(pureCircle)
        pureCircle[, proportion := proportion / sum(proportion), by = "id"]
        set(pureCircle, NULL, "dists", NULL)
        setkeyv(pureCircle, c("id", "to"))
        pureCirclePrev <- attr(dt, "spreadState")$pureCircle
        if (!is.null(pureCirclePrev)) {
          pureCircle <- rbindlist(list(pureCircle, pureCirclePrev),
                                  use.names = FALSE, fill = FALSE)
          #pureCircle <- unique(pureCircle)
        }
      }

      if (circle) {
        if (usingAsymmetry) {
          distKeepers <- effDists %<=% totalIterations & effDists %>>%
            (totalIterations - 1)
          dtPotentialAsymmetry <- dtPotential[!distKeepers]
          if (sum(distKeepers) == 0) {
            # all failed
            set(dt, NULL, "state", "successful")
          } else {
            unDTPotAssym <- unique(dtPotentialAsymmetry$from)
            if (length(unDTPotAssym) == length(unique(dt$pixel))) {
              set(dt, NULL, "state", "successful")
            } else {
              dt[pixels %in% unDTPotAssym, state := "successful"]
            }
          }
        } else {
          distKeepers <- dists %<=% totalIterations & dists %>>% (totalIterations - 1)
        }

        dtPotentialAllNeighs <- copy(dtPotential)
        setkeyv(dtPotentialAllNeighs, "from")
        dtPotential <- dtPotential[distKeepers]
        dists <- dists[distKeepers]
      }

      set(dtPotential, NULL, "distance", dists)
      if (usingAsymmetry) {
        set(dtPotential, NULL, "effectiveDistance", effDists[distKeepers])
        if (circle) {
          dtPotential <- dtPotential[pureCircle, nomatch = 0, on = c("id", "to")][
            , proportion := proportion / .N, by = c("id", "to")]
        }
      }
    }

    # Step 4 -- assign "successful" to all dtPotentials --
    set(dtPotential, NULL, "state", "successful")

    # Step 5 -- optional -- Algorithm neighProbs - uses a specific number of neighbours
    if (!anyNA(neighProbs)) {
      # numNeighs algorithm
      numNeighsByPixel <- unique(dtPotential, by = c("id", "from"))
      if (is.list(neighProbs)) {
        if (NROW(numNeighsByPixel) != length(neighProbs)) {
          neighProbs1 <- neighProbs[match(numNeighsByPixel$from,
                                          start[state == "activeSource"]$pixels)]
        } else {
          neighProbs1 <- neighProbs
        }
        set(numNeighsByPixel, NULL, "numNeighs", unlist(lapply(
          neighProbs1, function(np) {
            sample.int(size = 1, n = length(np), replace = TRUE, prob = np)
          }))
        )
      } else {
        set(numNeighsByPixel, NULL, "numNeighs",
            sample.int(size = NROW(numNeighsByPixel), n = length(neighProbs),
                       replace = TRUE, prob = neighProbs))
      }
      setkeyv(numNeighsByPixel, c("id", "from"))

      # remove duplicates within dtPotential
      dupsWithinDtPotential <- duplicatedInt(dtPotential$to)
      successCells <- dtPotential$to[!dupsWithinDtPotential] # remove the dupsWithinDtPotential
      potentialNotAvailable <- notAvailable[successCells]
      whNoDupsCurItAndAll <- seq_along(dtPotential$to)[!dupsWithinDtPotential][
        !potentialNotAvailable]
      dtPotential <- dtPotential[whNoDupsCurItAndAll]
      setkeyv(dtPotential, c("id", "from")) # sort so it is the same as numNeighsByPixel

      if (NROW(dtPotential)) {
        if (is(spreadProbRel, "RasterLayer")) {
          set(dtPotential, NULL, "spreadProbRel", spreadProbRel[][dtPotential$to])
        } else {
          set(dtPotential, NULL, "spreadProbRel", 1)
        }
        spreadProbNA <- is.na(dtPotential$spreadProbRel) # This is where a mask enters
        if (any(spreadProbNA)) {
          dtPotential <- dtPotential[!spreadProbNA]
          # code below is a possible replacement for previous line -- faster for small problems
          # colnamesDtPot <- colnames(dtPotential)
          # ll <-  lapply(colnamesDtPot, function(x) dtPotential[[x]][!spreadProbNA])
          # names(ll) <- colnamesDtPot
          # dtPotential <- as.data.table(ll)
        }
        # might be zero length because of spreadProb NAs
        if (NROW(dtPotential)) {
          # If it is a corner or has had pixels removed bc of duplicates,
          # it may not have enough neighbours
          numNeighsByPixel <- numNeighsByPixel[dtPotential[, .N, by = c("id", "from")]]
          set(numNeighsByPixel, NULL, "numNeighs",
              pmin(numNeighsByPixel$N, numNeighsByPixel$numNeighs, na.rm = TRUE))
          dtPotential <- dtPotential[numNeighsByPixel[dtPotential][,
                                                                   .I[sample.int(length(numNeighs), size = numNeighs, prob = spreadProbRel)],
                                                                   by = "from"]$V1]
        }
        set(dtPotential, NULL, "spreadProbRel", NULL)
      }
    } # end of neighProbs -- should now have only dtPotentials that match number neighbours req'd

    # Step 6 -- spreadProb implementation - uses an absolute probability for
    # each potential neighbour
    # Extract spreadProb for the current set of potentials
    if (length(spreadProb) == 1) {
      actualSpreadProb <- rep(spreadProb, NROW(dtPotential))
    } else {
      actualSpreadProb <- spreadProb[dtPotential$to]
      # remove NA values that may come from a spreadProb raster
      NAaSP <- !is.na(actualSpreadProb)
      if (any(NAaSP)) {
        dtPotential <- dtPotential[NAaSP,]
        actualSpreadProb <- actualSpreadProb[NAaSP]
      }
    }

    # Step 6a -- asymmetry -- this will modify spreadProb if it is not a circle
    #  -- circle asymmetry happens elsewhere
    # modify actualSpreadProb if there is asymmetry
    if (usingAsymmetry & !circle) {
      actualAsymmetry <- if (length(asymmetry) == 1) {
        asymmetry
      } else {
        asymmetry[dtPotential$to]
      }

      actualAsymmetryAngle <- if (length(asymmetryAngle) == 1) {
        asymmetryAngle
      } else {
        asymmetryAngle[dtPotential$to]
      }

      angleQualities <- angleQuality(from = dtPotential$id, to = dtPotential$to,
                                     landscape, actualAsymmetryAngle)

      naAQ <- is.na(angleQualities[, "angleQuality"])
      angleQualities[naAQ, "angleQuality"] <- actualSpreadProb[naAQ]

      actualSpreadProb <- asymmetryAdjust(angleQualities, actualSpreadProb, actualAsymmetry)
    }

    # Step 7 <- calculate spread success based on actualSpreadProb
    spreadProbSuccess <- runifC(NROW(dtPotential)) <= actualSpreadProb

    # Step 8 - Remove duplicates & bind dt and dtPotential
    if (anyNA(neighProbs)) {
      if (isTRUE(allowOverlap) | is.na(allowOverlap) | !canUseAvailable) {
        # overlapping allowed
        dtPotential <- dtPotential[spreadProbSuccess]
        dtNROW <- NROW(dt)
        dt <- rbindlistDtDtpot(dt, dtPotential, returnFrom, needDistance, dtPotentialColNames)

        # this is to prevent overlap within an event... in some cases, overlap within event is desired, so skip this block
        if (!is.na(allowOverlap)) {
          dt[, `:=`(dups = duplicatedInt(pixels)), by = initialPixels]
          dupes <- dt$dups
          set(dt, NULL, "dups", NULL)
          dt <- dt[!dupes]
        }

        # remove all the duplicated ones from dtPotential
        dtPotential <- dt[-seq_len(dtNROW)]
      } else {
        # no overlapping allowed

        # This block is instead of a dt[!duplicated(pixels)] which becomes very
        # slow on large problems
        successCells <- dtPotential$to[spreadProbSuccess]
        dupsWithinDtPotential <- duplicatedInt(successCells)

        #successCells <- na.omit(successCells[!dupsWithinDtPotential]) # remove the dupsWithinDtPotential
        successCells <- successCells[!dupsWithinDtPotential] # remove the dupsWithinDtPotential
        potentialNotAvailable <- notAvailable[successCells]

        # 3 reasons why potentials are not selected
        whSuccNoDupsCurItAndAll <- seq_along(spreadProbSuccess)[spreadProbSuccess][
          !dupsWithinDtPotential][!potentialNotAvailable]

        # next line is a fix with data.table 1.11.4 or so, can't pass length 0 vector?
        if (length(successCells[!potentialNotAvailable]) > 0)
          notAvailable[successCells[!potentialNotAvailable]] <- TRUE
        dtPotential <- dtPotential[whSuccNoDupsCurItAndAll]

        dt <- rbindlistDtDtpot(dt, dtPotential, returnFrom, needDistance, dtPotentialColNames)

        if (circle) {
          if (usingAsymmetry) {
            saturated <- dtPotentialAllNeighs[, sum(to %in% dt$pixels) == directions,
                                              by = from][V1 == TRUE]$from
          }
        }
      }
    } else {
      # neighProbs -- duplication checking already happened, but
      dtPotential <- dtPotential[spreadProbSuccess]
      dt <- rbindlistDtDtpot(dt, dtPotential, returnFrom, needDistance, dtPotentialColNames)
      if (NROW(dtPotential)) notAvailable[dtPotential$pixels] <- TRUE
    }

    # Step 9 -- Size issues: i.e., if too big (remove extras) or too small (make sure keeps going)
    if (!anyNA(maxSize) | !(anyNA(exactSize))) {
      # Too big first
      setkeyv(dt, "initialPixels") # must sort because maxSize is sorted
      setkeyv(dtPotential, "initialPixels")
      dtPotClusterDT <- dtPotential[, list(size = as.integer(.N)), by = "initialPixels"]
      clusterDT[dtPotClusterDT, size := size + i.size]

      # This next line is a work around for a problem that doesn't make sense:
      # See: https://stackoverflow.com/q/29615181/1380598
      alloc.col(clusterDT, 7)
      set(clusterDT, NULL, "tooBigByNCells", clusterDT$size - as.integer(clusterDT$maxSize))

      currentSizetooBigByNCells <- clusterDT[tooBigByNCells > 0]
      if (NROW(currentSizetooBigByNCells) > 0) {
        # sort them so join works between dt1 and currentSizetooBigByNCells
        setkeyv(currentSizetooBigByNCells, "initialPixels")
        set(dt, NULL, "origIndex", seq_len(NROW(dt)))
        dt1 <- dt[state == "successful"]
        dt1b <- dt1[currentSizetooBigByNCells] # attach tooBigByNCells
        dt1a <- tryCatch(dt1b[, list(omit = origIndex[sample.int(.N, tooBigByNCells)]),
                              by = "initialPixels"],
                         error = function(e) TRUE)

        dt <- dt[-dt1a$omit][, list(initialPixels, pixels, state)]
        dt[dt1a, state := "inactive"]
        clusterDT[currentSizetooBigByNCells[, list(initialPixels)],
                  size := size - tooBigByNCells]
      }

      # Too small second
      if (!(anyNA(exactSize))) {
        # push those that are too small into "tooSmall"
        currentSizeTooSmall <- clusterDT[tooBigByNCells < 0]
        if (NROW(currentSizeTooSmall) > 0) {
          # successful means will become activeSource next iteration,
          # so they don't need any special treatment
          currentSizeTooSmall <- currentSizeTooSmall[
            !dt[state %in% c("successful", "holding"), nomatch = 0]
            ]
        }
        # if the ones that are too small are unsuccessful, make them "tooSmall"
        set(dt, NULL, "ind", seq_len(NROW(dt)))
        whTooSmall <- dt[!(state %in% c("successful", "inactive"))][
          currentSizeTooSmall, nomatch = 0]$ind
        set(dt, NULL, "ind", NULL)

        if (length(whTooSmall)) {
          # add index column -- like doing a 'which( )'
          set(clusterDT, NULL, "indClDT", seq_len(NROW(clusterDT)))
          whNeedRetryClusterDT <- clusterDT[dt[whTooSmall]]$indClDT
          set(clusterDT, NULL, "indClDT", NULL)
          tooManyRetries <- clusterDT[whNeedRetryClusterDT, numRetries > maxRetriesPerID]
          if (sum(tooManyRetries) > 0) {
            whNeedRetryClusterDT <- whNeedRetryClusterDT[!tooManyRetries]
            whTooSmall <- whTooSmall[!tooManyRetries]
          }
          set(dt, whTooSmall, "state", "tooSmall")
          set(clusterDT, whNeedRetryClusterDT, "numRetries",
              clusterDT$numRetries[whNeedRetryClusterDT] + 1L)
        }
      }
      set(clusterDT, NULL, "tooBigByNCells", NULL)
    } # end size-based assessments

    # Step 10 - Change states of cells
    if (usingAsymmetry) {
      if (!(isTRUE(allowOverlap) | is.na(allowOverlap))) {
        if (circle) {
          if (length(saturated)) {
            set(dt, which(dt$pixels %in% saturated), "state", "activeSource")
          }
        }
      }
    }

    # Step 10a - Persistence: starting fire pixels (activeSource) continue burning
    # with a persistence probability, becoming "successful" and then
    # "activeSources" in Step 10b
    # at the moment, this is skipped if persistence is left = 0 to avoid
    # breaking some tests

    ## Extract persistenceProb for the current set of source pixels
    if (length(persistProb) == 1) {
      if (is.na(persistProb)) {
        actualPersistProb <- NULL
      } else {
        actualPersistProb <- rep(persistProb, sum(dt$state == "activeSource"))
      }
    } else {
      actualPersistProb <- persistProb[dt[state == "activeSource", initialPixels]]
    }

    ## "activeSource" fires become "successful" depending on prob of persistence
    if (!is.null(actualPersistProb)) {
      startFires <- which(dt$state == "activeSource")
      persistingFires <- runifC(length(startFires)) <= actualPersistProb
      dt[startFires[persistingFires], state := "successful"]
    }

    # Step 10b convert previous states to new states
    notInactive <- dt$state != "inactive" # currently activeSource, successful, or holding
    whNotInactive <- which(notInactive)
    activeStates <- dt$state[whNotInactive]
    whActive <- whNotInactive[activeStates == "successful" | activeStates == "holding"]
    whInactive <- whNotInactive[activeStates == "activeSource"]

    #   activeSource ==> inactive
    #   holding ==> activeSource
    #   successful ==> activeSource
    #   tooSmall ==> tooSmall
    set(dt, whNotInactive, "state",
        c("inactive", "activeSource", "activeSource", "tooSmall")[
          fmatch(activeStates, c("activeSource", "holding", "successful", "tooSmall"))])

    # Step 11 - plot it if necessary
    if (plot.it) {
      newPlot <- FALSE
      if (totalIterations == 1) {
        newPlot <- TRUE
      }
      if (newPlot | !(exists("spread2Ras", inherits = FALSE)))
        spread2Ras <- raster(landscape)
      if (returnDistances) {
        spread2Ras[dt$pixels] <- dt$distance
        newPlot <- TRUE # need to rescale legend each time
      } else {
        set(dt, NULL, "order", seq_along(dt$initialPixels))
        setkeyv(dt, "initialPixels")
        spread2Ras[dt$pixels] <- dt[clusterDT]$id # get id column from clusterDT
        setkeyv(dt, "order")
        set(dt, NULL, "order", NULL)
      }
      Plot(spread2Ras, new = newPlot)
    }
  } # end of main loop

  # Step 12 -- Add attributes
  attrList <- list(clusterDT = clusterDT,
                   whActive = whActive,
                   whInactive = whInactive,
                   totalIterations = totalIterations)
  if (canUseAvailable) {
    attrList <- append(attrList, list(notAvailable = notAvailable))
  }

  if (usingAsymmetry) {
    if (exists("pureCircle", inherits = FALSE))
      attrList <- append(attrList, list(pureCircle = pureCircle))
  }
  setattr(dt, "spreadState", attrList)

  # Step 13 -- return either raster or data.table
  if (asRaster) {
    ras <- raster(landscape)
    # inside unit tests, this raster gives warnings if it is only NAs
    suppressWarnings(ras[dt$pixels] <- clusterDT[dt]$id)
    setattr(ras, "pixel", dt)
    return(ras)
  }

  return(dt)
}


#' Internal helper
#'
#' Not for users. A function to setnames and rbindlist that is used 3 places in spread2.
#'
#' @param dt Data.table
#' @param dtPotential Data.table
#' @param returnFrom Logical
#' @param needDistance Logical
#' @param dtPotentialColNames Character Vector.
#' @rdname spread2-internals
#' @keywords internal
#'
rbindlistDtDtpot <- function(dt, dtPotential, returnFrom, needDistance, dtPotentialColNames) {
  # distance column is second last, but needs to be last
  # to merge with dt, need: from, to, state in that order
  reorderColsWDistance(needDistance, dtPotential, dtPotentialColNames)

  if (!returnFrom) {
    set(dtPotential, NULL, "from", dtPotential$id)
    set(dtPotential, NULL, "id", NULL)
    setnames(dtPotential, old = c("from", "to"), new = c("initialPixels", "pixels"))
  } else {
    setnames(dtPotential, old = c("id", "to"), new = c("initialPixels", "pixels"))
  }

  # convert state of all those still left, move potentialPixels into pixels column
  if (NROW(dtPotential)) {
    # need fill = TRUE if user has passed extra columns
    dt <- rbindlist(list(dt, dtPotential), fill = TRUE)
  }

  return(dt)
}

#' Internal helpers for \code{spread2}
#'
#' @inheritParams rbindlistDtDtpot
#' @rdname spread2-internals
#' @keywords internal
#'
reorderColsWDistance <- function(needDistance, dtPotential, dtPotentialColNames) {
  if (needDistance)
    setcolorder(dtPotential,
                neworder = c(
                  dtPotentialColNames[(dtPotentialColNames %in% colnames(dtPotential))],
                  colnames(dtPotential)[!(colnames(dtPotential) %in% dtPotentialColNames)]
                )
    )
}

#' @param from vector of cell locations which are the "from" or starting cells
#' @param to vector of same length as \code{from} which are the "to" or receiving cells
#' @param landscape \code{RasterLayer} passed from \code{spread2}.
#' @param actualAsymmetryAngle Angle in degrees, either a vector length 1 or
#'                             vector \code{NROW(dtPotential)}.
#' @keywords internal
#' @rdname spread2-internals
#'
angleQuality <- function(from, to, landscape, actualAsymmetryAngle) {
  from1 <- cbind(id = from, xyFromCell(landscape, from))
  to1 <- cbind(id = from, xyFromCell(landscape, to))
  d <- .pointDirection(from = from1, to = to1)

  angleQuality <- cbind(angleQuality = (cos(d[, "angles"] - rad(actualAsymmetryAngle)) + 1), d)
  angleQuality
}

#' @param angleQualities Matrix. The output from \code{angleQuality}
#' @param quantity Variable of interest to adjust, e.g., \code{spreadProb}
#' @param actualAsymmetry Asymmetry intensity. Derived from \code{asymmetry} arg in \code{spread2}
#' @keywords internal
#' @rdname spread2-internals
#'
asymmetryAdjust <- function(angleQualities, quantity, actualAsymmetry) {
  if (sum(angleQualities[, "angleQuality"]) %==% 0) {
    # the case where there is no difference in the angles, and they are all zero
    return(quantity)
  } else {
    dd <- data.table(angleQualities, quantity)
    dd[, quantityAdj := quantity * angleQualities[, "angleQuality"]]
    dd[, quantityAdj2 := quantityAdj / (mean(quantityAdj) / mean(quantity)), by = "id"]

    dd[, newQuantity := {
      minQuantity <- 0
      maxQuantity <- max(2 * quantity)
      aaMinus1 <- actualAsymmetry - 1
      par2 <- aaMinus1 * sum(quantityAdj) /
        (length(quantityAdj) * (maxQuantity - minQuantity) +
           aaMinus1 * sum(quantityAdj - minQuantity))
      par1 <- par2 / aaMinus1 * (maxQuantity - minQuantity)
      (quantityAdj2 - minQuantity) * par2 + par1
    }, by = "id"]
  }
  dd$newQuantity
}

Try the SpaDES.tools package in your browser

Any scripts or data that you put into this service are public.

SpaDES.tools documentation built on July 15, 2018, 9:01 a.m.