dsL2Sens: Calculate the l2 sensitivity on DataSHIELD servers

View source: R/l2sens.R

dsL2SensR Documentation

Calculate the l2 sensitivity on DataSHIELD servers

Description

Calculation of the l2 sensitivity using a histogram representetation. Source: https://www.cis.upenn.edu/~aaroth/Papers/privacybook.pdf

Usage

dsL2Sens(
  connections,
  dat_name,
  pred_name,
  nbreaks = NULL,
  cols = NULL,
  drop_on_error = TRUE
)

Arguments

connections

('DSI::connection') Connection to an OPAL server.

dat_name

('character(1)') The name of the data at the DataSHIELD servers.

pred_name

('character(1)') Name of the prediction object at the DataSHIELD server.

nbreaks

('integer(1L)') Number of breaks used for the histogram (default = nrow(dat) / 3).

cols

('character()') Subset of columns used to find adjacent inputs.

drop_on_error

('logical(1L)') NA is returned if an error occurs on a server.

Value

List with maximal l2 sensitivity, indices of inputs that are used to calculate the maximal l2 sensitivity, and the number of adjacent inputs.

Author(s)

Daniel S.


difuture-lmu/dsROCGLM documentation built on March 24, 2024, 1:07 p.m.