# An introduction to owl In owl: Generalized Linear Models Regularized with the Sorted L1-Norm

y <- bodyfat$y tune <- trainOwl(x, y, q = c(0.1, 0.2), number = 10, repeats = 3)  As before, the plot method offers the best summary. plot(tune, measure = "mae") # plot mean absolute error  Printing the resulting object will display the optimum values tune  ## False discovery rate Under assumptions of orthonormality, SLOPE has been shown to control false discovery rate (FDR) of non-zero coefficients (feature weights) in the model [@bogdan2015]. It is in many ways analogous to the Benjamini--Hochberg procedure for multiple comparisons. Let's set up a simple experiment to see how SLOPE controls the FDR. We randomly generate data sets with various proportions of true signals. Under this gaussian design with independently and identically distributed columns in$X$, SLOPE should asymptotically control FDR at the level given by the shape parameter$q$, which we set to 0.1 in this example. # proportion of real signals q <- seq(0.05, 0.5, length.out = 20) fdr <- double(length(q)) set.seed(1) for (i in seq_along(q)) { n <- 1000 p <- n/2 sigma <- 1 problem <- owl:::randomProblem(n, p, q = q[i], sigma = sigma) x <- problem$x
y <- problem$y signals <- problem$nonzero

fit <- owl(x,
y,
lambda = "gaussian",
q = 0.1,
sigma = sigma)

selected_owl <- which(fit\$nonzeros)
V <- length(setdiff(selected_owl, signals))
R <- length(selected_owl)
fdr[i] <- V/R
}

library(lattice)
xyplot(fdr ~ q, type = "b", ylab = "FDR",
panel = function(...) {
panel.refline(h = 0.1)
panel.xyplot(...)
})


SLOPE seems to control FDR at roughly the specified level.

## Try the owl package in your browser

Any scripts or data that you put into this service are public.

owl documentation built on Feb. 11, 2020, 5:09 p.m.