# OGA: Orthogonal greedy algorithm In Ohit: OGA+HDIC+Trim and High-Dimensional Linear Regression Models

## Description

Select valuables via orthogonal greedy algorithm (OGA).

## Usage

 `1` ```OGA(X, y, Kn = NULL, c1 = 5) ```

## Arguments

 `X` Input matrix of `n` rows and `p` columns. `y` Response vector of length `n`. `Kn` The number of OGA iterations. `Kn` must be a positive integer between `1` and `p`. Default is `Kn=max(1, min(floor(c1`*`sqrt(n/log(p))), p))`, where `c1` is a tuning parameter. `c1` The tuning parameter for the number of OGA iterations. Default is `c1=5`.

## Value

 `n` The number of observations. `p` The number of input variables. `Kn` The number of OGA iterations. `J_OGA` The index set of `Kn` variables sequencially selected by OGA.

## Author(s)

Hai-Tang Chiou, Ching-Kang Ing and Tze Leung Lai.

## References

Ing, C.-K. and Lai, T. L. (2011). A stepwise regression method and consistent model selection for high-dimensional sparse linear models. Statistica Sinica, 21, 1473–1513.

## Examples

 ``` 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17``` ```# Example setup (Example 3 in Section 5 of Ing and Lai (2011)) n = 400 p = 4000 q = 10 beta_1q = c(3, 3.75, 4.5, 5.25, 6, 6.75, 7.5, 8.25, 9, 9.75) b = sqrt(3/(4 * q)) x_relevant = matrix(rnorm(n * q), n, q) d = matrix(rnorm(n * (p - q), 0, 0.5), n, p - q) x_relevant_sum = apply(x_relevant, 1, sum) x_irrelevant = apply(d, 2, function(a) a + b * x_relevant_sum) X = cbind(x_relevant, x_irrelevant) epsilon = rnorm(n) y = as.vector((x_relevant %*% beta_1q) + epsilon) # Select valuables via OGA OGA(X, y) ```

### Example output

```\$n
[1] 400

\$p
[1] 4000

\$Kn
[1] 34

\$J_OGA
[1] 2382  712   10    9    8    7    5    6    4    3    2    1 2331 2558 1226
[16] 2093 3448 3201 2345 2275 3101 2319  387 2471 2430 1623 1992 3756 2268  845
[31] 2463 1907 3444 3147
```

Ohit documentation built on May 1, 2019, 8:43 p.m.