Sets parameters for Newton-Raphson iteration in Firth's penalized-likelihood logistic regression

1 2 | ```
logistf.control(maxit = 25, maxhs = 5, maxstep = 5, lconv = 1e-05, gconv = 1e-05,
xconv = 1e-05, collapse=TRUE)
``` |

`maxit` |
the maximum number of iterations |

`maxhs` |
the maximum number of step-halvings in one iteration. The increment of the beta vector within one iteration is divided by 2 if the new beta leads to a decrease in log likelihood. |

`maxstep` |
specifies the maximum step size in the beta vector within one iteration. |

`lconv` |
specifies the convergence criterion for the log likelihood. |

`gconv` |
specifies the convergence criterion for the first derivative of the log likelihood (the score vector). |

`xconv` |
specifies the convergence criterion for the parameter estimates. |

`collapse` |
if TRUE, evaluates all unique combinations of x and y and collapses data set. This may save computing time with large data sets with only categorical (binary) covariates. |

`logistf.control()`

is used by `logistf`

and `logistftest`

to set control parameters to default values.
Different values can be specified, e. g., by `logistf(..., `

`control=`

`logistf.control(maxstep=1))`

.

`maxit` |
the maximum number of iterations |

`maxhs` |
the maximum number of step-halvings in one iteration. The increment of the beta vector within one iteration is divided by 2 if the new beta leads to a decrease in log likelihood. |

`maxstep` |
specifies the maximum step size in the beta vector within one iteration. |

`lconv` |
specifies the convergence criterion for the log likelihood. |

`gconv` |
specifies the convergence criterion for the first derivative of the log likelihood (the score vector). |

`xconv` |
specifies the convergence criterion for the parameter estimates. |

`collapse` |
if TRUE, evaluates all unique combinations of x and y and collapses data set. |

Georg Heinze

1 2 3 4 |

Questions? Problems? Suggestions? Tweet to @rdrrHQ or email at ian@mutexlabs.com.

All documentation is copyright its authors; we didn't write any of that.