Description Usage Arguments Details Value Examples
For a simple logistic regression model with an intercept and one slope parameter, this function returns their respective MLE's. Iterations stop once the improvement in successive steps falls under a certain tolerance level.
1 |
x |
vector of covariates (predictor) |
y |
vector of responses, each one distributed as a Binomial random variable |
n |
vector of number of trials for each observation. |
tol |
tolerance level governing when to stop the iterations. Here we use the larger absolute error for either parameter and compare it to |
verbose |
Logical; if |
We can view y
as the number of successes out of n
trials, with probability of success p
unknown. The linear predictor is related to p
by the log odds ratio: log(p/(1-p)) = α + β x.
It is known that in logsitic regression, we cannot obtain closed form solutions for α and β. This function uses the Newton-Raphson method to estimate these parameters. Our initial value is (α_0, β_0) = (0, 0), an intuitive choice that corresponds to the case of equally likely outcomes (p = 0.5).
In the function definition, the matrix L
is the system of nonlinear score functions we need to solve, and L_prime
is the derivative of L
. Hence, each step of the iteration computes updated estimates via (α_{i+1}, β_{i+1}) = (α_{i}, β_{i}) + L_prime
^-1 * L
.
A data.frame with two entries in one row, the first being the MLE of the intercept and the second being the MLE of the slope.
1 2 3 4 5 6 7 8 |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.