mrinfo | R Documentation |
The mrinfo
function produces a wealth of additional information
on multiple regression models above and beyond what summary(), anova(),
and Anova() produce.
mrinfo(lm.fit, cilevel = 0.95, minimal = TRUE)
lm.fit |
A |
cilevel |
Confidence interval level. Default is .95. |
minimal |
specifies volume of returned information (see details). Default=T. |
The function takes an lm
fit object. Supplemental information is
calculated for lm
fit objects. When minimal=T
is specified,
a data frame of several indices (see values) is returned. When minimal=F
,
a list of information, including the supplemental values is returned:
zero-order correlations, the coefficients table, confidence intervals for the
regression coefficients, type I SS anova table and type III SS Anova table.
Structure coefficients can be very helpful in interpretation of
lm
models. The are defined as the pearson correlation of each IV in
the model with the yhat vector. See Thompson and Borello (1985), Cooley and Lohnes
(1971), Cohen and Cohen (2003) or Nimon et. al., (2008).
beta wt | Standardized Regression Coefficients |
structure r | Structure Coefficients |
partial r | Partial correlations of IVs with the DV |
semi-partial r | Semi-partial correlations of IVs with the DV |
tolerances | Tolerance for each IV |
unique | Unique proportion of variance in DV accounted for by each IV |
common | Common proportion of variance in DV accounted for by each IV shared with other IVs |
total | Total proportion of variance in DV accounted for by each IV |
pearsons | The zero-order pearson correlation matrix among all variables |
The mrinfo
function is designed to work with multiple regression objects
where an intercept is estimated. Models where the intercept is forced through
the origin ('no intercept' models) are problematic for interpretation of the
supplemental information listed above and it is not returned. The specification of
minimal=FALSE
will still provide the list of items described above.
Simple regression models do not require the supplemental information, but the
user can specify minimal=F
to obtain the longer list of detailed information.
Models with factor IV's may create situations where interpretation of the
supplemental indices is problematic. This can easily happen with
coding schemes such dummy coding (indicator coding or contr.treatment()
).
Often, suppressor effect can occur,
rendering interpretation of beta weights, partial and semi-partial correlations, and
particularly the unique and common variance proportions challenging. The unique
proportion of variance index is calculated as the square of the semi-partial correlation
(appropriately). But the common proportion is calculated as the difference between
this unique fraction and the squared zero order pearson correlation for that IV. When
suppressor effects arise because of patterns related to contrast coding schemes this
common proportion can sometimes be found as a nonsensical negative quantity. Careful understanding
of one's model is required in these circumstances.
Bruce Dudek bruce.dudek@albany.edu
This function is a modeled on a function originally coded
in the regr
function in the yhat package. It uses the
effect.size
function from that package.
Cohen, J., & Cohen, J. (2003). Applied multiple regression/correlation analysis for the behavioral sciences (3rd ed.). L. Erlbaum Associates.
Cooley, W. W., & Lohnes, P. R. (1971). Multivariate data analysis. Wiley.
Nimon, K., Lewis, M., Kane, R., & Haynes, R. M. (2008). An R package to compute commonality coefficients in the multiple regression case: An introduction to the package and a practical example. Behavior Research Methods, 40(2), 457-466.
Pedhazur, E. Multiple regression in behavioral research. 1997. Orlando, FL: Harcourt.
Ray‐Mukherjee, J., Nimon, K., Mukherjee, S., Morris, D. W., Slotow, R., & Hamer, M. (2014). Using commonality analysis in multiple regressions: a tool to decompose regression effects in the face of multicollinearity. Methods in Ecology and Evolution, 5(4), 320-328.
Thompson, B., & Borrello, G. M. (1985). The importance of structure coefficients in regression research. Educational and psychological measurement, 45(2), 203-209.
data(attitude)
fit1 <- lm(rating ~ complaints + learning + privileges, data=attitude)
#summary(fit1)
mrinfo(fit1, minimal=TRUE, cilevel=.99)
mrinfo(fit1, minimal=FALSE, cilevel=.95)
data(mtcars)
mtcars$cyl <- as.factor(mtcars$cyl)
contrasts(mtcars$cyl) <- contr.helmert(3)
fit2 <- lm (mpg ~ cyl + hp, data= mtcars)
mrinfo(fit2, minimal=TRUE)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.