visualize.model: Visualizations of one or two variable linear or logistic...

Description Usage Arguments Details Author(s) References See Also Examples

Description

Provides useful plots to illustrate the inner-workings of regression models with one or two predictors or a partition model with not too many branches.

Usage

1
visualize.model(M,loc="topleft",level=0.95,cex.leg=0.7,...)

Arguments

M

A linear or logistic regression model with one or two predictors (not all categorical) or a partition model

loc

The location for the legend, if one is to be displayed. Can also be "top", "topright", "left", "center", "right", "bottomleft", "bottom", or "bottomright".

level

The level of confidence for confidence and prediction intervals for the case of simple linear regression.

cex.leg

Magnification factor for text in legends. Smaller numbers indicate smaller text. Default is 0.7.

...

Additional arguments to plot. This is typically only used for logistic regression models where xlim is to be specified to see the entirety of the curve instead of using the default range.

Details

If M is a simple linear regression model, this provides a scatter plot, fitted line, and confidence/prediction intervals.

If M is a simple logistic regression model, this provides the fitted logistic curve.

If M is a regression with two quantitative predictors, this provides the implicit regression lines when one of the variables equals its 5th (small), 50th (median), and 95th (large) percentiles. The model may have interaction terms. In this case, the p-value of the interaction is output. The definition of small and large can be changed with the level argument.

If M is a regression with a quantitative predictor and a categorical predictor (with or without interactions), this provides the implicit regression lines for each level of the categorical predictor. The p-value of the effect test is displayed if an interaction is in the model.

If M is a partition model from rpart, this shows the tree.

Author(s)

Adam Petrie

References

Introduction to Regression and Modeling

See Also

rpart, lm, glm

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
  data(SALARY)
  #Simple linear regression with 90% confidence and prediction intervals
  M <- lm(Salary~Education,data=SALARY)
  visualize.model(M,level=0.90,loc="bottomright")
  
  #Multiple linear regression with two quantitative predictors (no interaction)
  M <- lm(Salary~Education+Experience,data=SALARY)
  visualize.model(M)

  #Multiple linear regression with two quantitative predictors (with interaction)
  #Take small and large to be the 25th and 75th percentiles
  M <- lm(Salary~Education*Experience,data=SALARY)
  visualize.model(M,level=0.75)
  
  #Multiple linear regression with one categorical and one quantitative predictor
  M <- lm(Salary~Education*Gender,data=SALARY)
  visualize.model(M)

  data(WINE)
  #Simple logistic regression with expanded x limits
  M <- glm(Quality~alcohol,data=WINE,family=binomial)
  visualize.model(M,xlim=c(0,20))

  #Multiple logistic regression with two quantitative predictors
  M <- glm(Quality~alcohol*sulphates,data=WINE,family=binomial)
  visualize.model(M,loc="left")

  data(TIPS)
  #Multiple logistic regression with one categorical and one quantitative predictor
  #expanded x-limits to see more of the curve
  M <- glm(Smoker~PartySize*Weekday,data=TIPS,family=binomial)
  visualize.model(M,loc="topright",xlim=c(-5,15))
  
  #Partition model predicting a quantitative response
  TREE <- rpart(Salary~.,data=SALARY)
  visualize.model(TREE)
  
  #Partition model predicting a categorical response
  TREE <- rpart(Quality~.,data=WINE)
  visualize.model(TREE)

profpetrie/regclass documentation built on May 26, 2019, 8:33 a.m.