MathExam14W | R Documentation |
Responses of 729 students to 13 items in a written exam of introductory mathematics along with several covariates.
data("MathExam14W")
A data frame containing 729 observations on 9 variables.
Item response matrix (of class itemresp
) with
values 1/0 coding solved correctly/other.
Item response matrix (of class itemresp
) with
values 2/1/0 coding solved correctly/incorrectly/not attempted.
Integer. The number of items solved correctly.
Integer. The number of online test exercises solved correctly prior to the written exam.
Factor indicating gender.
Factor indicating two different types of business/economics degrees. Either the 3-year bachelor program (571) or the 4-year diploma program (155).
Integer. The number of semesters enrolled in the given university program.
Factor. The number of times the course/exam has been attempted (including the current attempt).
Factor indicating whether the students were in the first or second batch (with somewhat different items) in the exam.
The data provides individual end-term exam results from a Mathematics 101 course for first-year business and economics students at Universität Innsbruck. The format of the course comprised biweekly online tests (26 numeric exercises, conducted in OpenOLAT) and a written exam at the end of the semester (13 single-choice exercises with five answer alternatives). The course covers basics of analysis, linear algebra, financial mathematics, and probability calculus (where the latter is not assessed in this exam).
In this exam, 729 students participated (out of 941 registered in the course). To avoid cheating, all students received items with essentially the same questions but different numbers (using the exams infrastructure of Zeileis et al. 2014). Also, due to the large number of students two groups of students had to be formed which received partially different items. The items which differed (namely 1, 5, 6, 7, 8, 9, 11, 12) varied in the setup/story, but not in the mathematical skills needed to solve the exercises. Prior to the exam, the students could select themselves either into the first group (early in the morning) or the second group (starting immediately after the end of the first group).
Correctly solved items yield 100 percent of the associated points. Items without correct solution can either be unanswered (0 percent) or receive an incorrect answer (minus 25 percent) to discourage random guessing. In the examples below, the items are mostly only considered as binary. Typically, students with 8 out of 13 correct answers passed the course.
Department of Statistics, Universität Innsbruck
Zeileis A, Umlauf N, Leisch F (2014). Flexible Generation of E-Learning Exams in R: Moodle Quizzes, OLAT Assessments, and Beyond. Journal of Statistical Software, 58(1), 1–36. doi:10.18637/jss.v058.i01
itemresp
, raschmodel
, pcmodel
, anchortest
## load data and exclude extreme scorers
data("MathExam14W", package = "psychotools")
MathExam14W <- transform(MathExam14W,
points = 2 * nsolved - 0.5 * rowSums(credits == 1)
)
me <- subset(MathExam14W, nsolved > 0 & nsolved < 13)
## item response data:
## solved (correct/other) or credits (correct/incorrect/not attempted)
par(mfrow = c(1, 2))
plot(me$solved)
plot(me$credits)
## PCA
pr <- prcomp(me$solved, scale = TRUE)
names(pr$sdev) <- 1:10
plot(pr, main = "", xlab = "Number of components")
biplot(pr, col = c("transparent", "black"), main = "",
xlim = c(-0.065, 0.005), ylim = c(-0.04, 0.065))
## points achieved (and 50% threshold)
par(mfrow = c(1, 1))
hist(MathExam14W$points, breaks = -4:13 * 2 + 0.5,
col = "lightgray", main = "", xlab = "Points")
abline(v = 12.5, lwd = 2, col = 2)
## Rasch and partial credit model
ram <- raschmodel(me$solved)
pcm <- pcmodel(me$credits)
## various types of graphics displays
plot(ram, type = "profile")
plot(pcm, type = "profile", add = TRUE, col = "blue")
plot(ram, type = "piplot")
plot(pcm, type = "piplot")
plot(ram, type = "region")
plot(pcm, type = "region")
plot(ram, type = "curves")
plot(pcm, type = "curves")
## test for differential item function with automatic anchoring
## passing vs. not passing students
at1 <- anchortest(solved ~ factor(nsolved <= 7), data = me,
adjust = "single-step")
at1
plot(at1$final_tests)
## -> "good" students discriminate somewhat more
## (quad/payflow/lagrange are slightly more difficult)
## group 1 vs. group 2
at2 <- anchortest(solved ~ group, data = me, adjust = "single-step")
at2
plot(at2$final_tests)
## -> quad/payflow/planning easier for group 1
## -> hesse slightly easier for group 2
## bring out differences between groups 1 and 2
## by (anchored) item difficulty profiles
ram1 <- raschmodel(subset(me, group == "1")$solved)
ram2 <- raschmodel(subset(me, group == "2")$solved)
plot(ram1, parg = list(ref = at2$anchor_items), ylim = c(-2, 3))
plot(ram2, parg = list(ref = at2$anchor_items), add = TRUE, col = "blue")
legend("topleft", c("Group 1", "Group 2"), pch = 21,
pt.bg = c("lightgray", "blue"), bty = "n")
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.