Creating and grading randomized exams with RndTexExams"

Introduction

The package RndTexExams creates exams with randomized content using R and latex. The R code will take as input a latex file and randomly define, for all of the multiple choice questions, the order of the questions and the order of the answers. The user can also change the content of the questions in each version.

The main target audience of this code is composed of examiners that are considering minimizing cheating in their exams. Based on this package and a database of questions, one can build an unique test and answer sheet for each student in the class, therefore making it nearly impossible to cheat by looking around or sharing the answer sheet. The use of the package with a cloud based spreadsheet tool also makes it possible to grade the exams digitally, without any manual intervention. This is a great feature that minimizes grading work in large classes. If you are still not convinced that the student might be cheating, you can statistically test this hypothesis using RndTexExams (see the other vignette).

The code of RndTexExams is built around the stable framework of examdesign and exam. Users that are not familiar with LaTeX and its exam classes, I strongly advice to read the manuals before using RndTexExams. Users familiar with both can start using RndTexExams with minor additions to the Latex file.

How to use RndTexExams

Before you start

In order to use RndTexExams you will need to install latex. You have two main choices, miktex and texlive. I also advice to use a nice IDE such as texstudio.

Miktex and texlive are good choices and should work well with RndTexExams. My advice for those first starting out in latex is to download the example file of the package from here and compile it using your flavor of latex.

For Linux users it might be necessary to install the latex packages examdesign and exam and the system function texi2dvi. The following terminal code will make sure all latex requirements are installed and ready for use:

sudo apt-get install texlive-base texlive-latex-extra texinfo

How to write questions with examdesign and RndTexExams

A standard multiple choice exam is compose of several questions that have a main text and a set of alternative answers to choose from. A simple example is:

\

Given the next five options, which one is the correct answer?

a) Choice 1

b) Choice 2

c) Choice 3

d) Choice 4

e) Choice 5 - The CORRECT answer!

\

For the rest of this tutorial we will use the examdesign template as it is well structured and self contained class for exam. The package RndTexExams also works for the exam class. In examdesign, the structure of a multiple choice question is defined using latex commands. The start of the multiple answer section is encapsulated by commands \begin{multiplechoice} and \end{multiplechoice}. Within this environment, all questions begin with \begin{question} and end with \end{question}. The multiple answers of the questions are marked as \choice{}.

Next I show the latex code that results in the previous example:

% Example of a multiple choice question in examdesign (only one version)
% The preamble and the rest of the document are ommited for simplification. 
% Be aware that this simple code as it is will NOT compile in pdflatex as it needs other requirements

\begin{multiplechoice}

\begin{question}

    Given the next five options, which one is the correct answer?

    \choice{Choice 1}
    \choice{Choice 2}
    \choice{Choice 3}
    \choice{Choice 4}
    \choice[!]{Choice 5 - The CORRECT answer!}

\end{question} 

\end{multiplechoice}

Notice from this simple example that each question has a main text and choices. The right answer of the question is marked with the symbol [!] as in \choice[!]{text of answer here}. The right answer for each question can be later used to build a version of the test with the correct answers for all questions.

The package RndTexExams will read the latex file, search for all occurrences of a multiple choice question then randomly rearrange the order of questions and alternatives, building a new latex file for each of the version of the exam. Therefore, each version of the test will have a different answer sheet. The latex files are later compiled from R, resulting in a set of pdf files that are ready for printing.

Changing the textual content of the exam

The package RndTexExams can use textual switches with specific symbols in order to define parts of the questions that can change in between versions. This is an optional feature for those examiners that wish to create different versions of the same questions.

Any change in the text, whether it is in the main text of the question or text of the answers, is marked with symbol @{text in ver 1}|{text in ver 2}|{text in ver 3}@. Making it clear, each version of the test will show the text according to its position. So, in version one it will show the text text in ver 1, in version two it will show the text text in ver 2 and so on. The version of the content changes every time that R and Latex compiles a new test.

Since we are changing the text of the questions and answers, it is also necessary to change the correct answers in each version. To do this, simply add the symbol [x] in the text of the answers, where x is the version in which the alternative is correct.

As an example, we can make different versions of the previous example question by using the following latex code with RndTexExams:

% Example of multiple choice question in examdesign, with 2 versions
\begin{multiplechoice}[resetcounter=no,  examcolumns=1]

\begin{question}

    Given the next five options, which one is the correct answer in @{version 1}|{version 2}@?

    \choice{Choice 1 - Incorrect in all versions}
    \choice{[2] Choice 2 - @{Incorrect in version 1}|{Correct in versin 2}@ }
    \choice{Choice 3 - Incorrect in all versions}
    \choice{Choice 4 - Incorrect in all versions}
    \choice{[1] Choice 5 - @{Correct in version 1}|{Incorrect in version 2}@ }

\end{question} 

\end{multiplechoice}

And that's it! The R code in RndTexExam will look for these symbolic expressions and randomly choose one of them for the final version of each exam.

Once you have your questions with the proper syntax for using with RndTexExams, all you need to do is to pass the tex file to functions rte.analyze.tex.file and later to rte.build.rdn.test. Next, I present the use of RndTexExam with an example latex file from the package.

library(RndTexExams, quietly = TRUE)

set.seed(10)

# Get latex file from package
f.in <- system.file("extdata", "MyRandomTest_examdesign.tex", package = "RndTexExams")

# Breakdown latex file into a a list 
list.out <- rte.analyze.tex.file(f.in,
                                 latex.dir.out = 'latexOut',
                                 pdf.dir.out = 'PdfOut') 


# Options for build.rdn.test
list.in <- list.out       # output from rte.analyze.tex.file
f.out <- 'MyRandomTest_'  # pattern for names of pdfs
n.test <- 10            # number of random tests (usually the number of students) 
n.question <- 4           # number of questions in each test 
pdf.dir.out <- 'PdfOut'   # directory for pdf output

# Builds pdfs
list.build.rdn.exam <- rte.build.rdn.test(list.in = list.in,
                                          f.out = f.out,
                                          n.test = n.test,
                                          n.question = n.question,
                                          pdf.dir.out = pdf.dir.out) 

The function rte.analyze.tex.file will analyze the tex file and produce a R list with all of the details of the latex code. It will find and separate all of the multiple choice questions. I encourage the user to open the resulting list to see how all of the latex code is broken down into pieces.

The function rte.build.rdn.test will use the output from rte.analyze.tex.file, randomize all of the contents of the multiple choice questions and later paste together a new latex file used to compile the pdf files of the exam.

The correct answer sheet for all versions is available in list.build.rdn.exam$df.answer.wide, where each row is the version of the test and the columns are the answers. The list output from rte.build.rdn.test should be locally saved as a .RDATA file with command save in order to be used later in the grading process. I also advice to use function set.seed() so that every run of the code always results in the same random answer sheet.

Next I show the answer matrix of the exam, where each column is a question and each row is a version of the exam.

print(list.build.rdn.exam$answer.matrix)

Grading exams

The package RndTexExams also makes it easy to grade exams using R. This is specially helpful when integrating the submission of the student's answers with a cloud based spreadsheet service that allows the examiner to record and process the answers of each student efficiently (details later).

When examining a class with RndTexExam, one should have the following information from the students:

The workflow of grading begins by pairing each student's name with a version of the exam. After that, grading the exam is accomplished by summing the number of correct answers weighted by the grading score of each question. The function rte.grade.exams will perform this calculation.

In the next example I show the use of function rte.grade.exams with some random data.

set.seed(10)

# create some (almost) random names
my.names <- c('John', 'Max','Michael','Marcelo','Ricardo', 'Tarcizio')

# random version of the test for each student
ver.test <- sample(n.test,size = length(my.names),replace = TRUE)

# number of simulated questions (same as before)
n.questions <- 4

# Get the correct answer sheet from previous code 
correct.answer.sheet <- list.build.rdn.exam$answer.matrix

# create simulated answers from students (cheat a little bit!)
q.to.cheat <- floor(n.questions/2)  # get at least half of questions right!
my.answers <- cbind(correct.answer.sheet[ver.test,1:q.to.cheat], 
                    matrix(sample(letters[1:5],                                          
                                  replace = T,
                                  size = length(my.names)*(n.questions-q.to.cheat)),
                           ncol = n.questions-q.to.cheat ))

# grade exams with rte.grade.exams 
grade.l.out <- rte.grade.exams(exam.names = my.names,
                              exam.version = ver.test, 
                              exam.answer.matrix = my.answers,
                              list.build.rdn.exam = list.build.rdn.exam)

Now we can plot the results of the grading process.

# print results in a bar plot

library(ggplot2)

p <- ggplot(grade.l.out$df.final.score, aes(y = final.score, x = my.names))
p <- p + geom_bar(stat = "identity") + labs(title = 'Final Score')
p <- p + theme(axis.text.x = element_text(angle = 90, hjust = 1))
print(p)

p <- ggplot(grade.l.out$df.grade, aes(y = n.question, x = exam.names, fill = grade.logical))
p <- p + geom_tile() + labs(title = 'Correct answer grade')
p <- p + theme(axis.text.x = element_text(angle = 90, hjust = 1))
print(p)

Integrating RndTexExam with google spreadsheets

The use of RndTexExam is optimized when using a cloud based spreadsheet tool to register the answers for the students. This avoid the manual work usually needed to process the information from the application of the exam. The package RndTexExam not only minimizes cheating but can also make the grading process painless.

The cloud services I currently use and recommend is Google Spreadsheets and Google Forms. Both are free and easy to use. Next I present the steps I suggest in order to build a random exam and grade it using a spreadsheet in the cloud.

  1. Create a randomized exam with rte.build.rdn.test. Make sure you locally save the correct answer sheet for each version of the test (output answer.matrix). I also advice to use set.seed() in order to keep the same randomized content of the tests, just in case.
  2. Create a new form in Google Forms with the following questions (example here:
  3. Name of Student (short answer)
  4. ID Card (optional) (short answer)
  5. Version of test (short answer with numerical validation)
  6. Answers of the test (Multiple choice grid, row = Question number, col = alternatives (a,b,c,d,e))

  7. In the day of the test, deliver the printed exams from step 1 and inform the students of the link previously created in Google forms, step 2. The site tinyurl will help to create a customized human-readable link. I usually write the link in the blackboard so that only those that came to the exam can access it. Once the student finishes the exam, he/she can register their answers with their smartphones by accessing the link in a browser. I also bring a laptop with internet connection just in case that some students don't have a smartphone. As a backup plan, make sure all students return the printed exams.

  8. Once all students have registered their answers, gain access to the spreadsheet in the cloud with package googlesheets. This way you can easily import the data from the cloud into R. I also advice to run a code to check the data in the cloud. Make sure that each student only has one registration and that the version of the test is unique for each. The link to the example spreadsheet previously created is here. See the vignette of googlesheets here for examples of how to get access of a Google spreadsheet.

  9. Use the information from the cloud and the correct answer sheet from step 1 with rte.grade.exams in order to grade the exams and calculate the final score for each.

Getting started on your own test

The easiest way to get started on your own test is to use the templates distributed at Gist Gist. There you can find latex files for examdesign and exam. Download it or copy and paste the contents of the Gist file into a new tex file, which we will assume is called MyRandomTest_examdesign.tex. You can also download the examdesign gist file in R using:

setwd('Your path goes here')
download.file(url = 'https://gist.github.com/msperlin/ef1b93a8eb9026ba5e9a/raw/MyRandomTest_examdesign.tex', destfile = 'MyRandomTest_examdesign.tex' )

Once you have the Latex file, run the following script to build 5 random tests, each with 4 questions:

library(RndTexExams)

my.d <- 'Your folder to the tex file here!'
setwd(my.d)

f.in <- 'MyRandomTest_examdesign.tex'
f.out <- 'RandomTest-'
n.test <- 5
n.question <- 4
latex.dir.out <- 'latexOut'
pdf.dir.out <- 'PdfOut'

list.out <- rte.analyze.tex.file(f.in,
                                 latex.dir.out = latex.dir.out,
                                 pdf.dir.out = pdf.dir.out)

out <- rte.build.rdn.test(list.in = list.out,
                          f.out = f.out,
                          n.test = n.test,
                          n.question = n.question,
                          latex.dir.out = latex.dir.out)                                 

The five pdf files named RandomTest-1.pdf, RandomTest-2, and so on should be now available in folder pdfOut.

Some advices



Try the RndTexExams package in your browser

Any scripts or data that you put into this service are public.

RndTexExams documentation built on May 2, 2019, 6:10 a.m.