Spark's Work

RExams: A real-life use case

  1. Introduction
  2. Overall experience
  3. Generating random questions
  4. Designing test formats
  5. Concluding remarks


Between May and August 2020, I was a course instructor at UofT, teaching ACT240 (Mathematics of Investment and Credit) for the first half of the summer and ACT247 (Introductory Life Contingencies) for the second half.

As a first-time instructor, I was excited and nervous at the same time. Fortunately, things went mostly smoothly. I received much positive feedback from my students, while also learning a lot from the entire experience.

One major challenge for instructors in 2020 was the online course format during the COVID-19 pandemic, particularly how tests should be conducted online to prevent students from "collaborating" when they should not.

Considering I'd mainly use multiple-choice and fill-in-the-blank formats for test questions, I thought it would be a prime opportunity to experiment with the R package exams , which I came across one day when searching random stuff on the Internet. It turned out to be quite efficient and helped me a lot, thus I am writing down the experience for future reference.

Overall experience

The official package website has already provided a good step-by-step guidance (here). For the simpliest case, the user only needs the following (which can be generated by the exams_skeleton function):

For my case, I am interested in generating questions usable on the UofT online learning system Quercus, which is an adaptation of the Canvas learning management system (LMS) widely used by many universities around the world.

While the exams package does have a function exams2canvas that corresponds to the Canvas LMS, I found the result unsatisfactory as some answers were not correctly identified by the online LMS (which could be caused either by me or the package or Canvas).

Luckily, the function exams2blackboard worked fine, even though it was designed for a slightly different LMS. After generating questions in zip formats, I uploaded the files to the online system, after which I only needed a few clicks to finish designing a complete test: simple!

Overall, I'd say there is a learning curve at the beginning, e.g. getting familiar with the package setup, formatting tests in a particular fasion (I also tried generating some pdf files in customized templates), and learning how to generate random questions for each test taker. After finishing the first few questions, I got quite comfortable with the package and it ended up saving me a lot of time and energy.

Generating random questions

As mentioned, the exams package has helped me preventing, at least to some extent, students from collaboration in an online test. I will elaborate more on this point.

Let us look at a very simple case, say, I would like to test the students on solving a linear equation. In an offline test setting, I could just give one single question to everyone, e.g. solve the equation x+3=5x+3 = 5 for xx.

Now, for online tests, it would be better to vary some parameters in the question (without changing what it is essentially testing on). This task is easily implemented in the exams package following these steps:

Indeed, it is possible to generate a unique set of questions for each test taker, thus eliminating some possibility of their collaboration: at least they would not blindly fill in an answer given by someone else, as they might well be answering the same question with different parameters.

In my opinion, if the questions are properly designed, this mechanism also ensures the fairness of the test, since everyone is essentially tested on the same skill (in this case, solving a linear equation), but only with slightly different details which are immaterial.

Designing test formats

In addition to generating the questions, I also spent some time designing the multiple-choice and fill-in-the-blank test format.

For multiple choice questions, I gave five possible choices with one correct answer. Of course, we do not want the correct answer to always be e.g. C. To obtain random assignment of the correct choice, what I did was:

For fill-in-the-black questions, the Canvas LMS allows for fuzzy answers, which is especially useful when intermediate calculation steps involve numerical rounding. This can be easily catered for by setting a tolerance level in the exams package.

Concluding remarks

All in all, it was quite an experience using the exams package in a real-life setting. The package gets the job done nicely, and it has proven to be intuitive and easy to use (albeit with some learning curve). I am not sure when I'd use it for a second time, but if so I would be much more comfortable with the package, thus saving more time on designing test questions. My appreciation to the contributions made by the package developers (see here).