Between May and August 2020, I was a course instructor at UofT, teaching ACT240 (Mathematics of Investment and Credit) for the first half of the summer and ACT247 (Introductory Life Contingencies) for the second half.
As a first-time instructor, I was excited and nervous at the same time. Fortunately, things went mostly smoothly. I received much positive feedback from my students, while also learning a lot from the entire experience.
One major challenge for instructors in 2020 was the online course format during the COVID-19 pandemic, particularly how tests should be conducted online to prevent students from "collaborating" when they should not.
Considering I'd mainly use multiple-choice and fill-in-the-blank formats for test questions, I thought it would be a prime opportunity to experiment with the R package
exams , which I came across one day when searching random stuff on the Internet. It turned out to be quite efficient and helped me a lot, thus I am writing down the experience for future reference.
The official package website has already provided a good step-by-step guidance (here). For the simpliest case, the user only needs the following (which can be generated by the
Question (+ Answer) files (templates available here); and
A script to generate exam files in various formats (pdf, html, etc.).
For my case, I am interested in generating questions usable on the UofT online learning system Quercus, which is an adaptation of the Canvas learning management system (LMS) widely used by many universities around the world.
exams package does have a function
exams2canvas that corresponds to the Canvas LMS, I found the result unsatisfactory as some answers were not correctly identified by the online LMS (which could be caused either by me or the package or Canvas).
Luckily, the function
exams2blackboard worked fine, even though it was designed for a slightly different LMS. After generating questions in zip formats, I uploaded the files to the online system, after which I only needed a few clicks to finish designing a complete test: simple!
Overall, I'd say there is a learning curve at the beginning, e.g. getting familiar with the package setup, formatting tests in a particular fasion (I also tried generating some pdf files in customized templates), and learning how to generate random questions for each test taker. After finishing the first few questions, I got quite comfortable with the package and it ended up saving me a lot of time and energy.
As mentioned, the
exams package has helped me preventing, at least to some extent, students from collaboration in an online test. I will elaborate more on this point.
Let us look at a very simple case, say, I would like to test the students on solving a linear equation. In an offline test setting, I could just give one single question to everyone, e.g. solve the equation for .
Now, for online tests, it would be better to vary some parameters in the question (without changing what it is essentially testing on). This task is easily implemented in the
exams package following these steps:
Design the question as: Solve the equation for , where and are parameters.
Randomly generate and from a reasonable range, so that students don't get very weird numbers. This step is invisible to the students.
The answer is coded as , which is automatically calculated from the and generated above so each answer file matches its corresponding question.
Indeed, it is possible to generate a unique set of questions for each test taker, thus eliminating some possibility of their collaboration: at least they would not blindly fill in an answer given by someone else, as they might well be answering the same question with different parameters.
In my opinion, if the questions are properly designed, this mechanism also ensures the fairness of the test, since everyone is essentially tested on the same skill (in this case, solving a linear equation), but only with slightly different details which are immaterial.
In addition to generating the questions, I also spent some time designing the multiple-choice and fill-in-the-blank test format.
For multiple choice questions, I gave five possible choices with one correct answer. Of course, we do not want the correct answer to always be e.g.
C. To obtain random assignment of the correct choice, what I did was:
Randomly choose a correct answer from
E, which is easily done by the
sample function in R.
Say, the correct answer is
C. . Then, the five possible choices would be:
where the choices are coded as , , ..., , given
C is the correct one.
The choices can also be designed as ranges rather than exact values. Say, the correct answer is . Then, the five possible choices could be:
A. is less than 0.50
B. is larger than or equal to 0.50, but less than 1.00
C. is larger than or equal to 1.00, but less than 1.50
D. is larger than or equal to 1.50, but less than 2.00
E. is larger than or equal to 2.00
where the choices can be coded in a similar systematic fashion.
For fill-in-the-black questions, the Canvas LMS allows for fuzzy answers, which is especially useful when intermediate calculation steps involve numerical rounding. This can be easily catered for by setting a tolerance level in the
All in all, it was quite an experience using the
exams package in a real-life setting. The package gets the job done nicely, and it has proven to be intuitive and easy to use (albeit with some learning curve). I am not sure when I'd use it for a second time, but if so I would be much more comfortable with the package, thus saving more time on designing test questions. My appreciation to the contributions made by the package developers (see here).