The covid goddess appeared upon me in spring and told me to create random
exams for online examination so that the individual solutions could not be
shared. I started with auto-multiple-choice, but then I noticed its basic
computation capabilities, based on the latex package fpeval, are not enough
to build questions on many topics.
I tried to call sagetex from auto-multiple-choice, and I did succeed, but
it was akward and buggy, since both sagetex and auto-multiple-choice hack
LaTeX in incompatible ways to bypass the limitations of the verbatim
environment.
Eventually, I wrote my own system for generating exams, based on sagetex
alone. It works like this:
- "pdflatex file.tex" compiles the latex file, but does not eval sage,
as in sagetex, so it is compatible with latex editors. This file has some
markup for the solution, and for multiple choice questions (optional).
- the standard sagetex sequence works as usual:
- pdflatex file.tex
- sage file.sagetex.sage
- pdflatex file.tex
- the command "./process file.tex" is just the three above commands in
succesion
- the command "./process file.tex all", however, reads all student data
from "students.csv" and generates
- the statements id1.pdf, id2.pdf, etcétera, in the "question" folder
- the solutions, id1.pdf, id2.pdf, etcétera, in the "solution" folder
- If there are forms in the pdf, then "file.data" contains info about
the questions and the correct solution for each question.
It can be used in two ways:
- A multiple choice exam for each student, which you can send by email
or distribute using a "folder" activity in moodle, and collect using an
"assign" activity in moodle. After downloading the filled exams from
moodle, the script in the "grade" folder helps grade automatically. It only
works if the students fill the pdf forms correctly, it doesn't work if they
draw on top using whatever means.
- An open question exam.
For both of them a detailed solution is useful to the student. For open
questions, it can also be interesting for the grader, since it makes easier
to detect mistakes in the computations. In the examples that you can find
in the adjoint zip file, you will not find that possibility, but I did it
in the messy exam with auto-multiple-choice and sageexam, it could be done
with this system and I'm sure you know well what I mean.
I'd be glad to have some feedback. It is important to work out the details,
so that a system like this can be used by more people that know little sage
and python. There are many possible improvements, like text field questions
that are checked with sage symbolic capabilities, exporting to moodle
questions, and many others. The real treat will come when we start to
design questions where the number of vectors, the dimension of each of the
subspaces and their intersections, and so on, are different each time, so
that the same question can be used for sparring a few times.
But I'd like to get the basics right first. Maybe the R exams module is
showing the righteous way? But it doesn't work in a latex editor...
Regards
--
You received this message because you are subscribed to the Google Groups
"sage-edu" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To view this discussion on the web visit
https://groups.google.com/d/msgid/sage-edu/2630c8fb-4202-468c-99e6-03d796340fabn%40googlegroups.com.