#
All entries for Friday 07 July 2006

## July 07, 2006

### Looking back to 14th March …

Follow-up to Pi in the Sky? from Computer-aided assessment for sciences

… the date of a Computer–Aided Assessment workshop at Warwick showcasing some of the software we are evaluating as part of an in–house CAA Project (for 'in–house' read 'Warwick Science Faculty'). Four software packages were exposed to scrutiny, and I would now like to say a little about each, our experience so far and the plans we have for them. More details will be posted soon on the Project website.

**Maple TA** (MTA).

This commercial **T**eaching and **A**ssessment package from the Canadian firm *Maplesoft* is built on their well-known computer algebra system *Maple*. This foundation give MTA one of its mains strengths, namely the way it handles mathematics:

- at the authoring stage — it has a palette of mathematical symbols and a LaTeX facility
- in its rendering of equations onscreen
- its ability to include random parameters in question templates
- its ability to parse mathematically–equivalent answers.

At the workshop, its praises were sung by a representative from Adept Science (who distribute MTA in the United Kingdom) and moderated "Warts and All" by Michael McCabe, who had used it to assess Mathematics students at Portsmouth University a few months earlier.

Five weeks after the workshop, we used MTA, in tandem with *Question Mark Perception*, for a 50-minute summative test taken by 166 students registered for a second-year module in elementary number theory. The test contained 11 multiple-choice questions (MCQs) and contributed around 6% to the module credit. It was fairly easy to author the questions, the test was reliably delivered on the day by the hosting service, and I found it straightforward to extract and process the students' answer files from its database. On the negative side I found its imposed marking scheme and MCQ format frustratingly restrictive and laboriously had to adjust marks student-by-student to allow 3 for a correct answer, 1 for 'don't know' and zero for a wrong choice, normalising the totals by subtracting 8. In particular, I could not return the students' scores as soon as they clicked the submit button. Here *Perception* won hands–down on pedagogical flexibility but couldn't compete on the maths.

We were fortunate in getting MTA for an extended trial evaluation period. We tried, without success, to run it locally on a Java server that was configured for other apps. Subsequently we changed to the hosted service. Apart from an unfortunate loss of data two days before the test, this worked robustly and we have now taken out a 500–student departmental licence for the coming academic year, when we hope to go beyond the simple MCQ format and begin to exploit MTA's full mathematical capabilities.

**WeBWorK**

This is a mature open–source assessment package developed with generous NSF funding at the University of Rochester. It has been around for over a decade and is currently used by over 80 (mainly North American) universities. When Jim Robinson of the Warwick Physics Department started looking for a suitable CAA package to improve and reinforce the mathematical abilities of his department's first–year students, he listed the following criteria. The software should preferably be:

- Capable of rendering mathematics
- Free
- Client–independent and available off site using web–based technology

It should also offer:

- A good bank of problems at right level
- Easy authoring, customizable question formats and individualised problems
- Instant feedback — hints, right/wrong, model solutions

WeBWorK has all these desirable features. Earlier this year, Jim installed WeBWorK on a Linux box, an old 500MHz PC. He needed some Linux systems admin experience – plus a few hours (no installation wizard) to install the WeBWorK software and problem libraries; it needs Perl, Apache, SQL server (MySQL), LaTeX, dvipng plus a few other apps (all free). Thereafter all course administration is web based.

Since Christmas, we have been running a pilot using volunteer Physics students taking the first–year *Maths for Scientists* module. Initially we plundered the very large collection of question banks to create a sample assigment based on the first term's material (mainly calculus) and subsequently provided a second assignment of home-made questions on the second-term's content (including linear algebra).The student feedback is currently being analysed, and we have begun to create assignments to be used by the whole cohort of Physics students taking the *Maths for Scientists* module next term.

As this entry is growing rather long, I will take a break now and discuss *STACK* and *Mathletics* later.