All 4 entries tagged Feedback
View all 35 entries tagged Feedback on Warwick Blogs | View entries tagged Feedback at Technorati | There are no images tagged Feedback on this blog
February 08, 2006
Feed back! I make no apologies for plugging "Feedback" again.
"Feedback, Feedback, Feedback", as Tony Blair might have said, is at the heart of any contract between teacher and learner. And providing it is one of the things computer-aided assessment can be really good at.
Here is a quote from this paper entitled Recent Developments in Setting and Using Objective Tests in Mathematics Using QM Perception presented by E. Ellis, N. Baruah, M. Gill and M. Greenhow to the 9th International CAA Conference in Loughborough last year.
One of us (Martin Greenhow) initially worried that so much feedback was being made available to students that they would simply ignore it. The results of this study clearly show that extensive feedback is welcomed by, and has a positive effect on, most students. Some students requested even more feedback. In effect, the questions are being used as a learning tool alongside, or even instead of, lectures and seminars. This could have rather far-reaching consequences: question designers should focus much of their attention on feedback, the curriculum needs to make time for students to attend to it and the assessment criteria need to reward such student engagement.
Of course, it is one thing to provide feedback, another to ensure that it is acted upon. Encouraging students to make good use of feedback is one of the aims of the FAST Project cited in the related web page.
We started with a crossword clue, and so let's end with one:
Well constructed and square, like a stool perhaps (6 letters)
This scatalogical clue is attributed to Ximenes (and as usual, culled from The Week magazine). Ximenes was the crossword pseudonym of Derrick Somerset Macnutt, who was Head of Classics at Christ's Hospital. A Housie friend of mine said he would regularly set his class a stiff translation while he got on with his weekly puzzle for the Observer newspaper.
January 06, 2006
This blog's title is an anagram of David Paul Ausubel, the educational psychologist who made serious play of the fact that you can't teach someone effectively until you know what they already know (and, by implicaton, don't know). In Educational Psychology: A Cognitive View (1968) he wrote: "The most important single factor influencing learning is what the learner knows. Ascertain this and teach accordingly". His dictum is often quoted but rarely acted upon, and I see no harm in running another advertisement to build up its value.
When students submit to any kind of assessment, they reveal information about what they know and don't know. This is often valuable information and usually it goes to waste (think of all those exams where the only feedback is a mystery number between 1 and 100). Does computer-aided asssesment (CAA) offer any remedies? Can it find out "what the learner knows" and then act accordingly? I believe the answer is "yes" and will try to convince you of this with a couple of simple examples.
If a factual(ish) question (eg Which philosopher might have said "Blogo ergo sum"?) is wrongly answered ("Renée Zellweger" perhaps), the same question can be repeated in a later test, with a friendly admonition in the feedback for a second wrong answer. Persistent weaknesses over a series of formative tests can be routinely reported back to the student.
For conceptual questions, CAA can do even better. We will take a simple example to illustrate how computer assessment can identify a student's problem and try to deal with it. Consider the arithmetic question:
Add one third to one half and express your answer as a fraction (i.e. a number of the form m/n).
There are many reasons for getting the wrong answer: a simple error of calculation (unlikely here though), or a failure to understand
(i) the nature of fractions (both as numbers and processes — does the symbol 1/2 mean the number 'one half' or the process of dividing by 2?) and
(ii) the rules for calculating with them.
Let's suppose a student submits the following wrong answer:
It's a fair guess that they have used the following wrong rule (mal-rule) of adding the numerators and the denominators:
We could reinforce this guess with another example, say two-thirds plus a fifth. If they come up with three-eighths, we can be pretty certain we've sussed out the mal-rule they're using.
Next we start to generate doubt by getting them to apply their rule to one half plus one half (giving two quarters, which one hopes they will know equals one half). Something funny going on here! A fluke exception? How about this then:
Having thoroughly undermined their faith in the mal-rule, we eventually return to the drawing board, asking the student how many sixths make a half (three) and how many sixths makes a third (two). If the penny drops that three of a kind plus two of a kind is five of a kind, the following equations take on meaning
More examples will be needed to elicit a thorough understanding of the general rule for adding fractions (including unlearning the mal-rule) and yet more practice for the student to feel comfortable applying the rule as a fast and accurate reflex.
What I have just described is only one possible CAA response to just one of many misconceptions or mal-rules that may be revealed in the simple exercise of adding a half to a third. A student may be guessing or may have mislearnt the rule at an earlier stage of education. An effective face-to-face (human) tutor will patiently probe a student's mistakes and identify the root causes their failures, then painstakingly rebuild their knowledge and understanding on a sound foundation. Intelligent computer assessment can be programmed to do the same and it is nothing if not patient and painstaking.
November 30, 2005
November 28, 2005
As my source of inspiration pointed out while administering dinner to our 2-year old Cameron last night, the blog title applies as well to "student feedback" as to "revenge". Effective formative assessment not only provides feedback to the assessor, but more importantly, gives fresh food for thought and enlightenment to the one being assessed.
A week is a long time (as Harold Wilson famously said of life in politics) to wait to find out where you went wrong; even a day later, your brain has probably gone cold. But a computer can tell you within microseconds.
So here's a situation where computer-aided assessment (CAA) can have a clear edge over traditional marking; I say "can" because, to gain the edge, you must take the trouble to design the CAA questions intelligently and to PROVIDE THE APPROPRIATE FEEDBACK (always assuming your software allows it).
I have been trawling through a lot of CAA software lately and have been disappointed by the perfunctory nature of the feedback provided in many samples that put the software through its paces (for instance, a single tick or a cross in response to a set of answers to a 6-part question). But, of course, there are beacons of good practice too. Here are two that caught my attention:
FAST (Formative Assessment in Science Teaching) -- A collaboration between the Open University (OU) and Sheffield Hallam (SHU) aiming to improve student engagement and learning through formative assessment following these principles . The Project has a Science focus (Biosciences, Chemistry, Physics) and is funded through the HEFCE Fund for the Development of Teaching and Learning (FDTL4). There are 30 development projects, 15 at the OU and SHU and 15 more at 13 other HE institutions.
Mathletics (near the bottom of this link's page) — Among the many features of Martin Greenhow's approach to online assessment that commend themselves is the attention given to the pedagogy of question setting and providing detailed feedback. Using his CAA software for modules at Brunel, Martin discovered surprisingly that the feedback is used by some students as their main learning tool. Martin uses the idea of 'mal-rules' (reflecting common conceptual errors) to generate plausible distractors in multiple-choice or multiple-response questions, and more generally uses students' answers to make a informed guesses at their misconceptions and provide targeted feedback.
Please extend this list with other examples of good assessment pedagogy.
Unfortunately the inspiration dried up when I was prevailed upon to take over the feeding of the aforesaid Cameron, slotting in spoonfuls as he wielded a sticky mouse to direct Adiboo's onscreen antics. Just as well this is not a blog on good parenting.