All entries for Monday 27 February 2006
February 27, 2006
Alan Schofield (our consultant) met with Michael Whitby, Darren Wallis, Julian Moss, Graham Lewis, Hywel Williams, Jay Dempster and Steve Brydges for the first on-site consultancy visit. As John posted in his last entry, an earlier meeting had already decided how to share out the tasks in the Institutional Review Document, and some have already made a start in gathering the statistics and departmental input, so this meeting was more about picking up on any concerns, and a reminder of the wider purposes of benchmarking.
It was a helpful introduction for me in particular, since I had been getting rather tied up in the definition of e-learning, and was concerned that without a reasonably clear consensus about what we were measuring, the statistics could become meaningless. There could even be a risk if we included all sorts of IT infrastructure spend under the e-learning heading (our elearning architecture is well integrated with the range of IT systems, services and networks) and came out significantly higher than the sector average. But that is also true of staff development, and even the curriculum itself. It may even be an attribute of successful e-learning that it is hard to disentangle. E-learning that is easy to identify as a discrete element is probably just a misguided innovation that failed to embed itself.
So that is hard for accuracy in budget attribution, but the good news is thatís further down the road for benchmarking. The Pilot Phase is all about trying to capture the essence of what is happening, in an open and honest manner, and the statistical requirements are simply one measure, to which we can add as many footnotes as we feel worthwhile. Some of the other sections, where we are asked to reflect in a more narrative style, will give us the opportunity to test and document our aspirations and practices in e-learning, and to use that as a basis for sharing, reflection and collaborative improvement.
Of course this comes at a time when the e-learning strategy is being reviewed in any case. I hope departments who recently responded to the request to offer feedback through Information Groups, will be happy to respond again to similar questions for benchmarking. Much of the IRD can be completed centrally, but would be incomplete unless it manages to capture the perspective of the departments, and that perspective must include not only the success stories, which the centre probably knows about anyway, but the smaller, less obvious pieces of e-learning, that may not even be commonly recognised as such. We also need to measure and describe uncertainty, hesitancy and the extent to which other academic pressures hinder e-learning innovation.