Here’s the initial plan for the departmental impact analysis of e-learning initiatives, part of the elearning Pathfinder evaluation workpackage. See attached document for appendices. List of names is tentative and will be subject to invitation.
Comments/suggestions on the plan, questions or people to interview?
This work is being undertaken between February and September 2007 by Terry Wareham, Fourstones evaluation & consultancy for HE, in response to a request from Dr Jay Dempster, Deputy Director of the Centre for Academic and Professional Development (CAPD) who is leading the evaluation component of the University’s HEA-funded e-learning ‘Pathfinder’ project. The data gathering and analysis will be conducted over 15 working days as agreed, which will cover initial discussion, planning and reading, preparation of interview schedule and gaining appointments, data analysis, report drafting and presentation.
Concurrently a university-wide web-based survey (see appendix 4) aimed at academic staff has been developed that aims to capture the spread of e-learning activity across the institution. This data will be reported separately.
a) identify the impact of current and new e-learning projects and initiatives on the student learning experience using an approach which emphasises appreciative enquiry in order to draw out the positive benefits of specific approaches in particular departmental contexts
b) make draft recommendations to inform the development of institutional strategic goals for e-learning
c) provide a model for evaluation of e-learning developments
The methods used for the evaluation will be a) gathering of existing data on departmental impact from the WELA programme, uptake and outcomes of initiative funding at departmental level, participation and outcomes of the Elab faculty activities and b) interviews with staff at a range of levels in 6 departments across 4 faculties and further interviews with a sample of staff who provide or support development activities for e-learning (see appendix 3). The sample of staff will be selected to ensure a range of engagement with e-learning initiatives and will thus include those who have made changes at programme and module levels and also those who have yet to embed e-learning approaches in their teaching.
The data will be analysed thematically using a process of constant comparison. Where comparable data are available from other institutions these will be used in the analysis as appropriate to draw out indications for future recommendations.
a) draft report to be submitted to CAPD by 17th August 2007
b) Final report to be submitted 14th September 2007
5. Phases of the evaluation
Phase 1 (February to May 2007)
Initial scoping of the work, establishment of list of interviewees background reading of institutional documentation and existing data.
Phase 2 (May to June 2007)
Interviews with identified sample of participants.
Phase 3 (June to August 2007)
Data analysis and report drafting.
Phase 4 (September 2007)
Completion and submission of report.
Note that phases overlap.
6. Interview protocol
Interview subjects will be contacted with a request for an interview, lasting 30 -45 minutes. Interviews will be confidential in order for subjects to feel comfortable in speaking about their experiences and views. All efforts will be made to ensure that particular views cannot be ascribed to particular individuals in the final report. Interview data will remain confidential to the consultant conducting the research. The interviews will be recorded, with the subjects’ permission, in order to facilitate data analysis.
The interviews will be dialogic and thus semi-structured. They will be iterative in the sense that themes that emerge in earlier interviews will be tested with later subjects where appropriate. It is important that interview subjects have the opportunity to develop ideas and views that were not necessarily envisaged in the initial interview schedule.
A list of questions are given in appendix 3. These will be used as a semi-structure to the interviews.