All 5 entries tagged ITS
November 05, 2009
Writing about web page http://www.ithaca.edu/myhome
At a session about building a portal, I was struck by the similarities between the presenters’ institution – Ithaca College – and our own setup. They have three groups governing their web presence:
- Their web strategy group has oversight. This is a high level group with VPs, Marketing, Admissions, Provost’s office, etc.
- IT Services has technical leadership and hosts the institutional web site(s)
- Marketing & Comms shares responsibility with ITS for brand, high level content, UX, etc.
They have a richly functional and well populated CMS which they built themselves, and a year or so ago, decided that they would build a portal to accomplish the following:-
- Provide a home for a person’s (not the institution’s) activities. User has complete control over portlets, tabs, etc. – except for the “message center” portlet which is mandatory. The Comms Office control what appears in the Message Center.
- Provide a single entry point leading to other resources
- Improve communications between institution and students
- Make transactions easier and information easier to find
- Make a lightweight system that reuses as much as possible of existing web services & content.
A fairly similar set of circumstances to our own. What they built was a PHP / mySQL based application which uses the iGoogle portlet standard to deliver the following:-
- Drag & drop UI for selecting & arranging content. (Choosing a background colour for each portlet turns out to be surprisingly popular and well used.)
- The portal is a single sign-on participant, so starting in the portal means you won’t need to sign in to move on to other apps, and data can be pulled from other apps without needing to reauthenticate.
- Webmail & calendar views in the portal (in fact, the only access to webmail is via the portal, to drive traffic)
- Access to third party email accounts (Yahoo, Gmail, IMAP)
- Lots of portlets for non-institutional data – Facebook, Digg, Reddit, Twitter, RSS Feeds, etc.
- Search portlet shows results inline for people, web pages, blogs, etc.
- “Service tabs” are whole-page applications (eg. change your password, see your calendar).
- Users can publish and share their tabs with others if they’ve made a useful combination of things.
- There’s a very Facebook-like gadget which shows you who else is online, their status updates, comments on other peoples’ statuses, their photos, etc. You can define who your friends are just like Facebook.
- Mobile-optimised rendition (webkit optimised) – mobile home page is a list of portlets, then each portlet gets its own mobile-optimised screen. Similarly, an accessibilty-optimised rendition of the portal.
What’s striking about this to me is that they reached a different conclusion to the thinking we’ve so far been doing. Their portal at present doesn’t have access to much institutional information about the individual. So there’s no gadgets for “My modules” or “My timetable” or “My coursework”. The gadgets are fundamentally just news, email and external. They hope to add gadgets which can display institutional data, but there’s back-end plumbing which needs to happen first (again, sounds kind of familiar). Until I saw this presentation, my take was that you absolutely had to have those institutional data gadgets to succeed. But the Ithaca portal has achieved the astonishingly high take up rate of 80% of the members of the university visiting it at least once per day. Without institutional data. It’s given me pause for thought.
Ithaca have an excellent micro-site intended for people who are interested in their portal but who aren’t members of the university. See for instance the home page, some video tutorials, the presentation from today, and some usage stats
November 04, 2009
Writing about web page http://www.educause.edu/E2009
I’m in Denver for the Educause conference. It’s probably the biggest IT-in-HE conference in the world, and whatever you’re interested in – e-learning, cloud computing, weathering the downturn – it’s a safe bet that there’ll be a session on it here.
I was last at the conference five years ago (also in Denver, coincidentally), and that time, one of my main interests was in helpdesk software, and I hoped to use the conference as a lever to try and persuade my colleagues that we should switch away from HEAT, which I thought then (and still think) was a bit of a dog’s breakfast. It’s taken five years for that particular plan to come to fruition, so I guess I should be cautious about what I might accomplish this time around. But if nothing else, there’s a bunch of people talking about things that Warwick is very much interested in right now, and, for as long as my laptop battery lasts, I’ll be taking notes which hopefully might prove useful to us in the future.
Some of the sessions I have my eye on include:-
Cloud Computing: Hype or Hope? Does this paradigm offer great promise or extreme peril to the core mission of the academy? Two academic IT leaders will debate the pros and cons of moving mission-critical services to the cloud.
Revisiting Your IT Governance Model. Four years after adopting an inclusive IT governance and prioritization process, we’ve completed 188 projects, spending $8.4 million and expending 250,000 hours. We will describe the history of our governance, our recent governance process review, and how the process has evolved to create a collegial and transparent method for prioritization.
Blackboard, Moodle, and Sakai. A discussion of the pros and cons of adopting proprietary versus open-source solutions. Issues addressed will include total cost of ownership, licensing, options for application hosting and technical support, and how new features find their way into a product.
Virtual Desktops: 60 Percent Cheaper, but Are They Worth It? Pepperdine University is conducting a 12-month study to assess the costs and feasibility of replacing desktop computers with virtual machines that allow multiple people to share a single PC. Results from a pilot implementation will be shared, revealing costs, power usage, user satisfaction, ease of administration, and recommendations for installation.
Project Management and IT Governance Through Agile Methods. Decision making within IT governance and project management is commonly driven by hierarchical, centralized, and formal planning. Agile Methods, adopted at SUNY Delhi, focusing on openness, transparency, self-organizing groups, collaboration and incremental development, deliver continuous innovation, reduced costs, and delivery times, as well as more reliable results.
IT Metrics. This session focuses on developing, collecting, and reporting IT metrics, leveraging peer efforts, and identifying benchmarks to improve the overall performance of IT departments. Frequently used metrics are customers’ feedback on IT services, balanced with internally recorded metrics of actual customer IT services usage. A constant goal of this group is to assist others in implementing metrics in a more rigorous, meaningful, and timely manner.
What Happened to the Computer Lab? Over 80 percent of respondents to the annual ECAR study of undergraduate students reported owning laptops; nevertheless, usage of expensive public computer labs remains high. Panelists from three institutions will lead a provocative discussion on updating existing computer labs.
The Heat Is On: Taming the Data Center. Power and cooling continue to be “hot” topics in the data center. Many vendors offer green solutions and products. Should an organization retrofit or build a new data center to meet increasing demands? This session will focus on some strategies to manage data centers more effectively.
Building a Research IT Division from the Ground Up
The nature of the research enterprise is changing rapidly. Large-scale computing, storage, and collaboration needs are now common. We describe how we scoped and funded a central IT division focused on research IT support to address these needs, and the successes and challenges we encountered along the way.
Ignorance Isn’t Bliss: How to Find Out What Your Clients Really Need
Providing IT tools and resources that meet client expectations requires persistent and creative efforts to understand their needs. Six presenters in this session will describe surveys, face-to-face discussions, and other means of learning about client needs and how to effectively follow up on those expressed needs.
According to my scribbled notes, I’m aiming to attend 16 sessions in two days, so I expect to get progressively less coherent as time progresses. Let the Powerpoint begin!
October 31, 2007
An idea which keeps popping up when I talk to my colleagues both within ITS and also within academic and administrative departments is that lots of people want a Document Management System (DMS for short from now on) to support their work. So I’ve agreed to try and pull together some kind of summary about what it is that people actually mean when they say DMS – do they all mean the same thing? – and thus what kind of system we might be looking for.
As luck would have it, I know nothing about what a DMS is or does, nor am I very aware of what products there are in this space. This is either a gross disadvantage or a refreshing lack of preconceptions allowing for open-minded consideration of the issues. However, some of the problem domains are pretty easy to understand: archiving of documents we need to keep around for legal or business reasons; users who want to work together on authoring documents; users who want a record of the history of a document. So my discussions and reading so far lead me to believe that a DMS might encompass some, all, none or fewer of the following:-
Document creation and editing
A DMS should support the creation and editing of documents using the same desktop applications which people already use. So you should be able to create and edit your Word documents, your Photoshop images, your Autocad drawings, etc. just as you do at the moment, but storing them in the DMS instead of on your local hard drive or a networked hard drive. An implication of this is that there would have to be some way to connect to the DMS directly from your desktop; it would add too much friction if you had to use a web app to download the document, edit it, save it to your hard disk, then re-upload it. You’d need to be able to open the document directly from the DMS. This would suggest that you’d need SMB or CIFS or webDAV and possibly sFTP support, especially if access in this way was also supposed to work off-campus (for when you’re at home, or when you’re collaborating with someone at another university) as well as on.
And in order to properly support editing in this way, you’d also need to be able to lock documents for editing so that while I’m editing it, you can’t. And you might want different sorts of locks; a lock which says “I have this document open for editing right now” is one sort, but you might also want to be able to say “I’m going to be working on this document, on and off, for the next week. Nobody else should be able to change it until a pre-defined time comes around, or until I explicitly signal that I’m done with it.” And implicit in this idea is the idea that you should be able to set permissions on your files and folders to control who can see them, edit them, comment on them, allow others to edit them, etc.
Another feature you’d want over and above what you get from a normal file system is version history. As changes are made to a document in the DMS, metadata about the changes should be stored so that it’s possible to see a history showing who changed the document and when, and you might also want to store all the previous versions of the document if you had disk space to burn.
Archiving and lifecycle management
Once you’re done editing your document, a feature which several people have mentioned is the ability to archive it – a place to store your documents which is secure and stable and allows for long-term storage of a “frozen” unchangeable version of a document. There are lots of documents which make the transition into this state – committee agendas, minutes and papers, annual reports, blueprints, etc. There’s also an interesting question about the lifecycle of archived documents: some documents, notably those which contain personal data, may not be stored for longer than is required to perform the work of the institution. So documents like that may have archiving rules such as “Keep for five years, then delete”. Others may be “Keep indefinitely” – but that raises challenges of its own, since it implies that your storage requirements for your DMS are going to rise every year. And how long is it reasonable to assume that “indefinitely” means? Our Estates office have paper documents going back forty years. Is it reasonable to try and design a DMS to store documents for that kind of period? What’s the lifespan of a given document format (eg. Word or Autocad)? Five years? Ten?
Sharing, publishing and retrieval
Once you’ve put a document into your DMS, you’re likely to want to share it with some people, and you need to be able to find it again later. So you need the same kind of permissions system that you need for editing purposes, but for viewing purposes. And, equally importantly, you need to be able to find your document, and possibly you need other people to be able to find it too. Web sites tend to allow browsing through a hierarchical structure, but a DMS may or may not work in that way, so good indexing, searching and metadata become important. The metadata is particularly relevant because not every file type contains content which can be indexed and searched; if the file I’ve uploaded is an image, then it’s effectively unsearchable unless I also supply a description or some keywords alongside it. (This is particularly important if what you plan to do is scan lots of paper documents and add them to your DMS; unless you intend to do OCR – a slow and expensive proposition – then what you’ll have is effectively just an image, so it’ll only be discoverable if its metadata is good enough.)
An interesting extra wrinkle which some people have mentioned is that once you’ve got the ability to share documents and edit them collaboratively, then you might want other tools to help your collaboration too. So if you and I are working on a research paper, or a design for a new Library, then as well as the documents we’re creating, perhaps we’d also like to-do lists for the participants; maybe a calendar to show due dates, or gantt charts, or ways of leaving message for each other like a mini discussion forum. By this point, I think, you’ve moved beyond pure DMS into a different space. But I can see how, in peoples’ minds, the two spaces might be logically linked, and if you’re doing one, you might well want to do the other around it.
Some things which I think are probably out of scope for our purposes are:-
- Real-time collaborative editing; two or more people working on a document at the same time, able to see each others’ changes live on their respective screens. GoogleDocs lets you do this for word processed documents and spreadsheets, but short of building our own web apps to do the same, I don’t think this is something you could easily get from a DMS; it would need your editing applications – your word processor, your spreadsheet tool, etc. – to support this kind of editing explicitly, and I don’t think many, if any of them, do. Users who want this feature should probably be directed towards GoogleDocs or Zoho or whatever.
- Workflow. Some papers I’ve read have suggested that a DMS could be the tool by which you define and enforce your workflow for certain types of document. So if I create an invoice within the DMS, the system knows that because it’s an invoice, and it’s for more than £5K, it should go first to my head of department for approval, and then to the finance office for processing, and a record should be created in SAP, and so on, and so on. I can see how this could be useful, but I don’t think there’s any realistic chance of implementing this in our very diverse and decentralised environment. So perhaps workflow support is out of scope.
- Document scanning. At least some of the people who want a DMS want to scan lots of paper documents and put them into it. My presumption at the moment is that the scanning and possible OCR work would be a separate project to the creation of a DMS, and the DMS engine wouldn’t particularly distinguish, or have additional support for, scanned documents as opposed to documents which were fully digital.
- Records management. I’m not as sure about this as I am about the other exclusions, but it seems to me that records management, where you have a set of documents which are all in the same, highly structured format like, er, records (in a database), is a niche of its own within document management, and the general purpose nature of a system which allows you to edit and store any kind of file may not be sufficient for more highly structured data.
Phew. So, one of the first questions which occurs to me is, are all these activities really the same in the sense that a single application could or should support them? Or is long-term archiving conceptually and technically different from shared editing? I guess that’s something which might become clearer once we start to consider possible products in this space, though again, I don’t know much about what products there are or what their individual strengths and weaknesses are; several people I’ve spoken to so far have explicitly suggested Microsoft SharePoint as being what they’re thinking of when they say ‘DMS’; other products I’ve heard mentioned include Alfresco as a sort of open source SharePoint, FormScape (which I believe may already be in use within the Finance Office), and Documentum as the sort of heavy-weight market leader in this space. One thing we’d need to watch out for with most off-the-shelf systems is that they generally claim to do a lot more than just document management; SharePoint in particular is like a sort of swiss army knife of a server, claiming to do document management, web content management, portal management, project management and for all I know moon landing management. We’d need to be sure that we could wall the application off so that it doesn’t offer features which would compete with tools which we already use.
Another possibility is to consider whether we could build something for the job, or adapt one of our existing applications. Files.Warwick is probably the closest to what would be required, though it lacks version history, check-in/out, the idea of a “locked” archive, and most challengingly of all, it lacks a way to connect to it directly from your desktop (other than FTP). But it does have some of what’s needed; granular permissions, easy sharing, notifications and so on. But then I also wonder if the name recognition which SharePoint, in particular, seems to enjoy, would be important in that anything which isn’t SharePoint runs the risk of being rejected on that basis alone.
Anyway, we’re a long way from package selection. But it does seem as if some sense of what we might be looking for is starting to emerge from the mist.
October 04, 2007
Well, the show’s now over and having been to more sessions than I can count, had a bunch of interesting conversations and seen some impressive applications, it’s time to try and wrap up and summarise what I’ve learned or concluded. It’s important to remember, I guess, that this is a vendor show. It’s not like ETech, where nobody’s trying to sell anything; the objective of this show is to persuade people to use Adobe technologies in preference to other tools. So one needs at least a pinch of the cynicism most elegantly expressed by, was it Paxman?, when he famously said “What I have to ask myself all the time, is, Why is this lying bastard lying to me?”. I don’t think anyone lied to me as such, but inconvenient questions (AIR on Linux, anyone?) were somewhat glossed over.
My previous entries have mostly been “live” from a session, so they’ve been short and factual about what I was seeing at the time; now it’s time to reflect a bit more. There are several topics I’ve been thinking about, some based on what I wanted to learn about before I set off, some arising from the sessions I saw and the emphasis that Adobe themselves and other presenters were putting on different products, technologies or applications.
A no-brainer. I thought before I even came to the conference, and I feel even more strongly now, that video (and audio) playback through the Flash player is the right choice for us to make. At the first day’s keynote, Kevin Lynch mentioned the statistic that over 70% of web video is now in Flash (FLV) format (driven largely by Youtube and the other video sharing sites, presumably). The move to support H.264 video just makes it all the more obvious that this is the best way to deliver video compared with Windows (WMV) or Mac (MOV) formats. We may even be approaching the holy grail of one format to rule them all – H264 video playing back in the browser, on consumer devices such as iPods or set-top boxes, on mobile phones, everywhere. There would still be questions and challenges relating to multiple encodings for different bandwidths and resolutions, but it’s a lot easier to contemplate multiple encodings all in the same format than multiple file formats.
Here the news is mixed. I saw some outstandingly rich and attractive applications, and I have invites to a couple of the ones I considered to be amongst the best – Buzzword and SlideRocket – which I look forward to demoing to people when I get back into the office. But, in completely unsurprising news, there were plenty of other applications which were clunky or slow or hard to figure out. Why should Flex be any different, right? And there are behavioural quirks with even the best of the apps; keyboard support is unpredictable, with even apparently basic things such as arrow up and down to scroll a block of text working in some places and some apps, but not others. Ctrl-Plus to make your text bigger (something my aging eyes are coming to depend on more and more) comes for free in HTML pages, but doesn’t work in Flex apps. None of these things are insurmountable, I expect, and I do think that if you were planning to build an app which you wanted to look and feel as much as possible like a desktop app, Flex would certainly be a tool to consider. But I didn’t come away with the feeling that there would be a big win to be had by converting parts of our existing apps into Flex-based interfaces, or doing any of our near-term planned work (video recording / playback / conferencing excepted) in Flex.
Having spent some time time talking to developers working on relatively largescale projects, I’ve come away reassured that there are plenty of people developing in Flex in ways that we would recognise, using Subversion or something similar, writing unit tests and build scripts, sharing development across a team. Should we decide one day that we wanted to make something in Flex, I see no reason why we shouldn’t be able to develop applications in exactly the same way as we currently do in Java.
A big theme of the conference was the usefulness of disconnected working; the idea that by using an appropriately designed AIR app, you could have your data both on the internet for ubiquitous access, but also on your desktop for when you aren’t connected. It’s not just Adobe who are pushing this idea, either; Google Gears and Firefox 3 both do similar things. But what I’m not sure about is how useful this would be for Warwick. The example that was invariably given in the sessions which talked about this is that of a company with a salesforce, who need to able to take their presentations and spreadsheets and whatever out on the road with them so that even when internet access is unavailable or unreliable, they still have everything they need to annoy people with. But are we like that? I would guess that 95% of our people are connected 95% of the time. And when they’re not connected, how much of the data we can make available would be useful to them? I’m not saying that we shouldn’t make data available offline where it’s quick and easy for us to do so (and we do already, I guess, with Pages-to-go). But I don’t know whether it’s as big a win for us as it might be for other organisations.
A theme which came out quite strongly across a number of sessions was the importance of your application playing nice with other peoples’ applications. Plenty of people talked about how it would be possible to, say, send your data to everyone on your SalesForce contacts list, or get notifications on your Google home page or whatever. It’s not an especially Flash-focussed question, but I do think it’s an important issue which we need to keep in mind. We do a little bit of it already, with SiteBuilder news and calendars being viewable within Google. But looking beyond just web-dev and thinking about all the tools and data which IT Services manages, there’s a long way to go: students can’t get their timetables into Google or on to their phone; staff can’t expose their Exchange calendars to anyone except other staff, and so on. It’ll be interesting to see whether we get pushed by increasing demand from staff and students for more interoperabiliity, or whether we’re enough of a closed eco-system that this won’t happen to us as much as it seems to be happening in the wider world.
I did like what I saw of AIR, the tool to let you convert Flash or HTML applications into Windows or Mac desktop applications. It seems fairly well thought out and although the lack of Linux for now is a bit of a bummer, I do think that it will arrive within twelve months. I think desktop applications to let you interact with our web applications in new ways – upload and download files, write content and then publish it, manage your sites / pages / users with a richer UI than the web one, perhaps – is potentially fertile ground for us to look into (can you look into the ground? Wouldn’t you just see the surface?).
Productivity apps on the internet
A big theme throughout the conference and elsewhere is the idea of moving productivity applications on to the internet. It’s clear that several large companies – Adobe, Google and others – are putting pretty big bets on the idea that having both your application and its data stored and running not on your PC but on a server on the internet, is Going To Be Big. One presentation asserted that this is a change comparable with the way that PCs killed typewriters and then Windows killed DOS; soon the idea that your apps and documents live just on one PC will seem archaic, is the argument. It’s an interesting idea, and there seem to be four distinct benefits being asserted:-
- Ubiquitous access for you to your data. No more not being able to continue working on your draft because it’s on your work PC when you’re at home, or vice versa.
- Easier publication. No longer will publication involve spawning many copies for distribution by email; just open up the permissions, send out an invite and away you go; you’ve published but there’s still a single canonical instance, and you can revoke publication later if you want to.
- More and better collaboration. If you want to work with several people on a document, the model of sending it round by email is clumsy and inefficient. Better to let everyone work on a single copy.
- Easier application upgrades. If the application is on somebody else’s server then upgrades can happen frequently and easily, without the end user needing to download or install or indeed do anything at all.
Against this are concerns about what happens when you’re not connected (but how big a deal is this going to be in the coming years?) and questions of privacy and security; would you be happy with all your documents being on Adobe’s or Google’s servers? Would your employer? But one could see that companies who want to offer this kind of service might be able to find ways around these problems, perhaps at a policy level by committing strongly to privacy and security, or perhaps technically by introducing ways for users to encrypt their data or choose where to store it (nothing to say that the application provider also has to be the storage provider), or even, as Google does with its search appliances, by giving you your own private instance of the application, running in your machine room, but managed by the supplier. But whatever happens, this does seem to be the coming thing; one has only to look at all the people already in this space, or pushing to get into it – Google, Adobe, Microsoft, plus a host of smaller players such as Zoho, Zimbra, Buzzword (until recently), SlideRocket, etc. – to see that there’s a lot of time and effort being invested into this space.
One last thing: if there’s one thing that came across more strongly than anything else throughout the whole conference, it’s that black is the new UI colour of choice. Every damn thing I saw used a black background, with optional accents in graphite or charcoal or carbon. Black, it would seem, is not just the new white, it’s the new any-colour-you-care-to-name.
November 28, 2006
One of the interesting things to come out of the recent email problems was the assertion that somebody should resign or be fired because of them. The most trenchant observation about this was probably Edward Ryan, who said:-
Has the head of IT resigned yet? If not why not? If he has not he should be sacked immediately.
but there are often suggestions along the same lines, sometimes slightly less directly, whenever there are problems of one sort or another at the university.
I’m curious about this, not in the particular context of IT at Warwick, but in the more general sense of whether summary dismissal is a good, effective or widely used management tool. It seems to me that we might ask both why it would be a good strategy to dismiss somebody, and when (in what circumstances) you might choose to do so.
When I studied law, many years ago, one of the topics in my criminology module was about the reasons for sending people to prison. There were three distinct arguments, as I remember it; prison sentences achieve:-
- Retribution. Our sense of justice or morality requires us to punish wrong-doing.
- Deterrence. Sending people to prison discourages other people from committing criminal acts, and may also discourage the offender from re-offending later.
- Prevention. People in prison can’t commit other criminal acts while they’re locked up.
(There was a fourth argument too; rehabilitation, the idea that going to prison might improve you as a person, making you less likely to re-offend. However, nobody seriously believes that any more, and it’s not relevant to my discussion here, so let’s pretend I didn’t mention it.)
Those three arguments, it seems to me, map quite well on to arguments in favour of firing people when bad things happen. We’re angry at the loss and damage that the bad thing causes, and in our anger we want to see somebody punished for the problem. Or we believe that if we fire people when they screw up, everyone else will try that much harder not to screw up. Or we don’t care about punishment or setting an example, but we just don’t want the person who screwed up to be in a position to do it again, and firing them seems like a good way to accomplish that.
I find each of the arguments unsatisfactory in one way or another, though; punishment doesn’t seem as though it accomplishes anything for the organisation; deterrence pre-supposes that people can choose to perform better – be more competent, more expert, more aware, more responsible – if there are severe penalties for failure. But is this really true? People aren’t generally good at anything other than very short term cause and effect, so if you fired someone, you might expect everyone else to try a bit harder for a little while, but the effect would be very short lived, so you’d have to be firing people a lot, so you’d be doing it for less substantial reasons, so would it really send the right message? Prevention is more persuasive in some cases; if you’re a pilot who forgets to lower the landing gear even once, perhaps it’s a good idea not to take the chance that you might forget again (not an entirely serious example). But most failures are more complex than that, involving lots of events and lots of people. It’s hard to be sure that firing one person or one team will really have the preventative effect you’re hoping for.
And that brings us on to the second interesting question. If you can persuade yourself that some combination of the three arguments above justify firing people in some circumstances, when would you choose to do it? What combination of factors would justify it? You might take into account:-
- The seriousness of the damage caused. If you do something wrong that causes lots of harm, you’re more culpable than if there was only minor damage. It seems reasonable, but it could also be kind of harsh: suppose, for example (and this is a purely fictitious idea) that in the recent outage we’d lost data from Warwick Blogs as well as from email, and that we established conclusively that Lazy Worker A had negligently failed to do email backups, and Lazy Worker B had negligently failed to do blogs backups. They’ve both failed in the same way, but people care much more about email than they do about blogs (yes, even yours). So is LWA more deserving of being sacked than LWB, if they’re generally comparable posts?
- The extent to which there is identifiable negligence. Sometimes we assume that when bad things happen, somebody or other is ipso facto responsible for it, if not in terms of what they actually did or didn’t do, then because the buck somehow stops with them. Ministers take the rap for errors in their department (or at least they used to) even when they personally didn’t know and wouldn’t have been expected to know what was going on. Lawyers have a term for it: Strict liability. But would you fire a manager of a department where things had gone wrong, if that manager had been intentionally kept in the dark about problems? Or if the problem was attributable to issues beyond the control of the manager? If I owned a chain of shops and told each manager that they’d be fired if shop-lifting exceeded 5% of turnover, but refused to underwrite CCTV and store detectives and other anti-theft measures, would it then be appropriate to fire those managers who missed the target?
- The organisation’s confidence in its legal position. I might be as sure as sure can be in my own mind that Lazy Worker C is responsible for fruit flies ruining my apple crop this year. But do I have enough proof to back up my position and my decision to fire him if he decides to contest my decision in the courts? Did I tell him, in writing, what was required? Did I train him in how to do it? Did I provide reminders and/or warnings in the period prior to the disaster? And so on. Despite the popular belief that in the private sector people are fired as a matter of routine when things go wrong, a quick straw poll of my friends in private sector jobs reveals that firing as punishment/deterrent, as opposed to redundancy for financial reasons, might in fact be a fairly rare thing because of the legal exposure it can open up.
- The costs and benefits of firing. A prudent organisation might want to ask itself, even if firing someone is in principle justified and legally supportable, will we be cutting off our nose to spite our face? I don’t for a moment agree with Mr Ryan’s assertion that the Director of IT should have been fired, but it’s interesting to think for a moment about what the costs and benefits would have been if that had happened. Good IT directors aren’t easy to find, so the first effect would have been that IT Services would have been without a director for perhaps six months or so. Departments without a director don’t always do a good job of fixing whatever the problems leading up to the dismissal were, and they are prone to drift, sometimes harmlessly, sometimes expensively. We would have had to pay for adverts, agencies, head-hunters and so on, and we would have borne the internal costs of drafting, advertising, short-listing, interviewing and so on, perhaps more than once. Senior posts require senior people to participate in the recruitment process, so it wouldn’t have been cheap. I’ve heard estimates of about £60k spent to recruit university posts at this level; I don’t know if that’s accurate or typical, but it seems plausible to me; there would be a measurable cost associated with the firing, and then further, harder to measure costs associated with the ramp-up time a new director would need, and the costs associated with the risk that the new director might turn out to be a dud – worse than the old director.
- The marketplace you’re in. If Warwick became more gung-ho about summary dismissal, it would be unusual among UK universities. Would it be harder to hire academics, managers and administrators if your hiring and firing reputation was different from that of your competitors? It could go either way, I guess; some people might be attracted to a university which reacts more strongly to failure, on the basis that you’d be working with better, more motivated colleagues. Others might be less persuaded. My instinct is that being tougher than comparable employers in the sector wouldn’t be a net gain in recruitment terms.
None of which is to say that there aren’t ever cases where firing is an appropriate response. But I wonder whether people who suggest it are doing so because having calmly weighed all the evidence at their disposal, they believe that it’s strategically the best choice for the organisation, or because they’re understandably furious that their email is unavailable and they need to vent.