October 08, 2009

The debate over the desirability of ‘pointless’ research continues to rage.

Follow-up to Draft David Mitchell for Board Membership in a UK Research Council from Making the university safe for intellectual life in the 21st century -- by Steve Fuller

The context, you may recall, is that the UK’s new ‘Research Excellence Framework’ (REF) aims to measure research ‘impact’ in ways that appears to favour economic relevance. The Times Higher last week covered my campaign to draft the comedian David Mitchell into membership of a research council because of his wise objections to this proposal. In fact, that issue of the Higher was full of like-minded sentiments.

In this week's Times Higher, Adam Corner, a psychologist at Cardiff, has written in defense of relevance measures, employing two arguments. First:

Their [i.e. mine and others’] arguments are couched in anti-establishment language and position academics as the guardians of truth-seeking. But the golden age of academia they long for was far from a meritocracy where independent inquiry ruled. Their desire to see research prised away from pragmatic objectives risks a return to intellectual elitism.

In response, first it’s worth pointing out that ‘impact’ is being proposed as a replacement of ‘esteem’ in previous research assessment exercises. No more coasting on reputations made twenty years ago! For younger researchers like Corner, this is potentially good news, at least in terms of levelling the playing field of merit. In this context, measurement of ‘impact’ might appear to be a step in the right direction.

But, speaking for myself, whatever intellctual elitism may have existed when academia was essentially a self-appointed club funded by the taxpayers has long disappeared. Certainly ideals of ‘social relevance’ (which Corner himself prefers to ‘economic relevance’) are strongly embedded in today’s academia, which is larger and more diverse than ever in its history – even without explicit steering in specific ‘policy relevant’ directions. The only question is whether academia should be somehow brought more into line with state policy concerns. My answer is no.

Corner then concludes:

we must not forget that the purpose of our research should be the advancement of socially useful knowledge - not simply the satisfaction of our own curiosity.

A false dichotomy often made in this debate. (Actually I hate the word ‘curiosity’: It makes intellectual work sound like a species of attention deficit disorder!) Luckily comedian Mitchell got the right end of the stick, when he observed various research endeavours that appeared pointless in the short term but turned out to be quite relevant and useful in the long term. In other words, the real enemy here is not the fixation on ‘impact’ per se but short-termist thinking about research impact. We need a smarter economics of research that thinks in terms of capital investments, product life cycles and multiplier effects, within which the return on ‘pointless’ research would be obvious and manageable.


September 29, 2009

Does Genius Excuse Crime? Another Angle on the Polanski Case

Writing about web page http://www.guardian.co.uk/film/2009/sep/28/roman-polanski-french-government

The celebrated and notorious film-maker Roman Polanski has been re-arrested in Zurich for having sexually abused a drugged-out 13-year old girl 32 years ago in the United States, a crime for which he was convicted but he skipped the country before serving time. Nobody has denied that the criminal act occurred. The question is what to make of it now. What is striking is that the artistic community across the world has been virtually unanimous in calling for Polanski’s release, whereas virtually everyone else (though not the victim) wants him to pay for his crime – if not more.

What accounts for this vast difference in sentiment? Well, Polanski is a genius! Let’s assume that this claim is not only true but also relevant to judging his case. How would it be relevant? From his artistic defenders, you might think it has something to do with the quality of his cinematic output. Perhaps we’re supposed to think that Polanski served his prison time by creating great art, which more than makes up for the original heinous act: Community service on a grand scale, if you will.

The only – but crucial – difference, of course, is that Polanski wasn’t coerced by the legal system to create this great art. He just happened to luck out in being that sort of artist. A porn film producer who committed a comparable act, even if his films had bigger box office takings than Polanski’s, would not enjoy comparable sympathy. The late ethicist Bernard Williams, who popularized the phrase ‘moral luck’, actually justified this way of looking at things that would now keep Polanski out of prison.

However, that can’t be right. Is genius nothing but a kind of miracle? On the contrary, I think ‘genius’ should be treated much more literally. After all, ‘genius’ refers primarily to the artist’s state of mind, not his or her output. In cases like Polanski’s, it makes most sense as the flipside of mental deficiency, possessors of which are also often given leniency in rape cases, either because one was too dumb or too crazy to have a fully functioning moral compass -- to put not too fine point on it. But of course one needs to prove mental deficiency in court, which is not always easy. But why not the same for genius? Polanski’s defenders should welcome the opportunity to have his genius demonstrated in a courtroom through a variety of expert witnesses who could testify, in the face of cross-examination, to the necessity of his particular pattern of personal behavior to the quality of art that he has created. Still, just as insanity defenses don't always work in particular cases, neither might the genius defense. 

We don’t have mental deficiency – whether of the cognitive or psychiatric variety – decided by a self-recognized class of ‘deficients’ for legal purposes. So why then allow it for claims of genius, even though that is what much of the artistic community who recognizes Polanski as one of their own seems to wish?


September 28, 2009

Draft David Mitchell for Board Membership in a UK Research Council

Writing about web page http://www.guardian.co.uk/commentisfree/2009/sep/27/david-mitchell-pointless-studies-survey

Yes, that David Mitchell – the one from the ‘Peep Show’, ‘That Mitchell and Webb Look’ and numerous comedy quiz shows on British radio and television. He’s also a Cambridge history grad and one of the most articulate and insightful commentators on the state of higher education today – professional academics and certainly government officials included.

Beneath the title of this blogpost, I have provided a link to an article that appeared in the Observer this past Sunday (where he has a regular column), which takes comic aim at the proposed standard of ‘practical relevance’ put forward by the Research Excellence Framework, which is the successor to the UK’s Research Assessment Exercise.

What’s most interesting about this piece is that Mitchell only has to tweak the straight version of the story a little to produce massive comic effect. Yes, it is pretty st-o-o-o-pid for the public sector to fund mainly research that demonstrates short-term economic and social utility when that would be precisely the sort of research that would most naturally attract private funding. State funding is supposed to make up for – not contribute to – market failure.

Of course, Mitchell overlooks the possibility that, as in the case of the banks, the state is trying to bail out the charities and other private funders, whose coffers have been depleted by the global credit crunch. It’s a stretch, I know. But given the absurdity of the official policy, why shouldn’t its justification be at least as absurd, if not more so?

Mitchell is also spot-on in associating so-called pointless research with a society that thinks beyond sheer animal survival. Indeed, if we can think only in terms of research for the sake of survival, then we should cut out the research middleman altogether and simply focus state funds on implementing solutions that facilitate survival. Surely, if our straits are so dire, we don’t have time for any research whatsoever!

To end on a constructive note, I would like to invite David Mitchell to put himself forward for membership in one of the UK’s publicly funded research councils, probably Arts and Humanities (AHRC) or Economic and Social (ESRC). These are administered by the ominously named Department for Business, Innovation and Skills (of which Higher Education is a subdivision). Here is the website: http://www.dius.gov.uk/science/research_councils/public_appointments/council_members

You will see that openings for these councils will be advertised later in the Autumn. Council members are typically people who do not work in academia but are seen as ‘stakeholders’ in the future of academia. Comedians certainly fit that bill, given their large student market – not to mention the source of much of their funniest material!

Pass the suggestion along: Draft David Mitchell!


September 15, 2009

Audio version of the play now posted

Follow-up to New Play: 'Three women after the soul of William James' from Making the university safe for intellectual life in the 21st century -- by Steve Fuller

I have now managed to post the audio file of ‘Three women after the soul of William James’ as item 31 here

The first 50 minutes of the file is the play itself, followed by 50 minutes of discussion with the audience.


September 08, 2009

New Play: 'Three women after the soul of William James'

I have just come back from the British Science Festival in Surrey, where I staged my second play, Three Women after the Soul of William James. Here is the script surrey_play.pdf. The running time of the play is 45 minutes. (I did an audio recording of the play and the 45 minutes of discussion that followed. I hope to upload it at some point but at the moment the file is too large.) Here are some nifty pictures of the actors in character. Many thanks to Rachel and Esther Armstrong and Zoe Walshe for their brilliant performances as the three female leads!

The festival is sponsored each year by the British Association for the Advancement of Science. Last year, while sociology and social policy section president, I staged Lincoln and Darwin – Live for One Night Only!, in which the two famous figures – both born on 12 February 1809 – return to one of today’s talk shows to reflect on what has happened to science and politics since they died. The play was subsequently performed at the Oxford Science Centre and made into a podcast by some actors in Sydney, Australia. It was also written up in the Times Higher.

The premise of this year's play is that William James, who would later become the great early 20th century US psychologist and pragmatist philosopher, appears for tea in London as a young recent medical school graduate travelling Europe to find himself. He has been invited by Harriet Martineau, an old liberal firebrand, and they are subsequently joined by Clemence Royer, Darwin’s French translator, and Helena Blavatsky, the Russian psychic and theosophist. The year is 1870.

Bearing in mind that the play takes place about a half-century before women enjoy full political rights in most developed countries, the three female leads represent an array of scientific, political and personal positions that, in their day, marked them as operating on the radical fringe of European society. Of particular note is the way they turn potential female liabilities into epistemic and political strengths: e.g. the positive role of ‘receptiveness’ as mode of discovery in both medicine and metaphysics, the conversion of biological reproduction into a branch of political economy under the rubric of ‘eugenics’.

I am still working on the ideas underlying the play, some of which will feature in a book I am writing on the history of epistemology for Acumen.

Finally, those interested in following up the themes here might look at the following books:

  • Charles C. Gross, Brain, Vision, Memory: Tales in the History of Neuroscience (MIT Press, 1998). See especially chapter 3 on the spiritualist (Swedenborgian) legacy to brain science.
  • Jennifer Michael Hecht, The End of the Soul (Columbia University Press, 2005). On Clemence Royer and her quest for an atheist science based on Darwinist principles.
  • Louis Menand, The Metaphysical Club (Flamingo, 2001). On the intellectual context of William James’ development
  • David Wootton, Bad Medicine (Oxford University Press, 2006). On the role of belief in healing.

September 07, 2009

Indisciplinarity’s Mid–Life Crisis

Writing about web page http://www.journals.uchicago.edu/toc/ci/current?cookieSet=1

The latest (Summer 2009) issue of Critical Inquiry, arguably the leading humanities journal in the United States, is devoted to interdisciplinarity, an idea with which I have identified throughout my entire academic career, even as an undergraduate. In fact, all of my degrees are interdisciplinary. I have also theorized about interdisciplinarity from time to time. My most sustained treatment, Philosophy, Rhetoric and the End of Knowledge, appeared in a second edition a few years ago, now co-authored with Jim Collier of Virginia Tech. I am very much pro-interdisciplinarity, but after reading this issue of Critical Inquiry, I am minded of Voltaire’s quip: ‘God save me from my friends -- my enemies I can take care of’. From the looks of it, interdisciplinarity is suffering from a mid-life crisis.

I will review a few of the matters raised – and not raised – in this special issue. But first, readers who are sensitive to the contemporary academic scene will be struck by the American-style parochial elitism, which mirrors an older period when British journals were over-represented by people from Oxford and Cambridge – rather than, say, Harvard and Chicago – all of whom seem to know each other’s texts and even jokes. There are also various excruciating verbal mannerisms – the rhetorical throat clearing, tie adjusting and name-checking – that are the telltale signs of people who spend too much time talking to each other and hence are too reliant on what each thinks of the other.

Without denying the occasional insight in many of the articles, I come away from the special issue disappointed by the extent to which the discussion of interdisciplinarity does not seem to have progressed very much since Philosophy, Rhetoric and the End of Knowledge came out in 1993. In fact, many of these papers appear caught in the early 1980s time-warp of my graduate school days when, in the first flush of Anglophone postmodernism, it was cool and radical to ‘blur genres’ and ‘deconstruct binaries’, given the ossified nature of disciplinary boundaries.

Part of the problem may be the context of publication. Most of the papers appear to have been written for a conference held at the Institute for the Humanities at the University of Chicago. In terms of changes taking place in higher education across the world, this may be the ultimate backwater, simply by virtue of its relative immunity from those changes. We are so used to thinking of backwardness in terms of the dispossessed that it is easy to overlook that the elites may also be left behind by history, even while they largely keep their possessions – which just end up being worth less than before. To flourish for oneself and to be relevant to others are two distinct achievements, and while there need not be a tradeoff between the two, one should provide independent evidence for each.

James Chandler’s introduction to the issue illustrates what I’m talking about. He barely registers the fact that the recent drive to interdisciplinarity reflects not the inherent limitations of disciplinary knowledge or the discovery of new domains of knowledge (which he discusses in considerable abundance), but the relative weakening of the university’s position in defining how knowledge should be organized, even within its own institutional setting. Here I refer to the withdrawal of state subsidies for higher education, accompanied by various incentives for academics to justify their existence by collaborating with private funders. To be sure, universities have always had working arrangements with the private sector but our own period is marked by increased dependency on a limited range of funders – largely due to the size of the budgets required for cutting edge research.

To his modest credit, Mario Biagioli sees this point but in a characteristically too polite way to cause anyone to take notice. Biagioli’s article is about our shared field of science studies, and how it has managed to thrive amidst the post-Cold War meltdown of the university – unlike other interdisciplinary fields in the humanities and social sciences. The trick is simply to follow the funding fashions in research projects and not be overly concerned about theoretical coherence. The result is that while science studies certainly thrives in the academy, its exact disciplinary location always remains uncertain. Although Biagioli means well, his conclusion is about as intriguing as the idea that cockroaches can survive in virtually any previously inhabited space.

Lorraine Daston, in contrast, is keen to distinguish ‘history of science’ as a full-fledged discipline independent of science studies. She is happy to admit that the controversies that have embroiled science studies over the past decade or so – the so-called Science Wars – are to be expected of a field that shadows so closely the changing fortunes of science in society at large. Yes, of course. By implication, she suggests, ‘history of science’ is different – dealing with the past but only with an eye to the past, not the present or future. Daston unwisely presumes without argument that the relative insulation of ‘history of science’ from the present and future is to the field’s credit. Her piece left me with the sense of a discipline settling into a state of ‘genteel poverty’.

Finally, something must be said about Judith Butler’s piece, ‘Critique, Dissent, Disciplinarity’. It is presented as a defence of academic freedom but it is last article I would offer as evidence in support of this noble principle. Butler is here defending the quite reasonable proposition that if the state, which protects academic freedom, objects to questions being raised about the legitimacy of its own actions, then the state forfeits its own legitimacy. In this respect, the legitimacy of state power and academic freedom are mutually implicated. Fine. But why do we need 25 pages of digressions through the works of Kant, Derrida and Foucault to make this point – especially when nothing interesting is added to our understanding of these texts? Instead we are given a tour of Butler’s reading habits (and infer that she is heroically abstracting from the specific case of Israel). Others may and have come to similar conclusions by a much less circuitous route – but probably in lower profile places.

I know this sounds churlish, but our responsibility as academics goes beyond simply giving a running order of our states of mind – that is, unless we think that there is some intrinsic value in retracing the steps by which we have reached our conclusions. The discipline called ‘logic’ promotes itself on such grounds. However, Butler – and of course she is hardly alone – does not claim to be a logician. In that case, would it not be more academically responsible to make one’s arguments by adopting the most efficient means vis-à-vis the target audience? Wouldn’t that be the ultimate good faith demonstration of interdisciplinary communication?

In any case, I recommend those interested in the future of interdisciplinarity to look out for the publication of the Oxford Handbook of Interdisciplinarity in March 2010.


August 28, 2009

How to Tell a Failed Genius from a Diligent Mediocrity – in One’s Own Lifetime

I operate in many different fields, and I am always interested in passing judgement. In fact, I don’t feel that I’ve made my mark as a human being until I have passed judgement. My sense of ‘human being’ is theologically informed: God passes judgement but infallibly, whereas humans – created in his image and likeness – do so too, but fallibly. And this fallibility appears in the dissent, censure and/or ridicule that such judgements receive from others so created. But that’s no reason to stop passing judgement.

I realize that fellow academics are uncomfortable with the idea of passing judgement, which is routinely seen as the stuff of ethics, politics, aesthetics – but not ‘science’! But this is to shortchange science’s centrality to our humanity, understood in this robust Biblical sense. Of course, my colleagues may not feel that science needs to be understood this way, and that a careful and balanced presentation of various sides of an issue is sufficient for ‘scientific’ purposes.

My response is that, like any virtue, fairness needs to be exercised in moderation. And the refusal to pass judgement may amount to being ‘too fair’, in that it neglects the fact that however the arguments stack up now is unlikely to be how they will stack up in the future. Perhaps more importantly, and certainly more subtly, the refusal to pass judgement is itself a judgement that will affect what subsequently happens. Just because you cannot fully determine the future doesn’t mean that you can opt out of bearing at least some responsibility for it. As Sartre said, there is no exit.

So, given that we are ‘always already’ making judgements, which ones are most crucial for the future? How to tell a failed genius from a diligent mediocrity – that is, the ‘A-‘ from the ‘B+’ mind. Consider two rather different cases: In the future, will Craig Venter appear as a visionary who helped to turn biology into a branch of engineering or merely an entrepreneur who happened to hit upon a lucrative technique for sequencing genes? Will Slavoj Zizek be seen as someone who leveraged philosophy’s weak academic position to revive the modern mission of public enlightenment or merely a clever and popular purveyor of (by now) familiar Marxo-Freudian themes?

For some benchmarks on how to think about these matters, consider the difference between the significance that was accorded to Edison and Voltaire, respectively, in their lifetimes and today.

Readers may recall that a decade ago I published a long and scathing study of the origins and influence of Thomas Kuhn’s The Structure of Scientific Revolutions, the most influential account of science in the second half of the 20th century – and perhaps the entire century. To his credit, Kuhn seemed to be sensitive to the issue that I am raising here. But he was convinced – very much like Hegel – that it could only be decided in retrospect. In other words, it makes no sense to speculate about future value judgements. I disagree: The present is the site in which the future is constructed. What Kuhn did not fully appreciate – though he half-recognised it – is that we get the future that matches our current judgements by carefully selecting the chain of historical precedents that lay the foundation for them.


July 30, 2009

Steve Fuller on Karen Armstrong

Writing about web page http://www.amazon.co.uk/Case-God-Religion-Really-Means/dp/1847920349/ref=sr_1_1?ie=UTF8&s=books&qid=1248982005&sr=1-1

Karen Armstrong, The Case for God: What Religion Really Means (London: The Bodley Head, 2009).

Ludwig Wittgenstein ends his gnomic classic, the Tractatus Logico-Philosophicus, by admonishing his readers, ‘Of which one cannot speak, thereof one must remain silent’. If Karen Armstrong had her way, that’s exactly how she’d have us respond to God, putting an end to all the arguments pro- and con- God’s existence that have only served to obscure our ability to grasp the divine, which is by definition – at least as far as she is concerned – beyond words. For many years and books, Armstrong, a former nun, has sharply distinguished the status of religion as logos and mythos – that is, as an account of how the world really is and how we make sense of our place in the world. According to Armstrong, religion works best when mythos has the upper hand over logos, and in this book she stresses the downright negative consequences that an overemphasis on logos can have for religion and the surrounding society. Against the backdrop of this thesis, Armstrong presents intelligent design (ID) as the latest and perhaps most monstrous spawn of logos-driven theism, resulting in bad theology and bad science – not mention bad politics.

Armstrong’s thesis is worth taking seriously not only because she is knowledgeable, thoughtful and influential -- though less so in academia than in interfaith politics. In addition, her indictment is meant to extend beyond ID and its forebears in natural theology to what Armstrong regards as the hubris behind science’s own ‘quest for certainty’, to recall the title of John Dewey’s Gifford Lectures. Armstrong turns out to be very much a fellow-traveller of the Frankfurt School and those who believe that humanity’s logos-mania has led to untold cruelty, misery and harm to other humans and nature at large. ID supporters may be disoriented to find themselves the targets of such an Anti-Enlightenment harangue but I think Armstrong has got the historical drift right. She even sees the scientism in many of the founders of the Anglo-American Protestant fundamentalism a hundred years ago. However, Armstrong portrays it all as part of one unmitigated disaster. And here I beg to differ.

I also beg to differ on an assumption that I think her liberal readers will too easily grant her – namely, that there is some common core to all ‘religions’, whether by this term we mean indigenous native beliefs or High Church Christianity. As a matter of fact, this generic idea of religion, by which we (including Armstrong) normally mean the great world-religions, is a 19th century invention, basically designed to capture the nature of social order prior to the rise of the modern nation-state. (Here is a good book on the topic.) Thus, when the discipline of sociology defined itself in terms of problems arising from ‘modernity’, it was assumed that before, say, capitalism, democracy and science, a much less differentiated social glue called ‘religion’ held people together in so-called traditional societies. Thus, the founding fathers of sociology -- Marx, Durkheim, Weber et al -- spent much of their energies figuring out how the various social functions previously filled by religion were carried out in the modern secular world.

The key point here is that ‘religion’ in the broad sense that we currently use it began as a residual category for any complex form of social life of pre-modern origin. It implied nothing about specific beliefs (even in a deity), rituals or the status of human beings in the cosmos. In this respect, ‘religion’ is the classic postmodern category that was invented to give a face to the ‘other’ of modernity. Under the circumstances, it comes as no surprise that Armstrong’s candidate for the core ‘religious’ experience is silence before the ineffability of Being, or apophasis. After all, modernity is sociologically marked by the replacement of tacit knowledge and social relations with explicit formulas and contracts. Whereas for the modern Abrahamic believer, the logos defines the common ground between God and humanity, both in the Bible and the Book of Nature, Armstrong’s deity undergoes a sort of self-alienation – and believers engage in a sort of idolatry – in the presence of the logos.

Indeed, the second, larger half of The Case for God, called ‘The Modern World’, is basically a tale of steady decline in the West’s religious authenticity as religion becomes increasingly conflated with science and other worldly – and wordy -- preoccupations. ID under its proper name appears at the very end in the final chapter, entitled ‘The Death of God?’ Here Armstrong calls for a version of Stephen Jay Gould’s segregationist line on science and religion – ‘Non-overlapping magisteria’ – as the best way to stem the rot. Despite her conspicuous silence on the recent rise of ‘radical orthodoxy’ within Anglican Christianity, my guess is that Armstrong would be comfortable with many of its signature counter-Reformationist moves, not least its rejection of ID. (For a quick study, here is my review of a BBC television show, ‘Did Darwin Kill God’ that aired this past Spring, which was done from a radical orthodox perspective. Here are two pieces in the Times Higher Education give good brief sense of the difference between my own and the radical orthodox position on the science-religion relationship.)

While it would be useful for ID supporters to read Armstrong’s book to see how their position can be demonised from the standpoint of someone who believes in a faith-neutral conception of pure religiosity, the audience that could really benefit from this book are the more boneheaded and blog-bound Darwinists who seem to think that ID is ‘anti-science’ and ‘pro-relativism’. Armstrong makes it absolutely clear that, if anything, ID is too enamoured of science (at least its worst tendencies) and too fixated on its own scriptural base in the Abrahamic faiths to appreciate religion’s ultimate basis in the ineffable. And she is right about all this: It’s a pity that she doesn’t like what she sees.


July 22, 2009

Oration for Bruno Latour

In our most recent graduation ceremony (16th July), I provided the Oration for the award of an Honorary Doctor of Laws to Bruno Latour. The full text is below and there you will find some links to our most pronounced disagreements. Those who know me well will realize that I hold in highest esteem those whose ideas persistently bother me -- and frankly most of what passes for intellectual life today is little more than eyewash and mosquito bites. With that in mind, here is the Oration:

It is a great honour and pleasure to present Bruno Latour for an honorary doctorate at Warwick. We are suited to him, and he to us. Warwick is the strongest general-purpose university in the United Kingdom that is anchored in the social sciences. We have a reputation for adventurousness not only across disciplinary boundaries but also across the boundary that separates academia from the rest of the world. Latour the person embodies these qualities as well. He regularly ranks among the ten most highly cited social scientists in the world yet his influence extends to business, policy, humanities and art. As someone whose teaching career has transpired in professional schools – first in engineering and now in politics – he was never inclined to think in narrowly disciplinary terms. Now speaking at a personal level, Latour is most responsible for making the field with which I am closely associated, science and technology studies, one of the most exciting academic endeavours of the last half-century. But to give the full measure of Latour’s significance – and if I may be forgiven some prophecy -- future historians will find Latour a most instructive figure to study as they try to make sense of the deep changes that are currently taking place in our attitudes towards science, modernity and even the primacy of the human on the planet. It would be fair to say that Latour offers grounds for taking all of these objects of concern down a peg.

If this sounds a bit heavy, well, in reality it is. However, Latour delivers his potentially disconcerting messages with a lightness of touch and wit that can at once provoke, entertain and educate. In Hong Kong in 2002 Latour and I debated whether a strong distinction between the human and the non-human is needed for social scientific purposes. In keeping with his writing on the topic, Latour argued against the proposition. Nevertheless, he managed to endear himself to the audience because, as my fellow sociologist Beverley Skeggs put it at the time, he stumbles upon these insights with the air of Inspector Clouseau. You can judge for yourself, and I certainly recommend his many books as vehicles for recreating the experience. But call me old-fashioned – or even downright modernist – but contra Latour I still believe that the social sciences need to retain the idea of the ‘human’, especially if we are to remain rightful heirs of the ‘moral sciences’. But that debate can be continued on another occasion.

I have been always drawn, however critically, to Latour’s trajectory from my graduate student days, nearly thirty years ago. I began very impressed with his landmark ethnography of the scientific laboratory, Laboratory Life. It has been one of the five most formative works in my own intellectual development. Here was a philosopher who endeavoured to make sense of scientists in their native habitat as an anthropologist might a native tribe, namely, by leaving his prejudices at the doorstep and observing very carefully what they actually say and do. The irony, of course, is that since in this case the ‘natives’ were scientists, the ‘prejudices’ that Latour had to abandon concerned their being massively superior – not inferior – to himself in rationality and objectivity.

Sharing Latour’s background in philosophy, I knew exactly what he was driving at: Many philosophers – but not only them – presume that the rigour associated with both popular and technical scientific rhetoric implies something extraordinary about scientists’ relationship to reality. Latour’s quite straightforward empirical conclusion was that these words have force only if they are backed up with indefinitely extended networks of people, places and things. The interesting question then is how do those networks get forged and remain durable. The fancy scientific words themselves don’t necessarily pick up anything special happening in the laboratory, which, truth to be told, looks more like a loosely managed factory than a sacred crucible of knowledge production.

Perhaps you can see why the knee-jerk response of many scientists was to regard Latour as ‘anti-science’. It was a complete misunderstanding. Yes, in a sense Latour wanted to cut scientists down to size, but it was for their own – and society’s – good. Scientists cannot possibly live up to the hype that would have them save the world with their superior knowledge – at least given science’s very chequered track record in the 20th century. However, a more modest conception of science whose success depends on more than mere ideas and words but on others willing to act on the basis of what scientists say and do has done an exemplary job of, so to speak, ‘domesticating’ science as a public policy concern. Quite deservedly Latour has been duly recognized in France for this achievement, which continues to exert considerable influence in state-funded ‘public understanding of science’ initiatives in this country and elsewhere.

As you can imagine from even in this short address, Latour has never been one to shy away from controversy, sometimes quite heated and unfair. But even if you end up disagreeing with much of what he says, Latour more than deserves this honorary degree for consistently exhibiting a quality that is all too lacking in today’s academic culture: grace under fire. He is always more civilised than his opponents, always instructive, and sometimes even right.

* * * * *

Now here's a nice picture of the two of us in a death grip as we descend to Hades together!


July 07, 2009

A response to 'The value of higher education in the arts and humanities'

Writing about web page http://www.egovmonitor.com/node/25870/print

To those unfamiliar with UK higher education policy, it may come as a surprise to learn that of the three major branches of learning – the arts and humanities, the social sciences and the natural sciences – the arts and humanities consistently draw the short straw on both funding and state recognition more generally. So it was striking to find the new minister for higher education, David Lammy, speaking in such broad and positive terms about the art and humanities at a recent speech delivered at the Royal Society for the Arts. Both the Guardian and the Times Higher gave the speech prominent coverage.

While Lammy made an excellent speech, it was really about the defence of a certain liberal arts ideal of general education, not about the humanities as specialist subjects. So, those who want to increase research funding in those fields still have their work cut out. And while Lammy is certainly justified in claiming all sorts of long-term social, cultural and even economic benefits from study in the arts and humanities, English (though not Scottish) higher education is currently not organized to realize them. This is because students come to university already specialised, not least in the various arts and humanities subjects. For Lammy’s vision to be truly realized, at least the first year at university would have to be an intellectually exploratory period, in which students are required to sample from an array of general education courses whose staff would be taken from many, if not all, departments. These courses would be specifically designed to cover both skills and content that encourage the openness of mind and breadth of knowledge needed to live in today’s world. The result would bring Britain closer to an American model of higher education (at its best). I don’t know if Lammy quite realizes the massive overhaul of the teaching and examination system across both the secondary and tertiary education sector that such a shift would entail, but I for one would welcome it.

You can find a précis of my general take on the value of the humanities in a syndicated column I published last year.


November 2024

Mo Tu We Th Fr Sa Su
Oct |  Today  |
            1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30   

Search this blog

Most recent comments

  • Unfortunately it seems your undergrads have missed the crux of the argument i.e. The possibilities t… by Luke Robert Mason on this entry
  • Philosophers are supposed to be wise. The problem here seems to be related to the conflict with this… by Dr. Akira Kanda on this entry
  • If metaphysicians were allowed to enjoy the status that they should have – being critics and propone… by Tom Milner-Gulland on this entry
  • Hi, I would like to hear opinions of kids to the advertising of products. Why do they buy the produc… by Kay zum Felde on this entry
  • Thanks, Lawrence, for this comment. I didn't know Nutt was unpaid. But as your comment already sugge… by Steve Fuller on this entry

Blog archive

Loading…
RSS2.0 Atom
Not signed in
Sign in

Powered by BlogBuilder
© MMXXIV