All 4 entries tagged Steve Fuller
October 28, 2009
Writing about web page http://skepticblog.org/author/shermer/
Norman Levitt has died, aged 66, of heart failure. He was awarded a Ph.D. in mathematics from Princeton at age 24 in 1967 but his fame rests mainly on having been one of the great ‘Science Warriors’, especially via the book he co-authored with biologist Paul Gross, Higher Superstition (Johns Hopkins, 1994). I put the point this way because I imagine that Levitt as someone of great unfulfilled promise -- mathematicians typically fulfil their promise much earlier than other academics – who then decided that he would defend the scientific establishment from those who questioned its legitimacy. Why? Well, one reason would be to render his own sense of failure intelligible. All of the ‘postmodernists’ that Levitt hated so much – myself included -- appeared to be arguing that his aspirations were illusory in one way or another. This is an obvious personal insult to those who define their lives in such ‘illusory’ terms. And yes, what I am offering is an ad hominem argument, but ad hominem arguments are fallacies only when they are used indiscriminately. In this case, it helps to explain – and perhaps even excuse – Levitt’s evolution into a minor science fascist.
As time marches on, it is easy to forget that before Alan Sokal’s notorious hoax, whereby the editors of the leading US cultural studies journal were duped into publishing a politically correct piece of scientific gibberish, Gross and Levitt had already launched a major frontal assault on a broad range of ‘academic leftists’ who were criticising science in the name of some multicultural democratic future. Sokal acknowledged his debt to them. Levitt was clearly in on Sokal’s joke, since Levitt contacted me prior to its publication, in response to which I said that Sokal was toadying unnecessarily to the Social Text editors, without catching the specific errors that Sokal had planted in the aritcle. I had an article in the same issue, which served as Levitt’s launch pad for criticising me over the next decade and a half.
I wish I could say that I learned a lot from my encounters with Levitt, but in fact I learned only a little. His anger truly obscured whatever vision he might have been advancing. To be sure, he did point up a few errors of expression and fact, which I acknowledged at the time and corrected in subsequent publications. But Levitt’s general take on me and my work was so badly off the mark that I never deemed it appropriate to respond formally. (And I am not normally shy when it comes to responding to critics.) In this respect, it is striking that none of his widely publicised criticisms ever passed academic peer review, yet they are enshrined in the internet, not least my own Wikipedia entry. And to be honest, this is the main reason why I am writing this obituary. Seemingly serious people have taken Levitt seriously.
I believe that Levitt’s ultimate claim to fame may rest on his having been as a pioneer of cyber-fascism, whereby a certain well-educated but (for whatever reason) academically disenfranchised group of people have managed to create their own parallel universe of what is right and wrong in matters of science, which is backed up (at least at the moment) by nothing more than a steady stream of invective. Their resentment demands a scapegoat -- and 'postmodernists' function as Jews had previously. My guess is that very few academically successful people have ever thought about – let alone supported -- what Levitt touted as “science’s side” in the Science Wars. Nevertheless, I am sure that a strong constituency for Levitt’s message has long existed amongst science’s many failed aspirants. This alienation of the scientifically literate yet undervalued in society will pose an increasingly painful reality check for policymakers who think that we are easing our way into a global ‘knowledge society’.
August 28, 2009
I operate in many different fields, and I am always interested in passing judgement. In fact, I don’t feel that I’ve made my mark as a human being until I have passed judgement. My sense of ‘human being’ is theologically informed: God passes judgement but infallibly, whereas humans – created in his image and likeness – do so too, but fallibly. And this fallibility appears in the dissent, censure and/or ridicule that such judgements receive from others so created. But that’s no reason to stop passing judgement.
I realize that fellow academics are uncomfortable with the idea of passing judgement, which is routinely seen as the stuff of ethics, politics, aesthetics – but not ‘science’! But this is to shortchange science’s centrality to our humanity, understood in this robust Biblical sense. Of course, my colleagues may not feel that science needs to be understood this way, and that a careful and balanced presentation of various sides of an issue is sufficient for ‘scientific’ purposes.
My response is that, like any virtue, fairness needs to be exercised in moderation. And the refusal to pass judgement may amount to being ‘too fair’, in that it neglects the fact that however the arguments stack up now is unlikely to be how they will stack up in the future. Perhaps more importantly, and certainly more subtly, the refusal to pass judgement is itself a judgement that will affect what subsequently happens. Just because you cannot fully determine the future doesn’t mean that you can opt out of bearing at least some responsibility for it. As Sartre said, there is no exit.
So, given that we are ‘always already’ making judgements, which ones are most crucial for the future? How to tell a failed genius from a diligent mediocrity – that is, the ‘A-‘ from the ‘B+’ mind. Consider two rather different cases: In the future, will Craig Venter appear as a visionary who helped to turn biology into a branch of engineering or merely an entrepreneur who happened to hit upon a lucrative technique for sequencing genes? Will Slavoj Zizek be seen as someone who leveraged philosophy’s weak academic position to revive the modern mission of public enlightenment or merely a clever and popular purveyor of (by now) familiar Marxo-Freudian themes?
For some benchmarks on how to think about these matters, consider the difference between the significance that was accorded to Edison and Voltaire, respectively, in their lifetimes and today.
Readers may recall that a decade ago I published a long and scathing study of the origins and influence of Thomas Kuhn’s The Structure of Scientific Revolutions, the most influential account of science in the second half of the 20th century – and perhaps the entire century. To his credit, Kuhn seemed to be sensitive to the issue that I am raising here. But he was convinced – very much like Hegel – that it could only be decided in retrospect. In other words, it makes no sense to speculate about future value judgements. I disagree: The present is the site in which the future is constructed. What Kuhn did not fully appreciate – though he half-recognised it – is that we get the future that matches our current judgements by carefully selecting the chain of historical precedents that lay the foundation for them.
July 30, 2009
Karen Armstrong, The Case for God: What Religion Really Means (London: The Bodley Head, 2009).
Ludwig Wittgenstein ends his gnomic classic, the Tractatus Logico-Philosophicus, by admonishing his readers, ‘Of which one cannot speak, thereof one must remain silent’. If Karen Armstrong had her way, that’s exactly how she’d have us respond to God, putting an end to all the arguments pro- and con- God’s existence that have only served to obscure our ability to grasp the divine, which is by definition – at least as far as she is concerned – beyond words. For many years and books, Armstrong, a former nun, has sharply distinguished the status of religion as logos and mythos – that is, as an account of how the world really is and how we make sense of our place in the world. According to Armstrong, religion works best when mythos has the upper hand over logos, and in this book she stresses the downright negative consequences that an overemphasis on logos can have for religion and the surrounding society. Against the backdrop of this thesis, Armstrong presents intelligent design (ID) as the latest and perhaps most monstrous spawn of logos-driven theism, resulting in bad theology and bad science – not mention bad politics.
Armstrong’s thesis is worth taking seriously not only because she is knowledgeable, thoughtful and influential -- though less so in academia than in interfaith politics. In addition, her indictment is meant to extend beyond ID and its forebears in natural theology to what Armstrong regards as the hubris behind science’s own ‘quest for certainty’, to recall the title of John Dewey’s Gifford Lectures. Armstrong turns out to be very much a fellow-traveller of the Frankfurt School and those who believe that humanity’s logos-mania has led to untold cruelty, misery and harm to other humans and nature at large. ID supporters may be disoriented to find themselves the targets of such an Anti-Enlightenment harangue but I think Armstrong has got the historical drift right. She even sees the scientism in many of the founders of the Anglo-American Protestant fundamentalism a hundred years ago. However, Armstrong portrays it all as part of one unmitigated disaster. And here I beg to differ.
I also beg to differ on an assumption that I think her liberal readers will too easily grant her – namely, that there is some common core to all ‘religions’, whether by this term we mean indigenous native beliefs or High Church Christianity. As a matter of fact, this generic idea of religion, by which we (including Armstrong) normally mean the great world-religions, is a 19th century invention, basically designed to capture the nature of social order prior to the rise of the modern nation-state. (Here is a good book on the topic.) Thus, when the discipline of sociology defined itself in terms of problems arising from ‘modernity’, it was assumed that before, say, capitalism, democracy and science, a much less differentiated social glue called ‘religion’ held people together in so-called traditional societies. Thus, the founding fathers of sociology -- Marx, Durkheim, Weber et al -- spent much of their energies figuring out how the various social functions previously filled by religion were carried out in the modern secular world.
The key point here is that ‘religion’ in the broad sense that we currently use it began as a residual category for any complex form of social life of pre-modern origin. It implied nothing about specific beliefs (even in a deity), rituals or the status of human beings in the cosmos. In this respect, ‘religion’ is the classic postmodern category that was invented to give a face to the ‘other’ of modernity. Under the circumstances, it comes as no surprise that Armstrong’s candidate for the core ‘religious’ experience is silence before the ineffability of Being, or apophasis. After all, modernity is sociologically marked by the replacement of tacit knowledge and social relations with explicit formulas and contracts. Whereas for the modern Abrahamic believer, the logos defines the common ground between God and humanity, both in the Bible and the Book of Nature, Armstrong’s deity undergoes a sort of self-alienation – and believers engage in a sort of idolatry – in the presence of the logos.
Indeed, the second, larger half of The Case for God, called ‘The Modern World’, is basically a tale of steady decline in the West’s religious authenticity as religion becomes increasingly conflated with science and other worldly – and wordy -- preoccupations. ID under its proper name appears at the very end in the final chapter, entitled ‘The Death of God?’ Here Armstrong calls for a version of Stephen Jay Gould’s segregationist line on science and religion – ‘Non-overlapping magisteria’ – as the best way to stem the rot. Despite her conspicuous silence on the recent rise of ‘radical orthodoxy’ within Anglican Christianity, my guess is that Armstrong would be comfortable with many of its signature counter-Reformationist moves, not least its rejection of ID. (For a quick study, here is my review of a BBC television show, ‘Did Darwin Kill God’ that aired this past Spring, which was done from a radical orthodox perspective. Here are two pieces in the Times Higher Education give good brief sense of the difference between my own and the radical orthodox position on the science-religion relationship.)
While it would be useful for ID supporters to read Armstrong’s book to see how their position can be demonised from the standpoint of someone who believes in a faith-neutral conception of pure religiosity, the audience that could really benefit from this book are the more boneheaded and blog-bound Darwinists who seem to think that ID is ‘anti-science’ and ‘pro-relativism’. Armstrong makes it absolutely clear that, if anything, ID is too enamoured of science (at least its worst tendencies) and too fixated on its own scriptural base in the Abrahamic faiths to appreciate religion’s ultimate basis in the ineffable. And she is right about all this: It’s a pity that she doesn’t like what she sees.
July 22, 2009
In our most recent graduation ceremony (16th July), I provided the Oration for the award of an Honorary Doctor of Laws to Bruno Latour. The full text is below and there you will find some links to our most pronounced disagreements. Those who know me well will realize that I hold in highest esteem those whose ideas persistently bother me -- and frankly most of what passes for intellectual life today is little more than eyewash and mosquito bites. With that in mind, here is the Oration:
It is a great honour and pleasure to present Bruno Latour for an honorary doctorate at Warwick. We are suited to him, and he to us. Warwick is the strongest general-purpose university in the United Kingdom that is anchored in the social sciences. We have a reputation for adventurousness not only across disciplinary boundaries but also across the boundary that separates academia from the rest of the world. Latour the person embodies these qualities as well. He regularly ranks among the ten most highly cited social scientists in the world yet his influence extends to business, policy, humanities and art. As someone whose teaching career has transpired in professional schools – first in engineering and now in politics – he was never inclined to think in narrowly disciplinary terms. Now speaking at a personal level, Latour is most responsible for making the field with which I am closely associated, science and technology studies, one of the most exciting academic endeavours of the last half-century. But to give the full measure of Latour’s significance – and if I may be forgiven some prophecy -- future historians will find Latour a most instructive figure to study as they try to make sense of the deep changes that are currently taking place in our attitudes towards science, modernity and even the primacy of the human on the planet. It would be fair to say that Latour offers grounds for taking all of these objects of concern down a peg.
If this sounds a bit heavy, well, in reality it is. However, Latour delivers his potentially disconcerting messages with a lightness of touch and wit that can at once provoke, entertain and educate. In Hong Kong in 2002 Latour and I debated whether a strong distinction between the human and the non-human is needed for social scientific purposes. In keeping with his writing on the topic, Latour argued against the proposition. Nevertheless, he managed to endear himself to the audience because, as my fellow sociologist Beverley Skeggs put it at the time, he stumbles upon these insights with the air of Inspector Clouseau. You can judge for yourself, and I certainly recommend his many books as vehicles for recreating the experience. But call me old-fashioned – or even downright modernist – but contra Latour I still believe that the social sciences need to retain the idea of the ‘human’, especially if we are to remain rightful heirs of the ‘moral sciences’. But that debate can be continued on another occasion.
I have been always drawn, however critically, to Latour’s trajectory from my graduate student days, nearly thirty years ago. I began very impressed with his landmark ethnography of the scientific laboratory, Laboratory Life. It has been one of the five most formative works in my own intellectual development. Here was a philosopher who endeavoured to make sense of scientists in their native habitat as an anthropologist might a native tribe, namely, by leaving his prejudices at the doorstep and observing very carefully what they actually say and do. The irony, of course, is that since in this case the ‘natives’ were scientists, the ‘prejudices’ that Latour had to abandon concerned their being massively superior – not inferior – to himself in rationality and objectivity.
Sharing Latour’s background in philosophy, I knew exactly what he was driving at: Many philosophers – but not only them – presume that the rigour associated with both popular and technical scientific rhetoric implies something extraordinary about scientists’ relationship to reality. Latour’s quite straightforward empirical conclusion was that these words have force only if they are backed up with indefinitely extended networks of people, places and things. The interesting question then is how do those networks get forged and remain durable. The fancy scientific words themselves don’t necessarily pick up anything special happening in the laboratory, which, truth to be told, looks more like a loosely managed factory than a sacred crucible of knowledge production.
Perhaps you can see why the knee-jerk response of many scientists was to regard Latour as ‘anti-science’. It was a complete misunderstanding. Yes, in a sense Latour wanted to cut scientists down to size, but it was for their own – and society’s – good. Scientists cannot possibly live up to the hype that would have them save the world with their superior knowledge – at least given science’s very chequered track record in the 20th century. However, a more modest conception of science whose success depends on more than mere ideas and words but on others willing to act on the basis of what scientists say and do has done an exemplary job of, so to speak, ‘domesticating’ science as a public policy concern. Quite deservedly Latour has been duly recognized in France for this achievement, which continues to exert considerable influence in state-funded ‘public understanding of science’ initiatives in this country and elsewhere.
As you can imagine from even in this short address, Latour has never been one to shy away from controversy, sometimes quite heated and unfair. But even if you end up disagreeing with much of what he says, Latour more than deserves this honorary degree for consistently exhibiting a quality that is all too lacking in today’s academic culture: grace under fire. He is always more civilised than his opponents, always instructive, and sometimes even right.
* * * * *