November 22, 2011

My Students Discuss Seven Transhuman Experiments

Writing about web page http://www.wired.com/magazine/2011/07/ff_swr/all/1

Last week, while I was in Moscow, I had two sets of students -- one undergraduate and one postgraduate discuss seven radical experiments proposed by Wired Magazine that, were they done, had the potential to revolutionise our understanding of the human condition. Like most of the important experiments on humans done before the 1970s, these would not be allowed most Institutional Review Boards today. But perhaps they should? I asked the students, whose national origins range across Europe, Asia and North America, to discuss the matter in the role of policy makers who have the power to allow such experiments -- and also as potential participants in them.

The seven experiments are described in more detail in the website above, and are summarised below:

EXPERIMENT
1. Split up twins after birth—and then control every aspect of their environments.
2. Remove brain cells from a live subject to analyze which genes are switched on and which are off.
3. Insert a tracking agent into a human embryo to monitor its development.
4. Use beams of light to control the activity of brain cells in conscious human beings.
5. Switch the embryos of obese women with those of thin women.
6. Test each new chemical on a wide range of human volunteers before it comes on the market.
7. Cross-breed a human with a chimpanzee.

Here are the undergraduates (mostly 2nd and 3rd years), who are enrolled in my Sociology of Knowledge class

http://dl.dropbox.com/u/14344203/Sociology%20Of%20Knowledge%20Week%207%20_0.m4a

Here are the postgraduates (mostly 1st years), who are enrolled in my Science, Media and Public Policy class

https://docs.google.com/open?id=0B9E7irPbtBw8Mjc1NTk5ZTYtYjFhNi00MDc2LWI2OTUtOWI0NTAwYzg3ZTNk

I was very impressed with the level of discussion in both cases, though there were some interesting differences in the style and emphases of the two cases.


August 30, 2011

ADVICE ABOUT ACADEMIC TALKS

If the audience is to get any value-added from an academic talk, then the academic should speak not read the talk. Reading the talk, at best, is good karaoke. To me it always suggests that the academic hasn’t mastered his/her material sufficiently to navigate without training wheels. Ditto for powerpoint presentations, unless one really needs to point to something for added epistemic power. A good academic talk should be more like a jazz improvisation – i.e. the speaker provides some novel riffs on themes familiar from his/her texts that allow the audience to join in, sometimes contributing some novelty of their own.

We live in economically stretched times. Why invite famous drones, whose appreciation you could more cheaply acknowledge by buying their books or citing their articles? Anyone who is in charge of a speaker schedule – be it a seminar series or international conference – should always bear in mind that, in the first instance, it is the speaker – not you – who most obviously benefits from an invitation. It is not unreasonable to request something more adventurous than boilerplate from the speaker. You might even – God forbid! – ask them to address a topic somewhat outside their comfort zone. (Youtube is beginning to provide a resource to make informed judgements about who you should (not) invite.)

The increasing specialisation of academic life is way too often used to condone a multitude of sins that hover around the concept of ‘competence’. I never ceased to be amazed how often academics are willing to speak to only a rather narrow sense of ‘what they have already prepared’, or how easily flummoxed they get when they’re told they have 20 instead of 30 (or 10 instead of 20) minutes to present. After all, we’re supposed to be in the business of conveying ideas not displaying powers of recitation.

Of course, academics are worried about saying something wrong in public. Putting aside assaults to one’s vanity, academics sometimes suggest that the public might be at risk if they misspoke. Well, we should be so lucky! In fact, academics misspeak all the time (even when think they’ve adequately prepared) and either no one is paying attention or it doesn’t matter. I do not wish to counsel complacency but simply to demystify a cheap excuse for not having to speak one’s mind in public.

And let’s say you’re caught in an outright error – you should respond graciously. There is nothing wrong with learning in public. Often the blows can be softened by retaining enough wit to realize that your fault-finder is probably overreaching because they’re motivated to find fault in what you said. In other words, they’re interested in advancing their own agenda as they curtail yours. But you can concede a specific error without conceding an entire agenda. In short, make sure you’ve got the wherewithal to do a wheat-and-chaff job on the fault-finder’s comment. (And also do try to correct the error in future presentations!)

Younger academics may think that my advice applies only to more seasoned professionals. But truth be told, we are already pretty cynical about young people. Your word-perfect presentations are taken to be a prosthetic channel for your supervisor’s thoughts – but you might still get credit (if only for academic survival skills!) if you fend off criticism effectively. What’s harder to judge – and therefore places more of the burden on us who judge you – is if you appear to be a normal person making a novel argument. Once we’re able to engage with you at that level, my advice starts to make sense. But what this means is that you need to integrate your academic message with your normal mode of being, so that you can shift with relative ease from banalities to profundities, without losing a sense of the difference between the two!

I have closed off the comments part to this blogpost. But you can respond on twitter, where I can also be found and where this blogpost has been announced: @profstevefuller.


October 30, 2010

Second interview on 'Science: The Art of Living' published

Writing about web page http://whoeverfightsmonsters-nhuthnance.blogspot.com/2010/10/science-art-of-living-interview-with.html

A second interview of the book appears at the Australian left-futurist website, Acheron LV-426, which goes into more detail about the sociological context of the book and its relationship to earlier aspects of my work. See above for web page.


September 21, 2010

Interview on My Latest Book, 'The Science: The Art of Living'

Writing about web page http://www.acumenpublishing.co.uk/display.asp?K=e2009012713293172

Cover of

Science: The Art of Living is published in the US and Canada with McGill-Queen’s University Press.

What follows is an interview in which I explain some of the motivation and ideas behind the book:

Science: The Art of Living. Ten Questions for Steve Fuller

What inspired you to write Science: The Art of Living?

Mark Vernon, himself once a student of theology, contacted me to write this book for his ‘art of living’ series at Acumen, a distinguished UK philosophy publisher. He knew about my work in science studies, especially my interest in the democratisation of scientific authority – and how creationism and intelligent design theory played into that trend, certainly in the US and increasingly in the UK. With his help, I coined the term ‘Protscience’, which I explain later in this interview. Arguably that concept forms the centrepiece of the book. However, Mark probably did not expect that I would stress so much the positive role that even quite difficult and problematic theological concepts – such as the Eucharist, Grace and Providence – have played in motivating the scientific enterprise. Personally speaking, the book provided me with the opportunity to re-connect with my own religious roots. I was never an enthusiastic churchgoer or especially pious. To be sure, my mother always spoke of God as a source of personal strength, a view that I have always found attractive. However, I never fell in and out of love with God, the way so many born-again atheists seem to have done. Rather, I have been taken with theology as a kind of 3-D version of philosophy, in which abstract metaphysics acquired vivid personal qualities. Maybe this had to do with my excellent Jesuit teachers in high school (Regis, in New York City) who made the transition from Jesus to Marx and Teilhard de Chardin appear seamless. For me, all theology – if it’s any good—is ultimately liberation theology, in which science plays a central role as a change agent. While I realize that people tend to see the science-religion relationship in much more adversarial terms these days, that was not so obvious in the 1970s, when I first caught the God bug. For me, recovering the theological underpinnings of science is all about recovering science’s progressive mission in the world.

What’s the most important take-home message for readers?

A lot depends on the reader’s starting point. The basic message is that it is unlikely that science would have developed the way it has or acquired the significance it has, were it not for the Abrahamic religions – specifically, the doctrine that we are created in ‘the image and likeness of God’, which figures most prominently in Christian theology. However, the science-friendly interpretation of this doctrine is the boldest one, namely, that we are capable of recovering from our fallen state to reunite with God. What secular philosophers and scientists call ‘the ulitmate theory of everything’ is simply another name for the mind of God. While there are many reasons why both atheists and believers, theologians and scientists, might want to deny this equation, the price of denial is quite high – especially for science: There really is no other justification for our having tolerated all the violence and destruction unleashed on both humanity and nature by science-led policy decisions in the 20th century unless some much higher end is in view, such that the end does truly justify the means. The theological doctrine of Divine Providence was designed precisely with this prospect in mind—to urge perseverance in the face of adversity. Our undiminished faith in science – even as we doubt particular scientific authorities – reflects a similarly providentialist mentality. In case there is any doubt, I support this perspective – but it needs to be pursued with open eyes. It will eventually mean that we re-adjust our collective moral compass. But that’s a project for another book.

Is there anything you had to leave out?

This is the third book I’ve written on the science-religion relationship, and I tend not to repeat myself from book to book. However, that doesn’t mean that I’m always able to say everything I want! Word limits and the need to retain a broad audience prevented me from ranging more widely into the medieval prehistory of science as well as into the nascent ‘transhumanist’ movement that would use science to enable us to better realize our spiritual ambitions. In particular, a lot more needs to be said about the roots of the much – and to my mind, unjustly – maligned quest for a ‘literal’ interpretation of the Bible. Such literalism, whatever its shortcomings in practice, was born of the same spirit that begat the modern preoccupation with hard facts and the testability of knowledge claims. All of it is rooted in the fourteenth century doctrine of ‘univocal predication’ associated with the great Franciscan philosopher, John Duns Scotus, according to which when we say that God is all good, powerful, knowing, etc., we mean ‘good’, ‘powerful’, ‘knowing’ in exactly the same sense as when we speak of humans – except, of course, that God possesses these qualities to an infinite extent. Scotus provocatively suggested that were we to imagine the maximum version of all of these qualities in one being, then we would come to know God. What makes this proposal so provocative is its suggstion that the ultimate spiritual quest might be achieved with some serious effort, since God is willing and able to communicate his existence through our normal linguistic channels, not least Scripture.

What are some of the biggest misconceptions about your topic?

There are many but three stand out. First, it is simply the idea that science and religion are either natural antagonists or peaceful parallel universes. Neither is true. As a matter of fact, science and religion address exactly the same questions of explaining and controling reality, albeit using somewhat different institutional, technical and verbal means, which have for the most part been mutually supportive but occasionally have led to profound conflict. In this respect, today’s search for a neurological or genetic basis for religion is completely misguided, insofar as ‘religion’ is presumed to be something psychologically opposed to ‘science’—at best a sense of the ineffable, at worst a fount of superstition. The second misconception is that a lot of ‘anti-scientific’ sentiment is afoot today. There is absolutely no evidence for this. What we see, rather, is considerable suspicion of scientific experts, given their palpable failings and excesses. This phenomenon is best interpreted on the model of the Protestant Reformation, when Christians decided to take the doctrines of their faith out of the hands of the priests and authorised theologians. Thus I dub it ‘Protscience’. The third misconception is that the recovery of the theological dimension of science would be a boon to conventional religious belief. On the contrary, if history is our guide, the sort of religiosity that has tended to focus the scientific mind has been heretical, dissenting or otherwise marginal to the religious establishments of their day. Nevertheless, as I stress in my book, the spirit of those positions is radically different from that of atheism. There’s a world of difference between saying that the devout are worshipping the wrong God because you know the right God and because you know there is no God. (By the way, religious believers may take comfort in the fact that pure atheism in the latter sense has made virtually no contribution to the history of science.)

Did you have a specific audience in mind?

Since I am broadly supportive of Protscience, my target audience is the intellectually informed and adventurous. Although the text presupposes a basic understanding of Western intellectual history – is this too much to ask? – it is not littered with academic notes. Rather, the book concludes with an extended bibliographic essay designed to allow readers to follow up elsewhere the larger points I raise. One of the great virtues of writing for a general intellectual audience is that you are encouraged to foreground your own role and keep the support cast off centre stage. Very often academic writing is nothing more than an elaborate exercise in scaffolding and routing, where the ultimate objective is kept only dimly in view – even to the author!

Are you hoping to just inform readers? Give them pleasure? Piss them off?

As I see it, informing readers is simply a means to an end, though often that is the best you can hope for – especially if the reader comes to the topic either ill- or mis- informed, which tends to happen when ‘science’ and ‘religion’ are used in the same sentence. But generally speaking, I see all my writing as irritants, grit in the oyster that eventually issues in a pearl. I am most disappointed by reviews of my work that fail to deal with what I actually say but rather respond to what the reviewer presumes someone versed in my topic would say. On matters of science and religion, self-avowed ‘liberals’ are most prone to slip into such stereotyping, which I think reveals the degree of anti-religious – especially anti-Christian – sentiment that is currently tolerated in polite secular culture. What’s bothersome about this response is less its negativity than its unreflective character. Perhaps I am an easy target for bigotry of this sort because, while trained by the Jesuits, I was never an avid churchgoer or especially devout believer, yet it strikes me as perfectly obvious that modern science would not have enjoyed its heroic levels of societal support, were it not for its Christian heritage. This combination of background and beliefs befuddles the average secular liberal.

What alternative title would you give the book?

The title was forced on me by the editor to fit the series (which consists of single words related to the ‘art of living’). I had published another book with the same one-word title –‘Science’—in 1997, which covers many of the same issues but differently inflected. A more clever and descriptive – but possibly opaque – title would be: Believing in God Literally: The Hidden History of Science. Someone who would have got the point is the father of cybernetics (and Unitarian), the mathematician Norbert Wiener. The epigraph to my book comes from him: ‘Science is a way of life that can flourish only when men are free to have faith’.

How do you feel about the cover?

Much better than I felt about the original cover proposal, which was made before I submitted the manuscript. It was an aerial view of a bacteria culture in a Petri dish. That would have been the consummate cliché of the philosopher-sociologist scrutinising scientific culture. However, once I turned over the manuscript, someone at Acumen realized that the apple dropping off the tree was much more to the point of my thesis – on at least two levels. Most obviously, it recalls the myth of Newton’s discovery of gravity, and my book stresses Newton’s mutually reinforcing scientific and theological concerns as a model for renewing our commitment to science in the 21st century. But more subtly, I also point out that much of the supposedly ‘anti-scientific’ sentiment of our times – ranging from New Age medicine to Intelligent Design Theory – really marks a maturation of the scientific sensibility in society at large. This is captured by ‘Protscience’. Instead of kowtowing to a science they don’t understand, people are increasingly motivated to learn about science for themselves and draw their own conclusions about its relevance for their physical and spiritual lives. The fully formed apple separating from the branch that nurtured it symbolises that transformation.

What book do you wish you’d written?

That’s tricky because it’s all too easy to rate a book by its impact and influence without worrying too much about its content and composition. This helps to explain the ease with which philosophers excerpt and gloss what they call ‘classics’ in ways that would be scandalous if the same practices were applied to works of ‘literature’. (A good antidote is the collection of faux publishers’ rejection letters to the likes of Kant and Kafka in Umberto Eco’s Misreadings.) With that concern in mind, I would say that Plato’s Dialogues have managed over the centuries to repay continued re-readings for new arguments and points of views about perennial issues. I actually think that drama is the most effective genre for the conveyance of ideas because it requires that people inhabit the roles that are scripted in order for the work to be fully realized. For the last three years, I have written and staged dramas at the annual British Science Festival. This started when I scripted an imaginary talk show in which Lincoln and Darwin (both born on 12 February 1809) are interviewed for their joint 200th birthday. This dramatic side to my writing has been very fulfilling both for me and the actors and the audiences involved. I also think it is potentially quite influential as a sort of dress rehearsal for things that later happen in ‘real life’. In this respect, I have come to believe that the Protestant Reformation’s stress on Biblical ‘literalism’ is really about the Bible’s readers coming to inhabit the roles of the people – especially Jesus – whose lives are portrayed on its pages, as if the Scriptures were a divine script.

What’s your next book?

Science: The Art of Living is my seventeenth book. Two are currently competing for the eighteenth position. One is called Humanity 2.0 (Palgrave Macmillan), and it elaborates the transhumanist challenge to any future understanding of the nature of our species. The other is Socrates vs. Jesus: The Struggle for the Meaning of Life (Icon), which basically portrays Socrates as a ‘Christ-Lite’ for tender secular sensibilities. But to say any more at this point would spoil the reading experience!


April 05, 2010

What Crisis of Philosophy?

Writing about web page http://www.insidehighered.com/views/2010/04/05/stanley

Jason Stanley (I was going to say Alexander!) has produced an apologia for ‘philosophy’ – better known as ‘analytic philosophy’ -- in the US Chronicle of Higher Education that is so bad it’s good! Someone should give him a map of the history of philosophy facing the right way up and show him where he’s coming from and likely to end up. (Hint: Look for the sign marked ‘Scholasticism’.)

He presumes that there is no taste for the deep questions of metaphysics and epistemology in the humanities. Au contraire! In fact, German and, more to the point, French philosophy of the post-war period has been all about these matters – often dealing with the same figures that Stanley venerates, conducting arguments at the same abstract plane, and often in a prose style much less tractable than the analytic philosophers Stanley wishes to promote. (Deleuze comes most easily to mind here, given his sustaining interest in Spinoza and Hume.) These people have had enormous and quite diverse influence across the humanities, and – love it or hate it – the word ‘theory’ tracks the scope of that influence well.

This raises a puzzle. Few in the humanities doubt the virtues of abstractness and depth that Stanley champions for philosophy. So why is Stanley complaining – other than sheer narcissism (i.e. the humanists don’t like the philosophers he likes, or in whose footsteps he thinks he’s walking)? But let’s take narcissism off the table as an explanation -- for the moment. It may simply be that analytic philosophers like himself are not especially good at dealing with the deep and abstract issues, such that if they did not control the most powerful graduate programmes, their influence would gradually wither away.

The MacArthur Foundation, the American Council of Learned Societies and other independent interdisciplinary awards bodies aren’t intimidated by the philosophy rankings in the Leiter Reports, which is probably the most visible indicator of the artificial stranglehold that analytic philosophy has on the discipline today. Indeed, Brian Leiter, self-appointed guardian of the profession, is notorious for issuing the diktat that the only relevant distinction in contemporary philosophy is not analytic-continental but good-bad. (Jason Stanley is probably best known to bloggers as a Joey Bishop figure in Leiter’s Rat Pack.) But history may prove Leiter remarkably prescient in this respect, except – as the cunning of reason would have it -- he got the valences reversed!

A relevant insight here comes courtesy of the sociologist Randall Collins, whose view of the history of philosophy in his magisterial Sociology of Philosophies corresponds to Stanley’s own metaphysics-and-epistemology-led view. Collins argues that what has kept generations of people intensely focused on philosophy’s deep and abstract issues, despite their prima facie removal from the stuff of normal living, is the emotional energy that they generate, which every so often spills over into the public sphere, resulting in cultural transformation, if not political revolt.

While I know from experience that analytic philosophers like Stanley feel quite passionately about what they do and how they do it, unfortunately the main operative passion appears to be self-regard: Everyone like me should see how wonderful I am because other people just like me have already done so. In contrast, as Hegel perhaps realized most clearly, philosophy proves its merit by its capacity to impose itself on a resistant world. To be sure, it’s a tough and dangerous standard. But in any case, it forces philosophy not only to regularly criticise its own foundations but also to break out of its own self-imposed institutional limitations – to preach beyond the easily converted. The continental, pragmatist and religious philosophers who Stanley implicitly dismisses do this much better than analytic philosophers, generally speaking.

When I read someone like Jason Stanley, I am reminded of a well-placed 18th century scholastic fretting about the corruption of philosophy in the hands of experimentalists and publicists, i.e. the agents of the Scientific Revolution and the Enlightenment who eventually succeeded in changing and raising the discipline’s game. I'll see you on the other side of the Revolution...


November 22, 2009

‘Primary Schools Need to Make Children “Media Savvy”

Writing about web page http://www.guardian.co.uk/education/2009/nov/22/primary-school-children-media-lessons

This is the title of an article that appeared in today’s Observer by Anushka Asthana, in which I was interviewed on the topic. Until the current ESRC project on ‘mimetic processes’ (i.e. how and why behaviours are imitated), in which I collaborate with several Warwick colleagues, I have not really published much in this area. However, it is an interest that I have nurtured over the years through teaching a variety of courses and lectures here at Warwick, at a summer school in Sweden and a liberal arts college in Germany. I have come to believe that media literacy ought to be introduced at the primary school level in the same spirit as reading, writing and numeracy are normally taught – insofar as they still are!

And what is that ‘spirit’? It’s simply that people should understand as well as possible the means by which they send and receive information. While much of media literacy may be regarded as technologically enhanced versions of traditional literacy and numeracy, there is clearly much more to it that is not normally covered in the traditional courses, especially in terms of the processing of visual and aural information – not to mention the blending of information channels (e.g. fonts as non-neutral displays of writing).

As I said in the interview, I believe that children already develop many of the relevant critical skills spontaneously because of their constant exposure to marketing campaigns, commercial and political advertisements and other forms of public relations through television, the internet, etc. However, the point of school, after all, is to provide systematic training, which means passing on some intellectual tools for dealing with these matters.

An interesting feature of the Observer article is the reaction that Anushka elicited from Cary Bazalgette (former head of education at the British Film Institute) and Tim Bell, one of the masterminds behind Margaret Thatcher’s successful election campaigns in the 1980s and nowadays the PR advisor for Belarus (If nothing else, the man certainly enjoys a challenge!) Bell’s comment was priceless PR spin. Here’s his criticism of my idea:

But Tim Bell, one of the best known figures in the communications industry, said that teaching children how to be critical in this way was a waste of time. Lord Bell added: "What we need are people who are educated and have open minds."

‘Open’, as in an empty vessel – or a blank slate, perhaps?

In any case, the workshops connected to the ESRC project on mimetic behaviours will continue on Monday 14th December. One of the speakers will be one of the UK’s leading social historians of advertising, Liz McFall, from the Open University.


November 08, 2009

The Dawn of Weimar Britain: Wake Up and Smell the Coffee!

Writing about web page http://www.guardian.co.uk/environment/2009/nov/03/tim-nicholson-climate-change-belief

Last week a UK High Court gave the green light for a green activist to sue his employer, who had sacked him for refusing to do an errand because it conflicted with his green beliefs. For intellectual ballast, the judge quoted no less – or, should I say, no more? – than Bertrand Russell’s A History of Western Philosophy, a work whose authoritativeness matches that of Bill Bryson’s A Short History of Everything in the history of science discipline. But that’s not really my point….

My point is to draw attention to the five criteria that the judge offered to expand the definition of ‘religious discrimination’ that may be invoked by others in the future in similar cases:

• The belief must be genuinely held.

• It must be a belief and not an opinion or view based on the present state of information available.

• It must be a belief as to a weighty and substantial aspect of human life.

• It must attain a certain level of cogency, seriousness, cohesion and importance.

• It must be worthy of respect in a democratic society, not incompatible with human dignity and not conflict with the fundamental rights of others.

Humanism was given as an example meeting the criteria, while belief in a political party or the supreme nature of Jedi knights, from the Star Wars movies, were offered as ones that do not.

The general response to this ruling has been positive, with some lawyers seeing it as opening the door to the re-classification of stances like feminism, humanism and vegetarianism as protected religious beliefs. Even New Atheism might count!

I completely disagree with the ruling and the sentiment informing it. In fact, I published a letter in the Guardian the next day, which said:

Justice Burton’s ruling in favour of a green activist whose beliefs interfered with his job has the potential for becoming an epistemological nightmare. In particular, by raising what were previously treated as ‘political’ and ‘lifestyle’ choices to the status of ‘genuinely held beliefs’, the ruling effectively creates an incentive to be dogmatic in one’s opinions, simply in order to avoid forms of social intercourse that one finds disagreeable. After all, evidence of a changed mind is all that would be needed to lose one the protection afforded by the ruling.

A potential practical consequence of this ruling is complete social and political gridlock. It reminds me of Article 118 of the old Weimar Constitution, the first half of which reads as follows:

Every German is entitled, within the bounds set by general law, to express his opinion freely in word, writing, print, image or otherwise. No job contract may obstruct him in the exercise of this right; nobody may put him at a disadvantage if he makes use of this right.

What’s gone wrong here? Part of the answer lies in how ‘free individuals’ is conceptualised. The Weimar Constitution began with a majority principle based on the idea of a ‘German people’ whose common values uphold the constitution. One of those values, of course, is freedom of expression. But to enforce that freedom, the constitution then needs to allow for ‘minority rights’, whereby individuals with deeply held beliefs are allowed opt-out clauses from certain aspects of normal social life that inhibit their expression; otherwise, the majority principle would prove oppressive. Hans Kelsen, one of the great legal minds behind the Weimar Constitution, justifies all this (though without quite seeing its practical consequences) in On the Essence and Value of Democracy (1929).

In the Weimar period, ‘minority rights’ were normally understood in ethnic terms but of course this was also the time when feminism, vegetarianism, etc. start to be recognized as ‘identity politics’. In any case, the pernicious long-term consequence of this way of thinking about freedom of expression is that it encourages a hardening of one’s sense of identity in order to gain personal and political leverage. Of course, in the case of ethnic identity, such a move can be easily turned against oneself – as the Nazis showed all too well.

My own view is that liberal democratic societies should discourage the formation of strong identities – be they around blood or belief – otherwise they will end up undermining their own principles.


November 03, 2009

When scientists lose touch…the case of David Nutt

Writing about web page http://www.guardian.co.uk/politics/2009/nov/02/alan-johnson-drug-adviser-row

If a scientist – or any academic – were fired whenever she said something that her peers regarded as false, then scientists would hardly ever say anything at all, out of fear of rejection. As it happens, science’s own peer review process already induces a certain measure of timidity, but tenured scientists (admittedly a dying breed) can remain gainfully employed while rejected by colleagues. All of this is important to science because free and open inquiry is the only way knowledge truly progresses.

Politics is something else entirely. Politicians are directly affected by the consequences of their decisions. In fact, that is the whole point of politics, especially in a democracy, where politicians don’t exist apart from those they govern. If people don’t like a policy, no matter how well-thought out or well-evidenced it is, then the policy goes and the politicians pay. This explains the long tradition of scientists advising politicians but staying away from actually making policy.

David Nutt, recently departed chair of the current Labour government’s drugs council, has long argued strenuously and colourfully for the declassification of narcotics like ecstasy and cannabis. The scientific side of the argument is quite strong though given the taboos and mysteries that surround ordinary drug use, there is always room to doubt the reliability of what we know. In any case, Nutt is paid to be a scientist not a politician. Once Nutt learned that the government would not implement his position, given how strongly he apparently feel about the matter, he should have simply resigned. And if he wants to get closer to politics, he can still sell his services to a more sympathetic party (Liberal Democrats?) or start a political action committee.

What amazes me is that Nutt had to wait to be fired. Why didn’t he just resign in protest? This is certainly not an unheard of option in the current Labour government! Inasmuch as I am inclined to agree with Nutt’s substantive position on drugs, I find his behaviour incredibly clueless. He clearly doesn’t understand the relationship between science and politics in a democracy. Politicians don’t ask scientists for advice because they want the scientists to rule on their behalf. Scientists are asked more in the spirit of a special interest group, albeit one with considerable mystique, rather like the church. Just as politicians would ideally like to have the church on their side, so too they would like to have the scientific community. However, politicians need to keep a lot of interests and prospects in balance, since in the end it is all about winning elections. And neither the clerics nor the scientists need to face the electorate. It’s as simple as that.

What is perhaps most striking about this episode is the demonstration of political backbone by the Home Office in standing down a formidable and noisy scientific advocate like Professor Nutt. This is a good sign that science is becoming normalised in democratic politics. I also suspect that politicians are becoming more informed about the sociology of science, which teaches not only that uncertainty is always present in science but also that the overall weight of scientific opinion can shift drastically with the appearance of a few well-supported studies. Imagine if Nutt got his way, and then as a conscientious scientist he was forced to change his mind six months later in light of new evidence, and then government policy changed alongside it. It’s hard to see how science’s - or for that matter, the government's - public standing would become stronger in the process.


October 28, 2009

Norman Levitt RIP

Writing about web page http://skepticblog.org/author/shermer/

Norman Levitt has died, aged 66, of heart failure. He was awarded a Ph.D. in mathematics from Princeton at age 24 in 1967 but his fame rests mainly on having been one of the great ‘Science Warriors’, especially via the book he co-authored with biologist Paul Gross, Higher Superstition (Johns Hopkins, 1994). I put the point this way because I imagine that Levitt as someone of great unfulfilled promise -- mathematicians typically fulfil their promise much earlier than other academics – who then decided that he would defend the scientific establishment from those who questioned its legitimacy. Why? Well, one reason would be to render his own sense of failure intelligible. All of the ‘postmodernists’ that Levitt hated so much – myself included -- appeared to be arguing that his aspirations were illusory in one way or another. This is an obvious personal insult to those who define their lives in such ‘illusory’ terms. And yes, what I am offering is an ad hominem argument, but ad hominem arguments are fallacies only when they are used indiscriminately. In this case, it helps to explain – and perhaps even excuse – Levitt’s evolution into a minor science fascist.

As time marches on, it is easy to forget that before Alan Sokal’s notorious hoax, whereby the editors of the leading US cultural studies journal were duped into publishing a politically correct piece of scientific gibberish, Gross and Levitt had already launched a major frontal assault on a broad range of ‘academic leftists’ who were criticising science in the name of some multicultural democratic future. Sokal acknowledged his debt to them. Levitt was clearly in on Sokal’s joke, since Levitt contacted me prior to its publication, in response to which I said that Sokal was toadying unnecessarily to the Social Text editors, without catching the specific errors that Sokal had planted in the aritcle. I had an article in the same issue, which served as Levitt’s launch pad for criticising me over the next decade and a half.

I wish I could say that I learned a lot from my encounters with Levitt, but in fact I learned only a little. His anger truly obscured whatever vision he might have been advancing. To be sure, he did point up a few errors of expression and fact, which I acknowledged at the time and corrected in subsequent publications. But Levitt’s general take on me and my work was so badly off the mark that I never deemed it appropriate to respond formally. (And I am not normally shy when it comes to responding to critics.) In this respect, it is striking that none of his widely publicised criticisms ever passed academic peer review, yet they are enshrined in the internet, not least my own Wikipedia entry. And to be honest, this is the main reason why I am writing this obituary. Seemingly serious people have taken Levitt seriously.

I believe that Levitt’s ultimate claim to fame may rest on his having been as a pioneer of cyber-fascism, whereby a certain well-educated but (for whatever reason) academically disenfranchised group of people have managed to create their own parallel universe of what is right and wrong in matters of science, which is backed up (at least at the moment) by nothing more than a steady stream of invective. Their resentment demands a scapegoat -- and 'postmodernists' function as Jews had previously. My guess is that very few academically successful people have ever thought about – let alone supported -- what Levitt touted as “science’s side” in the Science Wars. Nevertheless, I am sure that a strong constituency for Levitt’s message has long existed amongst science’s many failed aspirants. This alienation of the scientifically literate yet undervalued in society will pose an increasingly painful reality check for policymakers who think that we are easing our way into a global ‘knowledge society’.


October 09, 2009

Congratulations Obama! The Postmodern Presidency Comes to Oslo

The 2009 Nobel Peace Prize has just been awarded to US President Barack Obama for his efforts at improving international diplomacy on many fronts, especially his attempts at reducing nuclear proliferation. Singling out this specific contribution, which may well stand the test of time (especially vis-à-vis Russia), reflects the Cold War vintage of the committee, since it is not clear to me that people born after, say, 1980 see nuclear annihilation as quite the global threat that older generations did – and the youngsters may be right on this point.

I say ‘postmodern presidency’ because the late Jean Baudrillard would relish the fact that the Nobel Peace Prize has been awarded to someone who really hasn’t brokered any peace at all – though Obama has certainly tried on many fronts and acted like a man of peace. Perhaps he is the 'anticipatory peacemaker'! The politician as simulacrum! Peace never looked so good – so maybe the look will do as grounds for prize-worthiness when the reality is way too grim! Sitting here in Boston, I learn that in a CNN-commissioned poll of viewers, Morgan Tsvangirai, the opposition leader of Zimbabwe, had been the preferred choice for the prize. Given Tsvangirai’s actual record to date, I can see why Obama’s potential might appear more attractive.

But there is also comic timing in Obama’s award. This week’s New Statesman, the UK’s historic centre-left weekly magazine of politics and ideas sports a cover containing a photograph of Obama morphing into GW Bush, reflecting its dismay at Obama’s apparent, albeit dithering, willingness to commit more troops into Afghanistan, even though the war there appears both endless and pointless. Generally speaking, UK military commitments mirror US ones these days, so we tend to treat the US President as our own Commander-in-Chief. Whether the New Statesman is ultimately proved prescient or cheap will probably not hurt sales, since they’ve now unwittingly turned themselves into a talking point.

Of course, Obama is being assaulted, in an increasingly vicious way, over his national health plan. I very much support the plan – as I did when Hilary Clinton, Ted Kennedy and countless others have proposed it before him. The hostility to the plan – which must rank as amongst the most mystifying features of US politics to foreigners – reflects the extent to which ‘Live Free or Die’ is really ingrained in the American psyche. At one level, it reflects a very positive view of what Ralph Waldo Emerson called ‘self-reliance’ that is predicated on the view that the US is the ultimate Land of Opportunity, in which health and wealth await those willing to put in a bit of effort. Unfortunately, this normative vision – however attractive – is never subject to a reality check from those who now worry that their taxes might benefit scroungers.

In all this, Obama is mainly saved by the fact that his opponents don’t have their act together – and Obama is the master of grace under pressure. This, I believe, reflects a larger background point: Because it is far from clear what really drives Obama’s political ambition, it is hard to find a way of getting him to sell out. ‘Selling out’ is always about finding the point when your mark distinguishes his own personal interests from those on whose behalf he would presume to speak, and then discreetly appealing to the personal interests alone. If they last long enough, politicians normally sell out because their vanity blows their cover – they are made to reveal that their own interests are really more important than those they’re speaking for – and they’ve become tired of maintaining the deception: Selling out = Cashing out.

Here’s a homework assignment: Given this definition of ‘selling out’, what sort of people will never sell out? Hint: Hegel would love them. And if Obama turns out to be one of them, then he definitely deserves the Nobel Prize!

As far as the Nobel Prize Committee is concerned, it would be a mistake to think that it is any more prone to wishful thinking these days than in the past. Its awards over the past century are strewn with purveyors of wishful thinking who at the time appeared quite plausible candidates for the ‘wave of the future’. Consider the last sitting US President to receive the award, Woodrow Wilson. He was praised for brokering the Treaty of Versailles that ended the First World War and helping to establish the League of Nations. All of this was done against the backdrop of a US that Wilson had to drag into that war kicking and screaming, which then rewarded him by failing to join to the League, once established, and booting the Democrats out of office until FDR – a dozen years and a Great Depression later. Let’s hope that Obama’s prize doesn’t follow that precedent! But no denying it: Wilson was a well-spoken man of good intentions much more popular abroad than at home.

Come to think of it: Didn’t Jimmy Carter get the Nobel Peace Prize too?


December 2024

Mo Tu We Th Fr Sa Su
Nov |  Today  |
                  1
2 3 4 5 6 7 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30 31               

Search this blog

Most recent comments

  • Unfortunately it seems your undergrads have missed the crux of the argument i.e. The possibilities t… by Luke Robert Mason on this entry
  • Philosophers are supposed to be wise. The problem here seems to be related to the conflict with this… by Dr. Akira Kanda on this entry
  • If metaphysicians were allowed to enjoy the status that they should have – being critics and propone… by Tom Milner-Gulland on this entry
  • Hi, I would like to hear opinions of kids to the advertising of products. Why do they buy the produc… by Kay zum Felde on this entry
  • Thanks, Lawrence, for this comment. I didn't know Nutt was unpaid. But as your comment already sugge… by Steve Fuller on this entry

Blog archive

Loading…
RSS2.0 Atom
Not signed in
Sign in

Powered by BlogBuilder
© MMXXIV