Jump to page content
The Pequod
Dr Alistair Brown | Associate lecturer in English Literature; researching video games and literature

Recent Posts

Twitter @alibrown18

New Essay

Through exploring the psychopathology of Capgras syndrome, in which a patient mistakes a loved one for an imposter, The Echo Maker offers a sustained meditation on the ways in which we project our own problems onto other people. As a reflection on the mysteries of consciousness, the novel offers some interesting if not especially new insights into the fuzzy boundaries between scientific and literary interpretations of the mind. Read more


What Scientists Read

Friday, July 20, 2012

In recent years, literary studies has increasingly appropriated science, opening new fields for critical enquiry. Darwinian literary studies, for example, shows how the reading of literature can be explained in terms of our evolutionary biology. Literary historians of science show how understanding the ways in which writers have represented science can help us better to communicate scientific knowledge today. Critical readers are turning to empirical studies, such as semantic analysis, to give their criticism the status of fact.

I would not want to demean such efforts to engage literature with science. Before I shifted more towards literature and game studies, my PhD research looked at the ways in which cybernetic science had been (mis)represented in literary and film science fiction. Science and Culture has been a key category under which I've posted on this blog over a number of years.

Nevertheless, I remain sceptical about the ultimate destination of such traffic between science and literature. There is always a feeling that such interdisciplinarity, whilst intellectually interesting in its own right, is also an attempt to lend literary studies the superficial credibility of the "real-world impact" that science possesses. If it is effective, scientific research invariably emerges from universities to have some social benefit, such as a new cure for cancer, or a green energy source. The "impacts" of science, especially the most exciting blue skies science, may not always be direct and instantaneous, but they are invariably assumed to be present. Literary studies clings to the coat-tails of the scientific impact-agenda, suggesting to policy makers and public - who increasingly demand pragmatic outcomes from their funding - that it has relevance, even if this is not always immediately obvious.

Read more »

Labels: , , ,

Posted by Alistair at 2:24 pm Post your comments (0)

Review of The Echo Maker, by Richard Powers

Saturday, July 14, 2012

I have just posted a review of Richard Powers 2006 novel, The Echo Maker. Through exploring the psychopathology of Capgras syndrome, in which a patient mistakes a loved one for an imposter, The Echo Maker offers a sustained meditation on the ways in which we project our own problems onto other people. As a reflection on the mysteries of consciousness, the novel offers some interesting if not especially new insights into the fuzzy boundaries between scientific and literary interpretations of the mind.

Although the novel won a host of literary awards, I am a little more sceptical about its value both as a novel, and as an exploration of the "two cultures" of science and literature.

The full review can be read here: The Echo Maker.

Labels: , ,

Posted by Alistair at 11:07 am Post your comments (0)

Review of Different Engines: How Science Drives Fiction and Fiction Drives Science, by Mark Brake and Neil Hooke

Thursday, July 12, 2012

I have posted up a review of a critical survey of science fiction, entitled Different Engines: How Science Drives Fiction and Fiction Drives Science. The review was first posted at the BSLS website, but as this is now two years old I am reposting it here.

In short, Different Engines offers a racy, if fairly predictable, synopsis of the ways in which science has influenced science fiction writers. However, the book fails to offer any convincing evidence for or discussion of the more complex possibility that literature may also influence scientific discoveries.

The full review can be read here: Different Engines: How Science Drives Fiction and Fiction Drives Science.

Labels: ,

Posted by Alistair at 11:04 am Post your comments (0)

Susan Greenfield and Autism

Monday, August 08, 2011

I was interested to read today that a leading neuropsychologist, Dorothy Bishop, has criticised her fellow Oxford professor, Susan Greenfield, after the latter claimed that the rise in internet use has led to an increase in cases of autism. In an open letter to Greenfield, published in New Scientist, Bishop said she was "dismayed by the way in which your public communications have moved away from science." Bishop suggested that her views depended on a fundamental (perhaps deliberate?) misreading of the evidence, since the rise in cases actually precedes the widespread adoption of the internet, and is best accounted for by a change in diagnostic techniques.

Greenfield's skewing of the evidence to make a point seems to tally with what I felt in my review of Greenfield's ID: The Quest for Identity in the 21st Century. So many of the arguments in this book seemed to be unsubstantiated, deriving from her ideological opposition to new technology, rather than from careful scrutiny of the scientific and social data.

We often bemoan the state of science reporting in the mass media. It does not help when scientists like Greenfield seek to become the story themselves by making lavish and apocalyptic pronouncements about the way in which games, social media and so on can affect the health of children.

Labels: , ,

Posted by Alistair at 2:14 pm Post your comments (0)

Table Talk

Wednesday, June 29, 2011

The solidly alliterative phrase "table talk" seems like it ought to originate in a novel or poem. In fact, surprisingly, it derives literally from the stomach.

According to Steven Shapin, the 15th century scholar Ficino wrote that "it is bad to strain the stomach with food and drink, and worst of all, with the stomach so strained, to think difficult thoughts," whilst an 18th author of a treatise on occupational diseases noted that "all the men of learning used to complain of a weakness in the stomach." From Thomas Carlyle, described as a "martyr" to dyspepsia, to Charles Darwin, who avoided public engagements because of his embarrassment about his belches and farts, there has been a strong association between intellectuals, and digestive suffering. Although modern medicine eventually downplayed this theoretical link between the hard-working mind and the ill-suffering gut, the sense of connection was enough to establish the etiquette of table talk, which was, Shapin explains "light, airy and undemanding stuff that didn't draw the vital spirits away from the stomach's proper work. It was a courtesy medicine paid to manners."

Perhaps this university lecturer, musing on the difficulties of organising an academic dinner party, ought to take note.

Labels: ,

Posted by Alistair at 8:44 am Post your comments (0)

Susan Greenfield's ID: The Quest for Identity in the 21st Century

Friday, June 24, 2011

I have just posted a review of Susan Greenfield's ID: The Quest for Identity in the 21st Century. It is a strange and in some ways interesting book, not for the science it contains but for what it tells us about Greenfield herself. Her complaints about the impact of technology on society lead this leading neuroscientist to make a series of absurd hypotheses and unsubstantiated arguments. It amounts to a middle-aged grumble about the pace of social change, rather than a rigorous study of the neurological effects of technology.

The full review can be read here: ID: The Quest for Identity in the 21st Century.


Labels: , , ,

Posted by Alistair at 4:10 pm Post your comments (0)

Beware the Spinal Trap - Reprint of Simon Singh's Original Chiropracty Article

Wednesday, July 29, 2009

Earlier this month, I blogged about the way in which Simon Singh is being sued for libel by the British Chiropractic Association, having made an entirely legitimate, scientifically substantiated comment about the ineffectiveness of the therapy for non-spinal-related conditions. Keep Libel Laws Out of Science The article led to a campaign to Keep Libel Laws Out of Science, orchestrated by the Sense About Science organisation. Today, as part of that campaign, a number of blogs, charities, and news organisations will be reprinting Singh's original article. I have joined in with this; the article - slightly edited - appears below.

Beware the Spinal Trap
Some practitioners claim it is a cure-all, but the research suggests chiropractic therapy has mixed results - and can even be lethal, says Simon Singh.

You might be surprised to know that the founder of chiropractic therapy, Daniel David Palmer, wrote that ‘99% of all diseases are caused by displaced vertebrae’. In the 1860s, Palmer began to develop his theory that the spine was involved in almost every illness because the spinal cord connects the brain to the rest of the body. Therefore any misalignment could cause a problem in distant parts of the body.
In fact, Palmer’s first chiropractic intervention supposedly cured a man who had been profoundly deaf for 17 years. His second treatment was equally strange, because he claimed that he treated a patient with heart trouble by correcting a displaced vertebra.

You might think that modern chiropractors restrict themselves to treating back problems, but in fact some still possess quite wacky ideas. The fundamentalists argue that they can cure anything, including helping treat children with colic, sleeping and feeding problems, frequent ear infections, asthma and prolonged crying - even though there is not a jot of evidence.

I can confidently label these assertions as utter nonsense because I have co-authored a book about alternative medicine with the world’s first professor of complementary medicine, Edzard Ernst. He learned chiropractic techniques himself and used them as a doctor. This is when he began to see the need for some critical evaluation. Among other projects, he examined the evidence from 70 trials exploring the benefits of chiropractic therapy in conditions unrelated to the back. He found no evidence to suggest that chiropractors could treat any such conditions.
But what about chiropractic in the context of treating back problems? Manipulating the spine can cure some problems, but results are mixed. To be fair, conventional approaches, such as physiotherapy, also struggle to treat back problems with any consistency. Nevertheless, conventional therapy is still preferable because of the serious dangers associated with chiropractic.

In 2001, a systematic review of five studies revealed that roughly half of all chiropractic patients experience temporary adverse effects, such as pain, numbness, stiffness, dizziness and headaches. These are relatively minor effects, but the frequency is very high, and this has to be weighed against the limited benefit offered by chiropractors.

More worryingly, the hallmark technique of the chiropractor, known as high-velocity, low-amplitude thrust, carries much more significant risks. This involves pushing joints beyond their natural range of motion by applying a short, sharp force. Although this is a safe procedure for most patients, others can suffer dislocations and fractures.

Worse still, manipulation of the neck can damage the vertebral arteries, which supply blood to the brain. So-called vertebral dissection can ultimately cut off the blood supply, which in turn can lead to a stroke and even death. Because there is usually a delay between the vertebral dissection and the blockage of blood to the brain, the link between chiropractic and strokes went unnoticed for many years. Recently, however, it has been possible to identify cases where spinal manipulation has certainly been the cause of vertebral dissection.

Laurie Mathiason was a 20-year-old Canadian waitress who visited a chiropractor 21 times between 1997 and 1998 to relieve her low-back pain. On her penultimate visit she complained of stiffness in her neck. That evening she began dropping plates at the restaurant, so she returned to the chiropractor. As the chiropractor manipulated her neck, Mathiason began to cry, her eyes started to roll, she foamed at the mouth and her body began to convulse. She was rushed to hospital, slipped into a coma and died three days later. At the inquest, the coroner declared: ‘Laurie died of a ruptured vertebral artery, which occurred in association with a chiropractic manipulation of the neck.’

This case is not unique. In Canada alone there have been several other women who have died after receiving chiropractic therapy, and Edzard Ernst has identified about 700 cases of serious complications among the medical literature. This should be a major concern for health officials, particularly as under-reporting will mean that the actual number of cases is much higher.

If spinal manipulation were a drug with such serious adverse effects and so little demonstrable benefit, then it would almost certainly have been taken off the market.

Simon Singh is a science writer in London and the co-author, with Edzard Ernst, of Trick or Treatment? Alternative Medicine on Trial. This is an edited version of an article published in The Guardian for which Singh is being personally sued for libel by the British Chiropractic Association.

Labels: , , , ,

Posted by Alistair at 7:33 am Post your comments (0)

Keep Libel Laws Out of Science: The Case of Simon Singh

Tuesday, July 14, 2009

If you were one of Britain's most respected science writers, you might expect that you had the right to publish articles surveying the evidence for the effectiveness of certain clinical techniques, without this costing you hundreds of thousands of pounds. If you were a voice on science to whom people listen, you might in fact think it your core duty to bring make the public aware of the claims being made by certain medical practitioners about remedies for which there is simply no evidence of benefit. If you did this, protecting the public from medical quackery, you might expect to be applauded rather than be dragged through the courts.

free debate

However, the latter is precisely what has happened to Simon Singh, bestselling author of books such as Fermat's Last Theorem. In 2008, Singh wrote an article for The Guardian which focused on chiropractic. This discussed its founder's belief that manipulating the spine could treat almost all diseases, by alleviating blockages in the flow of energy through the nervous system. Though the article acknowledged that many modern chiropractors have moved away from this extreme position, concentrating instead on alleviating back pain, some continue to claim that it can be used to treat various childhood conditions, including asthma.

Though the article has since been removed from The Guardian's website, it is still circulating freely online. Reading the article, it seems quite clear that Singh's doubts about the effectiveness of chiropractic in treating non back-related conditions are valid, and supported by evidence. In particular, Edzard Ernst, with whom Singh co-authored on a book on alternative medicine, Trick or Treatment? Alternative Medicine on Trial, had done a meta-study of 70 trials on chiropractic treatments related to the back, and found no evidence of effect. Ernst's research articles, all peer reviewed, can be found on PubMed.

Singh must, therefore, have been surprised to receive a letter notifying him that the British Chiropractic Association intended to sue him for libel, and for defaming the reputation of their organisation. Out of necessity, The Guardian could not support Singh, who was left to fight the case alone, which he did. Unfortunately, on 7th May, 2009, a preliminary ruling at the Royal Courts of Justice deemed that the article was a statement of fact rather than personal comment, and that the article contained "the plainest allegation of dishonesty and indeed it accuses them (the BCA) of thoroughly disreputable conduct."


Singh objects to this ruling
that he was accusing the BCA of being dishonest and disreputable. He argues that although calling chiropractors are deluded and reckless, he was not suggesting that they are dishonest. Additionally, the judge ruled only on meaning and the way in which Singh's statements affected public perceptions of the BCA, not whether that meaning was supported by valid evidence and was therefore a legitimate case to argue. Finally, the burden of proof is reversed in libel cases, which means that Singh is guilty until proven innocent. He has to prove the accuracy of his statement, rather than the BCA having to prove why Singh was wrong - something they would find difficult to do, given the balance of scientific evidence about the effectiveness of chiropractic.

Rather than going to trial (which would be expensive) or settling out of court (which would be to acquiesce to the BCA in spite of the weight of valid scientific evidence behind Singh's statements), Singh has boldly decided to take the case to the Court of Appeal and then, if that fails, to the European Court of Human Rights.

Luckily, Singh is a bestselling author who can afford to pursue his case, whilst he is supported by various organisations. The point is, other scientists may not be so fortunate. The case has drawn attention to the serious flaws in the English libel system, that threaten the ability of scientists to express opinions based on sound scientific evidence, opinions that might be of benefit to the public if disseminated through the media.

In particular, it is too easy in English law for anyone to launch a libel action. And, from the point at which an action is launched, the burden of proof is on the defendant, rather than the defendent being judged innocent until proven guilty, as is the case for criminal trials. Consequently, defending cases - even those for which there is sound scientific evidence - becomes both time-consuming and costly. Organisations such as the BCA know this - and as a result libel can be used as a way of silencing scientists who, unlike Singh, are unstandably unwilling or unable to pursue their defence further.

Additionally, English libel cases are very costly and do not qualify for legal aid support. Going to trial can cost £1 000 000, which could be many times more than the damages at stake. By contrast (using this handy graph), a case in Sweden might cost a mere £10 000. Taking a case to trial puts one in a lose-lose scenario. Win, and you lose some money. Lose, and you lost a lot of money. Again this means libel can be used to bully publishers and authors into silence, even if they have solid evidence for their claims, as is true of the vast number of scientific studies that endorse Singh's view against chiropractic as a treatment for non back-related syndromes.

As David Colquhoun wittily puts it in his Patient's Guide to Magic Medicine:

Libel: A very expensive remedy, to be used only when you have no evidence. Appeals to alternative practitioners because truth is irrelevant.


Clearly the stilted, unfair nature of English libel law is a problem in many areas of publishing, not just in science communication. Alan Rushbridger of The Guardian puts the case very well. But the problem is particularly evident in the case of science, because libel cases can be used maliciously to silence honest scientists, even though the scientists speak with the weight of evidence on their side. Statements supported by clear scientific facts can still be sued against, and the burden of proof means is up to the scientists to prove otherwise, by which time the scientist will be broke, exhausted, regardless of whether he or she ultimately wins.

I am, therefore, very worried about this phenomena, and the effect it may have on scientific communication and practice, particularly in England where, we are continually told, the government is keep to promote innovation. Luckily, an army of bloggers, writers, commentators, publishers and scientists is fighting back, marshalled by the Sense About Science organisation. They have a campaign to Keep English Libel Laws Out of Science. For what it's worth, appropriately situated beneath the Irrepressible.info button campaigning for free speech, you will now see my own badge of support for this campaign. And if, having read this post, you feel similar concern about the way libel can be used to stifle legitimate scientific argument, I would urge you to sign up too.

Labels: , , , , ,

Posted by Alistair at 11:13 am Post your comments (3)

Literature and Science: A Disciplinary Fracture?

Tuesday, April 07, 2009

Last week, I attended the annual British Society for Literature and Science conference in Reading. As in the previous two BSLS conferences I've been to, this was a fabulous event, an opportunity to renew old acquaintances, chat about common interests, enjoy sumptuous breakfasts...oh, and to hear some excellent panels and plenaries.

However, thinking broadly about the weekend's papers, there seems to me - and I stress that this is my general sense, or thought-in-progress, and may well turn out to be misguided or making a false accusation - to be something of a crack emerging in the interdisciplinary approach to the field of literature and science.

On the one hand, there are those who treat literature and science in an essentially conventional historicist vein. Often focusing on Romantic poets and Victorian novelists, they explore the ways in which particular writers were influenced by scientific ideas in circulation at the time. Which scientists was George Eliot reading when she wrote Middlemarch? How was Wordsworth influenced by Humphrey Davy? Often drawing on archives or letters, scholars in this vein connect ideas or metaphors at work in the creative text with scientific enquiries. This is very interesting and worthy work, but it uses an essentially conventional model of English literary studies, showing the influences upon a writer in an attempt to make better sense of their oeuvre. In this case, scholars look at science, but they might just as easily refer to an author's tour of Venice, or their reading of Milton.

On the other hand, others in the field see the confluence of science and literature as an opportunity to rethink the models of knowledge with which literary scholars work, asking what are (to me) very interesting epistemological questions. What is "science"? Can a scientific "fact" about the world be conveyed to readers via creative works, such as science fiction, or does a fact assume a different status the moment it transfers into a genre other than the scientific journal article? To what degree does scientific writing draw on narrative modes, employing devices such as metaphor, plot, drama, rhetoric in order to produce a stable and persuasive body of knowledge? What sort of knowledge is made available by literary fiction, and can fiction itself therefore be said to be a science of sorts? How can we use recent discoveries in science, such as neuroscience or evolution, to inform our interpretations of literary texts? Without invoking that outmoded postmodern belief that science has no greater claims to reality than any other way of looking at the world, when these sorts of questions are raised they trouble the "two cultures" boundary, broadening the remit of "knowledge" as construed by the sciences and the arts.

It seemed to me that very rarely did the two approaches come together. Presenters were either theorising science and literature, or historicising, but not really making connections across the parallel approaches. This is particularly odd because the matriarch of science and literature (and President of the BSLS), Gillian Beer, stood in the shoes of both the historicist and the theorist in both of her seminal works. Darwin's Plots shows how Darwin's language and rhetoric was essential to the way his argument operates and convinces, and Open Fields: Science in Cultural Encounter shows how science and literature interplayed in the late Victorian period in a way which makes the "two cultures" differences of the twentieth century seem quite arbitrary. For anyone working in science and literature, these works are founding manifestos of sorts, but in a sense the fact one of the most formidable (but charming) scholars of the present moment wrote them reminds how difficult it is to do this sort of interdisciplinary work in a way that makes best use of science's introduction to literary studies to create a new paradigm for the latter.

If it is to be conducted in the fullest way, I would argue that science and literature must avoid doing two things. On the one hand, it cannot simply seize on scientific texts as just other examples of influential historical documents through which to understand a poem. On the other, it must avoid turning to science in order to claim some positivist legitimacy for literary studies, as if to say that literary criticism is a science just like physics, when in fact if there is a scientific knowledge encoded within literature and literary studies, it is a science of a different sort to that encoded in molecular mechanics. The latter is precisely what the current hot topic of the moment, evolutionary literary criticism risks doing, when at its worst it appears to say that reading Jane Austen can somehow improve your evolutionary survival in society - which is simply to give a gloss of scientific kudos to what is essentially an old Arnoldian argument that reading literature is a moral activity (see Joseph Caroll's Evolution and Literary Criticism).

Labels: , ,

Posted by Alistair at 9:13 am Post your comments (0)

The Science of Seaweed

Wednesday, July 16, 2008

Dr. Frithjof Küpper cites one of his research interests as lying in the "chemical ecology of marine algae and microbes." Basically, Dr. Küpper researches seaweed. However, as Radio 4's recent Material World programme on the subject indicated, the science of seaweed is a highly significant research area.

In a co-authored paper entitled "Iodide Accumulation Provides Kelp with an Inorganic Antioxidant Impacting Atmospheric Chemistry," Küpper and his collaborators published their finding that large brown seaweed releases a form of iodine to protect itself from sunlight or low-lying atmospheric ozone. Consequently, iodine emissions may cause localised clouds to form. Harnessing this effect may make it easier to seed rain clouds in drought-ridden areas, and appreciating the global extensiveness of this biospheric feedback loop may have implications in understanding and tackling climate change.

My reason for this brief post is to highlight the anarchistic nature of scientific discoveries. For whilst the public may respond positively to news about breakthroughs in cures for cancer (the Daily Express carries headlines about one nearly every week) or more fuel efficient cars, they probably care little about the science of seaweed. However, the story admirably picked up by Material World indicates that the majority of science is not headline-grabbing stuff. It is often conducted in niche areas, that may at first glance appear to have no relevance to human society, with the science itself being done purely to satisfy the curiosity of those involved.

This, then, exemplifies one of the myths of scientific research. For asked which project they would prefer to fund, I expect most lay people would choose the cancer cure option. However, science is a holistic enterprise. You cannot necessarily have the cure for cancer, or the solution to the climate change crisis, without understanding esoteric processes in apparently unrelated fields. Although with a limited pot of money funding bodies necessarily prioritise and exclude some research proposals, it is not always possible to predict from which area vital findings are going to emerge.

Indeed, one suspects that many people will have been balked at the $8 billion cost of the new Large Hadron Collider at CERN. What need for an expensive camera designed to take pretty snapshots of elementary particles? Sure, it may provide newspaper editors with impressive photos of small men standing in giant machines (The Guardian recently ran a supplement feature on the LHC), but surely there are cheaper ways to spark the imagination than by producing particulate fireworks? Well, probably the same questions were raised in 1990. Then a chap called Tim Berners Lee came along and, faced with the need to share data among research groups, invented the World Wide Web, by which you are reading these very words. The sciences of seaweed or particles indicate that whilst some sciences may not be appreciated in anticipation of great discoveries, their inestimable value often emerges through hindsight. I suspect that when it becomes operation in August, along with its terabytes of data the LHC will continue to prove this one, vital rule.

Labels: ,

Posted by Alistair at 7:52 am Post your comments (4)

The Funny Side of the Moon

Friday, July 11, 2008

Last Friday, I gave a conference paper on the moon. Well, OK, the paper itself was delivered in an old grammar school, the walls of which were carved with rather elegant seventeenth-century graffiti; but the paper was about the moon, particularly the NASA missions. Recently watching the elegiac documentary In The Shadow of the Moon (which won Best Documentary at the Sundance Festival), I was struck by the reading of Genesis performed by the Apollo 8 astronauts as they orbited the moon on Christmas Eve, 1968. Once back on Earth, the astronauts were astonished to learn that not everyone was happy with this Biblical reading, which they had thought was just “something appropriate” to the context; they were sued, unsuccessfully, by militant atheist Madalyn Murray O’Hair.

My paper was about the way in which ideology is apparently neutralised from the context of space, so that a specifically Christian passage is perceived – I argued naively – to apply to all mankind. Likewise, the plaque left by Apollo 11, “We came in peace for all mankind,” belies the fact that the missions were precisely the product of the Cold War, and that Communists probably had a less auspicious sense of the occasion. I argued that these moments maintain the myth of the Enlightenment: that scientific reason is also automatically socially reasonable, being applied for the good of universal humanism. In a sort of quasi-Marxist, quasi-postmodernist reading, I tried to deconstruct the myth of the moon missions as being the pinnacle performance of science for the audience of all mankind. I also looked at 2001: A Space Odyssey (1968) and Norman Mailer’s A Fire on the Moon (1970), showing how the first, though an apparently operatic celebration of scientific progress, is actually as doubtful about science as a universal humanist pathway as is the second, A Fire on the Moon, Mailer’s caustic retrospective on the Apollo 11 mission. In there markedly different ways, both texts encapsulate a comparable uneasiness about the moral universalism of the space missions, an uneasiness which retrospective celebrations, such as A Shadow on the Moon, can elide.

I am hoping to write this into a full journal article, so I don’t want to say any more about it now (though just in case you get the wrong impression, and have not read my other Science and Culture pieces on this blog, I am not anti-science per se, just against unselfconscious science which fails to appreciate that society may not unanimously view science in the same positivistic way as the scientists within it do). However, I did want to provide a video that should, in and of itself, demolish that myth that the moon missions were a pure ambition, untainted by realpolitik, ideology, or crass commercialism. Take a look at the following, an excerpt from CBS News’ contemporary coverage of Apollo 11, and when you have stopped laughing, come back and tell me that the space missions were truly transcendent affairs, not grounded in the reality of Western capitalism:



If my paper was about the dark side of the moon, however, it was pleasant to discover that there is also a humorous side to the moon. When I returned to my computer that evening, the BBC carried a story about a caller to the police who was worried about an unexplained "bright stationary object" in the night sky. Here is the transcript:

Control Room: "South Wales Police, what's your emergency?"

Caller: "It's not really. I just need to inform you that across the mountain there's a bright stationary object."

Control room: "Right."

Caller: "If you've got a couple of minutes perhaps you could find out what it is? It's been there at least half an hour and it's still there."

Control: "It's been there for half an hour. Right. Is it actually on the mountain or in the sky?"

Caller: "It's in the air."

Control: "I will send someone up there now to check it out."

Caller: "OK."

The mystery was soon solved, as the exchange between control and an officer at the scene, makes clear.

Control: "Alpha Zulu 20, this object in the sky, did anyone have a look at it?"

Officer: "Yes, it's the moon. Over."



Actually, of course, it's not so funny when you stop and think about the waste of police resources. But for a brief moment it is worth a laugh at the funny side of the moon.

Labels: , , ,

Posted by Alistair at 7:45 am Post your comments (0)

What is Art?

Tuesday, June 17, 2008

There being no Euro 2008 on telly Because I am dedicated student of culture, on Sunday night I went to a debate entitled "What is Art?" which was being run as part of my university's arts festival. The panel comprised a philosopher, two directors of modern art galleries, a theologian, and the director of Resonance FM.

I will not rehearse the debate here, which meandered largely around familiar grounds, but I just wanted to note the way in which the various definitions put forwards in relation to the question might be used to transgress the boundaries between science and art. I jotted down some of the epithets each contributor put forward in answer to the "What is Art?" issue; these included:
  • Accident becoming intention: the artist is never quite sure of the destiny of his or her work from the outset, and there is always the sense of the haphazard about art which is then justified as such only after it has been produced
  • Reproducing consciousness in others: the artwork acts as a vehicle for the imagination by which the viewer can occupy the perspective with which the artist views a particular aspect of the world
  • Pleasure: art is that which generates a response that transcends (note the romanticism) or stands beyond further expression or deconstructive analysis
  • Utility: art can have a public function, either memorialising events to be shared by the community, or by generating a sense of excitement about the potential of a region or city (something the Resonance FM representative completely overlooked when he derided the Angel of the North as worthless kitsch - hardly something that will go down well with the residents of the rejuvenated Newcastle Gateshead, a destination whose numerous cultural sites receive more visitors per capita than London)
  • Vision: this one, not surprisingly, was contributed by the theologian, but is probably not too far removed from the ideals of pleasure and reproducing consciousness in others
All of these examples seem fairly mainstream in aesthetic debates, although naturally no one example is capable of containing the full range of what might be, or what has been, considered as art (or, with equal applicability, literature or music). And the one thing missing from the list was ideology: art is whatever a particular culture defines as such because it suits the norms or incarnates the values that the culture wants to perpetuate. Clearly such a view is not one that curators of publically funded galleries can subscribe to. But enough Marxism; I want to focus really on the way in which each argument survives the translation across the disciplinary divide, into the sphere of scientific activity.

If accident becoming intention defines an artwork, does this not also describe Alexander Fleming's petri dishes, left unintentionally on a windowsill but leading to the understood phenomena of antibiotics? If art is the reproduction of consciousness in others, might this not also be the effect of scientific writing, the conventions of which should allow any other scientist to step into the shoes of his predecessor and see the world - albeit within an emotionally neutral framework - as if through his eyes when he conducted the original experiment? Certain scientific writing, such as The Origin of Species, has a clear aesthetic quality, able to generate pleasure in its reader through rhetorical means; but I suspect that the moment when the most dispassionate paper generates new knowledge is not unlike the moment in literature or art when you recognise what you had always known to be true in the world, but never quite so succinctly or elegantly expressed. The ideals of vision and utility pretty much speak for themselves.

I suspect that the most viewed images (artwork?) of the last couple of weeks were not paintings or photographs in a gallery in London, but those astonishing shots captured by the Phoenix lander on Mars, some 35 million miles away. What is so remarkable is the self-consciousness of the shots: here is little Earthbound me, looking at an image taken by a man-made machine, which is looking at itself (or at least its leg), on another world. The pictures are a medium for the mind, vicariously transporting me imaginatively so that I can feel what it must be like to fly (there's transcendence again) beyond Earthly limits, to plant my foot on another world. I am not sure that cognitively, my response to these images is far removed from that which I might have standing before a Picasso. Science might in and of itself possess aesthetic qualities, as a recently-published book entitled The Ten Most Beautiful Experiments implies.

On the other hand, bringing art and science into uncanny proximity encourages me also to note a contrast that might provide my own epithet to use in response to the question "What is Art?" With apologies to Heidegger, I would suggest there is between art and science a general difference between being and becoming.

As I have suggested above, science has many of the same agendas as art, though the methods and tones in which the enterprise is couched seem superficially different. However, the test of success for the process of science is a test of ends, of being; the test of a successful piece of art is one of bringing that art into being, of means floating independently of specific ends. There is no such thing as art, but art describes the process of creating the artefacts which might be given such a name.

The ideal scientific experiment will be replicable numerous times, with no unexpected deviation from the predictions of the model or formula. The model or formula may initially be revealed by accidents like Fleming's mould, but once that process has become known science aims to remove any possibility of the accident happening again; the test of scientific knowlege is its predictive quality: that the same conditions will produce the same state in comparable situations.

Art, however, is a process rather than an end, the becoming about of that entity that might (or might not) be named art once the process is complete. One of the panelists (the one for whom art was defined by its pleasure-giving capacity) noted that he played the accordion very badly, but that he enjoyed the experience of making music, even if his listeners found his results unbearable. Musical notation might be said to be like scientific writing, in the sense that it is a formal recording system that enables anyone able to read the system to reproduce the original product. Except, of course, the whole point of musicality is that there is no such direct correspondance. The accordion player may not be able to reproduce the notes with perfect fidelity, but this does not necessarily mean that the process of reproduction is - for him - unmusical; it is a process of becoming, of discovering a connection between the self and the music that is not definitively posited or founded in the score. One might make a similar point about literary language, in which the creative word floats freely of their author (even if, contra Barthes, the author is not quite dead), such that freshly creative interpretations of the same material are possible, even encouraged.

For the scientist, however, the failure to reach the anticipated end when he conducts an experiment signifies either a failure in the hypothesis, or in the methodology he is repeating, or that conditions not present in that original moment have had an unanticipated effect; such "errors" can, of course, turn out to be very purposive in leading science down new paths. However, the fact that if the second experiment fails to produce the same state of being as the first this must lead to further experiments means that the reproduction is not self-contained, containing within itself its end or purpose.

By contrast, for the accordian player, the fact that he fails to reproduce the notes with the fidelity intended by the composer is essentially irrelevant to his or her personal enjoyment and investment in the process (or becoming) of reaching that end (or being); he or she may want to reproduce the notes more accurately in the future, but the process itself will remain satisfying because it is one of new creation personal to him. Indeed, if the player reaches professional standard, the test of his ability will not be whether he can reproduce the musical consciousness of the composer by translating the score through the medium of the instrument, but the degree to which he or she is also mediating, that is to say, translating and interpreting the music in a newly productive deviation from the original intention.

So what implication does this contrast between being and becoming of science and art have on the question "What is Art?" Essentially, I think, it is to signify that the question what is art can not be grounded in any intrinsic quality of the artefact; nor can it be left ungrounded by talking romantically about metaphysical pleasure that cannot be referred to the mind of the creator or receiver; nor is the idea of utility particularly workable, given that econometrics cannot predict the social value of Guernica as opposed to the latest Damien Hurst installation.

Rather than a top-down approach to the question, by which the art is produced and we then must try to categorise it, the contrast between being and becoming operates in a bottom-up direction: art is whatever is produced with a sense of artistry, or art is the process of generating the thing that has the potential to be named "art." Though a tautology - or hermeneutic circle - it is a feasible definition because it refocuses attention not on the receivers of art but on the producers. The links that bind a viewer to art (or whatever is classified as "art") are potentially unrelated to any quality inherent in the artefact, perhaps intruding through ideology or preconceptions of what good art must do; on my model, there is a very definite connection between the artist and the production (art does not just spring from thin air), and any response to "What is Art?" must attend to the materialities (whether the cognitive processes in the mind of the artist, the nature of the medium being worked) that relate the artist to his creation, not those that flicker between a creation and a viewer.

Additionally, in spite of my contrast between science and art, this does not exclude the former from the potential of the latter: the child's process of discovering that a prism can split light into the rainbow may be treated as the artistic one of the child becoming conscious of a world otherwise hidden; likewise the process of the scientist discovery when something does not happen as expected might also be classed as art under my definition, no matter what the formal properties of the final result. If Fleming's experience of the growth of mould catalysed, for him, a comparable sense of personal growth, the process was artistic, regardless of the aesthetic qualities inherent (or not) in the green goo at the end of that becoming. On the other hand, not all science may be experienced with this cognitive way in the person conducting the experiment, whereas all art, or all science that is art, must be.

Labels: , ,

Posted by Alistair at 7:59 am Post your comments (0)

Am I Normal? Spirituality and Psychiatry

Tuesday, April 29, 2008

Until the BBC iPlayer was released, there would have been no point in blogging about programmes which the reader would have no chance of watching again. But the iPlayer is available, and so too is the exemplary documentary I watched last night: Am I Normal? presented by psychologist Dr. Tanya Byron.

In the hour-long film - a sensible, grown-up film without patronising background music or silly graphics - she explored the fine line between religious devotion and psychiatric disorder. Why is it that Pentecostals who speak in tongues are considered blessed, but schizophrenics who hear voices are institutionalised? Why is it that we pass by the street evangelist, thinking him to be slightly weird, but consider the grey-haired Carmelite nun, silently passing time in a convent, to be harmless?

Byron - an atheist herself - was open-minded about the value of religious belief for some people (statistically, patients with a spiritual background are more likely to recover from psychiatric syndromes than are atheists). But she was quite prepared to damn the cult of faith healing, which lacks any substantial evidence base and which may raise false hope for patients with severe medical conditions best treated by mainstream physical interventions. She was respectful in pressing the values and beliefs of atheists (Matthew Parris) and believers (Jeremy Vine) alike. She witnessed an evangelical song meeting, noting the same symptoms of crowd arousal - raised arms, physical proximity - as occur at football matches and rock concerts. She was intrigued by a trained psychiatrist who treated patients by exorcising the dead child spirits by whom they were possessed, seemingly (though no hard data is available) with results akin to those achieved by therapies such as CBT. Byron examined the neuroscience of talking in tongues (neurotheology). This has shown how the neurological system that regulates semantic language does shut down when people are being "possessed" as mediums for the "spirit," proving that they are not deliberate fakes, though it does not (cannot) prove either way the mechanism by which the synaptic action happens in the first place, whether supernaturally Holy or a self-induced behaviour.

This serious and sensitive look at what could have been a greatly divisive issue ought to be well-received by religious believers, atheists and scientists. It did not make grand claims to prove or disprove the existence of God, or to castigate religion as anti-science (though this was implicitly there in the background, in the consistent lack of an evidence base for alternative therapies and faith healing). Rather, it stuck to its remit to expose the conventions by which "normal" is determined, and it concluded with some force that what we classify as psychologically normal - and the normal therapies deployed to treat psychiatric disorders - are generally socially-constructed ideologies.

Because of this, many of the conventions and methods between treatments may be comparable at root. I noted that the psychiatrist-exorcist asked many questions of his patient whilst rhetorically planting ideas; a similar sort of approach is used by mainstream therapy or even by the Eliza chatbot (the latter, a simple artificial intelligence programme, is peculiarly effective at helping interlocutors to express their anxieties). It seems that treating patients with psychological problems may be done effectively through talking with God, inner demons, keyboards, doctors or priests. The challenge science and religion must meet now is to confront the evidence: even if normal and mad are arbitrary categories, there must be one form of treatment that is most effective, for most people, most of the time. One suspects the scientists may be very prepared to explore this. The priests, less so. But with the likes of Tanya Byron moderating, there may be hopes for a start.

Labels: , , ,

Posted by Alistair at 8:27 am Post your comments (0)

Back to Kubrick's Future: Revisiting 2001: A Space Odyssey

Because the broad remit of my research allows such things, since Christmas I have gone beyond the infinite universe of books to write on science fiction film, with my current focus being Stanley Kubrick's 1968 masterpiece, 2001: A Space Odyssey. Watching this in 2008, and reading about its reception at the time, is a slightly bemusing experience.

As Jerome Agel's contemporary edited collection, The Making of Kubrick's 2001 reports, critics at the time were less than complementary about Kubrick's ten million dollar baby (the contrast with the universal acclaim for Grand Theft Auto IV released today could not be more striking). Some excerpts from the more damning reviews:
You could see it a dozen times and still not understand it. But then, you didn't really expect to understand a movie that took $10.5 million and four years to make, did you?
The guesses of Messrs. Kubrick and Clarke must be as good as ours.
Were 2001 cut in half it would be a pithy and potent film, with an impact that might resolve the "enigma" of its point and preclude our wondering why exactly Mr. Kubrick has brought us to outer space in the year 2001...We hope he sticks to his cameras and stays down to earth - for that is where his triumph remains.
Granted: 2001 is the head flick of all time. Note the faintly resinous spoor of the audience, the people fighting at intermission to get those 50-cent chocolate bars, the spaced-out few who contemplate the curtain for long minutes after the movie ends.
The tedium is the message.
That last piece of pithy genius is from Joseph Gelmis, but in a second review, having watched the film again, he acknowledges one of the problems reviewers of the film at the time faced:
When a film of such extraordinary originality as Stanley Kubrick's 2001: A Space Odyssey comes along it upsets the members of the critical establishment because it exists outside their framework of apprehending and describing movies. They are threatened. Their most polished puns and witticisms are useless because the conventional standards don't apply. They need an innocent eye, an inconditioned reflex and a flexible vocabulary. With one exception (The New Yorker's Penelope Gilliatt), the daily and weekly reviewers offhandedly dismissed the film as a disappointment or found it an ambitious failure.
Gelmis's first review in Newsday (April 4, 1968) classified it precisely in these terms. However, his second review admired the fact that it "uncompromisingly demands acceptance on its own unique terms." Unfortunately, as Gelmis noted, such a refusal to buckle to the audience's demands for simple plot and exegesis meant that its stark originality did not make sense except on a second or third viewing.

But this is precisely why I am so surprised by all the negative reviews from 1968. Because, in 2008, one can only ever watch 2001 for the first time having already seen it many times before. This is to say that anyone who has ever played Frontier Elite to the soundtrack of the Blue Danube Waltz, or seen adverts for the Apple Macintosh, or watched Star Wars or Star Trek or last year's science fiction hit Sunshine has already experienced Kubrick's vision. It is hard to overemphasise how odd seeing 2001 retrospectively is; its visual coinage has been in the cinematic economy for four decades now, and numerous shots first witnessed in 1968 set off echoes in the head today. It is therefore impossible to read the contemporary reviews objectively, without a sense of historical irony: unless, like Gelmis, they were prepared to watch it a second time, they would all be proved wrong.

However, before one gets too heady with schadenfreude, one is brought down to earth with a bump. Kubrick's aesthetics may have survived in the cinematic medium, but the vision of science has not been realised by 2008 in reality. At the time, that famous dissolve in which the spinning bone morphs into a rotating space satellite signified the compression of technological development. A year before man actually did land on the moon, space travel and intelligent computers must have seemed a mere frame in history in the future. Looking back today, we are reminded that 2001 did not see the rise of artificial intelligence nor space exploration.

Indeed, a year earlier we'd all been terrified by millennium bugs infecting cranky dumb machines. That AI has failed to come to fruition as Kubrick and Clarke anticipated can be seen not as endorsing the fact that the human mind is so advanced no machine can match it, but that the human mind is so limited that it never can invent a machine to match it. For the twenty-first-century spectator of 2001, perhaps the most profound message is that Clarke and Kubrick, writing in the heyday of the space race and the Eliza chatbot, wrongly judged the acceleration of scientific development. In the twenty-first century the chronology of history and the future-time of the novel have switched places. Thus HAL becomes not so much the potential nightmare we want to avoid, but more symbolises the dream we may not ever realise, due to our own limited knowledge in comparison with that represented in his omniscient but fictional mind.

A similarly depressing story is told by 2001's vision of space travel. Famously, this is presented as being entirely mundane. It involves talk about freeze dried sandwiches ("What's that? Chicken?" "Something like that. Tastes the same anyway."), inane birthday greetings from mum and dad, lounging on sun beds. However, as we know from the Columbia disaster, space remains a risky and colossally expensive business. It is the specialist enterprise of big government, not space tourists (though Virgin Galactic may be seeking to change that).

Space science today is mundane, but in a significantly different way to that which Kubrick imagined. Until it was taken over by images of galaxies colliding - admittedly a pretty exciting firework, though not of our making - the BBC space section was reporting news of the Galileo satellite launch. Space is going to give us better sat nav so that we don't get stranded down country lanes on the way to the Dog and Duck. In comparison, the grand voyages to Jupiter and beyond the infinite seem - in the finite historical timeframe that separates 1968, 2001 and 2008 - a sorry world away.

Labels: , , ,

Posted by Alistair at 8:00 am Post your comments (0)

Retrospective Reading: Frankenstein and the Embryology Debate

Tuesday, April 08, 2008

I recently presented a conference paper on science fiction, considering the theoretical problems of reading retrospectively, after its one-time futuristic visions have now been technologically realised. In one of the examples I used, contemporary reviewers of H.G. Wells' War of the Worlds were impressed by Wells' evolutionary imaginings of how the Martians might look, and how they might be defeated by bacteria; they enjoyed his novel presentation of heat rays, tripods and flying machines. But they do not seem to have focused much on how the invasion narrative was intended as a critique of Victorian society in his present, showing how quickly the veneer of civilisation would drop away under the stress of war. However, modern readings now emphasise the novel as a social satire, an approach given added plausibility since World War One did indeed bring Victorian civilisation almost to its knees, through the use of poison gas and flying machines.

In a lecture presented to the Royal Institution entitled "The Discovery of the Future," Wells ascribed to creative writers (himself included) the ability to discern the future with a near empirical accuracy. Like a palaeontologist who by piecing together fossil fragments is able to reconstruct prehistory, the creative writer is able to assimilate the ideas of the present and project a reliable scientific vision of the future. Whilst in the postmodern age of textual relativism such a view seems always suspect, Wells is not unique in holding this perspective on science fiction, though he is rare in the objective force of his argument. Wells would, I suspect, have got on with the recently departed Arthur C. Clarke, who similarly argued in his essay "Hazards of Prophecy: The Failure of Imagination" that good science fiction should be grounded in extrapolations of present reality, unless it was to become mere fantasy.

However, it seems clear that science fiction does not have any strong claim to predictive validity. Any judgements it makes are given empirical weight only with the benefit of hindsight. In order to seem predictive, science fiction only needs to be lucky once. Star Trek's communicator device seems not unlike a contemporary mobile phone, and so Star Trek is taken as a good predictor of the future. But where are the holodecks, warp drives, and voice-activated computers? Certainly, all these sorts of things will come to pass eventually - virtual reality, space travel, intelligent-type machines. But in reality they will come about not because Star Trek made them so, and not primarily because science has been inspired by the series, but because when they come to be we will recollect the fiction and structure the contemporary technologies according to its earlier, fictional versions. If science fiction seems to present an accurate picture of the future, it is principally because fiction is always going to be reframed in terms of the present.

The reason for this excursion into literary theory of science fiction is that the recent debate about the embryology bill currently being legislated in Parliament has also employed a science fiction text in considering the ethics of the present. The bill would allow scientists to create human-animal embryos for research purposes. Cytoplasmic embryos containing 99.9% human DNA, and the remainder animal, would be grown in the lab for a few days, and then be harvested for stem cells to be used in research into cognitive degeneration diseases: Parkinson's, Alzheimer's, Motor Neurone Disease.

However, whilst the scientific research that would be allowed by the legislation is specific and with particular medical benefits, the reaction to the bill - orchestrated by the Catholic church - has been anything but subtle. Particularly grabbing the headlines was the Easter sermon of the Archbishop of Edinburgh, Cardinal Keith O'Brian. He polemicised:

This bill represents a monstrous attack on human rights, human dignity and human life.

In some other European countries one could be jailed for doing what we intend to make legal.

I can say that the government has no mandate for these changes: they were not in any election manifesto, nor do they enjoy widespread public support.

The opposite has indeed taken place - the time allowed for debate in parliament and indeed in the country at large has been shockingly short.

One might say that in our country we are about to have a public government endorsement of experiments of Frankenstein proportion - without many people really being aware of what is going on.

Many excuses are being made for this present legislation, particularly that cures will soon be found for various diseases which afflict mankind through this legislation.

My objection to the Cardinal's squeals of objection lies in his use of the terms "monstrous" and "Frankenstein" as a catch-all phrase designed to prevent engagement with his argument on any logical grounds, instead invoking the spirit of innate disgust. Given my introductory discussion about the retrospectivity of science fiction, what happens when Frankenstein is introduced into a debate like this (as it has previously been in relation to Genetic Modification, in the form of "Frankenfoods")?

The use of the "Frankenstein" metaphor disrupts logic. It prevents readers and listeners from considering what the science's future really is - immediate and specialised, to grow cells for a few days in a petri dish - and expands it in a limitless bubble of blind ambition. As we inevitably reconstruct the present science in terms of the past text, it seems as if Mary Shelley definitively predicted this would happen, that scientists in a laboratory in Newcastle would try to tamper with life in a grand way (they are, objectively, not doing this - simply manipulating a few cells not whole human bodies). Therefore, any other such claims made in the fiction take on empirical weight as the definition of where science will inevitably, with absolute predictive truth as envisaged by Wells, want to travel morally in its discovery of the future.

Like a giant and unilateral weight, the fiction text is dropped on the science to make a number of associative predictions. The Cardinal invokes sexual deviancy: "The norm has always been that children have been born as the result of the love of man and woman in the unity of a marriage." Frankenstein indeed insinuates a slightly incestuous relationship between Victor and Elizabeth; because Frankenstein was right about scientists tampering with life, it must also be right about the horror of a society in which heterosexual monogamy is no longer an automatic given. The Cardinal challenges us to allow life "to triumph over these deathly proposals"; given the connection with Frankenstein, the implication is that if we fail to prevent the legislation we are performing the moral equivalent to Frankenstein's graveyard robberies. Because one aspect of Frankenstein's legacy appears to have come true, so must all the other aspects of the text.

Rereading Frankenstein, though, as I currently am, I am struck by how much more nuanced it is. Frankenstein is far from pure evil, which is why he is such a compelling and interesting figure. His ambition is directed to the best of purposes, to "renew life where death had apparently devoted the body to corruption." This is a reading which would also apply to the scientists, but the focus in the Cardinal's argument is not on them personally, but on the hideous objects - hybrids of life and technology - which they create. Does the Cardinal not think that scientists doing the work have themselves weighed carefully and personally the ethics of doing this research against the ethics of failing to pursue research which will almost certain provide great medical benefits? In the novel there are numerous cases of ambition and intent for far less admirable and transient ends than those of Frankenstein - financial gain, sexual desire - even if Victor's methods are the most distasteful. Victor Frankenstein may confront the reader with a moral case, but he is far from simply morally corrupt. Frankenstein is a dialogue in the life sciences, not a diagram against it. It is also a science - in the broadest sense - of human life, human nature, human passion and desire, and where the limits of the desire that drives civilisation should be curtailed.

Frankenstein is a wholly appropriate text to bring to the debate about embryological research, and the biosciences generally. Its nuances make it an ideal philosophical abstraction by which we can think through the ethics of science in a general sense, outside of the frantic contexts of our current time. However, it needs to be done in a way that treats the narrative with the complexity it deserves, not just extracting those elements which seem to mesh, with absolute predictive force, with where science is in the present. Constructing the present in terms of the past is a dangerous business, because we are doomed to carry out only the lessons from it which stand out most starkly. Those who oppose embryological research need to read carefully the fictional texts that they choose to use as empirical evidence; they should not unreflexively extract those moments that seem to suit their singular ends so well in the present.

Labels: , , , ,

Posted by Alistair at 4:33 pm Post your comments (0)

Fredric Brown's "Answer": A Short Story of the Internet (from 1964)

Thursday, January 31, 2008

Dinah Birch's Times Literary Supplement review of Brian Aldiss's latest Science Fiction Omnibus cites a "piercingly brief story" by Fredric Brown, called "Answer." It is brief. But if it is piercing, this is not wholly to do with its succinctness, but has much to do with its prescience. Written in 1964, there's something very disturbing, sublime and aweful, about this description of the internet, as it was then not known. The story is probably still in copyright, but - what the heck - I loved it so much, and it's so brief, that I invoke the interests of "fair use" (and wider dissemination) to reproduce it here:
Dwan Ev ceremoniously soldered the final connection with gold. The eyes of a dozen television cameras watched him and the subether bore throughout the universe a dozen pictures of what he was doing.
He straightened and nodded to Dwar Reyn, then moved to a position beside the switch that would complete the contact when he threw it. The switch that would connect, all at once, all of the monster computing machines of all the populated planets in the universe -- ninety-six billion planets -- into the supercircuit that would connect them all into one supercalculator, one cybernetics machine that would combine all the knowledge of all the galaxies.
Dwar Reyn spoke briefly to the watching and listening trillions. Then after a moment's silence he said, "Now, Dwar Ev."
Dwar Ev threw the switch. There was a mighty hum, the surge of power from ninety-six billion planets. Lights flashed and quieted along the miles-long panel.
Dwar Ev stepped back and drew a deep breath. "The honor of asking the first question is yours, Dwar Reyn."
"Thank you," said Dwar Reyn. "It shall be a question which no single cybernetics machine has been able to answer."
He turned to face the machine. "Is there a God?"
The mighty voice answered without hesitation, without the clicking of a single relay.
"Yes, now there is a God."
Sudden fear flashed on the face of Dwar Ev. He leaped to grab the switch.
A bolt of lightning from the cloudless sky struck him down and fused the switch shut.
The large part of my current research involves looking for the ancestors of the concept of the cyborg, or posthuman (see N. Katharine Hayles or Donna Haraway). It's academic, dense, and theoretical - a quest for the roots of an idea that is ideologically very old, though the shiny technological manifestations of it are superficially, shinily novel (think The Terminator, The Matrix, the human genome project). But my circuitous (forgive the pun!) philosophy is brightened by anecdotal moments which connect past - an age before the internet - to present, in a way that reminds in an instant that the human imagination has long transcended the limits of its environment, without the need for virtual reality helmets or the hyperlink. Which leads me to one other prescient factoid I recently discovered: the idea of the hyperlink, the structuring of information by association of content rather than alphabetical order, is almost unanimously traced back to Vannevar Bush, with his Memory Extender. The date he first raised the idea: 1933 - before even Alan Turing, let alone Tim Berners-Lee.

Labels: , , ,

Posted by Alistair at 10:42 pm Post your comments (3)

Royal Society's Public Understanding of Science Report (1985)

Wednesday, September 19, 2007

I've just been skimming through the Royal Society's 1985 report into the Public Understanding of Science. Just over twenty years since it was published, I have not got time to go through the detail with a fine-tooth comb and observe how many of its recommendations have been taken up. A couple of details I did pick up, however: the recommendation that all universities introduce some form of "general studies" to allow students to learn from experts from disciplines other than their main one; and the idea that all science PhDs should be required to produce a brief publicly accessible report into their research (such as a press release) as a requirement of graduating. Neither of these proposals have been taken up directly across Higher Education. However, the key buzzword of academia today is very definitely interdisciplinarity, bringing together ideas and academics from different departments (even different faculties); in relation to the second point, the UK Grad programme includes training on writing press releases and publicising research. Whether these are a result of the Royal Society's influence, I don't know, but I expect the report contributed towards the atmosphere of positive change.

As for the findings of the public understanding of science in relation to the media, the picture is more pessimistic. Take a look at these examples of Bad Science, and you will see that fundamentally things have not changed in two decades.

Labels: , ,

Posted by Alistair at 3:11 pm Post your comments (0)

Miraculous Mitosis

Tuesday, September 18, 2007

I commented a couple of months ago about how Darwinian evolutionary theory is now so firmly entrenched in my mind that I cannot conceive of life working in any radically different way. However, that is not to say that my amazement at the natural world is in any way diminished (and, in large part, my photoblog is a celebration of the environment). Reading geneticist Mark Ridley's Mendel's Demon: Gene Justice and the Complexity of Life, I come across the following description of mitosis:
Eukaryotic cells have a distinct method of cellular reproduction. The genes and other cellular components first double up inside the cell. A special machinery of cables forms inside the cell, and they mechanically pull the two sets of genes into the two opposite halves of the parent cell. A membrane then forms between the two halves and division is complete. Such is the normal process of cell division, called mitosis, for instance in a growing plant or animal.
As a literary critic, I am aware that much of my seduction by this passage is triggered by Ridley's investment of agency in the cells, and his use of humanising metaphors: they "first double up"; "a special machinery forms"; "they mechanically pull." In fact, there is no such thing as "they" in a cell, which is simply a biological component, not a conscious or semi-conscious identity. It is only from the human perspective (and especially that of a popular science book) that it appears remarkable that cells pull sets of genes apart in a game of biotic tug-of-war from. From the gene's eye view of the world, though, there is nothing intentional or teleological about the act; it is an entirely mundane process that gets on with its cellular housekeeping while someone is eating, or opening a window, or just walking dully along (apologies to Auden).

Nevertheless, even when you escape from the framings and manipulations of text, there is something close to miraculous about watching this process - which happens billions of times a day, and has done so for billions of years - effortlessly in the action of creating life.

Labels: , ,

Posted by Alistair at 7:52 am Post your comments (0)

Postgraduate Diary: If In a Literature Thesis a PhD Student..., or, The Lotarian Trap

Thursday, September 13, 2007

In Italo Calvino's famous meditation on the relationship between novelists and readers, If On a Winter's Night A Traveller, comes a warning about the fundamental trap of a literary research thesis:
A girl came to see me who is writing a thesis on my novels for a very important university seminar in literary studies. I see that my work serves her perfectly to demonstrate her theories, and this is certainly a positive fact - for the novels or for the theories, I do not know which. From her very detailed talk, I got the idea of a piece of work being seriously pursued, but my books seen through her eyes seem unrecognisable to me. I am sure this Lotaria (that is her name) has read them conscientiously, but I believe she has read them only to find in them what she was already convinced of before reading them.
In science, you carefully choose the dataset on which you will run a test for a hypothesis, selecting a target which will provide results most efficiently and with the minimum of uncontrolled variables. But, ultimately, the dataset chosen by the scientist should be entirely irrelevant: the data must be independent of the conclusion if the scientific theory is valid. In the apocryphal story, Newton may have been standing under an apple tree when he reasoned the theory of gravity, but that theory applies equally whether the observational data is falling apples or dropped bombs. Were the theory to stop being applicable in a comparable situation - under a plum tree, for example - then the theory would have been falsified, such that we would need to recognise either that the theory must be fundamentally wrong, or that it requires modification in order that it apply (or appreciates why it cannot apply) equally for different varieties of fruit (or, more realistically, in the extreme conditions of entities such as black holes).

In literary study, however, the division between theory and data is less clear cut, as the Calvino passage makes clear in its parody. Currently researching some of Umberto Eco's semiotic theories, I notice that although deconstruction claims itself as a method applicable across all texts and language - since it places language as the very centre of our way of being in and knowing the world - most often the texts to which it is applied are always already open to deconstructive readings: works that are self-referential, embrace paradox rather than conclusiveness, are conscious of their being as texts. Thus Barthes examines some stories by Edgar Allan Poe, but not the editorial correspondence from the New England Magazine in which many of them were published.

And literary writers such as Italo Calvino (or A.S. Byatt, Umberto Eco, John Fowles), conscious of the ways in which the academy will appraise their texts, deliberately pre-empt and parody those modes of criticism. Thus texts such as If On a Winter's Night adopt what I call the critically sarcastic attitude. A Lotaria, or other academic reader, comes to the work from a pre-conceived theoretical angle, finds that the text deconstructs itself (or performs according to the predictions of some other theory), and thus the text can do nothing but applaud that critic ironically: "Well done," it says, "of course such and such a theoretically knowing symbol/structure/tone/philosophy etc. was there. I put it there. I knew you would come looking for it."

I am not a poststructuralist myself, though I am aware that I regularly (often subconsciously) dip into its toolbox in my analysis of texts, just as I do Marxism, psychoanalysis, historicism, or the close readings of new criticism. However, though I do not have a single preconceived critical angle, in my research I still risk falling into the Lotarian trap.

Without giving my game away too much (anonymity matters, as does the intellectual property of my original idea in my thesis), I am examining the use of a particular metaphor in literary fiction and science. Now hovering on the brink of its third year, my research is well-developed, most chapters are drafted or written, my ideas are well-honed and focused. Among other things, I am going to be looking at four novels and a couple of films which use my metaphor. However, to select these - effectively my dataset - I discarded tens of other novels which I read over the previous two years which did not happen to contain the image or symbol for which I was looking. It is therefore inevitable that I will give the impression that I "read them only to find in them what [I] was already convinced of before reading them." This is where Chapter One: The Introduction comes in, and I realise only now that in spite of it being only a small component, it is probably the most important single chapter of my thesis, since it is this that will make-or-break it in a viva.

If I fully admit the qualifications, and paradoxes of my research there, then what follows will stand or fall by the internal logic of the framework I publicly have set myself; I admit my theory works only within the orchard of texts in which I have chosen to wander. If I fail to recognise the inherent limits of my methods, however, then a single plum dropped by the examiner will falsify it, showing my data to rely wholly on my theory, rather than existing independently of it. The moral of my experience, and Calvino's story, and The Lotarian Trap, is that literary theory becomes a bad pseudoscience when it seems to explain both apples and plums.

Labels: , , ,

Posted by Alistair at 9:33 am Post your comments (0)

Enemies of Reason

Sunday, August 12, 2007

In an era in which science is under threat from religious fundamentalism, medical quackery, and general scaremongering, there has been a scientific backlash against all forms of thinking outside the scheme of rational empiricism. This is evidenced in the Channel 4 documentary entitled "Enemies of Reason," in which Richard Dawkins chases down such "primitive" beliefs and outmoded ways of thinking which "impoverish our culture." Likewise, in the United States, the biologist Jerry Coyne recently asserted at Edge magazine:
We don't reject the supernatural merely because we have an overweening philosophical commitment to materialism; we reject it because entertaining the supernatural has never helped us understand the natural world. Alchemy, faith healing, astrology, creationism—none of these perspectives has advanced our understanding of nature by one iota.
The economic and political reasons for this polarising antagonism are understandable (see this previous post). However, in historical terms this total rejection of supernaturalism can be challenged. The first half of my research thesis examines the history of supernaturalist encounters from within - rather than opposed to - mainstream empirical science. It shows how, from Athanasius Kircher's Mundus Subterraneus, through the nineteenth-century's Society for Psychical Research, through James Clerk Maxwell's thermodynamic demon, to Marvin Minsky's demonic model of consciousness, rationalists have engaged with the supernatural when science reached the limits of Enlightenment methods of enquiry. And whilst science since the Enlightenment has driven us through multiple technological revolutions in a remarkably short span of time, it is worth remembering (as Coyne clearly has not) that for the majority of human history supernatural ways of interpreting the world have been the dominant ones, and human knowledge and technology still developed over this far longer period, albeit much less spectacularly.

In my view, then, the distinction between scientific and supernatural epistemologies is not quite so polar as scientists such as Dawkins or Coyne make out (though I appreciate their motivations). I should add that my argument does not assert that supernatural methods are in any systematic way better than rational ones, nor that things like ghosts or demons or astrological effects exist in reality, only that thinking that they might exist and using alternative methods working under that assumption can produce insights normal science would struggle to reach were it to follow its normal tangents. Once the alternative approaches map out the new ground, often quarantined from normal practice by being labelled "thought experiments" or "placeholder terms," science invariably assumes control once again in matching, or falsifying the match between, hypothesis and reality. I have to be supremely careful in my research that whilst re-evaluating the historical value of supernatural modes of enquiry, I also demarcate the limits to it, where rational science takes over with its time-honoured methodological reliability.

The best way to tackle the assertion that supernaturalism is the equal to science, would be through systematically deconstructing supernaturalist claims and exposing them as empirically unreliable, whilst allowing that in special cases supernaturalism offers a subtle sub-set of the methodologies at its disposal. Nevertheless, given the level of scientific illiteracy among the general public, the influence of a press generally insensitive to the difference between good and bad science, and, in my own field, the belief among postmodern academics that science is a relativistic and ideological epistemology, it is very tempting to do a wholesale demolition job of supernaturalist beliefs, and lose the subtlety of their merging with rationality. Thus the acerbic tone adopted by Paul Gross and Norman Levitt in their critique of postmodernist theory, tellingly entitled Higher Superstition; the aggressive manner adopted by Richard Dawkins; the patronising voice adopted by popular defenders against Bad Science, such as Ben Goldacre and David Colquhoun.

But when I read the response to Dawkins' programme by Neil Spencer, the Observer's astrologer, I realised that in spite of the nuances of my research it can be very difficult to avoid taking this directly oppositional stance in the public sphere, when the claims made are so obviously empirically false, and the tone of the supernaturalist thinker is just as acerbic as that of the scientist about which he complains. Inspired by the methods of Goldacre and Colquhoun, I tried to deconstruct his counter-attack in which he asserts the value of superstition, astrology, and alternative medicine. I start with astrology:
There was the usual objection to astrology dividing people into 12 Sun signs, and my usual reply: that's eight more than the Myers-Briggs personality test used by commerce. Actually, astrology's basic personality types number 1,728.
Rather than "more being better," one would expect that a personality model that divides people into four types will be more reliable than one that uses 1,728, since even a randomised response to the Myers-Briggs test would give a subject a 1-in-4 chance of being placed in the correct category (although as I understand it the test actually uses 16). It is not feasible that I fit neatly into one of 1,728 personality types, whereas all standard personality tests do not give absolute categories, but percentages which allow for people to straddle groups. Further, the Myers-Briggs test relies on subjects answering questions about themselves, and draws conclusions from that data based on aggregate samples of a large population. By contrast astrology draws conclusions from the stars, and applies them to people based on nothing more than the coincidence of their birthday. Rather than people determining the range of possible personalities (which is what we do tacitly in everyday life when meeting another person for the first time, with a large degree of success), astrologers cherry pick from a pool of personalities and apply them to people according to the rigid and arbitrary rule of celestial mechanics. As Dawkins showed, a reading for one star sign such as Capricorn has the same predictive value for an individual of a different zodiac, as for the person actually born in January.

But if the numbers game does not work, there's always the name game:
Am I bothered by Dawkins calling me names? Not really. I'm in some esteemed company - Resurgence publisher Satish Kumar, and Dr Peter Fisher, clinical director of the Royal Homeopathic Hospital (and the Queen's physician) - also fall under Dawkins' stony disapproval.
Declaring himself unaffected by being called names, he nevertheless decides to name them instead, assuming we will be impressed where he was not. So, in keeping with this intelligent tactic, let us name names back at him: Pinker, Crick, Maxwell, Darwin, Kelvin, Einstein...Actually, rather than going on with this squabbling, which is conducted on the level of a playground argument, lets switch to some serious empirical scrunity:
Homeopathy's supposed cures are, according to Dawkins, merely the result of the placebo effect. 'It's our own minds that cure the pain,' he concludes. How that explains why animals respond to homeopathy isn't confronted.
I'm not sure which study Spencer was thinking of in asserting that animals respond to homeopathy. It certainly wasn't the large-scale, double-blind, placebo controlled trial on dairy herds in Sweden in 2003, which found no evidence of effect, but a "considerable risk to animal welfare" in the continuing use of the treatment. Nor was it this study from the Veterinary Record in 2006. Or this one from Oslo. Or this from Canada. In fact, if you use Google Scholar to search for "homeopathy animal placebo," you will be hard pressed to discover any of the evidence Neil Spencer cites (or, rather, fails to cite, given that he gives no further references).

But wait a minute. Clearly I am the one being silly by looking for such scientific studies at all. Perhaps the failure to detect any difference between placebo and homeopathic remedies is precisely that:
Everything must be subject to randomised, controlled double-blind trials, just like medical drugs - 'drugs that work' as Dawkins insists.
Now instead of tackling Spencer by evidence, I'm just getting angry. That bloody medical science, always so pernickity when deciding whether or not to produce expensive quantities of a drug and release it into a large medical population; so annoyingly demanding in its tests for the effectiveness of alternative therapies. There is certainly a case for containing the burden of proof on medical trials, and separating responsibility for testing from the pharmaceutical companies which produce the treatments (Goldacre himself comments on this in The Guardian this week). But in the meantime, I'm not sure I trust the coin-toss method.

Though having said that, according to Spencer, we are not certain of getting better even by drugs which have been subjected to such a lengthy, scientifically controlled testing process:
The medical profession admits that the success of approved drugs can be as low as 60 per cent.
True. But according to a study in the quacks' journal Homeopathy, the success rate of that alternative therapy is around 70 per cent, so not much better than mainstream medicine. (Though the study asked patients who had paid for and received homeopathic treatment - with no placebo control - whether they thought their condition had improved. Surprise, surprise, having handed over wads of cash, many of them did.) And when you consider that most mainstream medicine will often be treating otherwise chronic, life-threatening illness, whereas homeopathy will tend not to be used by people lying incapacitated in intensive care wards, the apparently lower success rate of some approved drugs is understandable.

Finally, keeping the argument at its markedly unsubtle ebb, we get back to names again:
Galileo was, after all, astrologer as well as astronomer. Likewise Johannes Kepler, who was preoccupied with Pythagorean mathematics and Platonic solids. Isaac Newton was fascinated by alchemy, as was Robert Boyle, father of chemistry.
It is noticeable that all these scientists date from before the eighteenth century, and it is entirely consistent with theories of paradigm shift that the new scientific methodology did not immediately replace the old, supernaturalist speculations. Today, four hundred years on, and having consistently proved its superiority, one would hope that the scientific revolution has been completed.

Nevertheless, the fact that it has not remains interesting; in the esteemed company of Boyle and Newton, I am intrigued by astrology and alternative therapy too, or I would not be dedicating a substantial chunk of my thesis to it. Likewise, the Times Higher this week reports of the nine "psi" research groups across UK universities. As parapsychologist Chris French explains, "The fact is that the majority of the population does believe in this stuff, and a sizeable minority of the population claims to have had direct experience of the paranormal. If psychologists have nothing to say about this topic, they are missing out on a broad part of human experience." Indeed, Dawkins' own programme featured a psychologist interested in the depth of belief in water dowsers; comically, they continued to believe they could dowse, despite their success rate being exposed as no better than would be expected by chance. I was disappointed Dawkins as an interested scientist did not ask the follow up question, which is that dowsing outside the laboratory conditions must have some effect, given its survival into the twenty-first century. Possibly water dowsers are excellent interpreters of natural signs, such as increased vegetation or changing lie of the land, and might well use this entirely explicable if implicit method, rather than explicitly a twitching branch, to predict where water might run. Learning how they become so expert at interpretation would be fascinating, as indeed was Derren Brown's analysis of the manipulations of "cold reading" used by spiritualists at seances (believe me, once the illusionist reveals the subtle pressures they exert on an audience, those who continue to do it believing they are actually communicating with spirits seem nothing more than silver-tongued salesmen).

As serious researchers correctly suggest, there is no doubt that astrology, supernaturalism, ghosts are part of human culture. Whether they exist or not in the physical world, they undeniably exist for half of us in the mind, which is why even the most rational of scientists sometimes use them in thought experiments to provoke the scientific community into debate. They are therefore worthy of physical, psychological, and in my case literary study, and it is this significance that proponents should assert. Were they to do so, they would make opponents like Dawkins appear to be attacking a straw man, and one moreover which allows itself to be subjected to the same rigorous empirical enquiry as the more mainstream science of which he is an exponent.

But this will never happen, so the view has to remain thus: it is interesting that humans fall for it; it is interesting that it once was thought to work; it has generated some valid knowledge in the past. But just as I could use a flint to light a fire, but prefer a match, true science has a way of getting things done which alternative therapies and superstitious beliefs simply cannot match. This is why I, like Dawkins or the other defenders of reason, find it hard to otherwise than to mock and patronise the absurd beliefs and false claims of a "primitive" such as Spencer.

Labels: , , , , ,

Posted by Alistair at 9:50 pm Post your comments (2)

The content of this website is Copyright © 2009 using a Creative Commons Licence. One term of this copyright policy is that Plagiarism is theft. If using information from this website in your own work, please ensure that you use the correct citation.

Valid XHTML 1.0. Level A conformance icon, W3C-WAI Web Content Accessibility Guidelines 1.0. | Labelled with ICRA.