Tải bản đầy đủ - 0trang
Review of Richard Milton - The Facts of Life, Shattering the myth of Darwinism.rtf
species. In fact, modern Darwinians agree with Darwin himself that natural selection chooses among
individuals within species. Such a fundamental misunderstanding would be bound to have far-reaching
consequences; and they duly make nonsense of several sections of the book.
In genetics, the word ‘recessive’ has a precise meaning, known to every school biologist. It means a gene
whose effect is masked by another (dominant) gene at the same locus. Now it also happens that large
stretches of chromosomes are inert - untranslated. This kind of inertness has not the smallest connection with
the ‘recessive’ kind. Yet Milton manages the feat of confusing the two. Any slightly qualified referee would
have picked up this clanger.
There are other errors from which any reader capable of thought would have saved this book. Stating
correctly that Immanuel Velikovsky was ridiculed in his own time, Milton goes on to say "Today, only forty
years later, a concept closely similar to Velikovsky’s is widely accepted by many geologists - that the major
extinction at the end of the Cretaceous ... was caused by collison with a giant meteor or even asteroid." But
the whole point of Velikovsky (indeed, the whole reason why Milton, with his eccentric views on the age of the
earth, champions him) is that his collision was supposed to have happened recently; recently enough to
explain Biblical catastrophes like Moses’s parting of the Red Sea. The geologists’ meteorite, on the other
hand, is supposed to have impacted 65 million years ago! There is a difference - approximately 65 million
years difference. If Velikovsky had placed his collision tens of millions of years ago he would not have been
ridiculed. To represent him as a misjudged, wilderness-figure who has finally come into his own is either
disingenuous or - more charitably and plausibly - stupid.
In these post-Leakey, post-Johanson days, creationist preachers are having to learn that there is no mileage
in ‘missing links.’ Far from being missing, the fossil links between modern humans and our ape ancestors
now constitute an elegantly continuous series. Richard Milton, however, still hasn’t got the message. For him,
"...the only ‘missing link’ so far discovered remains the bogus Piltdown Man." Australopithecus, correctly
described as a human body with an ape’s head, doesn’t qualify because it is ‘really’ an ape. And Homo
habilis - ‘handy man’ - which has a brain "perhaps only half the size of the average modern human’s" is ruled
out from the other side: "... the fact remains that handy man is a human - not a missing link." One is left
wondering what a fossil has to do - what more could a fossil do - to qualify as a ‘missing link’?
No matter how continuous a fossil series may be, the conventions of zoological nomenclature will always
impose discontinuous names. At present, there are only two generic names to spread over all the hominids.
The more ape-like ones are shoved into the genus Australopithecus; the more human ones into the genus
Homo. Intermediates are saddled with one name or the other. This would still be true if the series were as
smoothly continuous as you can possibly imagine. So, when Milton says, of Johanson’s ‘Lucy’ and
associated fossils, "the finds have been referred to either Australopithecus and hence are apes, or Homo and
hence are human," he is saying something (rather dull) about naming conventions, nothing at all about the
But this is a more sophisticated criticism than Milton’s book deserves. The only serious question raised by its
publication is why. As for would-be purchasers, if you want this sort of silly-season drivel you’d be better off
with a couple of Jehovah’s Witness tracts. They are more amusing to read, they have rather sweet pictures,
and they put their religious cards on the table.
The Third Culture
RICHARD DAWKINS ON W.D. HAMILTON (1936-2000)
W. D. Hamilton (1936 - 2000)
W D Hamilton is a good candidate for the title of most distinguished Darwinian since Darwin. Other
candidates would have to include R A Fisher, whom Hamilton revered as a young student at Cambridge.
Hamilton resembled Fisher in his penetrating biological intuition and his ability to render it in mathematics.
But, like Darwin and unlike Fisher, he was also a superb field naturalist and explorer. I suspect that, of all his
twentieth century successors, Darwin would most have enjoyed talking to Hamilton. Partly because they
could have swapped jungle tales and beetle lore, partly because both were gentle and deep, but mostly
because Hamilton the theorist was responsible for clearing up so many of the very problems that had
intrigued and tantalised Darwin.
William Donald Hamilton FRS was Royal Society Research Professor in the Department of Zoology at
Oxford, and a Professorial Fellow of New College. He was born in 1936, spent a happy childhood botanising
and collecting butterflies in Kent, was educated at Tonbridge, then Cambridge where he read Genetics. For
his Ph.D. he moved to London where he was jointly enrolled at University College and LSE. He became a
Lecturer at Imperial College in 1964, where his teaching skills were not highly rated. After a brief Visiting
Professorship at Harvard, he accepted a Museum Professorship at the University of Michigan in 1977.
Finally, in 1984 he moved to Oxford at the invitation of Richard Southwood, who had been his Professor at
Hamilton was showered with medals and honours by the academies and learned societies of the world. He
won the Kyoto Prize, the Fyssen Prize, the Wander Prize, and the Crafoord Prize - instituted by the Swedish
Academy because Alfred Nobel unaccountably failed to include non-medical Biology in his list of eligible
subjects. But honours and recognition did not come early. The autobiographical chapters of Hamilton's
collection of papers, Narrow Roads of Gene Land, reveal a lonely young man driven to self-doubt by lack of
comprehension among his peers and superiors. To epitomise the Cambridge of his undergraduate days,
where "many biologists hardly seemed to believe in evolution" he quotes one senior professor: "Insects do
not live for themselves alone. Their lives are devoted to the survival of the species . . ." This is "Group
Selection", a solecism which would cause today's biology undergraduates to wince, but they have the
advantage of a post-Hamilton education. The young Hamilton felt that in Cambridge he was wincing alone.
Only the cantankerous Fisher made sense to him, and he had been advised that Fisher "was good with
statistics but knew nothing about biology."
For his doctoral work he proposed a difficult mathematical model with a simple conclusion now known as
"Hamilton's Rule." It states that a gene for altruistic self sacrifice will spread through a population if the cost to
the altruist is outweighed by the benefit to the recipient devalued by a fraction representing the genetic
relatedness between the two. Hamilton's original paper was so difficult and innovative that it almost failed to
be published, and was largely ignored for a decade. When finally noticed, its influence spread exponentially
until it became one of the most cited papers in all of biology. It is the key to understanding half the altruistic
cooperation in nature. The key to the other half - reciprocation among unrelated individuals - is a theory to
which Hamilton was later to make a major contribution, in collaboration with the social scientist Robert
The great obsession of his later career was parasites - their evolutionary rather than their medical impact.
Over twenty years, Hamilton convinced more and more biologists that parasites are the key to many
outstanding problems left by Darwin, including the baffling riddle of the evolution of sex. The sexual shuffling
of the genetic pack is an elaborate trick for outrunning parasites in the endless race through evolutionary
time. This work led Hamilton into the arcane world of computer simulation, where his models were as richly
textured, in their way, as his beloved Brazilian jungle. His spin off theory of sexual selection (how Darwin
would have relished it!) was that bird of paradise tails and similar male extravaganzas are driven by the
evolution of female diagnostic skills: females are like sceptical doctors, actively seeking parasite-free males
to supply genes for their shared posterity. Male advertisement is an honest boast of health.
Hamilton's mathematical models never became arid; they were laced with, and often inspired by, bizarre
natural history. Would that every mathematical lump were leavened, as Hamilton's were, by eye-witness
accounts of, say, the male mite who copulates with all his sisters and then dies before any of them are born.
Or of aphid females who give live birth to their daughters and granddaughters simultaneously.
For most scientists, good ideas are a scarce commodity, to be milked for everything they are worth. Hamilton,
by contrast, would bury, in little throwaway asides, ideas for which others would kill. Sometimes he buried
them so deeply that he overlooked them himself. Extreme social life in termites poses a particular
evolutionary problem not shared by the equally social ants, bees and wasps. An ingenious theory exists,
widely attributed to an author whom I shall call X. Hamilton and I were once talking termites, and he spoke
favourably of X's theory. "But Bill", I protested, "That isn't X's theory. It's your theory. You thought of it first."
He gloomily denied it, so I asked him to wait while I ran to the library. I returned with a bound journal volume
and shoved under his nose his own discreetly buried paragraph on termites. Eeyorishly, he conceded that,
yes, it did appear to be his own theory after all, but X had explained it much better. In a world where scientists
vie for priority, Hamilton was endearingly unique.
Those who loved him saw a Felix with nine lives. Charmingly accident-prone, Bill would always bounce back.
A childhood experiment with explosives cost him several finger joints of his right hand. He was frequently
knocked off his bicycle, probably because of misjudgements by Oxford motorists who couldn't believe a man
of his age with a great shock of white hair could possibly cycle so fast. And he travelled dangerously in wilder
and more remote places than Oxford. He hiked through Rwanda at the height of the civil war, and was
treated as a spy, so implausible was his (true) story that he was looking for ants. Held up at knife point in
Brazil, he made the mistake of fighting back, and was viciously wounded. He jumped into an Amazon
tributary when his boat was sinking, in order to plug the hole, like the little Dutch boy, with his thumb (the
ferocity of Piranha fish, he explained, is over-rated). Finally, to gather indirect evidence for the theory (of
which he was a strong supporter) that the AIDS virus was originally introduced into the human population in
an oral polio vaccine tested in Africa in the 1950s, Hamilton went, with two brave companions, to the depths
of the Congo jungle in January this year. He was rushed back to London, apparently with severe malaria,
seemed to recover, then collapsed into complications and coma. This time, he didn't bounce back.
He is survived by his wife, Christine, from whom he had been amicably separated for some time, by their
three daughters Helen, Ruth and Rowena, and by his devoted companion of recent years, Luisa Bozzi.
(This obituary also appeared in The Independent - 3.10.2000)
RICHARD DAWKINS is an evolutionary biologist and the Charles Simonyi Professor For The Understanding
Of Science at Oxford University; Fellow of New College; author of The Selfish Gene,The Extended
Phenotype,The Blind Watchmaker, River Out Of Eden (Science Masters Series), Climbing Mount Improbable,
and Unweaving The Rainbow.
SCIENCE AND SENSIBILITY
Queen Elizabeth Hall Lecture, London, 24th March 1998. Series title: Sounding the Century (‘What will the
Twentieth Century leave to its heirs?’)
With trepidation and humility, I find myself the only scientist in this list of lecturers. Does it really fall to me
alone to ‘sound the century’ for science; to reflect on the science that we bequeath to our heirs? The
twentieth could be science’s golden century: the age of Einstein, Hawking and relativity; of Planck,
Heisenberg and Quantum Theory; of Watson, Crick, Sanger and molecular biology; of Turing, von Neumann
and the computer; of Wiener, Shannon and cybernetics, of Plate Tectonics and radioactive dating of the
rocks; of Hubble’s Red Shift and the Hubble Telescope; of Fleming, Florey and penicillin; of moon landings,
and – let’s not duck the issue – of the hydrogen bomb. As George Steiner noted in the previous lecture, more
scientists are working today than in all other centuries combined. Though also – to put that figure into
alarming perspective – more people are alive today than have died since the dawn of Homo sapiens.
Of the dictionary meanings of sensibility, I intend "discernment, awareness" and "the capacity for responding
to aesthetic stimuli". One might have hoped that, by century’s end, science would have been incorporated
into our culture, and our aesthetic sense have risen to meet the poetry of science. Without reviving the midcentury pessimism of C P Snow, I reluctantly find that, with only two years to run, these hopes are not
realised. Science provokes more hostility than ever, sometimes with good reason, often from people who
know nothing about it and use their hostility as an excuse not to learn. Depressingly many people still fall for
the discredited clichŽ that scientific explanation corrodes poetic sensibility. Astrology books outsell
astronomy. Television beats a path to the door of second rate conjurors masquerading as psychics and
clairvoyants. Cult leaders mine the millennium and find rich seams of gullibility: Heaven’s Gate, Waco, poison
gas in the Tokyo underground. The biggest difference from the last millennium is that folk Christianity has
been joined by folk science-fiction.
It should have been so different. The previous millennium, there was some excuse. In 1066, if only with
hindsight, Halley’s Comet could forebode Hastings, sealing Harold’s fate and Duke William’s victory. HaleBopp in 1997 should have been different. Why do we feel gratitude when a newspaper astrologer reassures
his readers that Hale-Bopp was not directly responsible for Princess Diana’s death? And what is going on
when 39 people, driven by a theology compounded of Star Trek and the Book of Revelations, commit
collective suicide, neatly dressed and with overnight bags packed by their sides, because they all believed
that Hale-Bopp was accompanied by a spaceship come to "raise them to a new plane of existence"?
Incidentally, the same Heaven’s Gate Commune had ordered an astronomical telescope to look at HaleBopp. They sent it back when it came, because it was obviously defective: it failed to show the accompanying
Hijacking by pseudoscience and bad science fiction is a threat to our legitimate sense of wonder. Hostility
from academics sophisticated in fashionable disciplines is another, and I shall return to this. Populist
‘dumbing down’ is a third. The ‘Public Understanding of Science’ movement, provoked in America by Sputnik
and driven in Britain by alarm over a decline in science applicants at universities, is going demotic. A spate of
‘Science Fortnights’ and the like betrays a desperate anxiety among scientists to be loved. Whacky
‘personalities’, with funny hats and larky voices, perform explosions and funky tricks to show that science is
fun, fun, fun..
I recently attended a briefing session urging scientists to put on ‘events’ in shopping malls, designed to lure
people into the joys of science. We were advised to do nothing that might conceivably be a ‘turn-off’. Always
make your science ‘relevant’ to ordinary people – to what goes on in their own kitchen or bathroom. If
possible, choose experimental materials that your audience can eat at the end. At the last event organized by
the speaker himself, the scientific feat that really grabbed attention was the urinal, which automatically
flushed as soon as you stepped away. The very word science is best avoided, because ‘ordinary people’ find
When I protest, I am rebuked for my ‘elitism’. A terrible word, but maybe not such a terrible thing? There’s a
great difference between an exclusive snobbery, which no-one should condone, and a striving to help people
raise their game and swell the elite. A calculated dumbing down is the worst, condescending and patronising.
When I said this in a recent lecture in the United States, a questioner at the end, no doubt with a warm glow
in his white male heart, had the remarkable cheek to suggest that ‘fun’ might be especially necessary to bring
‘minorities and women’ to science.
I worry that to promote science as all larky and easy is to store up trouble for the future. Recruiting
advertisements for the army don’t promise a picnic, for the same reason. Real science can be hard but, like
classical literature or playing the violin, worth the struggle. If children are lured into science, or any other
worthwhile occupation, by the promise of easy frolics, what happens when they finally confront the reality?
‘Fun’ sends the wrong signals and might attract recruits for the wrong reasons.
Literary studies are at risk of becoming similarly undermined. Idle students are seduced into a debased
‘Cultural Studies’, where they will spend their time ‘deconstructing’ soap operas, tabloid princesses, and
tellytubbies. Science, like proper literary studies, can be hard and challenging but science is – again like
proper literary studies – wonderful. Science is also useful; but useful is not all it is. Science can pay its way
but, like great art, it shouldn’t have to. And we shouldn’t need whacky personalities and explosions to
persuade us of the value of a life spent finding out why we have life in the first place.
Perhaps I’m being too negative, but there are times when a pendulum has swung too far and needs a push in
the other direction. Certainly, practical demonstrations can make ideas vivid and preserve them in the mind.
From Michael Faraday’s Royal Institution Christmas Lectures, to Richard Gregory’s Bristol Exploratory,
children have been excited by hands-on experience of true science. I was myself honoured to give the
Christmas Lectures, in their modern televised form, with plenty of hands-on demonstrations. Faraday never
dumbed down. I am attacking only the kind of populist whoring that defiles the wonder of science.
Annually in London there is a large dinner, at which prizes for the year’s best science books are presented.
One prize is for children’s science books, and it recently went to a book about insects and other so-called
‘ugly bugs.’ Such language is not best calculated to arouse the poetic sense of wonder, but let that pass.
Harder to forgive were the antics of the Chairman of the Judges, a well known television personality (who had
credentials to present real science, before she sold out to ‘paranormal’ television). Squeaking with gameshow levity, she incited the audience to join her in repeated choruses of audible grimaces at the
contemplation of the horrible ‘ugly bugs’. "Eeeuurrrgh! Yuck! Yeeyuck! Eeeeeuurrrgh!" That kind of vulgarity
demeans the wonder of science, and risks ‘turning off’ the very people best qualified to appreciate it and
inspire others: real poets and true scholars of literature.
The true poetry of science, especially 20th century science, led the late Carl Sagan to ask the following acute
"How is it that hardly any major religion has looked at science and concluded, ‘This is better than we
thought! The Universe is much bigger than our prophets said, grander, more subtle, more elegant’? Instead
they say, ‘No, no, no! My god is a little god, and I want him to stay that way.’ A religion, old or new, that
stressed the magnificence of the Universe as revealed by modern science might be able to draw forth
reserves of reverence and awe hardly tapped by the conventional faiths."
Given a hundred clones of Carl Sagan, we might have some hope for the next century. Meanwhile, in its
closing years, the twentieth must be rated a disappointment as far as public understanding of science is
concerned, while being a spectacular and unprecedented success with respect to scientific achievements
What if we let our sensibility play over the whole of 20th century science. Is it possible to pick out a theme, a
scientific leitmotif? My best candidate comes nowhere near doing justice to the richness on offer. The
twentieth is The Digital Century. Digital discontinuity pervades the engineering of our time, but there is a
sense in which it spills over into the biology and perhaps even the physics of our century.
The opposite of digital is analogue. When the Spanish Armada was expected, a signalling system was
devised to spread the news across southern England. Bonfires were set on a chain of hilltops. When any
coastal observer spotted the Armada he was to light his fire. It would be seen by neighbouring observers,
their fires would be lit, and a wave of beacons would spread the news at great speed far along the coastal
How could we adapt the bonfire telegraph to convey more information? Not just "The Spanish are here" but,
say, the size of their fleet? Here’s one way. Make your bonfire’s size proportional to the size of the fleet. This
is an analogue code. Clearly, inaccuracies would be cumulative. So, by the time the message reached the
other side of the kingdom, the information about fleet size would have degraded to nothing. This is a general
problem with analogue codes.
But now here’s a simple digital code. Never mind the size of the fire, just build any serviceable blaze and
place a large screen around it. Lift the screen and lower it again, to send the next hill a discrete flash. Repeat
the flash a particular number of times, then lower the screen for a period of darkness. Repeat. The number of
flashes per burst should be made proportional to the size of the fleet.
This digital code has huge virtues over the previous analogue code. If a hilltop observer sees eight flashes,
eight flashes is what he passes along to the next hill in the chain. The message has a good chance of
spreading from Plymouth to Dover without serious degradation. The superior power of digital codes has been
clearly understood only in the twentieth century.
Nerve cells are like armada beacons. They ‘fire’. What travels along a nerve fibre is not electric current. It’s
more like a trail of gunpowder laid along the ground. Ignite one end with a spark, and the fire fizzes along to
the other end.
We’ve long known that nerve fibres don’t use purely analogue codes. Theoretical calculations show that they
couldn’t. Instead, they do something more like my flashing Armada beacons. Nerve impulses are trains of
voltage spikes, repeated as in a machine gun. The difference between a strong message and a weak is not
conveyed by the height of the spikes – that would be an analogue code and the message would be distorted
out of existence. It is conveyed by the pattern of spikes, especially the firing rate of the machine gun. When
you see yellow or hear Middle C, when you smell turpentine or touch satin, when you feel hot or cold, the
differences are being rendered, somewhere in your nervous system, by different rates of machine gun
pulses. The brain, if we could listen in, would sound like Passchendaele. In our meaning, it is digital. In a
fuller sense it is still partly analogue: rate of firing is a continuously varying quantity. Fully digital codes, like
Morse, or computer codes, where pulse patterns form a discrete alphabet, are even more reliable.
If nerves carry information about the world as it is now, genes are a coded description of the distant past.
This insight follows from the selfish gene view of evolution.
Living organisms are beautifully built to survive and reproduce in their environments. Or that is what
Darwinians say. But actually it isn’t quite right. They are beautifully built for survival in their ancestors’
environments. It is because their ancestors survived – long enough to pass on their DNA – that our modern
animals are well-built. For they inherit the very same successful DNA. The genes that survive down the
generations add up, in effect, to a description of what it took to survive back then. And that is tantamount to
saying that modern DNA is a coded description of the environments in which ancestors survived. A survival
manual is handed down the generations. A genetic Book of the Dead.
Like the longest chain of beacon fires, the generations are uncountably many. No surprise, then, that genes
are digital. Theoretically the ancient book of DNA could have been analogue. But, for the same reason as for
our analogue armada beacons, any ancient book copied and recopied in analogue language would degrade
to meaninglessness in very few scribe generations. Fortunately, human writing is digital, at least in the sense
we care about here. And the same is true of the DNA books of ancestral wisdom that we carry around inside
us. Genes are digital, and in the full sense not shared by nerves.
Digital genetics was discovered in the nineteenth century, but Gregor Mendel was ahead of his time and
ignored. The only serious error in Darwin’s world-view derived from the conventional wisdom of his age, that
inheritance was ‘blending’ – analogue genetics. It was dimly realised in Darwin’s time that analogue genetics
was incompatible with his whole theory of natural selection. Less clearly realised, it was also incompatible
with obvious facts of inheritance. The solution had to wait for the 20th century, especially the neo-Darwinian
synthesis of Ronald Fisher and others in the 1930s. The essential difference between classical Darwinism
(which we now understand could not have worked) and neo-Darwinism (which does) is that digital genetics
has replaced analogue.
But when it comes to digital genetics, Fisher and his colleagues of the Synthesis didn’t know the half of it.
Watson and Crick opened floodgates to what has been, by any standards, a spectacular intellectual
revolution – even if Peter Medawar was going too far when he wrote, in his review of Watson’s The Double
"It is simply not worth arguing with anyone so obtuse as not to realise that this complex of discoveries is
the greatest achievement of science in the twentieth century."
My misgiving, about this engagingly calculated piece of arrogance, is that I’d have a hard time defending it
against a rival claim for, say, quantum theory or relativity.
Watson and Crick’s was a digital revolution and it has gone exponential since 1953. You can read a gene
today, write it out precisely on a piece of paper, put it in a library, then at any time in the future reconstitute
that exact gene and put it back into an animal or plant. When the human genome project is completed,
probably around 2003, it will be possible to write the entire human genome on a couple of standard compact
discs, with enough space over for a large textbook of explanation. Send the boxed set of two CDs out into
deep space and the human race can go extinct, happy in the knowledge that there is now at least a sporting
chance for an alien civilisation to reconstitute a living human being. In one respect (though not in another),
my speculation is at least more plausible than the plot of Jurassic Park. And both speculations rest upon the
digital accuracy of DNA.
Of course, digital theory has been most fully worked out not by neurobiologists or geneticists, but by
electronic engineers. The digital telephones, televisions, music reproducers and microwave beams of the late
twentieth century are incomparably faster and more accurate than their analogue forerunners, and this is
critically because they are digital. Digital computers are the crowning achievement of this electronic age, and
they are heavily implicated in telephone switching, satellite communications and data transmission of all
kinds, including that phenomenon of the present decade, the World Wide Web. The late Christopher Evans
summed up the speed of the twentieth century digital revolution with a striking analogy to the car industry.
"Today’s car differs from those of the immediate post-war years on a number of counts. . . But suppose
for a moment that the automobile industry had developed at the same rate as computers and over the same
period: how much cheaper and more efficient would the current models be? If you have not already heard the
analogy the answer is shattering. Today you would be able to buy a Rolls-Royce for £1.35, it would do three
million miles to the gallon, and it would deliver enough power to drive the Queen Elizabeth II. And if you were
interested in miniaturization, you could place half a dozen of them on a pinhead."
It is computers that make us notice that the twentieth century is the digital century – lead us to spot the digital
in genetics, neurobiology and – though here I lack the confidence of knowledge – physics.
For it could be argued that quantum theory – the part of physics most distinctive of the twentieth century – is
fundamentally digital. The Scottish chemist Graham Cairns-Smith tells how he was first exposed to this
I suppose I was about eight when my father told me that nobody knew what electricity was. I went to
school the next day, I remember, and made this information generally available to my friends. It did not create
the kind of sensation I had been banking on, although it caught the attention of one whose father worked at
the local power station. His father actually made electricity so obviously he would know what it was. My friend
promised to ask and report back. Well, eventually he did and I cannot say I was much impressed with the
result. ‘Wee sandy stuff’ he said, rubbing his thumb and forefinger together to emphasise just how tiny the
grains were. He seemed unable to elaborate further.
The experimental predictions of quantum theory are upheld to the tenth place of decimals. Any theory with
such a spectacular grasp on reality commands our respect. But whether we conclude that the universe itself
is grainy – or that discontinuity is forced upon an underlying deep continuity only when we try to measure it –
I do not know; and physicists present will sense that the matter is too deep for me.
It should not be necessary to add that this gives me no satisfaction. But sadly there are literary and
journalistic circles in which ignorance or incomprehension of science is boasted with pride and even glee. I
have made the point often enough to sound plaintive. So let me quote, instead, one of the most justly
respected commentators on today’s culture, Melvyn Bragg:There are still those who are affected enough to say they know nothing about the sciences as if this
somehow makes them superior. What it makes them is rather silly, and it puts them at the fag end of that
tired old British tradition of intellectual snobbery which considers all knowledge, especially science, as
Sir Peter Medawar, that swashbuckling, Nobel Prize-winner whom I’ve already quoted, said something similar
It is said that in ancient China the mandarins allowed their fingernails – or anyhow one of them – to grow
so extremely long as manifestly to unfit them for any manual activity, thus making it perfectly clear to all that
they were creatures too refined and elevated ever to engage in such employments. It is a gesture that cannot
but appeal to the English, who surpass all other nations in snobbishness; our fastidious distaste for the
applied sciences and for trade has played a large part in bringing England to the position in the world which
she occupies today.
So, if I have difficulties with quantum theory, it is not for want of trying and certainly not a source of pride. As
an evolutionist, I endorse Steven Pinker’s view, that Darwinian natural selection has designed our brains to
understand the slow dynamics of large objects on the African savannahs. Perhaps somebody should devise
a computer game, in which bats and balls behave according to a screened illusion of quantum dynamics.
Children brought up on such a game might find modern physics no more impenetrable than we find the
concept of stalking a wildebeest.
Personal uncertainty about the uncertainty principle reminds me of another hallmark that will be alleged for
twentieth century science. This is the century, it will be claimed, in which the deterministic confidence of the
previous one was shattered. Partly by quantum theory. Partly by chaos (in the trendy, not the ordinary
language, meaning). And partly by relativism (cultural relativism, not the sensible, Einsteinian meaning).
Quantum uncertainty, and chaos theory, have had deplorable effects upon popular culture, much to the
annoyance of genuine aficionados. Both are regularly exploited by obscurantists, ranging from professional
quacks to daffy New-Agers. In America, the self-help ‘healing’ industry coins millions, and it has not been
slow to cash in on quantum theory’s formidable talent to bewilder. This has been documented by the
American physicist Victor Stenger. One well-heeled healer wrote a string of best-selling books on what he
calls ‘Quantum Healing." Another book in my possession has sections on Quantum psychology, quantum
responsibility, quantum morality, quantum aesthetics, quantum immortality, and quantum theology.
Chaos theory, a more recent invention, is equally fertile ground for those with a bent for abusing sense. It is
unfortunately named, for ‘chaos’ implies randomness. Chaos in the technical sense is not random at all. It is
completely determined, but it depends hugely, in strangely hard-to-predict ways, on tiny differences in initial
conditions. Undoubtedly it is mathematically interesting. If it impinges on the real world, it would rule out
ultimate prediction. If the weather is technically chaotic, weather forecasting in detail becomes impossible.
Major events like hurricanes might be determined by tiny causes in the past – such as the now proverbial flap
of a butterfly’s wing. This does not mean that you can flap the equivalent of a wing and hope to generate a
hurricane. As the physicist Robert Park says, this is "a total misunderstanding of what chaos is about . . .
while the flapping of a butterfly’s wings might conceivably trigger a hurricane, killing butterflies is unlikely to
reduce the incidence of hurricanes."
Quantum theory and chaos theory, each in their own peculiar ways, may call into question the predictability of
the universe, in deep principle. This could be seen as a retreat from nineteenth century confidence. But
nobody really thought that such fine details would ever be predicted in practice, anyway. The most confident
determinist would always have admitted that, in practice, sheer complexity of interacting causes would defeat
accurate prediction of weather or turbulence. So chaos doesn’t make a lot of difference in practice.
Conversely, quantum events are statistically smothered, and massively so, in most realms that impinge on
us. So the possibility of prediction is, for practical purposes, restored.
In the late twentieth century, prediction of future events in practice has never been more confident or more
accurate. This is dramatic in the feats of space engineers. Previous centuries could predict the return of
Halley’s Comet. Twentieth century science can hurl a projectile along the right trajectory to intercept it,
precisely computing and exploiting the gravitational slings of the solar system. Quantum theory itself,
whatever the indeterminacy at its heart, is spectacularly accurate in the experimental accuracy of its
predictions. The late Richard Feynman assessed this accuracy as equivalent to knowing the distance
between New York and Los Angeles to the width of one human hair. Here is no licence for anything-goes,
intellectual flappers, with their quantum theology and quantum you-name-it.
Cultural relativism is the most pernicious of these myths of twentieth century retreat from Victorian certainty.
A modish fad sees science as only one of many cultural myths, no more true nor valid than the myths of any
other culture. In the United States it is fed by justified guilt over the appalling treatment of Native Americans.
But the consequences can be laughable; as in the case of Kennewick Man.
Kennewick Man is a skeleton discovered in Washington State in 1996, carbon-dated to older than 9000
years. Anthropologists were intrigued by anatomical suggestions that he might be unrelated to typical Native
Americans, and might represent a separate early migration across what is now the Bering Strait, or even from
Iceland. They were about to do all-important DNA tests when the legal authorities seized the skeleton,
intending to hand it over to representatives of local Indian tribes, who proposed to bury it and forbid all further
study. Naturally there was widespread opposition from the scientific and archaeological community. What if
Kennewick Man is an American Indian of some kind, it is highly unlikely that his affinities lie with whichever
particular tribe happens to live in the same area 9000 years later.
Native Americans have impressive legal muscle, and ‘The Ancient One’ might have been handed over to the
tribes, but for a bizarre twist. The Asatru Folk Assembly, a group of worshippers of the Norse Gods Thor and
Odin, filed an independent legal claim that Kennewick Man was actually a Viking. This Nordic sect, whose
case you may read in your copy of The Runestone, were actually allowed to hold a religious service over the
bones. This upset the Yakama Indian community, whose spokesman feared that the Viking ceremony could
be "keeping Kennewick Man’s spirit from finding his body." The dispute between Indians and Norsemen
might be settled by DNA comparison with Kennewick Man, and the Norsemen are quite keen to be put to this
test. More probably, DNA would decide the case in favour of neither side. Further scientific study would
certainly cast fascinating light on the question of when humans first arrived in America. But Indian leaders
resent the very idea of studying this question, because they believe their ancestors have been in America
since the creation. As Armanad Minthorn, religious leader of the Umatilla tribe, puts it: " From our oral
histories, we know that our people have been part of this land since the beginning of time. We do not believe
our people migrated here from another continent, as the scientists do."
Perhaps the best policy for the archaeologists would be to declare themselves a religion, with DNA
fingerprints their sacramental totem. Facetious, but, such is the climate in the United States at the end of the
20th century, it is possibly the only recourse that would work. If you say, "Look, here is overwhelming
evidence from carbon dating, from mitochondrial DNA, and from archaeological analyses of pottery, that X is
the case" you will get nowhere. But if you say, "It is a fundamental and unquestioned belief of my culture that
X is the case" you will immediately hold a judge’s attention.
Also the attention of many in the academic community who, in the late twentieth century, have discovered a
new form of anti-scientific rhetoric, sometimes called the ‘postmodern critique’ of science. The most thorough
whistle-blowing on this kind of thing is Paul Gross and Norman Levitt’s splendid book, Higher Superstition:
The Academic Left and its Quarrels with Science. The American anthropologist Matt Cartmill sums up the
"Anybody who claims to have objective knowledge about anything is trying to control and dominate the
rest of us. . . There are no objective facts. All supposed "facts" are contaminated with theories, and all
theories are infested with moral and political doctrines. . . Therefore, when some guy in a lab coat tells you
that such and such is an objective fact . . . he must have a political agenda up his starched white sleeve."
There are even a few, but very vocal, fifth columnists within science itself who hold exactly these views, and
use them to waste the time of the rest of us.
Cartmill’s thesis is that there is an unexpected and pernicious alliance between the know-nothing
fundamentalist religious right, and the sophisticated academic left. A bizarre manifestation of the alliance is
joint opposition to the theory of evolution. The opposition of the fundamentalists is obvious. That of the left is
a compound of hostility to science in general, of ‘respect’ for tribal creation myths, and various political
agendas. Both these strange bedfellows share a concern for ‘human dignity’ and take offence at treating
humans as ‘animals’. Moreover, in Cartmill’s words,
Both camps believe that the big truths about the world are moral truths. They view the universe in terms
of good and evil, not truth and falsehood. The first question they ask about any supposed fact is whether it
serves the cause of righteousness."
And there is a feminist angle, which saddens me, for I am sympathetic to true feminism.
"Instead of exhorting young women to prepare for a variety of technical subjects by studying science,
logic, and mathematics, Women’s Studies students are now being taught that logic is a tool of domination. . .
the standard norms and methods of scientific inquiry are sexist because they are incompatible with "women’s
ways of knowing." The authors of the prize-winning book with this title report that the majority of the women
they interviewed fell into the category of ‘subjective knowers’, characterized by a ‘passionate rejection of
science and scientists.’ These ‘subjectivist’ women see the methods of logic, analysis and abstraction as
‘alien territory belonging to men’ and ‘value intuition as a safer and more fruitful approach to truth’."
That was a quotation from the historian and philosopher of science Noretta Koertge, who is understandably
worried about a subversion of feminism which could have a malign influence upon women’s education.
Indeed, there is an ugly, hectoring streak in this kind of thinking. Barbara Ehrenreich and Janet McIntosh
witnessed a woman psychologist speaking at an interdisciplinary conference. Various members of the
audience attacked her use of the
. . . oppressive, sexist, imperialist, and capitalist scientific method. The psychologist tried to defend
science by pointing to its great discoveries – for example, DNA. The retort came back: "You believe in DNA?"
Fortunately, there are still many intelligent young women prepared to enter a scientific career, and I should
like to pay tribute to their courage in the face of such bullying intimidation.
I have come so far with scarcely a mention of Charles Darwin. His life spanned most of the nineteenth
century, and he died with every right to be satisfied that he had cured humanity of its greatest and grandest
illusion. Darwin brought life itself within the pale of the explicable. No longer a baffling mystery demanding
supernatural explanation, life, with the complexity and elegance that defines it, grows and gradually emerges,
by easily understood rules, from simple beginnings. Darwin’s legacy to the twentieth century was to demystify
the greatest mystery of all.
Would Darwin be pleased with our stewardship of that legacy, and with what we are now in a position to pass
to the twenty first century? I think he would feel an odd mixture of exhilaration and exasperation. Exhilaration
at the detailed knowledge, the comprehensiveness of understanding, that science can now offer, and the
polish with which his own theory is being brought to fulfilment. Exasperation at the ignorant suspicion of
science, and the air-headed superstition, that still persist.
Exasperation is too weak a word. Darwin might justifiably be saddened, given our huge advantages over
himself and his contemporaries, at how little we seem to have done to deploy our superior knowledge in our
culture. Late twentieth century civilisation, Darwin would be dismayed to note, though imbued and
surrounded by the products and advantages of science, has yet to draw science into its sensibility. Is there
even a sense in which we have slipped backwards since Darwin’s co-discoverer, Alfred Russel Wallace wrote
The Wonderful Century, a glowing scientific retrospective on his era?
Perhaps there was undue complacency in turn-of-century science, about how much had been achieved and
how little more advancement could be expected. William Thomson, First Lord Kelvin, President of the Royal
Society, pioneered the transatlantic cable – symbol of Victorian progress – and also the second law of
thermodynamics – C P Snow’s litmus of scientific literacy. Kelvin is credited with the following three confident
predictions: ‘Radio has no future.’ ‘Heavier than air flying machines are impossible.’ ‘X-rays will prove to be a
Kelvin also gave Darwin a lot of grief by ‘proving,’ using all the prestige of the senior science of physics, that
the sun was too young to have allowed time for evolution. Kelvin, in effect, said, "Physics argues against
evolution, so your biology must be wrong." Darwin could have retorted: "Biology shows that evolution is a
fact, so your physics must be wrong." Instead, he bowed to the prevailing assumption that physics
automatically trumps biology, and fretted. Twentieth century physics, of course, showed Kelvin wrong by
powers of ten. But Darwin did not live to see his vindication, and he never had the confidence to tell the
senior physicist of his day where to get off.
In my attacks on millenarial superstition, I must beware of Kelvinian over-confidence. Undoubtedly there is
much that we still don’t know. Part of our legacy to the 21st century must be unanswered questions, and
some of them are big ones. The science of any age must prepare to be superseded. It would be arrogant and
rash to claim our present knowledge as all there is to know. Today’s commonplaces, such as mobile
telephones, would have seemed to previous ages pure magic. And that should be our warning. Arthur C.
Clarke, distinguished novelist and evangelist for the limitless power of science, has said, ‘Any sufficiently
advanced technology is indistinguishable from magic.’ This is Clarke’s Third Law.
Maybe, some day in the future, physicists will fully understand gravity, and build an anti-gravity machine.
Levitating people may one day become as commonplace to our descendants as jet planes are to us. So, if
someone claims to have witnessed a magic carpet zooming over the minarets, should we believe him, on the
grounds that those of our ancestors who doubted the possibility of radio turned out to be wrong? No, of
course not. But why not?
Clarke’s Third Law doesn’t work in reverse. Given that ‘Any sufficiently advanced technology is
indistinguishable from magic’ it does not follow that ‘Any magical claim that anybody may make at any time is
indistinguishable from a technological advance that will come some time in the future.’
Yes, there been occasions when authoritative sceptics have come away with egg on their pontificating faces.
But a far greater number of magical claims have been made and never vindicated. A few things that would
surprise us today will come true in the future. But lots and lots of things will not come true in the future.
History suggests that the very surprising things that do come true are in a minority. The trick is to sort them
out from the rubbish – from claims that will forever remain in the realm of fiction and magic.
It is right that, at the end of our century, we should show the humility that Kelvin, at the end of his, did not. But
it is also right to acknowledge all that we have learned during the past hundred years. The digital century was
the best I could come up with, as a single theme. But it covers only a fraction of what 20th century science
will bequeath. We now know, as Darwin and Kelvin did not, how old the world is. About 4.6 billion years. We
understand – what Alfred Wegener was ridiculed for suggesting – that the shape of geography has not
always been the same. South America not only looks as if it might jigsaw neatly under the bulge of Africa. It
once did exactly that, until they split apart some 125 million years ago. Madagascar once touched Africa on
one side and India on the other. That was before India set off across the widening ocean and crashed into
China to raise the Himalayas. The map of the world’s continents has a time dimension, and we who are
privileged to live in the Plate Tectonic Age know exactly how it haschanged, when, and why.
We know roughly how old the universe is, and, indeed, that it has an age, which is the same as the age of
time itself, and less than twenty billion years. Having begun as a singularity with huge mass and temperature
and very small volume, the universe has been expanding ever since. The 21st century will probably settle the
question whether the expansion is to go on for ever, or go into reverse. The matter in the cosmos is not
homogeneous, but is gathered into some hundred billion galaxies, each averaging a hundred billion stars. We
can read the composition of any star in some detail, by spreading its light in a glorified rainbow. Among the
stars, our sun is generally unremarkable. It is unremarkable, too, in having planets in orbit, as we know from
detecting tiny rhythmic shifts in the spectrums of stars. There is no direct evidence that any other planets
house life. If they do, such inhabited islands may be so scattered as to make it unlikely that one will ever
We know in some detail the principles governing the evolution of our own island of life. It is a fair bet that the
most fundamental principle – Darwinian natural selection – underlies, in some form, other islands of life, if any
there be. We know that our kind of life is built of cells, where a cell is either a bacterium or a colony of
bacteria. The detailed mechanics of our kind of life depend upon the near-infinite variety of shapes assumed
by a special class of molecules called proteins. We know that those all-important three-dimensional shapes
are exactly specified by a one-dimensional code, the genetic code, carried by DNA molecules which are
replicated through geological time. We understand why there are so many different species, although we
don’t know how many. We cannot predict in detail how evolution will go in the future, but we can predict the