Erhalten Sie Zugang zu diesem und mehr als 300000 Büchern ab EUR 5,99 monatlich.
Riddled with jealousy, rivalry, missed opportunities and moments of genius, the history of the atom's discovery is as bizarre, as capricious, and as weird as the atom itself. John Dalton gave us the first picture of the atom in the early 1800s. Almost 100 years later the young misfit New Zealander, Ernest Rutherford, showed the atom consisted mostly of space, and in doing so overturned centuries of classical science. It was a brilliant Dane, Neils Bohr, who made the next great leap - into the incredible world of quantum theory. Yet, he and a handful of other revolutionary young scientists weren't prepared for the shocks Nature had up her sleeve. This 'insightful, compelling' book (New Scientist) reveals the mind-bending discoveries that were destined to upset everything we thought we knew about reality and unleash a dangerous new force upon the world. Even today, as we peer deeper and deeper into the atom, it throws back as many questions at us as answers.
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 352
Veröffentlichungsjahr: 2017
Das E-Book (TTS) können Sie hören im Abo „Legimi Premium” in Legimi-Apps auf:
Cover
Title Page
Copyright
List of illustrations
Foreword by Jim Al-Khalili
Acknowledgements
About the author
Preface
Part One Energy in Pieces
From botany to the atom
Smaller than the smallest thing
Scientific romance
The dollar lode
The reluctant revolutionary
The patent clerk
Part Two The Empty Atom
Fifteen-inch shells
The ‘Great Dane’
Part Three Not Even Wrong
More trouble with waves
Werner Heisenberg
Waves of emotion
The Solvay wars
The ‘Old One’ defeated
Spooky action
The dead-and-alive cat
Why do you dance?
Matter in the mirror
Part Four Playing With Marbles
Big science
The hidden weight
Lise Meitner
In Hahn’s way
Part Five Blast Radius
Now teach me something
The pile
The ‘nim-nim-nim’ boys
Uncertainty about Heisenberg
The belly of a Mosquito
The heroes of Telemark
Unholy Trinity
Teller’s testimony
Pandora’s box
Part Six Renormalising the Infinities
Surely you’re joking, Mr Feynman!
‘They don’t know’
Different infinities
A clash of styles
Tempted by Mephistopheles
No contest
The atomic spaceship
Part Seven ‘Three Quarks for Muster Mark!’
The nano-world
Part Eight Ylem
Black Sea boat to freedom
The Yorkshireman
One last stand
Proof of the Bang?
Part Nine New Frontiers
Multiple realities
The evolving cosmos
The matrix
First and last things
Further Reading
Index
Published by Icon Books Ltd, Omnibus Business Centre, 39–41 North Road, London N7 9DPEmail: [email protected]
ISBN: 978-1-78578-216-9
The author has asserted his moral rights
Text copyright © 2007 Piers Bizony
No part of this book may be reproduced in any form, or by any means, without prior permission in writing from the publisher.
Figures in the text
A wave interference pattern
A wave-like interference pattern from a double-slit experiment
A sketch of Enrico Fermi’s uranium pile under the Stagg Field stadium
A Feynman diagram showing an electron and a positron colliding to create a photon
Pictures in the plate section
Marie Curie
Pierre and Marie Curie
Max Planck
Albert Einstein
Hans Geiger and Ernest Rutherford
J.J. Thomson and Ernest Rutherford
Niels Bohr, the ‘Great Dane’
Wolfgang Pauli
Werner Heisenberg
Erwin Schrödinger
The Fifth Solvay Conference of 1927
Paul Dirac
John Cockcroft and George Gamow
Ernest Walton, Ernest Rutherford and John Cockcroft
James Chadwick
Otto Hahn and Lise Meitner
Otto Frisch
Enrico Fermi
Robert Oppenheimer
Edward Teller and Enrico Fermi
Murray Gell-Mann and Richard Feynman
Freeman Dyson
Fred Hoyle
Now that we have comfortably settled in and acclimatised ourselves to the new millennium, it is tempting to look back at the science of the last century with haughty arrogance at the naivety and innocence of the early pioneers of the atom. Surely we are dealing with loftier, more exciting and more profound issues in science today: from genetic engineering to the challenge of tackling climate change; from the internet to nanotechnology; from artificial intelligence to dark matter, string theory and parallel universes? Sexy science is always that which pushes at the frontiers of our understanding – the questions we have yet to answer and the phenomena we have yet to explain. Why would we wish to dwell on unfashionable stories relegated to the ignominy of a thousand textbooks when we have a brave new world to wonder at and make sense of? What is there left unsaid about those brave pioneers of what is still nostalgically referred to as ‘modern physics’, beyond the respectful acknowledgements of centenaries of their births and deaths?
I am not a historian of science, nor am I immune to the lure of current ideas and unresolved puzzles in fundamental physics. After all, I spend an unhealthy fraction of my waking hours pondering such things. That is my job. But I put it to you that it is hard to imagine an age of more exciting revelations and revolutions in humanity’s quest to understand the ultimate building blocks of our universe than the one that faced the scientists of the first half of the 20th century. Never before or since has mankind successfully unlocked so many of nature’s inner secrets. The story of how a band of dedicated men and women slowly but surely unravelled the inner workings of the atom is not only a great drama, but a lesson in scientific collaboration, dedication and resourcefulness that is worth telling to each new generation.
In little over half a century, our understanding of the world developed from debate over whether atoms even existed at all to the realisation that they are indeed the building blocks of the universe, and on to probing their inner workings and structure. It is quite remarkable that we can today describe the rich variety of everything we see around us – from tables and chairs to the ground we stand on and the air we breathe, to our own bodies and the minds we use to think about such things – in terms of just three elementary particles much tinier than we can ever hope to see directly: the electron and two types of quarks; just three basic building blocks that make up everything there is.
It is interesting to note that such a reductionist approach, in which everything in the universe, however complex, can be ultimately described in terms of its constituent parts, its basic building blocks, is today acknowledged by many scientists to be inadequate. A nice example of this is the most basic property of water: its ‘wetness’. This is something that emerges only when we are dealing with trillions of water molecules combined together. We would never expect to predict water’s wetness by studying a single water molecule and probing its atomic or subatomic structure. Clearly our knowledge that all matter is composed of atoms is not enough to explain the rich variety of phenomena we see in the world around us. Despite this, mankind’s discovery of atoms ranks as being the most important in the history of science (followed not so far behind, I would argue, by Darwinian natural selection). The story of the atom is a fascinating one, and I don’t mean just to nerdy historians of science. Many of the personalities involved in this epic quest are household names. Who hasn’t heard of Curie and Einstein? Others are well known to scientists but may be less so to the general public. Of course, you may well know of Rutherford, Bohr, Schrödinger, Heisenberg and Feynman. But the story of Wolfgang Pauli, the man who laid the foundations of modern chemistry, or that of Fred Hoyle and George Gamow who between them explained how all the atoms were created in the first place, is not so widely known and deserves to be told.
Today, it is hard to imagine just what little knowledge Marie and Pierre Curie started with when they embarked on their quest to explain radioactivity, or what a great leap of intuition Rutherford needed to make to figure out what the inside of an atom looks like. What is even more remarkable is that in cracking the code of the atom, physicists needed to invent a new way of thinking, a mathematical construct far stranger than anything they could have dreamt up – ideas so powerful that they form to this day the most important scientific theory ever produced: quantum mechanics. In their quest for simplicity, these geniuses found that Nature was more remarkable than they could ever have imagined. Old notions of a logical mechanistic universe that could be explained by a few simple rules were overthrown so completely that one prominent physicist, questioning the correctness of a new theory presented to him, wondered not whether the ideas it was based on were crazy, but rather whether they were crazy enough!
Being able to describe the nature of the atom was far from the whole story, of course. Its energy was harnessed and its origins were explained. For instance, there is no idea more romantic or powerful than that we are all made of stardust. Every atom on earth, including our own bodies, was forged in the crucibles of space, being either formed at the birth of the universe, just after the Big Bang, or cooked inside a star billions of years ago.
What must it have been like to be part of that great adventure, designing simple but ingenious experiments on lab benches across Europe, building fantastic machines in the Cavendish in Cambridge that would smash open the atomic nucleus, or filling blackboards in Copenhagen with symbols and algebra, inventing new mathematics as they went along? This book tells that story.
Jim Al-Khalili
January 2007
Professor Jim Al-Khalili OBE is a physicist, author and broadcaster based at the University of Surrey. He received his PhD in theoretical nuclear physics in 1989 and has published over a hundred research papers on quantum physics. His many popular science books have been translated into 26 languages. He is a recipient of the Royal Society Michael Faraday medal and the Institute of Physics Kelvin medal. In 2016 he received the inaugural Stephen Hawking medal for science communication.
The author would like to express his gratitude to Paul Sen and his colleagues, whose fine BBC series Atom has been the inspiration for this book. The series presenter, Dr Jim Al-Khalili, was tirelessly patient answering my questions and sympathetically correcting my scientific and historical mistakes. Any errors that remain in this text are my fault alone. Thanks also to Icon Books for making this project possible. They are a very pleasant team to work with. Likewise, I thank Peter Tallack at Conville and Walsh, who first introduced me to Icon. Above all, I would like to thank my family, Fiona, Alma and Oskar, for giving me such a happy atmosphere in which to work. From now on, I promise not to talk about atoms at meal-times.
Although every effort has been made to contact copyright holders for pictures reproduced here, there are instances where we have been unable to do so. If notified, the publisher will be pleased to acknowledge the use of copyright material in future editions.
Piers Bizony has written about science and the history of technology for a wide variety of publishers in the UK and the US, and also has worked in the film and TV industries. His previous book, The Man Who Ran the Moon (Icon 2006), picked up rave reviews for its politically astute account of NASA’s lunar-era chief, Jim Webb. Bizony’s most recent project, in collaboration with the family of film director Stanley Kubrick, is a major retrospective of 2001: A Space Odyssey.
All our cities have been reduced to dust. All our books have been vaporised in a global firestorm, and every last byte of computerised knowledge has been wiped out by the electromagnetic disturbance of some terrible disaster. Of the six billion humans that existed a few days ago, only a few thousands are still alive. Is this the absolute end of civilisation, or can some of the great intellectual achievements from the last two thousand years be rescued from the dust and ashes?
Suppose that the vanished civilisations of Earth had seen this day coming, and had created dozens of blast-proof tablets of toughened steel in the hope that at least one of them might be discovered by survivors of a future calamity. And suppose that these tablets had space for only one quick sentence of information, perhaps repeated in several different languages, as if on a futuristic Rosetta Stone. What might that sentence be? According to Richard Feynman, one of the 20th century’s greatest physicists, humanity could recover pretty much everything it needed to know about the material world from the simple statement:
All things are made of atoms, little particles that move around in perpetual motion, attracting each other when they are a little distance apart, but repelling upon being squeezed into one another.
Our quest for the atom has been one of the greatest scientific adventures in history. Not least of the problems is that an atom is a mind-numbingly small entity. No technical comparison can really help us grasp how small an atom is. It’s about a tenth of a millionth of a millimetre across. Can you picture that in your mind’s eye? Analogies drawn from our everyday experience also leave us reeling. Let’s try one. If you dip a cup into the sea and extract a cupful of water, there are as many atoms in that cup as there are cups of water in all the oceans of the world. No? Let’s try one more, this time based on an ever-popular yardstick of scientific communication, a strand of human hair. The width of the finest hair is longer than the span of one million carbon atoms stretched out in a row. Still no good? Don’t worry. The atom’s tiny size is only one of its multitude of ungraspable qualities. Another challenge to our sensory expectations, based as they are on the solidity of tables and chairs, the lumpen density of lead, the satisfying heft of gold coins in our hand, is this. The atom is mainly just empty space. If all the atoms in your body could be squeezed down to get rid of the empty volume inside them, you would weigh the same, but would be as small as a grain of salt. Almost all of you – almost all of everything in the entire cosmos – is nothing. The atom is not a hard-edged ‘thing’. It’s a ghostly shimmer.
Atoms in their turn are made from subatomic components thousands of times smaller again. The instruments designed for atomic science are made out of atoms, so we are essentially trying to measure entities smaller than the smallest tips of the finest instruments that we can ever create. Atomic science is like investigating a slippery coin while wearing a blindfold and a pair of thick gloves, and trying to find out which side of the coin is ‘heads’ and which ‘tails’. Ordinary fumbling is not sufficient. We have to be unusually clever to find the clues: not with, but despite the gloves and the blindfold, for we can never take them off. The gloves and the blindfold represent the limits of our animal senses, which allow us only so much access to the world of the ultimately small.
Yet the basic secrets of the atom were explored nearly a hundred years ago, at a time when scientific instruments consisted of little more than glass tubes, hand-made wooden and brass boxes, and simple electrical circuits wired up to batteries. There were no electron microscopes, no positron emission scanners, no supercomputers and plasma screen read-outs. There was just the human mind, the human imagination. Before the atom could be investigated with instruments, it needed to be imagined. After all, how could we find any atoms before we knew what we were looking for?
Today the atom is regularly subjected to multi-billion-dollar experiments in super-laboratories employing thousands of people. The tips of our instruments have become so fine that we can move individual atoms from place to place. We can even write words by nudging atoms across superpolished sheets of metal. Yet when we read this writing with our fantastic machines, we still never really see the atoms themselves. What we see are shadowy images of them rendered as computer graphics, which in turn are based on predictions of science that are far from absolutely certain. At heart the atom remains a thing of the mind. We still do not know the reality of it, and this book explains why its mysteries refuse to be resolved.
The atom may not be knowable, but it can be harnessed. With terrifying swiftness we have learned how to liberate its dangerous powers, and now that these inner demons have been released into the world, we cannot expect any time soon to coax them back into their cages. We have also discovered that the world of the unimaginably small is the direct cousin of the unimaginably vast. In a single atom lie all the forces and energies that brought the cosmos into being.
Is this not a fantastic drama? Science is so often seen as coldly objective, and perhaps that’s why so many people think of scientists as dull, colourless technicians. But science is a profoundly creative act, and the larger-than-life personalities involved in the quest for the atom were more creative than most. They came up with revolutionary theories that were shaped by their characters as much as by the data from their experiments. Some saw the atom as beyond our reach, an abstraction far removed from the treacherous human realm of tactile experience. Others with a more sensual outlook on life were sure that there must be something real in the atom’s shadow. The great figures of atomic discovery in the 20th century could not necessarily tell us what was true about their work. They told us what their favoured versions of that truth happened to be. Their joint legacy is still the cause of vigorous argument today, for it calls into question our most cherished concepts of reality.
In the year 1900 a deeply conservative physicist called Max Planck concluded, somewhat reluctantly, that energy is not smooth and continuous. It is divided into discrete amounts, mysterious packets, which he called ‘quanta’. It was a discovery that would revolutionise all of science.
The word ‘atom’ is derived from a Greek idea formulated 2,400 years ago. The philosopher Democritus argued that matter is made from indivisible, imperishable and unchanging particles, which he called atomos. It was a brilliant insight, based on logical argument, but Democritus did not choose to test any of his concepts experimentally. Like most of his contemporaries, he thought that logic alone should be able to resolve the mysteries of nature. During the next 22 centuries the atom made almost no impact on the human imagination, until a precocious young teacher at the dawn of the Industrial Revolution discerned the hard-edged practical value of talking about atoms. John Dalton was born in 1766 into a modest Quaker family in Cumberland, and earned his living for most of his life as a teacher, first at his local village school (where he began giving classes at the age of twelve) and then in the factory-dominated city of Manchester. Here he reanimated the atomic theory in a strict mathematical framework rather than just as a vague philosophical idea. He concluded that all atoms of a given element must be identical to each other, and argued that chemical compounds are formed by a combination of two or more different kinds of atoms. Carefully weighing his chemicals before and after they reacted with each other, he worked out the ratios of different elements that went into certain well-known compounds. The atom emerged from his work as a spectacularly reliable chemical counting unit. As a statistical way of looking at gas and steam pressures, the atom was also invaluable. Yet it remained, for now, just that: a workable counting tool with no proper physical explanation behind it.
By the end of the 19th century, a self-confident set of rules had been assembled to describe just about everything that could be looked at, listened to, weighed or measured: the precise movements of the stars and planets across the sky, the temperatures and pressures of gases under given conditions, the rate of transfer of heat from one substance into another, the equations for shaping glass lenses so that they would bring rays of light to a focus, and so on. It was a mechanistic and results-driven way of looking at the world, and it ushered in the age of the electric lightbulb, the radio telegraph, the telephone, the motion picture, the motor car and the aeroplane. Science also seemed capable of unveiling secrets of nature at the profoundest levels. In the 1820s, a number of astronomers insisted that the gravity of an unknown planet must be responsible for observed irregularities in the orbit of an already familiar planet, Uranus. For three decades the pure and logical rule of Newtonian mathematics was their only guide. And then in 1846 they steered their telescopes to the point in the sky where the mathematics said that the planet should be. And there it was. Neptune existed because the classical laws of nature said it had to exist.
Scientists were satisfied of two things. First, almost everything that could be understood was understood; and second, the remaining mysteries were the province of religion and metaphysics, not science. Many properties of the commonly available chemical elements – hydrogen, oxygen, carbon, nitrogen, copper, iron, and so on – were predictable. Thanks to Dalton and his successors, chemists knew the precise ratios of elements in thousands of industrially useful compounds. There was an elaborate counting system based on the atom, which was widely held to be the most fundamental unit of all matter. The atom was a useful idea, but it was too small to be seen in any microscope, so its existence could not be verified.
We knew everything. And yet we knew nothing. Scientists were thoroughly accustomed to putting different kinds of knowledge into separate compartments. Botany was popular in the late Victorian age as a respectable hobby for the leisured classes, as was mathematics and the study of optics. The smellier and more hands-on business of chemistry and mechanical engineering were best left to the newly powerful industrialists. Few people would have imagined that all these disciplines might share common ground. No one suspected that the shapes of molecules, the specific arrangements of atoms in chemical compounds, might give strength to a tree, while another arrangement lent flexibility to its rubbery sap, and yet another controlled the shape of its leaves. There were many individual natural philosophers and amateur scientists intrigued by such questions, but the burgeoning numbers of university specialists, military research arsenals and commercial laboratories had no common framework with which to tackle deep, abstract problems in science. There were no grants available to study questions that did not already appear on the list of approved and potentially profitable questions: how could catalytic converters be made more efficient? How could the casings of steam engines be machined in stronger but more lightweight configurations? What mix of explosives could most effectively hurl a 15-pound shell the greatest distance?
The ‘what happens when …’ questions of science were incredibly well understood by 1900. The ‘why’ questions were barely addressed. No one had the tools to understand the strange invisible energies emanating from Henri Becquerel’s experiments with uranium salts. It was a puzzle, also, that his compounds emitted their energies week after week, month after month, without apparently depleting like any normal energy source. Similarly, and on a grander scale, it was a mystery how the sun could keep shining and not burn itself out. There was no unifying concept that could link all these disparate wonders together and explain them. We didn’t even now why red things look red.
A seemingly unremarkable young clerk in the Patent Office at Berne, Switzerland, pondered that the science of his day was like a vast library of unrelated books in myriad subjects. Yet he had faith in ‘a definite plan in the arrangement of the books, a mysterious order, which we do not comprehend, but dimly suspect’. The clerk believed that if only we could study how that order worked, instead of concentrating just on the individual books, then the entire library might one day resolve itself into a single, compact and breathtakingly tidy volume. The clerk was not an experimenter and had no laboratory. He drew his conclusions purely from the logic of scientific papers already available to his generation. In 1905, aged just 25, he published three papers which, in principle, should have revolutionised all scientific thinking. A few scientists adopted his ideas enthusiastically, and the clerk quickly gained acceptance in academia, but the world as a whole remained unmoved, and he graduated from youthful revolutionary to middle-aged professor without attracting much attention outside his close-knit scientific coterie. Albert Einstein was 40 years old by the time he even began to become famous.
It often happens in science that an observation in one field of research eventually throws a startling new light on another area. An observation in botany unexpectedly laid the foundations for one of the most important discoveries in all of modern science: the discovery of the atom. And it was Einstein who spotted the clues. His 1905 theory on Special Relativity, for which he is now best known, was only one of several intellectual breakthroughs he published in that extraordinary year. Another paper was all about little grains of pollen.
In 1827 the Scottish botanist Robert Brown was examining pollen grains under a simple microscope. He put some grains in a droplet of water, and noticed that they moved about, tracing random zigzag paths across his microscope’s field of view. At first, he concluded that the movement of each grain ‘arose neither from currents in the fluid, nor from its gradual evaporation, but belonged to the particle itself’. Other botanists enthusiastically decided that Brown had witnessed a fundamental ‘life force’ animating these tiny pieces of biological matter (typically the pollen grains measured no more than 1/100th of a millimetre across). This was a perfect example of something that tends to happen in science. There can be an unseemly rush to try to confirm theories which have already gained currency among researchers. Observational facts are sometimes interpreted to fit a favourite theory, and this is one of the biggest mistakes that any scientist can make. It’s much better to adjust the theory to fit the facts, even – or perhaps especially – when it involves abandoning cherished ideas about how nature works.
Wisely as it turns out, Brown was more cautious. Even as he prepared to publish his results, he revised his text to warn that he had seen a similar motion among pollen grains he had preserved in alcohol many months before, so that surely they must have been lifeless by the time he put them under the microscope. Of course there was a slim chance that pollen was harder to kill than he had assumed, so one more experiment was needed to remove any ambiguity. He ground down some inorganic mineral samples into powders and suspended them in water. Again, he saw random movements through his microscope. If there was some kind of a force at work here, it almost certainly wasn’t coming from the grains. And yet they moved. The only logical conclusion was that something in the water was pushing them around.
Throughout the Victorian era, the jiggling of the grains remained an intriguing enigma whose significance was not truly understood. Then, at the dawn of the 20th century, a Swiss-Italian electrical engineer named Michelangelo Besso introduced his very good friend Albert Einstein to what he called ‘Brownian motion’. In 1905 Einstein was inspired to write a paper on this theme, in which he described how the motion could be understood as the buffeting of billions of water molecules against the grains. Previous theorists had been confused by the idea that something so infinitesimally small and lightweight as a molecule could push against the comparatively massive grains. Einstein certainly wasn’t suggesting that the individual zigs and zags of the grains were caused by each one being hit by individual molecules. They were the cumulative outcomes of many millions of random impacts. He even calculated the probability of the motions in a way that could be tested by subsequent experimenters. Ignoring all the zigs and zags, and focusing only on the straight-line distances covered between the start of a grain’s journey and where it ended up after a given amount of time, Einstein accurately predicted how far a grain would travel. In other words he ‘smoothed out’ the irrelevant details of every last zig and zag, and treated the whole problem statistically.
Yet he drew back from claiming that the apparent action of the molecules on the grains specifically proved the existence of molecules. Instead he suggested that the statistical effects he described would produce the visible large-scale motions observed in Brownian motion if it turned out to be the case that molecules existed. As for the molecules themselves, he warned that ‘the data available to me are so imprecise that I could not form a judgement on the question.’ It was an important and conscientious distinction. For now, these invisible entities remained a useful theoretical model for predicting the thermal and kinetic forces in liquids and gases, and for predicting the outcomes of chemical reactions. But still, no one had yet ‘seen’ a molecule, let alone an atom.
Strangely enough, someone had by now demonstrated the existence of something even smaller than an atom. In 1897 the English physicist Joseph J. Thomson was experimenting with a sealed glass tube from which most of the air had been drawn out. Inside were two metal plates, mounted at opposite ends, and wired up to a battery. The plate attached to the positive terminal was called the anode, and the negative plate was known as a cathode. When the current was switched on, a mysterious glow could be observed at the anode end of the tube. Certain types of glass exhibited a very faint glow unaided, but when a coat of phosphor was applied to the inside of the glass tube at the positive end, the glow was unmistakable. Some kind of invisible beam was crossing the gulf between the cathode and the anode, striking the end walls of the tube and producing the glow.
These ‘cathode ray’ tubes, tremendously popular among experimenters in the late 1800s, were the ancestors of television. Thomson’s special contribution was to prove that electric currents or magnets just outside the vacuum tube could deflect and even steer the rays, altering the positions of the luminous spots on the phosphor screens. A wide variety of different anode and cathode metals produced similar results, so he concluded that the rays were a fundamental constituent of nature, and not just some oddity connected with a particular material. He showed that the rays were narrow beams of negatively charged particles, which he called ‘corpuscles’. By measuring the influences of external magnets and electrical currents on these particles, he proved they were 2,000 times less massive than an atom of even the lightest element, hydrogen. We know these particles, today, as electrons.
It was an odd state of affairs, that an entity even smaller than the atom had been identified while the atom itself remained just a theoretical notion. In April 1897 Thomson admitted to a meeting of the Royal Society in London that ‘the assumption of a state of matter more finely divided than the atom is a somewhat startling one’. In 1904 he took the bold step of asserting that ‘the atom consists of a number of corpuscles [electrons] moving about in a sphere of uniform positive electrification’. This came to be known as the ‘plum pudding’ model. For now, that’s all it was: a mind’s eye visualisation, a vague speculation. Yet Thomson’s confidence was bolstered by news from Paris, where a brave and romantic couple were at last finding substance in the atom’s elusive shadow.
A hundred years ago, even at the tail end of the Victorian era, a male scientist could get away with having an adulterous affair as long as his work was good enough. A woman had to tread more carefully in her private life, even if her work was in the Nobel Prize-winning class. One who always refused to toe the line was Madame Curie.
Maria Skłodowska was born in 1867 in Warsaw, the fifth and youngest child of Bronisława Boguska, a pianist, singer and teacher, who died of tuberculosis while Maria was still a child, and Władysław Skłodowski, a professor of mathematics and physics. At sixteen Maria won a gold medal for outstanding achievement at her secondary school education. She dreamed now of travelling to Paris and entering the Sorbonne, one of the few major European universities where a young woman might be allowed to study. Unfortunately her father lost all the family savings in a failed investment scheme. Maria was forced to find work as a teacher, and then became a governess, essentially a nanny-cum-tutor role familiar to well-brought-up young women in the Victorian age whose families had fallen on hard times. Meanwhile, she found time to read Polish books to poor women who would not otherwise have had access to education, except at the hands of Poland’s Russian overlords. Maria was a member of a clandestine and politically radical ‘flying university’ which convened wherever it could, and just as quickly dispersed whenever threatened. The Russian authorities did not approve of Maria’s nationalist ideals.
Her next risky adventure was to fall in love with one of the sons of the family she was living with as a governess. The affair was passionate, and the love completely mutual, but the young man’s parents refused to let him marry a penniless governess, and Maria was forced to leave the household. Now her only hope was her sister Bronia, who had struck a deal with her some years earlier. Funded by Maria’s earnings as a governess, Bronia had made it to Paris, where she was studying for a medical degree. On her return to Poland, the sisters were supposed to swap roles. These two strong-minded women kept to their plan, and in November 1891 Maria at last set out on the thousand-mile train journey to Paris and the Sorbonne. She attended physics and mathematics lectures by day, then at night returned to her very humble student’s lodgings in the city’s Latin Quarter. She ate little more than bread and butter, and seldom drank anything more costly than tea. Her studies went well, and after three years of dedicated student life, she passed examinations in physics and mathematics with outstanding grades. Marie’s goal (she changed her name while in Paris) was to obtain a teacher’s diploma and then return to Poland. Her instincts told her that her widowed father must surely expect her to come home and play her part in supporting the family.
And then she chanced to meet her soulmate. ‘I was struck by the open expression on his face’, she recalled. ‘His simplicity, his smile, at once grave and youthful, inspired confidence.’ Thirty-five-year-old Pierre Curie was the head of a laboratory at the School of Industrial Physics and Chemistry. He had already made something of a name for himself by discovering, with help from his brother Jacques, that when an electric current was applied to a quartz crystal, it changed shape by a tiny amount. Conversely, when the crystal was squeezed or pulled, it delivered a jolt of electricity in response. This might sound obscure, yet it was the key to a new range of super-sensitive scientific instruments. Pierre was a skilled designer of measuring equipment, and he was well respected in Paris at that time, although he was not so good at insinuating his way into the professional élite of French science. He despised political games-playing and had little appetite for medals and awards, or any of the other back-slapping perks of his trade.
Marie’s good fortune was that this decent and unegotistical man proved very happy to collaborate with her in work as well as in love. She knew she was supposed to return to Poland after completing her studies at the Sorbonne, and she visited her family to give them the news about Pierre, uncertain how they would react. Much to her relief, her father made it quite clear that she should return to Paris straight away and marry him. In July 1895, after an idyllic wedding, one of the most romantic couples in the history of science got down to work, studying the strange invisible rays recently discovered by another French scientist, Henri Becquerel. He had worked with a very unusual element. It was called uranium.
At the beginning of the 16th century, the rough mountain territory dividing Bohemia from Saxony, the borderland between modern Germany and the Czech Republic, was covered by an impenetrable virgin forest, a refuge for wolves, bears and bandits. The discovery of precious metals triggered the first ‘silver rush’ in history. The previously insignificant little town of Joachimsthal soon become the largest mining centre in Europe. In just a couple of years an eager influx of chancers swelled the population to 20,000. The silver was minted into a coin called a Joachimsthaler, later known more simply as a thaler. Rather like a certain currency in today’s world, the thaler was accepted worldwide. The silver coins are no longer in circulation, but the name has stayed with us, slightly transmuted. It’s now pronounced ‘dollar’.
After just three decades, Joachimsthal’s silver reserves were exhausted. Plagues killed off much of the population, and the Thirty Years War finished the job. Joachimsthal became a ghost town with an unhealthy reputation. Miners had always fallen ill there, even before the plague struck. But just because the silver was gone didn’t mean that the mines had nothing left to offer. Along with the silver, the miners had often come across a shiny black mineral, which didn’t immediately impress them as being of much use. They called it Pechblende, from the German words Pech, which means bad luck, and Blende, meaning mineral. In 1789 an amateur German chemist, Martin Klaproth, decided to see what it was made of. He found that it contained ‘a strange kind of half-metal’, which he named in honour of the planet Uranus, at that time believed to be the last planet in the solar system.
During the next century, ‘pitchblende’ was found in Cornwall, France, Austria and Romania, and by the end of the Victorian era, thousands of scientific papers had been published on geological and mineral occurrences of ‘uranium’. The metal, as dense as gold, was apparently the heaviest element on earth. Its principal value appeared to lie in the vivid colours of its oxides and salts, which were used to create glassware with an attractive fluorescent glow, or glazes for ceramics and porcelain in orange, yellow, red, green and black. Some of these techniques had been known since Roman times. No one suspected the invisible dangers lurking inside those decorative flourishes. Uranium’s more dramatic potentials emerged in 1896, when Becquerel discovered quite by accident that it gave off invisible rays capable of fogging photographic plates wrapped in lightproof black paper.
Just a few months earlier, the German physicist Wilhelm Roentgen was experimenting with electron beams in an apparatus similar to J.J. Thomson’s cathode ray vacuum tube, except that his electrons streamed from a heated metal cathode. He noticed that a fluorescent phosphor screen at the other end of his workbench began to glow. Yet another mysterious ray had revealed itself, this time by escaping from the tube altogether and flying across the room. When Roentgen placed a thick black card between the tube and the screen, the screen still glowed. Finally, he put his hand in front of the tube, and saw the silhouette of his bones projected onto the screen. He had no way of knowing that the electrons in his vacuum tube, flying off a particularly hot filament, were so energetic that they were generating a different kind of radiation when they knocked into atoms in the glass walls of the tube.
One week later, Roentgen discovered the ray’s most beneficial application. He captured an image of his wife’s hand on a photographic plate. Her bones (her wedding ring too) showed up as solid shadows. She was horrified by the death-like image, but the medical world quickly embraced this new discovery, which seemed little short of magic. Here was a tool that could look inside a living patient and reveal broken bones, or seek out the exact location of a shard of shrapnel prior to surgery. But what were these rays? An invisible kind of light? Something akin to Thomson’s corpuscles? At first no one was sure, and this is why they were called x-rays.
Becquerel’s uranium rays seemed to have similar characteristics. Pierre and Marie Curie were fascinated by these exciting emanations, which they described, in French, as ‘radio-actif’. The word ‘radio’, derived from the Latin geometrical term ‘radius’, did not then have the specific meanings we assign to it today. It simply meant ‘related to rays’. The Curies had discovered an invisible ‘active ray’. The English derivation of their term is one we all recognise and fear. It is ‘radioactive’.
By 1898 they had secured the use of a laboratory space granted to them by the School of Physics and Chemistry in Paris. Here they began the grimy and labour-intensive business of extracting microscopic quantities of pure uranium from huge, boiling vats of pitchblende solution. In truth, their laboratory was nothing more than a disused dissecting room purloined from a nearby medical school. It had an unreliable glass roof that let the rain in, and it was unbearably hot in summer and inhumanely cold in winter. Visitors were surprised by the roughness of the conditions, and by the noxious smells emanating from the chemical vats and gas burners. It was more like a grim factory than a laboratory. The German chemist Wilhelm Ostwald paid a visit on one of the rare days when the Curies were absent on other business. The lab, he wrote, ‘was a cross between a stable and a potato shed, and if I had not seen the worktable and items of chemical apparatus, I would have thought that someone had played a practical joke on me’.
Day after day the routine was the same. Sacks of pitchblende arrived freshly dug from the giant discarded slag heaps that still festered around the abandoned Joachimsthal mines. Marie would clean off the mud and grass and pine needles from the rough lumps, which then had to be ground down into fine powder, then boiled into a liquid which could be sieved and refined yet further. There was an acid bath to dissolve unwanted impurities. Then came an electrolysis process, similar to the method used for silver-plating cutlery. At the end of many months’ labour, just a few grammes of purified uranium might be extracted. ‘Sometimes I had to spend a whole day mixing a boiling mass with a heavy iron rod nearly as large as myself. I would be broken with fatigue at the day’s end’, wrote Marie. Yet she had no urge to complain, for Pierre was by her side, and their discoveries were mounting up. ‘It was in this miserable old shed that we passed the best and happiest years of our life.’