The rise of bombs: how physics lost its innocence

    On October 30 1961, a year before John F Kennedy squared up to Nikita Khrushchev in the Cuban missile crisis (and rather less momentously, a year to the day before I entered the world), the largest ever human-made explosion erupted 4,000 metres over the tundra of the Novaya Zemlya archipelago in the Arctic Ocean. It was caused by Tsar Bomba, “King of Bombs”, a Soviet thermonuclear device the size of a bus that weighed 27 tonnes and was, by itself, responsible for a tenth of the total energy released in all nuclear tests worldwide.

    The Hiroshima and Nagasaki bombs were squibs in comparison, despite causing probably over 200,000 deaths. They were old-fashioned atom bombs that derived their energy from the fission (splitting apart) of the atomic nuclei of heavy elements uranium and plutonium rather than, like the later hydrogen bombs, from fusion (joining together) hydrogen atoms, using the process that powers the sun. Tsar Bomba was too big for any regular aircraft and was delivered by a long-range bomber adapted so that the bomb hung beneath the fuselage. The plane’s crew were given a 50:50 chance of surviving the blast.

    Tsar Bomba’s mushroom cloud reached 65km above the stratosphere, two-thirds of the way to outer space. The explosion destroyed everything in a 25-km radius and levelled all brick buildings 55km away. As the physicist Frank Close helpfully tells readers of his new book, Destroyer of Worlds, that is the equivalent of a blast over central London that flattens both Gatwick and Heathrow airports. It also broke windowpanes 900 km away, and disturbed the earth’s atmosphere for four days. A cameraman on a documentation flight accompanying the bomber reported a “bright orange ball” emerging through the clouds that “seemed to suck the whole earth into it”.

    Tsar Bomba provides the climax to Close’s book and it captures the lunacy of the postwar nuclear arms race. There is no military need for such a device, Close explains, and it was probably unusable, being too heavy to be carried to an enemy target without an impractical fuel load. “The monstrosity was a machine of genocide,” he says: a sheer symbol of power demanded by Khrushchev.

    It was initially meant to be twice the size, but Andrei Sakharov, the physicist widely considered the “father of the Russian bomb” and recipient of the 1975 Nobel peace prize for his efforts towards nuclear disarmament, urged for the plan to be scaled back. Sakharov had in fact insisted that no further testing of Soviet H-bombs was necessary at all by that stage, only for Khrushchev to scoff that: “If we listened to people like Sakharov, I’d be a jellyfish not chairman of the Council of Ministers.”

    Close recounts the story that led from the scientific discovery of the perplexing phenomenon of radioactivity at the end of the 19th century to potentially world-destroying weapons six decades later. It was a timespan short enough for some of the scientists to witness pretty much all of it. The Danish physicist Niels Bohr was awarded his masters’ degree in Copenhagen in 1911, the year his later mentor Ernest Rutherford, arguably the greatest experimental physicist of the modern age, described the atomic nucleus. Bohr died a year after Tsar Bomba’s outburst, his desperate efforts to avoid the cold war nuclear arms race having come to nothing.

    Marie Curie never saw the terrible harvest of her pioneering work on radioactivity – a word she coined – in weapons of mass destruction, because of the initially unrecognised dangers her work incurred. With her husband Pierre she isolated the radioactive elements radium and polonium from uranium ore in Paris at the turn of the century; her death from a blood disorder in 1934 was almost certainly caused by exposure to the deadly emanations. Her body will remain hazardously radioactive for over a thousand years, interred in a coffin lined with thick lead shielding. The Curies’ daughter Irène, who herself became an expert in nuclear physics and won the 1935 Nobel Prize for her work, died aged 58 from leukaemia, again probably from exposure to polonium.

    The Curies did not actually discover radioactivity. That claim goes to the French scientist Henri Becquerel, who was trying to understand the origin of X-rays, which had been discovered in 1895, the launch point for Close’s survey. He found that uranium compounds spontaneously emit energetic rays that turned photographic emulsion dark.

    Curie, seeking a subject for her doctorate at the University of Paris, decided to study these “uranic rays”, which most scientists found of little interest in comparison to the mysterious X-rays. She and Pierre realised that natural uranium ore contains substances that were emitting powerful rays. Using painstaking chemical separation methods over several years, they isolated these intensely radioactive chemical elements, in work that won them the 1903 Nobel prize in physics (shared with Becquerel). It was the first of two Nobels for Marie. Pierre was already showing signs of radiation sickness when he died in 1906 after being run over by a horse-drawn carriage on a Paris street.

    Radioactivity was deeply unsettling, the rays seeming to stream incessantly from atoms of elements like uranium. Where was all this energy coming from? Ernest Rutherford, working first at Cambridge, then McGill University in Montreal before going to Manchester and then returning to Cambridge in 1919 to head the famous Cavendish Laboratory, did more than anyone to answer that question. In 1899 he showed that there are two types of “uranic rays,” both of which are not really rays (like light or X-rays) at all but electrically charged particles with mass: what he called alpha and beta particles. 

    The first of these, he showed in 1907, are identical to the dense nucleus of helium atoms. (This identification with helium actually came before Rutherford’s discovery of the nuclear structure of atoms). Beta particles turned out to be nothing but electrons, the subatomic constituents discovered by Rutherford’s mentor in the Cavendish, JJ Thomson, in 1897.

    In other words, the particles of radioactivity are themselves pieces of atoms. A third radioactive emanation, called gamma rays, was later identified too, these indeed being electromagnetic waves like light but with much shorter wavelengths. And it was found in the 1930s that another particle accompanies beta particles: the perplexing, electrically neutral and almost massless neutrino. Radioactivity wasn’t, then, a single or simple thing at all.

    All radioactive decay, however, involves a change in an atomic nucleus that transforms it to that of another chemical element. When uranium atoms decay, for example, they become atoms of the element thorium. Such transmutation was profoundly unsettling to Rutherford when he and the English chemist Frederick Soddy, working at McGill, first realised what was happening. 

    The chemical identity of atoms was until then supposed to be fixed and unchanging, which was why the dream of the alchemists of turning lead into gold was considered futile. “They’ll have our heads off as alchemists!” Rutherford exclaimed to his colleagues. Instead, they gave him a Nobel prize in 1908 for his work on radioactive decay. 

    In all this, radioactivity was offering fundamental insights into the nature of the atom, pictured by Rutherford as a dense nucleus orbited by electrons. But it did not seem very useful. The immense reservoir of energy in the nucleus was released in tiny amounts at a pace set by nature, and there seemed at first little we could do about that. This changed in 1919, when Rutherford showed that alpha particles could be used as projectiles to “split the atom” artificially. Even so, Rutherford famously proclaimed in 1933 that it was “moonshine” to suppose nuclear energy could be harnessed as an energy source (Close is sympathetic to claims that he worded his scepticism more cautiously than was reported). 

    How that picture changed during the 1930s is a complex story that Close tells crisply and precisely. It hinged around the discovery in 1932 of the nuclear particle called the neutron, which was, Close writes, “the moment when the science of nuclear physics began”. “The neutron would become key to unleashing forces able to destroy life on Earth.” 

    Because it has no electrical charge and so is not repelled by the positive nucleus, the neutron (which Irène Curie kicked herself for not discovering first) is a better projectile for inducing nuclear decay and the release of nuclear energy. In particular, neutrons are released in the decay of uranium and can then split apart other uranium nuclei into fragments in the process of nuclear fission discovered by the German chemists Otto Hahn and Fritz Strassmann in Berlin in 1938. 

    The duo could not understand their results, however; they were correctly interpreted as fission by Hahn’s former collaborator Lise Meitner, who, as an Austrian Jew, had had to flee Germany to Sweden after the Anschluss, along with her nephew Otto Frisch. Given a large enough piece of uranium, fission can be induced in an accelerating chain reaction that will liberate the nuclear energy all at once – in a bomb.

    As war broke out, it was clear to all the nuclear physicists what the discovery of fission implied. Legend ascribes the impetus for the Manhattan Project, which created the Hiroshima and Nagasaki bombs, to a letter written to Franklin Roosevelt by the exiled Albert Einstein, fearful that his former colleagues in Germany might deliver an atom bomb to Hitler. As Close explains, the US efforts to develop nuclear technology – not just a bomb but a reactor – probably owe more to the research on the feasibility of a bomb by Frisch and Rudolf Peierls, another Jewish émigré, in England, which was conveyed to scientists in America in 1941. 

    From that point, the work that would lead to Los Alamos, Tsar Bomba, and the Cuban missile crisis took on a momentum of its own, beyond the ability of the scientists to control. “Now I am become Death, the Destroyer of Worlds”, the Manhattan Project’s scientific leader J Robert Oppenheimer portentously quoted from the Bhagavad Gita after the Trinity bomb test of July 1945. Others, like Bohr and the physicist Joseph Rotblat who quit the project when it was clear the war with Germany was over and who campaigned against nuclear weapons for the rest of his life, were less grandiose and, like Sakharov, more ready to engage directly in the political process. It was they who acted as the real conscience of science, seeking to avert the madness that their enquiry into nature had unleashed.

    As a history (and a splendid one), it would be perhaps unfair to expect Close’s book to dwell on the perilous situation today in which nuclear-armed Russia wages war on Europe’s border, India and Pakistan square up to one another in Kashmir, and the finger of an infantile narcissist hovers over the American button. But missing from this book is any reflection on the change that the nuclear age brought to both the public perception and the self-image of science – some sense of cultural as well as scientific history. 

    By 1964 the scientist had morphed into Kubrick’s Dr Strangelove, an amoral blend of the H-bomb inventor Edward Teller and the German rocket scientist Wernher von Braun. Physics had lost its innocence. The world thereafter lives in the shadow of the apocalypse.

    Destroyer of Worlds: The Deep History of the Nuclear Age 1895-1965, by Frank Close, is published by Allen Lane

    Discussion