I have just finished reading an excellent book that traces various theories about beta decay in the first third of the Twentieth Century. Controversy and Consensus: Nuclear Beta Decay 1911-1934 is an editted version of the successful PhD thesis written by Carsten Jensen, who clearly had a deep passion for unravelling physics history but died of cancer soon after he was awarded his doctorate. This post has been inspired by that book.

When radioactivity is taught at school it is introduced using the conservation of mass (https://physbang.com/2022/06/05/alpha-and-beta-nuclear-decay/) before moving on to conservation laws that refer to different types of particles.

Protons and neutrons are both baryon particles and their total number must remain constant during radioactive decay. Beta particles are high-speed electrons, which belong to a different class of particles known as leptons. As with baryons, the total lepton number must also remain constant through any decay process.

This causes a problem because there are no leptons in the nucleus but they are detected as beta particles after nuclear decay has occurred. The possibility that electrons could exist in the nucleus, bound to protons to make neutrons, was taken seriously during the early days of nuclear physics and is still used sometimes as a stepping-stone idea in school, but it simply isn’t true.

In order to balance having no leptons before decay occurred, we must also have no leptons after the decay. We clearly DO have a lepton (the beta particle) so the conservation of particle type appears to have failed. We avoid this issue by saying the electron is accompanied by another particle, called the electron antineutrino, which is very hard to detect. Antineutrinos are anti-leptons so the decay produces zero leptons in total and the conservation laws are obeyed once more.

Beta decay expressed complete with the accompanying electron antineutrino, which is hard to detect on account of having zero charge and negligible mass. The bar above its symbol indicates that it is an anti-particle.

Neat though this is when presenting ideas to students, it isn’t how things happened in the history of physics, which is where Carsten Jensen’s treatise comes in.

Early in the Twentieth Century, alpha particles were known to have energies characteristic of the decay process that had produced them. The energy of an emitted alpha particle is directly dependent on the difference in energy between the parent nucleus and its daughter. The same was assumed to be true for beta particles but the experimental data suggested otherwise.

The first controversy hinged on the absorption of beta particles as they travelled through different thicknesses of the same material. Working in Berlin, Lise Meitner and Otto Hahn found the absorption was exponentially linked to thickness, suggesting the beta particles were homogenous (that is, they all had the same energy). This was consistent with previous expectations.

But what if the notion of homogenous beta particles was wrong? Over in England, Ernest Rutherford suggested an investigation to check this and one of his students, William Wilson, took up the challenge. Using magnetic deflection and a shielded detector, Wilson found that single-energy beta particles exhibited linear absorption, not exponential.

The exponential relationship discovered by Meitner and Hahn could be explained if the beta particles had a variety of energies but that seemed to clash with the conservation of energy. Fortunately, there was a way out because daughter nuclei will undergo their own decay, so a small spectrum of beta energies would still be expected and all the experimental data could be valid, both exponential and linear.

Rutherford, whose name is most commonly associated with the gold-foil scattering experiment (https://physbang.com/2020/10/04/rutherfords-gold-foil-experiment/) proposed a different solution: he suggested beta particles from a single decay process could have different energies if they were accompanied by gamma rays that carried the appropriate amount of energy required to maintain a constant total for every decay.

Further experiments conducted by Meitner and Hahn in 1911 revealed clear evidence that beta particles were definitely released with a number of different energies, not a single energy as they had previously believed. The energies did not vary continuously but rather had a number of fixed values: sadly, although the number started small it quickly grew thanks to a new and more sensitive experimental technique invented by the previously unknown physicist Jean Danysz.

The minor inconvenience of multiple beta energies became a flood of data that challenged physicists’ understanding of the nucleus and nuclear decay processes. Things got worse in 1914 when James Chadwick adopted Hans Geiger’s discrete-event counter in place of the analogue photographic and ionisation methods used previously. Now, save for a few obvious peaks, beta particles seemed to exhibit a continuous spectrum of energies.

The peaks were explained either using ideas about internal energy transfer as nuclear electrons made their way past the atom’s orbiting electrons or by proposing that some of the beta particles were transformed into gamma rays within the nucleus and these gamma rays caused ionisation of the orbiting electrons from the innermost shells. These two approaches, which were championed by Lise Meitner (in Berlin) and Charles Ellis (in Cambridge), are discussed in enormous detail in Carsten Jensen’s thesis.

The same two parties also represented opposite views on the origin of the continuous spectrum: Ellis said it was an intrinsic part of the nuclear decay process whereas Meitner claimed it was due to subsequent effects after the decay had taken place. Importantly, both models failed to explain data obtained from the decay of radium-E (now known to be bismuth-210) and the validity of continuous spectra was further challenged by the rise of quantum physics. This in turn led to what would have previously been unthinkable; a suggestion that energy might not be conserved on the quantum scale.

The idea of energy non-conservation was supported by none other than Neils Bohr, the man who successfully modelled energy levels for orbiting electrons and so unseated the widely-accepted structure of the atom developed by Rutherford. His nemesis was Wolfgang Pauli, who insisted energy must be conserved at all times and on all scales.

Pauli initially followed in Rutherford’s footsteps, echoing the idea that continuous spectra could be explained by gamma rays carrying the balance of energy. When the gamma rays’ energies didn’t add-up as expected, in 1930 Pauli proposed a new particle with zero charge and zero mass, making it almost impossible to detect. Originally called the neutron, Pauli’s elusive particle was subsequently renamed the neutrino (by Enrico Fermi) after Chadwick announced his discovery of the “heavy” neutron in 1932.

Despite resorting to an almost undetectable particle that would remain entirely theoretical until 1955, Pauli had been right and Bohr was wrong: energy is indeed always conserved in radioactive decay. The notion expresed earlier, that neutrinos were “invented” to satisfy particle conservation, is misleading and obscures the scientific debates that led to their proposal. This is the injustice that Carsten Jensen put right in his thesis and I thoroughly recommend Controversy and Consensus: Nuclear Beta Decay 1911-1934 (ISBN: 978-3-7643-5313-1) to those who want the full details. The book is available from various online sellers.

One thought on “Beta Decay and Energy Conservation

Leave a comment