Skip to main content

Flash Physics: Nuclear diamond battery, M G K Menon dies, four new elements named

Diamond batteries run on nuclear waste

Radioactive waste from nuclear reactors could be used to create tiny diamonds that produce small amounts of electricity for thousands of years. That is the claim at the heart of a proposal from researchers at the University of Bristol in the UK, who say they have a practical way of dealing with some of the nearly 95,000 tonnes of radioactive graphite that was used as a moderator in the UK’s nuclear reactors. The idea is to make the waste less radioactive by removing radioactive carbon-14 nuclei, which are concentrated on the surface of the graphite. The isotope would then be integrated into artificial diamonds. Carbon-14 has a half-life of about 5700 years and decays to non-radioactive nitrogen-14 by emitting a high-energy electron. It turns out that diamond is very good at turning the energy released in the decay into an electrical current – essentially creating a battery that will last for thousands of years. Embedding carbon-14 in diamond is a safe option, say the researchers, because diamond is hard and non-reactive, so it is unlikely that the radioactive carbon will leak into the environment. And because nearly all of the decay energy is deposited within the diamond, the radiation emitted by such a battery would be about the same as that emitted by a banana. The team reckons that a diamond battery containing about 1 g of carbon-14 would deliver about 15 joules per day. A standard 20 g AA battery could sustain this power for about 2.5 years, whereas the diamond battery would last hundreds of years without a significant drop in output. “We envision these batteries to be used in situations where it is not feasible to charge or replace conventional batteries,” says Bristol’s Tom Scott. “Obvious applications would be in low-power electrical devices where long life of the energy source is needed, such as pacemakers, satellites, high-altitude drones or even spacecraft.” The team has already shown that the device could work by placing a non-radioactive diamond next to nickel-63, which emits high-energy electrons.

M G K Menon 1928–2016

Photograph of M G K Menon and Cecil Powell

The Indian particle physicist and cosmic-ray expert M G K Menon has died at the age of 88. Menon was educated at Jaswant College, Jodhpur, and the Royal Institute of Science in Bombay (now Mumbai), before moving to the University of Bristol in 1953, where he did a PhD in particle physics under the supervision of Nobel laureate Cecil Powell. Two years later, he joined the Tata Institute of Fundamental Research in Bombay, researching cosmic rays before becoming the institute’s director from 1966 to 1975. Later in his career, Menon was appointed to a number of notable policy positions. He became a member of India’s Planning Commission from 1982 to 1989 and was science advisor to Indian prime minister Rajiv Gandhi from 1986 to 1989. In 1989 he became minister of state for science and technology and education, and a year later was elected as a member of parliament.

Four new elements officially named

The International Union of Pure and Applied Chemistry (IUPAC) has officially named four new elements: 113, 115, 117 and 118. Element 113 was discovered at the RIKEN Nishina Center for Accelerator-Based Science in Japan and will be called nihonium (Nh). Nihon is a transliteration of “land of the rising sun”, which is a Japanese name for Japan. Moscovium (Mc) is the new moniker for element 115 and was discovered at the Joint Institute for Nuclear Research (JINR) in Moscow. Element 117 will be called tennessine (Ts) after the US state of Tennessee, which is home to the Oak Ridge National Laboratory, while element 118 will be named oganesson after the Russian physicist Yuri Oganessian, who led the team at JINR that discovered the element. “The names of the new elements reflect the realities of our present time,” says IUPAC president Natalia Tarasova. She adds that the names reflect the “universality of science, honouring places from three continents where the elements have been discovered – Japan, Russia, the US – and the pivotal role of human capital in the development of science, honouring an outstanding scientist – Yuri Oganessian”. The names were proposed in June and then underwent a five-month consultation period before they were approved by the IUPAC Bureau on Monday.

 

  • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on creating ghost images with atoms.

Photons created in a superposition of two colours

Individual photons have been put into a quantum superposition of two different colours by a team of physicists in the US and Germany. Such photons could be useful for connecting different parts of quantum-information networks that operate using differently coloured light.

Superposition is an important concept of quantum mechanics that allows a physical system to be in two or more quantum states at the same time – until a measurement on the system puts it into a specific state. A photon, for example, can be in a superposition of a horizontally polarized state and a vertically polarized state until it passes through a polarimeter.

Information can be encoded into quantum states and then processed in a quantum computer, which uses superposition and other features of quantum mechanics to process information much faster than is possible with conventional computers.

Two-colour states

Normally when physicists think of a photon, it is in a well-defined energy state having a specific colour. However, quantum mechanics allows the photon to be in a superposition of two or more energy states – or colours. In this latest work, Stéphane Clemmen and colleagues at Cornell University, Humboldt-University Berlin and Columbia University have created photons that are “bichromatic” by being in a superposition of two different colours.

The team made the bichromatic photons using a technique called “Bragg-scattering four-wave mixing”. This takes place in a 100 m-long optical fibre that is pumped with two laser beams. When a “red” photon is shone into the fibre, it interacts with the laser light and is put into a bichromatic superposition of the initial red state and a second “blue” state.

The set-up can be adjusted so that the photon emerges from the opposite end of the fibre with an equal probability of being either red or blue when its colour is measured.

Phase proof

Clemmen and colleagues were also able to adjust the relative phase between the red and blue states in the quantum superposition. This allowed them to create photons that were all blue when detected, or all red, or a specific combination of red and blue. This ability to adjust the phase is proof that the photons were in a coherent quantum superposition. The team also showed that to a very high probability, the experiment detects one photon at a time – which means that the researchers are really seeing single photons in a superposition of two colours.

The technique could someday be used to connect quantum devices that operate using different colours of light. Two quantum memories, for example, could be put into a state of quantum entanglement by inputting a bichromatic photon. Such entangled memories would prove useful for a range of quantum-computing and quantum-communication applications. Other potential uses include spectroscopy measurements on living samples such as eyes, which must be done using very low levels of incident light.

The research is described in Physical Review Letters.

Flash Physics: Exotic cosmic rays have mundane origins, Swiss reactors keep running, programmable material

Exotic cosmic rays have mundane origins

Measurements made by the Alpha Magnetic Spectrometer (AMS) on the International Space Station suggest that exotic cosmic rays comprising boron nuclei have rather mundane origins. Astrophysicists divide cosmic rays into two categories: primary and secondary. Primary cosmic rays are produced in supernovae and other violent astrophysical processes, whereas secondary cosmic rays are created when their primary cousins collide with gas atoms in the interstellar medium. The vast majority of carbon-nuclei cosmic rays are thought to be primary in origin, whereas all boron cosmic rays are thought to be secondary in nature. As a result, the ratio of boron-to-carbon cosmic rays (B/C) reaching the AMS should provide a measure of the average amount of interstellar matter that the cosmic rays have passed through. There are several models that predict the shape of the B/C spectrum as a function of energy, but previous balloon-borne measurements of the B/C were not precise enough to decide which model is best. After analysing 80 billion cosmic rays collected over five years, AMS physicists have concluded that a relatively simple model developed in 1941 by the Russian mathematician Andrey Kolmogrov best describes the data. The result is of great interest to physicists studying the apparent excess of cosmic-ray positrons that reach the Earth. These particles were expected to have been created by similar secondary processes as the boron nuclei, but the AMS results could mean that there are hitherto unknown additional astrophysics sources of positrons in the universe. The research is described in Physical Review Letters.

Swiss reject nuclear phase-out

Switzerland has voted to reject an early shutdown of the country’s five ageing nuclear reactors in a referendum held yesterday. Some 54.2% of people voted “No” – on a turn-out of 45% – to phasing out the country’s nuclear plants by 2030. In Switzerland, nuclear power provides around a third of electricity – the second largest source behind hydro. Yet a few months after the Fukushima nuclear disaster in Japan in March 2011, the Swiss government abandoned plans to build new nuclear power plants and reactors. The referendum held yesterday was to decide whether those existing plants should be closed before their expected lifetime comes to an end. The plants are now likely to continue operating well into the 2030s, subject to approval from safety regulators.

Defects allow material properties to be programmed

Researchers at Purdue University in Indiana have unveiled a new type of cellular material with physical properties that can be “programmed” after manufacture. The honeycomb-like structures are made from shape-memory polymers and contain engineered defects that make the materials respond in certain ways to external forces. The programming can be done by heating the material and then applying a force to change its shape. The new shape is then retained when the material cools down. The stiffness of one material, for example, increases by 55% when it is compressed by 5%. “That is pretty impressive because ordinarily you would have to fabricate a new material with at least twice the thickness of the walls to obtain a material with a 50% increase in stiffness,” says Purdue’s David Restrepo. Possible applications of the new materials include acoustic metamaterials that can be tuned to absorb sound at specific frequencies and “stealthy” surfaces that do not reflect radar waves. Other uses, according to the researchers, include protective helmets and car seats that adjust to a driver’s weight. The materials are described in two papers in the International Journal of Solids and Structures.

 

  • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on qubits with two colours.

Torsion-bar antenna adds new twist to gravitational-wave search

Physicists in Japan have developed a new kind of compact gravitational-wave detector that works by measuring the tiny rotations of two suspended blocks of aluminium. A far cheaper alternative to the more conventional interferometer-based devices, this “torsion-bar antenna” could plug a gap in the gravitational-wave spectrum – between the high-frequency waves observable today from the ground and the lower-frequency radiation potentially detectable in space – so expanding the range of very massive objects that astronomers can study.

Gravitational waves are ripples in the fabric of space–time predicted by Albert Einstein in 1916 and detected directly for the first time last September by the Laser Interferometer Gravitational-wave Observatory (LIGO) in the US. Each of LIGO’s two detectors is a laser interferometer with two 4 km-long arms at right angles to each other. A passing gravitational wave can stretch one arm by a minuscule amount while compressing the other – and these changes can be measured with very high precision.

The LIGO detectors are shielded from terrestrial vibrations by suspending the interferometer mirrors – turning each mirror into a pendulum. As a result, LIGO cannot detect gravitational waves with frequencies below about 1 Hz, which is the resonant frequency of the mirror pendulums. Since heavier astronomical objects emit gravitational waves at lower frequencies, LIGO is only able to study relatively small objects – its first signal having been produced by the merger of two black holes weighing in at about 30 times the mass of the Sun.

Detectors in space

To observe gravitational waves at lower frequencies, many scientists are instead looking to vibration-free space. Among proposed missions is the European Space Agency’s evolved Laser Interferometry Space Antenna (eLISA), which is due to be launched in the early 2030s. This would fire laser beams between free-floating test masses arranged in a triangular formation with arms a million kilometres long, and would target waves with frequencies between about 0.1–100 mHz.

In contrast, Masaki Ando of the University of Tokyo and colleagues aim to detect low-frequency gravitational waves on the ground – at a cost of just a few million dollars. Their detection process involves monitoring the effect of passing gravitational waves on two bar-shaped test masses positioned at right angles to one another and which rotate around a common axis of suspension. Rather than recording a length change, the Japanese group instead measures a tiny relative rotation – the waves would cause one test mass to move in a clockwise direction while sending the other anticlockwise.

It is probably the best concept for a roughly 1 Hz ground-based detector proposed to date
Hartmut Grote, Albert Einstein Institute

In this set-up the resonant frequency is not fixed by the strength of gravity and the length of the suspension – as for a pendulum – but instead by the suspension’s tensional strength, diameter and length, as well as the bar’s moment of inertia. Putting forward their idea in 2010, Ando and co-workers calculated that 10 m-long bars suspended by very narrow, soft wires would have resonant frequencies as low as a few millihertz and could turn through angles as small as 10–17 of a degree. This, say the researchers, would enable them to detect significant numbers of merging intermediate-mass black holes, which can weigh in at up to about a million solar masses.

Prototype built

The researchers have now built a small prototype detector comprising two bars, each 24 cm long. The detector is shielded from vibrations and the researchers used a laser interferometer to achieve angular sensitivities of up to 10–8 of a degree.

The team also showed that its antenna would be able to obtain three independent measurements from each passing gravitational wave – the average of the two bars’ horizontal rotation and both vertical rotations. According to Ando’s colleague Ayaka Shoda of Japan’s National Astronomical Observatory, this increases the chances of detecting a wave in the first place (given that its direction would be unknown) and also provides more information about the wave’s source, such as its location and rate of spin.

Shoda says that the biggest technical challenge in building the full-scale version of the antenna will be developing the cryogenics needed to reduce vibrations in the bars and wire, pointing out that the cryopump, which will be connected to the bars, will itself vibrate. She estimates that reaching design sensitivity could take anywhere between 10 and 20 years, but says that as an intermediate step, they first plan to demonstrate an angular sensitivity of some 10–13 of a degree. At this point, their device could pick up fluctuations in the local gravitational field due, for example, to seismic waves or atmospheric sound waves. Indeed, the researchers say that their technology might one day be used to generate earthquake alerts, given that gravitational effects travel at the speed of light while seismic waves typically travel at just a few times the speed of sound.

Despite the work that still needs to be done on the torsion-bar technology, Hartmut Grote of the Albert Einstein Institute in Hannover, Germany, believes the concept is worth pursuing. “It will take quite a while, plus uncertainties of funding, to get to an astrophysically interesting sensitivity,” he says. “But it is probably the best concept for a roughly 1 Hz ground-based detector proposed to date.”

Also enthusiastic is Jan Harms of the University of Urbino in Italy. He underlines how difficult it will be to remove gravitational noise from observations, noting that ideas for carrying out such screening remain unproven. But he says it is “important to close the frequency gap” between ground-based interferometers and LISA, and believes that the torsion-bar antenna is “one of the most promising concepts” for doing so.

The research is reported on arXiv.

Flash Physics: Spin-Hall effect switches magnet, ‘Big Bell Test’ kicks off, quasiparticles multiplex light

Spin-Hall effect switches insulator’s magnetic state

The magnetization direction of a magnetic insulator has been switched by passing an electrical current through a metal layer adjacent to it. The new switching technique has been developed by Caroline Ross, Geoffrey Beach and colleagues at the Massachusetts Institute of Technology in the US, who describe it in Nature Materials. The technique takes advantage of the spin-Hall effect, whereby an electrical current can generate a spin current that flows in a direction perpendicular to the charge current. In this experiment, the electrical current flows along a layer of platinum that is adjacent to a layer of garnet – which is a magnetic insulator. “The spin current interacts with the magnetic moment of the garnet, exerting a spin torque on it, and this torque is strong enough to switch the garnet’s magnetization,” explains Ross. The technique could be used to write information to magnetic memory devices based on magnetic insulators. Data are currently written to magnetic memories by generating magnetic fields, which is much trickier and expensive to achieve than simply creating an electrical current. “We can also use electric effects to read back the state of the magnetic material, which allows us to make an all-electrical magnetic device,” adds Ross. A longer version of this article appears on nanotechweb.org.

“Big Bell Test” will use human randomness to test quantum physics

Twelve physics labs worldwide will conduct a series of quantum-physics experiments on 30 November with the help of a global army of volunteers. Dubbed the “Big Bell Test“, the event involves members of the public playing an online game that challenges players to create random sequences of binary bits. These numbers will then be used to control experiments that perform Bell tests. These test the idea that two quantum particles such as photons can be in an entangled state in which a measurement on one particle instantaneously affects the other – no matter how far apart they may be. Named after the physicist John Bell – who derived an inequality that quantifies entanglement – Bell tests have proven difficult to do in the lab. This is because practical implementations include one or more “loopholes” whereby non-quantum effects cannot be ruled out as the cause of the observed entanglement. The Big Bell Test aims to use human-generated random numbers to ensure that measurement biases are not introduced into several Bell-test experiments. The event is co-ordinated by the Institute of Photonic Sciences in Barcelona. It begins on Wednesday at 00:00 local time in Brisbane, Australia and ends at 23:59 local time in Boulder, Colorado, US.

Quasiparticles multiplex light

Diagram of how the multiplexer works

An optical device that uses quasiparticles to convert one optical signal into two signals at different colours has been unveiled by Hyun Seok Lee and colleagues at the Institute for Basic Science, in Suwon, Korea. The device comprises two tiny pieces of semiconductor – molybdenum sulphide and tungsten selenide – that are about 2 μm apart and connected by a tiny silver nanowire. The device works by shining green light onto the nanowire at the molybdenum-sulphide side of the device. This creates quasiparticles called surface plasmon polaritons (SPPs) on the silver. The SPPs then create electron–hole pairs in the molybdenum sulphide. These pairs remain bound to each other and are described as quasiparticles called excitons. Eventually, the excitons decay and some of their energy goes into creating orange light, which is emitted from the device. The remaining energy creates new SPPs, which propagate along the nanowire to the tungsten selenide. There the SPPs create more excitons, which then decay to create red light that is emitted from the device. As a result, the device works as a multiplexer that converts green light into orange and red light. The conversion process occurs very quickly and this combined with the tiny size of the device means that it could someday find use in high-speed computers of the future that use light – rather than electrical signals – to process information. The device is described in Nature Communications.

 

  • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on a new gravitational wave detector.

New optical device absorbs just one photon

Physicists in Germany have created a new optical device that can absorb exactly one photon. They say that this device, which exploits the physical properties of giant micron-sized atoms known as Rydberg atoms, could be used in optical quantum computing networks of the future.

Sebastian Hofferberth of the University of Stuttgart explains that the device first behaves like a dark sunglasses lens, but once it absorbs its first photon it becomes transparent to light. One important application of the device, says Hofferberth, could be to absorb single photons from a quantum network. Another potential application is a precise photon counter, which could be made by putting a number of the devices in series.

Atomic cloud

At the heart of the device is a micron-sized diffuse cloud of rubidium atoms cooled to near-absolute-zero temperature. To make the cloud only absorb a single photon, it is first illuminated with laser light with precisely enough energy to excite the atoms’ outermost electron into the 121st energy level. There the electron is about a thousand times further from the nucleus than it would be in the atom’s ground state. Such atoms have radii of more than a micron and are known as Rydberg atoms.

When a rubidium atom in the cloud absorbs a single photon to become the first Rydberg atom, no other atom can accept another photon from the laser beam. This is because the first Rydberg atom’s outermost electron is so far from its nucleus that it overlaps with all the other atoms in the cloud, changing their electronic structures. “The presence of the first Rydberg atom has such a strong influence that it changes the resonance conditions for all the other atoms,” Hofferberth says, adding “Rydberg atoms can interact with its neighbours about 10 microns away”. Because no other atoms can absorb photons, the cloud becomes transparent.

To verify that only one photon had been captured, the researchers use the fact that the outer electron is loosely bound to the Rydberg atom’s nucleus. “They’re very fragile,” Hofferberth says. So to verify that the cloud only absorbed one atom, he and his colleagues converted the Rydberg atom into a rubidium ion by knocking the outermost electron away. Then, they counted how many rubidium ions were present – and measured only one.

Delicate process

Creating this photon absorber was experimentally difficult, Hofferberth says. While laser cooling and trapping the rubidium atoms is a standard technique, creating the atomic cloud and the single Rydberg atom is still a very delicate process.

The concept behind this single-photon absorber was first proposed in 2011, says Alexey Gorshkov, a physicist at the University of Maryland who has collaborated with Hofferberth in the past, but was not involved in this most recent work. “These guys have implemented it, which is pretty cool,” Gorshkov says. However, he points out that when the device absorbs a single photon, it also distorts the signal of subsequent photons passing through it, which may complicate its use in quantum information applications.

Hofferberth explains that his team’s overarching goal is to create an array of general tools to precisely add, subtract, and control individual photons. “We have now built the most primitive version of such a tool for manipulating light,” he says. “We can subtract exactly one photon.” A similar single photon absorber based on a different physical mechanism was unveiled in 2015 by Barak Dayan and colleagues at the Weizmann Institute of Science in Israel and it is too early to tell which will be a more effective tool. The next step, according to Hofferberth, is to create a device that does the reverse – a collection of atoms that can produce exactly one photon.

The research is described in Physical Review Letters

Chasing gravitational waves in song, physicists on Broadway, the 'impossible space engine' returns

 

By Hamish Johnston

These days anyone making a major breakthrough in physics is expected to follow-up with a cheesy music video. So give it up for The Mavericks and “Chasing the Waves”, which chronicles the quest to detect gravitational waves – which culminated in LIGO’s success earlier this year. I don’t much about this video, but it seems to have been filmed at the University of Glasgow, which is part of the LIGO collaboration.

(more…)

Flash Physics: Glitch crashed Mars lander, Microsoft hires quantum stars, cosmic speed test for light

European Mars lander doomed by computer glitch

The European Space Agency (ESA) has confirmed that the recent crash of a Mars probe was caused by a computer glitch that made the spacecraft assume it had already landed on the red planet. ESA’s Entry, Descent and Landing Demonstrator, known as Schiaparelli, was launched together with ESA’s Trace Gas Orbiter and arrived at Mars in October. It was supposed to test landing techniques that would be employed on the upcoming ExoMars rover. However, as the probe entered Mars’s atmosphere, ESA scientists lost contact with Schiaparelli after it casted its parachute. Investigations have now shown that the parachute was deployed too early – at some 4 km above the surface of the planet – and that the probe also briefly fired its breaking thrusters too soon. The problem was due to a sensor failure that generated a negative altitude reading and made the probe think it was below ground level. A full report on the cause of the crash is expected early next year.

Was the speed of light faster in the early universe?

A way of testing whether the speed of light was faster in the very early universe than it is today has been put forth by João Magueijo of Imperial College London and Niayesh Afshordi at the Perimeter Institute for Theoretical Physics in Canada. Although a variable speed of light is at odds with Einstein’s special theory of relativity, it could solve the “horizon problem” of cosmology. The conventional theory of the early universe is that it underwent a rapid exponential expansion just 10–36 after the Big Bang. Known as inflation, this phenomenon explains several properties of the universe including the fact that it appears more or less the same in every direction. Dubbed the horizon problem, this homogeneity is unexpected because it would require energy to be transferred across the universe faster than the speed of light if the universe expanded gradually. Inflation solves the problem because an exponentially-expanding universe would not have time to lose its initial homogeneity. However, if the speed of light was much faster in the early universe then inflation – which itself is not fully understood – could be dispensed with. Writing in Physical Review D Magueijo and Afshordi explain how a varying speed of light would leave a specific signature in the tiny fluctuations in the cosmic microwave background (CMB) – radiation that was produced 380,000 years after the Big Bang and can be detected today. While their prediction falls within the current measurement uncertainty of the CMB by the Planck space observatory, Magueijo and Afshordi say that “improved observations will soon vindicate or disprove this model”.

Four quantum-computing stars join Microsoft

Photograph of Leo Kouwenhoven and Charles Marcus

Four leaders in the field of quantum computing are joining Microsoft to help the company develop a topological quantum computer. Leo Kouwenhoven of the Delft University of Technology in the Netherlands and Charles Marcus of the University of Copenhagen have already been hired by the US-based company and they will both build dedicated Microsoft quantum labs at their respective universities – while maintaining their academic research labs. Kouwenhoven and Marcus are both experimentalists who study solid-state systems that could be used to create hardware for quantum computers. Microsoft has also announced that it will soon be hiring Matthias Troyer of ETH Zurich, who is a theorist working on quantum algorithms and David Reilly of the University of Sydney, who develops quantum devices based on nanostructures. The quartet will help US-based Microsoft in its attempt to build a quantum computer based on topological quantum bits (qubits). Such qubits have inherent physical properties that should make them immune to environmental noise – which would otherwise degrade or even destroy quantum computations. Marcus and other experts explain the challenges of building a quantum computer in this podcast: “ Quantum computing: Challenges, triumphs and applications

 

  • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on a single-photon absorber based on Rydberg atoms.

Do solar neutrinos affect nuclear decay on Earth?

Further evidence that solar neutrinos affect radioactive decay rates on Earth has been put forth by a trio of physicists in the US. While previous research looked at annual fluctuations in decay rates, the new study presents evidence of oscillations that occur with frequencies around 11 and 12.5 cycles per year. The latter oscillation appears to match patterns in neutrino-detection data from the Super-Kamiokande observatory, in Japan. Other physicists, however, are not convinced by the claim.

The idea of fluctuating beta-decay rates is very controversial because for more than 80 years, radioactive substances have been thought to follow a fixed exponential decay, under all conditions. The theory of invariable decay constants was set by Ernest Rutherford, James Chadwick and Charles Ellis in Radiations from Radioactive Substances, published in 1930.

In recent years, however, there have been suggestions that decay rates are not constant and are influenced by the Sun. In 2009, physicists from Purdue University in Indiana published a paper discussing unexplained annual fluctuations in long-term measurements of decay rates of silicon-32 and chlorine-36 at Brookhaven National Laboratory (BNL) in New York and radium-226 at the Physikalisch-Technische Bundesanstalt (PTB) in Germany.

Not so constant

The Purdue researchers noted that decay rates at both experiments appeared to be fastest early in the year when Earth is closest to the Sun. They suggested that the annual decay oscillations could be related to yearly variations in the Earth–Sun distance, with solar neutrinos somehow affecting decay rates.

This idea, however, was met with some scepticism in the physics community. Part of the criticism was that environmental factors, such as ambient temperature, are known to affect decay-rate measurements and might explain the seasonal variations. While others pointed to the fact that neutrinos interact with other particles infrequently and there is no known mechanism that could explain the proposed influence on decay rates.

In the latest research, Peter Sturrock, an applied physicist at Stanford University, Ephraim Fischbach at Purdue and Jeffrey Scargle, an astrophysicist at NASA’s Ames Research Center, performed power-spectrum and spectrogram analysis of the BNL silicon-32 and chlorine-36 data. The study revealed oscillations at frequencies of 11 and 12.5 cycles per year, as well as the previously reported annual oscillation. They also analysed five years of measurements from the Super-Kamiokande observatory and found similar oscillations in solar-neutrino flux.

Similar oscillations

In the Super-Kamiokande data, which were collected between 1996 and 2001, they found oscillations at 12.5 and 9.5 cycles per year. The researchers say that the oscillation at 12.5 could be related to the rotation of the Sun’s radiative zone, while the oscillation at 9.5 may be related to the rotation of the solar core.

The oscillations in decay rates and neutrino flux that occur at 12.5 cycles per year fit with each other. However, the oscillations at 9.5 (neutrino flux) and 11 (decay rate) cycles per year are more difficult to reconcile. The researchers say that the 11 cycles-per-year oscillation could originate in the region between the Sun’s core and radiative zone.

Sturrock told Physics World that his team is the first “to show similar patterns in both decay data and neutrino data. I see evidence of internal solar rotation in both BNL and Super-Kamiokande data”. He adds: “Comparison of spectrograms formed from BNL data and from Super-Kamiokande data shows a remarkable similarity to each other and to what we know (from helioseismology) about the rotation rate of the solar radiative zone.”

Unknown mechanism

Sturrock says that the mechanism behind the effect of neutrinos on beta-decay rates is unknown. “I speculate that neutrinos interact with the W-boson that is believed to mediate beta decay,” he explains. “But I am hoping that some theoretical physicists will take up this problem.”

Others, however, remain unconvinced. Karsten Kossert, a physicist at PTB, says that his own research, with others, on decay rates has shown that there are “some fluctuations in some instrument readings”. “However, since different instruments and/or measurement techniques show different variations, we can exclude solar neutrinos as a common reason for these variations.” He adds: “In some cases, we have shown a clear correlation between environmental parameters – such as temperature, humidity, air pressure – and instrument readings.”

Kossert recently co-authored a study looking at data on decay rates from 14 laboratories around the world. The report concluded that “observed seasonal modulations can be ascribed to instrumental instability” and that “there are also no apparent modulations over periods of weeks or months”.

Not persuaded

The evidence that neutrinos affect beta-decay rates “is not persuasive”, according to Hamish Robertson, director of the Center for Experimental Nuclear Physics and Astrophysics at the University of Washington, in Seattle. He says: “Evidence that fits the hypothesis has been brought forward, while other evidence that does not fit (for example, long-term studies of the beta decay of tritium), is ignored.” He adds: “Fitting the fluctuations to one natural phenomenon after another will eventually lead you to reach a spectacular conclusion.”

Patrick Huber, of Virginia Tech in the US, echoes this, saying that “correlation is not causation”. “Even if we assume there is this variation [in decay rates], I do not find anything in the data indicating that neutrinos have anything to do with it.”

Huber adds that if the oscillations are real and “not due to some experimental artefact”, this requires “extraordinary new physics, and hence it will require extraordinary proof – which the present work is not”. “In particular, it makes no suggestion how to test the hypotheses put forward or where to go next to study this question.”

The study is described in Solar Physics.

The beauty of gravitational waves

Painting by Penelope Cowley depicting gravitational waves is being unveiled at Cardiff University's school of physics and astronomy on 25 November 2016

By Matin Durrani

A new painting by Welsh artist Penelope Cowley is the latest attempt to bring art and science together. Set to be unveiled on Friday 25 November at Cardiff University’s school of physics and astronomy, the 1.2 × 1.5 m picture was inspired by the recent detection of gravitational waves by the LIGO collaboration.

According to the university, the oil painting “combines a visualization of data taken from the equipment used to detect the first gravitational waves…with an imagination of some of the celestial bodies that are responsible for creating these waves, such as binary black holes and neutron stars”.

(more…)

Copyright © 2026 by IOP Publishing Ltd and individual contributors