Skip to main content

New energy territories

 

ATLAS: Hanspeter Beck and Anna Lipniacka

• Hanspeter Beck, deputy leader, Bern ATLAS group

• Anna Lipniacka, deputy co-ordinator of ATLAS, Norway

With both ATLAS and CMS are searching for the Higgs how much rivalry is there?

AL: [laughs] Yes, there is a healthy competition between the two of us, a bit like an Olympic Games of CERN. No, but seriously, it is very necessary because we are the only people on Earth who have an accelerator like this. If we only had one detector it would have a complete monopoly on seeing the result and if we were wrong there would be no-one to check us.

What do we know so far about the Higgs?

HB: We know from LEP [the Large Electron Positron collider, the predecessor of the LHC] that its mass should lie between 115 and 160 GeV, and could even be as heavy as 200 GeV. If it does not exist in this mass range, then we know that it does not exist at all. In itself this would be an extremely interesting discovery because we would have to go over our textbooks and look for other mechanisms for how particles get their mass.

But do you have any idea where it might lie within this range, and will that affect the difficulty of the search?

HB: From previous findings at the Tevatron [at Fermilab in the US] and theoretical studies, is seems most probable that the Higgs mass is a tick above 115 GeV. And if it is there, it will be ultra difficult and will take up to 6 years to prove or disprove if it exists.

If it is around 160 GeV then that is different. In 18 months from now we will be able to say if we have found it. But who knows where it is.

Is there anything else you hope to see in your experiment?

AL: My personal dream is that the LHC will become a dark-matter factory.

HB: This substance does not interact with light and is supposed to make up one-quarter of the mass in the universe.

AL: We can study dark matter on the galactic scale and make a map of dark matter, but we don’t know the exact local structures. We suspect that dark matter interacts weakly, and from this we can judge probability of particles interacting with one another. There is a possibility that we are now getting enough energy now to produce them.

The LHC has received a lot of attention over the past few years, in part because of the sheer scale of the project. Do you think this interest will be maintained as the scientific results emerge?

HB: Good research will always create an impact back to society. It changes the way we look back on ourselves. This goes back to Galileo when he said the Earth is not the centre of the universe.

AL: This is not only physics. We are going to a new land with new instruments. We have never ever seen matter in such extreme conditions and we are bound to learn new things.

CMS: Albert de Roeck

• Albert de Roeck, deputy spokesperson, Compact Muon Spectrometer (CMS)

Do you have any sort of timetable for results?

By autumn it is going to get quite interesting here and, particularly by winter, when the accelerator is going to stop for one or two months, we will be frantically looking to the data. We will have our first pop at the Higgs at the end of the year. We will certainly exclude mass regions, but discovery is much more difficult.

But supersymmetry could well be there by the end of the year. “Sparticles” [supersymmetric particles that are partners of every normal particle] would be of the order 500 Gev, and we are kinematically getting to this region. If the luminosity of the LHC beams lives up to promises, we are going to bypass the Tevatron in the region search for supersymmetric particles – definitely by end of 2011.

If you still can’t find the Higgs after running the LHC at full power (14 TeV) for two years, as planned, can you say that it doesn’t exist?

We can certainly say by then that the Standard Model Higgs does not exist.

It will make people uneasy because most people think there will be a sort of Standard Model type of Higgs. But there are more exotic models out there where the Higgs can be mimicked, or not seen, or have negative interferences at much smaller cross sections.

However, I should say that many of these non-Standard Models should show up within the capabilities of this machine. So people will be more inclined to think that the Higgs does not exist, and there is something else going on there.

The four LHC experiments seem to function very separately: why is this?

We have what we call a friendly competition.

For example, they will not talk to each other if they start to see something. Though there is sort of a gentleman’s agreement between the experiments that when things get out, the other experiments get sort of a warning. And that is not on a level that they can start the analysis and finish at that level but at least they are not taken completely by surprise when they go into a public meeting and say we have that discovery.

It’s just like a politeness thing – for the benefit of science and a good healthy collaboration. For example, when we had the first paper out [Feb 2010], we gave a week’s notice.

ALICE: David Evans

• David Evans is a physicist at the University of Birmingham who leads the UK team on the ALICE experiment

How have things been going and what are your hopes for this year?

We are taking a lot of data and we are expecting to see a lot of new physics, especially when the LHC starts circulating lead particles towards the end of the year. Only by colliding lead ions will we recreate a quark–gluon plasma, the exotic primordial soup that would have existed up to ten-millionths of a second after the Big Bang.

After the start of the physics programme went so well, we are hoping that 2010 will be the year that Christmas comes twice.

What kind of physics are you hoping to discover at ALICE?

Among the fascinating things that may emerge is a more thorough understanding of the strong interaction – the least well known of the four fundamental forces.

Up to 98% of particle mass is accounted for by this strong force, but we still don’t really understand why quarks get confined in groups. At ALICE we will get as close as you can to studying free quarks.

To make an analogy, we are in a similar position to the pioneers of electromagnetism in the 19th century. Now, in the 21st century, we are studying another fundamental force: the strong force.

Do you think it was a good decision to run the LHC at half power for 18 months?

I think it will be an advantage to raise the energy of collisions in steps. If we went straight to 14 TeV then, if we saw any exciting new phenomena, we would need to check those results at a lower energy anyway, to see when this new physics kicks in.

LHCb: Niko Neufeld

• Niko Neufeld, deputy leader of the online system project in the LHCb collaboration

LHCb is smaller than the other detectors. What is it looking for?

We have a specialized machine for the study of CP violation, a phenomenon responsible for the difference between matter and antimatter.

There is a fundamental asymmetry in a small part of the Standard Model. It is a violation in the decay properties that makes antimatter decay just that little bit faster and that is why all the antimatter has gone – there are no anti-Suns, no antimatter galaxies…

How long do you think it will take before you can start drawing any conclusions?

The way the machine is going right now, I think we can really start serious physics within a few months’ time – when the accelerator gives us a few hundred thousand “beauty hadrons” per second. Then towards the end of this year we should be in full swing and hopefully have the first draft papers for the winter conferences of next year.

How do you find working at CERN?

For me, the international aspect is one of the best things. Before, it was a sort of European club, which was already cool, but now, with the LHC, CERN has become a truly international laboratory. It is very enriching; you learn about different science cultures. It is very interesting to find that even within the same field physics can be taught in a range of manners.

Randomness is no lottery thanks to entangled ions

An international team of physicists has created the first system that can produce verifiably random numbers. The technique relies on the inherent uncertainties in quantum mechanics and future versions could help cryptographers to encode information more securely than ever before.

Randomness is central to modern cryptography, which uses long strings of random numbers to form “keys” that can encode and decode sensitive information. Normally such strings are churned out by complex mathematical algorithms, called pseudo random-number generators. But these only approximate random strings, and there is the constant worry that hackers could somehow predict the sequences and gain access to secret files.

Worse still, cryptographers can never be sure that a pseudo random-number generator is unique and genuine. There is no way to prove a string is truly random, and even if it is, there is no way to know if a copy exists elsewhere. “If somebody gives you a bunch of numbers and claims they are random you should be suspicious,” says Christopher Monroe, a physicist at the University of Maryland, US, and leader of the experimental group.

Impossible to copy

Monroe’s group, which has worked alongside physicists at the Université Libre de Bruxelles in Belgium, the Institute of Photonic Sciences in Spain and Cambridge University in the UK, has used quantum mechanics to produce random strings that are impossible to copy. In the quantum world, an object exists in a mixed-up superposition of states until it is measured, at which point it collapses randomly into one of them. In principle, therefore, by observing a sequence of superposed objects, one can generate a sequence of random results.

To do this experimentally, Monroe’s team employed a method known as a Bell test, named after the late physicist John Bell who invented it in 1964. They placed two atomic ions in separate enclosures one metre apart, and then “entangled” them by passing single photons through them. Once entangled, the state of one atom is inextricably linked to the superposed state of the other, so that a measurement of one – in this case, a measurement made by recording the emission of light – causes the states of both atoms to collapse.

Over the course of a month, the researchers measured the states of more than 3000 entangled atomic ion pairs, generating a string of 42 binary digits. Because the correlations between the measured states were less than a certain value, as given by Bell’s famous “inequality”, they were – according to quantum mechanics – certifiably random.

Perfect detection

Our system of trapped atomic ions separated in space with perfect detection is the only system that can be used for this purpose Christopher Monroe, University of Maryland

Bell tests have been performed many times before, however. In particular, they have been performed to show that entanglement can affect the state of objects instantly, even if they are far enough apart that no signals could travel between them without breaking the speed of light – and Einstein’s theory of special relativity. What makes Monroe’s experiment different is that every entanglement event is recorded. In past attempts, limitations in detector efficiency have allowed many state collapses to pass by unnoticed. “Our system of trapped atomic ions separated in space with perfect detection is the only system that can be used for this purpose,” says Monroe.

The researchers are now looking to improve the speed of random-number generation by increasing the efficiency of entanglement, perhaps embedding the system in a solid-state chip.

The research is published in Nature 464 1021.

Panel rules out malpractice by climate scientists

By Hamish Johnston

“We saw no evidence of any deliberate scientific malpractice in any of the work of the Climatic Research Unit and had it been there we believe that it is likely that we would have detected it.”

That is the main conclusion of an independent panel of scientists nominated by the UK’s Royal Society to scrutinize the scientific methodology of researchers at the University of East Anglia Climate Research Unit (CRU).

The seven-member panel was set-up by the university and chaired by Ron Oxburgh – a geologist, former oil-company executive and member of the UK’s upper house of parliament. It released its findings today.

The panel looked at 11 “representative publications” produced by CRU members over the past 24 years.

While the report is good news for CRU scientists, some climate-change sceptics have accused the panel of being biased because Oxburgh is chairman of the wind energy company Falck Renewables and president of the Carbon Capture and Storage Association. Oxburgh has insisted that the panel had no pre-conceived views on the CRU science.

This is the second report published after private e-mails of CRU members were hacked last year and made public. Critics of the CRU have alleged that the e-mails show that the scientists incorrectly interpreted data to support manmade climate change and also flouted freedom-of-information requests to make data and computer code available to their critics.

The first report – which was released on 31 March by the House of Commons Science and Technology Committee – concluded that the University of East Anglia was mostly to blame for supporting a culture of non-disclosure.

Sun blamed for Europe’s colder winters

When the Sun’s magnetic output is low, winters in Europe tend to be cooler than average – whereas higher output corresponds to warmer winters. That is the conclusion of a new study by physicists in the UK and Germany that looked at the relationship between winter temperatures in England and the strength of the Sun’s magnetic emissions over the last 350 years. The group predicts that, global warming notwithstanding, Europe is likely to continue to experience cold winters for many years to come.

The possibility of a link between European winter temperatures and solar activity can be seen in historical records from the second half of the seventeenth century. For about 50 years the Sun remained free of sunspots (in contrast to its normal 11-year cycle of sunspot highs and lows) and at this time Europe experienced a number of harsh winters. Motivated by the fact that the relatively cold winters of the past few years have come at a time when solar activity fell to the lowest values for 100 years, Mike Lockwood of the University of Reading and colleagues set out to establish whether or not there is a strong connection.

Lockwood and colleagues used data from the Central England Temperature record. This provides monthly temperature data from several monitoring stations in central England all the way back to 1659 – the world’s longest instrumental temperature record. The researchers first removed the estimated contribution from the warming recorded in the northern hemisphere as a whole over the past century – which is widely believed to have been caused by increasing levels of manmade carbon dioxide in the atmosphere. Hemispheric temperature records data back to 1850; to extend the analysis back to 1659 they used data from a number of different proxy sources, such as tree rings, isotope concentrations in stalagmites, sediment depths, lake heights and documentary evidence.

Sunspot counting in the 1600s

To establish how solar activity varied over the same time period, Lockwood’s group calculated changes to the total magnetic flux carried away from the Sun by the solar wind. This they could do dating back to 1868 based on measurements of fluctuations in Earth’s magnetic field (caused by the solar changes). To extend these data back to 1659 they used a model that links solar magnetic flux levels to sunspot numbers, observations of which extend back to 1600. They did not use sunspot numbers directly because this exhibits very little variation from one minima to the next and therefore cannot be used to create a meaningful long-term trend.

Comparing the changes in English temperatures (which the researchers say are representative of European temperatures as a whole) with fluctuations in solar activity, the researchers found a strong correlation. Indeed, they say, winter temperatures are on average about a half degree centigrade lower when solar activity is low. Further analysis of the data allowed the team to conclude that the probability of the connection being a statistical fluke was less than 5%.

But what causes these changes in the Sun to modify winter temperatures, and why should this effect be limited to Europe, rather than apply to the Earth as a whole? The answer, believes Lockwood, lies in changes to the behaviour of a current of air known as a jet stream that travels west to east across the Atlantic. The jet stream can get caught up in itself and remain blocked over the ocean, preventing mild maritime winds from reaching Europe and allowing icy arctic winds to take their place. Changes in solar magnetic activity would affect the amount of ultraviolet radiation emitted by the Sun, which could then affect temperatures and wind patterns in the stratosphere, effects which, as shown by other recent research, can feed down to the troposphere – the lowest portion of the atmosphere.

According to Lockwood, lower solar activity does not guarantee a cold winter. He points out that England’s coldest winter on record was 1684 but that the following year was the third warmest on record, even though solar activity remained very low. Conversely, he adds, 1947 was a cold winter even though solar activity was high. However, he says, the results show that there are more cold winters when solar activity is low and more warm ones when it is high.

Extrapolating forward, Lockwood predicts that European winters in the coming years are likely to be colder than they have been in recent decades. He has calculated, based on evidence of past solar activity contained within cosmogenic isotope data from tree rings and ice cores, that there is an 8% chance that we will see another 50-year solar low starting within the next 50 years and that this would lower the average winter temperature in central England by half a degree.

No insight into global climate change

However, Lockwood is keen to emphasize that this research can tell us nothing about global climate change. He and his colleagues also analysed temperature data from central England in their raw state, rather than corrected for the underlying hemispheric warming trend, and found the relationship with solar activity breaks down after about 1900, when other studies show that central England temperatures began to respond to global climate change. “There is a tendency to see a local or regional effect as evidence for or against global warming,” says Lockwood. “But our work shows how one can have a regional and seasonal variation that shows solar influence but which is different from the trends in global average data.”

Michael Mann of Penn State University in the US says the research “appears to be a very solid analysis”, which “provides further support” for the idea that the Sun was behind Europe’s cold winters 300 years ago. He adds that he and other researchers have shown that fluctuations in solar activity can also explain the relatively warm winters that occurred in Europe about 1000 years ago.

The research is reported in Environmental Research Letters.

BEC coupled to mechanical oscillator

Physicists in Germany and France have coupled a Bose–Einstein condensate (BEC) of ultracold atoms to the vibrations of a mechanical oscillator for the first time. Although the oscillator is relatively large and obeys the laws of classical physics, the team believes that the technique could be applied to quantum oscillators. If so, it could lead to the development of BEC-oscillator quantum-computing devices or to new techniques for measuring very small magnetic or other forces.

Bose–Einstein condensates are formed when identical atoms with integer spin are cooled until all the atoms are in the same quantum state. In the new work, carried out by Philipp Treutlein and colleagues at the Ludwig-Maximilians University in Munich and the Kastler Brossel Laboratory in Paris, a tiny magnetic microtrap known as an “atom chip” was used to cool about 2000 rubidum-87 atoms to below 1 µK. Roughly 3 cm across, the chip uses metal wires just 50 µm wide to create magnetic fields that trap the atoms.

Treutlein’s team began by creating a BEC with a radius of about 300 nm that was then guided to within 1 µm of the surface of the mechanical oscillator. The oscillator, which is integrated within the chip, is an off-the-shelf cantilever for an atomic-force microscope. It is about 200 µm long and 450 nm thick, has a natural frequency of vibration of 10 kHz and is driven using a piezoelectric device.

Sloshing about

Atoms in the BEC interact with the oscillator via the Casimir–Polder force – an attractive force between an atom and a surface that arises from quantum fluctuations in the vacuum. The BEC has a spectrum of vibrational modes, the simplest of which involves the atoms moving back and forth in the trap like water sloshing in a tilted bowl. The frequencies of the modes can be changed by adjusting the size of the trap.

When the frequency of one of the BEC modes matches the frequency of the oscillator, the atoms gain enough energy to escape the trap. Treutlein and colleagues detect these resonances by keeping a BEC next to the oscillator for several milliseconds and then moving it away to count the number of atoms that remain. If the frequencies do not match, hundreds of atoms remain, whereas tens of atoms are left at resonance.

Towards quantum-coherent coupling

By repeating this process for BECs trapped under a number of different conditions the team was able to identify several different vibrational modes. Because these resonances are very sensitive to external fields, such a device could be used as a tool for measuring magnetic and other fields.

A remarkable aspect of the experiment is that the cantilever is at room temperature, whereas the BEC remains near absolute zero. This is possible, according to team member David Hunger, because thermal fluctuations in the cantilever (other than the fundamental vibrational mode) do not resonate with the BEC.

A downside of this temperature difference, however, is that the oscillator is a classical – rather than a quantum – entity and therefore cannot be coupled quantum-coherently with the BEC. Such coupling could allow quantum information to be transferred to the BEC, where it could be stored for a relatively long time because the BEC is very well isolated from the surrounding environment. The oscillator could also be coupled coherently to light or electrons, thereby offering a way of reading and writing the quantum information.

Hunger told physicsworld.com that one way of achieving quantum coupling is to use a very small oscillator such as a carbon nanotube that is chilled to ultracold temperatures. The team is also exploring the use of extremely thin silicon nitride membranes than could interact with a BEC via quantum back-action.

Klemens Hammerer of the University of Innsbruck believes that Hunger, Treutlein and colleague’s work represents an important step forward in the use of ultracold atoms to study mechanical systems. “This experiment clearly demonstrates the possibility to use ultracold atomic systems as diagnostic tools and high-precision sensors, this is a very promising route of research,” he said.

The work is described in Phys. Rev. Lett. 104 143002.

Combing makes for neat qubits

Physicists in the US have used an optical “frequency comb” to reliably entangle a pair of atomic qubits. The breakthrough bodes well for practicable quantum computing because it allows for simpler manipulation of quantum states than in previous systems.

Quantum computing exploits the innate ambiguities of quantum physics to process certain calculations, such as searching or factorizing, much faster than any of today’s computers. Whereas conventional bits of information can take only the values 0 or 1, a quantum computer’s “qubits” exist in a mixed-up superposition of both. This uncertainty allows any number of qubits, N, to be lumped together – or “entangled”, in quantum speak – to represent a huge 2N values, and then processed in parallel. Or, to put it another way, a quantum computer with just 10 entangled qubits could perform 1024 calculations at once.

Entangling isn’t easy, however. Achieving it with atomic-ion qubits, for example, requires two in-phase laser beams that have a frequency separation exactly matching that of the ions’ spin states. In the past, physicists have made such beams from a single modulated laser, or from two lasers locked to a common source, but in either case the lasers must be very powerful to control the spin states with a reasonable speed. And because the spin transitions often lie in the ultraviolet, the lasing frequencies have to be up-shifted with optical systems that are often inefficient.

Nobel inspiration

Now, Chris Monroe and colleagues from the University of Maryland have shown that entanglement can be made more straightforward by using an optical frequency comb. These devices use interference effects on a single laser to create a series of pulses, equally spaced in frequency like the teeth of a comb, an invention that won Theodor W Hänsch and John L Hall the 2005 Nobel Prize for Physics.

Because all the pulses come from the same laser cavity, they are automatically in phase, and frequencies can be altered simply by altering the length of the cavity. This can also be achieved by adding devices known as acousto-optic modulators. Monroe’s group used the beams from a frequency comb to control and entangle two qubits made of ytterbium ions.

“What this paper demonstrates is using the frequency comb itself as thousands of pairs of lasers,” says David Hanneke, a quantum-optics researcher at the National Institute of Standards and Technology in Boulder, Colorado. “This fast laser-pulse method could prove useful in many systems with large qubit splittings, and the high power available in pulsed lasers could give an advantage even in those systems that currently use conventional frequency modulators.”

This research is published in Physical Review Letters.

Black hole twins spew gravitational waves

Astronomers could be on the cusp of detecting gravitational waves after four decades of trying, according to a team of Polish astrophysicists. They say that if current gravitational-wave detectors are upgraded to search for binary black-hole systems, gravitational waves would be expected “within the first year of operation”. If correct, it would open up a new window to the cosmos, allowing astronomers to see the universe with fresh eyes.

Unlike waves of light which travel through space, gravitational waves are ripples in the fabric of space–time itself. Sources of these waves, which were predicted by Einstein’s theory of general relativity, include binary systems of compact objects such as neutron stars and black holes. As one of the duo inspirals toward the other, gravitational waves propagate out into space.

Searches for gravitational waves, such as the Laser Interferometer Gravitational-Wave Observatory (LIGO), have concentrated on binary systems of two neutron stars because they were thought to be more numerous, despite being weaker sources than rarer double black hole systems.

Wrong decision

However, a team of researchers, led by Chris Belczynski of the Los Alamos National Laboratory, report that these projects have taken the wrong option, saying that double black hole systems may be far more common than previously thought. The reason is related to stars’ metallicity, which is the fraction of elements that are heavier than helium. The lower the metallicity the less mass is lost at the end of the star’s life and therefore the black holes that form are more likely to survive to become a black hole binary.

Until now, models have assumed that most stars had a similar metallicity to the Sun. But by analysing data from the Sloan Digital Sky Survey, Belczynski and his team found that this is only true for 50% of stars, while the rest have a significantly lower metallicity, at 20% of the Sun’s.

The finding is particularly significant given the sensitivity of black hole binary formation to changes in metallicity. “If you reduce metallicity by a factor of ten then you increase the number of black hole binaries by a hundred or a few hundred times,” says Tomasz Bulik, one of the researchers at the Nicolaus Copernicus Astronomical Center in Warsaw.

Imminent upgrades

The current generation of experiments that are searching for gravitational waves, such as LIGO and fellow detector VIRGO, fall short of the sensitivity that Belczynski’s team predicts is required. However, ten-fold upgrades for both are imminent. “The upgrades mean that we are looking at reaching sensitivities at which this paper suggests we are guaranteed to see something,” explained Stuart Reid, a gravitational wave researcher at the University of Glasgow, who is not involved in this research.

Intermediate upgrades to VIRGO could be online as early as this autumn, bringing the instrument to the edge of the range of sensitivities predicted by Belczynski. Both detectors are due to be fully upgraded by 2015. Should they find gravitational waves it would open up new possibilities for probing the cosmos, allowing astronomers to become stellar cartographers.

“As a neutron star inspirals into a black hole, gravitational waves are emitted, mapping out the space–time curvature formed by the black hole. Measuring those waves tells us how the black hole affects objects around it,” says Reid. Gravitational wave astronomy also has advantages over electromagnetic radiation. “It’s difficult to account for how light is affected as it travels towards you. Gravitational waves’ interaction with matter is very weak, so you don’t get the same distortion,” he added.

The research is published on arXiv.

New element 117 discovered

Scientists in Russia and the US have observed the fleeting existence of a new element with 117 protons, produced by firing calcium ions at a radioactive target. The discovery fills in a notable gap in the periodic table and bolsters the idea that neutron-rich superheavy nuclei could be extremely stable, perhaps having lifetimes of many millions of years.

Before 1930 the periodic table was filled entirely with naturally occurring elements, the heaviest of which was uranium with 92 protons. Since then, nuclear-physics experiments have yielded a further 27 elements. In the early 1990s scientists at the GSI laboratory in Darmstadt, Germany, synthesized elements 107 through to 112, and over the last decade experiments at the Joint Institute for Nuclear Research in Dubna, Russia, have added elements 113 to 116 and 118.

The Dubna discoveries were made by firing beams of the rare isotope calcium-48 at targets of heavy ions. Having a high ratio of neutrons to protons (28 versus 20), calcium-48 produces more neutron-heavy elements than other kinds of projectile, which is significant because the shell model of the nucleus predicts that superheavy elements will become more stable as their neutron number goes up – reaching a peak, or “island of stability”, at 184.

Preparing the target

Until now, however, element 117 remained undiscovered. That is because the target material needed to produce it – berkelium-249 – is itself extremely hard to generate. But scientists at Dubna and the Lawrence Livermore National Laboratory in California, under the leadership of Dubna’s Yuri Oganessian, have now produced 22 mg of the substance after completing a two-year experimental programme that involved intense neutron irradiation and processing. After preparing the target the researchers then bombarded it with calcium-48 for 150 days using the heavy-ion cyclotron at Dubna.

Oganessian and colleagues observed the tell-tale signs of the decay chains of two isotopes of element 117 – one having 176 neutrons and the other 177. Both chains involved a series of alpha decays in which the existing nuclide at each step was transformed into a new nuclide after losing two protons and two neutrons, with each chain terminating in nuclear fission. The former chain involved three alpha decays and occurred five times, whereas the latter consisted of six alpha decays and took place just once.

Intriguingly, the new element 117 containing 177 neutrons, the most neutron-rich isotope yet produced, has a half-life (at 78 ms) that is 87 times longer than that of element 118 containing one fewer neutron. Also, the new isotopes of 115, 113, 111, 109 and 107 observed at Dubna each have one or two more neutrons than the isotopes of these elements detected previously and their half-lives are 2.5–42 times as long. “These longer half lives give stronger evidence that there is a special stability to nuclei as you approach N = 184,” says team member Joseph Hamilton of Vanderbilt University in Tennessee.

Towards the island

Actually producing a nucleus containing 184 neutrons, however, is likely to take some time, according to Hamilton, given the intensely radioactive and neutron-rich ion beams that will probably be needed. He estimates that nuclei containing up to 181 neutrons could be produced in experiments “over the next few years”. But he points out that extremely stable superheavy isotopes could perhaps be found in nature, maybe in remote sites such as the ocean floor. “There are some predictions that they could last for very long times, even approaching the age of the Earth,” he adds.

Walter Loveland, a chemist at Oregon State University, says that Oganessian’s group has produced a “convincing demonstration” of the first observation of element 117, which is dubbed “ununseptium”, though it is yet to be officially named by the International Union of Pure and Applied Chemistry. Loveland also believes that the work is a strong vindication of modern theories of heavy element synthesis.

The research will appear in Physical Review Letters.

Strange quark weighs in

A collaboration of particle physicists in Europe and North America have calculated the mass of strange quarks to an accuracy of better than 2%, beating previous results by a factor of 10. The result will help experimentalists to scrutinize the Standard Model of particle physics at accelerators such as CERN’s Large Hadron Collider and Fermilab’s Tevatron.

Quarks are elementary particles possessing familiar properties such as mass and charge, but they never exist as free particles. Instead they join together by the strong force into bound states called hadrons, which include the proton and the neutron. Theorists predict that a large portion of the hadron mass is accounted for by the strong force, mediated by particles known as gluons, and the exact nature of these interactions are still poorly understood.

Quark colour

To determine the mass of individual quarks, therefore, theorists have to combine experimental measurements of hadrons with calculations based on quantum chromodynamics (QCD) – the theory of the strong force. Refinements to this theory over the years have enabled experimentalists to calculate the mass of the heavier three quarks – the top, bottom and charm – to an accuracy of 1%. Unfortunately, however, it is has been much harder to make accurate predictions for the mass of the three lighter quarks – the up, down and strange – and reference tables still contain errors of up to 30%.

Christine Davies at the University of Glasgow and her colleagues in the High Precision QCD collaboration have now finally produced an accurate figure for the mass of the strange quark by taking a different, mathematical approach. They have used a technique known as “lattice QCD”, where quarks are defined as the sites of a lattice and their interaction via gluons represented on the connecting links.

Lattice QCD, which requires the use of powerful supercomputers, enabled the researchers to measure the ratio of the charm quark to the strange quark to an accuracy of 1%. Since the mass of the charm quark is well defined, Davies calculates that the strange quark has a mass of 92.4 MeV/c2 plus or minus 2.5 MeV/c2.

Precision programme

This result is part of a programme of precision calculations in lattice QCD that will help experimentalists at accelerators like the LHC to make sense of the collisions they observe. They are of particular interest to researchers at the LHCb experiment who, by studying mesons made of bottom quarks, are trying to understand whether current physics can describe how our universe developed,

Indeed, many particle physicists believe that once the LHC is ramped up to 14 TeV it will be in a position to either confirm or destroy the Standard Model of particle physics. “This is all part of pinning down the Standard Model and asking how nature can tell the difference between matter and antimatter,” says Christine Davies. In the short term, the High Precision QCD team intends to develop its research by using the same method for bottom quarks, to get accurate results for its mass and the decay rates of its hadrons needed by LHCb.

David Evans, a researcher at the University of Birmingham and a member of the ALICE experiment at CERN, says that it is important to know quark masses for the pursuit of new physics. “If you want to predict new particles in higher energy states, it is very important to know the mass of its constituent parts,” he says. “As far as I know, this is the only group to pin down the mass of light quarks to such high accuracy”.

This research is published in Physical Review Letters.

Argonne lab tackles exotic nuclei

Nuclear physicists at the Argonne National Laboratory in the US have obtained the first results from a new spectrometer that contains the magnet from a mothballed MRI machine. They used the Helical Orbit Spectrometer (HELIOS) to make the most precise measurements to date of two excited states of boron-13 – an “exotic” nucleus containing an unusually high ratio of neutrons to protons. The researchers say that HELIOS could eventually yield precise data on the structure of a range of rare nuclei.

Carried out at Argonne’s ATLAS facility, the experiment involves slamming a beam of stable boron-11 nuclei, containing five protons and six neutrons, into a gas cell filled with much lighter deuterium nuclei, which have just one proton and neutron. This method of rare-isotope production – known in the trade as “inverse kinematics” – leads to neutrons being “stripped” from the deuterium and tacked onto the nuclei in the beam. The result is a “secondary” beam of short-lived boron-12 nuclei containing seven neutrons.

A subsequent stripping reaction takes place inside the magnetic field of HELIOS, built from a converted 2 m long superconducting solenoid donated to Argonne by the Max Planck Institute in Germany. The boron-12 beam is then passed into the magnet’s hollow centre, where it hits a target of thin deuterium-rich foil, from which another neutron is stripped to produce the desired boron-13 nuclei. By studying the protons left behind from the stripped deuterium nuclei, the researchers were able to gain insights into the boron-13 nuclei.

The advantage of HELIOS is in how it handles these protons, which are caught by the solenoid’s strong 3 Tesla magnetic field before being bent and focused back into a bank of silicon detectors along the beam axis. In similar experiments that rely on spectrometers with no magnetic fields, the protons cannon off the target at various angles, requiring expensive umbrella arrays of detectors to capture all information about a given reaction. And even with such arrays in place, the rapid variation in proton energies in such experiments makes it difficult to record precise measurements, particularly for those protons downstream from the target.

Sniffing out rare nuclei

Using HELIOS, the researchers were able to discern the spin quantum numbers of two such states less than 200 keV apart, a hair’s breadth on the nuclear scale. The data are key to determining the gap between the neutron shells that models suggest provide the essential structural scaffolding of all nuclei. However, much more significant than this first finding is the possibility of using HELIOS to investigate heavier neutron-rich isotopes such as tin-132 (50 protons, 82 neutrons).

“It wasn’t this measurement that the device was built for,” says Argonne physicist John Schiffer, who first proposed the idea of a HELIOS-type spectrometer about decade ago.

HELIOS is able to resolve protons with energies that differ by a mere 50–100 keV, whereas existing spectrometers have resolutions of just 150–250 keV. Birger Back, who is a member of the team that includes researchers from Manchester, Michigan State and Western Michigan universities, says that this three- or four-fold boost in resolution is an amount that “in many cases can either make or break an experiment”.

A related HELIOS benefit, say Back and Schiffer, is its high “efficiency”, which means that it can sniff out and make measurements for a higher percentage of rare nuclei produced during the experiment. This is important, says Wilton Catford, a nuclear physicist at Surrey University in the UK, “because the most interesting radioactive beams will always tend to be the rarest ones, with quite low intensities”.

Although Catford is not part of the HELIOS collaboration, he helped to develop two detectors – TIARA, currently based at the GANIL lab in France, and SHARC, at the TRIUMF facility in Canada – used in similar experiments with exotic beams. He says that, along with TIARA and SHARC, “the HELIOS detector provides one promising way to push the efficiency limits”.

The work is described in Physical Review Letters.

Copyright © 2026 by IOP Publishing Ltd and individual contributors