Skip to main content

Former Lawrence Livermore physicist begins jail term

A former Lawrence Livermore National Laboratory physicist has been sentenced by a court in California to 18 months in prison for submitting false data and reports with the “purpose of defrauding a government agency”. Sean Darin Kinion, 44, who earned his physics PhD from the University of California, Davis, pleaded guilty in June to mail fraud and acknowledged that he was involved in “a scheme to defraud the government out of money intended to fund research”. In addition to his prison term, which will begin on 26 January, Kinion will have to pay $3,317,893 in compensation to the US government. He will also undergo supervision for three years following his release from prison.

According to Lawrence Livermore spokesperson Lynda Seaver, the lab began to see “red flags” in Kinion’s work in late summer 2012 and, after initially putting him on leave, lab administrators sacked the physicist in February 2013. That action was relatively simple because Kinion was not a civil servant – he was a contract employee who could be “dismissed for cause”. The lab then referred the case to the Department of Energy (DOE), which oversees national laboratories in the US. The DOE’s inspector general forwarded the case onto the Department of Justice, which began the process that led to Kinion’s prosecution.

Non-existent experiments

Kinion’s trial focused on funding that he received between 2008 and 2012 from the Intelligence Advanced Research Projects Activity (IARPA), which belongs to the Directorate of National Intelligence. According to the local US attorney’s office, the funding was to let him “design, build, and test experimental components in the field of quantum computing”. Prosecutors put particular focus on an experimental design that involved the deposition of ion-trap electrodes on polished sapphire wafers, which were covered by a layer of niobium that was wet-etched with hydrofluoric acid. The prosecutors noted that Kinion had received a grant of $539,000 for the necessary equipment. “[He] claimed he had used the equipment successfully to build and test [the] experimental components, and submitted reports and information in support of these claims,” the prosecution stated. “Kinion, however, never set up nor operated the equipment.”

The prosecutors also note that Kinion mailed “bogus” non-functioning components to the IARPA’s validation team and also “altered and backdated Federal Express mailing labels and falsely claimed that he mailed items on dates prior to the date he actually mailed them”. According to the prosecutors, he also conducted a three-day “charade” experiment for a scientist visiting Lawrence Livermore to establish the legitimacy of the research he claimed to have carried out.

False and fraudulent data

In court documents, prosecutors noted that Kinion set out to win prestige and advance his career rather than enriching himself. Yet he “presented to the government false and fraudulent data and information in a scheme to defraud IARPA into thinking he had performed the work…[and] took deliberate additional steps to conceal and prevent IARPA from discovering his fraudulent scheme”.

According to Kinion’s lawyer James Phillip Vaughns, “Kinion does not, and did not, admit to any embellishment of his theoretical work.” While no papers were published based on his research, prosecutors charged that the fraud wasted the time and effort of scientists who tried to test and duplicate Kinion’s results, as well as taking IARPA funds that other researchers could have received. Prison time for scientific fraud is extremely rare in the US. But noting that the prosecution had requested 51 months in prison, Vaughns told Retraction Watch that the 18 months Kinion received “could be viewed as a favourable outcome”.

Flash Physics: Plasma makes an ultraviolet twist, p-wave superconductivity in graphene, Sun does not affect radioactivity

Plasma shortens the wavelength of twisted light

A new way of creating twisted light at extreme-ultraviolet (EUV) wavelengths has been developed by Fabien Quéré and colleagues at the University of Paris-Saclay in France. Twisted light carries orbital angular momentum and has a range of potential applications, from boosting the capacity of optical-telecommunications networks to high-resolution microscopy. The new technique involves first creating a powerful pulse of twisted infrared light by passing a 25 fs 100 TW laser pulse through a 1 mm-thick silica plate with a spiral pattern on it. This twisted light is then fired at a plasma that has been created by heating a silica target with a second infrared pulse. The plasma acts as a mirror that reflects some of the pulse as a twisted light at much shorter EUV wavelengths. Physicists are currently working on several different schemes for making EUV twisted light. Writing in Physical Review Letters, Quéré and colleagues say that such sources “might find intriguing applications as advanced probes of matter”. Vortices created within the plasma during the conversion process could also be useful for accelerating charged particles to very high energies – effectively operating as table-top accelerators.

Is graphene a p-wave superconductor?

Graphic of graphene's hexagonal carbon lattice structure

Scientists have unlocked graphene’s superconductivity, which could be the elusive p-wave type. Superconductors exhibit zero electrical resistance at very low temperatures. The type of superconductivity exhibited (s-, d- or p-wave) is defined by how the material’s electrons form superconducting pairs. Graphene – a sheet of carbon one-atom thick – is expected to be a superconductor, but superconductivity has only been seen when it is doped with another superconducting material. Now, researchers at the University of Cambridge in the UK have managed to overcome this problem. Angelo Di Bernardo, Jason Robinson and colleagues coupled the graphene with praseodymium-cerium copper oxide (PCCO). PCCO is a known d-wave superconductor because its electron-pair spin states are oriented in a certain way. But when coupled with graphene, that orientation changed. The group speculates that this means the graphene is exhibiting rare p-wave superconductivity, which physicists have been struggling to verify exists for more than 20 years. The combination of graphene’s superconductivity and the potential observation of p-wave superconductivity could lead to the development of new technologies based on the material. The work is published in Nature Communications.

Sun does not affect radioactive decay, says comprehensive study

The rates at which radioactive nuclei decay are constants and do not vary with time – according to Stefaan Pommé of the Joint Research Centre (JRC) of the European Commission in Geel, Belgium and colleagues. The team looked at decay-rate measurements made on a number of different isotopes at 14 laboratories worldwide and spanning 60 years. After performing careful statistical analyses of the data, the researchers have showed that the decay rates do not change over time and are not influenced by the experiments’ proximity to the Sun. Several studies had suggested that decay rates are affected by the distance between the Earth and the Sun – speculating that the corresponding fluctuations in solar-neutrino flux were responsible. “The study confirms that the foundation of our common measurement system of radioactivity is valid and that radioactivity behaves the same in every place on Earth,” says a statement from JRC Geel. The study is reported in four papers including three published in Metrologia.

 

  • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on fraud and research funds.

Optical supercavity drives tiny and efficient laser

A new type of compact and highly efficient laser that is compatible with optical telecommunications has been created by Boubacar Kanté and colleagues at the University of California, San Diego, in the US. The tuneable device, which uses a wave phenomenon first proposed more than 80 years ago, can output light with a range of different beam profiles. According to Kanté, the laser could someday be used in a wide range of applications including spectroscopy and optical trapping.

The wave phenomenon exploited by the device was first suggested in 1929 by John von Neumann and Eugene Wigner, who calculated that certain quantum systems can have bound states above the continuum threshold. The discovery of these “bound states in the continuum”, or BICs, was surprising because this threshold is normally the energy required to break apart a quantum system – ionizing an atom, for example.

No one paid much attention to the result until the 1970s when physicists suggested that BICs could exist in regular arrays of semiconductor materials. More recently, it has been realized that BICs are a general property of all waves and can occur in classical systems based on light, sound and microwaves.

A BIC could also function as a very high-quality optical “supercavity” that can confine light to regions as small as several microns. Making high-quality yet small cavities in conventional lasers, in contrast, is hard because they need to be formed from two opposing mirrors that bounce light back and forth through a lasing medium. Shrinking the cavities is tricky, making it a challenge to create tiny lasers that are very efficient at producing high-quality light.

Tiny cylinders

Now, however, Kanté and colleagues have made tiny lasers that are based on BICs created within semiconductor structures as small as 10 μm across. The structures are square arrays of tiny cylinders made of indium-gallium phosphide. Normally, BICs arise in systems in which the lattice repeats infinitely in at least one direction. Kanté and colleagues got around this restriction by creating a cavity that supports several standing waves and then adjusting the structure so it best resembles a BIC.

The researchers made several different arrays – containing 8 × 8 to 20 × 20 elements and comprising cylinders with radii in the 500–550 nm range. They were able to create lasers with all of these arrays.

Kanté told Physics World that the use of a BIC supercavity allows the devices to efficiently produce high-quality laser light – even when the array is tiny. Furthermore, light is emitted vertically from the surface of the array, which offers advantages during the production process. Another plus for the laser is that it works at room temperature. And because the laser is based on a simple semiconductor array, its overall size can be easily changed – with larger arrays producing more light.

Compact spectroscopy

Kanté also points out that the colour of the laser light can be fine-tuned by adjusting the size of the semiconductor cylinders in the array. This means it could useful for creating compact spectroscopy instruments such as the Tunable Laser Spectrometer, which is used by NASA’s Curiosity rover to study the chemical composition of the Martian atmosphere.

Another facet of the new laser is that it can create “vector” beams of light, which have specific profiles such as a Gaussian distribution or a doughnut shape. Such light can be used to trap, manipulate and study tiny objects such as bacterium and red blood cells. Vector beams can also carry orbital angular momentum and this “twisted light” has a number of applications including increasing the data capacity of optical telecommunications networks.

The BIC lasers are described in Nature.

Create films with the sounds of space

 

By James Dacey

Last weekend I went to a Davie Bowie tribute night at a local pub in Bath. It was a fun evening – roughly a year since the artist passed away – where local musicians played classic tracks by Ziggy Stardust, the Thin White Duke and several of Bowie’s other alter egos. One of the more surreal moments of the night was when a man in a pink suit took to the stage to play what the band called his “spaceship” – producing a whirring, repetitive electronic sound that built up to a crescendo. For a few minutes we were transported into space, just as Bowie intended with many of his memorable songs.

(more…)

Flash Physics: Robot hugs broken heart, cool magnetic storage, antiprotons prop up Standard Model

Soft robot hugs a broken heart

Researchers have developed a soft robot that can fit around a heart and help it beat. In an attempt to improve cardiovascular treatments, scientists at Harvard University and Boston Children’s Hospital in the US have created a customizable robotic sleeve that can augment heart functions. Today’s treatments for heart failure include heart transplants and ventricular-assist devices (VADs) – mechanical pumps that force blood from the ventricles into the aorta. While VADs are improving, there is always the risk of clotting, blood-thinner side effects and infection because the device is in contact with the blood. In contrast, the robotic sleeve, developed by Ellen Roche and team, wraps around the heart. Made of thin silicone, it uses soft pneumatic actuators to mimic the heart’s two outer muscle layers. The actuators are supplied with air by an external pump, which allows them to twist and compress in sync with the heartbeat. Another benefit of the soft robot is that it is customizable to the patient. Should one side of a patient’s heart be weaker than the other, the robot can account for that. If the heart begins to weaken, the actuators can apply more pressure. While the device is still in the early stages of development, in the future it could be used for patients with heart failure and aid in cardiac rehabilitation and recovery. The work is described in Science Translational Medicine.

New magnetic data storage is cool and quick

Image showing domains of four different magnetizations

An ultrafast and energy-efficient way of writing data to a magnetic storage medium has been unveiled by researchers in Poland and the Netherlands. Their storage medium is a thin layer of yttrium-iron garnet (YIG) in which some of the iron atoms have been replaced with cobalt. These cobalt atoms tend to align their spin magnetic moments along one of three high-symmetry directions in the material’s cubic lattice, leading to regions of net magnetization. Andrzej Stupakiewicz of the University of Bialystok, Poland, and Alexey Kimel of the University of Radboud, the Netherlands, and colleagues have shown that this magnetization can be “steered” from one lattice direction to another by shining a tiny spot of linearly polarized light on the material. The infrared light is tuned to correspond to a transition between two different d orbitals in cobalt – the d electrons being involved in the magnetic properties of the material. Information can be stored in the direction of the magnetization, which can be switched in less than 20 ps. This is much faster than today’s magnetic storage media. Writing in Nature, the team reports that switching requires less that 6 j/cm3. This is much less than today’s magneto-optic storage devices, which rely on heating up the medium to switch the magnetization.

CERN antiproton measurement props up Standard Model

Photograph of Stefan Ulmer, spokesperson BASE Collaboration, in the BASE experiment

The magnetic moment of the antiproton has been measured with a sixfold improvement in accuracy. The BASE collaboration at CERN has reported the best measurement to date, with an experimental uncertainly of 0.8 parts per million. This improves upon the 2013 measurement by the CERN-based ATRAP collaboration, which had 4.4 parts-per-million uncertainty. Both experiments had aimed to shed light on an important mystery of antimatter: while antimatter appears identical to matter at a particle level, the universe as a whole contains little antimatter – which suggests it is different. By measuring properties such as magnetic moment, scientists can test the Standard Model and its matter–antimatter symmetry for chinks that could explain the disparity. The BASE experiment cools antiprotons from CERN’s antiproton decelerator to about 1 K before trapping them in electromagnetic containers. The containers store the antiprotons for long periods of time and release them individually into further traps where measurements are made. However, while the latest measurements show impressive accuracy, the antiproton magnetic moment was found to be the exact opposite of the proton’s – just as predicted by the Standard Model. The next step for BASE is to try a new trapping technique that should improve uncertainty by a factor of up to 800, which could reveal a glimpse of new physics. The current study has been reported in Nature Communications.

  • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on a new and extremely efficient miniature laser.

Web life: Nanoscale Views

So what is the site about?

Nanoscale Views is a blog written by physicist Douglas Natelson, who heads the physics and astronomy department at Rice University in the US. The blog has a catchy strapline – “A blog about condensed matter and nanoscale physics. Why should high energy and astro folks have all the fun?” – that essentially describes why Natelson began blogging in mid-2005. Indeed, in his first post, he describes hunting around the Internet for atomic, molecular and optical physics blogs, only to come up empty-handed. Since then, Natelson’s blog has been regularly updated, at a rate of four to five posts a month, and examines a myriad of topics.

What are some of the topics covered?

As the blog’s title suggests, condensed-matter posts are a regular feature: from quantum computing to metasurfaces. These are often in-depth, but Natelson keeps the contents and his language as simple as possible, making all his writing enjoyable. He often covers a topic in a series of posts over a week or two, so no individual post is onerously long. He also regularly follows up on previous research news and topics, even a few years on, providing a swiftly developing field with the necessary context. Apart from research, Natelson also writes a variety of posts on everything from conference and workshop reports, collections of short and interesting news briefs and job postings. He often writes about academic life – be it career advice or academic publishing or even occasionally funding and policy news. Interestingly, Natelson also reports on general big physics news – for example, the discovery of a terrestrial exoplanet around habitable zone around Sun’s nearest neighbour – that does not lie strictly in the AMO field, but is of definite interest to anyone in physics.

Who is it aimed at?

The vast variety of topics covered means that the blog’s readership is wide. While a healthy interest in condensed-matter physics would benefit regular readers, it is not at all a prerequisite. In fact, the blog would be a good place for anyone looking for a solid introduction to the field (especially thanks to Natelson’s wide archive of topics, which he links to in most posts), as well as for those who wish to keep up to date with developments within it.

Can you give me a sample quote?

From a February 2016 post introducing “density functional theory”: “Let me try an analogy. You’re trying to arrange the seating for a big banquet, and there are a bunch of constraints: Alice wants very much to be close to the kitchen. Bob also wants to be close to the kitchen. However, Alice and Bob both want to be as far from all other people as possible. Chairs can’t be on top of each other, but you still need to accommodate the full guest list. In the end you are going to care about the answers to certain questions: How hard would it be to push two chairs closer to each other? If one person left, how much would all the chairs need to be rearranged to keep everyone maximally comfortable? You could imagine solving this problem by brute force – write down all the constraints and try satisfying them one person at a time, though every person you add might mean rearranging all the previously seated people. You could also imagine solving this by some trial-and-error method, where you guess an initial arrangement, and make adjustments to check and see if you’ve improved how well you satisfy everyone. However, it doesn’t look like there’s any clear, immediate strategy for figuring this out and answering the relevant questions.”

Of old habits and new ideas

Poliomyelitis or polio is a disease that causes irreversible paralysis and even death. Unfortunately, it mainly affects children under the age of five. This debilitating disease must be banished and it is indeed on its way out. According to the World Health Organization, polio cases have decreased by over 99% since 1988, from an estimated 350,000 cases in more than 125 endemic countries. This is mainly thanks to the “Global Polio Eradication Initiative” (GPEI) that rolled out a large-scale polio vaccination programme in the late 1980s. One might be excused, then, for assuming that the invention and administration of the Polio vaccine, which made the GPEI possible, is a good thing that must surely be globally applauded. Well, not quite. Innovations such as the polio vaccine are often not embraced by everyone. It should therefore come as no surprise that in certain parts of the world, the polio vaccination is still dogged by controversy even today.

There are various reasons why a section of the population will vocally, and sometimes even militantly, oppose any innovation. Some of these reasons are explored by author Calestous Juma in Innovation and Its Enemies: Why People Resist New Technologies. When it comes to innovation, the book explains that neither the genuineness nor the gravity of a problem being addressed is enough to make the innovation universally acceptable. For example, the author talks of “transgenic farming” or genetically modified crops. If you think the burgeoning world population requires food security and that engineering the genetic make-up of crops to increase yield is a universally acceptable solution, then dream on. Environmentalists, green campaigners and self-styled “friends of the Earth” believe that it is in the best interests of our planet to protect it from any such farming and its proponents, who are often deemed as “foes of the Earth”.

Juma argues that some of the controversies that follow innovations could be directly traced to the very change brought about by the innovation itself. He echoes Austrian-American economist Joseph Schumpeter’s concept of “creative destruction”, which suggests that “the change that comes with innovation requires the destruction of something old and replacing it with something new”. This very process brews tension between the proponents of incumbency and the people promoting innovative ideas. The palpable tension is further fuelled by the “fear of loss”, or better still, “perceived loss” that is associated with change. Loss here could be anything from economic loss, to loss of the Earth’s biodiversity and even one’s cultural heritage.

In the book, Juma addresses the controversy that follows each of nine flagship innovations. With enough attention to fine historical details, the book promises not to bore the reader, especially combined with Juma’s lively and engaging style. Each chapter starts with a detailed historical background to the chosen innovation, followed by a lucid description of its specifics, before delving into the controversies surrounding the innovation and the reasons behind the tension. Starting with the ancient but lively story of the innovations relating to coffee and its consumption, through to controversy-laden transgenic farming, Juma tries to communicate to the reader, the justifications and evidence that are often used by antagonists in suppressing a particular innovation. Some of the reasons why a certain group will vehemently oppose a breakthrough, and the methods used to stop it, are quite obvious, but readers should be prepared to be surprised or in the very least, amused.

Take the battle of electric currents, for example. The book chronicles how the proponents of Thomas Edison’s direct current (DC) vilified the alternating current (AC) championed by George Westinghouse in the 1800s. Edison and his supporters publicly demonized AC as cruel and unsafe. To substantiate their claim, they sent 300 volts AC through the spinal cord and brain of animals to electrocute them. This inadvertent act of horror would later pave the way for the use of the electric chair as a form of capital punishment in the US. Edison actually realized the superiority of AC and his primary intention was not to stop its adoption, but rather to delay it so he could recover/divest his investment from DC.

Juma really demonstrates his storytelling prowess while writing about the incredibly rich history of coffee (Coffea Arabica) and how this beverage, native to the highlands of Ethiopia, conquered the rest of the world, and of course, the resistance encountered along the way in doing so. Juma discerningly narrates the political, cultural and at times the “dodgy” premises used by antagonists to oppose the global spread of coffee and coffee houses. Sadly, not all the chapters in the book are as rich in detail as the story of coffee.

In the very last chapter of the book Juma addresses the question of whether the controversies that accompany innovations can ever be avoided and gives some practical suggestions directed at policy-makers and public-office holders on this. While his ideas might not entirely avert controversies, he certainly believes they can help manage controversies better.

Readers should be aware that Juma gives no reasons or excuses for his choice of selected innovations discussed in the book. Some very topical and interesting innovations along with their controversies are therefore not tackled. Of the long list of overlooked controversy-prone innovations, one that is conspicuously missing in my view is birth-control pills. These are embroiled in controversy even today in different parts of the world. Ever since Margaret Sanger, who coined the term “birth control” in 1914, opened her first family planning clinic in around 1916, birth control has always been controversial. The Zubik v. Burwell case before the US Supreme Court in May 2016 is a stark reminder of the neverending controversy surrounding birth control pills. It certainly would be interesting to know what Juma thinks about this innovation and why many people are so against it. I doubt very much that the author is oblivious to this example; it must have been a conscious decision to leave this particular example out. By so doing, the breadth of the book is narrowed and this might disappoint some readers.

Moreover, except for the stories of coffee and the printing press, the other controversies discussed in the book are overly focused on a particular geographical location – the US – though this may well be the author’s target audience. With so much history of innovation and resistance to choose from, some level of selectivity is to be expected in a book of this size. Nonetheless, some readers will find the geographical confinement unappealing.

What is certainly not missing in the book is the unmistakable passion of Juma regarding innovation. This comes across strikingly clear right from the first page of the book to the very last. Reading Innovation and its Enemies will not make anyone an innovator, but it brings to the fore why being an innovator is never plain sailing.

  • 2016 Oxford University Press £19.99hb 432pp

Dark energy emerges when energy conservation is violated

The conservation of energy is one of physicists’ most cherished principles, but its violation could resolve a major scientific mystery: why is the expansion of the universe accelerating? That is the eye-catching claim of a group of theorists in France and Mexico, who have worked out that dark energy can take the form of Albert Einstein’s cosmological constant by effectively sucking energy out of the cosmos as it expands.

The cosmological constant is a mathematical term describing an anti-gravitational force that Einstein had inserted into his equations of general relativity in order to counteract the mutual attraction of matter within a static universe. It was then described by Einstein as his “biggest blunder”, after it was discovered that the universe is in fact expanding. But then the constant returned to favour in the late 1990s following the discovery that the universe’s expansion is accelerating.

For many physicists, the cosmological constant is a natural candidate to explain dark energy. Since it is a property of space–time itself, the constant could represent the energy generated by the virtual particles that quantum mechanics dictates continually flit into and out of existence. Unfortunately the theoretical value of this “vacuum energy” is up to a staggering 120 orders of magnitude larger than observations of the universe’s expansion imply.

Running total

The latest work, carried out by Alejandro Perez and Thibaut Josset of Aix Marseille University together with Daniel Sudarsky of the National Autonomous University of Mexico, proposes that the cosmological constant is instead the running total of all the non-conserved energy in the history of the universe. The “constant” in fact would vary – increasing when energy flows out of the universe and decreasing when it returns. However, the constant would appear unchanging in our current (low-density) epoch because its rate of change would be proportional to the universe’s mass density. In this scheme, vacuum energy does not contribute to the cosmological constant.

The researchers had to look beyond general relativity because, like Newtonian mechanics, it requires energy to be conserved. Strictly speaking, relativity requires the conservation of a multi-component “energy-momentum tensor”. That conservation is manifest in the fact that, on very small scales, space–time is flat, even though Einstein’s theory tells us that mass distorts the geometry of space–time.

Even though each individual violation of energy conservation is tiny, the accumulated effect of these violations over the very long history of the universe can lead to dark energy
Alejandro Perez, Aix Marseille University

In contrast, most attempts to devise a theory of quantum gravity require space–time to come in discrete grains at the smallest (Planck-length) scales. That graininess opens the door to energy non-conservation. Unfortunately, no fully formed quantum-gravity theory exists yet, and so the trio instead turned to a variant of general relativity known as unimodular gravity, which allows some violation of energy conservation. They found that when they constrained the amount of energy that can be lost from (or gained by) the universe to be consistent with the cosmological principle – on very large scales the process must be both homogeneous and isotropic – the unimodular equations generated a cosmological-constant-like entity.

Modified quantum mechanics

In the absence of a proper understanding of Planck-scale space–time graininess, the researchers were unable to calculate the exact size of the cosmological constant. Instead, they incorporated the unimodular equations into a couple of phenomenological models that exhibit energy non-conservation. One of these describes how matter might propagate in granular space–time, while the other modifies quantum mechanics to account for the disappearance of superposition states at macroscopic scales.

These models both contain two free parameters, which were adjusted to make the models consistent with null results from experiments that have looked for energy non-conservation in our local universe. Despite this severe constraint, the researchers found that the models generated a cosmological constant of the same order of magnitude as that observed. “We are saying that even though each individual violation of energy conservation is tiny, the accumulated effect of these violations over the very long history of the universe can lead to dark energy and accelerated expansion,” Perez says.

In future, he says it might be possible to subject the new idea to more direct tests, such as observing supernovae very precisely to try to work out whether the universe’s accelerating expansion is driven by a constant or varying force. The model could also be improved so that it captures dark-energy’s evolution from just after the Big Bang – and then comparing the results with observations of the cosmic microwave background.

If the trio are ultimately proved right, it would not mean physicists having to throw their long-established conservation principles completely out of the window. A variation in the cosmological constant, Perez says, could point to a deeper, more abstract kind of conservation law. “Just as heat is energy stored in the chaotic motion of molecules, the cosmological constant would be ‘energy’ stored in the dynamics of atoms of space–time,” he explains. “This energy would only appear to be lost if space–time is assumed to be smooth.”

Fanciful yet viable

Other physicists are cautiously supportive of the new work. George Ellis of the University of Cape Town in South Africa describes the research as “no more fanciful than many other ideas being explored in theoretical physics at present”. The fact that the models predict energy to be “effectively conserved on solar-system scales” – a crucial check, he says – makes the proposal “viable” in his view.

Lee Smolin of the Perimeter Institute for Theoretical Physics in Canada, meanwhile, praises the researchers for their “fresh new idea”, which he describes as “speculative, but in the best way”. He says that the proposal is “probably wrong”, but that if it’s right “it is revolutionary”.

The research is described in Physical Review Letters.

Flash Physics: Tornado mystery solved at long last, nanoparticles self-heal, firm bags £35m nuclear contract

Tornado mystery solved at long last

In 1955, Scottsbluff, Nebraska, US, a group of broadcasters was forced to take shelter from a tornado in the basement of a stone building. But they weren’t completely safe from the violent storm. As the funnel of the tornado passed overhead, the temperature dropped and suddenly they found it difficult to breathe. Sixty-two years on, Georgios Vatistas and his students at Concordia University in Canada have developed a new mathematical model that can explain what happened that day. While past studies have focused on laminar (smooth) vortices, the current study looked at vortices with more complex turbulent flow. In a laminar vortex, the converging flow cools down consistently with decreasing radius. The new model, however, takes into account density variation and turbulence. The team found that the temperature first rises before cooling to a minimum at the vortex centre. This is the result of competition between the air heating because of mechanical friction and cooling because of expanding air pockets. The result is a cooler vortex centre, with lower air pressure. Vatistas and colleagues could work out that during the 1955 tornado, the temperature went from 27 °C to 12 °C and the air pressure dropped to 80% of that found at an altitude of 8000 m (known as the “death zone”). Thankfully, it passed quickly before the broadcasters were suffocated. The findings, published in the Journal of Aircraft, will help to improve the operation of refrigeration vortex tubes used to cool equipment such as electronic components and cutting tools. It also sheds some light on the mysterious world of tornados and water spouts.

Nanoparticles self-heal after absorbing hydrogen

Photograph of Jen Dionne and colleagues

Researchers at Stanford University in the US have used state-of-the-art electron-microscopy techniques to watch tiny particles of palladium absorb hydrogen ions in real time. Jen Dionne and colleagues first created palladium nanocubes that were 15–80 nm in size. The nanocubes were then placed in a scanning transmission electron microscope in the presence of hydrogen. The team watched as hydrogen molecules reacted with the surface of the palladium, creating hydrogen ions that entered the bulk of the nanocubes in a process called intercalation. Studies that lasted more than 24 h revealed that imperfections in the palladium-crystal structure developed as the nanocube filled up with hydrogen. However, once the material can absorb no more hydrogen, these imperfections appeared to be “pushed-out” of the nanocube. “The nanoparticle has the ability to self-heal,” explains Dionne. “When you first introduce hydrogen, the particle deforms and loses its perfect crystallinity. But once the particle has absorbed as much hydrogen as it can, it transforms itself back to a perfect crystal again.” The storage of hydrogen in metals plays an important role in advanced energy sources such as fuel cells. The research – described in Nature Communications – could help to boost the performance of such systems.

Amec Foster Wheeler bags £35m nuclear-propulsion contract

UK-based engineering firm Amec Foster Wheeler has been awarded a five-year contract to provide research and technology services to the UK Ministry of Defence’s Naval Nuclear Propulsion Programme (NNPP). The deal is worth £35m and involves analysis and laboratory testing at the firm’s facilities in Warrington and Dorchester. Programme, project and technical management services will also be provided by Amec Foster Wheeler. “Long-term investment in innovative research and technology like this is absolutely essential to ensuring that Britain’s nuclear programme remains cutting-edge,” says UK defence minister Harriet Baldwin.

 

  • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on the origins of dark energy.

Squeezed light plays a quantum drum

Physicists in the US have cooled a microscopic aluminium drum closer to absolute zero than previously thought possible. The researchers, working at the National Institute of Standards and Technology (NIST) in Boulder, Colorado, say that the cooling technique could be used in a range of applications including quantum computers or high-precision sensors.

The drum, which is 20 μm in diameter and 100 nm thick, was embedded on a chip as part of a superconducting circuit, explains John Teufel of NIST. To cool the drum, the researchers first placed the circuit into a vacuum chamber at 37 mK. They then circulated microwaves through the circuit. The microwave photons struck the drum’s atoms and robbed the atoms of some of their momentum. Thus, by slowing the atoms’ thermal motion, the photons lower the drum’s temperature to 360 μK.

Teufel points out that in this experiment, temperature is defined differently than conventionally understood in thermodynamics. When they say they have achieved 360 μK, this does not mean that the atoms are nearly stationary. Rather, the drum is capable of simultaneously vibrating at a range of frequencies, but in their experiment they have restricted it to only the lowest frequency.

Random kicks

For years, physicists have used photons to cool physical systems including atoms, molecules and even mirrors in optical cavities. But the photons’ cooling ability was always limited by noise, due to Heisenberg’s uncertainty principle. Teufel explains that overall, while the photons steal momentum from the drum, occasionally they give “random kicks” of momentum back to the drum because of quantum fluctuations. It had been thought that this noise would be present whenever photons were used for cooling, and it would prevent the photons from cooling beyond a specific temperature called the “quantum limit.”

His group’s innovation, Teufel explains, was to figure out how to get rid of the noise to surpass the quantum limit. The researchers achieved this by using a special type of light known as squeezed light.

Squeezed light is light with engineered uncertainty. All photons are subject to the uncertainty principle. For example, the better you know the photon’s position, the worse you know its momentum. While physicists can never get rid of this fundamental uncertainty, they have figured out that they can redistribute it. For example, if they want to know the photon’s position more precisely, they can sacrifice the precision in the momentum measurement.

Cleverly using the rules

Teufel’s group redistributed the uncertainty between a photon’s intensity and its phase, such that photons hit the drum in a co-ordinated way to eliminate random noise. He is careful to point out that they have not broken the rules of quantum mechanics. “We’re just cleverly using the rules,” he says. This technique can theoretically cool an object “arbitrarily close” to absolute zero, meaning that while reaching absolute zero is impossible, experimentalists can inch towards it, unconstrained by photon quantum noise.

This level of cooling means that they can observe vibrations in the drum due to quantum effects, which are usually drowned out by thermal motion. Furthermore, Teufel says that the cooling technique can be used to make ultrasensitive detectors such as force sensors or magnetometers. “You can very sensitively measure anything that pushes on the drum,” he says.

Teufel also wants to use the drum in quantum computing. For example, quantum information could be encoded as a long-lasting vibration in the drum. These vibrations can be used as information storage.

Minds were blown

“The work is exciting because most people in the field thought they were stuck with the quantum noise forever,” says Aashish Clerk of Canada’s McGill University, who was not involved in the work. To surpass the quantum limit, Teufel’s group really thought outside the box. “No one had any intuitive reason to think that [squeezed light] would actually do something,” Clerk says. He first heard the researchers talk about the technique at a conference. “It was a big audience, and their minds were just blown,” he says.

The research is described in Nature.

Copyright © 2026 by IOP Publishing Ltd and individual contributors