Skip to main content

Flash Physics: LHCb spots baryon resonances, climate-change centre to close, forecasting turbulent flow

LHCb spots five new baryon resonances

Five new Ωc0 baryon resonances have been detected by the LHCb experiment on the Large Hadron Collider (LHC) at CERN. Ωc0 baryons comprise a charmed quark and two strange quarks, and because they are composite particles they can exist in a number of different energy states – or resonances. Two low-energy states of the baryons are already known, but now physicists working on LHCb have observed five higher-energy Ωc0 resonances with masses 3000 MeV/C2, 3050 MeV/C2, 3066 MeV/C2, 3090 MeV/C2 and 3119 MeV/C2. The discovery could provide physicists with further insights into quantum chromodynamics, which is the theory that describes how quarks interact with each other. The discovery is described in a preprint on arXiv.

Australian climate-change centre to close for lack of funds

Australia’s most prominent non-governmental climate-change organization will shut down in June due to a lack of funding. The Climate Institute – a policy think tank based in Sydney – was founded in 2005 and has played a vital role in guiding Australian climate policy, such as developing the country’s renewable-energy target in 2008, a carbon-pollution reduction scheme in 2009, as well as an emissions-trading scheme in 2011. The institute was originally financed by the Poola Foundation’s Tom Kantor fund, which supported the organization for 10 years, after which the institute began to seek new donors. “Despite ongoing support from a range of philanthropy and business entities, the board has been unable to secure sufficient funding to continue the level and quality of work that is representative of the Climate Institute’s strong reputation,” Mark Wooton, the institute’s founding director and board chair notes in a statement. One potential difficulty in attracting funding could have been the current Australian government’s inclination towards fossil fuels. “We are disappointed that some in government prefer to treat what should be a risk-management issue as a proxy for political and ideological battles,” Wooton notes. There is, however, still a chance that some of its activities will continue. After shutting down in June, the institute’s board says it will work with other organizations that could carry on some “key aspects” of its work.

Turbulent flow can be forecasted

the vorticity of fluid flow

Turbulence is characterized by chaotic changes in the flow and pressure of a fluid, and therefore it is very difficult to predict the time evolution of turbulent systems. Recently, however, physicists have noticed that non-chaotic “exact coherent structures” appear to exist in turbulent flows and that these ECSs recur in space and time. Now, Michael Schatz and colleagues at the Georgia Institute of Technology in the US have done theoretical and experimental studies of a weakly turbulent system that confirm the existence of ECSs and the important role they play in turbulence. They have also shown that the time evolution of turbulent flow can be calculated using knowledge of the relevant ECSs. The research could lead to new ways of predicting the evolution of turbulent flow and is described in Physical Review Letters.

  • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on a new touch-screen technology.

A quantum boost for machine learning

This week, one year ago, one of the world’s top players of the board game Go took on an artificial intelligence (AI) program from Google’s DeepMind project. Watch the video to find out what happened. The program, called AlphaGo, is a demonstration of machine learning, which can be thought of as the data side of AI. Discover more about machine learning – and how it could be boosted by quantum computing – by reading this article from the March issue of Physics World.

Interstate discomfort

Laws, unlike nature, cannot recognize indeterminacy and will go to all lengths to squash an ambiguous phenomenon into a recognized state. The recent declaration by the US Patent Trial and Appeal Board that magnetic resonance imaging machines are “abstract ideas” is one example. Looming cases in which courts will have to decide if an embryo is a person or property is another. Intellectual Ventures v. Symantec – a case involving software decided by the US Court of Appeals for the Federal Circuit last September – is yet one more.

An anti-virus corporation based in California, Symantec had been sued by Intellectual Ventures – a company that buys and licenses intellectual property – for infringing patents relating to certain software features. (Intellectual Ventures was founded in 2000 by physicist and former Microsoft chief technology officer Nathan Myhrvold.) The court rejected the suit, partly following the 2014 US Supreme Court case Alice Corp. v. CLS Bank International.

That case implied that many software patents were unfounded because they involve “abstract ideas”. What shocked and baffled lawyers and geeks alike, however, about the Intellectual Ventures decision was one judge’s declaration that “software is a form of language” and that patenting software would therefore amount to a “suppression of free speech”.

Inventions versus ideas

A patent is a right to a limited monopoly that a nation grants an inventor in exchange for the public disclosure of an invention. The social policy of such a deal is to encourage openness and risk-taking by making it unnecessary for innovators to hide inventions to protect their ability to exploit them. But patenting has limits. As I discussed in a previous column (see “Patenting science” April 2014) as the Alice decision loomed, the US Supreme Court ruled in 1972 that patents on “products of nature”, natural laws and abstract ideas are impermissible because such patents would “inhibit future innovation premised on them”.

Digital technologies can blur the legal distinction between inventions and abstract ideas. The 2014 Alice decision failed to clear up this indeterminacy. While not precluding patenting software, it did convey scepticism about the patentability of many already-issued software patents. The Intellectual Ventures decision expressed still stronger scepticism.

To quote from concurring judge Haldane Mayer’s opinion: “Software lies in the antechamber of patentable invention. Because generically-implemented software is an ‘idea’ insufficiently linked to any defining physical structure other than a standard computer, it is a precursor to technology rather than technology itself.” The Internet, he continued, is “the most participatory form of mass speech yet developed” and therefore “deserves the highest protection from governmental intrusion”. As a form of language, software enjoys such protection.

Software states

Software does not fit any existing legal categories. Elsewhere in Judge Mayer’s opinion he compared it to literature or music, in which case it would be subject to copyright. Copyright is a legal category often used to cover legally disruptive new technologies. The first copyright law (the 1710 British Statute of Anne) covered books, but was eventually extended by analogy to maps, music and photographs. In 1976, after companies started to sell hardware and software separately, the US Copyright Act explicitly put software in the conceptual box of copyrightable things. But copyright, which protects copying of original material, is of limited value to protect software, which is often created by combining original elements with publicly available code. It is easy to avoid restrictions by coding an attractive new feature borrowed from another software’s innards a little differently.

This explains the desirability of trying to put new software developments in the patent state, which is designed to protect original ideas. But software doesn’t fit well here, either. It’s not a concrete object, like a mousetrap, but more like a set of instructions.

The difficulties of using either copyright or patent legislation to protect software invites turning to trademark legislation. This is an attractive analogy because while trademarks are identifiers of a product, they can apply to its overall shape – the look of a Toyota Prius, say – leaving aside specific details of innards. But determining the identity of a piece of software poses other difficulties, and trademarks protect the look of something rather than its functioning.

Faced with the immense intellectual property perplexities of software, Judge Mayer appealed to yet another analogy: language. Software indeed has many linguistic features, with a distinctive, crafted structure mixing original and conventional parts. The analogy let him invoke well-established freedom of expression legislation applying to things in the linguistic state. The social policy of free-speech legislation is to encourage sharing views about the world with other humans. But software – instructions that tell machines what to do – lacks properties that make expression valuable to communal life. The language analogy is as far-fetched as the others, but has the advantage of allowing courts to opt out of the issue’s subtleties.

The critical point

Software thus creates a good legal example of what I’d call “interstate discomfort”, or the deliberate forcing of something into a category where it obviously doesn’t belong but that makes it able to be handled as if it were a conventional state. While physicists regard indeterminate phases – liquid crystals, say – as interesting challenges to be explored, judges find them intolerable and demand binaries; an invention, for instance, must be in one state or another, patentable or not.

The problem will only worsen. When the courts eventually get a case on a patent claim for an invention made by a computer, I can’t wait to see what state they try to squash that into. Physics, you might say, has a degree of freedom that the law lacks.

Invisible robots overshadowed by metallic hydrogen

By Sarah Tesh in New Orleans, Louisiana, US

After much coffee and a lot of crispy bacon, the second day of the APS March Meeting began. The hot topic of the day – metallic hydrogen. Even though we arrived 15 minutes early to Isaac Silvera‘s talk, the crowd was overflowing from the room, but despite all the pushing and shoving (my foot has not recovered from being stood on), we did manage to get seats. Silvera began by saying that he had been working on the problem for “probably longer than [most of us] were born” before taking us through the nearly 45 years of research on the subject. He also gave a press conference that included talks by theoretical physicists David Ceperley from the University of Illinois and Jeffrey McMahon from Washington State University. My colleague Tushna Commissariat caught up with Silvera later on, so be on the lookout for a more detailed update from her.

In a biomedical session, we heard how Xuanhe Zhao of Massachusetts Institute of Technology recently developed “invisible soft robots” made of hydrogels. These nifty devices are both optically and acoustically invisible. Zhao and his team use the fact that hydrogels expand when water is pumped into them. By having different densities of hydrogel within one block, it can bend when water is pumped in. Zhao showed us a film of their “claw robot” made of hydrogel and controlled with hydraulic power. The strength of the hydrogel, its controlled flexibility and the layer of waterproof material on the outside, means the claw can pick up objects underwater. Zhao hopes such water-based soft robots and materials could have biomedical applications because the human body contains sufficient amounts of water.

Early galaxies shunned dark matter

Dark matter had less influence in galaxies 10 billion years ago than it does today, according to observations by the European Southern Observatory’s Very Large Telescope (VLT).

In the modern universe, stars on the outskirts of galaxies rotate just as fast as those in the dense galactic cores. This is surprising because the gravitational glue from the visible matter in a galaxy is not strong enough to stop these outliers from being flung out into space. Such galaxies are said to have “flat rotation curves” and the fact that the rotational velocity does not drop off at large distances from the core has been attributed to extra gravity provided by otherwise invisible dark matter.

Now, astronomers led by Reinhard Genzel of the Max Planck Institute for Extraterrestrial Physics in Germany have shown that galaxies in the early universe didn’t seem to play by the same rules. Instead of flat rotation curves, these young galaxies had rotation curves with a sharp downward dip, caused by stars and gas orbiting the galaxies more slowly on the outskirts than in the centre. The implication is that there was not as much dark matter in the discs of these galaxies as there is in galaxies today. Instead, these early galaxies appear to be dominated by normal, visible “baryonic” matter.

To make the discovery, Genzel’s team measured the rotation velocity of bright star-forming regions in distant galaxies using the KMOS and SINFONI spectrographs on the VLT. Selecting the six best galaxies from a larger data set, they found that all six had the downward dip in their rotation curve. To ensure that these six were not simply flukes, the researchers then stacked and averaged out the rotation curves from 97 other galaxies in their data set and found that downward dips were a common trend.

Unruly beasts

The observations show the galaxies during an era when the formation of stars and galaxies in the universe were at their peak. During this time, “galaxies were unruly beasts”, Genzel told physicsworld.com. They exhibited exceptionally high rates of gas flowing onto them, as well as powerful feedback from supernovae, stellar winds and black holes blowing material back out from these galaxies.

The various accretion and feedback processes afflicting galaxies in the early universe could dump a lot of energy into them, resulting in a wide disparity in rotational velocities around a galaxy. This velocity dispersion would increase with radius, resulting in the outermost parts of a galaxy’s disc rotating more slowly.

Also, during the earliest times in the universe, gas and dark matter were mixed together, but gas was able to separate from dark matter by losing energy more quickly through interactions with other baryonic matter. If the decoupled gas formed a galactic disc fast enough, “then the central regions of galaxies could become dominated by baryons”, says Mark Swinbank of the University of Durham, who was not involved in the research. The high accretion rates of gas flowing onto the galaxies would then reinforce the surfeit of baryonic matter.

The dark-matter story

However, the decoupling of matter as well as the velocity dispersions may not be enough to explain the data, says Genzel. Instead, astrophysicists may also need to look at how dark matter was behaving.

Although physicists have very little understanding of what dark is, several theories have been developed to try to describe the mysterious substance. According to the theory of cold dark matter, galaxies are nestled within giant haloes of dark matter. These haloes aid the growth of galaxies by feeding them gas from the “cosmic web” of matter that spans the cosmos.

“The cold dark-matter paradigm does an extremely good job in mapping out what we have learned over cosmic time about large scale structure and the distribution of galaxies, but maybe the story on smaller scales is wrong,” says Genzel.

Since galaxies were still forming during this early epoch, it is possible that their dark-matter haloes were still in flux and had not yet settled into an equilibrium state where the dark-matter’s gravity could affect the rotation curves. It is also possible that feedback processes could drag dark matter around, rearranging its distribution in the halo relative to the discs of the galaxies. If true, the interaction could provide constraints on dark matter’s properties.

Modelling the data

Astrophysicists will now pour over these data to see if it can be explained using current models of galaxy formation, or whether new ideas will be needed. Current models are in reasonable agreement with the observations, says Simon White, director of the Max Planck Institute for Astrophysics in Germany, “but they would require fine-tuning in terms of star-formation and feedback processes to get precise agreement”.

A more difficult task may be describing how these baryon-dominated galaxies developed into the dark-matter-dominated galaxies that we see in the modern universe. “It’s up to the models to understand how and when dark matter begins to dominate the central regions,” says Swinbank.

The observations themselves may be the key to solving that problem. By determining the distribution of mass in galaxies during different epochs, it will become possible to join the dots. “Putting it all together will allow us to see how things developed from the early universe all the way to the present day,” says White.

Nevertheless, the new findings have raised questions about the relationship between dark matter and galaxy growth. “There will, I’m sure, be a very active and probably controversial discussion,” admits Genzel. “It’s going to be an interesting time.”

The research is described in four papers, one published in Nature and three supporting papers published in The Astrophysical Journal.

Flash Physics: Musical vibrato analysed, women choose biology after basic physics course, CMS upgrade complete

Musical vibrato analysed using quantum-physics technique

Vibrato – the controlled oscillation of the pitch of a musical instrument or voice – has been analysed by researchers at Queen Mary University of London in the UK using a technique originally developed for atomic and molecular physics. Musical vibratos usually involve modulating the pitch of a note by as much as a semitone at a rate of about 4–8 Hz. Because most vibratos only last a second or so, it is difficult to analyse the nature of the sound waves because the signals include only a few vibrato cycles. Now, Luwei Yang, Elaine Chew, Khalid Rajab and colleagues have characterized vibratos using the filter diagonalization method (FDM). This was originally developed to study quantum-mechanical resonances of atoms and molecules from time-sequence data, particularly from nuclear magnetic resonance experiments. “Although musical signals are very different from their quantum counterparts, mathematically they share many similarities, including the characteristics of their resonances,” says Rajab. “In fact, we found that, because they oscillate with time, the harmonics in musical signals can be more complicated to analyse than their quantum counterparts.” The project focussed on understanding the differences between the violin and the erhu, which is a two-stringed Chinese fiddle. Chew explains: “When music for a folk instrument like the erhu is performed on a violin, it lacks the stylistic and expressive qualities of the original.” “One of the major sources of these differences lies in the way in which notes are elaborated (with vibrato) and the way in which the instrumentalists make their transitions between notes (using portamentos),” she adds. “We were interested in creating computing tools that can help reveal these differences.” The research is described in the Journal of Mathematics and Music.

Women turn to biology after taking basic physics course

A survey of almost 10,000 undergraduates at the University of Auckland in New Zealand has found that women are more likely to choose to study life sciences after taking a first-year physics course, rather than progressing further in the physical sciences. Carried out by Auckland physicist Dion O’Neale and colleagues, the study did not find the same effect in men. The research could help to explain why women are significantly underrepresented in fields like physics, but not in subjects such as biology or medicine. The team examined five years of records of all students who took at least one physics course, looking in particular for relationships between gender, course selection and performance. They found that male students were twice as likely to go on to study physical-science subjects such as physics after taking a stage-one “Advancing Physics” course, while women were around 2.5 times more likely to progress in life-science subjects instead. The researchers propose that women are socially discouraged from seeing physics as a realistic and suitable study option. They also say that female students may feel they would be better off in the life sciences, where there are more women to act as role models, the perception of a better work/life balance and the sense that stereotypical women’s traits might be more valued. The study is described in a preprint on arXiv.

CMS completes particle-tracker upgrades

Photograph of the new pixel tracker being installed

Scientists at CERN’s Large Hadron Collider (LHC) have completed a major upgrade to the Compact Muon Solenoid (CMS) – one of four main detectors at the facility. The work focussed on CMS’s particle-tracking system – consisting of the pixel tracker and the strip tracker – that determines the trajectories of charged particles. The upgrade will allow CMS to take advantage of numerous and ongoing upgrades to the LHC. The bulk of the work involved removing the original pixel tracker – which makes up the innermost part of CMS – and replacing it with a brand-new system. The new pixel tracker is a four-layer device with 124 million silicon pixels, whereas the device it replaced only had three layers. The additional layer is designed to cope with the higher collision rates planned for upcoming runs of the LHC. The upgrades will increase CMS’s ability to make precise measurements on the properties of the Higgs boson and aid in the search for physics beyond the Standard Model. Engineers began the upgrade in December when the LHC was shut down for the winter. The LHC is expected to start collisions in April.

 

  • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on missing dark matter.

APS sees friction as fracture, cat pictures and brain implants

Cats and DFT: Thomas Baker chats about machine learning and DFT (Courtesy: Sarah Tesh)

By Sarah Tesh in New Orleans, Louisiana, US

So the first day of the APS March Meeting has been and gone and the second is nearly at an end. Being my first conference as a journalist not a scientist, I was definitely as nervous as some of the speakers looked. The conference centre is huge, there are thousands of people and almost as many talks – a rather daunting prospect for a newbie. Thankfully there were some very interesting press talks, covering a variety of topics.

The first session began with Jay Fineberg from the Hebrew University of Jerusalem in Israel talking about “friction as fracture”. While we all learn about friction at school, the fundamental physics behind it remains shrouded in mystery. So Fineberg looks at the problem as the fracture of contact points. This approach makes it particularly useful for studying the motion of tectonic plates and, so, earthquakes. As Fineberg points out, seismologists have no idea about conditions deep in the ground at a fault. He and his team therefore hope to work out “what makes earthquakes tick”.

(more…)

Flash Physics: Star in tight orbit around black hole, nanocubes detect nitrogen dioxide, Born’s rule prevails

Star in tight orbit around black hole

Astronomers have spotted what they think is a star in the closest known orbit around a black hole – just 2.5 times the separation between the Earth and the Moon. Located in the 47 Tucanae globular cluster about 14,800 light-years away, the white dwarf is seen to oscillate in X-ray brightness with a period of about 28 min – which astronomers believe corresponds to its orbit around a black hole. The observation was made using NASA’s Chandra X-ray Observatory and NuStar telescope along with the Australia Telescope Compact Array. “This white dwarf is so close to the black hole that material is being pulled away from the star and dumped onto a disc of matter around the black hole before falling in,” says team member Arash Bahramian, from the University of Alberta in Canada and Michigan State University in the US. The team believes that the system could have been formed when a black hole smashed into a red giant star. Gas from the red giant was ejected during the collision, creating the white dwarf, which was drawn closer to the black hole over time. The binary system is called X9 and will be described in Monthly Notices of the Royal Astronomical Society and on arXiv.

Iron nanocubes detect nitrogen dioxide

Photograph of researchers with a model iron nanocube

Nanometre-sized cubes of iron similar to those used to decorate the ancient Lycurgus cup could be used to sense the presence of nitrogen dioxide. The famous cup was made in the 4th century by Roman artisans who embedded iron nanoparticles in glass to create structural colour that changes hue depending on which way light is shone through it. Now, Jerome Vernieres and colleagues at the Okinawa Institute of Science and Technology in Japan have come up with a way of making large numbers of iron nanoparticles that are all the same size – something that had proven difficult to do in the past. This uniformity is important because it should allow the nanoparticles to be used to detect specific molecules such as nitrogen dioxide. The team’s manufacturing process involves firing an argon plasma at a piece of iron, which knocks out iron atoms that then join together to form nanoparticles. A magnetic field is used to achieve precise control over the plasma, which allows the team to create nanoparticles of a specific size. Once it had made its nanoparticles, the team noticed that the electrical resistance of the tiny cubes changed in the presence of nitrogen dioxide. The researchers then joined forces with others at the University of Toulouse in France to create a prototype nitrogen-dioxide sensor that they say could be useful for a range of applications including the diagnosis of asthma and detecting environmental pollution. The research is described in Advanced Functional Materials.

Born’s rule prevails in five-path interferometer

Schematic of the five-path interferometer

An important tenet of quantum mechanics is that interference always occurs between pairs of paths in an interferometer – and that higher-order interference effects between more than two paths do not occur. This is a result of Born’s rule, which was put forth by Max Born in 1926 and defines how the result of a measurement on a quantum system is related to its wave function. Any deviation from Born’s rule would identify a significant flaw in quantum theory and therefore be of great interest to physicists. Now, Thomas Kauten and colleagues at the University of Innsbruck and University of Vienna in Austria have put Born’s rule to the test in a five-path interferometer. By implementing single-photon detection, the team was able to run the interferometer in the “quantum regime” with one photon at a time passing through it. The researchers were able to exclude the existence of higher-order interference effects in this quantum regime to an uncertainty of 2 ×10–3, which they say is much better than previous attempts. The research is described in New Journal of Physics.

 

  • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics.

Space-race technology revived to generate greener energy

Thermionic energy conversion – a technology first developed in the 1950s to power spacecraft – could soon be significantly more efficient, thanks to two key innovations made by researchers in the US. The work is at an early stage, but the researchers believe that the technology could eventually produce electricity in situations where a traditional steam turbine would not be feasible. As well as making conventional power stations more efficient, the energy conversion could also lead to more environmentally friendly sources of electrical energy.

In the 19th century, physicists noticed that a hot negatively charged filament will tend to discharge in a vacuum, whereas a hot positive filament will not. After J J Thomson’s discovery of the electron in 1897, the British physicist Owen Richardson won the 1928 Nobel Prize for Physics for working out that electrons with enough energy to escape the hot negative electrode will travel through a vacuum to a positive electrode. This can work without an applied potential: electrons escaping the hot electrode (the emitter) move randomly through the vacuum until they are absorbed by the cold electrode (the collector), creating a potential difference between them. In this way, a device can convert a temperature difference into electrical energy. This effect was used in the 1950s to develop small “thermionic energy converters” with no moving parts, for use on-board spacecraft. However, the efficiency of the devices was too low for practical applications on Earth and the technology has fallen into disuse.

Solving two problems

In the new research, Roger Howe and colleagues at Stanford University in California have addressed two efficiency problems that had limited the use of thermionic energy converters. First, they reduced the work function of the collector – which is the energy required to transfer one electron from a material to the vacuum surrounding it. Therefore, an electron entering the material from the inter-electrode vacuum loses this energy to the atomic lattice as heat. The work function of the traditional collector material, tungsten, in the same experimental set-up is 2.15 eV. This dramatically reduces the device efficiency. The researchers utilized the fact that, unlike in metals, the work function of graphene can be tuned by applying a voltage to it with respect to a conductor. They deposited a 20 nm dielectric layer on top of a doped silicon gate, before covering this with monolayer graphene. By applying a voltage to the gate, the researchers increased the electron density in the graphene, reducing its work function to as little as 1.69 eV.

Second, the researchers minimized the space-charge effect, whereby electrons in the gap between electrodes repel each other, pushing electrons back towards the emitter. To reduce this, the researchers used nanofabrication techniques to narrow the inter-electrode gap to 17 μm, ensuring electrons reached the collector as quickly as possible, where they were extracted into the external circuit. As a bonus, it allowed an atomic layer of barium atoms to evaporate from the coating of the tungsten emitter and cover the collector, providing strong surface dipoles that further reduced the work function of the collector.

Big improvement

The researchers estimate that up to 9.8% of the heat radiated from emitter to collector at a temperature difference of 800 °C is converted to electricity – a nearly sevenfold improvement on previous technologies. Other devices have reported higher efficiencies (10–20%), but only by using much higher temperature differences, which can be difficult to sustain in a real device. The researchers are now working to increase the efficiency by reducing the collector work function and inter-electrode spacing further, as well as improving the general device stability.

Team member Hongyuan Yuan says the device might eventually generate electricity from waste heat in the existing power infrastructure, reducing the amount of coal that needs to be burned, for example. It could also generate electricity from renewable sources such as geothermal or solar thermal energy, where a steam turbine would not be viable: “Thermal energy is one of the most abundant energy sources on our planet,” says Yuan: “The only thing needed for this technology to generate electricity is something that’s very hot.”

“What I really like about the work is that it’s focused on some nano-engineering materials, but they’ve developed the whole device for energy conversion,” says Ali Shakouri of Purdue University in Indiana. “Are the results earth-shattering? Not really – but they bring some ideas into a full device and show the principles working.” He adds, however: “On the theory side, this work is weak. Output power can be modelled, and they should tell us ‘OK, the gap decreases from 1 mm to 17 μm and the output power goes up this much – how well does that match theory?'”

The new technology is described in Nano Energy.

Physicists take over the Big Easy

New Orleans: city with a view

By Sarah Tesh and Tushna Commissariat in New Orleans, Louisiana, US

It is that time of the year again when around 10,000 physicists gather for the American Physical Society (APS) March Meeting and this year we’re in the Big Easy. While yesterday was a jetlag-recovery day, it’s all kicking off today at the sprawling Ernest Morial Convention Center, where more than 9600 papers will be presented during the week.

Despite our sleep-deprived state yesterday, we played the traditional game of “spot the physicist” during our wanderings in the French Quarter. This was made particularly interesting with the simultaneous game of “spot the spring-breakers”. Relaxed, youthful students chatting loudly about their late-night escapades were a stark contrast to academics looking anxious and lost while over-burdened with poster tubes, suitcases and laptop bags.

(more…)

Copyright © 2026 by IOP Publishing Ltd and individual contributors