Skip to main content

A theremin fit for a gerbil, hairdos for physicists and the trouble with Richard Feynman

How we created spooky experimental music in a superconductor lab”: what physicist could resist clicking on this story, which appeared on the Guardian website earlier this week? Written by the physicist-turned-computational-biologist Andrew Steele, the article describes how Steele and a few pals converted a magnetic sensor into a musical instrument. Like the theremin, which is played by waving your hands around an antenna, this new instrument responds to the player’s motion. But because the sensor was optimized for studying superconductors rather than creating freaky mood music, Steele explains the “instrument covered three octaves in less than a centimetre of hand movement”. He suggests that playing the instrument should probably be left to a talented gerbil rather than talented superconductor researchers. You can listen to Steele’s attempt at making music on SoundCloud.

(more…)

The July 2014 issue of Physics World is out now

 

Unless you’re prepared to modify our understanding of gravity – and most physicists are not – the blunt fact is that we know almost nothing about 95% of the universe. According to our best estimates, ordinary, visible matter accounts for just 5% of everything, with 27% being dark matter and the rest dark energy.

The July issue of Physics World, which is out now in print and digital formats, examines some of the mysteries surrounding “the dark universe”. As I allude to in the video above, the difficulty with dark matter is that, if it’s not ordinary matter that’s too dim to see, how can we possibly find it? As for dark energy, we know even less about it other than it’s what is causing the expansion of the universe to accelerate and hence making certain supernovae dimmer (because they are further away) than we’d expect if the cosmos were growing uniformly in size.

(more…)

Charging up with jumping droplets

Superhydrophobic surfaces can be used to harvest small amounts of energy from the atmosphere, according to new research carried out by scientists at the Massachusetts Institute of Technology (MIT). The team has developed a device that uses jumping droplets of condensed water vapour to carry charges between two sets of metal plates. The device could be used as a handy tool in remote areas, to charge mobile phones or as an environmental sensor.

As the name suggests, superhydrophobic surfaces are highly effective at repelling water. This ability is down to their nano-scale topography, which limits contact – and therefore adhesive forces – between the surface and overlying liquids. Dominated by cohesive forces, droplets on these surfaces tend to ball up, forming beads that can easily roll away. Droplets of the right size – around 10–100 μm in diameter – can sometimes spontaneously jump, converting their excess surface energy into kinetic energy.

Jump start

These jumping droplets do not travel alone, however – they take a tiny electric charge with them. This effect – first noted by Nenad Miljkovic and colleagues in a paper last year – occurs through the interaction of free charges in the water with the superhydrophobic surface. “An electric double layer forms at the water/coating interfaces, and because of the fast jumping dynamics, charge separation can occur,” says Miljkovic. This, he explains, leaves a small positive charge on the droplets and a small negative charge on the surface.

In their new study, Miljkovic and colleagues have put their earlier discovery to practical use – harvesting atmospheric energy. To do this, they created two sets of interleaved copper plates: one patterned to be superhydrophobic, the other hydrophilic. As water droplets condense on the first set of plates, they merge and leap across to the adjacent hydrophilic plates. With each drop, a small charge is transferred – building up a difference between the two sets of plates that can be used to power a circuit. The design is scalable: the larger the plates, the greater the overall charge transferred.

Remote tool

The concept does have its limitations though: it requires a humid environment, and the researchers’ prototype only provides 15 picowatts of power per square centimetre of plate. Despite this, Miljkovic is confident that the concept can be fine-tuned to harvest at least one microwatt per square centimetre. If achieved, the harvester might become a useful tool in remote regions. Such a device, 50 cubic cm in size, could charge a mobile phone in about 12 hours – an output comparable with other waste-energy harvesting solutions – and would also collect clean water. One specific application for this technology could be in automated environmental sensors, whose low power requirements might be covered by the morning dew alone. For more extensive uses, however, a cold sink – such as a flowing river – would be required to keep up the condensation.

Julie Crockett – a mechanical engineer from Brigham Young University who was not involved in the study – calls the work a “significant contribution to [the] field”, highlighting the potential for optimization of the superhydrophobic structuring. She adds, however, that “questions still remain about the longevity of the process, because of the jumping droplets coating the opposing hydrophilic wall, and the heat flux available for direct condensation of atmospheric moisture”.

“This finding is very exciting because it suggests that electricity could be passively harvested from the condensers found in many engineering systems,” comments Jonathan Boreyko – a mechanical engineer, formerly of the University of Tennessee, who will be joining Virginia Tech in the autumn – who was also not involved in this study. “It would be interesting to see how the performance is altered when using smooth parallel surfaces, which would remove the temperature gradient in the chilled superhydrophobic surface to enable more uniform jumping-drop condensation across the gap to the opposing electrode,” he adds.

Having demonstrated their prototype’s viability, the researchers are now refining their design, aiming for increased power output. They are also working on an aluminium version, which should be cheaper to make than their copper prototype.

The research is described in Applied Physics Letters.

Seismic study digs into volcanic plumbing

Map of the seisemic velocity of Mount Fuji

Plumbing problems do not get any bigger and more complicated than a backed-up volcano. But geophysicists looking at the responses of ground waves below Japanese volcanoes have now come up with a technique for identifying where pressurized volcanic fluids build up, allowing them to better anticipate when a volcano may erupt. Scientists already knew that seismic waves from large earthquakes agitate volcanic systems and that large eruptions generally follow a build-up of pressurized fluids at some depth. But they had been unable to pin down the specific physical changes that seismic waves cause.

Now though, Florent Brenguier of the Institut des Sciences de la Terre in Grenoble, France, and colleagues at the University of Tokyo have used recordings of seismic-wave velocity from the devastating 2011 Tōhoku earthquake to create a map of seismic-velocity changes in its aftermath. Surprisingly, the largest changes were not observed in the area closest to the earthquake epicentre near the Pacific coast but farther inland, immediately below volcanic regions. The image above highlights an anomalously low seismic velocity below the Mount Fuji volcano after the earthquake, despite it being some 500 km from the epicentre. The drop in velocity is because the regions are susceptible to earthquake shaking – cracks in the crust open so that fluids at high pressures can escape, and could be seen as proxies for the high-pressure fluid build-up (Science 345 80).

Scattered neutrons could mimic DAMA-LIBRA’s ‘dark matter’ modulation

For the last 16 years, researchers at the DAMA/LIBRA experiment in Italy have seen a controversial annual oscillation in the signal from their dark-matter detector. This type of variation would be seen if the Milky Way galaxy was wreathed in a “halo” of dark matter. But apart from the CoGENT dark-matter experiment in the US, no other dark-matter searches have seen a similar effect. Now, a physicist at Durham University in the UK has proposed an alternative source for the modulation in the form of neutrons, which are knocked out of atoms by muons and neutrinos scattering in the rock or shielding material around DAMA/LIBRA.

The most recent cosmic microwave background (CMB) data from the Planck mission reveal a universe that is composed of 26.8% dark matter and 68.3% dark energy, with less than 5% of “normal” visible matter, such as galaxies and gas clouds. Dark matter is thought to interact weakly with ordinary matter, making dark-matter particles extremely difficult to detect. The DAMA detector – located deep underground at the Gran Sasso National Laboratory – reported the first signs of dark matter in 1998, and further data over the years has cemented the result at a statistical significance at 9.3σ – well beyond the 5σ that usually signifies a discovery in particle physics.

Summer high

The modulation, which peaks in the month of May, is thought to come about as the solar system sweeps through a dark-matter halo enveloping the Milky Way. The peak occurs in the northern-hemisphere summer because the tangential velocity of the Earth as it orbits the Sun is in the same direction as the motion of the solar system at that time of the year. This means that the number of collisions detected by DAMA should be at a maximum in summer and then drop in the winter.

Now, however, Jonathan Davis of Durham University has developed a new model to explain the DAMA/LIBRA signal without invoking dark matter. Rather, he shows that neutrons scattering in the detector could easily produce an annual signal. The neutrons are released when solar neutrinos and atmospheric muons scatter in the shielding material or the rock that envelops the DAMA/LIBRA set-up.

Scattered signal

The muons come from cosmic rays decaying in the atmosphere and their rate varies across the year, peaking around 21 June. The solar neutrinos’ rate also varies annually, but instead they peak around 4 January. “When combined, this means that the neutrons from both of these sources also have a rate that varies annually but peaks somewhere in between the two and can match the DAMA phase that is in late May,” explains Davis. “There is an annual peak because the interference between the muons and neutrinos is not perfect. So, they don’t exactly cancel…the idea is that because both of the constituent signals peak at different times, when they add up there is some cancellation but this is not total,” he says, further explaining that it is the remnant signal after the cancellation that peaks around late May, just like the DAMA data.

Muon mimic

While the idea of muons mimicking the DAMA signal is not new, the timing of muons in isolation does not match the DAMA data and the idea was dismissed. But Davis’s model solves this problem by adding the effect of solar neutrinos. Davis told physicsworld.com that it is currently unclear how important a role the lead shielding surrounding the experiment plays in neutron production; however, it is likely to be significant. Lead is particularly good at producing neutrons from neutrinos and muons. “Indeed, the cross-section – which gives you the interaction rate – for neutron production from neutrinos and muons is particularly high. Also the lead shielding is very close to the DAMA detector,” he says. He also points out that neutrons produced in this way have a spectrum that tends to peak at low energy, similar to what one would expect from dark matter, meaning that “the signals can be easily confused”.

Davis acknowledges the fact that neutrinos – often referred to as the ghosts of matter – are known particularly for their ability to not interact with matter as they pass through it. But he says that as the DAMA detector is particularly sensitive to low-energy recoils, it will pick up the neutrons produced by these neutrinos. “Also, most other experiments, such as LUX, have more advanced shielding than DAMA, so they would be able to stop the neutrons before they get to the detector,” he says. He also states that other neutrino experiments do see the modulation he considers – he points to papers from the Borexino and SuperKamiokande experiments, which measure the modulation caused by neutrinos precisely. “However, these are directly down to neutrino scattering, not neutrons from neutrinos. The phase should be the same though,” he says.

Other experiments?

When it comes to the CoGeNT experiment, which also sees the same type of early modulation, Davis is intrigued. “In principle, the model would be the same, however, since CoGeNT is in a different lab to DAMA, the phase of the signal would be different. CoGeNT has had a lot of trouble recently with surface event backgrounds, so we will have to wait and see as it is not clear what it is seeing,” he cautions. Other dark-matter experiments, he says, have not seen the signal, probably thanks to a combination of shielding and thresholds. Because most of the more recent experiments employ more effective neutron shields than DAMA, the neutrons, which make up the DAMA signal, would not reach detectors such as CDMS, LUX or XENON100. Also, DAMA is more sensitive to low-energy recoils than most experiments, and so might be more susceptible to the muon/neutrino signal than other experiments.

To check for the accuracy of Davis’s model, the DAMA/LIBRA collaboration could study in more detail how the phase of its signal changes with the energy of the events. According to Davis, this has been studied before, and DAMA found that the phase does change with energy – something that you would not expect from standard dark matter but that is explained by his model. Also, with the increasing number of data that DAMA will collect in the coming years, the researchers will be able to look for “an additional mode with a period of 11 years, which would be expected if the signal is due to muons (it comes from solar activity), but not for dark matter” Davis says.

Davis is also keen to emphasize the importance of future dark-matter experiments – such as DM-Ice, KIMS, and ANAIS – which are looking to replicate DAMA. “My model gives them something they can test as a comparison with dark matter,” he says. “This is especially interesting for DM-Ice because it will be in Antarctica, so the muons will have the opposite phase.”

A preprint of the research is available on the arXiv server.

UPDATE: The paper has now been published in Physical Review Letters.

A century of general relativity

It doesn’t seem that long since “Einstein Year”, the worldwide celebration held in 2005 to commemorate the great physicist’s extraordinary scientific output a century earlier. 2015 will mark another important Einstein anniversary: the centenary of the presentation of his general theory of relativity.

Among physicists, this theory is regarded as Einstein’s greatest achievement, a towering scientific theory that remains unsurpassed in terms of its originality, elegance and predictive power. By replacing Newton’s force of gravity with a warping of space–time, Einstein transformed our view of space, time, force and gravitation, a revolution that continues to deliver astonishing insights into the world of the very large.

The general theory stands alongside quantum theory as one of the great pillars of 20th-century physics, but where quantum theory had a long and difficult birth, with multiple modifications and many “parents”, general relativity sprang from the mind of one man and has remained virtually unchanged ever since. Over the years, the theory has provided the framework for almost all of our knowledge of the universe, from the “Big Bang” model of the evolution of the universe to our understanding of black holes.

In The Perfect Theory, Pedro Ferreira provides a timely, expert and highly readable history of general relativity. An astrophysicist at the University of Oxford, Ferreira is renowned for his work on the problem of galaxy formation and his research into alternative theories of gravity. His biography of the general theory is affectionate and meticulous, although the narrative is that of a physicist rather than a mathematician or relativist. This approach is neatly summed up in the book’s excellent prologue, where Ferreira writes that “The reward for harnessing Albert Einstein’s general theory of relativity is nothing less than the key to understanding the universe, the origin of time, and the evolution of all the stars and galaxies in the cosmos.” At the same time, the book is firmly aimed at a public audience and is a welcome addition to popular books on the topic such as Jean Eisenstaedt’s The Curious History of Relativity or God’s Equation by Amir Aczel.

From a cosmologist’s point of view, the story of general relativity can be usefully divided into five distinct periods. The first era saw the formulation of the theory and its initial application to the universe as a whole, resulting in the “static” cosmic models of Einstein and Willem de Sitter. In the second epoch, time-varying models of the cosmos were proposed by Alexander Friedmann and Georges Lemaître; such models were further explored by Einstein, De Sitter, Lemaître, Howard Percy Robertson, Richard Tolman and Arthur Eddington in the wake of Edwin Hubble’s observations of the recession of the galaxies in 1929. Little theoretical progress was made in relativity during the third period (1940–1960), but this era did see the proposal of a hot, radiation-dominated infant universe by George Gamow, Ralph Alpher and Robert Herman, and the rise of a rival “steady-state” cosmology proposed by Fred Hoyle, Hermann Bondi and Tommy Gold. Next came the “golden decade” of 1963–1973, which saw the discovery of radio-galaxies, quasars, pulsars and the cosmic microwave background, and parallel progress in theoretical work on singularities. This period was followed by the modern era of precision measurements of the cosmic microwave background and the emergence of theories such as cosmic inflation and dark energy.

Ferreira covers each of these periods in an engaging, conversational way. He does not skimp on detail in most instances, yet the lively narrative holds the reader’s attention throughout. I particularly enjoyed the section on the renaissance of general relativity, from John Wheeler’s famous presentation “The issue of the final state” at the 1963 Texas Symposium on Relativistic Astrophysics to the furious efforts of the world’s top relativists at Cambridge, Princeton and Moscow to crack the problem of black holes in the 1970s. Another unusual section is the description of attempts by some theorists to reinstate the cosmological constant before the discovery of the universe’s accelerating expansion in 1998.

That said, there is some unevenness in the level of detail in the narrative, no doubt owing to considerations of length. For example, there is surprisingly little discussion about the plethora of dynamic cosmic models that were proposed in the early 1930s, almost no details are given of the pioneering work of the Gamow group in the 1940s and very little information is presented on modern measurements of the cosmic microwave background by the COBE, WMAP or Planck satellites. On the other hand, the author does present an intriguing chapter on alternative theories of gravity that have recently come to the fore, and expertly conveys the excitement of future experiments that “could confirm or refute the fundamental tenets of general relativity”.

With regard to audience, the book will be an enjoyable read for physicists in any field. Some physics teachers and students might be disappointed by the complete absence of equations and diagrams, and wonder what the mathematical machinery of general relativity looks like – in this respect, the story is less satisfying than the author’s earlier book The State of the Universe. Mathematically inclined readers might also be disappointed that there is very little description of the theoretical development of general relativity by key players such as Hermann Weyl and Cornelius Lanczos in the 1920s, or John Lighton Synge and William McCrea in the 1950s and 1960s. On the other hand, the book is very approachable for a lay audience, despite the level of historical detail.

Historians of science may notice some minor historical errors. For example, it is known from Einstein’s travel diaries that his conversion to the expanding universe was influenced by discussions with Tolman (not Hubble, as stated). It is also known that Einstein first formally embraced the expanding universe and banished the cosmological constant in the Friedmann–Einstein model of 1931 (not the Einstein–de Sitter model of 1932 as stated). The existence of a universal background radiation was first predicted by Alpher and Herman (not Gamow). Finally, it is not made clear that Alan Guth’s proposal of cosmic inflation addressed a theoretical puzzle concerning spatial flatness, rather than an observational problem. Useful historical notes are given for each chapter at the end of the book, but they are easy to miss because they are not flagged in the main text.

The above are minor criticisms. In this book, the story of general relativity is told with clarity and authority, and the narrative speeds along at a cracking pace. I particularly admired how the book opens with Eddington’s 1919 address to the Royal Society in which he announced the observation of a warping of space by our Sun – an experiment that was carried out on the island of Principe during an eclipse – and closes with a description of the author’s visit to the island 90 years later to lay a plaque in honour of that landmark experiment. All in all, this is a masterful, well written and timely addition to the literature on the greatest theory of them all.

  • 2014 Little, Brown/Houghton Mifflin Harcourt £20/$28hb 304pp

Portugal slashes funding for physics research

At least half of all Portugal’s scientific research units will receive only a limited amount of cash during the next five years from the country’s main funding agency, the Science and Technology Foundation (FCT). An evaluation process carried out by the agency in collaboration with the European Science Foundation (ESF) graded 322 proposals in science with six grades – “exceptional”, “excellent”, “very good”, “good”, “fair” or “poor”.

The process resulted in 71 out of 322 proposals being ranked “poor”, and those will receive no funding, while 83 were ranked as “good” or “fair”, and they will now get a maximum of €40,000 per year from 2015 to 2020 – for the majority this will be a substantial cut in funding. The remaining 52% were graded as being “exceptional”, “excellent” or “very good”, and they will now compete for a total of €50m in funding per year – about the same amount as in the previous evaluation process five years ago – in a second round of evaluation this autumn, that could, however, see more proposals downgraded.

Bibliometric evaluation

The FCT carries out evaluations of the country’s research every five years. While in the previous evaluation 16% of proposals were denied funding, this evaluation round was carried out for the first time in collaboration with the ESF, with the FCT also asking the publisher Elsevier to give bibliometric data about the researchers involved.

Physics in Portugal is being badly damaged
Carlos Fiolhais, Coimbra University

The results of the first round have been met with outrage by the Portuguese scientific community. “Physics in Portugal is being badly damaged,” says Carlos Fiolhais, a physicist at Coimbra University. “The government is trying to shut down very active physics research units.” In a statement, the Physics Society of Portugal also expressed concern, stating that “the majority of units in the centre and north of Portugal are going to be eliminated, or heavily constrained”.

Critics also point out the mismatch between the evaluation and the actual performance of the units. For example, the Center for Nuclear Physics and the Center of Physics and Technological Research, both based in Lisbon, have the highest numbers of papers and citations per researcher in physics in the country, yet they have not progressed to the second round.

“We were graded ‘excellent’ in the previous evaluation and our bibliometric indexes have improved since then, but still we have been graded ‘good’ now,” says Nuno Miguel Reis Peres, the director of the Center of Physics at the universities of Minho and Oporto. This now means that the institute’s cash from the FCT will fall from €380,000 to just €40,000 per year.

The FCT and the ESF have defended the quality of the evaluation. “The bibliometric output is only relevant to part of the evaluation. The strategic research plans proposed also had to be convincing to the panels,” Nicholas Walter, a senior science officer at the ESF who reviewed the FCT’s process, told physicsworld.com.

Prioritizing excellence

Indeed, the Portuguese government insists that there have been no budget cuts, with the exercise only a matter of prioritizing excellence. “I think it is a deliberate effort to redirect funding to areas that the FCT and the government feel are going to be competitive, and where innovation is likely to occur,” says biophysicist Alex Quintanilha, who is a member of the European commission’s Scientific Advisory Panel. “Social sciences, humanities and certain basic sciences are less important, in their view.”

While Quintanilha says that evaluation is necessary, he is concerned that the panellists evaluating the units were not experts in the same field. However, FCT spokesperson Ana Godinho maintains that each application was reviewed “by at least two area-specific experts” before the proposal was sent to the panels.

Rebuilding Tesla Tower

Two Russian physicists have turned to the fundraising website Indiegogo in the hope of raising a cool $800,000 to build a Tesla Tower.

Leonid and Sergey Plekhanov – graduates of Moscow Institute of Physics and Technology but now working in industry – want to reconstruct the famous Wardenclyffe Tower that was built by the inventor and engineer Nicola Tesla to find a commercial application for long-distance wireless energy transmission.

(more…)

Dark-matter searches get US government approval

Two key US federal funding agencies – the Department of Energy’s Office of High Energy Physics and the National Science Foundation’s Physics Division – have revealed the three “second generation” direct-detection dark-matter experiments that they will support. The agencies’ programme will include the Super Cryogenic Dark Matter Search-SNOLAB (SuperCDMS), the LUX-ZEPLIN (LZ) experiment and the next iteration of the Axion Dark Matter eXperiment (ADMX-Gen2).

“We are pleased to announce that the joint DOE/NSF second-generation programme will include the LZ and SuperCDMS-SNOLAB experiments with their collective sensitivity to both low- and high-mass WIMPS, and ADMX-Gen2 to search for axions. It will also include a programme of R&D to test and develop technologies for future experiments, consistent with the recent P5 recommendations,” says a joint statement from the two agencies. The P5 recommendations refer to the Particle Physics Project Prioritization Panel’s (P5) 2014 report, which was released in May this year. The P5 advisory panel considered what high-energy particle-physics experiments and collaborations the US government should fund across the next five years, in the light of declining spending on particle physics in the US.

Narrowing the field

Programme directors at both the DOE and the NSF have been looking into which of the many dark-matter-detector design bids they have received would give the best results. After narrowing it down to five experiments last year, the final three were chosen based on the advice of an external panel of experts.

Second-generation dark-matter experiments are experiments that will reach sensitivities that are at least 10 times better than current detectors can achieve. The ADMX-Gen2 will hunt for a dark-matter candidate particle known as an axion.

Both the LZ and the SuperCDMS will look for a type of dark-matter particle called WIMPs – weakly interacting massive particles – across a range of masses. SuperCDMS, which will be underground at SNOLAB in Ontario, Canada, will be particularly good at looking for light WIMPs with masses lower than 10 GeV.

The LZ is the union of the UK-based ZEPLIN programme, which has run three experiments at the Boulby mine during the past decade, and the current most sensitive dark-matter detector – the US-based LUX experiment. Like most of today’s dark-matter experiments, LUX is currently nestled deep in a subterranean cavern at the Sanford Underground Research Facility in South Dakota in the US, to shield it from any background sources – if all goes to plan, LZ will occupy the same space in a few years.

Unprecedented sensitivity

“While the SuperCDMS is wonderful at low masses, the LZ has unprecedented sensitivity across the scale,” says Chamkaur Ghag, from University College London, who was involved in ZEPLIN and now works on both LUX and LZ, and is delighted by the news of the US backing. “None of the competing experiments will be able to match it at that timescale,” he says.

Ghag told physicsworld.com that dark-matter searches must sweep from heavy to light particles because the masses of WIMPs are currently unknown. The technology developed for the current LUX detector was initially pioneered in the UK, “so it is wonderful to see the progress it has now made”, he says. Ghag explains that the UK also carried out a similar consolidation process last year, and the LZ detector was once more chosen to be the project of choice. He points out that the added US backing will further boost the already advanced design of LZ, saying that there is major UK involvement in the project. Three out of 10 main “work packages” that include building the cryostat that will hold the experiment, developing the internal “eyes” of the detectors or its photomultiplier tubes and screening all of the materials that will be used in constructing the detector, will be co-lead by researchers in the UK, who will also work on other parts of the project.

“This announcement by the US funding agencies DOE and NSF means that the LZ dark-matter experiment is on its way to becoming an approved project in the US. In the UK we are about to submit a proposal to the Science and Technology Facilities Council (STFC) for a three-year construction programme,” says Henrique Araujo, from Imperial College London, who is the principal investigator for LZ in the UK. “LZ uses extremely sensitive liquid-xenon detector technology to search for the very rare and extremely faint interactions of dark-matter particles – that we believe make up most of the mass of the universe.” With all three detectors gearing up to being up and running in the next few years, it could be only a matter of time before the dark-matter debate will be settled once and for all.

Hydrogel matrix makes superhydrophobic surface

A superhydrophobic thin film that can be coated onto virtually any substrate has been synthesized by an international team of researchers. The material, produced using a 3D nanotextured hydrogel matrix, is strong, very flexible and optically transparent. It might be used as a waterproof coating in applications such as self-cleaning windows, antifouling surfaces, and as a filter and sponge to separate oil from water after an industrial oil spill.

Superhydrophobic surfaces efficiently repel water in a phenomenon that is also known as the “lotus effect”. Now, a team led by Guihua Yu and Yi Shi from Nanjing University in China, along with colleagues at the University of Texas at Austin in the US, has made a new type of superhydrophobic surface comprising a 3D silica nanostructure replicated from a hydrogel template. The resulting hybrid coating consists of 3D interconnected nanofibres with uniform diameters of about 100 nm. Its morphology is like that of the lower surface of a lotus leaf, which contains micron-sized bumps that, in turn, are covered with nanoscale hair-like tubes. The nanofibres trap air under any water drops falling on them, creating a surface that repels water.

Stretched to their limit

The films produced by these inherently 3D nanotextured hydrogel templates remain superhydrophobic, even when stretched to their limit – and after more than 5000 stretching cycles at 100% strain. This is a first, because most superhydrophobic surfaces made to date lose their hydrophobic properties when exposed to a strain of more than 30%.

The films can be coated onto virtually any substrate, including metals, cement, wood, fabrics and plastics, thanks to their good wettability. They are also optically transparent (letting through 98% of light falling on them). They might come in handy as screen filters and sponges for separating oil from water, says Yu, because they can absorb up to 40 times their weight in oil.

The researchers made their superhydrophobic films using a polyaniline (PAni) hydrogel template. First, they mixed three precursor solutions together: an aqueous solution of oxidative initiator; an aqueous solution of aniline monomer and phytic acid; and tetraethoxysilane in isopropanol. The polyaniline hydrogel polymerizes and gels out fairly fast, and forms a 3D structure within three minutes.

Thanks to the high-acidic, high-water-content hydrogel matrix, the silica layer preferentially coats onto the PAni nanostructured template. Next, the silica layer is chemically modified, or “silanized” by depositing trichloro(octadecyl)silane onto the template to produce a superhydrophobic surface. The overall process is simple and can be scaled up to produce large amounts of superhydrophobic film, team member Lijia Pan told physicsworld.com.

The Texas–Nanjing researchers say that they are now looking at making super-oleophobic (oil-repelling) surfaces using the same hydrogel matrix template but a different version of their process.

The research is published in Nano Letters.

Copyright © 2026 by IOP Publishing Ltd and individual contributors