Skip to main content

Putting a damper on wobbly bridges

Wobbly footbridges can both delight and terrify pedestrians. Now, researchers in the USA and Russia have developed a model showing how an apparently stable bridge can suddenly show alarming, potentially dangerous wobbles when a certain number of people walk across it.

Designing a footbridge can be a challenge because it can be difficult to predict how a structure will respond to the pounding of many feet at once. The London Millennium Footbridge across the River Thames, for example, opened with great fanfare in 2000, only to close within days after large crowds found the bridge rocking unnervingly as they walked. The bridge remained closed for almost two years while dampers were installed.

Bridges, like any other structures, have natural frequencies of vibration. It is well known that bridges can collapse if large numbers of feet simultaneously excite vibrations at these natural frequencies. The Albert Bridge – built across the Thames in 1873 – bears a sign instructing marching soldiers to break step when crossing. However, ordinary pedestrians do not march in step. Moreover, the Millennium Bridge oscillated left to right, not up and down.

Inadvertent amplification

In 2004 Steven Strogatz of Cornell University in the US, and international collaborators, modelled pedestrians on a bridge as coupled oscillators to show how, if a bridge does begin to vibrate naturally, pedestrians can fall into step with the vibrations to maintain their balance. In doing so, they inadvertently amplify the oscillations. This is analogous to the famous model, first developed by the Dutch physicist Christiaan Huygens in 1665, of pendulums suspended from the same beam becoming synchronized in phase because of motion transmitted through the beam.

Strogatz’ model has been highly influential in the applied mathematics community, but it cannot provide precise, quantitative predictions of the conditions under which a given bridge will wobble that could be used for computer modelling in the design of bridges. “The existing industry programs used to develop bridges are based on linear calculations,” explains Igor Belykh of Georgia State University in the US. “These are very outdated and cannot capture highly non-linear phenomena like this switching to larger wobbling as a result of very complicated two-way interactions between the pedestrians and the bridge.”

Belykh and colleagues in Russia combined crowd synchronization and bridge dynamics with a biomechanical model of walking humans as inverted pendulums pressing alternately on the ground with left and right feet. They considered many such pendulums on the bridge at once, with a range of frequencies and phases, and formulated two nonlinear differential equations for the amplitude and phase of the bridge’s oscillations.

Pendulum crowd

The researchers showed that, above a specific critical number of pendulums, a stable solution can appear in which the oscillators all fall into phase and the amplitude suddenly increases: “We were able to give specific estimates of the relationship of this critical size to the natural frequency of the bridge, to the mass of the bridge and to the natural frequency of human walking,” says Belykh. The model predicted oscillations of the Millennium Bridge would occur when more than around 165 people walked on it at once – matching the experimental findings of the engineering company Arup, who designed and fixed the bridge. In future, says Belykh, the work could predict whether the anticipated number of pedestrians using a planned bridge will cause problems, and whether additional dampers or other design modifications are needed. The researchers also developed a more mathematically abstract model that gives very similar predictions and can be solved analytically.

Questions remain, however, and it is uncertain how phase synchronization arises initially. For example, the Clifton Suspension Bridge in Bristol, UK, was closed to large crowds after it developed oscillations when thousands of pedestrians surged across it during the city’s annual Balloon Fiesta. However, the oscillation frequency of this bridge was different from the average pedestrian’s frequency and people did not fall into phase when crossing. The researchers are now investigating these phenomena in collaboration with John Macdonald and colleagues at the University of Bristol, who originally developed the inverted pendulum model.

Adilson Motter of Northwestern University in Illinois says the work fits into a body of complex systems research on bridges and synchronization phenomena that followed the Millennium Bridge affair: “A key step here is that they model what a person is on a bridge that is not very stable and try to understand how the person interacts with the bridge response,” he says.

Moving forward

Henk Nijmeijer of the Technical University of Eindhoven in the Netherlands agrees that “it’s a very interesting [piece of work] that brings together aspects of crowd dynamics and wobbly bridge dynamics, and there is still a lot that’s not well understood”. He notes, however, that, in using inverted pendulums to model the pedestrians, the researchers have ignored the crucial fact that pedestrians cross a bridge: “Pendulum clocks are not supposed to walk from left to right or right to left,” he says. “If you miss in the walker the forward motion which must be there, there is something weird in the model.”

The research is described in Science Advances.

Pulsars could reveal nanohertz gravitational waves within 10 years

Evidence for gravitational waves from binary supermassive black holes could be spotted in pulsar-frequency anomalies in the next 10 years, according to researchers in Germany, the UK and the US. Distortions in space–time caused by the passage of gravitational waves should temporarily alter the distance between Earth and certain highly regular pulsars, affecting the periods of the radio pulses received from them.

Frequency threshold

The recent observation of gravitational waves by the LIGO and Virgo experiments represents one of the most important astronomical breakthroughs of the last few decades. But although there is no overstating the potential of this new eye on the cosmos, there are some gravitational-wave sources to which the technique will always be blind.

Earthbound laser interferometers such as LIGO and Virgo are sensitive to gravitational-wave frequencies between 10 Hz and 10 kHz – a range that corresponds approximately to the human-audible sound spectrum. Some astronomical sources produce signals far below the bottom end of this range, however. When two galaxies collide and merge, for example, the gargantuan black holes at their respective centres can end up orbiting each other as a supermassive black-hole binary (SMBHB). Even if the objects are destined ultimately to coalesce, such relationships can last for billions of years, with gravitational waves emitted continuously at frequencies as low as 1 nHz.

Chance of detection

Writing in Nature Astronomy, Chiara Mingarelli of the Max Planck Institute for Radio Technology in Germany, and California Institute of Technology in the US, and a multi-institutional collaboration have calculated the likelihood of such an SMBHB being detected against the gravitational-wave background under a range of possible conditions. The group based their analysis on a catalogue of more than five-thousand suitably sized “local” galaxies identified by the Two Micron All-Sky Survey (in this context, “local” means within about 730 million light-years from Earth). The researchers then used the results of cosmological simulations conducted by the Illustris project to estimate that about 100 of these galaxies are likely to contain SMBHBs.

Currently available pulsar timing arrays were sufficient to reveal gravitational waves in fewer than 1% of probabilistic simulations based on these local sources, which helps explain the lack of positive results obtained so far. Projecting the addition of dozens of new pulsars to the pulsar timing array over the next decade, and assuming that the gravitational-wave background can be subtracted, the researchers found that continuous gravitational waves from at least one SMBHB could be detected in the next 10 years.

What philosophers do

There are some questions in physics that no amount of physics research can answer. Why, for example, is doing string theory scientific despite its lack of empirical predictions? How should we interpret quantum mechanics? And what, while we’re at it, is so fundamental about physics? We can answer such questions dogmatically by appealing to textbooks or by making rough and ready pronouncements, but the underlying issues are best clarified with the help of the systematic, critical reflection that philosophy practises.

You’d expect me to say that; I am a philosopher after all. So I’ll ignore all the stupid and half-arsed remarks about philosophy that I’ve heard from physicists who should know better and come straight to the point: philosophers seek to understand, not what physicists know, but how they know it. And because physicists are constantly discovering new ways to know things, philosophy of physics is as alive, valuable and active as physics itself.

Three traditions

Philosophy comes in several traditions, of which three – “analytic”, “pragmatic”, and “continental” – have paid particular attention to physics. They are stylistically and methodologically divergent and, to outsiders, may erroneously look like political parties squabbling over ideological commitments. These traditions, however, have distinct perspectives on science. It’s a bit like how chemists, physicists and engineers have distinct perspectives on atoms: different features of the subject matter are put centre-stage, and analysed in different vocabularies for different ends.

Analytic philosophers, whose founding figures include logicians and mathematicians such as Rudolf Carnap and Bertrand Russell, are mainly interested in the logic of science and the meaning of its basic concepts. Starting with the language of scientific theorizing, they seek the logical conditions for its successes. Analysts tend to agree that concepts and theories are what can be known about the world, and that these are judged by testing models against observations. They focus on the “epistemology” of science – on its conceptual and methodological issues, on the logic of scientific inquiry, on evidence, and on the conceptual structure of its findings. Analysts essentially regard physicists as logicians of the world.

Pragmatic philosophers, whose founders include Charles Peirce (a physicist), William James and John Dewey, are interested in how scientists solve puzzles and what the consequences are. They know that humans don’t spring into being thinking like scientists but apprentice to become them. Pragmatists believe that true scientific ideas make a difference to the world and to science, that inquiry involves doing rather than just cognition, and that scientific work is judged by how well it explains, predicts, and gives us power over (rather than just describes) nature. Pragmatic philosophers view physicists as puzzle-solvers of the world.

Continental philosophers, whose founding figures include Edmund Husserl and Martin Heidegger, approach scientific activity as one way of life, among others, in which humans engage with the world. Continental philosophers agree that scientific activity gives a primacy to things that appear in a certain (framed) way – to things that can be measured and manipulated – and tends to ignore things that do not, such as the powerful metaphors, images and deeply embedded habits of thought that shape our thinking. Continental philosophers agree it’s a mistake to assume that the original human encounter with the world is cognitive, for all ways of being, scientific activity included, spring from a pre-scientific engagement with the world. Humans must be trained, technically and interpretatively, to think like scientists. Continental philosophers view physicists as disclosers of the world insofar as it is knowable and manipulable.

Three approaches

Philosophers of physics take their most important problems not from textbooks but from the practice of physics itself: what problems in physics can’t more research make go away? How a philosopher approaches such problems – the scientific character of string theory, say – depends on their tradition.

Analytic philosophers would start with their traditional description of scientific method – in which testability is essential – and add additional criteria to make string theory conform. Pragmatists wouldn’t be obsessed with whether string theorists were following any specific method, which might change with science itself. Instead, they’d judge string theory by whether it made a difference to physics – whether it yielded insights about existing physics (field theory for instance), and carried forward the aims of the theoretical physics community. Continental philosophers would start by investigating why physicists are stuck on this question – why one group of physicists thinks that ascertaining whether string theory is scientific should be settled by appealing to traditional concepts of “method” and “confirmation” while another group finds it sufficient to consult the actual experience of practising physicists. Each group evidently understands something about physics that cannot yet be articulated to everyone’s satisfaction, making such a controversy deeply revealing about physics itself.

The critical point

What philosophers can do, in short, is to encourage reflection on the practice of physics, especially on urgent and obstinate questions such as “Is string theory scientific?” The various philosophical approaches each bring different kinds of expertise to their analyses of this question and scrutinize in detail different features of what is taking place: logic, puzzle-solving, and interpretative and self-interpretative activity.

Analytic philosophers can stimulate the question of how much the answer has to do with methodology. Pragmatists can argue that answering such a question is less methodological and more a matter of evaluating the consequences of accepting or rejecting string theory. Continentals can point to the relevance of scientists consulting their own experience, so that they are not just reflecting on questions of method, confirmation, inquiry and community consensus, but also considering how the relationship between science and the wider world can become part of the regular practice of science itself.

Is encouraging these kinds of reflection of value to physicists? How could it not be?

20,000 pings under the sea

Friday 16 June 2017, 8.30 p.m., 260 km off the west coast of Vancouver Island in the north-east Pacific Ocean. I’m in a darkened control room on the exploration ship Nautilus. Roaming below, remotely operated vehicles (ROVs) Argus and Hercules are diving into the ocean depths, while their pilots and navigators sit beside me with rapt faces illuminated in the glow of giant HD monitors.

Staring at the screens, our 10-strong exploration team is gripped by a scene enacted from a science-fiction movie: an alien landscape of sculpted and forbidding towers is emerging from the dark abyss. Black, mineral-rich clouds are billowing from a panorama of irregular chimneys created by fissures in the Earth’s surface called hydrothermal vents. Some of these obelisks are giants, such as the aptly named Godzilla, which grew to a height of 45 m before collapsing under its own weight. Others feature delicate fluted branches and fans like underwater candelabra. And in addition to these giant smoking chimneys there is a multitude of smaller vents, some like miniature engines with spewing exhausts, others, emitting cooler, clearer water, like shimmering mirages in a scorched desert.

Photograph of the ROV Hercules while underwater

It is to this seemingly industrial landscape engineered by heat and chemistry that I have come to glimpse the future, to imagine echoes of the technology before me applied to the vast ocean worlds of our solar system where, on moons such as Jupiter’s Europa, we may one day look for new life.

Astronomer turned seafarer

My background, my day job if you like, is astronomy. Based at the University of Victoria in Canada, I use large telescopes and space-based observatories to study the universe at its largest scales, looking at clusters of galaxies and their distribution over the sky. However, I also have a keen interest in astrobiology – the search for life beyond Earth. As that search takes its first, faltering steps, we still have much to learn from our own planet about life in extreme and novel environments. This is why I have come to see the deep-sea hydrothermal vents, for the geochemical energy coursing through the multitude of chimneys and towers also supports truly unique ecosystems in conditions that many believe may exist within the ice moons of Jupiter and Saturn.

I have joined the Nautilus for two weeks as a science communication fellow, one of a group of formal and informal educators with a passion for outreach, selected by the Ocean Exploration Trust (OET) – a non-profit, US-based society that seeks to bring ocean discoveries to an enthused and interested public. Its president and founder is Robert Ballard, a marine geologist who has spent a lifetime exploring the world’s oceans with innovative technology and is best known for discovering the Titanic shipwreck in 1985.

This Nautilus voyage is part of Ocean Networks Canada’s (ONC) Expedition 2017 – Wiring the Abyss. ONC, an initiative of the University of Victoria, operates deep-sea observatories off the coasts of Canada and in the Arctic, and our two-week voyage will be taking us to numerous locations to perform a variety of tasks. Currently though, we are floating above one of their deepest sites – Endeavour. Located at depths of 2200–2600 m, Endeavour is part of the Juan de Fuca (JdF) Ridge – a boundary between the Pacific and JdF tectonic plates. The site is one of five connected via the North East Pacific Time-series Underwater Networked Experiments (NEPTUNE) observatory – an 840 km loop of power and fibre-optic cable that links a network of instruments on the sea floor to scientists ashore. NEPTUNE spans the JdF plate and offers researchers access to a wide range of deep ocean geological environments within a relatively small geographical area.

NEPTUNE was switched on in 2009 and, in many ways, it functions like the local power network in your own town or city. At intervals along the cabled system are clusters of instruments around sites of scientific interest. At these “nodes” the main line voltage of 10,000 V is stepped down to 400 V by transformers and directed to junction boxes, each functioning like the circuit panel in your own home. From here, power at voltages between 15 and 48 V is distributed via extension cables to individual instruments installed on the sea floor. Fibre-optic cables run through the entire network and allow scientists ashore to gather data in real time – with 500 terabytes of data and counting all stored on ONC’s servers.

In case you were thinking that all of this sounds fairly straightforward, just remember that all of this hardware has to be installed and operated at depths of up to 2600 m below sea level and up to 260 km offshore. The water temperature is just 2 °C above freezing, the pressure of overlying water is up to 260 atmospheres (about 26,350,000 Pa) and it is very, very dark.

This is where the Nautilus comes in. Named after Captain Nemo’s ship in Jules Verne’s novel Twenty Thousand Leagues Under the Sea, this 64 m long ship offers a comfortable, cosy home for 17 professional crew and 31 scientists, engineers and educators. My own personal piece of real estate while aboard is a bunk in cabin 81. Located in the deepest, darkest section of the vessel, the cabin might seem a poor deal, designed for those on the lowest rung of the crew ladder. However, I soon discovered that being close to the ship’s roll axis means that heavy seas disturb me less than my crew mates higher up in the ship (literally and figuratively). The lack of a porthole in my cabin is also a welcome blessing when I try to nap during the middle of the day (a regular occurrence given that my watches are 4–8 a.m. and 4–8 p.m.). As a science-communication fellow, my job while on watch is to occupy the comms seat in the control van – acting as the human link between the watch crew and the public ashore following us on nautiluslive.org. I am afforded a ringside seat, my only responsibility being to convey the interest and excitement of deep-ocean exploration to our online audience.

It is to explore the ocean at great depth that the Nautilus deploys its two ROVs Hercules and Argus. The Argus of antiquity was a giant commanded by Hera to watch over the nymph Io with his 100 eyes, and the current-day ROV Argus lives up to the name by acting as a chaperone to Hercules. The two ROVs are linked together by a 50 m neutrally buoyant cable, and Argus is then attached to the Nautilus via a 4 km-long cable that provides power and fibre-optic connectivity. The ROV-babysitter, which can go to depths of 6 km, serves as both a watcher, with HD cameras keeping an eye on Hercules, and a shock absorber, preventing the rolling surface motion of the Nautilus from affecting its charge. Meanwhile, Hercules does the research to depths of up to 4 km – it operates as a completely stable work platform and, with two manipulator arms, a suite of thrusters and HD cameras, it provides its pilot aboard the Nautilus with an immersive sense of presence on the sea floor.

Underwater chimneys

The hydrothermal vent systems we have come to explore are associated with tectonic boundaries where weakness in the Earth’s crust results in the magma of the liquid mantle approaching far closer to the surface than anywhere else – to within 1 km in some locations, compared to 5–10 km in most other areas on the ocean crust. Seawater percolating down through faults and fissures is heated near the magma and this abundance of energy powers aqueous reactions with rocks that saturate the superheated water with dissolved minerals. Chief among these are sulphides of iron, copper and zinc.

As water is heated it becomes less dense and therefore more buoyant than the seawater above it. The faulted and fractured rocks overlying tectonic plate boundaries are therefore like a complex plumbing system, with cold seawater descending and hot mineral-rich water ascending though the rock layers. Upon reaching the sea floor, these plumes of hot water, which can be up to 400 °C, encounter seawater with an ambient temperature of 2–4 °C. The crushing pressure at Endeavour maintains the superheated water in a liquid state yet contact with colder water causes dissolved minerals inside it to precipitate. In particular, as the iron sulphide precipitates it forms dense hazes of fine black particulates which, when viewed underwater, appear as belching clouds and give these vent systems their common name – black smokers.

Underwater photograph of a black smoker

The temperature gradients around the vent systems are immense and test the nerve of any submersible or ROV pilot with the temerity to explore them. Water escaping at the very base of a vent may be as hot as 400 °C and rises in a vertical, expanding plume. However, a probe 1 m off to the side will still register the temperature of ambient sea water, say 3 °C in this case. Even as the probe is inched closer, to within a few centimetres of the plume base, it will only register a temperature of 20 °C or so. When black smokers were first discovered, the pilots of the submersible Alvin used a manipulator arm to insert a temperature probe directly into water escaping from the base of a vent. The probe, constructed of the same perspex material as the viewing ports, promptly melted and the pilots beat a measured yet deliberate retreat. Indeed, one treasured memento from my visit to Endeavour is a section of three-quarter inch electrical cable in a thick plastic sheath. It had the misfortune to fall across the opening of a seemingly innocuous vent and, in the space of about 1 s, it melted and parted – now appearing like an oversized bright-green stalk of asparagus.

One of our first tasks when we arrived at Endeavour was to install a probe into the water emerging from a specific vent location and determine the concentration of dissolved ions. Known as a Benthic and Resistivity Sensor (BARS), the device consists of a custom-built titanium pressure cylinder housing the electronics needed to measure temperature and resistivity. Set on short stubby legs about 15 cm long, the BARS receives data from a ceramic probe inserted into the base of a nearby vent system. This is one example where extended time series data provide critical insight as the temperature and chemistry of the emerging vent water can vary rapidly in response to changing conditions in the reaction zone several hundreds of metres below. However, to be of value, the water must be sampled as it emerges from the very base of the vent, right on top of the fissure in the volcanic rock.

I was therefore shocked that my first encounter with a hydrothermal vent would be to topple a 2 m tower to expose the base of the vent. Had I come all this way just to nuke the chimney? I need not have worried as vent chimneys have been observed to grow at prodigious rates – up to 30 cm in a day and up to 5 m over the span of a year. BARS devices have themselves been embedded within tall chimneys that have enveloped them during their time on the sea floor. On a personal note, any chagrin I felt at my involvement in toppling the chimney was replaced with scientific glee as I was later able to handle a sizeable chunk of the vent material – the grey, friable rock has the strange consistency of highly compressed cigarette ash. Sadly, my prized sample will dry and crumble with time, the particles of iron sulphide steadily rusting as they are oxidized in our atmosphere. Vent chimneys, and the hydrothermal activity that creates them, are transient phenomena.

Life in a dark, cold world

Geologically speaking, these vent systems were a revelation when discovered in 1977 – they provide the missing link in understanding both heat flow through Earth’s crust and the chemical composition of its oceans. What was completely unexpected, however, was that such environments, well below the reach of any sunlight, would host abundant and unique ecosystems.

When scientists visited the first of these black smoker vents, they were astounded with views of gardens of giant tube worms wafting in the turbulent currents at the periphery of the vents – their vivid red bodies and gills in stark contrast to their white, calcareous tubes. Ghostly white crabs prowl through this forest of tubes and frond-like gills, snipping off tasty morsels from any tube worm too hesitant in withdrawing. Gauzy mats of filamentous bacteria are harvested by hosts of eyeless shrimp, some of which transport their own travelling gardens of bacteria growing on their undersides.

The important question is what supports this complex ecosystem, where the pressure is as high as 260 atm (it increases by 1 atm for every 10 m in depth), there is no sunlight whatsoever and the temperature is near to freezing. The key resource turns out to be hydrogen sulphide (H2S) and oxygen (O2) dissolved in the sea water. Combined with dissolved carbon dioxide (CO2), these gases provide an abundant and constantly renewed source of geochemical energy for colonies of microbes. However, what was originally poorly understood at the time, yet has since been revealed in a number of elegant studies, is the extent of the symbiotic relationships between macrofauna and microbes.

Underwater photograph of a black smoker covered with tube worms

The critical requirement for the tube worms is that they must simultaneously provide their symbiotic bacteria with both H2S from the vent fluid and O2 from ambient seawater. Access to both occurs on the turbulent periphery of vent systems, in waters of temperature 2–60 °C, where partial mixing of vent and ambient seawater exposes organisms to both reservoirs of gas on timescales of tens of seconds or less. ONC’s infrastructure offers a simple, yet highly effective, way to study this mixing process by laying out a sensor mat over the garden of tube worms. Consisting of a wired grid of temperature and dissolved oxygen sensors, the mat allows competing flows of vent and ambient seawater to be monitored in real time and checked visually using a pan-and-tilt video camera equipped with powerful lights (though to avoid disrupting life at the vents the lights are used sparingly).

Upon seeing these ecosystems for the first time in 1979 the late biologist and oceanographer Holger Jannasch, from the Woods Hole Oceanographic Institution, made an immediate astrobiological connection. “We were struck by the thought, and its fundamental implications,” he once recalled, “that here solar energy, which is so prevalent in running life on our planet, appears to be largely replaced by terrestrial energy – chemolithoautotrophic bacteria taking over the role of green plants. This was a powerful new concept and, in my mind, one of the major biological discoveries of the 20th century.”

Deep-ocean, sulphide-oxidizing bacteria remain metabolically reliant on surface photosynthetic organisms (plants, algae and bacteria), which provide all of the dissolved oxygen in Earth’s oceans. Anaerobic chemosynthetic bacteria and archaea – microbes that can metabolize geochemical energy without using molecular oxygen – are present but rarer and less well studied than their oxygen-breathing brethren. Clearly there is still much to learn.

My watch is ending

Our time at Endeavour is nearly done. The four hours of my watch have passed in a shimmering blur and by the time I come back for my next shift on comms, the exploration group will have moved on to the next task of the voyage. While this particular dive has lasted nine hours, more demanding, complex dives with Hercules and Argus can go on for up to 72 hours. In using remotely operated yet highly capable vehicles controlled over a near-instantaneous link, each dive shows how it is feasible that such techniques may one day be used to explore oceans beneath the ice sheets of Europa and other moons in our solar system. By being present at the Endeavour hydrothermal vent system, I hoped to see vents as close-up as possible, yet also glimpse a more distant vision – of one possible future way to explore beyond our planet. Those were my hopes and I was not disappointed.

A lunar ocean on Europa?

Artist’s impression of an ROV exploring a watery planets

Arriving at Jupiter in 1995, the NASA spacecraft Galileo began the first detailed exploration of our solar system’s largest planet and its retinue of moons. One of Galileo’s major discoveries was of a weak magnetic field emanating from Jupiter’s second closest moon, Europa. Oddly, this field rotated not once every 3.6 days, as Europa rotates, but every 9.8 hours, in step with Jupiter itself. The inference is that Europa’s magnetic field is induced by Jupiter’s: a physical effect that requires an electrically conductive layer beneath the surface ice sheet. The Galileo observations point to a 100 km deep, lunar-wide, salty, liquid water ocean – a volume of liquid water equal to two times that of planet Earth. Internal heat, caused by rhythmic tidal forces raised as it orbits Jupiter, maintain Europa’s ocean in a liquid state. Furthermore, the inferred presence of salt in Europa’s oceans suggests that liquid water is in contact with rock. On Earth, heat flow across rock–ocean boundaries occurs most intensely at deep ocean hydrothermal vents and the possibility that such vents may exist within Europa’s oceans is motivating a new generation of solar system oceanographers in their quest for new discoveries.

Living cells weigh-in on tiny cantilever

Changes in the masses of microscopic living cells have been measured using a tiny vibrating cantilever. By monitoring the resonant frequency of the cantilever, researchers in Switzerland and the UK could detect changes in mass as small as 1%. ETH Zurich’s Daniel Müller, says that this new ability to measure cell mass introduces “a new parameter into biology”, which the team has already used to made new discoveries about cells’ behaviour. The researchers believe the technique could have a wide range of applications including stem-cell biology, drug discovery and even cancer research.

Tricky measurement

Measuring the size of living cells has been integral to biology for decades, but accurately tracking their mass is much trickier. For over 50 years, the basis of cell analysis has been flow cytometry. This determines the size of cells by measuring the changes in the electrical resistance and/or the optical properties of a solution of cells when it passes through a narrow tube.


“It’s like trying to characterize the behaviour of a Swiss cow on the moon”
Daniel Müller, ETH Zurich

“These devices are very powerful,” says David Martínez Martín, also of ETH Zurich. “Doctors use them to do a blood analysis to tell you the size of your red blood cells and see, for example, whether you have anaemia.”

The technique has several limitations, however. It measures volume but not mass, so changes in density are undetectable. Also, it cannot study changes in specific cells on short time scales. Finally, cells may behave differently if extracted from tissue and placed in solution. “It’s like trying to characterize the behaviour of a Swiss cow on the moon,” says Müller. “It’s just not a native environment.”

Small masses can be measured using a tiny cantilever like that used in a scanning probe microscope. When a mass is attached to the free end, the cantilever’s resonant frequency drops and this drop can be measured. Similar principles have already been used to measure the mechanical properties of living cells by using piezoelectric materials to drive the cantilever or simply using its natural thermal oscillations. The noise in these oscillations, however, compromises the cantilever’s sensitivity to tiny changes in cell mass.

Müller and colleagues attach single mammalian cells or small cell clusters to a cantilever, which is then set oscillating by a laser beam that is modulated at the cantilever’s resonant frequency. A second laser is used to measure the actual frequency of oscillation and an electronic feedback loop adjusts the modulation frequency to ensure that the cantilever is always driven on resonance.

Cool running

This presented a challenge to the team, explains Martínez Martín, because they had to excite sufficient oscillations of the cantilever to measure changes in its resonant frequency without heating it up and destroying the cells. “Most other physicists said it would not work,” he says. Detecting oscillation amplitudes as small as 0.1 nm allowed the researchers to use microwatt laser powers and thereby keep the temperature of the cantilever within 0.1°C of their desired temperature for days.

The team could monitor changes of around 15 pg (approximately 1–4% of a cell’s mass), with a time resolution of 10 ms. They noticed two distinct sets of oscillations – one with period about 2 s and one with period around 18 s – neither of which had been seen before. The researchers found the oscillations were suppressed when they disrupted cellular energy and water exchange, so they concluded these processes were responsible for the cyclic changes in mass.

The team also studied the cells’ response to viral infection. Over 40 h, healthy cells were seen to increase in mass as they grew and divided. The mass of cells infected with a virus, however, did not increase. This was unexpected as an infected cell continuously produces new virus particles that then burst out from inside it. “Most people would have said ‘Of course a cell grows if it produces viruses’, although there was no real data until now,” says Müller.

Cell regulation

The researchers believe the technique could find numerous applications: “We’ve got an enormous response from biologists,” says Müller adding that it will allow scientists to study how cells regulate their masses and volumes. Crucially, the technique could reveal how this regulation is disrupted by disease.

“Within cell biology, mass is not something that you regularly see measured and reported,” says Thomas Burg of the Max Planck Institute for Biophysical Chemistry in Göttingen, Germany, who was not involved in the research. “I think this [work] will significantly contribute to raising awareness that mass measurements can reveal interesting phenomena in cells and to raising new questions and hypotheses around mass, which will lead to new insights into how cells develop, live and grow.”

The device is described in Nature.

Silicon probe measures hundreds of neurons in moving animals

A neural probe able to monitor the activity of hundreds of neurons simultaneously has been designed and constructed by researchers in the UK, the US, Canada and Belgium. The new device, dubbed “Neuropixels“, squeezes 960 titanium nitride recording sites onto a single, 10 mm-long silicon strip (or shank), with 384 user-programmable recording channels.

High resolution, large area

Direct measurement of large ensembles of neurons in the brain is essential for our understanding of sensory, motor and cognitive processes. For the best insight into brain dynamics, experiments must exhibit high spatiotemporal resolution and large volume coverage.

A previous experiment demonstrated simultaneous measurement of 300 neurons using a 16-shank configuration, but the size of the associated amplification and multiplexing equipment made it unsuitable for long-term use in mobile subjects. Achieving the same resolution and coverage in a device engineered for unrestrained animals has been challenging.

As well as employing a dense and extensive array of recording sites, the multi-institute collaboration also needed their new probe to have a small cross-sectional area (to minimize brain damage), to be resistant to noise and motion-induced artefacts, and to allow stable recording over weeks and months.

Integrated design

The design that the researchers came up with has a 70 × 20 µm cross-section and a base in which the analogue amplifiers, multiplexers and digitizers are integrated. This makes the complete setup small, and light enough to be implanted chronically into freely moving mice. Furthermore, because the device was fabricated using a standard complementary metal–oxide–semiconductor (CMOS) process, it can be manufactured at large scale and low cost.

The group demonstrated the effectiveness of the Neuropixels instrument by making recordings of hundreds of neurons in mice and rats over more than eight weeks, and finding no sign of performance degradation. As the probe length is of the same order as the rodents’ brain size, the device can span several brain structures simultaneously, helping to reveal how neural activity is co-ordinated during waking behaviour.

Full details of the research are published in Nature.

Name a distant world, fireworks through a diffraction grating, radio telescope helps Puerto Rican relief

By Hamish Johnston

Here is an opportunity to put your mark on the solar system. NASA and the team behind the New Horizons spacecraft are asking the public to nickname the mission’s next flyby target. Located in the Kuiper belt and called “(486958) 2014 MU69”, the target is likely to be two objects – each about 20 km across – in a very close orbit. So, a name like “Cheech and Chong” could be a winner. To enter, go to “Help us nickname a distant world”.

Ever wonder what fireworks would look like when viewed through a diffraction grating? You are in luck because astrophysicist Jen Gupta has posted a video of such a scene in all its psychedelic glory on Twitter.

In September, Hurricane Maria roared through Puerto Rico and the US territory is still struggling to deal with the aftermath. Some good news is that the huge Arecibo radio telescope only suffered minor damage and is now allowing relief agencies to use its facilities including a fresh-water well, electric generators and a helicopter pad. The telescope is also coming back to life with an unlikely ally – local WiFi providers. The facility has lost its Internet connection and is relying on wireless communications at the very radio frequencies it must police to avoid interference with its observations. See “Giant radio telescope lends a hand in Puerto Rico relief” for more.

Applied physics and Japan’s ageing population

By Matin Durrani in Osaka, Japan

By most measures, Japan is one of the wealthiest nations in the world. Depending on which criterion you use, it’s either the third or fourth biggest economy on the planet. Much of that success is built on the country’s prowess in science and technology, which have supported numerous hi-tech giants of the corporate world.

Still, not everything is rosy in the Japanese garden. After the post-war boom years, the economy began to slump in the early 1990s and has picked up only slowly since then. To make matters worse, Japan has also had to contend with rising social-security costs to support an ageing population. Plummeting birth rates and steadily rising death ages mean that Japan’s population has fallen by just over 1% since 2010 to 126 million.

I was thinking about such matters yesterday as I walked through a shopping mall in central Osaka on my way to meet applied physicist Satoshi Kawata from the University of Osaka. Okay, it was a weekday lunchtime and this is just one data point, but there sure were lots of pensioners out shopping. I was also surprised to see a guy selling The Big Issue – the magazine that supports homeless people who want to make a living. I’d not seen any inkling of poverty in the country up to that point.

The ageing population means that most learned societies in Japan are shrinking in numbers simply because there are fewer young people around. However, sitting in his office on the third floor of the Photonics Center, Kawata was pleased to say that membership of the Japan Society of Applied Physics (JSAP), which he served as president from 2014 to 2016, remains relatively constant.

“It’s good given that the number of 18-year-olds as a fraction of the total population has halved in the last 30 years,” says Kawata, who is 65 but still massively active in his new position as an emeritus professor at Osaka. Indeed, he claims that the proportion of young researchers in the JSAP membership is going up.

Having a vibrant applied-physics community – JSAP has more than 23,000 members – seems vital if Japan wants to continue to innovate. It also needs more people like Kawata, who is fizzing with ideas and willing to challenge orthodox thinking. And just because he’s “officially” retired, doesn’t mean older researchers like him have got nothing to offer any more.

Superconducting quantum computer achieves 10-qubit entanglement

Physicists in China and the US have built a 10-qubit superconducting quantum processor that could be scaled up to tackle problems not solvable by classical computers. The performance of the device was verified using quantum tomography, which showed that the new approach can generate a true 10-partite Greenberger–Horne–Zeilinger (GHZ) state – the largest yet achieved in a solid-state system.

The field of quantum computing is in its infancy, and a genuinely useful, practical device that outperforms classical computers has not yet been built. At this stage of development, researchers do not even agree on the basics of implementation, but techniques employing superconducting circuits have an advantage over some other designs in that they are based on established and scalable microfabrication processes.

Robust to noise

Writing in Physical Review Letters, a multi-institution collaboration led by Jian-Wei Pan of the University of Science and Technology of China, Shanghai, report a superconducting architecture in which information is encoded as transmons – a form of charge qubit especially robust to noise. The team used a bus resonator to mediate qubit–qubit coupling, and showed that a single collective interaction could produce a 10-qubit GHZ state from initially non-entangled qubits.

Pan and colleagues propose that the efficient generation of entanglement, and the ability to operate on different qubit pairs in parallel, make their approach a promising route to achieving a large-scale quantum computer.

Japan pushes ahead for Hyper-Kamiokande neutrino detector

The University of Tokyo has created the Next-generation Neutrino Science Organization (NNSO) to promote the construction of a new neutrino facility in Japan called Hyper-Kamiokande (Hyper-K). The occasion was marked on Wednesday by a gathering of around 100 people at the Kamioka Observatory, which is already home to Hyper-K’s predecessor – Super-Kamiokande. The 2015 Nobel-prize-winning neutrino physicist Takaaki Kajita, who was also at the event, has been appointed director of NNSO.

The $800m Hyper-K neutrino experiment would, if built, involve a 260,000 tonne tank of pure water some 74 m in diameter and 60 m tall. Located 650 m underground in Kamioka, it aims to detect the Cherenkov radiation produced by the collision of neutrinos with water molecules using 40,000 photomultiplier tubes that each have a diameter of 50 cm. The project is being led by the Institute for Cosmic Ray Research (ICRR) together with the Kavli Institute for the Physics and Mathematics of the Universe (IPMU) and the University of Tokyo’s School of Science.

Fundamental question

The main aim of Hyper-K is to detect charge–parity symmetry violation (CP violation) in neutrinos, which could explain why there is more matter than antimatter in the universe. Indeed, the 10-fold increase in detector volume over Super-Kamiokande will allow physicists to get much better statistics, with the aim of getting a 5σ “gold standard” measurement by 2030. “This is about answering the fundamental question of why we exist,” says neutrino physicist Masahiro Kuze from the Tokyo Institute of Technology. Yet CP violation is not the only goal, it will also aim to determine the “mass hierarchy” of the three types of neutrinos – electron, muon and tao.

Physicists in Japan hope that Hyper-K will be approved later this year by the country’s Ministry of Education, Culture, Science and Technology (MEXT). Indeed, the project is already part of MEXT’s roadmap of future facilities, in which it was selected earlier this year together with six other planned facilities. If given the go-ahead, construction could begin next year and be completed by 2026. Japanese physicists are looking for foreign investment in the project. It is expected that the Japanese photonics firm Hamamatsu will build at least half of the 40,000 photomultiplier tubes, with the rest made by other countries.

America first?

Hyper-K is not the only next-generation neutrino experiment aiming to detect CP violation. The Deep Underground Neutrino Experiment in the US, which began construction in July, will aim to do so via four 10,000 tonne tanks of liquid argon. While the US project is further along in terms of construction, the technology is much less established than that at Hyper-K, so physicists in Japan are confident – if the project is approved soon – that they can measure it first.

“I firmly believe that the Hyper-K experiment will be one of the most important experiments in the foreseeable future to study the universe,” says Hitoshi Murayama, director of the IPMU. “I am very excited, let’s make Hyper-K happen.”

Copyright © 2026 by IOP Publishing Ltd and individual contributors