Skip to main content

Building bridges with the west

By Michael Banks in Beijing, China

It’s my final day in Beijing and keeping up with the daily weather reports, it is still raining. But that is better than the snow that was forecast only a couple of days ago.

My time in Beijing has been short, but packed full of interesting discussions with researchers.

Yesterday I headed to the Beijing Institute for Nanoenergy and Nanosystems. Today, I visited the theoretical condensed-matter physicist Fuchun Zhang, who is director of the Kavli Institute of Theoretical Sciences (KITS).

(more…)

Molecular motors drive liquid through large channels

Conventional fluids flow only in response to external forces, but scientists are increasingly interested in “active matter” that consumes energy and moves itself. Previously, this has been seen only on millimetre or centimetre scales, but now researchers in the US have observed tiny molecular motors moving around metre-long tubes.

Flocks of birds, swarms of bees and other collective animal movements can be explained by cognitive decisions of the organisms involved. A bird, for example, sits in the wake of the bird in front to minimize drag. However, collective behaviour can also be found at the cellular and sub-cellular level. “Non-equilibrium behaviour is more general than what living or sentient creatures exhibit,” explains Seth Fraden of Brandeis University in Massachusetts.

The mechanisms through which this kind of self-organization occurs are not well understood. To gain more insight, Fraden, together with colleagues at Brandeis and Georgia Institute of Technology, studied the motor protein kinesin – which Brandeis’ Zvonomir Dogic describes as “the simplest biological engines capable of transducing chemical energy into mechanical motion”. They dispersed the kinesin in water with filaments called microtubules extracted from the brain tissue of cows. When fed with ATP – a chemical fuel used in the human body – the kinesin moves microtubules against each other, pulling the water along by viscous drag.

Hallmark behaviour

Experiments on concentrated bacterial suspensions reported in 2004 by the group of Raymond Goldstein at the University of Cambridge revealed a hallmark behaviour of active fluids. They observed that in an unconfined bulk, different parts of fluids move in different directions, creating turbulent vortices with a characteristic size that depends on the fluid. However, they later showed that when confined, these fluids can produce self-organized bulk motion on macroscopic scales.

These experiments, as well as theoretical simulations, have suggested that this requires the smallest confinement dimension to be smaller than the characteristic vortex size of the fluid. In the case of the Brandeis experiments, this would mean about 100 μm. To the Brandeis researchers’ astonishment, however, they observed coherent flows persisting in channels of various shapes that were over a millimetre wide and over a metre in length. “We found that you could keep going larger and larger and larger,” says Fraden. “No one knows what the upper bound is or whether there is an upper bound. We were limited by the number of cows that had to be slaughtered [to provide sufficient sample material].”

The researchers found that – regardless of the size – the aspect ratio of the space into which the fluid is confined appears crucial. If the cross-section of the channel is more than about three times as wide as it is tall, or vice versa, then the flow becomes turbulent. The reason, however, remains puzzling. “No one knows the answer – no one even knew it was a question until this experimental discovery,” says Fraden. “Here we have channels that are 50 vortices wide and we still have coherent flow, so there’s something more going on, and this is something that no theorist has even considered yet.” The researchers suspect this has not been seen previously because aspect ratio is an inherently 3D feature. Dogic says: “The vast majority of investigations so far – both experimentally and theoretically – have been confined to 2D systems, which are much simpler to investigate.” “This is a new direction that people have not really thought about.”

Not seen in nature

The large-scale, self-organized flow does not appear to be used in nature, and applications in engineering are probably “many, many years away”, says Dogic – partly because synthetic molecular motors are much less efficient than the biological ones used here. Fraden, however, points out that there were no applications to the classical equations of hydrodynamics when Claude-Louis Navier and George Gabriel Stokes formulated them 150 years ago – whereas today the equations are used to design aircraft. “In 150 years we’re going to have buildings and all sorts of things we haven’t imagined based on this kind of continuum mechanics where the boundary between animate and inanimate has become so blurred as to be completely unrecognisable,” he predicts.

Andreas Bausch of the Technical University of Munich is excited by the researchers’ ability to control whether flow is turbulent or coherent by tailoring the width and height of the channels: “This is a real scaling up of properties,” he says, “The flow is driven in long-range flow patterns by nanometre motors.”

The research is described in Science.

Quantitative finance: what’s it really all about?

By Matin Durrani

Among the many joys of studying physics is that a degree in the subject can take you down lots of different paths. As our recent Physics World Careers 2017 guide revealed, they range from research and industry to education, IT and even sports, politics and the arts.

One particularly popular destination is the world of finance, which is hardly surprising given physicists’ love of numbers. Those in finance work in many different areas, with one of the most high profile – and lucrative – being the field of “quantitative finance”.

But what exactly does the term mean and what’s the field all about? To find out more, do check out the new, free-to-read Physics World Discovery ebook entitled Quantitative Finance, written by Jessica James – a managing director and senior quantitative researcher at Commerzbank in London.

As James explains in the introduction to her book, the field includes “complex models and calculations that value financial contracts, particularly those which reference events in the future, and applies probabilities to these events”. I encourage you to read her book, which is available in PDF, ePub and Kindle formats. And to whet your appetite, James has kindly answered some questions about what she does, her career to date and what the book’s about.

(more…)

Flash Physics: Mapping Earth’s magnetism, single-molecule vibronic spectroscopy, nanoclip could treat disease

Mapping Earth’s magnetism

The magnetic field of the Earth’s crust has been revealed like never before in a new high-resolution map. To produce the plot, Nils Olsen from the Technical University of Denmark (DTU) and colleagues have used a new modelling technique to combine data from the European Space Agency’s (ESA) Swarm satellites and their predecessor, the German CHAMP satellite. Presented at the ESA’s Fourth Swarm Science Meeting in Canada, the map is the highest-resolution plot to date of the crust’s magnetic field. While the majority of the Earth’s magnetic field is generated by molten iron in the core, a small part of it is created by magnetized rocks in the lithosphere (the crust and upper mantle). This “lithospheric magnetic field” is very weak and difficult to detect. Taking different orbits around the Earth, Swarm’s three identical satellites use “new-generation” magnetometers to measure the direction and magnitude of the field. The new map shows field features down to about 250 km, therefore highlighting anomalies such as one in the Central African Republic. The exact cause of this strong localized magnetic field is unknown, but Olsen and team speculate that it is the result of a meteor impact 540 million years ago.

Vibronic spectroscopy done at atomic resolution

A new technique that combines the chemical sensitivity of optical vibronic spectroscopy with the atomic resolution of scanning tunnelling microscopy (STM) has been developed by Guillaume Schull and colleagues at the University of Strasbourg in France. Near-infrared, Raman and low-temperature spectroscopies involve the vibrational modes of molecules and can therefore provide important information about the chemical, structural and environmental properties of organic molecules. But because these are optical techniques, their spatial resolutions are normally limited to approximately the wavelength of the light used to probe the molecules – typically hundreds of nanometres. STM involves bringing an atomically sharp tip near to a molecule on a surface and measuring the electrical current that flows between the two. While STM can create sub-nanometre-resolution images of molecules showing individual atoms, it does not provide much chemical information about its subjects. Now, Schull’s team has come up with a way to do vibronic spectroscopy without illuminating the sample with light. Instead, electrons from the STM tip excite vibrational modes of molecules of interest, causing them to emit light that can then be detected. The team used the technique to study individual phthalocyanine molecules deposited on a surface and measured an intense emission of red light at 1.9 eV as well as several weaker signals at lower energies. Writing in Physical Review Letters, Schull and colleagues say that a comparison to data acquired using conventional Raman spectroscopy confirms that the STM-acquired spectrum provides “a detailed chemical fingerprint” of the phthalocyanine molecules.

Electrical “nerve cuff” could help treat chronic disease

Image showing the nanoclip attached to a nerve

A tiny device that is implanted in the body and delivers electrical signals to the nervous system has been created by Timothy Gardner and colleagues at Boston University in the US. Described as a “nerve cuff” or a “nanoclip”, the device is designed to stimulate nerves as part of the treatment of a range of disease including diabetes, polycystic ovarian syndrome, asthma and cancer. The implant is currently being tested in small animals and is therefore very small – measuring just 200 nm to target nerves that are as small as 50 μm in diameter. The devices are made using a laser writing technique, which the team says allows the nanoclips to be designed for use in keyhole surgery. The devices were tested by implanting them on the hypoglossal nerves of zebra finches. This is the nerve that control’s the tongue of the birds and it therefore plays a crucial role in how they sing. Writing in the Journal of Neural Engineering, Gardner’s team says that studies of the finch’s birdsong revealed no change due to the presence of the nanoclips. This, they say, suggests that the implant is safe to use.

 

  • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics.

A blue energy dream

By Michael Banks in Beijing, China

I was told that it wouldn’t rain much in Beijing, a city known for its dry air – and pollution.

But since I arrived here last night courtesy of the bullet train, all I have seen is drizzle. The wet weather also made it a challenge during rush hour, but I finally made it to the Beijing Institute for Nanoenergy and Nanosystems (BINN).

I met with BINN’s director, Zhong Lin Wang, who has been in the US for more than 39 years, most of which has been spent at the Georgia Institute of Technology. While he is still affiliated to Georgia Tech, he came back to China in 2012 to establish BINN.

(more…)

Between the lines

Picture of an apple falling

Defining gravity

Since 1995 Oxford University Press has been publishing its Very Short Introductions book series that covers a hugely varying range of topics – from atheism and British cinema to dinosaurs – aimed at a general audience, but written by an expert in the field. It seems surprising then, that a topic as basic as gravity has not been covered, until now. Written by physicist Timothy Clifton at Queen Mary University of London, this little book is a heavyweight, if you will excuse the pun. Split into six chapters, Gravity: a Very Short Introduction will make an excellent primer for students, teachers and, indeed, anyone who is interested in the concept of gravity as we know it today. Unsurprisingly, the opening chapter deals with the history of gravity, beginning with Aristotle and moving on to Galileo Galilei before delving into Isaac Newton and Albert Einstein’s respective treatises on gravity. But Clifton then swiftly moves on to experiment, as the next two chapters talk about testing gravity both within our solar system and beyond, explaining everything from time dilation to pulsars and even introducing devices such as the interferometer. An exciting addition to the book is the chapter on gravitational waves, which, thanks to the amazing discovery made last year by the researchers working on the Laser Interferometer Gravitational-wave Observatory in the US, is no longer theoretical. The final two chapters get a bit more technical as Clifton gets into the nitty gritty of explaining curved space–time, the history of our universe and different cosmological models, before dipping into quantum gravity and the multiverse, while making clear that all of the above are still being investigated. Dig into Gravity to get a rapid refresh of everyone’s favourite fundamental force.

  • 2017 Oxford University Press 144pp £7.99pb

Cavendish pioneers

Founded in 1874, the Cavendish Laboratory at the University of Cambridge is one of the most famous physics labs in the world. It was here that J J Thomson discovered the electron, Ernest Rutherford “split the atom”, Francis Crick helped to determine the structure of DNA and James Chadwick discovered the neutron. But the lab’s physicists have accomplished far more besides, as is made clear in Maxwell’s Enduring Legacy: a Scientific History of the Cavendish Laboratory. Written by Malcolm Longair, who was head of the Cavendish from 1997 to 2005, this comprehensive book details in chronological order with the lab’s main scientific achievements based on descriptions from the original scientific literature. Longair focuses squarely on the Cavendish’s research output, but he cannot avoid the politics, personalities and finances of the lab entirely. Indeed, the book’s title pays tribute to James Clerk Maxwell – the lab’s first head – whose philosophy of how to do physics influenced successive generations of Cavendish workers. These days almost 1000 people are based at the lab, with the head of the Cavendish effectively being the “chief executive officer of a middle-sized company”. As a former boss himself, hopefully Longair has the skills to handle any Cavendish staff who find their efforts haven’t been included, for reasons of space, in this well-illustrated and clearly written compendium of current and past successes.

  • 2016 Cambridge University Press 664pp £39.99/$69.99hb

A self-inflicted doomsday

In 1859 the Carrington Event – when a solar coronal mass ejection scored a direct hit on Earth’s magnetosphere, leading to a large geomagnetic storm – caused telegraph systems to fail all over Europe and North America. In 1859 there was only limited local electricity generation and few people or businesses relied on electronic communications. A similar event of this magnitude today would be far more problematic, potentially catastrophic, wiping out power and/or communications to millions. This is just the first of several disaster scenarios that physicist Peter Townsend presents in The Dark Side of Technology. It’s also surprisingly likely to happen (events such as the Carrington one have historically occurred every 100–150 years) and the clearest fit for the title of the book. Townsend runs through various types of technology and different reasons for our modern reliance on technology, and extrapolates doomsday scenarios from limited evidence (at least, limited evidence is presented to us here). The majority of the detail and research on display is in the historical background. There are summaries of the history of fertilisers and insecticides, of hygiene and medicine, of beauty and fashion, but few of these lead to a clear argument about technology that has a sinister side. Too often, Townsend relies on hyperbole and historical examples of “unfortunate technologies” that were short-lived precisely because they were unsafe or didn’t work as intended. The scientific explanations are unclear as well – where they are supplied at all. It’s a shame because these are interesting topics.

  • 2016 Oxford University Press 320pp £25hb

Then and back again?

Time travel has formed a lively strand of science fiction, ever since H G Wells published The Time Machine in 1895. Back then, the concept was pure fantasy, but soon after, special relativity made travel forward in time a theoretical possibility, while general relativity made backward time travel open to scientific debate. Author James Gleick’s latest book, Time Travel: a History, looks into the formation and evolution of the concept that is time travel.

Gleick broke the mould of popular-science writing with his 1987 bestseller Chaos: Making a New Science. Rather than being presented with a dry history, the reader was plunged into a novel-like world where the individuals involved in chaos theory came alive. This was character-driven science, and it was a highly effective approach. Ever since, Gleick has proved at his best when writing about the personalities of scientists or mathematicians, such as his biography of Richard Feynman, Genius.

It comes as somewhat of a shock, then, that the opening chapters of Time Travel are dominated by fiction, most notably from Wells’ The Time Machine. Gleick makes the case that, around the time of Wells, humanity underwent a change in the way time itself was perceived – from both a philosophical and a scientific point of view – and he provides both fictional and factual examples to support that premise.

While it is nearly impossible to write a book about time travel without bringing Wells into the discussion, giving such weight to his novel with sweeping statements like “he invented a new mode of thought” seems somewhat excessive. Yet, Gleick does bring out a major shift in attitude to time, taking it from a distant characteristic of nature to something that we interact with far more directly.

Perhaps the strangest decision that Gleick makes in a book with this title is to announce early on that “we still need to remind ourselves that time travel is not real. It’s an impossibility”. The reader may find this a worrying statement in a book that has yet to get out of its introductory chapters and that now appears to be limiting itself to fiction. Many physicists might at this point raise an eyebrow, wondering how Gleick has missed the amply tested time dilation that emerges from Albert Einstein’s theory of relativity.

He hasn’t, but dismisses it by saying “It is hardly time travel, though. It is time dilation.” As anyone with a grasp of the theory will know, by moving away from the Earth at fast enough speeds and returning, more time would have passed on Earth than for the traveller. What is that, if not time travel? But Gleick refer to this, the so-called “twin paradox”, as merely an “anti-ageing device” and is quick to point out that this form of travel is a “one way street”.

Gleick may be unimpressed by this technique as it doesn’t match the fictional idea of a magic box that disappears from the present and appears at a different date, but it smacks of ignorance to say that time travel isn’t involved. Ironically, the author himself suggests that physicists professing realistic theories of time travel have been influenced by “a century of science fiction”.

Gleick’s thesis becomes apparent as he gets into the paradoxes of time travel, and examines the impressive early stories of Robert A Heinlein, a master of the tangled web of causality and connection that appears to result from travelling backwards in time. Having deemed actual time travel impossible, Gleick instead uses the concept as the vehicle for a philosophical exercise.

It would be fair to say that this book is not popular science, but rather a combination of a history of time travel in science fiction and an examination of time itself in culture and philosophy. For example, there is an entire chapter dedicated to Gleick’s view that creating time capsules is folly. Where science does come into it, Gleick is far more concerned with the nature of time than of time travel, for example exploring the thermodynamic arrow of time and its implications.

The approach taken is best illustrated with an example. A reader’s opinion of the book is likely to reflect how they respond to this prose. For me, this is a triumph of verbal dexterity over communication:

“Having dispensed with simultaneity, [author and poet Jorge Luis] Borges also denies succession. The continuity of time – the whole of time – another illusion. Furthermore, this illusion, or this problem, the never-ending effort to assemble a whole from a succession of instants, is also the problem of identity. Are you the same person you used to be? How would you know? Events stand alone; the totality of all events is an idealization as false as the sum of all the horses: ‘The universe, the sum of all events, is no less ideal than the sum of all the horses – one, many, none? – Shakespeare dreamed between 1592 and 1594.’ Oh, Marquis de Laplace.”

A reader may feel that, not only has Gleick not bothered with science much in this book, but also that he hasn’t always got it right. At one point, he tells us that in quantum mechanics “the wave function is timeless”, which will surprise anyone who put in the effort of learning the time-dependent Schrödinger equation. To enjoy the book, it is necessary to put aside any concern about the scientific aspects of time travel and to focus purely on the cultural. Though Gleick touches on science, and spends a number of pages looking at whether physicists really believe that time doesn’t exist, it is the metaphysics that concerns him here, not the nuts and bolts of practical time travel offered by relativity.

This isn’t a bad book. It’s a thoughtful and interesting exploration of cultural ideas of time and the philosophy behind them. But only if you accept the author’s premise – that actually traversing through time will never be possible – will you find that the book does what it says on the tin.

  • 2017 Fourth Estate (Harper Collins UK) 352pp £16.99hb

Spin glass provides insight into brain activity

Spin-glass-like states that occur in models of neural networks can provide important insights into states of low and high brain activity that have been observed in mammals. That is the claim of a team of theoretical biophysicists in Spain who are the first to show that these disordered states in neurological networks could have a functional role in living brains.

In familiar magnetic materials such as ferromagnets, the interaction between individual spin magnetic moments causes all of the spins to point in the same direction of magnetization. In spin-glass states, the interaction between spins does not allow individual spins to point in the same direction as their neighbours. This leads to “frustration”, whereby no direction of magnetization exists and the spins point in random directions.

Brains are not magnetic systems and their working cells – neurons – do not resemble magnetic moments, but mathematically they behave in a similar manner. This is because neurons also have a binary variable – firing or not firing – which is similar to the up or down quantum states of spin. Neurons are also linked by synapses in a way that is similar to how magnetic spins interact with each other. As a result, a neural network in which all of the neurons are firing (or not firing) is similar to a magnetic material in which all of the spins are all pointing up (or down).

Synapse strength

When we create a memory, it is stored in our brain as a pattern of neural activity encoded by the strength of the synapses. These synapses can be either excitatory – they favour the transmission of information – or inhibitory – they inhibit transmission – and vary in connection strength. When the memory is triggered, the neurons fire or stay silent in a pattern configured by these synaptic connections.

Mathematical models of the recall of learned memories are based around simulations of binary neurons linked by connections of varying strengths. In such neurological models, memories are introduced in the same way as in real brains – as a pattern of binary activity encoded by connections and inhibitions of varying strengths. In these models, disordered states that resemble spin glasses emerge when the number of stored patterns and the network size (the number of stored memories) approaches infinity. They are essentially a frustrated state of frozen neural activity.

Previous models, however, do not accurately represent the balance of synapse configurations that have been found in brains. Instead the models had generally assumed an equal balance of excitatory and inhibitory synapses with similar strengths.

Balanced brains

“In the brain you have heterogeneity, but you have a balance,” explains Joaquín Torres at the University of Granada. “80% of the synapses are excitatory and 20% are inhibitory, but inhibitory synapses are stronger than excitatory synapses. So you have a kind of balance that retains the heterogeneity in the brain within an optimal range.”

“We also introduce this into our model”, adds Torres. Using this more realistic balance of excitatory and inhibitory synapses and synaptic strengths, the researchers found that at low “temperatures”, disordered states with spin-glass behaviours appeared naturally even when only a few memories had been introduced – it was no longer necessary for the network size to approach infinity. They also demonstrated mathematically that these states are not linked to memory retrieval.

“We can measure how far we are from a memory,” explains Torres. “If you reach a memory you will have a value of nearly one, meaning that you recover the memory. If you compute the same measure for a spin glass you will have zero.”

Frustrated states

They also found that these frustrated states are associated with relatively high or low brain activity – unlike memory recall – that correlate with well-known “up” and “down” states that have been described in neural models and observed in the brain activity of mammals.

“We have proven both theoretically and through simulation that the up and down states observed in the activity of mammal brains would be but a mere manifestation of these spin-glass states,” says Torres. “This spin-glass state is due to the heterogeneity that is observed in the synaptic strength in the brain – this balance between excitation and inhibition.” The research is described in Neural Networks.

Shanghai round-up

By Michael Banks in Shanghai, China

It’s been a busy three days in Shanghai and now I’m on my way to Beijing to continue reporting for the China special report, which will be published in June.

As I mentioned in previous blog posts, Shanghai has thrown up some interesting stories. I heard about plans for a new 12 m telescope and also received a progress update on the construction of a new X-ray free electron laser in the city.

(more…)

How LIGO got the word out about gravitational waves

Tweeting to millions: LIGO made a social media plan before announcing the detection (Courtesy: Sarah Tesh)

By Sarah Tesh

Nowadays, social media plays a big role in communicating science to the public. It has two important qualities – it’s free and it’s international.  A great case study for social media and science came last year when the Laser Interferometer Gravitational-Wave Observatory (LIGO) announced the first ever detection of gravitational waves. To tell us more about how the team grabbed the public’s attention (and got its work on Sheldon Cooper’s T-shirt in The Big Bang Theory), LIGO scientist Amber Stuver gave a witty talk at the APS March Meeting 2017 about the outreach strategy.

She began by telling us the story of that exciting detection day. Before the first detection, LIGO had published 80 papers on “detecting nothing”.  Yet on 14 September 2015 – the first morning of the first day of Advanced LIGO – the much-sought-after signal appeared. The first thing that had to be done was to check it wasn’t a fake. Having detected nothing for so long, those with the knowledge to do so would sometimes “inject” results to check the system worked and keep the scientists on their toes.

(more…)

Copyright © 2026 by IOP Publishing Ltd and individual contributors