Skip to main content

Inferior by Angela Saini wins Physics World’s 2017 Book of the Year

Intrepid, detailed, upbeat and busting through over 100 years of misrepresentation and dodgy science, Angela Saini’s book Inferior: How Science Got Women Wrong and the New Research That’s Rewriting the Story is Physics World’s Book of the Year for 2017. In Inferior, Saini attempts to get to grips with, and challenge, the large body of (so-called) science that is often used to diminish women, by referring to supposed differences between men and women.

Cover of Inferior by Angela Sain

Saini told Physics World that one of the reasons she wrote Inferior was to tackle the contradictory information on gender studies that is put forth in both the media and in scholarly journals. “Really I just wanted to get to the heart of that riddle… what does science actually say about men and women and what is the true extent of the sex differences between us?” Saini found that the clashing research results were a feature of the controversy within the fields themselves, thanks to bias, prejudice and speculation by the researchers themselves.

The book touches upon controversial gender studies in in biology, neuroscience, anthropology and even evolutionary psychology, in an attempt to separate the real science from the bias. Popular misconceptions – such as women being “better at multi-tasking” or that they don’t like playing chess or can’t read maps – have no scientific basis, and yet have become ingrained into our society. Saini calls into question all the things that we think we know about the differences between men and women.

Clear and non-judgemental

Written in a clear and non-judgemental tone, Inferior can sometimes be difficult reading, no matter your views on the subject. Somewhat miraculously, Saini manages to remain non-biased throughout the book, and never once comes across as simply airing her grievances – instead, she objective analyses research, and delves into what has been overlooked. The book also does a good job at pointing out the positive impact that society as a whole would see, if these misconceptions become a thing of the past. “Women, as a subject of study, really are a scientific battleground,” says Saini, adding that it was this very controversy that fascinated her as a writer.

Thanks to her fair and comprehensive research, on what is one of the thorniest current issues being debated, Inferior stood out in a strong shortlist of books that are all novel, well-written and scientifically interesting to physicists – the criteria used to determine Physics World’s Book of the Year.

Tune in to the latest Physics World podcast in which reviews editor Tushna Commissariat and managing editor Matin Durrani discuss the books on the shortlist, and announce Inferior as the winner. You will also hear from Saini herself, on what’s happened since Inferior came out, and the stellar reception it has had.

This is the ninth year the magazine has picked a Book of the Year. Previous winners include Why String Theory?, Joseph Conlon’s robust defence of the subject; Trespassing on Einstein’s Lawn, Amanda Gefter’s personal quest to understand the meaning of “nothing” (2015); Stuff Matters, Mark Miodownik’s salute to everyday materials science (2014); and The Strangest Man, Graham Farmelo’s landmark biography of Paul Dirac (2009).

CERN-MEDICIS produces first medical isotopes

The new CERN-MEDICIS facility in Geneva has produced its first radioactive isotopes for medical research. Linked to CERN’s ISOLDE facility, CERN-MEDICIS will provide a range of medical isotopes for hospitals and research centres across Europe.

ISOLDE produces exotic radioactive nuclei by firing a high-intensity proton beam at a solid target. These nuclei are then formed into low-energy beams and studied by nuclear physicists. However, only about 10% of the beam is absorbed by the ISOLDE target – leaving the rest for CERN-MEDICIS.

Medical isotopes are made by placing the appropriate solid target downstream from ISOLDE. After it is irradiated with protons, the target is moved on a conveyor belt to a mass-separation system where the isotopes are extracted from the target and then implanted in metal foils. These foils are shipped to nuclear chemistry labs at hospitals and research labs where they are bound to specific biological molecules – which are used for therapy and imaging applications.

Gamma-ray imaging

The first isotope produced at CERN-MEDICIS is terbium-155, which is one of four terbium isotopes that show great potential for medical applications. When terbium-155 decays it emits a gamma-ray, which means that it could be used for single-photon emission computed tomography. This involves binding the isotope to a biological molecule and injecting it into a patient, where it accumulates in a region of interest such as a tumour or specific parts of the brain or heart. The patient is then surrounded by gamma-ray detectors to produce a 3D image of the region of interest.

Led by CERN, the CERN-MEDICIS programme is paid for in part by CERN’s Knowledge Transfer Fund. Money also comes from private foundations, other participating research institutes and the European Commission.

Book of the Year 2017

Here at Physics World, we love talking about popular-science books. Indeed, we enjoy it so much that we braved the cold, not to mention a sore throat and cracked ribs (you’ll have to listen to find out more!), to share our thoughts on a few of the year’s best popular-physics books in a special edition of our podcast.

As is becoming a tradition, this chat was hosted by our regular podcast presenter and producer Andrew Glester, in his garden shed, where he can often be found musing about “science fiction, science fact and everything in-between” for his own podcast the Cosmic Shed. Despite the freezing December morning, we gathered in the shed with hot drinks, blankets and a pile of books, as we discussed some of the themes that link the year’s books, on what was a somewhat out-of-the-ordinary shortlist.

Congratulations to all of the shortlisted authors on their fantastic books – tune in to the podcast to hear some words from our winner. We hope that everyone will find something to appreciate on this list, and hopefully we have given you a few ideas for some excellent holiday presents.

Shortlist for Physics World Book of the Year 2017 (in no particular order)

Marconi: the Man Who Networked the World by Marc Raboy

Hidden Figures: the Untold Story of the African American Women Who Helped Win the Space Race by Margot Lee Shetterly

The Glass Universe: How the Ladies of the Harvard Observatory Took the Measure of the Stars by Dava Sobel

Scale: the Universal Laws of Life and Death in Organisms, Cities and Companies by Geoffrey West

Not A Scientist: How Politicians Mistake, Misrepresent and Utterly Mangle Science by Dave Levitan

Inferior: How Science Got Women Wrong and the New Research That’s Rewriting the Story by Angela Saini

Mapping the Heavens: the Radical Scientific Ideas That Reveal the Cosmos by Priyamvada Natarajan

We Have No Idea by Jorge Cham and Daniel Whiteson

The Secret Science of Superheroes edited by Ed. Mark Lorch and Andy Miah

The Death of Expertise: the Campaign Against Established Knowledge and Why it Matters by Tom Nichols

Evaluating bioinks for vascular tissue engineering

Panel of hydrogel-based bioinks

Advancements in biofabrication have enabled the translation of simple 2D cell models to complex multicellular 3D models that could be utilized in tissue engineering, for drug testing in vitro or even for transplantation. Biofabrication for tissue engineering and regenerative medicine applications comprises the use of cells and biomaterials as building blocks to build a layered tissue construct. A limitation of tissues produced in this way has been a lack of vascularization, as this requires precise spatial distribution of vasculature within a 3D model. This vascularization could be introduced through 3D bioprinting, where precise deposition of endothelial cells within a 3D construct could rectify the issue.

A further complication to introducing vascularization into a 3D bioprinted construct comes from selecting the correct biomaterial, which must have bioprintable properties (bioink) whilst allowing for cell-cell and cell-bioink interactions. This issue has been addressed by researchers from the University of Freiburg, who have assessed a panel of bioinks for their suitability as a 3D culture material of endothelial cells, which form the blood vessels of the circulatory system (J. Biomed. Mater. Res. Part A doi: 10.1002/jbm.a.36291).

Hydrogel-based bioinks
For this study, the group tested bioinks based on hydrogels – water rich saturated polymers – from either natural or synthetic sources. They also evaluated the printability of the hydrogels through an inkjet based 3D bioprinting system. From the hydrogels tested, they found that fibrin and collagen were best for the range of characteristics tested; with good bioprinting characteristics and enabling the formation of blood vessels with capillary-like cords developing from the cells in these gels.

The researchers observed issues with the other gels, such as a lack of inkjet-based printability or an inability to be cultured with cells. For example, the synthetically sourced pluronic acid hydrogel and naturally sourced alginate/gelatin blend hydrogel were unstable in the medium required for cell sustenance, resulting in rapid degradation and an inability to form a sustained solid 3D structure.

Future tests
The panel-based hydrogel screening approach used in this study proved useful in establishing the key functional characteristics for inkjet bioinks in vascularization applications. There are also many other hydrogels and bioinks that have not been tested in this research. The authors note that the functional outputs recorded and protocols used to assess the suitability of these hydrogels could easily be translated to assess other hydrogels for similar applications.

Translational applications
Further research could utilize these results to build a tissue-specific panel of vascularization bioinks. For example, the development of brain-specific endothelial cells through stem cell differentiation could establish materials suitable for modelling and transplantation of brain tissue.

The advantages of using stem cells are that they can be extracted from a skin biopsy as adult stem cells and reprogrammed to have pluripotency (capability to differentiate into multiple lineage cell types), referred to as induced pluripotent stem cells (iPSCs). Studies of such iPSCs may result in more effective treatment of diseases that affect multicellular tissues and organs, adding complexity and stratified screening strategies to medical research.

Space debris threat to geosynchronous satellites has been drastically underestimated

A new analysis has found that the threat posed by space debris to satellites in geosynchronous Earth orbits (GEO) is much greater than has been assumed until now. Daniel Oltrogge at Analytical Graphics Inc (AGI), and collaborators at AGI and satellite operators SES and Inmarsat, used six separate approaches to estimate the risk, finding broad agreement between them. The results indicate that the chances of collision in GEO are up to four orders of magnitude higher than some estimates have suggested, and those collisions can occur at much higher relative velocities than previously thought. The researchers predict that the population of active GEO satellites can be expected to suffer one potentially mission-terminating impact every four years on average.

Desirable orbits

Because objects in GEO circle the Earth once every 24 hours, GEO satellites with circular, equatorial orbits keep a constant position over the Earth’s surface. Such orbits – about 35,786 km above the ground – are especially useful for communications satellites, since ground stations can maintain contact without the need for active tracking.

Unfortunately, the desirability of this orbital real estate has led to the GEO region above the equator becoming congested with satellites both operational and defunct, as well as smaller “resident space objects” (RSOs) generated by collisions, explosions, and other fragmentation events. This means that, in addition to the 466 active GEO satellites, the region is also host to thousands of objects larger than 10 cm, and at least tens of thousands larger than 1 cm.

Risk assessment

“Collision risk is significant in all orbit regimes, but too often people gravitate toward low Earth orbit (LEO) and fail to properly assess the risk at GEO. This research is intended to open that discussion, particularly as GEO is where the majority of space commerce is currently conducted,” says Oltrogge, who presented the work at the 68th International Astronautical Congress in Adelaide, Australia in September 2017.

Assessing the hazard is more challenging for GEO than it is for LEO for various reasons: the number of orbiting objects is harder to establish with confidence; orbits tend to be largely synchronous; and the population of active satellites is distributed unevenly around the equator. Still, researchers have produced a range of estimates for the danger, with those at the bottom end going so far as to suggest that the risk of collision is so low that even the routine use of graveyard orbits for retired GEO satellites is unnecessary. When they do occur, impacts have generally been assumed to involve relative velocities below 1 km/s.

The methods used by Oltrogge and colleagues involved statistical extrapolation from catalogues of known debris, records of close approaches, and simulations of the passage of satellites through known and estimated RSO populations. The analyses included orbital drift induced by the gravitational effects of the Sun and Moon, and also took account of the elongated shape and alignment of the typical GEO satellite. “An important element of the process was the corroboration that each of the many independent approaches provides to the others,” explains Oltrogge. “The collective body of work is more impactful than each of the parts on their own, and consequently we thought that having them all done and published in a single paper was worth the extra time.”

Surprisingly hazardous

The researchers found that the average time between satellite–debris collisions is just four years for the population of RSOs 1 cm across, but this was not the only alarming result. The team also found that, contrary to previous analyses, collisions could take place at relative velocities of up to 4 km/s, due to the existence of GEO-crossing debris in eccentric orbits. Such impacts are energetic enough to cause catastrophic damage to satellites, which are not designed with mechanical robustness in mind.

“GEO satellites are essentially sitting ducks,”
Hanspeter Schaub, University of Colorado Boulder

Collisions with objects 20 cm in size were inferred to occur every 50 years on average, and events like this can produce clouds of high-velocity fragments that spread throughout the GEO region, potentially triggering a cascade of secondary impacts. “While the relative speeds are not as high as what can be encountered in LEO, they are plenty large to completely fractionate a spacecraft,” says Hanspeter Schaub of University of Colorado Boulder, who was not involved in this study. Furthermore, continues Schaub, “In contrast to low Earth orbits, the GEO satellites are essentially sitting ducks with limited ability to evade the space debris flow.”

Suspected events

But if we should expect collisions among GEO satellites to be commonplace, one might wonder why we hear so little about them. In fact, although only two such events have ever been confirmed, a further 20 or more are suspected. Additionally, optical surveys by ESA found large numbers of objects that can only have originated from unreported fragmentation events.

It does seem, then, that the GEO region might well be a much more hazardous environment for satellites than assumed. “Our findings indicate that unmitigated collision risk is fairly high,” says Oltrogge. “Although there is plenty of GEO satellite redundancy to weather a single mission-terminating collision with a debris fragment, of far greater consequence are GEO collisions with larger debris objects; all in the industry need to ensure that effective mitigation measures are implemented to avoid placing the entire market for GEO-based satellite services at risk.”

Schaub agrees with the importance of the research: “This study is very timely, as the geosynchronous region is the most heavily insured region in space. Oltrogge et al. provide an excellent overview of the limited research in GEO debris risks, and summarize extensive new work illustrating the challenge of GEO debris.”

Shadows of Saturn’s rings complicate ionosphere

Low-altitude flybys by the recently retired Cassini spacecraft have shown Saturn’s electrical environment to be surprisingly dynamic. By manoeuvring Cassini into a series of orbits that took the probe within the planet’s innermost ring, researchers obtained direct measurements of the electron density and temperature in the ionosphere. The results revealed that the variability and fine structure of the plasma conditions close to Saturn are due in part to the way in which the A- and B-rings prevent ionizing solar radiation from reaching the upper atmosphere.

Saturn up close

Before April 2017, when Cassini began a series of close orbits around Saturn, study of the planet’s ionosphere was limited to averaged, low-resolution measurements acquired using remote-sensing techniques.

Now, writing in Science, Jan-Erik Wahlund and collaborators at the Swedish Institute of Space Physics, University of Iowa, and NASA Goddard Space Flight Center, have reported the results of an in situ investigation using Cassini’s onboard Radio and Plasma Wave Science (RPWS) instrument package.

Sunscreen

Where Saturn is illuminated directly, extreme ultraviolet (EUV) radiation from the Sun partially ionizes the upper atmosphere, producing electrons and hydrogen ions. The planet’s two most prominent rings, however, are opaque to EUV radiation, and in their shadows the researchers found a local decrease in plasma density. The lack of charged particles in this region could explain the previously observed leakage of lightning-induced radio signals, which would otherwise be blocked by the ionosphere.

Complex dynamics

Variability was also observed away from the rings’ shadows, with electron densities fluctuating between 50 and 1300 per cm3 from one orbit to another. Wahlund and colleagues propose that these inconstant conditions, too, are caused by the rings – in this case by the electrodynamic interaction between the ionosphere and the electrically charged D-ring. Resulting flows of ionospheric ions along magnetic flux tubes could impart the fine structure observed by the spacecraft. An additional effect might be caused by reactions between negatively charged molecules delivered from the D-ring, and ionized hydrogen from the atmosphere, which could combine to locally depress the population of ion species.

Once a physicist: Apoorva Jayaraman

What sparked your interest in physics?

I suppose I was always captivated by the mysteries of the night sky, even as a child. But it was during my undergraduate years that the faculty at St Stephen’s College in Delhi, India, made a deep impression on me with their passionate approach to the subject. A physics teacher at school once quoted Henri Poincaré to me, who said “a scientist delights in science not because it is useful, but because it is beautiful”. Sitting in a classroom, I was not using physics merely as a tool to solve textbook problems, but was learning to ponder over the implications of a law, to investigate the validity of a result, to value the insight of instructive approximations – I think that is where I truly started seeing the beauty of physics and seriously engaging with it.

Did you ever consider an academic career?

I did not undertake my journey in physics as a necessary stepping stone to academia. When I began my PhD in astronomy at the University of Cambridge, I was really quite open to the possibilities after. I always knew that dance would be a big part of my life, whether professionally or not, and that my professional choice would be greatly influenced by this factor. During my PhD, I definitely contemplated doing a postdoc and I did consider a career in astronomy teaching.

How did you get interested in dancing, particularly the traditional Indian dance Bharatanatyam?

I was exposed to Bharatanatyam as early as the age of four, through a teacher who came to my kindergarten class to tell stories through the medium of dance. I was drawn not necessarily to the genre, but to the fascinating world of dance-theatre that it embodied, which has a powerful potential for narration. I started formally training in the dance form at the age of five. I was fortunate to find good teachers along the way, but it was my mother’s influence on my learning process that made me see and experience greater beauty in what I was doing. This is probably what transformed an engaging hobby into a passion that began to deeply influence my life.

What were some of the challenges transitioning from academia to performing arts?

There is no laid-out career path in the performing arts. You steer it where you want to, and your ascent is as fast, as far and in the direction that you determine. While creatively this could be true of both disciplines, in terms of career progression, the performing arts is not supported by any established ecosystem in India. This has obvious repercussions on its financial viability as a career choice. I think it is fair to say that science accords a relatively high degree of genuine equality in the workspace. Good ideas are recognized no matter who they come from. Professional growth is, by and large, proportional to merit. Coming from such a space to the world of art, which is intrinsically non-quantitative and unstructured, was a sort of culture shock and took some time to realign to. Apart from these differences, I have found that the two areas of study reconcile fairly elegantly.

What are you working on now?

I am currently under a fellowship from the Government of India’s Ministry of Culture, which supports my research proposal to study the role of Indian performing arts in the advancement of knowledge. Artistic pursuits in the Indian subcontinent – such as classical dance and music – are vitally linked to the quest of the human mind for higher knowledge. This accords Indian classical art a seat of gravity that raises it beyond being an isolated pursuit of aesthetics or beauty. I am investigating ideas of embodied learning, abstraction, and the role of art in training and sharpening the cognitive and intuitive faculties. This month, I am organizing a national conference on dance for young professionals in the field of Bharatanatyam, to deliberate on the motivations, purpose and scope of Indian classical dance as a discipline. For the first time within the community of performers, I hope to expand the discussion to include evolutionary and cognitive studies and art as an enquiry into reality. December is also the high season of classical music and dance in Chennai, where I live.

How has your physics background been helpful in your work, if at all?

Doing research in physics has trained me to be aware of what I do and don’t know, in any given context. I have a fairly critical assessment of the quality of my ideas and the robustness of my current understanding of them. This has lent conviction to my work in the performing arts beyond the realm of performance.

Any advice for today’s students?

An education is to prepare you for life, not necessarily to prepare you for a profession. Don’t be afraid to seek an education that is independent of a specific professional or vocational choice.

A salty safety solution

You’ve spent the weekend skiing in the Norwegian mountains. The conditions were amazing: it had snowed all week and when you arrived on Saturday, the Sun was shining and there was a crisp chill in the air. But by Sunday afternoon the weather had warmed up and now, as you drive home through the dark forest-covered mountains, it’s raining. The road, whose covering of white packed snow had made driving up so easy, is now icy, slippery and dangerous.

Your car’s fitted with studded tyres, but you’re struggling to get a good grip. One false move and you could skid out of control, slide down the mountain, crash into a tree or hit another vehicle. Your heart’s in your mouth, your hands are sore from gripping the steering wheel and your eyes are focused wide.

Then, salvation: you reach the main highway. The glassy forest road merges into black, bare tarmac, and you soon see why this road is free of snow or ice – a winter maintenance truck is reassuringly trundling along in the opposite direction, spreading salt on the roadway. But what exactly happens when we spread salt on icy and snowy roads? How does it transform these dangerous highways into safe surfaces to drive on?

Mounds of mineral

Salt – or sodium chloride (NaCl) – is the most common material used to increase grip on icy roads and it is an invaluable tool for winter maintenance services in many countries. In Norway, where we both live, some 250,000 tonnes of road salt are used on average every year, while the US gets through about 17 million tonnes and the UK uses two million tonnes. Salt is particularly efficient in those countries where winter temperatures fluctuate above and below 0 °C, and for roads with high volumes of traffic (more than about 1500 vehicles per day).

In Norway, the government’s Public Roads Administration does not do maintenance itself but instead hires contractors for defined regions. Their staff follow the weather forecast closely so that they can respond both pre-emptively and reactively to weather events. This strategy leads to three different approaches to spreading road salt: anti-icing, anti-compaction and de-icing.

Anti-icing is a pre-emptive operation, in which salt is applied to stop ice forming on a road in the first place. This strategy is used, for example, when the road is wet and temperatures are expected to fall below 0 °C, or when the air is humid and the roadway is cold, causing frost to build up.

Photographs of winter maintenance trucks implementing the anti-icing, anti-compaction and de-icing

Anti-compaction, meanwhile, refers to roads being salted before or during snowfall. In this case, the purpose is not to melt all the snow, but to prevent that snow compacting into a strong, hard crust that is difficult to remove. Contractors aim to get the salt down before too much snow has fallen, as salted snow is less cohesive so easily removed by traffic and snow ploughs.

As for de-icing, this involves using salt to get rid of ice that is already on the road and is strongly attached to the surface. Contractors try to avoid de-icing because it means their anti-icing or anti-compaction actions have failed and dangerous conditions have formed. De-icing, in other words, is a reactive strategy intended to regain control over a precarious situation.

The de-icing power of chaos

But how does road salt work? The simple answer is that salt dissolves in water, increasing its entropy and thereby lowering its freezing point. Entropy – often loosely referred to as chaos – is a measure of disorder in a substance. Dissolving salt in water provides the solution with more possible geometric configurations than pure water and therefore creates a system that is more disordered. Since the universe tends to higher entropy, a more disordered phase (i.e. liquid) becomes more attractive relative to its other phases (i.e. vapour and ice). The salt therefore stabilizes the water relative to ice, meaning a lower temperature is needed to freeze the liquid. And the more salt we dissolve, the lower this freezing point becomes. For most de-icers the relationship between concentration and freezing point has been determined experimentally.

While this is a key mechanism behind salt’s powers, it does not explain everything. Take the anti-icing strategy, which is designed to stop water freezing. That should be simple, right? Presumably you just need to apply enough salt to give the water on the roadway a freezing point lower than the coldest road temperature expected from the weather forecast. It turns out, however, that to prevent freezing, contractors only need 40% of what they use to melt ice. The reason for this discrepancy is that salt also changes the way water freezes.

We are all taught at school that liquid water becomes solid when it reaches its freezing point (0 °C). For the most part (minus the inherent complexities of any phase transition), this is true – you put a tray of water in the freezer and after a while there’s no liquid left, just solid ice. That’s simple enough. Salt water, on the other hand, doesn’t completely become a solid at its freezing point (below 0 °C) – there will be both solid and liquid present, and the solid does not contain salt.

Microscope images of dendritic ice crystals in a salt solution and the ice crystals in a plain water solution

This is because ice does not accept salt ions into its lattice – it has strict geometric constraints on where and how water molecules can be placed. An ice lattice forms a pattern of layered hexagons and for a foreign ion to be included, it must fit either inside the hexagons or between the layers. However, salt ions (Na+ and Cl) are too large to do this. As a result, the freezing process only removes water molecules from the solution and leaves the salt unaffected in the remaining liquid water. The salt in the leftover solution therefore becomes more concentrated, the freezing point falls further and freezing stops – you end up with some ice and some salty liquid. If the temperature drops further, more ice will form but there will be liquid present all the way down to the “eutectic temperature” of the salt-water mixture – the lowest possible temperature for a solution to be stable. Below this temperature, the salt crystallizes and the remaining water freezes. For sodium chloride this occurs at –21 °C.

Adding to salt’s power, ice that forms in a salt solution consists of branch-like crystals (figure 1) that aren’t seen in fresh-water ice. These “dendritic needles” are not as strong as solid ice made from pure salt-free water and will be destroyed by traffic if they form on a road. Only when more than 60% of the solution has been transformed into ice will it be strong enough to become a problem for road users.

No salty snowballs

So how does salt work when spread on snow to stop it from compacting? Again, the decreased freezing point is central but it doesn’t explain everything. Salt melts snow because, as mentioned earlier, salt solution has a higher entropy than ice – it is more attractive for the water molecules to be in the solution than be solid crystals. But to melt all the snow that falls would require extreme amounts of salt; much more than is actually used.

Instead, the salt melts a bit of snow and the resulting salt solution changes the mechanical properties of the remaining solid. It’s a bit like what happens when you make a snowball. “Wet” snow – snow close to its melting point (0 °C) – sticks together really fast to make good, strong snowballs that can withstand being lobbed through the air. This happens through a process called sintering, whereby the small ice crystals constituting the snow bind together. Snow has a very large surface area – after all, a flat snowflake with all its arms has very little interior and a lot of surface. This structure is, however, unfavourable, from a thermodynamic point of view, giving the material a high surface energy. But because ice exists very close to its melting point, its molecules are very mobile. The molecules will then move from energetically unfavourable positions (snowflake arms) to more favourable positions at the contact point between crystals. The snow crystals therefore bond together, or “sinter”, providing strength to the snow.

Salt water, however, is very different to fresh water. If the snow is wetted by salt water, you can’t make a perfect, proper snowball – the crystals don’t stick to each other and the snow behaves more like a collection of very small pearls. The reason for this is not entirely understood, but scientists believe that the dissolved salt might attach to the water molecules on the surface of the ice, preventing them from forming bonds to other water molecules in other ice crystals. The result on a road is that we need to melt only 20% of the falling or fallen snow, and the remaining snow can be easily removed.

Photograph of piles of salt

This brings us to de-icing – where water on the road has already frozen. Once again, de-icing is all about salt melting ice. But this only happens until meltwater dilutes the solution so much so that a new equilibrium is reached – the freezing point of the diluted salt solution becomes equal to the road temperature.

In the winter maintenance industry, we use the term “melting capacity” to state how many grams of ice one gram of de-icer can melt, with the lower the temperature, the lower the melting capacity of a de-icer. But remember that contractors perform de-icing when the road is already too slippery. It is therefore important that a de-icer not only melts a lot of ice, but also does it quickly – the melting rate is equally important as the melting capacity. Specifically, at low temperatures, there are salts other than sodium chloride that are more efficient de-icers, such as magnesium chloride or calcium chloride. Indeed, in some parts of the world, such as the US, these compounds are used when it becomes particularly cold.

Good salt, bad salt

Salt can also have negative affects. Although it helps keep roads safe and predictable during winter, salt corrodes exposed metal, damaging vehicles as well as roadside installations, such as signs and barriers. Road salt can also damage vegetation near the salted roads and contaminate freshwater sources.

It is therefore important not to use more salt than necessary and systematically work on minimizing the salt exposure. Indeed, in Norway, salt-spreading is a finely honed affair, with the amount used depending on a road’s importance, usage and local climate. On 13,000 km of roads (about a seventh of the entire network), the aim is to keep them completely free of snow and ice throughout winter. These are the most important roads, seeing between 1500 and 80,000 vehicles per day, and are routinely salted and ploughed. On smaller roads, salt is used only when the temperature is close to 0 °C and not when it’s colder – the roadway is kept bare when the weather is mild but a snow/ice crust is allowed to form during colder periods. On the rest of the network, the aim is to maintain a snow/ice crust throughout winter, and only use salt at the beginning and end of the season. This strategy requires snow ploughing to keep the layer level and spreading abrasive particles such as sand when the crust becomes too slippery. While this is still practised in many snow-rich countries, in more temperate climates virtually all roads are salted in winter.

At the Norwegian University of Science and Technology (NTNU) in Trondheim, our group is trying to find out how little salt we need to keep the roads safe. To do this, we need to understand the fundamental physics through which salt acts for its different purposes. For example, we are trying to work out the minimum amount of salt that needs to be added to snow to get sufficient anti-compaction and also what effect salt has on the formation of “hoar frost” (the deposition from humidity onto road surfaces). But knowing how salt works is only one piece of the puzzle. The mineral only helps as long as it is spread on the road and not in the roadside ditch. Unfortunately, traffic can easily blow or spray salt off the road, so another research topic in the field of “winter maintenance” is to develop methods to keep salt or other de-icers longer on the road and improve models to predict their operational longevity after they have been applied.

The purpose of all this work is to come closer to an optimal salting practice whereby contractors use as much salt or other de-icers as needed, but as little as possible without compromising safety. If we can reach this salting nirvana, it could mean fewer of those treacherous journeys on slippery, icy roads and a more relaxing end to your weekend skiing trip.

First multimessenger observation of a neutron-star merger is Physics World 2017 Breakthrough of the Year

On 17 August 2017 the LIGO–Virgo gravitational-wave detectors, the Fermi Gamma-ray Space Telescope and the INTEGRAL gamma-ray space telescope detected nearly-simultaneous signals. They came from the merger of two neutron stars – an object now called GW 170817. This was the first time that LIGO–Virgo scientists had seen a neutron star merger, but five hours later they had already worked out the location of the source in the sky. Over the next hours and days, more than 70 telescopes were pointed at GW 170817 and a wealth of observations were made in the gamma-ray, X-ray, visible, infrared and radio portions of the electromagnetic spectrum. Astrophysicists also searched for neutrinos, but none were seen.

These coordinated observations have already provided a vast amount of information about what happens when neutron stars collide in what is called a “kilonova”. The observations have yielded important clues about how heavy elements, such as gold, are produced in the universe. The ability to measure both gravitational waves and visible light from neutron-star mergers has also given a new and independent way of measuring the expansion rate of the universe. In addition, the observation settles a long-standing debate about the origin of short, high-energy, gamma-ray bursts.

Epitome of collaboration

This year’s award has been given to thousands of scientists working in nearly 50 collaborations worldwide. While some awards, notably the Nobel prizes, are given to individuals and not groups, here at Physics World we recognize that science is a collaborative effort. Furthermore, we believe that the multimessenger observation of GW 170817 epitomizes the collaborative nature of science and is a shining example of how our knowledge of the universe can move forwards when people from all over the world join together with a common scientific cause.

A comprehensive description of the multimessenger observation and a full list of all the scientists and collaborations involved can be found in The Astrophysical Journal Letters.

The podcast “Exploring the cosmos with gravitational waves” features interviews with several of the scientists who took part in the award-winning observations.

For a more in-depth look at the significance of these latest discoveries, take a look at the ebook Multimessenger Astronomy, which is free to read.

The Physics World top 10 breakthroughs of 2017 are awarded to research reported on physicworld.com in 2017. The 10 breakthroughs are chosen by Physics World editors from a shortlist based on popularity with our readers. The criteria for judging included:

  • fundamental importance of research;
  • significant advance in knowledge;
  • strong connection between theory and experiment; and
  • general interest to all physicists.

Now for our nine runner-up breakthroughs, which are listed below in no particular order.

Physicists create first ‘topological’ laser

Photo of researchers making topological laser

To Boubacar Kanté and colleagues at the University of California, San Diego, for creating the first “topological laser”. The device involves light snaking around a cavity of any shape without scattering – much like the motion of electrons on the surface of a topological insulator. The laser works at telecom wavelengths and could lead to better photonic circuits or even protect quantum information from scattering.

Lightning makes radioactive isotopes

To Teruaki Enoto of Kyoto University and colleagues for providing the first detailed, convincing evidence that lightning strikes can lead to the synthesis of radioactive isotopes in the atmosphere. Physicists already knew that lightning strikes can produce gamma rays and neutrons, and had suspected that interactions between this radiation and nitrogen nuclei in the air could create radioactive nuclei. Enoto and colleagues confirmed this by measuring a gamma-ray signal indicative of nuclear decay that peaked about 1 min after a lightning strike. This, they say, is evidence for the production of radioactive nuclei such as nitrogen-13.

Super-resolution microscope combines Nobel-winning technologies

Stefan Hell (left) and colleagues

To Francisco Balzarotti, Yvan Eilers, Klaus Gwosch, Stefan Hell and colleagues at the Max Planck Institute for Biophysical Chemistry, Uppsala University and the University of Buenos Aires, for developing a new type of super-resolution microscope that can track biological molecules in living cells in real time. The new technique is called maximally informative luminescence excitation probing (MINFLUX) and it combines the merits of two Nobel-prize-winning techniques – one of which was developed by Hell. MINFLUX attains nanometre-scale resolution more quickly and with fewer emitted photons than previously possible.

Particle-free quantum communication is achieved in the lab

To Hatim Salih of the University of Bristol and colleagues and Jian-Wei Pan of the University of Science and Technology of China and colleagues for the theory and experimental realization of transmitting information using quantum physics without the exchange of any particles. Four years ago, Salih and colleagues proposed a new quantum-communication scheme that does not require the transmission of any physical particles. While some physicists were sceptical, this year a team led by Pan created such a system in the lab and used it to transfer a simple image while sending (almost) no photons in the process. Dubbed “counterfactual imaging”, the technique could prove handy in imaging delicate pieces of ancient art that cannot be exposed to direct light.

Ultra-high-energy cosmic rays have extra-galactic origins

Night-time photograph of a Cherenkov detector in Argentina

To the Pierre Auger Observatory collaboration for showing that ultra-high-energy cosmic rays come from outside the Milky Way. For decades, astrophysicists have believed that the sources of cosmic rays with energies greater than about 1 EeV (1018  eV) could be worked out from the arrival directions of these particles. This is unlike lower energy cosmic rays, which appear to come from all directions after being deflected by the Milky Way’s magnetic fields. Now, Pierre Auger’s 1600 Cherenkov particle detectors in Argentina have revealed that the arrival rate of ultra-high-energy cosmic rays is greater in one half of the sky. What is more, the excess lies away from the centre of the Milky Way – suggesting that the cosmic rays have extra-galactic origins.

‘Time crystals’ built in the lab

To Christopher Monroe at the University of Maryland and colleagues and Mikhail Lukin of Harvard University and colleagues for their independent creation of “time crystals”. Like conventional crystals, which spontaneously break translational symmetry, time crystals spontaneously break discrete time symmetry. Time crystals were first predicted five years ago and now two spin-based systems with properties resembling time crystals have been created. Lukin used spins in diamond defects, while Monroe’s spins were trapped ions.

Metamaterial enhances natural cooling without power input

To Ronggui Yang and Xiaobo Yin of the University of Colorado Boulder and colleagues for creating a new metamaterial film that provides cooling without the need for a power source. Made out of glass microspheres, polymer and silver, the material uses passive radiative cooling to dissipate heat from the object that it covers. It emits the energy as infrared radiation, so it can travel through the atmosphere and ultimately into space. The material also reflects sunlight, which means that it works both day and night. But perhaps most importantly, it can be produced cheaply at an industrial scale.

Three-photon interference measured at long last

To Sascha Agne and Thomas Jennewein of the University of Waterloo and colleagues and Stefanie Barz, Steve Kolthammer and Ian Walmsley of the University of Oxford and colleagues for independently measuring quantum interference involving three photons. Seeing the effect is very difficult because it requires the ability to deliver three indistinguishable photons to the same place at the same time and also to ensure that single-photon and two-photon interference effects are eliminated from the measurements. As well as providing deep insights into the fundamentals of quantum mechanics, three-photon interference could also be used in quantum cryptography and quantum simulators.

Muons reveal hidden void in Egyptian pyramid

Virtual-reality representation of the interior of Khufu's Pyramid

To the ScanPyramids collaboration for using cosmic-ray muons to find evidence for a hitherto unknown large void in Khufu’s Pyramid at Giza, Egypt. By placing different types of muon detectors in and around the pyramid, the team measured how showers of muons were attenuated as they passed through the huge structure. Computer algorithms analysed the data and revealed an unexpected and very large void deep within the pyramid.

Learning with LEGO

An illustration from the book, of LEGO bricks to explain particle physics

I have lost count of the number of wheezes to get people hooked on particle physics. There have been straightforward scientific accounts, personal tales of discovery, books filled with cartoons, essays and even historical vignettes. Now in Particle Physics Brick By Brick, science communicator Ben Still, who is also an honorary research fellow at Queen Mary University of London, has decided to use LEGO bricks to coax readers into learning about the mysteries of the subatomic world. The book is divided into about 80 different topics, covering everything from the various fundamental particles and forces to nucleosynthesis, cosmic radiation, dark matter and more. Each topic is tackled via a richly illustrated double-page spread, in which LEGO bricks represent different particles – an up quark, for example, is a red two-by-two brick, a heavier charm quark is a green three-by-two brick, while a massive top quark is a blue four-by-two brick (the mass/brick-size relationship is only approximate of course). Using LEGO makes for a pretty way to illustrate what are quite technical topics, such as Feynman diagrams, symmetries, pentaquarks, antimatter and radioactivity. Still’s effort is a fine introduction to particle physics, but his book will only take you so far, LEGO bricks or not. Just as well then that, according to the title page, the LEGO Group “does not sponsor, authorize or endorse this book”.

Copyright © 2026 by IOP Publishing Ltd and individual contributors