Skip to main content

Online conferences, auxetic materials and a new kind of rocket engine: the October 2020 issue of Physics World is now out

Physics World October 2020 cover

When it comes to scientific conferences, we’ve all been affected by the restrictions from the COVID-19 pandemic. But even when the pandemic is over, as it surely will be one day, I don’t think we’ll ever entirely go back to massive scientific conferences as we knew them.

Sending thousands of people halfway round the world at great expense to mix in stuffy rooms increasingly seems a thing of the past, not least on the grounds of cost and pollution. What’s more, we can do so much more online.

Sure, it’ll take time to figure out what works best, but there are some intriguing developments already under way. In the new issue of Physics World, for example, you can read about a recent online conference in attophysics that deliberately set out to create scientific “battles”. More importantly, online conferences can make physics meetings more accessible and open.

If you’re a member of the Institute of Physics, you can read the whole of Physics World magazine every month via our digital apps for iOSAndroid and Web browsers. Let us know what you think about the issue on TwitterFacebook or by e-mailing us at pwld@ioppublishing.org.

For the record, here’s a rundown of what else is in the issue.

• US election focuses on science – from dealing with the COVID-19 pandemic to stimulating industries of the future, science policy will play a larger role than usual in next month’s US election, as Peter Gwynne reports

Redefining the scientific conferenceEleanor S Armstrong, Divya M Persaud and Christopher A-L Jackson argue that the COVID-19 pandemic offers an opportunity to start making scientific meetings more inclusive

Courting controversy onlineCarla Figueira de Morisson Faria and Andrew Brown say that academic debate can still be fostered in an online-only world

• Monumental mistake – Robert P Crease laments the disappearance of a landmark in US science history: the cooling tower at the Brookhaven National Laboratory

• Green strings attached – solving many of today’s environmental problems will require advanced technological solutions, says James McKenzie

• Eliminating the boundary between sky and space – reusable vehicles are vital to make access to space more affordable, but conventional rocket engines have their limits. Oliver Nailard describes how UK firm Reaction Engines hopes to revolutionize space access with a new class of propulsion system, the Synergetic Air Breathing Rocket Engine (SABRE)

• Stretching the limits – most materials get thinner when stretched, but “auxetics” do the opposite and get thicker. Helen Gleeson describes her group’s discovery of a material that is auxetic at the molecular level, which could be used in everything from body armour to laminated glass

• Optical microscopy – how small can it go? For centuries diffraction limited the resolution of optical microscopy. The past 50 years have, however, seen one limitation after another buckle under the ingenuity of a host of wide-ranging techniques, from lenses to tips, chips and doughnuts. Anna Demming reports

• Age of cosmic explosion – Tushna Commissariat reviews Look Up: Our Story with the Stars by Sarah Cruddas

• Intrepid interstellar adventurers – Ian Randall reviews Spacefarers: How Humans Will Settle the Moon, Mars, and Beyond by Christopher Wan

• Path of least resistance – graduate student Rosemary Teague and undergraduate Amber Yallop share their non-traditional degree pathways, some difficult choices they made along the way, and what a future in physics looks like for them now

• Is anybody there? – Chris Holt on the physics of ghosts

Why Google builds quantum computers, the LGBT+ experience in physics, CERN’s carbon footprint

In this episode of the Physics World Weekly podcast Google’s Sergio Boixo explains why the tech giant is building its own quantum computers. Boixo will be a plenary speaker at the upcoming Quantum 2020 virtual conference, and we will be interviewing other plenary speakers in future episodes of the podcast.

Next up is Ramon Barthelemy – a physicist at the University of Utah who has surveyed more than 300 LGBT+ physicists about their careers. Barthelemy chats with Physics World’s Matin Durrani about the survey and the issues faced by the LGBT+ physics community, which are also described in a paper in the European Journal of Physics.

CERN is home to the world’s largest particle collider and several experiments that are the size of small office blocks. So, it is not unexpected that the Geneva-based lab has a large carbon footprint. What is surprising, however, is the main source of CERN’s greenhouse gas emissions – as the science writer Kate Ravilious explains in the final segment of the podcast.

Overlooked for the Nobel: Nicola Cabibbo

In 2008 three physicists bagged that year’s Nobel Prize for Physics for developing predictions and concepts on symmetry breaking that became the cornerstones of the Standard Model of Particle Physics.

Makoto Kobayashi of the KEK lab and Toshihide Maskawa from the University of Kyoto, both in Japan, shared one half of prize for their work in 1972 on the mechanism of broken symmetry, which led to the prediction of a new family of quarks. Yoichiro Nambu of the University of Chicago in the US, who died in 2015, bagged the other half of the prize for realizing in 1960 how to apply spontaneous symmetry breaking to particle physics. Nambu’s work described how the vacuum is not the most symmetrical state, work that underpinned the mechanism for the Higgs field.

Symmetry breaking seeks to explain the subtle differences in physics that enables matter to tip the balance with antimatter in the universe. Charge (C) symmetry involves particles behaving like their oppositely charged antiparticles, while “parity” (P) symmetry means events should be the same when the three spatial co-ordinates x, y and z are flipped. In the 1950s, physicists discovered that charge symmetry breaks in the weak interaction, which governs radioactive beta decay. This was followed a few years later by observations of parity breaking in the weak interaction. A decade later and CP violation was shown to break during the decay of kaons.

It is the theory explaining CP violation that handed the 2008 Nobel prize to Kobayashi and Maskawa. In 1972, while at Nagoya University, the duo formulated a 3 × 3 matrix that describes how the strange quark and down quark inside a kaon can switch to and fro into their antiparticles and, in doing so, occasionally break CP symmetry. Moreover, the mixing in the matrix implied the existence of new quarks – the charm, bottom and top – all of which were discovered over the following decades.

Missing link

Kobayashi and Maskawa’s groundbreaking matrix was later named the CKM matrix after the initials of the physicists involved. But the first initial, C, reveals the contribution of someone else: Nicola Cabibbo. He was an Italian theorist from the CERN particle-physics lab near Geneva who in 1963 developed a smaller 2 x 2 quark-mixing matrix that ultimately laid the groundwork for Kobayashi and Maskawa almost a decade later.

Unfortunately for Cabibbo, the statutes of the Nobel Foundation, set down in 1900 some three years after the reading of Alfred Nobel’s will, state that the prize can go to no more than three people every year. While some mainained that Cabibbo could be honoured in another year, he died in 2010 at the age of 75 and another Nobel rule stipulates that awards cannot be given posthumously.

Some suggested that the committee could have given the prize in 2008 to Cabibbo, Kobayashi and Maskawa, leaving Nambu to be honoured in the future. Indeed, Roberto Petronzio, president of Italy’s National Institute for Nuclear Physics, said at the time that he was “filled with bitterness” at Cabibbo missing out.

So, in an alternative universe, Cabibbo, Kobayashi and Maskawa would have shared the 2008 Nobel, leaving Nambu to be awarded the prize following the discovery of the Higgs boson in 2012. But hindsight is a wonderful thing.

Redefining the scientific conference to be more inclusive

The ongoing COVID-19 pandemic has led to the cancellation or postponement of many in-person scientific meetings that have historically been a key forum for scientists to present and share their ideas as well as to foster academic collaborations. In response, scientists have moved their meetings online, which has been met with a mixed response, despite offering the possibility to dodge bad coffee, rubbish Wi-Fi and awkward poster sessions where you’re ignored in favour of warm beer. 

Based on our experiences of organizing and participating in many conferences both before and during the pandemic, we see many advantages that the “new normal” could bring, particularly for those from under-represented groups. Online conferences present an opportunity to challenge the problematic norms of existing conferences, which lead to the exclusion of already marginalized groups. They can also eliminate the environment that facilitates sexual or other forms of harassment and, when appropriate care is taken, also improve access to disabled people. 

Many academics travel extensively for scientific meetings – to develop and sustain collaborative projects as well as undertake fieldwork. Although sometimes exhilarating, travel presents barriers. The opportunity to travel is not equally distributed and can be prohibitively expensive. Most large conferences take place in the Global North and entering these countries from the Global South may require costly visas. Potential delegates may be subject to countrywide travel bans and upon entry researchers of colour may experience racial aggressions. Alongside the physical demands and inaccessibility of travel, unfamiliar locations may present challenges to disabled researchers and their carers. Conferences held in countries with discriminatory laws and attitudes may be unsafe for people from marginalized groups such as LGBTQ+ researchers. Travel also contributes to the unnecessary release of CO2.

Virtual conferences have the benefit of eliminating many of these barriers, making conferences more accessible to delegates who otherwise would be unable to attend. For example, more early-career researchers from China, India and Latin America attended the 2020 Virtual Perovskite Conference than usual, and disabled academics took part in this year’s Space Science in Context meeting at a rate that reflected the 24% of disabled people in the US and UK.

Smile, you’re on camera

Video presentations – whether pre-recorded or live – provide an opportunity to showcase a more diverse range of speakers. As questions can be asked anonymously and in advance, video-based talks have been shown to encourage questions from historically marginalized members of our communities such as early-career researchers or researchers of colour. Online events also allow the value of different contributions to be reshaped. Poster presentations are disproportionately delivered by early-career researchers and those of minoritized genders and people of colour. Running them in the middle of the day rather than alongside other, typically alcohol-based “social” events, places a higher value on this research. 

Conferences have typically been hostile to disabled and neurodivergent people. Combining pre-recorded and live events can improve accessibility for those who are disabled, chronically ill, neurodivergent and those with caregiving needs, who can access the material in their own time. Having a choice of format – for example, text or audio – also lets people choose how they engage with the conference based on their needs and preferences. Recorded meetings and having sessions repeat over the course of the conference can also overcome challenges related to time zones.

Disabled people must now be involved in conference organizing, especially as online conferences do not present a one-size-fits-all approach to accessibility. For example, lack of audio captioning, text that is not compatible with screen readers, video-based sessions without sign-language interpreters, and networking events conducted on inaccessible platforms will affect disabled researchers’ ability to connect. Evening events, meanwhile, may present barriers to attendees with caring or other responsibilities. These are important considerations for conference committees.

While online conferences lessen the burden for those in the most financially precarious positions, there will be additional costs linked to interpreters, captioning, Internet access and childcare. But this could be prioritized over travel, location and conference-funded socializing. 

Some have suggested that going online only could hinder how science is conducted in the long term and affect the “culture” of science. Unsurprisingly, these commentaries are typically generated by scientists for whom the traditional conference format is not exclusionary. Switching to an online conference can threaten meaningful networking, but tools and mechanisms do exist to support such engagement, including the use of randomly allocated breakout “coffee rooms”, using virtual worlds, and different forms of “virtual booths” for networking based on shared interests. Much like in-person events, ensuring a high-quality, rigorously enforced code of conduct is central to the success of online relationships. Relationships that develop online can be just as strong as those formed offline. 

In the face of COVID-19 we must be bold enough to redefine our norms. Rather than trying to maintain business as usual in an online format, we should ask ourselves who is not in the room where it happens. We should ensure that we share knowledge and opportunity in an accessible and inclusive way. Rather than representing an unmitigated disaster for the future of conferences and networking, COVID-19 is an opportunity to open up these spaces and make them welcoming to all.

Fostering academic debate in an online world

The past six months have seen scientists shift from working in the lab to conducting their research and collaborations online using tools such as Zoom. Conferences, which had almost always been held in-person before the COVID-19 pandemic, have also had to switch to online-only. This move has led some to warn of the long-term dangers for science, especially those fields in which there is “much disagreement and passion”. Face-to-face meetings, they contend, are the “only way to propel science forward”.

We disagree. We recently co-organized an online conference devoted entirely to controversy. Held in early July over three days, the Quantum Battles in Attoscience event had more than 300 registered participants from 34 countries. Attoscience is a fairly new branch of physics and deals with some of the shortest times in nature (10–18 s). At these time scales, researchers can image the real-time movements of electrons. And since electrons carry energy in systems from biomolecules to nanostructures and metals, attoscience may impact many areas of science and even lead to “optoelectronic” computers.

Despite – or perhaps because of – its vibrancy, the attoscience community is very divided on almost all issues, siloed into factions without a co-ordinated effort toward constructive debate. We have seen plenty of “street fights” at major international conferences and in journals, with not much respect being held between the different parties. The idea of embracing, instead of avoiding, conflict emerged when we were writing a workshop proposal. Privately, some of us had practised martial arts, where rigorous codes of conduct are enforced. Breaking them will result in your being expelled for tarnishing your school’s reputation. In extreme cases, you may even have your belts revoked. This approach is very different from academic street fights, so we asked: “If people want to fight, why not go for the scientific equivalent of a martial-arts tournament?”

If people want to fight, why not go for the scientific equivalent of a martial-arts tournament?

Mortal combat 

We initially intended to host a “battle” event at University College London, but moved it online when the pandemic hit. This posed several challenges, but also gave us plenty of opportunity to test this debating format with a specific code of conduct that was  developed especially for the event. We invited early-career researchers – who we dubbed “combatants” – from opposing groups to participate in three “battles” on contentious topics. Every combatant was promoted on the workshop website and on social-media platforms leading up to the conference. They also became co-organizers of the conference, invested enormously in the planning of the battles and passed on the excitement to their groups

Bringing these people together to trust each other in a virtual environment took around three months. This was done via Zoom meetings and dedicated channels on Slack. Two organizers – Bridgette Cooper and Andrew Maxwell – managed the interaction between participants. Once the arguments had been agreed and prepared, we then carried out several mock battles. Traditionally, panel discussions happen on the fly and involve leaders in the field, who would not have the time for such a lengthy preparation. We heard from our combatants that as early-career researchers they wanted the practice as well as the reassurance that they would not be caught off guard. The preparation allowed them to explore controversial points, let go of their impostor syndrome and step outside their comfort zones to discuss more “fringe” topics. The mock battles helped to set boundaries and timing, ensuring that everyone was equally represented. The battles were mediated by leading scientists in the field who were not affiliated with the panelists. They met a few times to establish how the battles would be conducted. 

Our conference still boasted big names but they were invited to give more traditional talks. By focusing on early-career researchers we could avoid a lot of politics and ego: the combatants were willing to invest in the process precisely because they had more to gain from it. During the conference, it was also much easier to poll people online as anonymity helped to increase audience participation. In a real conference we would only have the usual suspects asking or replying to questions.

Culture change

Current conference culture is built to encourage the participation of principal investigators. This needs to change – why do we need the same lectures every year from the same people? We would like to see fresh faces and ideas, but this is a double-edged sword: a conference with no big names may not attract interest and may even look suspicious. With this new initiative we wanted to change this mindset. Furthermore, an onsite conference requires a huge investment in terms of local resource, sponsorship and infrastructure, both for the participants and the organizers. This poses further barriers and favours those with privilege and time. Online meetings avoid some of these issues. 

The meeting was a huge success and by subverting a few paradigms, we hope to have shown that alternatives are possible. Not only can debate happen in an online forum, but it can be done while maintaining respect for those involved. 

White papers: Mad City Labs and Edinburgh Instruments

 

This time we are featuring two white papers from Mad City Labs and Edinburgh Instruments

Making sense of viruses

Single-molecule microscopy techniques allow researchers to directly study molecular mechanisms, enabling them to boost our understanding of, for example, how biological viruses assemble, disassemble and interact with their hosts. In the new white paper from Mad City Labs, entitled Understanding Virus Mechanisms – One Particle At A Time, you can discover how Tijana Ivanovic from Brandeis University in the US has been using the company’s equipment to understand cell entry mechanisms and the relationship between the structure and organization of a virus particle and the early steps of infection. With her lab having studied viruses ranging from those responsible for Ebola to COVID-19, the white paper shows how these single-molecule techniques allow you to measure the “trajectories” of individual molecules in a population.

Studying quantum dots

Semiconductor quantum dots have unique tuneable photoluminescence properties, which make them ideal for a range of important technological applications including solid-state lighting, displays, photovoltaics, and biomedical imaging. Indium-phosphide quantum dots are of particular interest as an environmentally friendly and non-toxic alternative to traditional heavy metal-based quantum dots containing cadmium and lead. Although indium-phosphide quantum dots do not emit light on their own, they can do this if coated with a layer of zinc sulphide. In this latest white paper from Edinburgh Instruments, entitled Emission Tail of Indium Phosphide Quantum Dots Investigated using the FS5 Spectrofluorometer, you can find out how one of the company’s spectrofluorometers was used to characterize the absorbance, emission and lifetime of such indium-phosphide/zinc-selenide quantum dots, thereby helping to establish important structure–property relationships.

Ice-core mission in the Swiss Alps abandoned due to surprisingly hard glacier

If you’re into home improvement, you may have experienced that sinking feeling of drilling into a wall and hitting something very hard. Whether it’s a metal support or stubborn brickwork the result is the same: you can’t drill any deeper and you may have damaged your drill. Last week, an Italian-Swiss group of climate scientists experienced a similar drilling defeat played out on a grand scale when they were forced to abandon attempts to drill ice cores from a glacier in the Swiss Alps.

The mission was part of Ice Memory, a UNESCO-backed project in which ice cores from several of the world’s threatened glaciers will be extracted and ultimately stored in an “ice sanctuary” in Antarctica. The project’s main goal is to provide future scientists with access to this ice record before it vanishes due to climate change. Tiny bubbles of gas, ancient pollen and possibly even microbes frozen within the ice can reveal information about Earth’s climate history.

“The ice cores also certainly contain still other deposits and residues that could help us answer scientific questions – which ones, we don’t even know yet – in the future”, said Margit Schwikowski, an environmental chemist at the Paul Scherrer Institute, who led the recent expedition.

Breaking the ice

Things began well for Schwikowski’s eight-person team. On Monday 14 September the group set up base camp at an altitude of 4100 m on the Corbassière Glacier, on the Grand Combin mountain massif 90 km east of Geneva. The mission was to extract three ice cores – each with a 7.5 cm diameter – which could extend as far as the underlying rock 80 m below.

But the group soon ran into difficulties. In two closely separated drilling locations, the scientists encountered an unexpected transition at a depth of just 20 m, which halted progress and damaged the drill. In early reports, the researchers suggest they may have encountered ice “lenses” – extremely resistant blocks of ice largely free from sediment.

Ice drill

Not to be deterred, the team managed to transport the equipment to the manufacturer’s lab in Bern to get it fixed. For the third attempt, during the weekend of 19–20 September, the scientists started drilling 10 m away from the previous holes, but progress was yet again halted at the same shallow depth. With harsh weather conditions forecast, the team decided to leave the mountain and postpone the mission.

“The water has complicated the whole operation. We weren’t expecting to find the glacier like this,” said Carlo Barbante, director of Italy’s Institute of Polar Sciences and researcher at Ca’ Foscari University of Venice. “We will need to change the way we drill the ice and hope it’s not too late to extract a full ice core from the Grand Combin.”

A challenging laboratory

Although the scientists will be disappointed, it is not surprising to face setbacks when working in extreme mountain environments. Harsh weather conditions also restricted the number of ice cores retrieved from earlier Ice Memory missions in Bolivia and Russia in 2017 and 2018 respectively. However, both of those missions ultimately resulted in successful ice-core extractions, as did expeditions in the French Alps (2016) and a Russian site in the Altai Mountains (2018).

Ice drill

Barbante is also the co-ordinator of Beyond EPICA, which seeks to drill ice cores in Antarctica, providing a climate record for at least the past 1.5 million years. That project is the follow-up to the European Project for Ice Coring in Antarctica (EPICA), where researchers drilled 3270m deep into the Antarctic ice between 1996–2005, which enabled them to reconstruct the climate history of the past 800,000 years.

So although this latest mission my have hit a metaphorical brick wall, it is part of a wider programme to preserve the world’s ice record. Undoubtedly there will be further challenges ahead, but climate scientists are depending on it.

Unprecedented ice loss is predicted for Greenland Ice Sheet

Over the next eighty years global warming is set to melt enough ice from the Greenland Ice Sheet to reverse 4000 years of cumulative ice growth – with rates of ice-loss more than quadruple even the fastest melt rates during the past 12,000 years. These stark conclusions come from new simulations which, for the first time, put current and projected future rates of ice-loss into context; comparing them directly with historical rates of ice-loss. These latest results are consistent with previous research that shows that if we continue our current high trajectory of greenhouse gas emissions we can expect Greenland to become ice-free in as little as 1000 years.

A couple of weeks ago a massive chunk of ice – equivalent in size to the Caribbean island of Montserrat – broke away from north-east Greenland. It serves as yet another indicator of the rapid pace of change in this region, with rising temperatures currently driving ice-loss at a rate of around 6100 billion tonnes per century. But how unprecedented is this? How quickly did the Greenland Ice Sheet melt during warm periods in the past?

The Greenland Ice Sheet currently stretches across 1,710,000 square kilometres and geologists have used ice-core data to understand how it has waxed and waned in the past. Until now, however, no-one had compared how ice-sheet activity in the past matched up with what scientists expect to see in the future. Jason Briner, from the University at Buffalo in New York, and colleagues have made use of recent reconstructions of climate and ice-sheet thickness to simulate the evolution of a portion of ice-sheet in southwestern Greenland, running their simulations from 12,000 years ago, through to eighty years into the future.

“Shocking” pace of ice loss

The scientists found that the highest rates of ice melt during the Holocene (the last 12,000 years) occurred during a warm period between 10,000–7000 years ago when it was 3–5 °C warmer than today. This resulted in ice being shed at a rate of around 6000 billion tonnes per century, which is similar to the loss rate seen over the last 20 years.

But as their simulation ventured into the future they discovered that the rate of ice-loss is likely to dwarf anything seen in the past. Under a high-emissions “business as usual” scenario Briner and his colleagues show that ice loss could reach an eye-watering 35,900 billion tonnes per century by 2100, whilst under a low-emissions scenario it is likely to rise to around 8800 billion tonnes per century. “It was a shocking to me to see that even with low emissions the pace of ice loss is going to be faster than it was during the warmest period in the past,” says Briner, whose findings are published in Nature.

Under a low emissions scenario the global sea-level rise associated with the melt from this segment of ice sheet is around 2 cm by 2100, whilst under high emissions it adds around 10 cm. Scaling this up to include the rest of the Greenland Ice Sheet suggests that Greenland ice-melt will produce at least 4–20 cm of sea level rise by 2100.

“Centuries of sea level rise”

“The findings of this study are no surprise, but they highlight that we should do everything in our power to slow the rate of melt,” says Timothy Lenton, Director of the Global Systems Institute at the University of Exeter. “Buying time for ‘managed realignment’ of the worlds coastlines and several major cities is a huge deal. When we are talking about centuries of sea level rise, we are considering a timescale where the global human population could conceivably be declining (as affluence leads couples to have fewer children, below ‘replacement rate’). That could make a realignment easier if it is further in the future. Whereas with currently growing population and growing coastal megacities it could be a nightmare,” he adds.

Previous research has suggested that we have already passed the point of no return for the Greenland Ice Sheet, with no hope of preventing a complete meltdown, but Briner and his colleagues are not convinced that this tipping point has passed. “It is clear that we are now committed to a lot of ice loss through this century, but our simulation shows that if we follow a low-emissions pathway the rate of ice-loss may slow as we approach 2100. It is possible to leave future generations with a healthy Greenland Ice Sheet,” he says. Lenton concurs and thinks there is still time to act. “Even if tipping points have been passed, because the ice sheet dynamics are relatively slow, it is possible to ‘overshoot’ an ice sheet tipping point temporarily and still recover the situation. That of course requires bringing greenhouse gas levels down, which is going to require deliberate greenhouse gas removal on top of stopping greenhouse gas emissions.”

Overlooked for the Nobel: Jocelyn Bell Burnell

I’ve met Jocelyn Bell Burnell twice.

The first was when I sat next to her at a dinner in London in 2007. The other occasion was last year when I interviewed her about her incredibly generous donation of $3m to set up the Bell Burnell Graduate Scholarship Fund.

Run by the Institute of Physics, which publishes Physics World, the fund supports PhD students from under-represented groups at universities in the UK and Ireland, with the first recipients having recently been announced.

On both occasions, I resisted the temptation to ask Bell Burnell why she feels she was never awarded a share of the Nobel Prize for Physics for the discovery of pulsars.

Famously, her PhD supervisor Antony Hewish won the 1974 Nobel prize for the pulsar discovery – sharing it with his astrophysicist colleague Martin Ryle – while Bell Burnell was left empty-handed.

The omission might appear to be due to her gender. But speaking at the International Conference on Women in Physics in Birmingham, UK, in 2017, Bell Burnell attributed it to the fact that she was a PhD student at the time of the discovery in 1967 at the University of Cambridge.

Bell Burnell and five colleagues had built a radio telescope in a huge field outside the city, which she then operated and ran. Combing through the mountains of data, Bell Burnell saw regular peaks in luminosity that she and Hewish attributed to a pulsar – a rotating neutron star that emits a regular ticking signal of radio waves. Their paper announcing the finding was published in Nature in January 1968.

Still, as described in this feature by Sarah Tesh and Jess Wade from 2017, Bell Burnell doesn’t think such an injustice could happen again, pointing to the 1993 Nobel Prize for Physics for binary pulsars. It went to Russell Alan Hulse, who was a student at the time of the discovery, along with his supervisor Joseph Hooton Taylor Jr. “At least they don’t make the same mistake twice,” Bell Burnell told delegates at the Birmingham meeting.

However, we won’t have to wait too long to find out the real reason for Bell Burnell’s omission. The Nobel archives always remain sealed for a period of 50 years after the award of a Nobel prize, which means that details of the 1974 prize will be available in just over three years’ time in January 2024. The archives contain not just data on who was nominated, by whom and when, but also information about the committee’s thinking.

I, for one, will be fascinated to find out what was going through the minds of that year’s Nobel Committee for Physics. We’ve recently learned a lot more about how they operate, but why they make their choices still remains a mystery. However, I’d lay a fair bet that, consciously or subconsciously, Bell Burnell’s gender played a role too in their deliberations.

Commissioning and QA workflows of MR-Linac with the THALES 3D MR SCANNER

Want to learn more on this subject?

The market introduction of the MR-Linac technology improves the quality of patient care via the real-time imaging of the targeted PTVs. The online control of the PTVs permits the gating process of the radiation beam delivery reducing the amount of delivered dose on healthy and sensitive tissues. Conventional water phantoms with ferromagnetic material becomes prohibited due to the presence of the static magnetic field of MR-Linacs. To overcome this situation, LAP introduces the MR-compatible water phantom, THALES 3D MR SCANNER, to support the end users with the measurements of dosimetry distributions necessary to commission the beam model of the MRIdian Linac from ViewRay.

The key products specifications and end-user benefits will be presented by Dr Thierry Mertens (business development manager for LAP). Daan Hoffmans, MSc, physicist in Amsterdam UMC will share his clinical experience with the THALES 3D MR SCANNER including first evaluations presenting the use of the THALES 3D MR SCANNER with the Ethos™ (Varian) machine.

Want to learn more on this subject?

Thierry Mertens has a PhD in physics and has nearly 15 years of experience in medical physics and radiotherapy, with his major commitment being to develop innovative quality assurance solutions to support the medical end users with their clinical tasks. As business development manager for LAP since 2016, Thierry has been instrumental with the development of the THALES 3D MR SCANNER and had the opportunity to work closely together with pioneer users of the MRIdian system. Thierry has been instrumental in ensuring that the end-user needs are fulfilled by feeding back to the LAP R&D team.

Daan Hoffmans, physicist in the Department of Radiation Therapy of Amsterdam UMC, has been closely involved in acceptance, commissioning and defining QA strategy for state-of-the-art radiation units for 15 years. In 2016, Europe’s first MRIdian (ViewRay) system was installed in Amsterdam UMC. In that time, the market of QA equipment for MR-guided radiotherapy was still at an early phase, which demanded the maximum of Daan’s creativity and improvization skills. In the years following, a second MRIdian was placed, and both systems underwent several upgrades. During this period, Daan had plenty of opportunities to test and validate prototypes of what became THALES 3D MR SCANNER.

Copyright © 2025 by IOP Publishing Ltd and individual contributors