Skip to main content

Why knocking down Brookhaven’s iconic smokestack is a monumental mistake

For 70 years a red-and-white striped smokestack was the first thing visitors would see when arriving at Brookhaven National Laboratory on Long Island near New York. Painted that way to meet the requirements for navigational markers, the stack had been built for the Brookhaven Graphite Research Reactor (BGRR), which fired up in 1950. Along with the neighbouring brown reactor building, the stack formed part of the lab’s first logo and was the centrepiece of Brookhaven’s stock photos.

Measuring about 6 m in diameter and standing almost 100 m high, the stack was the tallest object in that part of Long Island. If you climbed the external ladder to the top, you saw water both to the north (Long Island Sound) and to the south (the Atlantic Ocean). The stack was designed to discharge the air that cooled the BGRR but had become slightly radioactive due to the presence of argon-41. This isotope has a half-life of 10.9 minutes so it decayed  before reaching the ground.

Now, however, this monument to US science history is being demolished.

A world first

The BGRR was not just Brookhaven’s first major instrument but also the world’s first reactor built for the sole purpose of supporting basic research. Nuclear physicists used it to develop quantitative models of the nucleus, chemists to explore the structure of matter, and biologists to study tissue and to create radioisotopes for research and treatment. Engineers, meanwhile, came to the BGRR to study materials for submarines, ships, aircraft and spacecraft.

The BGRR was not just Brookhaven’s first major instrument but also the world’s first reactor built for the sole purpose of supporting basic research.

After the BGRR closed in 1968, the stack served as an air-exhaust pathway for the High Flux Beam Reactor, another basic research tool. But when that instrument was terminated in 1999, the stack became obsolete. Since then, apart from periodic inspections for structural integrity, it has stood there, alone and unused.

The BGRR’s project leader was the nuclear physicist Lyle Borst (1912–2002), who had been research supervisor at the X-10 reactor at Oak Ridge National Laboratory in Tennessee, which served the US military during the Second World War. Like X-10, the BGRR consisted of a huge, air-cooled cube of graphite moderator with natural uranium fuel.

Air flowed through a gap in the centre of the cube and was sucked into the fuel channels and a series of filters before passing into air ducts that carried it to a fan building, where powerful motors blew it up the stack. The turbulent air flow made a huge noise, and 5 m tall vertical steel panels were installed downstream of the fan house and upstream of the stack to dampen the sound by converting it to laminar flow.

While the stack was being built, Borst would treat visitors to a dramatic sonic shift by taking them into the stack and then to the fan house area, going from what engineers call an echoic chamber to an anechoic chamber. Standing inside the stack was like being in a giant organ pipe, ultra-reflective of every sound. “You’d clap your hands,” Borst told me when I spoke to him 30 years ago, “and this would go to the top of the stack, and bounce back own, and about 10 seconds later you’d hear a BANG, which was your clap.”

Sound-wise, the area near the silencers was the opposite. “If your partner were just a few feet away,” Borst said, “you couldn’t hear each other. The sound was all absorbed – there was no reflection, no echo, no nothing. It was just dead.” Sadly, silencers have been removed in preparation for the stack’s demolition, which is scheduled to take place in early November.

John Carter, director of communications at the US Department of Energy (DOE), told me that the stack is being demolished because of the “the ongoing cost and risks of inspection, maintenance, repair, and collection and disposal of contaminated rainwater”. Features like the stairway I climbed would need to be periodically surveyed and fixed. The DOE had contacted agencies involved in navigation and historical preservation, but none expressed a strong enough interest in the stack to warrant devoting the required resources.

The critical point

We’ve seen many public objects toppled this year. Throughout the US, statues of Confederate soldiers were removed, while in the UK statues of 17th-century slave traders were pulled down in both Bristol and London. These statues are coming down because they memorialize people whom we no longer honour.

The BGRR’s stack is not a monument to a cause we have ceased to honour, but its removal makes me ponder what science historians should keep.

The BGRR’s stack is not a monument to a cause we have ceased to honour, except perhaps to those who regard reactors as objectionable. But its removal makes me ponder what science historians should keep. Galileo’s telescopes and James Clerk Maxwell’s mechanical models are preserved in museums because they were literally instrumental to key discoveries. Other objects – like components of particle accelerators – end up in museums because the historic facilities to which they belonged are too big to preserve.

Occasionally, buildings that once housed important scientific research are saved, such as the Atomic Physics Observatory at the Carnegie Institute near Washington DC, which was built in 1938 to house a forefront Van de Graaff generator. That building has some architectural interest, being designed to resemble an astronomical observatory so nearby residents would not be frightened by the nuclear instrument inside.

The BGRR stack is in a different category. It is not part of the BGRR’s innards, but a monument by default as the last and most visible piece of the building that once contained it. Soon, however, it will be no more. I find that sad, because the stack is as much of a monument as Brookhaven has. Models, images, artifacts and a time-lapse photo series of the demolition will soon be all that remain.

In vivo dosimetry should play a pivotal role in radiation therapy

Linac with portal dosimetry

Radiation therapy is a complex procedure, with a series of equipment and dosimetry checks performed before every treatment to ensure its safety and accuracy. However, there’s still potential for errors to occur during the actual radiation delivery, such as changes in patient geometry, inaccuracies in beam delivery or mispositioning of brachytherapy sources.

In vivo dosimetry (IVD), which measures the dose to the patient during the treatment, could detect any such errors and help ensure the accurate delivery of radiotherapy. But its adoption in clinical practice has so far been low.

In November 2017, at the first ESTRO Physics Workshop, a task group was created to investigate this low uptake and stimulate the wider adoption of IVD. After three years of work, the task group has now published its recommendations for the future development and clinical use of IVD in the two most common forms of radiotherapy: external-beam photon radiotherapy (EBRT) and high dose rate (HDR) or pulsed dose rate (PDR) brachytherapy.

“Our task group believes there should be more and tighter checks on radiotherapy, in particular during the beam delivery to the patient,” explain Frank Verhaegen from Maastro Clinic and Kari Tanderup from Aarhus University Hospital, who led the EBRT and brachytherapy teams, respectively. “IVD is one of those techniques that all clinics could have, the equipment is available, but hardly anyone does it. We analysed the reasons for this and tried to come up with technical requirements and guidelines for equipment manufacturers and clinical users.”

Issues arising

The project, which integrated academics, clinicians and equipment vendors from Europe, North America and Australia, was coordinated by Gabriel Fonseca and Jacob Johansen (for brachytherapy), and Igor Olaciregui-Ruiz (for EBRT). The first challenge was to create a formal definition for IVD. After much debate, the group agreed that “IVD is a radiation measurement that is acquired while the patient is being treated, containing information related to the absorbed dose in the patient”. As such, an IVD system must be able to detect errors arising from equipment failure, dose calculation errors, anatomical changes, and patient (EBRT) or applicator (brachytherapy) positioning errors.

Writing in an editorial in phiRO, Verhaegen and Tanderup detail the key requirements identified for IVD. In addition to acting as a safety system to catch errors that could affect the patient, an IVD method should also provide tools for treatment adaptation and record the true dose received by the patient. Ideally, an IVD system should record signals in real time without perturbing the dose to the patient.

But with such potential to improve radiotherapy, why is IVD is so under-utilized? The task group suggests that many clinics do not perform IVD because they consider the clinical benefit to be too low, or because workflows are too complex and resource-heavy. Manufacturers, meanwhile, are unwilling to invest due to limited demand from clinics and a lack of regulations.

“It’s a bit of a chicken-and-egg problem,” says Verhaegen. “There are quite a few products that one can buy, but they all only do part of the work. And because there is little guidance on how to use it, people just don’t use it.”

Commercial IVD systems available for EBRT include point detectors placed on the patient’s skin in the treatment field and electronic portal imaging devices (EPIDs), which use the treatment beam to image the patient. The EBRT task group focused on EPIDs, as they are ubiquitous on modern linear accelerators, easy to use, can be automated and can perform 2D or 3D dosimetric verification.

Initially employed for verification of patient set-up on the treatment couch, EPIDs have since been adapted for dosimetric measurements, including IVD. “Developing IVD methods for EBRT requires little investment in hardware, but needs a lot of software and methodology development,” Verhaegen notes.

Within brachytherapy, the main goal of IVD is to catch large deviations from the treatment plan that could affect the clinical outcome. Such deviations arise, for example, from source misplacement, deviations in dwell times or anatomical changes. In particular, the use of real-time IVD could allow treatment interruption and prevent gross errors. IVD systems should also record smaller deviations, enabling inter-fraction adaptation, and provide an estimate of the actual delivered dose.

Currently, there are two IVD methods that could be used with brachytherapy. One involves locating a radiation detector inside the applicator itself. While this approach can identify a variety of errors, it cannot detect movement of the entire applicator with respect to the patient, which would generate serious dose errors. A second option is to put the radiation detector on, or near to, the patient’s skin. This design can detect moving applicators, but the position of the detector itself may be uncertain.

“Both of these methods currently have some uncertainties which should be reduced,” says Tanderup. “Furthermore, treatment verification and error detection relies on a rather complex post-processing of the raw signals from the detectors. As long as software for such post-processing is not commercially available, the IVD methods will not have significant clinical value.”

Several groups, including those of Verhaegen and Tanderup, are currently working to develop novel IVD systems for brachytherapy.

Optimizing IVD

To fully exploit IVD and encourage its clinical introduction, the task group created a wish list for vendors to address. For starters, IVD methods require high sensitivity and specificity, to accurately identify clinically relevant errors while minimizing false alarms. The workflow should be easy to implement in the clinic, but able to trigger alerts when needed. In addition, IVD systems should be fully integrated with treatment planning software and treatment delivery equipment.

Automation could also accelerate the uptake of IVD, which currently involves performing a large number of manual actions, particularly for brachytherapy, and generates large amounts of data, especially with EBRT. “Fully automated systems that interpret discrepancies between planned and monitored therapy are key,” says Verhaegen. “Artificial intelligence could help a lot in catching errors, and even determining their cause and suggesting corrective action. This is a bit of sci-fi for now, but good to aim for – if we can get vendors on board.”

Tanderup and Verhaegen hope that the task group’s recommendations will encourage vendors to get excited by this urgent need for complex treatment verification. “It may be the small start-up companies that rise to the occasion,” Verhaegen tells Physics World. “Hopefully, the recommendations will also motivate clinics with current in-house-developed IVD systems to start collecting clinical data that demonstrate that IVD has clinical value,” adds Tanderup.

Optical receiver for space communications has ‘unprecedented’ sensitivity

Space communications

The most sensitive receiver to date for picking up optical signals in free space has been designed and demonstrated by researchers in Sweden. Peter Andrekson and colleagues at Chalmers University of Technology say they achieved this “unprecedented” sensitivity of one photon per bit of information in their receiver using a novel approach to signal preparation, combined with virtually noiseless amplification at the receiver. Their technique could have important implications for future space missions.

As space agencies seek to both expand their scope of exploration, and improve the data outputs of their satellites, existing radio-based communications systems are struggling to keep up. To enable operation at higher data rates, and transmissions across larger distances, optical signals are now increasingly being considered over radio waves, owing to their lower power losses during propagation. All the same, losses can be substantial in the vast distances of space. To realize higher transmission rates using as few photons as possible, receivers with the highest possible sensitivities are critical for success.

To achieve this, Andrekson’s team introduce a new setup in which data are first encoded onto a signal light wave, then combined with a continuous pump light wave at a different frequency. When these waves are passed through a nonlinear optical fibre, they then generate a third “idler” wave. Afterwards, all three waves are amplified to the desired output power, and launched into free space. At the receiving end, the depleted signal is captured in an optical fibre, then amplified by a phase-sensitive optical amplifier – a device unique in adding almost no noise to signals. Finally, the restored signal reaches a conventional receiver, where the original information can be recovered.

Room-temperature operation

Currently, even the most sophisticated free-space optical communications systems can only run at speeds of under 1 Gb/s, and require ultracold temperatures to operate. In contrast, the system designed by Andrekson’s team achieved a receiver sensitivity of close to one photon per bit of information at room temperature, enabling a data transmission rate as high as 10.5 Gb/s. In addition, the system relies on straightforward techniques for signal modulation, processing, and error calculation. This means it could easily be scaled up to accommodate higher data rates.

Through further theoretical calculations of the sensitivity of their technique, Andrekson and colleagues concluded that it is the best possible approach to transmission across a broad range of data rates. If integrated into the communications systems of real space missions in the future, their approach could hasten the transition from radio to optical signals for transmissions across large distances. This could lead to operational improvements to future missions to distant parts of the solar system; the transfers of data between satellites; and the monitoring of Earth’s surface using the optical technique LIDAR.

The research is described in Light: Science & Applications.

Raman microscopy: when chemicals become images

Want to learn more on the subject?

From materials to life or earth sciences, in many cases, solely obtaining the global chemical composition of a sample is not enough to characterize it completely. Spatial distribution and morphological information are also mandatory to give a full and realistic understanding of the sample studied.

Confocal Raman microscopy is the perfect technique to provide complete and deep chemical characterization of a sample. With our new LabRAM Soleil ultrafast imaging confocal microscope, the result of more than 50 years of HORIBA knowledge in spectroscopy, even for the most difficult sample, you can get easily and quickly a high-definition Raman image.

Join this webinar, presented by Thibault Brulé, to discover how LabRAM Soleil can solve your research challenges.

Want to learn more on the subject?

Thibault Brulé is Raman application scientist at HORIBA France, working in the Demonstration Centre at the HORIBA Laboratory in Palaiseau. He is responsible for providing Raman spectroscopy applications support to key customers from various industries, as well as contributing to HORIBA’s application strategies. Prior to joining HORIBA in 2017, he conducted research on proteins in blood characterization based on dynamic surface enhanced Raman spectroscopy. He then applied this technique to cell-secretion monitoring. Thibault holds a MSc from the University of Technologies of Troyes, completed his PhD at the University of Burgundy and followed on with a postdoc fellowship at the University of Montreal.

Look Up examines the age of cosmic exploration

I’ve always been somewhat of an amateur astronomer. Despite having lived most of my life in big cities (Mumbai, London and now Bristol) with skies often obscured by light pollution or smog, I make a point of going out and looking at the night sky a couple of times a week. Earlier this year, during the most stringent period of lockdown, I found solace in the familiarity and allure of the cosmos.

How refreshing then to read Look Up: Our Story with the Stars by journalist, TV presenter and author Sarah Cruddas. “Recently, I have found myself looking up at the stars more than ever,” she writes in the introduction. “Doing so is a reminder that we are so tiny compared to the vastness of what is out there.” With a background in astrophysics, Cruddas is a leading voice in the rapidly expanding commercial space sector, and is a director at Space for Humanity, a US non-profit aimed at democratizing access to space. Her first popular-science book aimed at adults, she describes Look Up as “part memoir and part manifesto” – a fairly apt description of this short, sharp and impassioned look at the history and the future of space exploration.

As Cruddas writes in the first chapter, the majority of the human race have never been to space and, for at least the next few generations, that balance is unlikely to tip. Indeed, of the 100 billion humans who have ever existed, fewer than 600 have left the planet. Despite this, it is fair to say that exploring the cosmos – from landing on the Moon and imaging the solar system to sending robots to Mars and detecting the first planets beyond our own star system – is one of humanity’s most significant enterprises, and one that is only just coming into its own.

The first half of Look Up actually looks back, as Cruddas swiftly and deftly takes the reader through the history of human exploration, beginning with our own planet. While we humans may have been looking at the heavens since time immemorial, it was only in the 1500s that the first real attempts were made to circumnavigate the Earth, beginning with Vasco da Gama’s journey around the Cape of Good Hope to India, and Ferdinand Magellan’s attempt to sail from Spain to Indonesia, travelling west. As Cruddas describes the importance and the impact of this period – the Age of Exploration, which reshaped many people’s ideas of the world – she does well to highlight the cost of this kind of endeavour.

She points, for example, to the Portuguese prince often called Henry the Navigator, who commissioned many expeditions. “[This was] at a time when lots of sailors were afraid of setting out into the Atlantic Ocean, tasking them with recording as much information as they could about the coastlines they visited,” she writes. “However, Henry was also responsible for starting the Atlantic slave trade. So when we celebrate humans’ drive to explore, at the same time it is also important to reflect on the horrendous mistakes we made.” Hindsight and perspective are common themes in the book, and crucial ones too, as a reminder that difficult lessons learnt in the past should not be forgotten.

From the Age of Exploration, the book jumps to the birth of aviation, including a particularly amusing tale of the first hot-air balloon flight in Paris in 1783. The cargo featured a sheep, a duck and a rooster, though you’ll have to read the book to find out why those three animals were chosen. She then moves on to the Space Race of the Cold War era. Cruddas spends a significant chunk of the book telling the stories and histories of the US and Soviet pioneers who first forged a path for humans in space.

What I found particularly enjoyable and useful was the commentary she provides in parallel for both the US and Russian attempts at lunar domination, including successes and failures. While I have read many books on the topic, Cruddas’ lucid writing and sharp narrative made for pleasant reading, even if I had heard most of the stories before. For those who may not be interested in reading a book only detailing the Space Race, Look Up could be just the summary they need, as it highlights many important figures often sidelined. They include JoAnn Morgan (the first woman to become an engineer at NASA’s Kennedy Space Centre), Katherine Johnson and the other African-American women who worked as “human computers” at NASA, as well as the wives and families of the celebrity astronauts.

Hindsight and perspective are common themes in the book, and crucial ones too

The rest of the book focuses on a rich mix of topics, from the beginning of commercial space flight ideas that took off as early as the 1970s, to the birth of Space Age technology. Cruddas spends a whole chapter highlighting the various benefits, skills and technologies that investment in space has given us over the years – a list of firm facts to use the next time you come across a naysayer who brings up the old argument of investing only in Earth’s problems. “While a generation was dreaming of jetpacks, they never imagined Deliveroo,” she writes. “Even though we’ve been launching satellites since the late 1950s, no-one during the heyday of the space race predicted a future of cyber space – the Internet that we have become so reliant on. But the combination of the connected world we live in today and the satellites above us is what has fuelled the unexpected space age.” Cruddas points out that today, the lines are completely blurred between what constitutes a space company and a tech company – a fascinating point that hadn’t occurred to me before.

The final chapters of the book highlight the current leaders in space exploration – Elon Musk’s SpaceX and Jeff Bezos’ Blue Origin. With sights set on Mars, near-Earth asteroids and moons beyond our own, humans will within the next few centuries most definitely become a space-faring species. As we look forward, it’s important to look back too, and with Look Up, readers will get a complete (if somewhat brief) narrative of our attempts to unravel the cosmos. As Cruddas puts it: “It is only from the vantage point of space that we are truly able to understand our Earth – a perspective that has been made possible by leaving.”

  • 2020 HQ 256pp £16.99hb

Astronomers propose telescope to monitor Betelgeuse dimming

An international team of astronomers has proposed a telescope to monitor the bright star Betelgeuse to provide clues about the cause of its sudden drop in brightness. The Betelgeuse Scope concept – which is anticipated to cost about $0.4m – would use twelve off-the-shelf 10 cm-aperture telescopes secured to a radio telescope dish to provide detailed, nightly observations of the supergiant star.

Betelgeuse’s “great dimming” began late last year and changed the naked-eye appearance of the constellation Orion. With it continuing to enthral astronomers, theories have emerged to explain why Betelgeuse’s glow has plummeted. A leading contender is that the star’s surface churned out an immense dust cloud that hid some of its famously ruddy light. To closely scrutinize this “mass-loss” activity, researchers will need frequent, high-resolution views of the roiling surface of the star, which are difficult to acquire with most telescopes right now, but possible via interferometry by connecting several telescopes as if they were one instrument.

The idea of a dedicated Betelgeuse Scope is a really nice one

Graham Harper

The team is building a prototype of the Betelgeuse Scope that would be placed on a University of Arizona 6.1 m radio antenna and are now seeking funding for the final telescope. “If successful, we will bring it to a larger antenna [of] 12 m or more to increase the interferometer array size,” says astronomer Narsireddy Anugu from the University of Arizona, who is leading the project. By using relatively inexpensive instruments affixed to the structure of an already-constructed radio telescope, the final Betelgeuse Scope system should be cheaper than a more complex, conventional arrangement. “It also saves money by using the pointing and tracking of the existing radio antenna,” adds Anugu. “So we don’t have to build it for all the individual amateur optical telescopes.”

A step forward

Graham Harper, an astrophysicist at the University of Colorado who has studied Betelgeuse and is not associated with the Betelgeuse Scope, says that if the proposal is successful then it would be “a major step forward” beyond existing observations. “Coordinating major observatories to look at a common object is difficult enough, even for one-off events, but for systematic monitoring of a couple of sources it is totally impractical,” he says. “The idea of a dedicated Betelgeuse Scope is a really nice one, because it attempts to address this problem.”

According to Harper, the proposed telescope would also glean information about Betelgeuse’s surface temperature. “They should be able to determine how the surface temperatures change with time across the stellar surface,” he adds. “This would help tell us exactly where the interesting phenomena are occurring, for example shock waves, outbursts and ejections.”

Accretion, not colliding spaghetti, flares up as star is devoured by black hole

An extremely bright flare, originating from a star being devoured by a supermassive black hole, has been observed in a galaxy 215 million light-years away – making this the nearest tidal disruption event (TDE) ever seen. The event was spotted by an international team of astronomers, headed by Matt Nicholl at the University of Birmingham. They caught the event well before its climax using instruments at the European Southern Observatory (ESO) in Chile. For the first time, the observations allowed astronomers to connect the characteristic brightening of these events with rapid outflows of material from stars.

If a star wanders too close to the supermassive black hole at the centre of its galaxy, it can experience tidal forces that exceed the gravitational forces holding the star together. As a result, the star will be dramatically shredded into thin streams of debris, through the process of spaghettification. During these TDEs, stellar remnants will flare, emitting large amounts of light.

Colliding spaghetti

Currently, there are two competing theories for these surges in brightness: either they occur as material accretes onto the black hole; or they result from earlier collisions between spaghettified streams. Although astronomers are now discovering several TDEs every year, they have yet to determine which of these theories is best. The main problem is that as the stars disintegrate, their colliding streams, combined with strong inflows and outflows of gas, make for a messy debris geometry which is incredibly difficult to disentangle.

In September 2019 the ESO’s Very Large Telescope and New Technology Telescope each spotted a new surge in brightness in a spiral galaxy. Through further calculations, Nicholl’s team concluded that the flash originated from a supermassive black hole as large as 1 million solar masses, as it devoured a star with a similar mass to the Sun.

Over six months, the instruments recorded the event across multiple regions of the electromagnetic spectrum as its brightness grew and faded. Owing to its proximity, Nicholl and colleagues were able to identify the TDE well before its peak brightness. This allowed them to capture the whole process unfolding, well before the geometry of the debris became too convoluted to untangle.

Sudden transition

By tracking changes in the blueshift of key absorption lines in the stellar debris, Nicholl’s team showed that the origin of the TDE’s early optical emission was dominated by an outflow of bright material, with speeds reaching roughly 10,000 km/s. Then, after around 30 days, the outflow underwent a sudden transition: first cooling, and then contracting.

Overall, the size and mass of the outflow remained consistent with the high ratio between optical and -ray emissions observed by the team, as well as in many TDEs from previous studies. This suggested that the outflow was more likely to have been powered by black hole accretion, instead of collisions between spaghettified streams of debris.

With the upcoming launch of ESO’s Extremely Large Telescope, now scheduled for first light in 2025, the team’s findings will provide researchers with key guidance as they uncover increasingly fainter and more rapidly evolving TDEs.

The research is described in a paper in Monthly Notices of the Royal Astronomical Society.

Physics in the pandemic: ‘Our physics and dosimetry facilities were vacated for COVID-19 patients’

India reported its first case of COVID-19 infection on 30 January 2020, and currently has the largest number of confirmed cases in Asia. The first case in the state of Rajasthan was in Jaipur, reported on 3 March. Since then, the SMS Medical College and Hospitals in Jaipur has served as the main COVID-19 treatment centre in Rajasthan. The hospital established a fully equipped outpatient department and isolation wards, as well as procuring a number of new X-ray machines and mobile X-ray units for diagnosis and treatment evaluation of COVID-19 patients.

As part of this transition, with the exception of the radiation delivery facility and a small intensive care unit for cancer patients, all other radiotherapy and radiological physics facilities were utilized for COVID-19 patient management. When lockdown was initiated end-March, there were four medical physicists in the hospital (two of whom were in vulnerable groups). Having handed over department space for COVID-19 management, they had to evolve rapidly to cope with the ensuing professional and personal challenges.

The hospital’s radiological physics department, which provides medical physics services to all departments that utilize ionizing radiation, took the lead role in managing cancer patients. As well as maintaining equipment and infrastructure, the team needed to rearrange radiation physics equipment and facilities to ensure patient services continued unhindered. Challenges included establishing protocols for radiography and radiotherapy delivery, maintaining the quality of diagnostic and treatment systems, achieving high patient throughput in the minimum time possible and managing the workload with a reduced workforce.

Priya Saini is a medical physicist in the hospital’s radiological physics department. Here is her account of maintaining a radiotherapy service in the midst of a pandemic.

Optimizing efficiency

During lockdown, I worked on cobalt teletherapy (on the Bhabhatron-II system) and brachytherapy. My work included treatment planning, radiation safety monitoring, quality assurance (QA), routine equipment calibration and treatment plan reviews, as well as teaching and research.

In early March, at the start of the pandemic, COVID-19 isolation wards were created in one or two departments in our hospital. As the number of patients increased, our cancer wards and other departments’ wards were also converted into isolation wards. Our physics and dosimetry facilities were vacated for doctors and nurses treating COVID-19 patients. At that time, only one room was left for us to conduct our routine work.

In this early phase of the COVID-19 pandemic, I had many difficulties in managing patients because of lack of awareness and fear of this virus. Then I spent extra time reading the available information and followed the guidelines given by WHO, and slowly I overcame my fear.

In our department, we used to treat 100 to 120 patients each day on the Bhabhatron-II telecobalt machine. This number quickly halved, because it became harder for patients to travel to the hospital and some had already returned to their hometowns or villages. However, some patients recommended for surgery were transferred to radiation therapy, so workload in our department slowly increased.

For certain cases, including some patients with early-stage cancer, we delivered radiation therapy over a shorter period of time. The main reason for such hypofractionated treatments was to minimize viral exposure and the risk of contaminating patients, without reducing the effectiveness of the treatments. Our aim was to establish a better way to treat all patients who can benefit from radiotherapy; and not to delay the start of treatment of any patient whose deferral may worsen the prognosis of their disease.

After lockdown, as the necessity of providing regular medical services to the general public became essential, our hospital resumed normal activity by shifting the COVID-19 patients to the university hospital and radiation treatments continued normally again. Every day, healthcare workers screened patients with thermal scanning before registering them for treatment and providing them with hand sanitizer.

To help avoid cross contamination, I made a separate box in the manual treatment planning room for each patient to store their treatment documents. I also instructed the security guard to send only one patient at a time into the room, with every patient advised to maintain physical distancing of one metre minimum. Before starting treatment, every patient was verified to be COVID-19 negative.

At the brachytherapy treatment console, three people were present at the time of treatment: one technologist, one resident doctor and myself. We maintained physical distancing from each other. We did not allow patients or their companions to enter the treatment planning room or console and instead interacted with them while maintaining a one metre distance outside the minor operating theatre.

Additional challenges

Personal protective equipment (PPE) has become an emotive subject during the current COVID-19 epidemic, as an important component, though only one part, of a system to protect staff and patients from cross-infection. During the lockdown period, however, the temperature in Jaipur reached up to 46°C. As per guidelines at that time, we were not using centralized air conditioning, to prevent the spread of virus via air circulation. So it was very difficult, as well as unhealthy, for us to wear PPE kit while treating the patients.

In cases of intraluminal brachytherapy, there was direct interaction with patient while taking the measurements for planning. During such instances I wore PPE kit. After treatment completion, the PPE kits were discarded properly and I washed my hands and face with soap.

To increase awareness of COVID-19 to the patients and others, we pasted notices in the treatment planning room, the calculation room and on other rooms’ doors, with messages in local languages: “no entry without a mask”, “maintain social distancing” and “do not enter without permission”. We asked all ongoing patients about symptoms and other health-related issues; if found symptomatic, patients were sent for COVID-19 tests.

My other roles included QA of the machines in the department and teaching. We performed mechanical QA of both treatment machines weekly and performed dosimetry QA monthly. Before starting, we would sanitize the treatment room and control console to prevent contamination and spread of the virus.

During this period, a few resident doctors in my department became COVID-19 positive. I had interacted with one of them two days earlier, which made me scared. I did self-assessments for 3–4 days and also consulted with a general physician. Following that, two technologists posted on the Bhabhatron-II machine tested positive for COVID-19. Following institutional protocol, they were home quarantined for 14 days. After quarantine, they tested negative and re-joined the hospital.

As per local government instructions, undergraduate paramedical students were taught via online Zoom classes during lockdown. There are five postgraduate students and, after lockdown, they re-joined regular classes. I conducted many classes for students of radiotherapy technology and radiology technology, maintaining proper distance in a department seminar room. I also instructed them regarding hospital protocols, the necessity of proper hygiene, and not to sit together without masks and maintaining social distancing.

Recently, one of the students got severe symptoms similar to COVID. He was asked to take a COVID-19 test and not to return until he had a negative result, while the other four students were asked to self-quarantine. After 24 hours, he tested negative and rested for a few days to recover from weakness.

The COVID-19 pandemic has affected everyone globally, and our department of radiological physics is no exception. We are striving to keep ourselves and our near and dear safe, while continuing to provide medical physics services to all radiological facilities of our institute, without compromising international standards.

Software tool for MRI predicts motor development disorders in preterm babies

A new software quantification tool has been developed by researchers in the US for analyzing white-matter abnormalities in very preterm babies. This magnetic resonance imaging (MRI) biomarker can predict motor development risks, including cerebral palsy, in a much more objective way than existing diagnostic tools.

Today, most children born extremely prematurely (earlier than 30 weeks of gestation) survive. However, nearly 70% of these very preterm infants show abnormalities in the white matter of their brains. Furthermore, half of very prematurely born children develop minor motor abnormalities, and up to 10% develop cerebral palsy (CP).

CP is a group of permanent neurological disorders that affect movement, posture, and balance. It is the most common physical disability in children. Even though the last three decades have brought improved survival and care for extremely preterm babies, they are about 50 times more likely to develop CP than full-term babies. Improved care for these children has not translated into fewer disabilities, since most children with CP are diagnosed at 1–2 years of age and conventional neuroimaging is subjective and not sensitive on its own to accurately predict CP.

Early intervention

Yet, identifying biomarkers of CP at an early stage of life could significantly increase the quality of life for both affected patients and their families. Early intervention by physical and occupational therapy could take advantage of early neuroplasticity and mitigate symptoms such as tremors, stiff muscles, or difficulty performing precise movements.

To address these issues, a group of researchers led by Nehal Parikh, a neonatologist at the Perinatal Institute, Cincinnati Children’s Hospital Medical Center, has been developing better diagnostic tools using advanced imaging risk factors to improve outcome prediction. They have developed an algorithm that can analyze structural MRI scans of preterm infants to predict their risk of motor development abnormalities based on spotting and quantifying brain defects.

Fully automated detection

Parikh’s team targeted diffuse white matter abnormality (DWMA), the most common finding in premature infants. It appears on a brain MRI scan as areas with increased signal intensity. Early evidence suggests these abnormalities are associated with inflammation-initiating illnesses and may represent areas of increased fluid in the brain’s wiring.

Current tools to assess DWMA rely on visual analysis of the MRI data. “Manual tracing of DWMA regions within the brain is highly subjective and produces poor reliability and reproducibility,” explains Parikh. “It is clear that we need a better tool for diagnostics. I started working with computer scientists to come up with an algorithm that can segment these lesions.”

In the study, the team enrolled very prematurely born infants and performed brain structural MRI scans at the term-equivalent age. These images were fed into an algorithm created using a probabilistic brain atlas constructed specifically for a very preterm infant brain.

“Since it was not possible to create a gold standard manual brain atlas for DWMA, Lili He, a computer scientist in my lab, decided to use a synthetic one to train the algorithm. I drew out the respective brain regions manually,” explains Parikh. The resulting algorithm could detect signal intensity and location of normal white matter, gray matter, and cerebrospinal fluid and isolate DWMA regions based on their different signal intensity and spatial information with an accuracy greater than 95% when compared with the ground truth.

Towards clinical translation

Critics might say that the algorithm represents a computer simulation trained on a synthetic brain model that is not relevant to a real patient. However, it can already predict clinical outcomes relevant to patients and their families. Indeed, the team correlated their results with cognitive and language scores of their cohort at the age of two; and motor scores at three years of age. “The results correlated very nicely,” says Parikh. “It showed that DWMA volume was predictive of standardized developmental scores up to three years of age, independent of other conventional MRI and other known predictors.”

The team is now working on implementing a deep learning approach to improve the DMWA region identification. In addition, Parikh is planning to work with leading MRI vendors to implement the algorithm into their consoles to allow for the earliest diagnostics possible.

“Ideally, when the MRI scan is complete, our algorithm would do its job within seconds and provide a value for diffuse white matter abnormality,” he says. “If this value was abnormal, the child would be referred for aggressive early intervention therapies and/or research trials of novel interventions.”

Manon Benders, a neonatologist at the Wilhelmina Children’s hospital, UMC Utrecht, foresees a big future for this research as a decision-support tool. “It is very important since obtaining the most accurate prognosis as early as possible is essential in order to inform the child’s family adequately.”

“Moreover, beginning the intervention therapy even before the onset of clinical symptoms of cerebral palsy or adverse motor outcomes is also crucial, particularly given that the brain’s plasticity is highest in the first few months after birth,” she adds.

These research is described in Scientific Reports.

 

Celebrating Emmy Noether, Sameera Moussa, Caroline Bleeker, Toshiko Yuasa and other inspiring women in science

Emmy Noether

Tuesday was Ada Lovelace Day, which celebrates achievements of women in science, technology, engineering and maths (STEM). Named after the 19th-century polymath Ada Lovelace, the annual initiative also seeks to engage with the challenges of attracting more women into STEM careers and supporting career development.

Nature’s On Your Wavelength blog celebrated with a piece by Ankita Anirban about five inspiring female physicists. Perhaps the most intriguing of the biographies is that of the Egyptian nuclear physicist Sameera Moussa, who some believe was murdered in 1952 to prevent Egypt from developing nuclear weapons.

There are also tales of wartime daring. The Dutch physicist and entrepreneur Caroline Bleeker, for example hid Jewish people from Nazi occupiers in her factory. And when it was raided in 1944, she managed to usher them to safety. After the war, her factory produced the world’s first complete phase contrast microscopes.

Meanwhile in wartime Berlin, the Japanese physicist Toshiko Yuasa developed a double-focussing beta spectrometer that she carried on her back through Siberia to Japan when she was expelled from Germany by the Soviet army.

Emmy Noether: The most important mathematician you’ve never heard of is a new book from the Canadian children’s author Helaine Becker. Illustrated by Kari Rust, the book was commissioned by Lisa Lyons Johnston, president and publisher of Kids Can Press. Lyons Johnston became aware of the remarkable life and work of Noether when she joined the Emmy Noether Council at the Perimeter Institute for Theoretical Physics.

“Boys and girls, men and women, need to see more women in physics and science,” says Lyons Johnston. “It’s so frustrating that people haven’t heard of her. I wanted people to know more about Emmy, who is a remarkable person in her own right, and also an inspiration for girls.”

You can read more about the book here.

Copyright © 2025 by IOP Publishing Ltd and individual contributors