Skip to main content

Strategic wind farm siting could help seabirds

Back in 2016 the US installed its first offshore wind farm: Block Island Wind Farm, off the coast of Rhode Island. Many more projects are underway and the US government plans to produce 86 GW – enough energy to power around 60 million homes – from offshore wind by 2050. But how will these new wind farms affect US seabird populations? A new study shows careful placement can help minimize the exposure of foraging birds that may be vulnerable to widespread development.

Offshore wind farms are already commonplace in Europe, particularly in the North Sea where there are over 80, producing more than 18 MW. Research from Europe has shown that offshore wind farms can affect seabirds, both directly – when birds collide with turbines – and indirectly, when birds lose habitat as a result of their tendency to avoid wind farm areas. When multiple offshore wind farms are installed, there are concerns that there may be cumulative effects on seabirds.

Wing Goodale from the University of Massachusetts at Amherst and Biodiversity Research Institute, US, and colleagues wondered if the cumulative impact on seabirds could be reduced by careful design of the spatial arrangement of offshore wind farms. To investigate, they modelled the exposure of seven foraging seabird guilds – groups of birds that rely on the same set of resources – under three different wind farm siting scenarios along the US east coast.

Their findings show that cumulative effects are more likely for coastal birds than pelagic birds, which spend most of their time on the ocean and away from land.

“Coastal bottom gleaners, such as sea ducks, and coastal divers, such as loons, are vulnerable to displacement, whilst coastal surface gleaners such as gulls are vulnerable to collision,” says Goodale. Pelagic birds like puffins, razorbills, fulmars and shearwaters are less vulnerable because their habitat is less disturbed.

The results, which are published in Environmental Research Letters (ERL), show that some birds will be exposed wherever the wind farms are placed. Gulls will be exposed the most in near-shore and shallow water locations, whilst pelagic birds will be exposed the most in high wind areas, which tend to be further out to sea.

Proposed wind energy areas already avoid known bird hotspots such as the Nantucket Shoals and aim to disperse developments along most of the east coast, from South Carolina to Massachusetts. The new findings suggest that the detailed siting of the wind farms is important too.

“We recommend that when two or more wind farms are sited in one wind energy area they should, to the extent practicable, be separated as much as possible, to provide movement corridors for species vulnerable to displacement,” says Goodale.

Spreading out developments even further may help to minimize adverse impacts on seabirds. Goodale and colleagues suggest that future siting should focus on areas with no current leases, such as the Gulf of Maine.

Transparent peer review trialled by IOP Publishing

IOP Publishing has become the first physics publisher to introduce “transparent” peer review – in which the referee reports of an accepted paper are made available to read alongside the published research. IOP Publishing, which publishes Physics World, has announced it will carry out a year-long trial of transparent peer review on three of its journals – JPhys Materials, Environmental Research Letters and Journal of Neural Engineering. The outcome of the pilot – and whether it will be extended to other journals – is expected to be announced later next year.

Even publishing anonymous peer reviews can greatly increase the transparency of the process

Simon Harris

During the trial period, which began yesterday, authors who submit their papers to the three journals – two of which are fully open access – will be asked if they want the reviewer reports to be published alongside the final article. If so, then the peer reviewers will also be asked if they wish their reports to be published and whether they want to waive their anonymity. For transparent peer review to go ahead, both authors and reviewers must agree to it, but peer reviewers can maintain their anonymity if they wish.

Opening the black box

As well as measuring how many authors opt for transparent peer review, IOP Publishing will also monitor whether this deters scientists from reviewing manuscripts and what effect, if any, it has on the quality of peer-review reports. A trial run between 2015 and 2018 on five Elsevier journals revealed that transparent peer review did not significantly affect referees’ willingness to review or how quickly the peer review was carried out. Although the trial also found no difference in whether a paper would be accepted or rejected, only 10% of peer reviewers agreed to lift their anonymity (Nature Com­munications 10 322).

However, even if reviewers prefer to remain anonymous, it is hoped that publishing the peer-review reports will be another step towards openness in research. “Even publishing anonymous peer reviews can greatly increase the transparency of the process, which was previously a black box,” says Simon Harris, a managing editor at IOP Publish­ing. “Such transparency increases accountability and may increase quality and also reduce bias in peer review.”

New optical timekeeper is 10 times more reliable than caesium atomic clocks

A new timekeeper based on trapped strontium atoms accumulates an error of just 48 ps over 34 days of operation – making it 10 times more reliable than current caesium time standards. This new record for performance has been set by physicists in the US and Germany, who used a silicon cavity and a diode laser to back-up the time signal generated by the atoms.

The current international time standard is Coordinated Universal Time (UTC), which combines the signals of hundreds of caesium atomic clocks worldwide. These clocks operate in the microwave region of the electromagnetic spectrum (at 9 GHz) and this relatively low frequency means that they are less accurate than atomic clocks that operate at optical frequencies of around 500 THz.

Optical clocks have not been more widely adopted because they tend to have “dead times” when a time signal cannot be obtained. This dead time degrades the long-term stability of the clock signal, making it impractical for use as a time standard. This can be overcome by converting the optical signal to a microwave signal, which is used to set the frequency of a hydrogen maser. The maser then delivers a steady time signal during clock dead times. This is not ideal, however, because the maser’s microwave signal is inherently less accurate than the original optical signal.

Stabilized laser

A better solution would be to use an optical laser instead of a maser. But until now, the challenge has been to build a laser that remains stable for long enough to stand-in for the optical clock. In 2012 researchers at NIST and the University of Colorado – both in Boulder – and the PTB standards lab in Braunschweig stabilized the frequency of a diode laser for about 100 s by locking it to a silicon optical cavity that is cooled to a chilly 120 K.

This was not quite good enough, but now the team has made several improvements to the system that extend the laser’s stability. These improvements include using super-polished optical lenses, active control of the laser power and better thermal control of the system.

The team used the stable laser in conjunction with an optical clock comprising a 1D lattice of trapped strontium-87 atoms. They created a timekeeper with an accumulated error of just 48 ps in 34 days, which is about ten-times better than the best caesium clocks used at metrology centres worldwide. What is more, the team is not finished improving the system and could achieve another factor-of-ten improvement.

Redefinition of time

“I think this new time scale demonstration will be very important for the redefinition of time in the future,” said project leader Jun Ye, who is at NIST and the University of Colorado.

The new timekeeper could have a wide range of other applications including improving the timing in radio-telescope arrays, which currently use hydrogen masers, and boosting the accuracy of global positioning systems. An array of timekeepers at different locations could even be used to detect the tiny gravitational effects of dark matter passing through the Earth.

The research is described in Physical Review Letters.

ASTRO showcase: Siemens Healthineers presents a new era of CT simulation 

In this short video, which was filmed at ASTRO 2019, Cecile Mohr highlights how Siemens Healthineers is working to expand precision medicine for more personalized treatment, transform patient care through more streamlined workflows, and leverage artificial intelligence to get more out of CT simulations. 

Optical imaging provides quality assurance for small radiotherapy beams

Optical scintillation imaging is proving feasible as a quality assurance (QA) tool for small static beams and for pre-treatment verification of radiosurgery and volumetric-modulated arc therapy (VMAT) plans. Researchers at Dartmouth College have shown that the technique can perform QA for each control point within five dynamic VMAT plans. Moreover, the approach is sensitive to small errors in gantry angle and multileaf collimator (MLC) leaf position (Med. Phys. 10.1002/mp.13797).

Recently, Brian Pogue and colleagues at the Thayer School of Engineering successfully used Cherenkov imaging as a QA tool for broad beams from an MR-linac. They demonstrated good agreement between optical images acquired from an intensified CCD camera and simulated dose distributions from the treatment planning system (TPS), but were not able to image small beamlets reliably due to a low signal-to-noise ratio.

“While the detection principle is the same as Cherenkov imaging, in scintillation imaging, an organic phosphor is used to boost the signal, allowing the recording of the smallest, dynamic beams with high fidelity,” explains co-author Petra Bruza.

In their latest study, the researchers used a unique water-based scintillator, which minimizes optical blurring and provides high optical signal while being fully tissue (water) equivalent. They acquired data with a blue-sensitive intensified CMOS camera, which captures optical photons generated due to ionizing radiation incident on a cylindrical water phantom containing quinine sulphate.

“The camera contains an optical intensifier that is remotely synchronized with the X-ray beam,” Bruza tells Physics World. “Optical amplification greatly improves the signal from small beamlets, while the remote trigger suppresses background and also provides additional information about the beam stability.”

The team compared projected percentage depth dose and cross beam profiles for static 6 MV beams of various widths with data from the TPS. The average differences between the TPS and optical depth dose profiles were 0.52%, 1.25% and 1.53%, respectively, for 50, 10 and 5 mm beam widths. All cross beam profiles exhibited a 3%/1 mm gamma passing rate of greater than 99%.

Next, the researchers delivered five clinically approved VMAT plans to the water phantom, including a prostate plan, two head-and-neck plans, a stereotactic body radiation therapy plan and a multi-lesion stereotactic radiosurgery (SRS) plan. Representing a range of different VMAT plan types with varying degrees of modulation and field sizes, these had dose rates of 120 to 2400 MU/min.

Brain metastases plan

All five plans passed the 3%/3 mm gamma criteria. The technique achieved good agreement between the TPS-simulated dose and the optical images, with an average gamma pass rate of 97.8%. The technique was sensitive to MLC errors down to 1 mm and gantry angle errors of 1°, and achieved a spatial resolution of 1 mm.

“The technique was found to be more sensitive than the diode array for common delivery errors,” says co-author Muhammad Ramish Ashraf. “Moreover, it was found that the technique was particularly sensitive to overmodulated plans.”

“A key advantage of optical scintillation imaging is that it records dose from the target treatment volume inside the phantom, where the quality assurance is most important,” explains Bruza. “It does it with true submillimetre lateral resolution. Some of the most advanced other approaches record dose either from the ‘skin’ of the phantom, or from a single plane inside the phantom, both with much coarser resolution of 5 to 10 mm.”

He notes that optical scintillation imaging also has advantage for MRI-linac systems. As it places all of the electronics outside the phantom, it further minimizing interference between the dosimeter and MRI.

The team’s next project is to improve detection of the X-ray cross-beam profile to enable seamless 3D imaging and to extend the technique to proton therapy QA. “Our research is focused on further refinement of our 4D (3D plus time) dose reconstruction project and translating it into clinics,” says Bruza. “By using additional information about X-ray cross-beam section, we can reconstruct 3D dose with submillimetre resolution and temporally with at least 20 frames per second frame rate.”

Cumulating these volumes in time provides the final 3D dose distribution. “With that, we can check how the dose truly matches the tumour shape, and how it avoids organs-at-risk, in the form of a dose–volume histogram,” says Bruza. “The dose–volume histogram is the best way how to represent the quality of dose delivery, because it relates the dose distribution to the actual clinical structures.”

How big ideas can change the world: the revolutionary potential of LiFi

“What if every light bulb in the world could also transmit data?”

That was the question posed by Harald Haas, a professor from the University of Edinburgh, in a 2011 TED talk. It was a simple yet profound notion, especially if, like me, you can’t get a decent WiFi signal in the kitchen because your router’s in the study and you can’t stream that crucial cooking video when trying to make dinner. If I could could send and receive data at high-speed using the light bulbs right above my head, I might not have to rely on WiFi radio signals at all.

That’s the principle behind LightFidelity (LiFi), which emerged from an Edinburgh research project known as D-Light that ran from 2009 to 2011. Haas co-founded a spin-off firm called pureLiFi to commercialize the technology, and I recently met him and the firm’s chief executive Alistair Banham at the Institute of Physics (IOP) in London. They’d come to launch the IOP’s Accelerator Centre, which offers space for small start-ups in the Institute’s new headquarters. In fact, the rooms in the centre are equipped with LiFi technology developed by the firm, which won an IOP Business Innovation Award in 2017.

At the IOP event, Haas described his journey of building a technology company in the UK. PureLiFi began with a few researchers in a lab and it now has more than 130 LiFi deployments in 24 nations. Haas spoke about the advantages of LiFi and his vision for pureLiFi’s revolutionary light-communications technology transforming global communication. He showed, for example, a pureLifi transceiver just a few millimetres across that could be integrated into a mobile phone. Haas said it could securely download data at 1 Gigabit per second (Gbps) – almost 10 times faster than with WiFi. Upload speeds would be 0.4 Gbps.

Bright prospects

Researchers at the University of Oxford have shown that LiFi can work in lab demonstrations at speeds of 224 Gbps, illustrating the huge progress since Haas introduced LiFi in his 2011 TED talk. In that presentation, Haas demonstrated the benefits of LiFi with an LED light bulb housed in a desk lamp sitting on a small, filing-cabinet-sized receiver unit. He showed how the bulb could stream a high-definition movie to the receiver, with the results displayed on the screen. To gasps and applause, Haas paused the video by blocking the light with his hand, before stopping the transmission altogether by angling the lamp away from the receiver.

Now most people who haven’t heard of LiFi will probably say “faster WiFi” if you ask them what they want when it comes to better wireless communication. But innovators like Haas have a vision that others don’t share. As automobile innovator Henry Ford is once alleged to have said: “If I had asked people what they wanted, they would have said faster horses.” Non-specialists, in other words, find it easy to describe their problems, but aren’t always the best at coming up with a solution.

In his 2011 TED talk, Haas eloquently outlined the four fundamental challenges that LiFi addresses. First, it’s efficient. Second, LED lights are widely available. Third, LiFi offers high capacity (the visible spectrum is 10,000 times wider than the radio spectrum). And finally, it’s secure: light doesn’t travel through walls so you can deliver the data only where you want it.

The security aspect of LiFi is a great example of how a good entrepreneur can turn a potential bug into a feature

James McKenzie

The security aspect of LiFi is a great example of how a good entrepreneur can turn a potential bug into a feature. At first sight, the limited transmission range might seem a problem. On closer inspection, however, light’s inability to pass through walls is LiFi’s biggest benefit because it allows data to be precisely and securely delivered. Indeed, I think LiFi’s inherent security and high capacity will be what drives the market to adopt the technology. Its high efficiency, in contrast, is only a “nice-to-have”, while the wide availability of light sockets may prove only a long-term benefit given that they’ll all have to be upgraded to get data to the sockets.

LiFi will be particularly attractive for high-security applications in defence and industry. But I also think it’ll be a boon in high-density residential premises, where WiFi routers currently battle it out to deliver bandwidth in the increasingly congested WiFi spectrum for residents, who at the same time want secure transmissions. LiFi benefits too from a low “latency” – the time it takes to request data from a server and then get the information back. LiFi is therefore a great prospect for virtual-reality applications where overly high latency can give people motion sickness.

Market forces

For LiFi to become a mass-market technology, however, transmission equipment made by different manufacturers will have to talk to receivers in phones, tablets and laptops. Developing standardized technology and interoperable systems can be tortuous, but there are international attempts to do just that by 2021, with pureLiFi closely involved. Indeed, other firms are already getting in on the LiFi act.

Signify (formerly Philips Lighting) has recently developed one of the first commercial LiFi systems. Users will need a USB-access key, plugged into a laptop, to receive the LiFi signal. Once connected, Signify’s Trulifi systems can provide wireless connectivity at up to 150 Mbps. One benefit unique to Trulifi is that it uses optical wireless transceivers built, or retrofitted, into Philips lights, meaning users won’t necessarily have to replace their existing lighting infrastructure to give LiFi a go.

As for pureLiFi, it has decided to focus on the optical front end (OFE) rather than a whole system, with its miniature 1 Gbps transceiver already being offered to system integrators, including mobile-phone manufacturers. These firms have huge sales, but with volumes falling for the first time in decades, perhaps a LiFi-enabled phone could be the next big thing for the mobile-phone industry keen to boost handset demand. Indeed, if you thought LiFi was just a replacement for WiFi, you may have missed the point entirely.

3D printing technique improves vertical conductivity of 2D materials

Extrusion-based 3D printing can vertically align 2D nanosheets to improve heat and electricity transport perpendicular to the sheet plane, report researchers at the University of Maryland .  Among the many uses, the researchers suggest that it could accelerate the cooling of CPU chips in circuits, by transporting the heat vertically away from the source.

In their recent ACS Nano paper, the researchers led by Liangbing Hu describe a method to obtain 2D materials oriented vertically, and use boron nitride (BN) nanosheets as a proof of concept, proposing a solution to an issue that is a significant challenge and limiting factor in the application of 2D materials.

The new methodology

One of the problems with solution-based self-assembly processes is that they are highly inefficient at generating ordered structures in the vertical orientation. As an alternative technique, Hu and co-workers suggest using a modification to conventional extrusion-based 3D printing methods

Although extrusion-based 3D printing can print in both horizontal and vertical directions and works with many types of materials, it requires the use of ink with the right rheological properties, as well  as a means of ensuring the printed BN rods are self-supporting.

To obtain vertical structures of BN nanosheets that are stable after drying, the team of researchers disperses them into a water-based ink binder. Extrusion of ink with 50% BN content from a nozzle 600 μm in diameter gives a self-standing rod that does not collapse. The new methodology allows them to print in air without any further manipulation, and speeds up the printing process for these systems, compared with the layer-by-layer depositing sequence that is commonly used.

The researchers 3D-print vertically aligned BN rods 3-10 mm in height and demonstrate how the adjustable printing conditions (e.g. nozzle diameter, printing pressure) can control the size of the rods and the array. In addition, they embed the arrays into a PDMS matrix and demonstrate an improved thermal conductivity compared with PDMS alone.

The researchers suggest that this method can be extended to other 2D materials to obtain 3D printed structures with higher levels of complexity for different types of applications such as batteries with electrical conductivity extended to a new dimension.

Sonic shock waves could help desalinate water

Shock waves fired repeatedly into water samples can remove dissolved salts, according to A Sivakumar and Martin Britto Dhas of the Sacred Heart College in Tirupattur, India. The researchers say that the effect involves a cavitation-based nucleation mechanism that could be useful for the pretreatment of water at desalination plants. However, not everyone is convinced by their findings.

When supersonic shock waves pass through liquids, they can trigger cavitation and create tiny bubbles that then collapse and trigger liquid-to-solid phase transformations.

“The phase transformation rate is associated with physicochemical parameters such as pressure, temperature, ionic strength, velocity of waves, degree of supersaturation, non-equilibrium thermodynamics of the medium and the kinetics of the nucleation phase,” write Sivakumar and Dhas in the Journal of Applied Crystallography.

More work needed

“In spite of these important facts,” they continue, “only a very small amount of work has been so far directed to understand the impact of shockwaves on various conditions such as pH, temperature or saturation index — which give rise to nucleation and growth kinetics”.

To investigate the latter, the duo passed shock waves through two water samples of different salinities — one a groundwater sample and the other seawater. The shock waves were generated by an air-compressor device that builds up pressure on one side of a diaphragm. When the diaphragm ruptures, shock waves are produced on the other side. Two microphones are used to measure each shock wave’s acoustic profile and speed. By varying the strength of the diaphragm, Sivakumar and Dhas say device can create shock waves ranging in speed from Mach 1 to 5.

The researchers subjected samples of both water types to shock waves of varying speeds, repeating the shock waves every five seconds. They report that, as the shocks were applied, the water samples grew turbid, with precipitates of salt subsequently settling at the bottom of each sample container.

Non-equilibrium state

“The main concept of acoustic cavitation is that the momentum of the applied shock wave produces vibrating micro-sized bubbles in the cavitation phase and such bubbles undergo collapse in a very short time,” the researchers write. This establishes a non-equilibrium thermodynamic state, they add, with bubble collapse forcing dissolved elements into a solid phase, at which point “the nucleation process separates these previously dissolved solid particles from the liquid medium”.

“A Mach number of 2.2 gives the maximum value [of precipitation],” they report, adding that above this, “crystallized salts are redissolved owing to the high dynamic pressure and temperature, hence the net amount of crystallization is reduced.”

At Mach 2.2, total precipitation volumes increase linearly with the number of shocks, up to a maximum number of 200 shocks when no further salts are deposited.

Crystalline deposits

Collecting the precipitated salts by running the post-shockwave samples through filter paper, the team analysed the deposits using powder X-ray diffraction, energy-dispersive X-ray spectroscopy and scanning electron microscopy. They report that the precipitate is crystalline in nature, comprising spherical particles that are larger when derived from groundwater. The particles contain various elements – such as chlorine, potassium and magnesium — that are consistent with the dissolved contents of the original water.

“Although this process could not remove the dissolved salts completely, shockwaves could be used as an alternative tool for the pretreatment of sea water in desalination plants,” the team conclude – noting that the method would require no additional chemicals and not call for any water rejection as part of the process. Their paper provides no indication, however, of how the approach might scale up for commercial applications, with the volume of water used in the current study itself left unclear.

“There has been growing interest in new methods of crystallization in recent years,” says, Robert Compton, of the University of Tennessee, Knoxville says. He points out that these methods include using lasers to create shock waves in liquids. “The paper by Sivakumar and Dhas presents another method of sonication using simple salts as examples of its utility,” he adds. “We await the application of this method to the production of difficult to produce crystals such as amino acid crystals and other biologically relevant materials.”

Questions remain

However, another expert, who wished to remain anonymous, is skeptical and questions both the study’s lack of a control sample of distilled or deionized water and the robustness of the described analysis of the precipitates.

“Their claim is thermodynamically unbelievable – you can’t precipitate below saturation, the nucleation sites would redissolve as soon as the collapsed bubble is gone. I suspect that they are reacting some organics in the cavitation event and precipitating salts from the resulting oxidation of organics.”

Tim Mason of the University of Coventry compared the work to the field of sonocrystallization. This cavitation-based technique uses ultrasound and was discovered in 1927. “Yet,” he adds, “this paper on shock wave precipitation makes no mention whatever of sonocrystallization.

With their initial study complete, the researchers are now conducting “an investigation of the physical, chemical and biological parameters of the shock wave-treated water”.

Metamaterials Design and Characterization with UV–VIS–NIR Microspectroscopy

Automated algorithm determines patient’s response to Ra-223 therapy

© AuntMinnie.com

An automated bone scan algorithm has shown its value in determining the response to radium-223 (Ra-223) treatment for patients with metastatic castration-resistant prostate cancer (mCRPC), according to a study published online October 4 (J. Nucl. Med.  10.2967/jnumed.119.231100).

Swedish researchers compared baseline and post-treatment values from the automated Bone Scan Index (aBSI) software with tumour response biomarkers alkaline phosphatase and prostate-specific antigen (PSA) to see how the trio fared before and after Ra-223 treatment. aBSI was quite helpful in determining how well patients responded to Ra-233 therapy and also could provide their prospects for overall survival.

“In daily clinical practice, there is no validated imaging biomarker for assessment of treatment response,” wrote the authors, led by Aseem Anand from Skåne University Hospital and Lund University. “aBSI has been demonstrated to be a prognostic biomarker in several studies, and, as a fully quantitative assessment of bone, aBSI may have the potential to be a radiographic biomarker to assess efficacious response in metastatic lesions of the bone scan images.”

aBSI values

Prostate metastases

Bone metastases are frequent occurrences in patients with prostate cancer, with estimates as high at 80% that patients with mCRPC will develop skeletal metastases in the spine, skull, ribs and the extremities. The primary modality to detect and assess bone metastases is whole-body bone scintigraphy, which tracks increased uptake of technetium-99m (Tc-99m).

In 2012, Ulmert et al developed the automated Bone Scan Index algorithm to quantify the degree of skeletal tumour burden in bone scans, with calculations based on the percentage of total skeletal weight. The method since has become a “valuable metric and potentially helpful tool for estimating the total quantitative skeletal metastatic burden in patients with mCRPC,” Anand and colleagues wrote. A May 2018 study by Armstrong et al used the aBSI algorithm to predict which prostate cancer patients were more likely to survive after bone scintigraphy with Tc-99m.

“An advantage of aBSI is that it is based on bone scan examinations, which is still the most common method to detect metastatic spread to the bone in patients with prostate cancer,” the Swedish authors wrote.

Relative to patient treatment, radium-223 is designed to selectively target bone metastases and subsequently allow oncologists to plot the most appropriate therapy for these patients. Ra-223 has been shown to extend overall survival by approximately four months with minor or no side effects.

Despite the clinical success of Ra-223 in targeting bone metastases, there has been “little work to evaluate the radiological response in patients being treated with Ra-223,” noted the authors, adding that the “lack of a response biomarker is a significant challenge in clinical routine management of these patients being treated with the Ra-223 treatment.”

Biomarker data

To evaluate the efficacy of aBSI with Ra-223, Anand and colleagues retrospectively gathered bone scan results from seven hospitals in Sweden of 267 patients who were treated with monthly injections of Ra-223 for metastatic castration-resistant prostate cancer. Among the collected data were baseline values of alkaline phosphatase and PSA for 156 patients (median age, 68 years; range, 62–77 years) and for 67 patients (median age, 70 years; range, 64–77 years) within three months of their last Ra-223 injection (median, 5 injections; range, 1–6 injections).

The researchers used aBSI version 3.2 (Exini Diagnostics) to analyse the bone scans and generate each patient’s index at baseline and the end of treatment by segmenting various regions of the skeleton and detecting and classifying the abnormal “hot spots” as metastatic lesions.

Metastatic hot spots

After the last round of Ra-223 treatment, 54 patients (80%) showed a decrease in alkaline phosphatase value, with 38 patients (57%) achieving a decline of 25% or more. By comparison, 24 patients (36%) had a lower aBSI, and nine patients (13%) exhibited a reduction in PSA value.

The number of administered Ra-223 doses correlated with a more advantageous response to treatment. Patients who received five or six doses of Ra-223 were most likely experience declines in alkaline phosphatase (85%), aBSI (53%) and PSA (20%) values. The number of Ra-223 injections also significantly influenced aBSI results, with 21 patients (87%) who received five or six Ra-223 doses having a lower aBSI than three patients (13%) with fewer than five injections.

Alkaline phosphatase decline

In the overall survival analysis, baseline aBSI (p = 0.01) and baseline alkaline phosphatase levels (p = 0.001) both significantly correlated with overall survival, while baseline PSA values did not achieve statistical significance (p = 0.60). When combined, the median overall survival of patients with declines in both aBSI and alkaline phosphatase values (median, 134 weeks) also was significantly longer than in patients with only an alkaline phosphatase decline (median, 77 weeks) (p = 0.023).

“For an accurate and comprehensive understanding of disease status, a quantitative and reproducible radiographic assessment like the aBSI can potentially add clinical utility to existing CT evaluation of soft-tissue metastases and to the blood-based biomarkers,” the authors concluded. “In this study, we demonstrated that aBSI at baseline and its change after treatment were significantly associated with [overall survival] and additive to the current predictor of response in mCRPC patients treated with Ra-223. Prospective studies are warranted to validate aBSI as a quantitative imaging response biomarker to Ra-223.”

  • This article was originally published on AuntMinnie.com. ©2019 by AuntMinnie.com. Any copying, republication or redistribution of AuntMinnie.com content is expressly prohibited without the prior written consent of AuntMinnie.com.
Copyright © 2025 by IOP Publishing Ltd and individual contributors