Skip to main content

Triangular crystals improve PET performance

The spatial resolution of a PET image is mainly determined by the size of the scintillation crystals in the detector array. But as crystals get smaller, one-to-one coupling with photodetectors becomes tricky. Optical multiplexing can be employed instead, but this degrades crystal separation and necessitates the use of light guides. Such light guides, in turn, decrease light collection efficiency and degrade energy resolution.

A research team at the University of California, Davis, aims to address these shortfalls by using triangular-shaped crystals to improve the separation of edge crystals in PET detectors, without decreasing light collection efficiency. To test their proposed design, the researchers compared edge crystal identification in a triangular-shaped crystal array with the performance of a conventional square-shaped crystal array (Biomed. Phys. Eng. Express 4 025031).

“Compared with traditional crystals with square cross sections, the untraditional triangular-shaped crystals on the edges of the detector module can be better separated from their neighbours in the flood histogram,” explained first author Peng Peng.

Changing shape

The researchers examined two crystal arrays: a 4 x 4 matrix of square-shaped LYSO crystals (“square array”); and 16 triangular-shaped LYSO crystals (“triangular array”). They coupled the two crystal arrays directly to a silicon photomultiplier (SiPM) array, using optical grease or via 1 and 2 mm thick acrylic light guides, and irradiated the detectors were with a 22Na point source.

The crystal arrays on the photodetector array

After acquiring coincidence events, the team calculated three values – flood histogram quality, light collection efficiency and energy resolution – for each crystal in the two arrays. To compare the performance of the two detector configurations, they grouped the crystals into centre, edge or corner elements, and compared parameters within each group.

To quantify the quality of the flood histogram, the researchers determined the fraction of events positioned in the correct crystal, based on a 2D Gaussian fit of the segmented flood histograms. In the first scenario, with crystal arrays coupled directly to the SiPM array, the corner crystals in both configurations were not separable and showed the lowest flood histogram quality. Edge crystals were resolved when using the triangular array, but not with the square array.

Flood histogram quality, photopeak and energy resolution

To evaluate light collection efficiency, the team examined the amplitude of the 511 keV photopeak. The outermost crystals generally showed the lowest light collection, as did crystals positioned over gaps between SiPM elements. The triangular array showed higher light collection efficiency than the square array, with 6.3%, 1.5% and 12.4% higher values for the centre, edge and corner crystals, respectively. The average light collection efficiency for the triangular array was 5.9% higher than for the square array.

The average energy resolution achieved by the triangular and square arrays was 11.6% and 13.2%, respectively. For the triangular array, there was no clear correlation between resolution and crystal position, while for square arrays, resolution was degraded towards the detector edge.

Adding the light guides

Next, Peng and colleagues examined the impact of 1 and 2 mm acrylic light guides on these performance measures. For the triangular array, all three parameters worsened as the light guide became thicker and introduced light loss. For the square array, however, the 2 mm light guide improved flood histogram quality and energy resolution, relative to the 0 and 1 mm cases. This finding confirms previous studies showing the benefit of light guides for square-shaped crystal arrays.

Performance of arrays with optimal light guides

Overall, the triangular array exhibited the best performance with no light guide; the square array only achieved comparable performance with a 2 mm light guide. Examining the arrays with their optimal light guides (0 mm for the triangular, 2 mm for the square array) revealed that the triangular array outperformed the square array across the majority of parameters, particularly in terms of light collection efficiency.

The researchers note that the triangular array with no light guide may also offer improved timing resolution – due to both higher light collection efficiency and the elimination of additional path length dispersion induced by the light guide. “The next step is to verify this hypothesis of improving timing resolution without jeopardizing the spatial resolution,” explained Peng. “If this goal is achieved, a full-size detector module will be built based on this design.”

Homogeneous 2D Fermi gas is created in the lab

Homogeneous 2D Fermi gases have long fascinated theoretical physicists, but creating a real system to study in the laboratory had eluded researchers. Now, Klaus Hueck, Henning Moritz and colleagues at the University of Hamburg  have created such a gas and also confirmed several theoretical predictions about its properties. Their work could lead to a wide range of new experiments that investigate the perplexing properties of quantum many-body systems including superconductors and superfluids.

Theoretical physicists have known for decades that creating and studying a 2D, ultracold gas of fermionic (half-integer spin) atoms would provide a wealth of information about quantum matter. In the past, experimentalists had to be content with trapping fermions inside a harmonic potential well, which allowed them to study its thermodynamic properties. However, the curved shape of a harmonic potential (which resembles a sloped valley) means that the gas inside is non-uniform — and this has important effects on the quantum properties of the gas.

Electron silo

In their study, Hueck and colleagues created a potential well that looks less like a valley and more like a grain silo. The physicists used optical dipole potentials to create the circular well, which simply repels quantum particles to keep them inside the trap. Crucially, the well has little influence on the density of the gas inside. This potential had already been used to contain gases of integer-spin particles (bosons) as well as 3D Fermi gases.

Once they created their 2D gas, the team tested several theoretical predictions about its quantum properties. They measured its thermodynamic properties, which agreed with the theoretical calculations. The physicists also created matter waves in the gas and measured the momentum distribution of the fermions. For the first time, they directly observed how certain momentum states were blocked due to the Pauli exclusion principle – just as predicted by theory.

Trapping technique is described in Physical Review Letters and could prove invaluable for physicists studying the dynamics of uniform systems of fermionic particles, a task which has previously been notoriously difficult. Researchers can look forward to making further experimental observations of exotic theoretical ideas, and even make new discoveries for themselves.

Building a firm foundation in vacuum technology

AVT was founded in 1989 by Jerry Bohn, previously director of Perkin Elmer’s Vacuum Products Division, who was keen to start his own business and saw the opportunity as Perkin Elmer entered a restructuring phase. Despite leaving the corporation, Bohn remained a key element of its operation, with AVT supplying vacuum parts and other hardware to Physical Electronics (PHI) – a leading maker of surface analysis equipment that was owned by Perkin Elmer at the time.

Under Bohn’s leadership, AVT’s operation grew from supplying flanges to building chambers and premium products for users of ultrahigh vacuum (UHV). The company also supplied several small ion pumps for government projects, recalls Dan Korolchuk, who was a manager at PHI during that period, and had known Bohn since the Perkin Elmer days.

The next step for AVT was to tackle bigger projects, which would mean expanding its relatively small team. Unfortunately, 10 years on from starting the firm, Bohn faced health issues that led him to sell the business, and he convinced Korolchuk to buy AVT to safeguard its operation. Thankfully, Bohn recovered after his surgery.

Background in physics

Keen to show that AVT was in safe hands and excited by the opportunities ahead, Korolchuk moved fast to develop the business. “Within eight years we had a new building and more than 50 employees. I took pains to differentiate AVT from other vacuum suppliers because of my background in physics,” he says. “Any supplier can read and accurately interpret a customer’s engineering drawings, whereas I sought to understand the customer’s technology and carry that information back to my production staff.”

I sought to understand the customer’s technology and carry that information back to my production staff

His strategy attracted manufacturers of molecular beam epitaxy (MBE) equipment as well as other new customers requiring vacuum knowledge. By understanding more about a client’s technology, AVT could advise on how various processing steps could enhance or hinder a customer’s efforts.

Working with different sized companies – from start-ups with minimal engineering staff to vertically integrated operations with large technical teams – brought a wide range of requests. This included everything from full design solutions to help with vacuum-specific issues.

AVT’s strong reputation was a major asset across the whole spectrum of services that it offered, helping to ease customer concerns ahead of a project. “Developers want to be sure you understand what they need without divulging any secrets or too much detail,” Korolchuk explains. “They want some assurances that you won’t inject variables into their efforts.”

Understanding client needs

For business relationships to flourish, clients need to feel that their projects are in safe hands. Knowing how to construct and clean chambers to avoid real or virtual leaks may seem like basic vacuum-fabrication skills, but they are extremely important parts of the service. Focusing on reliability upfront ensures that equipment builders and research organizations don’t waste effort chasing down problems further down the line.

There are other aspects too that are important for clients, which is why Korolchuk and his team were proactive in making use of the latest information technology. “Thanks to a state-of-the-art enterprise resource planning system, we were able to give customers real-time status on their orders,” he said. “We knew where everything was at any time and the software could project completion times based on the activities competing for resources on the shop floor.”

The set-up’s cost-estimating feature, which was customized for AVT’s products and shop configuration, made sure that pricing was fast, easy and accurate. Advances in computing also pushed AVT’s technical capabilities to new levels as the team integrated the latest modelling and simulation packages into its workflow.

At the same time, the firm purchased new hardware such as a high-performance co-ordinate-measuring machine. Capable of measuring the machined surfaces of a product within an eight-foot diameter sphere, the tool automatically checked that all critical dimensions were within specified tolerances – a common customer request.

Finding the right solution

These developments helped to open the door to a range of exciting projects. Korolchuk recalls developing hardware to suit a miniature medical device for treating tumours, as well as building massive chambers for use at Sandia National Laboratories.

Vacuum parts from AVT

Customer orders presented AVT with a variety of challenges to overcome. Makers of surface analytical instruments are concerned with precision alignment, which must be maintained even for complex welded parts. Providers of MBE systems and other deposition equipment must preserve clean surfaces and support high-temperature operation. There were other considerations too.

“We built several generations of a product that used a vacuum-generated plasma to sterilize surgical instruments,” Korolchuk remembers. “The device needed to be portable and the customer used AVT expertise for exploring ways to reduce weight while keeping a UHV environment.”

Useful skills for the job

Reflecting on his tenure at the firm, Korolchuk has some sound advice for physicists who aspire to build their own companies. “It’s very easy to become absorbed in the technology and forget there is a business to run – you need resources to support that technology,” he points out.

One of the most important skills, according to Korolchuk, is listening. “Listen to your customers, employees and suppliers,” he advises. Indeed, one of AVT’s key strengths is its ability to partner with clients to deliver the most appropriate solution for each application.

Enhanced operation

In 2013 AVT was acquired by Anderson Dahlen, one of the premier stainless steel and speciality alloy fabricators in the US, and Korolchuk was retained as a consultant during the transition. David Knoll became president of the new company and assumed management of all operations at AVT’s facility in Waconia, Minnesota. Perry Henderson joined AVT to direct the sales and marketing efforts, and a series of investments in personnel, equipment and the facility layout helped to streamline operations and add capacity for growth.

Over the past several years AVT has expanded and enhanced its operation in numerous areas, and now boasts a wider range of customers and applications, additional engineers and project managers, upgraded equipment, and improved capabilities in machining, welding and assembly. The company also worked to develop a vacuum group at Anderson Dahlen’s primary manufacturing facility to take on large-scale projects and chambers that require pressure as well as vacuum. These investments in have resulted in nearly 300% growth of vacuum products for the company since the acquisition.

Today, development of the Applied Vacuum Division continues as the company builds on its strong links with physics and material science research. In 2018 Anderson Dahlen announced further expansions in its vacuum capabilities to take on XHV applications, clean-room assembly and testing, as well as custom-designed vacuum-research equipment.

Portable optical lattice clock measures elevation

A portable optical lattice clock on a trailer has been used to measure elevation deep inside a mountain near the France-Italy border. Although this elevation measurement is not as precise as conventional geodetic methods, the experiment marks an important advance in optical clock technology.

Optical lattice clocks are the most precise timekeepers ever developed, losing about 1 s every 10 billion years. The clocks are extremely sensitive to changes in their local environment, and until now have only been operated in laboratories under highly-controlled conditions. However, a portable optical lattice clock would allow researchers to exploit the device’s remarkable precision to make a wide range of measurements from geodesy (measuring the shape of the Earth) to putting physical theories to the test.

“It’s the first time anybody has operated one of these lattice clocks outside of the laboratory,” explains Andrew Ludlow of the National Institute of Standards and Technology in the US, who was not involved in the research. “It’s kind of a big deal because these are generally pretty complicated laboratory experiments.”

Running behind

The elevation measurement was made by a Europe-based team led by Christian Lisdat of PTB (Germany’s national metrology laboratory) and relies on principles of Einstein’s general theory relativity. The idea is that a clock in a larger gravitational potential – in this case closer to sea level – will tick slower than a clock in a lower potential at a higher elevation. By comparing the time kept by clocks at separate locations, the difference in elevation of the two places can be deduced.

The team towed their portable clock to the Modane Underground Laboratory, which is in the French Alps and accessed via the Fréjus road tunnel between France and Italy. They then compared the time it kept in Modane to another clock at a laboratory down the road in Turin, Italy. They found that the Modane-based clock ran about 0.2 ns faster per day.

Working with geodesy experts, they could determine Modane’s elevation with an uncertainty about 20 m. This is by no means a state-of-the-art measurement of elevation – but rather a proof-of-principle exercise – and the team hopes to reduce the uncertainty of future measurements to several centimetres.

Localized measurements

Optical-clock geodesy could offer several technical advantages over conventional methods such as satellite imaging. Clocks have much better spatial resolution, for example. “The atoms in the clock are tiny and very well-localized, so we measure exactly at that spot,” says Lisdat. Clock data can also be used in conjunction with other methods to confirm measurements. Measurements using a portable clock could be useful, for example, for monitoring sea level rise due to climate change.

Lisdat’s team used a clock that employs strontium atoms that are trapped by laser beams. When excited with light at the correct frequency, the atoms emit electromagnetic waves that cycle at over a hundred trillion times a second, and the clock counts the cycles to keep time. The cycles remain constant as long as the atoms are trapped in a vacuum and held at a constant, ultracold temperature.

“We have about six different laser systems at different wavelengths that have to be guided to the right spot in the vacuum chamber to make the experiment work,” explains Lisdat. These lasers need to emit light consistently at strontium’s precise transition frequencies and the optical system that delivers the light must remain aligned to an exquisite degree of precision.

“One and a half PhD theses”

These requirements are difficult to meet in a laboratory, and to create their portable clock, the team also had to miniaturize its components and make them robust enough to work in different environments. Lisdat says this took “one and a half PhD theses, plus a lot of knowledge of building clocks from more PhD theses.” The team also relied on several decades of institutional experience building other atomic clocks, such as the caesium microwave clock that currently defines the second.

For its first run at Modane, the clock worked inconsistently and the team often had to tinker with and re-align its components. As a result, the physicists are now working on making their clock more reliable and more accurate.

Beyond geodesy, portable optical lattice clocks could have a variety of other applications, says Ludlow. As well as providing a precise definition of time, physicists could use portable clocks to test general relativity and look for possible variations in fundamental constants that could point to the nature dark matter or extensions to the Standard Model of particle physics.

The research is described in Nature Physics.

Photocatalytic heterostructures unleash the potential of doped silver halides

Nanoparticles are ideal for use within solar energy devices because they have large surface areas and small volumes. Silver-doped silver halide particles also display sensitized optical properties thus making them ideal for photocatalysis, but they are prone to aggregating in light. Now Yugang Sun and co-workers at Temple University in the United States have discovered that adding silica helper nanoparticles to nanoparticle catalysts dramatically improves both catalyst activity and particle stability.

Despite recent technological breakthroughs that continue to unveil new and interesting nanomaterials, an ideal photocatalyst for renewable energy efforts has not been developed. An optimal catalyst should: absorb an appropriate range of light via sensitized bandgaps, exhibit solution stability, and be both robust and active. Silver-doped silver halides (AgX(Ag)) show impressive bandgap tunability. However, few research groups have considered these particles for photocatalysis due to their colloidal instability.

Sun et al. synthesized the helper silica nanoparticle spheres using a Stöber sol-gel method. By electrostatically affixing AgX(Ag) nanocubes to these significantly larger nanoparticles, the colloidal stability of the modified-AgX(Ag) dramatically improved, meaning particles no longer immediately aggregated. Additionally, silica nanoparticles can support resonant light scattering, which improved nanocube photo-absorption within the hybrid structure. Impressively, Sun et al. demonstrate a loss of only 9% effectiveness in the heterostructure catalytic activity after 10 repeated operation cycles using ideal dye methylene blue.

A fine-tuned bandgap and an effective helper

These nano-heterostructures owe their catalytic effectiveness to the optimized bandgaps of AgX(Ag) and to the large surface area to particle volume ratio of the hybrid particle. The addition of doped Ag0 to AgX introduces multiple energy levels between the conduction and valence bandgaps of AgX thus promoting visible light absorption, an important criterion when considering solar energy developments.

In addition to preventing aggregation in solution, the silica helper particles also increase the photocatalytic activity of the AgX(Ag) nanocubes. The silica spheres produce resonant light scattering, which increases the amount of light absorbed by the doped nanocubes thus promoting electron excitation and catalytic activity. Additionally, silica’s inertness prevents charge transfer at the silica-AgX(Ag) junction.

Sun et al. note that these results directly display the criticalness of the helper silica particles in photocatalysis. Such a result paves the way for effective solar energy developments.

Full details are reported in Nano Futures.

Silicon nanowires modulate neuronal activity

Light can be used to modulate the behaviour of cells like neurons for applications in clinical therapeutics and for fundamental single-cell studies. However, most of the methods available today to do this are invasive, require cells to be genetically modified and are not precise enough. A new technique that makes use of free-standing photoactivated coaxial silicon nanowires containing atomic gold on their surfaces could help overcome these problems.

Photobiology is the study of the effect of light on organisms that have evolved over millions of years to enhance the amount of light they absorb thanks to specialized machinery. Vision in animals and energy harvesting for photosynthesis in plants are two well-known examples.

Researchers have recently extended this concept to artificially make cells and living organisms more sensitive to light. New work by a team of researchers at the University of Chicago in the US makes use of silicon nanowires measuring just a few hundred nanometres across and consisting of a p-type (boron-doped) core and an n-type (phosphorus-doped) shell lined with surface atomic gold.

When illuminated with light, photoexcited charge carriers (electrons and holes) are produced in the wires. These carriers then separate at the core-shell junction with the photogenerated holes migrating to the core and the electrons being trapped at the surface of a nanowire by the surface gold.

Cathode current depolarizes neuronal membrane

“These electrons then participate in cathodic electrochemical reactions in a surrounding electrolyte solution, generating a cathodic current,” explains team leader Bozhi Tian. “When we then interface the coaxial nanowires with a target neuronal membrane, this current depolarizes the membrane, mimicking the effect of a nerve impulse and causing the neuron to fire an action potential.”

Surprisingly, a single nanowire can prompt this neuronal firing, he adds.

The researchers say they successfully tested out their technique on primary rat dorsal root ganglion neurons.

Towards clinical therapeutics

“This tool could be used for both fundamental single bioelectric studies and clinical therapeutics,” Tian tells nanotechweb.org. “Silicon strongly absorbs light in the near-infrared, a wavelength of light that deeply penetrates biological tissue, which means that the nanowires could be used to stimulate peripheral nerves (lying as far as 1 cm below the skin) if injected into tissue. This could ultimately allow for non-invasive treatment of diseases characterized by severe neuropathic pain, such as diabetic peripheral neuropathy, for example.”

The researchers, reporting their work in Nature Nanotechnology doi:10.1038/s41565-017-0041-7, say they will now try to use these nanowires to modulate voltage in both photo-excitable and non-excitable cellular systems for fundamental bioelectric studies. “We also plan to target specific cell types after modifying their surfaces and develop non-invasive treatments in mouse models of neuropathic diseases,” says Ramya Parameswaran, lead author of this study.

Swarm science

Large groups of swarming midges are composed entirely of males, where their combined motion generates high-pitched sounds that attract females. Physicists are among the scientists trying to understand the underlying principles of “collective behaviour”, which is observed in a range of biological systems. Insights from their work could lead to real-world applications such as crowd-management strategies and smart energy networks that are more resistant to catastrophic failure. Find out more about the research in this feature by Jennifer Ouellette originally published in the February 2018 issue of Physics World.

US citizens less well informed on fracking than those in UK

To frack, or not to frack: that is the question. And what a heated question it is. In both the UK and the US fracking regularly hits the headlines, but people’s concerns about this unconventional gas extraction method are not necessarily the same on either side of the Atlantic. Now a study has compared public perception and beliefs surrounding fracking in both the US and the UK.

People in the UK are generally better informed on the facts surrounding fracking, the results show, but less willing to take a positive or negative stance than people in the US. Energy security is a massive incentive for supporters of fracking in the UK, but less so in the US. The results also showed that concerns around fracking are highly nuanced in both countries and that there’s a need for better communication of the issues.

Fracking – injecting a high-pressure mix of water, sand and chemicals into oil and gas wells to force more of the fuel to flow out – has been around since 1949. But it is only since the late 1990s, with the advent of horizontal drilling, that fracking has really taken off, enabling the exploitation of low-porosity sandstones and shales that were previously uneconomically viable to extract from.

Today 95% of the new wells drilled in the US are hydraulically fractured, and the output makes up 43% of the oil and 67% of the natural gas production in the US. But this fracking boom hasn’t been welcomed by all, and environmental and health concerns have made the technique contentious. Back in 2014 New York state banned fracking, and in March 2017 Maryland followed suit.

Meanwhile, in the UK low volume fracking has been used in around 200 onshore oil and gas wells since the 1980s, but it is only since 2008, when licences were awarded for onshore shale gas exploration, that fracking really started to attract public attention. To date only one well in the UK – the Cuadrilla Resources shale gas exploration well near Blackpool in Lancashire – has used high-volume fracking, but this was halted after a few months due to seismic activity concerns.

So how do people in the US and the UK feel about fracking? Darrick Evensen from Cardiff University, UK, and his colleagues carried out an online survey in September 2014, asking 3823 UK nationals and 1625 people from the US a similar set of questions about their understanding, feelings and beliefs surrounding fracking.

In the US, 60% of respondents were in favour of fracking, 25% were against it, and 16% said they didn’t know. By contrast, only 44% of UK respondents were in favour, 27% were against and 29% said they didn’t know. One factor that may contribute to the differing opinions between the two countries is the source of people’s information.

“Research in the UK suggests that national newspapers are more important as an information source, whereas in the US local newspapers tend to be a more predominant information source, particularly in areas where shale gas development is occurring,” said Evensen.

The surveys also revealed that people in the UK tend to be more accurately informed – 72% of the UK sample could answer factual questions about fracking accurately whereas only 36% of the US sample answered the same questions correctly – but less willing to take a position and express an opinion.

“One conjecture is that political leaning, both on a conservative-liberal scale and republican/democrat or conservative/labour split, might more strongly relate to support or opposition to fracking in the US than the UK,” said Evensen, whose findings are published in Environmental Research Letters (ERL) . “Therefore, amongst people who do not know much about fracking, this political leaning is coming out more strongly in the US as causing people to take sides.”

Politics aside, the survey results clearly showed that energy security was a predominant concern in the UK, with those who thought that fracking would increase energy security being three times more likely to support fracking than people who held the same belief in the US.

“There is not a natural gas supply issue in the US, whereas in the UK a large proportion of natural gas is imported and people are presumably aware that supply problems from Russia would be felt throughout Europe,” said Evensen.

Looking forward, Evensen and his colleagues hope that a greater understanding of which fracking issues really matter to people will help government, industry, environmental organisations and activists to share the appropriate information and start a more open two-way dialogue, enabling more informed public decision-making.

Algorithm enhances Cherenkov-based dose verification

Verification of radiation dose at the time of radiotherapy is a necessity to ensure accurate dose delivery to a malignant tumour. With many radiation treatments, daily verification is performed using a series of technologies that give partial information, such as optical surface positioning, electronic portal dosimetry or cone-beam CT position evaluation. But the process requires a series of tools and involves substantial data analysis. Cherenkov imaging offers the potential for automated, non-contact verification of the beam position on the patient’s body, imaged in real-time as the patient is receiving treatment.

Researchers at the Thayer School of Engineering at Dartmouth College have been investigating Cherenkov imaging for use in external-beam radiotherapy for three years. This novel optical imaging technique enables visualization of the radiation dose delivery into tissue. In 2014, the team reported its use in monitoring real-time changes in the delivery of fractionated radiation beams to the patient’s tissue, during a study of 12 cancer patients (see Cerenkoscopy monitors breast treatments).

The research team has now developed a Cherenkov image processing algorithm that maximizes accurate recovery of beam edges. The algorithm enables quantification of daily variations of intrafraction treatment. This capability is a step closer to automating analysis of this imaging approach, and facilitating its development as a commercial radiotherapy verification tool (J. Med. Imaging 5 015001).

Because Cherenkov images are inherently noisy, with signal quality affected by factors such as the lighting conditions in the radiotherapy treatment suite and the camera acquisition parameters, image processing is needed to maximize accurate recovery of beam edges. Principal investigator Brian Pogue and colleagues examined methods to quantify the beam edges and tested the quantification approaches with both phantom and patient data.

The team acquired images from seven breast cancer patients undergoing radiotherapy in up to nine treatment sessions. They then imaged five treatment plans with varying levels of beam modulation delivered to a breast phantom. These images were used to create a calibration set of quantitatively known values for comparison with day-to-day variations of patient images. For each plan, the phantom was imaged after being moved by 1.0, 3.0 and 5.0 mm from its original position in the anteroposterior (AP) direction.

The researchers analysed treatment dose at three control points for each patient. Video data were temporarily integrated into a single, composite summary image at each control point. They used three different edge-preserving filtering methods to reduce noise. A series of image processing functions were bundled together to enforce edge preservation in segmenting the images, to outline the Cherenkov field of where the beam was incident.

The authors used Dice coefficients and mean distance to conformity values for analysis and comparison of day-to-day variation. They determined that the patient data could be compared with the calibrated breast phantom data, such that the patient variations could be estimated by known systematic shifts of the phantom image data.

They also found that the patient data had slightly more variation than the absolute 3 mm phantom position shift, indicating that daily variation was greater than expected, based upon positioning goals. The authors suggested that this occurred because of the complexity of day-to-day positioning of the patients.

The Dartmouth team will conduct further tests using their processing algorithm to analyse the variations seen in patients receiving whole-breast radiotherapy, to assess day-to-day variations in beam delivery as a quality audit system for patient position and beam variation.

Solar heating shines again

The UK may be doing reasonably well on green power, but it is not making much progress on green heat. What about solar heating? The familiar roof-top solar heat collector, with a flat glass plate on top of a box with pipework for cooling water, has been upgraded over the years to more efficient evacuated tube and focused solar and hybrid PV/thermal (PVT) variants. These systems can offer cost-effective no/low carbon heating in some locations and are mostly scalable, suitable for a range of small or large-scale applications. And you can store solar heat for use later and also use it to drive cooling systems. All in all, it’s an energy source with multiple attractions.

The UK Solar Trade Association (STA) says: “It’s time to look again at solar thermal. The strategic importance of this mature, proven technology is growing as our homes become more thermally efficient and require less space heating – we will continue to need hot water. The UK also needs to do much more to decarbonise heat, where we lag badly behind in Europe.”

STA’s Leonie Greene says: “One of the largest solar thermal schemes in the world, in Silkeborg, Denmark, provides a fifth of the heat for the district heating for a town of 43,000.” She adds, “Analysis by IRENA shows solar thermal could technically meet half of heat demand in the industrial sector. Indeed, half of industrial heat requirements are for medium to low temperatures, such as washing, drying, sterilizing and pasteurising. This makes solar thermal, which can do very hot process steam, well suited to sectors where heating needs are below 250 °C, such as paper, chemicals, tourism, pharmaceuticals and textiles. The farming industry also offers great opportunities.”

The UK’s Renewable Heat Incentive (RHI) provides support for solar thermal projects up to 200 kW, but more support is needed for a wider range and scale of projects if the full potential of solar heat is to be reached. The Energy Research Partnership’s report on decarbonizing heat, Transition to Low-carbon Heatwhich I mentioned in an earlier post, says nearly half of the UK’s existing heat demand could be economically connected to green heat supply networks, up from 2% now, saving £30bn.

Solar heat, along with green gas, ought to play a part in that, with big solar arrays feeding heat stores and district heating networks. There are also many other options, including using solar heat for air con absorption chillers and along with heat inputs from heat pumps. High-temperature focused-solar, or concentrated solar power (CSP) sun-tracking systems can also be used for generating electricity on a large scale, but are best suited to sunny climates with direct (not diffuse) sunlight, and desert areas, for the large areas needed.

More relevant to the rest of us, there is also so-called PVT,  hybrid solar thermal and PV systems, combining PV electricity production with heat absorption. That can improve the efficiency of PV cells, which falls off with temperature, as they heat up in the sun – PV works on light photons not infrared heat, so PV cells can get very hot.

In one approach to dealing with this problem, forced (powered) or passive (convection) air cooling flows may be used, with the warmed air being used for building heating. Alternatively, and more efficiently, water cooling is arranged in a heat collector substrate on the underside of a PV array, so that the pipework does not interfere with the incoming light. Some of the incident light will pass through to this heat collector, but as well as converting some to electricity, the PV cell material will block some of it, so that direct heat collection is reduced, over what it would have been with just a simple solar heat collector. The newer more translucent ink dye cell materials will reduce this effect but, in any case, the unconverted energy in the direct light blocked by the cell materials is in effect stored as heat, so it is still available for collection. As a result, in theory, with PV efficiency enhanced and overall heat collection efficiency not much reduced, a liquid cooled hybrid PV-T collector can generate up to 40% more energy than PV and solar-thermal panels installed side by side over the same area. Some clever new designs have emerged, some with evacuated tube systems to reduce heat loss.

Solar heating has obviously been somewhat eclipsed by solar PV in the UK and elsewhere, but these two solar options can complement each other via PVT, especially in hot climates for utility scale systems, including some with lens focusing (then competing with CSP), but also for domestic scale and building integrated units. For example, one recent review noted that “Building-integrated air PVT systems have the potential to reduce the primary energy required for heating and cooling from conventional systems by 30%, and in pre- heating applications where the fluid temperature is kept low, such as in combined PVT/heat-pump systems, the electrical output can be enhanced by 4–10% compared to an equivalent non-cooled PV system.”

Solar heating has been around for some while and continues to expand at around 12% p.a. PV is the relatively new kid on the block and is expanding faster – at around 45% pa. But it is worth noting that, globally, the use of solar heating is still more widespread than the use of PV solar, with over 465 GW(th) of rooftop solar thermal capacity in place (much of it in China), compared to only around 300 GW of PV so far globally. It is hard to predict exactly what will happen next but, although PV seems set to continue to accelerate rapidly, one of the big advantages with solar heat units is that you can store heat much more easily than you can store electricity. PV is getting cheaper, but we still need heat, and although some PV power might be used for heating (e.g. with excess daytime power run into an immersion heater), direct solar heating will continue to have a role, and PVT may become an important new option – in effect rooftop solar CHP. Pity it’s not supported in the RHI.

It’s perhaps worth noting that, whereas nearly all solar heat collectors in the UK are on rooftops, apart from some swimming pool schemes and the like, and so take up no extra land, PV solar is increasingly being put on the ground in large solar farms. About two-thirds of the UK’s PV projects are ground mounted.

With the cutbacks in FiTs for smaller domestic projects, and the block on CfD access, the trend to larger ground-mounted PV projects looks likely to increase. The STA has commented that “Government policy of excluding solar from clean power auctions is driving larger projects in a bid to get the economics to work.” Certainly some very large solar farm PV projects are planned, such as the ~350 MW array in Kent, covering 850 acres.

Is it a good idea to cover so much farm land with PV? In theory sheep can still graze on the site, but that seems rare in practice – most projects have security fences around them to keep people and animals out. Solar farms may be popular with farmers since some can no longer make enough money from food farming, but some prime farm land is being lost, with the only upside, apart from the power output, being that they may become a haven for wildlife and wildflowers, assuming the area isn’t sprayed to stop weeds and grasses from growing too high and blocking the sun. Well-sited arrays may not be a problem in visual terms, but the sheer scale of some PV projects can lead to opposition.

Obviously rooftop PV deployment is best (it’s then handily angled and also near to the energy user), the same for factory/warehouse roofs. However, there may then be contests for roof space with solar heat collectors! If we must go on-land with PV, then brown field/marginal land is surely preferable to prime farm land. That has been the government’s view, and for once they may be right. Though small community-owned projects may be an exception. But who knows what’s next re farming policy, post-Brexit and post the CAP. Meanwhile, rooftop solar heat collectors will hopefully continue to find roles in all sectors, reducing the need for the use of fossil fuels for heating and, in some locations, with large heat stores, offering summer heat for winter use.

Copyright © 2026 by IOP Publishing Ltd and individual contributors