Skip to main content

Polarization-sensitive photoacoustic microscopy reveals heart tissue health

MIR-DS-PAM images of fibrotic and normal cardiac tissue

Many of the tissues in the human body rely upon highly organized microstructures to function effectively. If the collagen fibres in heart muscle become disordered, for instance, this can lead to or reflect disorders such as fibrosis and cancer. To image and analyse such structural changes, researchers at Pohang University of Science and Technology (POSTECH) in Korea have developed a new label-free microscopy technique and demonstrated its use in engineered heart tissue.

The ability to assess the alignment of microstructures such as protein fibres within tissue’s extracellular matrix provides a valuable tool for diagnosing disease, monitoring therapy response and evaluating tissue engineering models. Currently, however, this is achieved using histological imaging methods based on immunofluorescent staining, which can be labour-intensive and sensitive to the imaging conditions and antibodies used.

Instead, a team headed up by Chulhong Kim and Jinah Jang is investigating photoacoustic microscopy (PAM), a label-free imaging modality that relies on light absorption by endogenous tissue chromophores to reveal structural and functional information. In particular, PAM with mid-infrared (MIR) incident light provides bond-selective, high-contrast imaging of proteins, lipids and carbohydrates. The researchers also incorporated dichroism-sensitive (DS) functionality, resulting in a technique referred to as MIR-DS-PAM.

“Dichroism-sensitivity enables the quantitative assessment of fibre alignment by detecting the polarization-dependent absorption of anisotropic materials like collagen,” explains first author Eunwoo Park. “This adds a new contrast mechanism to conventional photoacoustic imaging, allowing simultaneous visualization of molecular content and microstructural organization without any labelling.”

Park and colleagues constructed a MIR-DS-PAM system using a pulsed quantum cascade laser as the light source. They tuned the laser to a centre wavelength of 6.0 µm to correspond with an absorption peak from the C=O stretching vibration in proteins. The laser beam was linearly polarized, modulated by a half-wave plate and used to illuminate the target tissue.

Tissue analysis

To validate the functionality of their MIR-DS-PAM technique, the researchers used it to image a formalin-fixed section of engineered heart tissue (EHT). They obtained images at four incident angles and used the acquired photoacoustic data to calculate the photoacoustic amplitude, which visualizes the protein content, as well as the degree of linear dichroism (DoLD) and the orientation angle of linear dichroism (AoLD), which reveal the extracellular matrix alignment.

“Cardiac tissue features highly aligned extracellular matrix with complex fibre orientation and layered architecture, which are critical to its mechanical and electrical function,” Park explains. “These properties make it an ideal model for demonstrating the ability of MIR-DS-PAM to detect physiologically relevant histostructural and fibrosis-related changes.”

The researchers also used MIR-DS-PAM to quantify the structural integrity of EHT during development, using specimens cultured for one to five days before fixing. Analysis of the label-free images revealed that as the tissue matured, the DoLD gradually increased, while the standard deviation of the AoLD decreased – indicating increased protein accumulation with more uniform fibre alignment over time. They note that these results agree with those from immunofluorescence-stained confocal fluorescence microscopy.

Next, they examined diseased EHT with two types of fibrosis: cell-induced fibrosis (CIF) and drug-induced fibrosis (DIF). In the CIF sample, the average photoacoustic amplitude and AoLD uniformity were both lower than found in normal EHT, indicating reduced protein density and disrupted fibre alignment. DIF exhibited a higher photoacoustic amplitude and lower AoLD uniformity than normal EHT, suggesting extensive extracellular matrix accumulation with disorganized orientation.

Both CIF and DIF showed a slight reduction in DoLD, again signifying a disorganized tissue structure, a common hallmark of fibrosis. The two fibrosis types, however, exhibited diverse biochemical profiles and different levels of mechanical dysfunction. The findings demonstrate the ability of MIR-DS-PAM to distinguish diseased from healthy tissue and identify different types of fibrosis. The researchers also imaged a tissue assembly containing both normal and fibrotic EHT to show that MIR-DS-PAM can capture features in a composite sample.

They conclude that MIR-DS-PAM enables label-free monitoring of both tissue development and fibrotic remodelling. As such, the technique shows potential for use within tissue engineering research, as well as providing a diagnostic tool for assessing tissue fibrosis or remodelling in biopsied samples. “Its ability to visualize both biochemical composition and structural alignment could aid in identifying pathological changes in cardiological, musculoskeletal or ocular tissues,” says Park.

“We are currently expanding the application of MIR-DS-PAM to disease contexts where extracellular matrix remodelling plays a central role,” he adds. “Our goal is to identify label-free histological biomarkers that capture both molecular and structural signatures of fibrosis and degeneration, enabling multiparametric analysis in pathological conditions.”

 

Astronomer Daniel Jaffe named president of the Giant Magellan Telescope project

Daniel Jaffe

Astronomer Daniel Jaffe has been appointed the next president of the Giant Magellan Telescope Corporation –  the international consortium building the $2.5bn Giant Magellan Telescope (GMT). He succeeds Robert Shelton, who announced his retirement last year after eight years in the role.

A former head of astronomy at the University of Texas at Austin from 2011 to 2015, Jaffe became vice-president for research at the university from 2016 to 2025 and he also served as interim provost from 2020 to 2021.

Jaffe has sat on the board of directors of the Association of Universities for Research in Astronomy and the Gemini Observatory and played a role in establishing the University of Texas at Austin’s partnership in the GMT.

Under construction in Chile and expected to be complete in the 2030s, the GMT consists of seven mirrors to create a 25.4 m telescope. From the ground it will produce images 4–16 times sharper than the James Webb Space Telescope and will investigate the origins of the chemical elements, and search for signs of life on distant planets.

“I am honoured to lead the GMT at this exciting stage,” notes Jaffe. “[It] represents a profound leap in our ability to explore the universe and employ a host of new technologies to make fundamental discoveries.”

“[Jaffe] brings decades of leadership in research, astronomy instrumentation, public-private partnerships, and academia,” noted Taft Armandroff, board chair of the GMTO Corporation. “His deep understanding of the Giant Magellan Telescope, combined with his experience leading large research enterprises and cultivating a collaborative environment, make him exceptionally well suited to lead the observatory through its next phase of construction and toward operations.”

Jaffe joins the GMT at a pivotal time, as it aims to secure the funding necessary to complete the telescope with just over $1bn from private funds having been pledges so far. The collaboration recently added Northwestern University and the Massachusetts Institute of Technology to its international consortium taking the number of members to 16 universities and research institutions.

In June 2025 the GMT, which is already 40% completed, received NSF approval confirming that the observatory will advance into its “major facilities final design phase”, one of the final steps before becoming eligible for federal construction funding.

Yet it faces competition from another next-generation telescope – the Thirty Meter Telescope (TMT) that will use a segmented primary mirror consisting of 492 elements of zero-expansion glass for a 30 m-diameter primary mirror.

The TMT team chose Hawaii’s Mauna Kea peak as its location. However, protests by indigenous Hawaiians, who regard the site as sacred, have delayed the start of construction with officials identifying the island of La Palma, belonging to Spain’s Canary Islands, as an alternative site in 2019.

India turns to small modular nuclear reactors to meet climate targets

India has been involved in nuclear energy and power for decades, but now the country is  turning to small modular nuclear reactors (SMRs) as part of a new, long-term push towards nuclear and renewable energy. In December 2025 the country’s parliament passed a bill that allows private companies for the first time to participate in India’s nuclear programme, which could see them involved in generating power, operating plants and making equipment.

Some commentators are unconvinced that the move will be enough to help meet India’s climate pledge to achieving 500 GW of non-fossil-fuel based energy generation by 2030. Interestingly, however, India has now joined other nations, such as Russia and China, in taking an interest in SMRs. They could help stem the overall decline in nuclear power, which now accounts for just 9% of electricity generated around the world – down from 17.5% in 1996.

Last year India’s finance minister Nirmala Sitharaman announced a nuclear energy mission funded with 200 billion Indian rupees ($2.2bn) to develop at least five indigenously designed and operational SMRs by 2033. Unlike huge, conventional nuclear plants, such as pressurized heavy-water reactors (PHWRs), most or all components of an SMR are manufactured in factories before being assembled at the reactor site.

SMRs, typically generate less than 300 MW of electrical power but – being modular – additional capacity can be brought on quickly and easily given their lower capital costs, shorter construction times, ability to work with lower-capacity grids and lower carbon emissions. Despite their promise, there are only two fully operating SMRs in the world – both in Russia – although two further high-temperature gas-cooled SMRs are currently being built in China. In June 2025 Rolls-Royce SMR was selected as the preferred bidder by Great British Nuclear to build the UK’s first fleet of SMRs, with plans to provide 470 MW of low-carbon electricity.

Cost benefit analysis

An official at the Department of Atomic Energy told Physics World that part of that mix of five new SMRs in India could be the 200 MW Bharat small modular reactor, which are based on pressurized water reactor technology and use slightly enriched uranium as a fuel. Other options are 55 MW small modular reactors and the Indian government also plans to partner with the private sector to deploy 220 MW Bharat small reactors.

Despite such moves, some are unconvinced that small nuclear reactors could help India scale its nuclear ambitions. “SMRs are still to demonstrate that they can supply electricity at scale,” says Karthik Ganesan, a fellow and director of partnerships at the Council on Energy, Environment and Water (CEEW), a non-profit policy research think-tank based in New Delhi. “SMRs are a great option for captive consumption, where large investment that will take time to start generating is at a premium.”

Ganesan, however, says it is too early to comment on the commercial viability of SMRs as cost reductions from SMRs depend on how much of the technology is produced in a factory and in what quantities. “We are yet to get to that point and any test reactors deployed would certainly not be the ones to benchmark their long-term competitiveness,” he says. “[But] even at a higher tariff, SMRs will still have a use case for industrial consumers who want certainty in long-term tariffs and reliable continuous supply in a world where carbon dioxide emissions will be much smaller than what we see from the power sector today.”

M V Ramana from the University of British Columbia, Vancouver, who works in international security and energy supply, is concerned over the cost efficiency of SMRs compared to their traditional counterparts. “Larger reactors are cheaper on a per-megawatt basis because their material and work requirements do not scale linearly with power capacity,” says Ramana.  This, according to Ramana, means that the electricity SMRs produce will be more expensive than nuclear energy from large reactors, which are already far more expensive than renewables such as solar and wind energy.

Clean or unclean?

Even if SMRs take over from PHWRs, there is still the question of what do with its nuclear waste. As Ramana points out, all activities linked to the nuclear fuel chain – from mining uranium to dealing with the radioactive wastes produced – have significant health and environmental impacts. “The nuclear fuel chain is polluting, albeit in a different way from that of fossil fuels,” he says, adding that those pollutants remain hazardous for hundreds of thousands of years. “There is no demonstrated solution to managing these radioactive wastes – nor can there be, given the challenge of trying to ensure that these materials do not come into contact with living beings,” says Ramana.

Ganesan, however, thinks that nuclear energy is still clean as it produces electricity with much a lower environmental footprint especially when it comes to so-called “criteria pollutants”: ozone; particulate matter; carbon monoxide; lead; sulphur dioxide; and nitrogen dioxide.  While nuclear waste still needs to be managed, Ganesan says the associated costs are already included in the price of setting up a reactor. “In due course, with technological development, the burn up will significantly higher and waste generated a lot lesser.”

Gravitational lensing sheds new light on Hubble constant controversy

By studying how light from eight distant quasars is gravitationally lensed as it propagates towards Earth, astronomers have calculated a new value for the Hubble constant – a parameter that describes the rate at which the universe is expanding. The result agrees more closely with previous “late-universe” probes of this constant than it does with calculations based on observations of the cosmic microwave background (CMB) in the early universe, strengthening the notion that we may be misunderstanding something fundamental about how the universe works.

The universe has been expanding ever since the Big Bang nearly 14 billion years ago. We know this, in part, because of observations made in the 1920s by the American astronomer Edwin Hubble. By measuring the redshift of various galaxies, Hubble discovered that galaxies further away from Earth are moving away faster than galaxies that are closer to us. The relationship between this speed and the galaxies’ distance is known as the Hubble constant, H0.

Astronomers have developed several techniques for measuring H0. The problem is that different techniques deliver different values. According to measurements made by the European Space Agency’s Planck satellite of CMB radiation “left over” from the Big Bang, the value of H0 is about 67 kilometres per second per megaparsec (km/s/Mpc), where one Mpc is 3.3 million light years. In contrast, “distance-ladder” measurements such as those made by the SH0ES collaboration those involving observations of type Ia supernovae yield a value of about 73 km/s/Mpc. This discrepancy is known as the Hubble tension.

Time-delay cosmography

In the latest work, the TDCOSMO collaboration, which includes astronomers Kenneth Wong and Eric Paic of the University of Tokyo, Japan, measured H0 using a technique called time-delay cosmography. This well-established method dates back to 1964 and uses the fact that massive galaxies can act as lenses, deflecting the light from objects behind them so that from our perspective, these objects appear distorted.

“This is called gravitational lensing, and if the circumstances are right, we’ll actually see multiple distorted images, each of which will have taken a slightly different pathway to get to us, taking different amounts of time,” Wong explains.

By looking for changes in these images that are identical, but slightly out of sync, astronomers can measure the time differences required for the light from the objects to reach Earth. Then, by combining these data with estimates of the distribution of the mass of the distorting galactic lens, they can calculate H0.

A real tension, not a measurement artefact

Wong and colleagues measured the light from eight strongly lensed quasars using various telescopes, including the James Webb Space Telescope (JWST), the Keck Telescopes and the Very Large Telescope (VLT). They also made use of observations from the Sloan Lens ACS (SLACS) sample with Keck and the Legacy Survey (SL2S) sample.

Based on these measurements, they obtained a H0 value of roughly 71.6 km s−1 Mpc−1, which is more consistent with current-day observations (such as that from SH0ES) than early-universe ones (such as that from Planck). Wong explains that this discrepancy supports the idea that the Hubble tension arises from real physics, not just some unknown error in the various methods. “Our measurement is completely independent of other methods, both early- and late-universe, so if there are any systematic uncertainties in those, we should not be affected by them,” he says.

The astronomers say that the SLACS and SL2S sample data are in excellent agreement with the new TDCOSMO-2025 sample, while the new measurements improve the precision of H0 to 4.6%. However, Paic notes that nailing down the value of H0 to a level that would “definitely confirm” the Hubble tension will require a precision of 1-2%. “This could be possible by increasing the number of objects observed as well as ruling out any systematic errors as yet unaccounted for,” he says.

Wong adds that while the TDCOSMO-2025 dataset contains its own uncertainties, multiple independent measurements should, in principle, strengthen the result. “One of the largest sources of uncertainty is the fact that we don’t know exactly how the mass in the lens galaxies is distributed,” he explains. “It is usually assumed that the mass follows some simple profile that is consistent with observations, but it is hard to be sure and this uncertainty can directly influence the values we calculate.”

The biggest hurdle, Wang adds, will “probably be addressing potential sources of systematic uncertainty, making sure we have thought of all the possible ways that our result could be wrong or biased and figuring out how to handle those uncertainties.”

The study is detailed in Astronomy and Astrophysics.

RFID-tagged drug capsule lets doctors know when it has been swallowed

Taking medication as and when prescribed is crucial for it to have the desired effect. But nearly half of people with chronic conditions don’t adhere to their medication regimes, a serious problem that leads to preventable deaths, drug resistance and increased healthcare costs. So how can medical professionals ensure that patients are taking their medicine as prescribed?

A team at Massachusetts Institute of Technology (MIT) has come up with a solution: a drug capsule containing an RFID tag that uses radiofrequency (RF) signals to communicate that it has been swallowed, and then bioresorbs into the body.

“Medication non-adherence remains a major cause of preventable morbidity and cost, but existing ingestible tracking systems rely on non-degradable electronics,” explains project leader Giovanni Traverso. “Our motivation was to create a passive, battery-free adherence sensor that confirms ingestion while fully biodegrading, avoiding long-term safety and environmental concerns associated with persistent electronic devices.”

The device – named SAFARI (smart adherence via Faraday cage and resorbable ingestible) – incorporates an RFID tag with a zinc foil RF antenna and an RF chip, as well as the drug payload, inside an ingestible gelatin capsule. The capsule is coated with a mixture of cellulose and molybdenum particles, which blocks the transit of any RF signals.

SAFARI capsules with and without RF-blocking coating

Once swallowed, however, this shielding layer breaks down in the stomach. The RFID tag (which can be preprogrammed with information such as dose metadata, manufacturing details and unique ID) can then be wirelessly queried by an external reader and return a signal from inside the body confirming that the medication has been ingested.

The capsule itself dissolves upon exposure to digestive fluids, releasing the desired medication; the  metal antenna components also dissolve completely in the stomach. The use of biodegradable materials is key as it eliminates the need for device retrieval and minimizes the risk of gastrointestinal (GI) blockage. The tiny (0.16 mm²) RFID chip remains intact and should safely leave the body through the GI tract.

Traverso suggests that the first clinical applications for the SAFARI capsule will likely be high-risk settings in which objective ingestion confirmation is particularly valuable. “[This includes] tuberculosis, HIV, transplant immunosuppression or cardiovascular therapies, where missed doses can have serious clinical consequences,” he tells Physics World.

In vivo demonstration

To assess the degradation of the SAFARI capsule and its components in vitro, Traverso and colleagues placed the capsule into simulated gastric fluid at physiological temperature (37 °C). The RF shielding coating dissolved in 10–20 min, while the capsule and the zinc layer in the RFID tag disintegrated into pieces after one day.

Next, the team endoscopically delivered the SAFARI capsules into the stomachs of sedated pigs, chosen as they have a similar sized GI tract to humans. Once in contact with gastric fluid in the stomach, the capsule coating swelled and then partially dissolved (as seen using endoscopic images), exposing the RFID tag. The researchers found that, in general, the tag and capsule parts disintegrated in the stomach at up to 24 h later.

A panel antenna positioned 10 cm from the animal captured the tag data. Even with the RFID tags immersed in gastric fluid, the external receiver could effectively record signals in the frequency range of 900–925 MHz, with RSSI (received signal strength indicator) values ranging from 65 to 78 dB – demonstrating that the tag could effectively transmit RF signals from inside the stomach.

The researchers conclude that this successful use of SAFARI in swine indicates the potential for translation to clinical research. They note that the device should be safe for human ingestion as its composite materials meet established dietary and biomedical exposure limits, with levels of zinc and molybdenum orders of magnitude below those associated with toxicity.

“We have demonstrated robust performance and safety in large-animal models, which is an important translational milestone,” explains first author Mehmet Girayhan Say. “Before human studies, further work is needed on chronic exposure with characterization of any material accumulation upon repeated dosing, as well as user-centred integration of external readers to support real-world clinical workflows.”

Quantum state teleported between quantum dots at telecoms wavelengths

Physicists at the University of Stuttgart, Germany have teleported a quantum state between photons generated by two different semiconductor quantum dot light sources located several metres apart. Though the distance involved in this proof-of-principle “quantum repeater” experiment is small, members of the team describe the feat as a prerequisite for future long-distance quantum communications networks.

“Our result is particularly exciting because such a quantum Internet will encompass these types of distant quantum nodes and will require quantum states that are transmitted among these different nodes,” explains Tim Strobel, a PhD student at Stuttgart’s Institute of Semiconductor Optics and Functional Interfaces (IHFG) and the lead author of a paper describing the research. “It is therefore an important step in showing that remote sources can be effectively interfaced in this way in quantum teleportation experiments.”

In the Stuttgart study, one of the quantum dots generates a single photon while the other produces a pair of photons that are entangled – meaning that the quantum state of one photon is closely linked to the state of the other, no matter how far apart they are. One of the photons in the entangled pair then travels to the other quantum dot and interferes with the photon there. This process produces a superposition that allows the information encapsulated in the single photon to be transferred to the distant “partner” photon from the pair.

Quantum frequency converters

Strobel says the most challenging part of the experiment was making photons from two remote quantum dots interfere with each other. Such interference is only possible if the two particles are indistinguishable, meaning they must be similar in every regard, be it in their temporal shape, spatial shape or wavelength. In contrast, each quantum dot is unique, especially in terms of its spectral properties, and each one emits photons at slightly different wavelengths.

To close the gap, the team used devices called quantum frequency converters to precisely tune the wavelength of the photons and match them spectrally. The researchers also used the converters to shift the original wavelengths of the photons emitted from the quantum dots (around 780 nm) to a wavelength commonly used in telecommunications (1515 nm) without altering the quantum state of the photons. This offers further advantages: “Being at telecommunication wavelengths makes the technology compatible with the existing global optical fibre network, an important step towards real-life applications,” Strobel tells Physics World.

Proof-of-principle experiment

In this work, the quantum dots were separated by an optical fibre just 10 m in length. However, the researchers aim to push this to considerably greater distances in the future. Strobel notes that the Stuttgart study was published in Nature Communications back-to-back with an independent work carried out by researchers led by Rinaldo Trotta of Sapienza University in Rome, Italy. The Rome-based group demonstrated quantum state teleportation across the Sapienza University campus at shorter wavelengths, enabled by the brightness of their quantum-dot source.

“These two papers that we published independently strengthen the measurement outcomes, demonstrating the maturity of quantum dot light sources in this domain,” Strobel says. Semiconducting quantum dots are particularly attractive for this application, he adds, because as well as producing both single and entangled photons on demand, they are also compatible with other semiconductor technologies.

Fundamental research pays off

Simone Luca Portalupi, who leads the quantum optics group at IHFG, notes that “several years of fundamental research and semiconductor technology are converging into these quantum teleportation experiments”. For Peter Michler, who led the study team, the next step is to leverage these advances to bring quantum-dot-based teleportation technology out of a controlled laboratory environment and into the real world.

Strobel points out that there is already some precedent for this, as one of the group’s previous studies showed that they could maintain photon entanglement across a 36-km fibre link deployed across the city of Stuttgart. “The natural next step would be to show that we can teleport the state of a photon across this deployed fibre link,” he says. “Our results will stimulate us to improve each building block of the experiment, from the sample to the setup.”

Quantum metrology at NPL: we explore the challenges and opportunities

This episode of the Physics World Weekly podcast features a conversation with Tim Prior and John Devaney of the National Physical Laboratory (NPL), which is the UK’s national metrology institute.

Prior is NPL’s quantum programme manager and Devaney is its quantum standards manager. They talk about NPL’s central role in the recent launch of NMI-Q, which brings together some of the world’s leading national metrology institutes to accelerate the development and adoption of quantum technologies.

Prior and Devaney describe the challenges and opportunities of developing metrology and standards for rapidly evolving technologies including quantum sensors, quantum computing and quantum cryptography. They talk about the importance of NPL’s collaborations with industry and academia and explore the diverse career opportunities for physicists at NPL. Prior and Devaney also talk about their own careers and share their enthusiasm for working in the cutting-edge and fast-paced field of quantum metrology.

This podcast is sponsored by the National Physical Laboratory.

Further reading

Why quantum metrology is the driving force for best practice in quantum standardization

Performance metrics and benchmarks point the way to practical quantum advantage

End note: NPL retains copyright on this article.

Mapping electron phases in nanotube arrays

Carbon nanotube arrays are designed to investigate the behaviour of electrons in low‑dimensional systems. By arranging well‑aligned 1D nanotubes into a 2D film, the researchers create a coupled‑wire structure that allows them to study how electrons move and interact as the system transitions between different dimensionalities. Using a gate electrode positioned on top of the array, the researchers were able to tune both the carrier density (number of electrons and holes in a unit area) and the strength of electron–electron interactions, enabling controlled access to regimes. The nanotubes behave as weakly coupled 1D channels where electrons move along each nanotube, as a 2D Fermi liquid where the electrons can move between nanotubes behaving like a conventional metal, or as a set of quantum‑dot‑like islands showing Coulomb blockade where at low carrier densities sections of the nanotubes become isolated.

The dimensional transitions are set by two key temperatures: T₂D, where electrons begin to hop between neighbouring nanotubes, and T₁D, where the system behaves as a Luttinger liquid which is a 1D state in which electrons cannot easily pass each other and therefore move in a strongly correlated, collective way. Changing the number of holes in the nanotubes changes how strongly the tubes interact with each other. This controls when the system stops acting like separate 1D wires and when strong interactions make parts of the film break up into isolated regions that show Coulomb blockade.

The researchers built a phase diagram by looking at how the conductance changes with temperature and voltage, and by checking how well it follows power‑law behaviour at different energy ranges. This approach allows them to identify the boundaries between Tomonaga–Luttinger liquid, Fermi liquid and Coulomb blockade phases across a wide range of gate voltages and temperatures.

Overall, the work demonstrates a continuous crossover between 2D, 1D and 0D electronic behaviour in a controllable nanotube array. This provides an experimentally accessible platform for studying correlated low‑dimensional physics and offers insights relevant to the development of nanoscale electronic devices and future carbon nanotube technologies.

Read the full article

Dimensionality and correlation effects in coupled carbon nanotube arrays

Xiaosong Deng et al 2025 Rep. Prog. Phys. 88 088001

Do you want to learn more about this topic?

Structural approach to charge density waves in low-dimensional systems: electronic instability and chemical bonding Jean-Paul Pouget and Enric Canadell (2024)

CMS spots hints of a new form of top‑quark matter

The CMS Collaboration investigated in detail events in which a top quark and an anti‑top quark are produced together in high‑energy proton–proton collisions at √s = 13 TeV, using the full 138 fb⁻¹ dataset collected between 2016 and 2018. The top quark is the heaviest fundamental particle and decays almost immediately after being produced in high-energy collisions. As a consequence, the formation of a bound top–antitop state was long considered highly unlikely and had never been observed. The anti-top quark has the same mass and lifetime as the top quark but opposite charges. When a top quark and an anti-top quark are produced together, they form a top-antitop pair (tt̄).

Focusing on events with two charged leptons (top quarks and anti-top quarks decay into two electrons, two muons or one electron and one muon) and multiple jets (sprays of particles associated with top quark decay), the analysis examines the invariant mass of the top–antitop system along with two angular observables that directly probe how the spins of the top and anti‑top quarks are correlated. These measurements allow the team to compare the data with the prediction for the non resonant tt̄ production based on fixed order perturbative quantum chromodynamics (QCD), which is what physicists normally use to calculate how quarks behave according to the standard model of particle physics.

Near the kinematic threshold where the top–antitop pair is produced, CMS observes a significant excess of events relative to the QCD prediction. The number of extra events they see can be translated into a production rate. Using a simplified model based on non‑relativistic QCD, they estimate that this excess corresponds to a cross section of about 8.8 picobarns, with an uncertainty of roughly +1.2/–1.4 picobarns. The pattern of the excess, including its spin‑correlation features, is consistent with the production of a colour singlet pseudoscalar (a top–antitop pair in the 1S₀ state, i.e. the simplest, lowest energy configuration), and therefore with the prediction of non-relativistic QCD near the tt̄ threshold. The statistical significance of the excess exceeds five standard deviations, indicating that the effect is unlikely to be a statistical fluctuation. Researchers want to find a toponium‑like state because it would reveal how the strongest force in nature behaves at the highest energies, test key theories of heavy‑quark physics, and potentially expose new physics beyond the Standard Model.

The researchers emphasise that modelling the tt̄ threshold region is theoretically challenging, and that alternative explanations remain possible. Nonetheless, the result aligns with long‑standing predictions from non‑relativistic QCD that heavy quarks could form short‑lived bound states near threshold. The analysis also showcases spin correlation as an effective means to discover and characterise such effects, which were previously considered to be beyond the reach of experimental capabilities. Starting with the confirmation by the ATLAS Collaboration last July, this observation has sparked and continues to inspire follow-up theoretical follow-up theoretical and experimental works, opening up a new field of study involving bound states of heavy quarks and providing new insight into the behaviour of the strong force at high energies.

Read the full article

Observation of a pseudoscalar excess at the top quark pair production threshold

The CMS Collaboration 2025 Rep. Prog. Phys. 88 087801

Do you want to learn more about this topic?

The sea of quarks and antiquarks in the nucleon D F Geesaman and P E Reimer (2019)

Photonics West explores the future of optical technologies

The 2026 SPIE Photonics West meeting takes place in San Francisco, California, from 17 to 22 January. The premier event for photonics research and technology, Photonics West incorporates more than 100 technical conferences covering topics including lasers, biomedical optics, optoelectronics, quantum technologies and more.

As well as the conferences, Photonics West also offers 60 technical courses and a new Career Hub with a co-located job fair. There are also five world-class exhibitions featuring over 1500 companies and incorporating industry-focused presentations, product launches and live demonstrations. The first of these is the BiOS Expo, which begins on 17 January and examines the latest breakthroughs in biomedical optics and biophotonics technologies.

Then starting on 20 January, the main Photonics West Exhibition will host more than 1200 companies and showcase the latest innovative optics and photonics devices, components, systems and services. Alongside, the Quantum West Expo features the best in quantum-enabling technology advances, the AR | VR | MR Expo brings together leading companies in XR hardware and systems and – new for 2026 – the Vision Tech Expo highlights cutting-edge vision, sensing and imaging technologies.

Here are some of the product innovations on show at this year’s event.

Enabling high-performance photonics assembly with SmarAct

As photonics applications increasingly require systems with high complexity and integration density, manufacturers face a common challenge: how to assemble, align and test optical components with nanometre precision – quickly, reliably and at scale. At Photonics West, SmarAct presents a comprehensive technology portfolio addressing exactly these demands, spanning optical assembly, fast photonics alignment, precision motion and advanced metrology.

SmarAct’s photonics assembly portfolio

A central highlight is SmarAct’s Optical Assembly Solution, presented together with a preview of a powerful new software platform planned for release in late-Q1 2026. This software tool is designed to provide exceptional flexibility for implementing automation routines and process workflows into user-specific control applications, laying the foundation for scalable and future-proof photonics solutions.

For high-throughput applications, SmarAct showcases its Fast Photonics Alignment capabilities. By combining high-dynamic motion systems with real-time feedback and controller-based algorithms, SmarAct enables rapid scanning and active alignment of PICs and optical components such as fibres, fibre array units, lenses, beam splitters and more. These solutions significantly reduce alignment time while maintaining sub-micrometre accuracy, making them ideal for demanding photonics packaging and assembly tasks.

Both the Optical Assembly Solution and Fast Photonics Alignment are powered by SmarAct’s electromagnetic (EM) positioning axes, which form the dynamic backbone of these systems. The direct-drive EM axes combine high speed, high force and exceptional long-term durability, enabling fast scanning, smooth motion and stable positioning even under demanding duty cycles. Their vibration-free operation and robustness make them ideally suited for high-throughput optical assembly and alignment tasks in both laboratory and industrial environments.

Precision feedback is provided by SmarAct’s advanced METIRIO optical encoder family, designed to deliver high-resolution position feedback for demanding photonics and semiconductor applications. The METIRIO stands out by offering sub-nanometre position feedback in an exceptionally compact and easy-to-integrate form factor. Compatible with linear, rotary and goniometric motion systems – and available in vacuum-compatible designs – the METIRIO is ideally suited for space-constrained photonics setups, semiconductor manufacturing, nanopositioning and scientific instrumentation.

For applications requiring ultimate measurement performance, SmarAct presents the PICOSCALE Interferometer and Vibrometer. These systems provide picometre-level displacement and vibration measurements directly at the point of interest, enabling precise motion tracking, dynamic alignment, and detailed characterization of optical and optoelectronic components. When combined with SmarAct’s precision stages, they form a powerful closed-loop solution for high-yield photonics testing and inspection.

Together, SmarAct’s motion, metrology and automation solutions form a unified platform for next-generation photonics assembly and alignment.

  • Visit SmarAct at booth #3438 at Photonics West and booth #8438 at BiOS to discover how these technologies can accelerate your photonics workflows.

Avantes previews AvaSoftX software platform and new broadband light source

Photonics West 2026 will see Avantes present the first live demonstration of its completely redesigned software platform, AvaSoftX, together with a sneak peek of its new broadband light source, the AvaLight-DH-BAL. The company will also run a series of application-focused live demonstrations, highlighting recent developments in laser-induced breakdown spectroscopy (LIBS), thin-film characterization and biomedical spectroscopy.

AvaSoftX is developed to streamline the path from raw spectra to usable results. The new software platform offers preloaded applications tailored to specific measurement techniques and types, such as irradiance, LIBS, chemometry and Raman. Each application presents the controls and visualizations needed for that workflow, reducing time and the risk of user error.

The new AvaSoftX software platform

Smart wizards guide users step-by-step through the setup of a measurement – from instrument configuration and referencing to data acquisition and evaluation. For more advanced users, AvaSoftX supports customization with scripting and user-defined libraries, enabling the creation of reusable methods and application-specific data handling. The platform also includes integrated instruction videos and online manuals to support the users directly on the platform.

The software features an accessible dark interface optimized for extended use in laboratory and production environments. Improved LIBS functionality will be highlighted through a live demonstration that combines AvaSoftX with the latest Avantes spectrometers and light sources.

Also making its public debut is the AvaLight-DH-BAL, a new and improved deuterium–halogen broadband light source designed to replace the current DH product line. The system delivers continuous broadband output from 215 to 2500 nm and combines a more powerful halogen lamp with a reworked deuterium section for improved optical performance and stability.

A switchable deuterium and halogen optical path is combined with deuterium peak suppression to improve dynamic range and spectral balance. The source is built into a newly developed, more robust housing to improve mechanical and thermal stability. Updated electronics support adjustable halogen output, a built-in filter holder, and both front-panel and remote-controlled shutter operation.

The AvaLight-DH-BAL is intended for applications requiring stable, high-output broadband illumination, including UV–VIS–NIR absorbance spectroscopy, materials research and thin-film analysis. The official launch date for the light source, as well as the software, will be shared in the near future.

Avantes will also run a series of live application demonstrations. These include a LIBS setup for rapid elemental analysis, a thin-film measurement system for optical coating characterization, and a biomedical spectroscopy demonstration focusing on real-time measurement and analysis. Each demo will be operated using the latest Avantes hardware and controlled through AvaSoftX, allowing visitors to assess overall system performance and workflow integration. Avantes’ engineering team will be available throughout the event.

  • For product previews, live demonstrations and more, meet Avantes at booth #1157.

HydraHarp 500: high-performance time tagger redefines precision and scalability

One year after its successful market introduction, the HydraHarp 500 continues to be a standout highlight at PicoQuant’s booth at Photonics West. Designed to meet the growing demands of advanced photonics and quantum optics, the HydraHarp 500 sets benchmarks in timing performance, scalability and flexible interfacing.

At its core, the HydraHarp 500 delivers exceptional timing precision combined with ultrashort jitter and dead time, enabling reliable photon timing measurements even at very high count rates. With support for up to 16 fully independent input channels plus a common sync channel, the system allows true simultaneous multichannel data acquisition without cross-channel dead time, making it ideal for complex correlation experiments and high-throughput applications.

The HydraHarp 500

A key strength of the HydraHarp 500 is its high flexibility in detector integration. Multiple trigger methods support a wide range of detector technologies, from single-photon avalanche diodes (SPADs) to superconducting nanowire single-photon detectors (SNSPDs). Versatile interfaces, including USB 3.0 and a dedicated FPGA interface, ensure seamless data transfer and easy integration into existing experimental setups. For distributed and synchronized systems, White Rabbit compatibility enables precise cross-device timing coordination.

Engineered for speed and efficiency, the HydraHarp 500 combines ultrashort per-channel dead time with industry-leading timing performance, ensuring complete datasets and excellent statistical accuracy even under demanding experimental conditions.

Looking ahead, PicoQuant is preparing to expand the HydraHarp family with the upcoming HydraHarp 500 L. This new variant will set new standards for data throughput and scalability. With outstanding timing resolution, excellent timing precision and up to 64 flexible channels, the HydraHarp 500 L is engineered for highest-throughput applications powered – for the first time – by USB 3.2 Gen 2×2, making it ideal for rapid, large-volume data acquisition.

With the HydraHarp 500 and the forthcoming HydraHarp 500 L, PicoQuant continues to redefine what is possible in photon timing, delivering precision, scalability and flexibility for today’s and tomorrow’s photonics research. For more information, visit www.picoquant.com or contact us at info@picoquant.com.

  • Meet PicoQuant at BiOS booth #8511 and Photonics West booth #3511.

 

Copyright © 2026 by IOP Publishing Ltd and individual contributors