For as long as computers have existed, physicists have used them as tools to understand, predict and model the natural world. Computing experts, for their part, have used advances in physics to develop machines that are faster, smarter and more ubiquitous than ever. This collection celebrates the latest phase in this symbiotic relationship, as the rise of artificial intelligence and quantum computing opens up new possibilities in basic and applied research
Manufacturing silicon qubits at scale
As quantum computing matures, will decades of engineering give silicon qubits an edge? Fernando Gonzalez-Zalba, Tsung-Yeh Yang and Alessandro Rossi think so
Small computers find an industrial niche
Physicist and Raspberry Pi inventor Eben Upton explains how simple computers are becoming integral to the Internet of Things
30 years of the web
Challenges of interdisciplinary physics and the Web at 30
Physics World journalists discuss the week’s highlights
Vague but exciting: how the Web transformed business
James McKenzie explains how Tim Berners-Lee's invention of the World Wide Web at CERN has revolutionized how we trade.
Electronic publishing and visions of hypertext
Tim Berners-Lee predicts the future of online publishing in an article he wrote for Physics World in 1992
Illustrating 30 years of the Web
Jess Wade illustrates the history of the World Wide Web, from the technology that enabled it to the staple it is today
The future of the Internet
Emerging technologies shaping our connected world
Physics World 30th anniversary podcast series – 30 years of the World Wide Web
Fifth episode in mini-series revisits the birth of the Web and the challenges it now faces
The third pillar of science
Computing is transforming scientific research, but are researchers and software code adapting at the same rate? Benjamin Skuse finds out
Simulations reveal new insights
Defying gravity: insights into hula hoop levitation
Successful hula hooping requires a gyrating body with a particular slope and curvature
Virtual patient populations enable more inclusive medical device development
University of Leeds spin-out adsilico is using computational medicine to enable more inclusive and patient-centric medical device development
Mathematical model sheds light on how exercise suppresses tumour growth
A simple mathematical model examines the intricate relationship between exercise, immune function and cancer
Optimization algorithm gives laser fusion a boost
Simulations suggest that iterative technique could increase the energy output of direct-drive inertial confinement fusion
Electromagnetic waves solve partial differential equations
New photonic technique could boost analogue alternatives to numerical methods
Bursts of embers play outsized role in wildfire spread, say physicists
Experiments on tracking firebrands could improve predictions of spot-fire risks
Machine learning reveals new science
Deep learning helps radiologists detect lung cancer on chest X-rays
Introducing artificial intelligence into the clinical workflow helps radiologists detect lung cancer lesions on chest X-rays and dismiss false-positives
Machine learning puts nanomaterials in the picture
Algorithms help materials scientists recognize patterns in structure-function relationships
Deep learning algorithm helps diagnose neurological emergencies
A deep learning algorithm detects brain haemorrhages on head CT scans with comparable performance to highly trained radiologists
Artificial intelligence helps detect atrial fibrillation
An artificial intelligence model can identify patients with intermittent atrial fibrillation from scans performed during normal heart rhythm
Machine learning is implemented on an IBM quantum processor
Proof-of-concept demonstration done using two superconducting qubits
AI framework uses medical images to individualize radiotherapy dose
An image-based artificial intelligence framework predicts a personalized radiation dose that minimizes the risk of treatment failure
AI predicts coma outcome from EEG trace
A machine learning algorithm can read electroencephalograms as well as clinicians
The latest in quantum computing
Quantum processor enters unprecedented territory for error correction
Errors on Google Quantum AI device drop below threshold needed to “win” the error-correction game
Quantum error correction research yields unexpected quantum gravity insights
Universal boundary that distinguishes effective approximate error correction codes from ineffective ones turns out to be connected to the fundamental nature of the universe
Thermal dissipation decoheres qubits
Superconducting quantum bits release their energy into their environment as photons
Spins hop between quantum dots in new quantum processor
Hopping-based logic achieved at high fidelity
‘Poor man’s Majoranas’ offer testbed for studying possible qubits
A new approach could put Majorana particles on track to become a novel qubit platform, but some scientists doubt the results’ validity
Quantum error correction produces better ‘magic’ states
Proof-of-concept demonstration yields encoded magic states that are robust against any single-qubit error
Download your publishing guide
A step-by-step guide to publishing your research paper. Helping you get published and make an impact in your scientific community
Related jobs
Related events
- Quantum | Workshop Machine Learning for Quantum Matter 24—28 February 2025 | Dresden, Germany
- Mathematics and computation | Virtual event Comprehensive online CasaXPS short course 27—28 March 2025
- Mathematics and computation | Conference World Conference on Data Science & Statistics 16—18 June 2025 | Amsterdam, Netherlands