Skip to main content
Vacuum and cryogenics

Vacuum and cryogenics

…And now for the next 20 years

01 Oct 2008

Six leading physicists look at the big challenges that lie ahead

In Physics World and also on our website physicsworld.com, we report on breakthroughs in physics and describe the latest research trends with in-depth features. This issue (pages 21, 29, 43, 49; print version only) contains some flashbacks to the big events that took place in physics over the last 20 years, but what are the burning questions in physics today?

These six answers are the most lucid and interesting replies to a survey we sent out to a number of physicists in different fields. Because theorists were happier to speculate than experimentalists, theoretical topics predominate. Many important areas — environmental physics, quantum computing, renewable-energy research and almost all of condensed-matter physics — are absent. The result is less a comprehensive overview of the future of physics than it is a series of snapshots. Yet with physics increasingly fragmented into sub-fields, some of which have only a tangential connection to each other (see pages 36–37, print version only), it is almost impossible, to capture the future direction of the subject as a whole. Indeed, one of the fields we profile here — network theory — was almost nonexistent two decades ago. Others, including string theory and (arguably) quantum field theory, were still in their infancy compared with where they are today.

So where will the future lead us? Predictions in physics can be problematic — witness the late 19th-century belief that the future of the subject lay in tying up a few loose ends — but here are the views of six researchers who were willing to go out on a limb.

Emergence of network theory

An important recent phenomenon in the field of complex systems has been the emergence of network theory. Most truly complex systems — from the cell to the Web and even social systems — have a network behind them that tell us how the system’s components interact with each other. Given the differences between these systems — both in the nature of the components and the function of the whole system — one would not expect there to be any intrinsic similarities between them. Yet, since the turn of the century, scientists have discovered that the underlying structure of these networks is rather similar: most have a scale-free topology (the number of links per node follows a power law), they display a high degree of clustering, and the distance between the nodes is small (the small-world property). We have realized that the network matters, and this has not only produced novel questions, but also led us to re-evaluate how we describe a complex system.

Another key question concerns human dynamics. The description and prediction of human behaviour may not initially look like physics questions, but recently the empirical tools of physics have begun to play a fundamental role in addressing these problems. The future of physics will be determined by our ability to address issues of fundamental importance to society. For physicists, such questions have traditionally involved finding new sources of energy or discovering new materials, but now the focus of the questions is slowly shifting towards increasingly interdisciplinary problems at the boundary of physics, social sciences, biology and engineering. If physics is to remain relevant, and maintain its leading role among the sciences, it will have to embrace these questions.

Albert-Lâszlô Barabási is director of the Center for Network Science at Northeastern University, US

Developments in medical-imaging techniques

Over the past two decades, major advances have been achieved in the ability to image normal and diseased structures at the tissue and organ level, thereby substantially improving the ability to detect and treat macroscopic disease. These advances reflect developments in imaging methods such as magnetic resonance imaging, emission computed tomography, digital X-ray imaging, and imaging with ultrasound.

At the same time, the ability to study the structural and functional integrity of tissue at the cellular and multi-cellular level has made great strides, principally through the development of a number of opticaland nuclear-imaging techniques. These advances in what is termed “molecular imaging” provide the potential to distinguish normal cells from cancerous ones, and to determine the presence or absence of cancer within any microscopic region of tissue.

The biggest unsolved problem in medical physics is how to combine macroscopic and microscopic imaging advances so that the precise margins of cancers can be delineated during the planning and delivery of radiation treatment. A chasm exists between the visualization of the microscopic and macroscopic aspects of cancer, and this chasm must be bridged if advances in molecular imaging are to be used to improve the treatment of the disease with sources of ionizing radiation.

Bill Hendee is a physicist at the Medical College of Wisconsin, Milwaukee, US

The quantum vacuum

At the beginning of the 20th century, Einstein replaced aether theory with relativity, but a 21st-century aether — the quantum vacuum — is still puzzling physicists today. The aether was originally thought to be an allpenetrating substance that carried light through space like air carries sound. Take away all the light, and the aether would still be there. Now, according to quantum field theory, the state of absolute darkness, the vacuum state, is still a physical state, filling space completely, much like the aether. There is an important difference, though: one does not notice motion at a uniform speed relative to the quantum vacuum, but during acceleration the vacuum should glow, due to friction. The quantum vacuum should also cause black holes to evaporate, because at the event horizon, particles are produced from the vacuum, at the expense of the black hole’s mass.

None of these phenomena have been observed yet — they are astronomically weak, although they could be demonstrated in laboratory analogues. However, some aspects of the quantum vacuum appear in daily life: the quantum vacuum causes things to stick. For example, a gecko can hang on a glass surface using only one toe, because the microhairs in its feet stick to the glass by ceaselessly exchanging virtual photons that bind the two together. The vacuum force is small and acts only at short ranges, so a gecko needs many hairs to suspend its weight. In some applications of nanotechnology, this stickiness has been a problem — particularly in microelectromechanical systems that integrate electronics with moving parts. Understanding the quantum vacuum is not only a challenge for 21st-century physics, but also for 21st-century technology.

Ulf Leonhardt is a theoretical physicist at the University of St Andrews, UK

Water on other earths

Why does the Earth contain the exact amount of water it does? And do earths around other stars contain a similar amount? These issues are not moot. Water on a drier Earth would be absorbed into the silicate mantle, leaving a dry surface. In contrast, if the Earth had twice as much water, the continents would be submerged. Advanced, technological life would be inconceivable in the resulting water world, as no exofish or alien dolphins could invent metallurgy, computers or guitars. Does our Earth contain a “lucky” amount of water?

The Earth formed by accumulating particles of silicates, iron and water from the early protoplanetary disk in similar amounts. The original Earth accumulated many oceans worth of water (if not more) during its formation. But our present-day Earth has only a 1000th of its mass in water, far less than the silicates and iron. Where does this critical fraction of water come from?

Within its first 100 million years, the Earth was dramatically desiccated when a Mars-sized planet slammed into it, vaporizing the oceans and sending a huge plume of water into space. (The Moon also formed from this impact.) The Earth was left a parched planet, not unlike Mars today.

That dry Earth acquired its current complement of water from asteroids and comets that randomly slammed into it. Jupiter’s gravity acted as a slingshot and dispersed these objects in all directions, with a few sent toward the Earth, depositing their water upon splashdown. Thus the Earth lost its water, and then re-acquired some back, all by chance collisions during the early bumper-car era of the solar system.

Other earths may not be so lucky. Simulations show that they could acquire anywhere from 0.01 to 100 oceans of water. Only a tiny fraction of these earths contain just the right amount of water to have both oceans and continents. The great variation in water on these other earths stems from the orbit and mass of a Jupiter-like planet (if any) that disturbs the asteroids. The glancing blow by a Mars-sized planet to desiccate other earths is a freak event, without which they keep many oceans of water. Any binary-star companion would perturb the asteroids and comets differently, yielding different amounts of water. The presence of isotopes such as aluminium-26 can significantly change the heating inside, and hence the evaporation of water from, asteroids and comets.

In the random card game of planet formation, the Earth was dealt an aquarian straight flush, cashing in with Homo sapiens. But the good fortune is circular: if Earth were not just-so endowed with water, we would not be here to discuss it.

Geoff Marcy is an astronomer at University of California-Berkeley, US

The future of cosmology

Cosmology may be on the verge of its most exciting decades — or its most boring. The last 20 years have seen the rise of a standard cosmological model, described by general relativity and the interactions of just a few components: normal matter, dark matter and dark energy, with hints of an early epoch of accelerated expansion known as inflation. The simplicity of this description is deceptive, since we do not know what kind of particles make up the dark components, nor do we know what mechanism is responsible for their relative amounts. The goal of the biggest upcoming cosmological projects — microwave background telescopes like the European Space Agency’s Planck Surveyor; groundbased surveys like the Large-Scale Synoptic Survey Telescope and the Square Kilometre Array; or future satellite telescopes like Euclid or the Joint Dark Energy Mission — is to understand these components by observing their effect on the overall expansion of the universe and the growth of structure within it.

Ideally, the next generation of microwave experiments will indirectly observe a background of gravitational radiation — a crucial signature of that early inflationary era. Detailed observations of distant supernovae and telescopes probing the local universe could allow us to measure the properties of dark energy, which seems to be causing the expansion of the universe to accelerate today. However, one very real possibility is that these observations will bring us a few digits of increased precision on the cosmological parameters describing the universe but no real understanding of the underlying physics of dark matter or dark energy, and few, if any, hints about the mechanism behind inflation or other epochs in the early universe.

Even in that scenario, what we will have in a few decades’ time is a phenomenally detailed map of the universe, over larger areas of the sky and reaching further away (and hence further back in time) with each new telescope, eventually to the very first objects to coalesce out of the primordial gas. But if these data are silent on the early universe and particle physics, then cosmology will continue the fragmentation that has already begun, between those applying theoretical tools and particle physics to the early universe and those using astrophysical techniques to understand the evolution of the objects within it.

Andrew Jaffe is an astrophysicist at Imperial College London, UK

The holographic principle and the string-theory landscape

The big new ideas that have emerged from quantum gravity have been the “holographic principle” and the “string-theory landscape”. The holographic principle was an outgrowth of Stephen Hawking’s insight into the clash between the equivalence principle and the quantum principle of information conservation. The outcome is one of the most startling concepts in modern physics: the degrees of freedom of a region of space, instead of filling the region, reside on the boundary surface. A hologram is a 2D sheet of film that stores information in a 3D scene. If you look at the film through a microscope, then all you see is a random bunch of marks; but if you know the rules, then you can reconstruct the solid scene that it depicts. The holographic principle says that the 3D universe is like a reconstructed image stored on a distant mathematical boundary.

The holographic principle has radically restructured our ideas about quantum gravity, black holes, and the nature of fundamental degrees of freedom. At the same time it has closed a circle of ideas that began in the late 1960s. String theory started as the theory of hadrons — ordinary sub-nuclear particles like protons and neutrons — but the same mathematics describes objects like gravitons and black holes. Remarkably, the circle has now been closed and black-hole theory is now used to explain properties of colliding nuclei.

The string-theory landscape, meanwhile, grew out of the search for a string theory of elementary particles. The important thing about string theory is not that elementary particles are strings, but that it provides a kind of DNA that codes the properties of a universe, in the same way that the base-pair sequence in DNA codes the biological phenotype. Just as there is a huge landscape of biological designs — all the possible rearrangements of the tens of millions of base pairs in a DNA strand — string theory provides an enormous number of patterns for rearranging the elements that comprise a compactification of the extra dimensions. (The number 10500 is often quoted.) This has had a sobering effect on the ambition of finding a unique string theory of particle physics, but it fits extremely well with cosmological ideas.

The landscape naturally lends itself to speculations about an eternally inflating multiverse of “pocket-universes” isolated from one another by event horizons. On the other hand, the holographic principle suggests that ordinary quantum mechanics only makes sense within an observer’s horizon. So there is a serious tension between the two: how to describe a multiverse holographically?

Leonard Susskind is a string theorist at Stanford University, US

Copyright © 2024 by IOP Publishing Ltd and individual contributors