Skip to main content

A gathering for Gardner

Martin Gardner, who turns 94 this autumn, seems to have pulled off an astounding trick. Every other year hundreds of people gather to honour Gardner, who is the author of over 70 books and wrote the popular “Mathematical Games” column that appeared in Scientific American for a quarter of a century from 1956. What is astonishing is that the people come from a bewildering variety of professions and include jugglers, magicians, artists, puzzle-makers, logicians, computer scientists, pseudoscience debunkers and mathematicians.

One long-standing fan of Gardner is the Atlanta businessman and puzzle collector Tom Rodgers. After Gardner stopped writing his column, Rodgers began trying to convince the famously humble author to attend an event in his honour. When Gardner finally relented, Rodgers used Gardner’s extensive correspondence files to compile a list of invitees. The first “Gathering for Gardner”, which Rodgers organized in 1993, was so successful that a second was held in 1996. Since then, it has been held biannually in an Atlanta hotel. Each event is called “G4Gn”, with the n denoting the number in the series. The most recent gathering, which took place over four days last March, was G4G8.

Gardner, who now resides in an “assisted-living” facility in Oklahoma, sadly no longer travels. But his presence was everywhere at G4G8. Talks ranged over his favourite topics — including mathematics, logic, games, puzzles, sculptures, mosaics and knots — and were all delivered and often staged wittily, with infectious enthusiasm and commitment. One mathematician ran the Atlanta marathon in the morning, then turned up to co-present his talk still wearing running shorts and his marathon number, in a kind of run-and-prove biathlon.

Raymond Smullyan, an 89-year-old magician and logician with white, shoulder-length hair and long, crooked fingers, gave a talk that consisted entirely of reciting a string of puzzles, paradoxes and one-liner-like, selfrefuting sentences. It began when he strode to the podium and said “Before I begin speaking, let me say something”, and ended with paradoxes associated with the remark “Only an idiot would believe this sentence”.

Gardner is renowned for debunking pseudoscience — he is a founding member of the Committee for the Scientific Investigation of Claims of the Paranormal — and several talks were in that vein. One debunked the “Indian rope trick”, which supposedly involves a climbable, vertically rising rope but is evidently a hoax that can be traced back to an 1890 article in the Chicago Daily Tribune.

Gardner is also famous for his creation of “Dr Matrix”, a fictitious scholar whom Gardner used to mock numerology — the belief that all numbers are intrinsically linked to real objects and living things. In homage, a G4Gn tradition is for one Dr Matrix impersonator to promote n, followed by another who attacks n. At this year’s meeting, we heard tongue-in-cheek conspiracy theories; for instance, about why the maximum term of a US president is eight years, this summer’s Olympics start on 8.8.08, Santa has eight reindeer, a piano has 88 keys, there are now eight planets in the solar system, and so forth. When one speaker asked “What’s 987,654,321 divided by 123,456,789?”, my 13-year-old son Alex grabbed his calculator, already suspecting the answer.

Kenneth Brecher, a physicist and astronomer from Boston University, peppered his talk, “A torque about tops”, with phrases like “I simply love tops!” and “I’m a top-a-holic!” and made it clear why their delightful properties attracted the affection of great physicists. James Clerk Maxwell created “the fanciest top ever made”, Einstein worked on the subject, and Felix Klein and Arnold Sommerfeld spent 13 years writing a four-volume, 966-page text on rotating bodies.

After discussing the chirality of the peculiar spinning objects known as celts, Brecher mentioned scarab beetles, whose carapaces are the only known natural objects that circularly polarize light, and wondered about the evolutionary mechanism, if any, behind it. Brecher’s enthusiasm was so infectious that when he displayed a picture of Wolfgang Pauli and Niels Bohr intently playing with a top, the first thought that sprang to mind was not “Why did such great minds stoop to such a frivolous activity?”, but rather “Of course!”.

Between sessions, people juggled, danced, played the piano, built group sculptures, taught card tricks and how to fold intricate origami objects, and argued about difficult mathematical sequences. An exhibition area displayed puzzles, toys and art. Everyone had an interesting story. Colin Wright (who has a PhD in mathematics from Cambridge University in the UK) used a notation system to discover novel juggling tricks and was demonstrating them to anyone with an interest. Alex Stone, a graduate physics student from Columbia University, was researching a book on physics and magic. Louis Brown, an 11-year-old from New Jersey, had written a fan letter to Smullyan a week before G4G8 and Smullyan invited him to come along.

Though Gardner did not attend, nothing is said to delight him more than the collaborations fostered by the gatherings. At its conclusion, several participants flew to Oklahoma to bring him news and gifts.

The critical point

The key to Gardner’s trick — of assembling such a diverse array of people — is that creating and solving puzzles is an integral part of many fields. Mathematicians and scientists study puzzles posed by nature that perhaps only they can solve. Puzzle-makers design puzzles that they hope others can solve with difficulty, while magicians create perceptual puzzles to foist on audiences. Taking unashamed delight in the play of puzzle-solving, and witnessing others do it, is essential to being able to do it well. The G4Gs reveal that Martin Gardner does not force people to cross disciplinary boundaries — rather, he reminds us that these boundaries are artificial to begin with.

A quantum renaissance

Quantum dice illustration

Pure curiosity has been the driving force behind many groundbreaking experiments in physics. This is no better illustrated than in quantum mechanics, initially the physics of the extremely small. Since its beginnings in the 1920s and 1930s, researchers have wanted to observe the counterintuitive properties of quantum mechanics directly in the laboratory. However, because experimental technology was not sufficiently developed at the time, people like Niels Bohr, Albert Einstein, Werner Heisenberg and Erwin Schrödinger relied instead on “gedankenexperiments” (thought experiments) to investigate the quantum physics of individual particles, mainly electrons and photons.

By the 1970s technology had caught up, which produced a “gold rush” of fundamental experiments that continued into the 1990s. These experiments confirmed quantum theory with striking success, and challenged many common-sense assumptions about the physical world. Among these assumptions are “realism” (which, roughly speaking, states that results of measurements reveal features of the world that exist independent of the measurement), “locality” (that the result of measurements here and now do not depend on some action that might be performed a large distance away at exactly the same time), and “non-contextuality” (asserting that results of measurements are independent of the context of the measurement apparatus).

But a big surprise awaited everyone working in this field. The fundamental quantum experiments triggered a completely new field whereby researchers apply phenomena such as superposition, entanglement and randomness to encode, transmit and process information in radically novel schemes. “Quantum information science” is now a booming interdisciplinary field that has brought futuristic-sounding applications such as quantum computers, quantum encryption and quantum teleportation within reach. Furthermore, the technological advances that underpin it have given researchers unprecedented control over individual quantum systems. That control is now fuelling a renaissance in our curiosity of the quantum world by allowing physicists to address new fundamental aspects of quantum mechanics. In turn, this may open up new avenues in quantum information science.

Against intuition

Both fundamental quantum experiments and quantum information science owe much to the arrival of the laser in the 1960s, which provided new and highly efficient ways to prepare individual quantum systems to test the predictions of quantum theory. Indeed, the early development of fundamental quantum-physics experiments went hand in hand with some of the first experimental investigations of quantum optics.

One of the major experimental leaps at that time was the ability to produce “entangled” pairs of photons. In 1935 Schrödinger coined the term “entanglement” to denote pairs of particles that are described only by their joint properties instead of their individual properties — which goes against our experience of the macroscopic world. Shortly beforehand, Einstein, Boris Podolsky and Nathan Rosen (collectively known as EPR) used a gedankenexperiment to argue that if entanglement exists, then the quantum-mechanical description of physical reality must be incomplete. Einstein did not like the idea that the quantum state of one entangled particle could change instantly when a measurement is made on the other particle. Calling it “spooky” action at a distance, he hoped for a more complete physical theory of the very small that did not exhibit such strange features (see “The power of entanglement” by Harald Weinfurter Physics World January 2005 pp47–51).

This lay at the heart of a famous debate between Einstein and Bohr about whether physics describes nature “as it really is”, as was Einstein’s view, or whether it describes “what we can say about nature”, as Bohr believed. Until the 1960s these questions were merely philosophical in nature. But in 1964, the Northern Irish physicist John Bell realized that experiments on entangled particles could provide a test of whether there is a more complete description of the world beyond quantum theory. EPR believed that such a theory exists.

Bell based his argument on two assumptions made by EPR that are directly contradicted by the properties of entangled particles. The first is locality, which states that the results of measurements performed on one particle must be independent of whatever is done at the same time to its entangled partner located at an arbitrary distance away. The second is realism, which states that the outcome of a measurement on one of the particles reflects properties that the particle carried prior to and independent of the measurement. Bell showed that a particular combination of measurements performed on identically prepared pairs of particles would produce a numerical bound (today called a Bell’s inequality) that is satisfied by all physical theories that obey these two assumptions. He also showed, however, that this bound is violated by the predictions of quantum physics for entangled particle pairs (Physics 1 195).

Bell experiment diagram

Take, for example, the polarization of photons. An individual photon may be polarized along a specific direction, say the horizontal, and we can measure this polarization by passing the photon through a horizontally oriented polarizer. A click in a photon detector placed behind it indicates a successful measurement and shows that the photon is horizontally polarized; no click means that the photon is polarized along the vertical direction. In the case of an entangled pair of photons, however, the individual photons turn out not to carry any specific polarization before they are measured! Measuring the horizontal polarization of one of the photons will always give a random result, thus making it equally likely to find an individual photon horizontally or vertically polarized. Yet performing the same measurement on the other photon of the entangled pair (assuming a specific type of entangled state) will show both photons to be polarized along the same direction. This is true for all measurement directions and is independent of the spatial separation of the particles.

Bell’s inequality opened up the possibility of testing specific underlying assumptions of physical theories — an effort rightfully referred to by Abner Shimony of Boston University as “experimental metaphysics”. In such Bell experiments, two distant observers measure the polarization of entangled particles along different directions and calculate the correlations between them. Because the quantum correlations between independent polarization measurements on entangled particles can be much stronger than is allowed by any local realistic theory, Bell’s inequality will be violated.

Quantum loopholes

The first such test was performed using entangled photons in 1972 by Stuart Freedman and John Clauser of the University of California at Berkeley; Bell’s inequality was violated and the predictions of quantum theory were confirmed (Phys. Rev. Lett. 28 938). But from early on there existed some loopholes that meant researchers could not exclude all possible “local realistic” models as explanations for the observed correlations. For example, it could be that the particles detected are not a fair sample of all particles emitted by the source (the so-called detection loophole) or that the various elements of the experiment may still be causally connected (the locality loophole). In order to close these loopholes, more stringent experimental conditions had to be fulfilled.

In 1982 Alain Aspect and colleagues at the Université Paris-Sud in Orsay, France, carried out a series of pioneering experiments that were very close to Bell’s original proposal. The team implemented a two-channel detection scheme to avoid making assumptions about photons that did not pass through the polarizer (Phys. Rev. Lett. 49 91), and the researchers also periodically — and thus deterministically — varied the orientation of the polarizers after the photons were emitted from the source (Phys. Rev. Lett. 49 1804). Even under these more stringent conditions, Bell’s inequality was violated in both cases, thus significantly narrowing the chances of local-realistic explanations of quantum entanglement.

In 1998 one of the present authors (AZ) and colleagues, then at the University of Innsbruck, closed the locality loophole by using two fully independent quantum random-number generators to set the directions of the photon measurements. This meant that the direction along which the polarization of each photon was measured was decided at the last instant, such that no signal (which by necessity has to travel slower than the speed of light) would be able to transfer information to the other side before that photon was registered (Phys. Rev. Lett. 81 5039). Bell’s inequality was violated.

Then in 2004, David Wineland and co-workers at the National Institute of Standards and Technology (NIST) in Colorado, US, set out to close the detection loophole by using detectors with perfect efficiency in an experiment involving entangled beryllium ions (Nature 409 791). Once again, Bell’s inequality was violated. Indeed, all results to date suggest that no local-realistic theory can explain quantum entanglement.

GHZ experiment

But the ultimate test of Bell’s theorem is still missing: a single experiment that closes all the loopholes at once. It is very unlikely that such an experiment will disagree with the prediction of quantum mechanics, since this would imply that nature makes use of both the detection loophole in the Innsbruck experiment and of the locality loophole in the NIST experiment. Nevertheless, nature could be vicious, and such an experiment is desirable if we are to finally close the book on local realism.

In 1987 Daniel Greenberger of the New York City College, Michael Horne of Stonehill College and AZ (collectively GHZ) realized that the entanglement of three or more particles would provide an even stronger constraint on local realism than two-particle entanglement (Am. J. Phys. 58 1131). While two entangled particles are at variance with local realism only in their statistical properties, which is the essence of Bell’s theorem, three entangled particles can produce an immediate conflict in a single measurement result because measurements on two of the particles allow us to predict with certainty the property of the third particle.

The first experiments on three entangled photons were performed in late 1999 by AZ and co-workers, and they revealed a striking accordance with quantum theory (Nature 403 515). So far, all tests of both Bell’s inequalities and on three entangled particles (known as GHZ experiments) (see “GHZ experiments”) confirm the predictions of quantum theory, and hence are in conflict with the joint assumption of locality and realism as underlying working hypotheses for any physical theory that wants to explain the features of entangled particles.

Quantum information science

The many beautiful experiments performed during the early days of quantum optics prompted renewed interest in the basic concepts of quantum physics. Evidence for this can be seen, for example, in the number of citations received by the EPR paper, which argued that entanglement renders the quantum-mechanical description of physical reality incomplete. The paper was cited only about 40 times between its publication in 1935 and 1965, just after Bell developed his inequalities. Yet today it has more than 4000 citations, with an average of 200 per year since 2002. Part of the reason for this rise is that researchers from many different fields have begun to realize the dramatic consequences of using entanglement and other quantum concepts to encode, transmit and process information.

Entanglement-based quantum cryptography

Take quantum cryptography, which applies randomness, superposition and, in one scheme proposed by Artur Ekert of Oxford University in the UK, two-particle entanglement to transmit information such that its security against eavesdropping is guaranteed by physical law (see “Entanglement-based quantum cryptography”). This application of quantum information science has already left the laboratory environment. In 2004, for instance, AZ and co-workers at the University of Vienna transferred money securely between an Austrian bank and Vienna City Hall using pairs of entangled photons that were generated by a laser in a nonlinear optical process and distributed via optical fibres. More recently, two international collaborations were able to distribute entangled photons over a distance of 144 km between La Palma and Tenerife, including a demonstration of quantum cryptography, and earlier this year even showed that such links could be established in space by bouncing laser pulses attenuated to the single-photon level off a satellite back to a receiving station on Earth (Physics World May p4). Commercial quantum-encryption products based on attenuated laser pulses are already on the market (see “Key to the quantum industry”), and the challenge now is to achieve higher bit rates and to bridge larger distances.

In a similar manner, the road leading to the GHZ experiments opened up the huge field of multiparticle entanglement, which has applications in, among other things, quantum metrology. For example, greater uncertainty in the number of entangled photons in an interferometer leads to less uncertainty in their phases, thereby resulting in better measurement accuracy compared with a similar experiment using the same number of non-entangled photons.

Multiparticle entanglement is also essential for quantum computing. Quantum computing exploits fundamental quantum phenomena to allow calculations to be performed with unprecedented speed — perhaps even solving problems that are too complex for conventional computers, such as factoring large prime numbers or enabling fast database searches. The key idea behind quantum computing is to encode and process information in physical systems following the rules of quantum mechanics. Much current research is therefore devoted to finding reliable quantum bits or “qubits” that can be linked together to form registers and logical gates analogous to those in conventional computers, which would then allow full quantum algorithms to be implemented.

One-way quantum computation

In 2001, however, Robert Raussendorf and Hans Briegel, then at the University of Munich in Germany, suggested an alternative route for quantum computation based on a highly entangled multiparticle “cluster state” (Phys. Rev. Lett. 86 5188). In this scheme, called “one-way” quantum computing, a computation is performed by measuring the individual particles of the entangled cluster state in a specific sequence that is defined by the particular calculation to be performed. The individual particles that have been measured are no longer entangled with the other particles and are therefore not available for further computation. But those that remain within the cluster state after each measurement end up in a specific state depending on which measurement was performed. As the measurement outcome of any individual entangled particle is completely random, different states result for the remaining particles after each measurement. But only in one specific case is the remaining state the correct one. Raussendorf and Briegel’s key idea was to eliminate that randomness by making the specific sequence of measurements depend on the earlier results. The whole scheme therefore represents a deterministic quantum computer, in which the remaining particles at the end of all measurements carry the result of the computation.

In 2005 the present authors and colleagues at Vienna demonstrated the principle of one-way quantum computing (see “One-way quantum computation”) and even a simple search algorithm using a four-photon entangled state (Nature 434 169). Then in 2007, Jian-Wei Pan of the University of Science and Technology of China in Hefei and coworkers implemented a similar scheme involving six photons. A distinct advantage of a photonic one-way computer is its unprecedented speed, with the time from the measurement of one photon to the next, i.e. a computational cycle, taking no more than 100 ns.

Foundational questions

The technology developed in the last 20 years to make quantum information processing and communication a reality has given researchers unprecedented control over individual quantum systems. For example, it is now clear that all-photonic quantum computing is possible but that it requires efficient and highly pure single-photon and entangled-photon sources, plus the ability to reliably manipulate the quantum states of photons or other quantum systems in a matter of nanoseconds. The tools required for this and other developments are being constantly improved, which is itself opening up completely new ways to explore the profound questions raised by quantum theory.

One such question concerns once again the notions of locality and realism. The whole body of Bell and GHZ experiments performed over the years suggests that at least one of these two assumptions is inadequate to describe the physical world (at least as long as entangled states are involved). But Bell’s theorem does not allow us to say which one of the two should be abandoned.

Leggett experiment

In 2003 Anthony Leggett of the University of Illinois at Urbana-Champaign in the US provided a partial answer by presenting a new incompatibility theorem very much in the spirit of Bell’s theory but with a different set of assumptions (Found. Phys. 33 1469). His idea was to drop the assumption of locality and to ask if, in such a situation, a plausible concept of realism — namely to assign a fixed polarization as a “real” property of each particle in an entangled pair — is sufficient to fully reproduce quantum theory. Intuitively, one might expect that properly chosen non-local influences can produce arbitrary correlations. After all, if you allow your measurement outcome to depend on everything that goes on in the whole universe (including at the location of the second measurement apparatus), then why should you expect a restriction on such correlations?

For the specific case of a pair of polarization-entangled photons, the class of non-local realistic theories Leggett sets out to test fulfils the following assumptions: each particle of a pair is emitted from the source with a well-defined polarization; and non-local influences are present such that each individual measurement outcome may depend on any parameter at an arbitrary distance from the measurement. The predictions of such theories violate the original Bell inequalities due to the allowed non-local influence, so it is natural to ask whether they are able to reproduce all predictions of quantum theory.

Leggett showed that this is not the case. Analogous to Bell, he derived a set of inequalities for certain measurements on two entangled particles that are fulfilled by all theories based on these specific non-local realistic assumptions but which are violated by quantum-theoretical predictions. Testing Leggett’s inequalities is more challenging than testing Bell’s inequalities because they require measurements of both linear and elliptic polarization, and much higher-quality entanglement. But in 2007, thanks to the tremendous progress made with entangled-photon sources, the present authors and colleagues at Vienna were able to test a Leggett inequality experimentally by measuring correlations between linear and elliptical polarizations of entangled photons (Nature 446 871).

The experiment confirmed the predictions of quantum theory and thereby ruled out a broad class of non-local realistic theories as a conceptual basis for quantum phenomena. Similar to the evolution of Bell experiments, more rigorous Leggett-type experiments quickly followed. For example, independent experiments performed in 2007 by the Vienna team (Phys. Rev. Lett. 99 210406) and by researchers at the University of Geneva and the National University of Singapore (Phys. Rev. Lett. 99 210407) confirmed a violation of a Leggett inequality under more relaxed assumptions, thereby expanding the class of forbidden non-local realistic models. Two things are clear from these experiments. First, it is insufficient to give up completely the notion of locality. Second, one has to abandon at least the notion of naïve realism that particles have certain properties (in our case polarization) that are independent of any observation.

Macroscopic limits

The close interplay between quantum information science and fundamental curiosity has also been demonstrated by some fascinating experiments that involve more massive particles. According to quantum theory, there is no intrinsic upper limit on size or complexity of a physical system above which quantum effects no longer occur. This is at the heart of Schrödinger’s famous cat paradox, which ridicules the situation by suggesting an experiment whereby someone could prepare a cat that is in a superposition of being alive and dead. One particularly interesting case is that of “matter–wave” interference.

Macroscopic quantum experiments

Electrons, neutrons and atoms have been shown to exhibit interference effects when passing through a double-slit, thus proving that these massive systems neither passed through only one or the other slit. Similar behaviour was observed more recently in 1999 by AZ and colleagues at Vienna for relatively large carbon-60 and carbon-70 fullerene molecules (Nature 401 680), and ongoing research has demonstrated interference with even heavier and larger systems (see “Macroscopic quantum experiments”). One of the main goals of this research is to realize quantum interference for small viruses or maybe even nanobacteria.

Very recently the ability to cool nanomechanical devices to very low temperatures has opened up a new avenue to test systems containing up to 1020 atoms. One fascinating goal of experiments that probe the quantum regime of mechanical cantilevers is to demonstrate entanglement between a microscopic system such as a photon and a mechanical system — or even between two mechanical systems.

While the underlying motivation for studying macroscopic quantum systems is pure curiosity, the research touches on important questions for quantum information science. This is because increasingly large or complex quantum systems suffer from interactions with their environment, which is as important for macromolecules and cantilevers as it is for the large registers of a quantum computer.

One consequence of this interaction with the outside world is “decoherence”, whereby the system effectively becomes entangled with the environment and therefore loses its individual quantum state. As a result, measurements of that system are no longer able to reveal any quantum signature. Finding ways to avoid decoherence is thus a hot topic both in macroscopic quantum experiments and in quantum information science. With fullerene molecules, for instance, the effect of decoherence was studied in great detail in 2004 by coupling them to the outside environment in different, tuneable ways (Nature 427 711). From an experimental point of view, we see no reason to expect that decoherence cannot be overcome for systems much more macroscopic than is presently feasible in the laboratory.

Quantum curiosity

Quantum physics and the information science that it has inspired emerge as two sides of the same coin: as inspiration for conceptually new approaches to applications on the one side; and as an enabling toolbox for new fundamental questions on the other. It has often happened that new technologies raise questions that have not been asked before, simply because people could not imagine what has become possible in the laboratory.

One such case may be our increasing ability to manipulate complex quantum systems that live in high-dimensional Hilbert space — the mathematical space in which quantum states are described. Most of the known foundational questions in quantum theory have hitherto made use only of relatively simple systems, but larger Hilbert-space dimensions may add qualitatively new features to the interpretation of quantum physics. We are convinced that many surprises await us there.

We expect that future theoretical and experimental developments will shine more light on which of the counterintuitive features of quantum theory are really indispensable in our description of the physical world. In doing so, we expect to gain greater insight into the underlying fundamental question of what reality is and how to describe it. The close connection between basic curiosity of the quantum world and its application in information science may even lead to ideas for physics beyond quantum mechanics.

At a Glance: A quantum renaissance

  • Quantum mechanics challenges intuitive notions about reality, such as whether the property of a particle exists before a measurement is performed on it
  • Entanglement is one of the most perplexing aspects of quantum theory; it implies that measurement results on two particles are intimately connected to one another instantaneously no matter how far apart they are
  • Since the 1970s, experiments have shown repeatedly that quantum theory is correct, but researchers are still devising measurements in order to find out what quantum mechanics tells us about physical reality
  • These tests have sparked a new field called quantum information science, in which entanglement and other quantum phenomena are used in to encrypt, transmit and process information in radically new ways
  • The increased level of control over individual quantum systems that has driven quantum information science is now enabling physicists to tackle once again the fundamental puzzles raised by quantum theory

More about: A quantum renaissance

M Arndt, K Hornberger and A Zeilinger 2005 Probing the limits of the quantum world Physics World March pp35–40
D Bouwmeester et al. (ed) 1999 The Physics of Quantum Information (Springer, Heidelberg)
A Steinberg et al. 1996 Quantum optical tests of the foundations of physics The American Institute of Physics Atomic, Molecular, and Optical Physics Handbook (ed) G W F Drake (AIP Press)
A Zeilinger et al. 2005 Happy centenary, photon Nature 433 239

Blog life: Chris Lintott’s Universe

Blogger: Chris Lintott
URL: chrislintott.net
First post: January 2006

Who is the blog written by?

Chris Lintott is a postdoc at Oxford University in the UK, where he is studying the application of astrochemical models of star formation to galaxies beyond the Milky Way. He is also heavily involved in science popularization and co-presents the 50-year-old BBC TV programme The Sky at Night with Sir Patrick Moore and he co-authored, along with Moore and Queen guitarist Brian May, the book Bang!, which is a popular account of the history of the universe (see Physics World October 2006 pp12–13, print edition only). Lintott is also principal investigator on the Galaxy Zoo project, which enlists the help of members of the public to classify galaxies imaged by telescopes, and contributes to the project’s blog.

What topics does the blog cover?

Compared with many other scientists’ blogs, this one is quite focused. On the whole, Lintott restricts himself to writing about and commenting on topics related to physics, and astronomy in particular. Readers are often treated to previews of upcoming episodes of The Sky at Night, as well as being pointed in the direction of Lintott’s other outreach activities, such as his posts on the Galaxy Zoo blog and articles he writes for The Times newspaper, not to mention other interesting science-related articles and posts on other blogs. The funding crisis at the UK’s Science and Technology Facilities Council (STFC) has naturally featured quite heavily over the past few months.

Who is it aimed at?

General readers with an interest in astronomy should have no trouble following most of the topics discussed on Lintott’s blog, although some — such as the STFC funding crisis — are probably aimed more at those in the research community. Judging by the comments the blog receives, readers are a varied bunch ranging from Lintott’s Galaxy Zoo colleagues to physicists working in completely different areas, and even non-scientists.

Why should I read it?

Lintott writes really well, and always explains the science very clearly, and his passion for all things space infuses his posts with enthusiasm. He also provides some interesting behind-the-scenes insights into the making of The Sky at Night and what it is like working with Moore; for instance, the veteran astronomy popularizer apparently insists on doing all his writing on a nearly 80-year-old Woodstock typewriter that he has owned since he was a boy of nine. Don’t expect to hear too much about the minutiae of Lintott’s life, however: astronomy is definitely the star of this blog.

How often is it updated?

Lintott often posts in bursts, putting up several posts in a day or over a few days, and then has a break for a few days to a week (presumably to allow him to catch up with his other commitments).

Can you give me a sample quote?

My trip to Hawaii was more or less a complete washout. Of the four nights we had on the telescope, we made it to the summit for one and a half of them. The last night was the most depressing, when we sat there for hours waiting for fog to clear only for it to start snowing heavily. Once that happens, you need a team with shovels to be able to open up and it’s time to head down. We did manage to get about an hour’s data on the third night, for a program which didn’t need as good conditions as mine did, providing data on targets which will be viewed by the new Herschel telescope.

Breaking through

Like all magazines, Physics World thinks carefully about the images that it puts on the cover of each issue. We aim for illustrations that are attractive, eye-catching and usually related to one of the main features in the magazine. This year a variety of striking images have appeared, including a battleship, a Bombardier beetle, a glacier and a snowflake. This month’s cover, though, is highly abstract, consisting of a projection into 2D of a complex 8D lattice called E8.

This image — or at least a version of it — first appeared in a paper posted last year on the arXiv preprint server by a US physicist living in Hawaii called Garrett Lisi. In the paper, Lisi controversially claims that E8 could form the basis of a “theory of everything” that unites nature’s four forces. The picture has 240 vertices and Lisi believes that 220 of these are occupied by all the fundamental particles of the Standard Model. The other 20, unoccupied, slots hold additional particles, the existence of which could be used to test his proto-theory.

Lisi’s paper was reported widely last year, partly because of its eye-catching illustrations and partly because he gave it a deceptively simple title — “An exceptionally simple theory of everything”. However, the article was widely criticized in blogs on various technical grounds, including the fact that Lisi’s work does not give the strengths of the interactions between the particles. Lisi himself has admitted that the paper, which has still to appear in a peer-reviewed journal, is not the final word. And although the article has been heavily downloaded, many of the downloads are believed to be by non-scientists (rather than mainstream researchers) coming from “news-aggregation” websites like Digg and Reddit that link to it.

Another reason that the paper got so much attention was no doubt Lisi’s unusual background. Although he has a PhD from the University of California, San Diego, Lisi currently holds no academic position and enjoys surfing and snowboarding. “Surfer dude makes stunning breakthrough” must have been too tempting a story for the media to ignore, even if it is far from the truth. Our feature this month on E8 (“Symmetry’s physical dimension”) is not so much about Lisi’s work itself, but rather uses his paper as a hook to illustrate some of the links between symmetry and basic physics.

Whether or not Lisi’s paper is a major contribution to science — the jury is still out — it illustrates the difficulty that those outside the mainstream can sometimes face in making worthwhile contributions to physics. For example, one UK physicist in his mid-50s, who recently completed a PhD in particle physics, explains elsewhere in this issue the difficulties he faced breaking into the subject as an older person (pp18—19, print edition only). He has dubbed this problem “the grey ceiling”, in analogy to the “glass ceiling” that can hinder women’s careers from progressing.

It would be a shame if talented people were excluded from physics, which needs to do all it can to encourage interest in the subject. Of course, those with unconventional career paths have to work twice as hard to be taken seriously. But one only has to look at William Henry Bragg (pp42—43, print edition only) to see that age need not be a hindrance to making valuable contributions to the subject. Despite never carrying out an original experiment before the age of 40, he went on to share the 1915 Nobel Prize for Physics with his son for their work on X-ray crystallography. Breakthroughs in physics can sometimes happen in the most unconventional of manners.

Building for the future

When I left university in 1975 having completed a degree in applied physics, embarking on a long-term career was pretty much the last thing on my mind. Exhausted by exams, and with the summer of 1975 shaping up to be a real scorcher, the prospect of beaches, sunshine and sailing seemed much more attractive. So I took a few months off — think of it as a delayed gap year.

Come the winter, however, with the beaches deserted and funds dispiritingly low, I looked in my local paper for something temporary to do while I considered how best to put my shiny new degree to good use. The Building Services Research and Information Association — a small research and development outfit in Bracknell — was looking for graduates to carry out laboratory experimentation into the comfort issues associated with buildings. Although the money was not great, the job was certainly interesting and would do fine as a stopgap. As it happened, the job was rather better than that and 33 years later I am chief executive of that research association, which is now the company known as BSIRA.

When I began my career, it was clear that while construction was a big industry (it now accounts for about 10% of the UK’s Gross Domestic Product), on the whole it was not at the forefront of technology and did not have many graduates — especially those with a background in science or technology — working in it. This meant that I, as a physicist, suddenly found myself working with very senior people at major UK and international companies, as well as gaining immediate access to some of the best people in the industry through in the professional organizations such as the Chartered Institute of Building Services Engineering. I was able to make a difference almost immediately.

For example, my first task was to develop a method for testing “whole house ventilation heat recovery devices”. This was quite complex, since it required the airflow volume, moisture content, temperature and electrical load to be measured in each of four positions simultaneously. A curiosity 30 years ago, this research is now essential as we work towards “zero carbon” homes. I was helped by having an employer that wanted junior employees to put their names to their work rather than hiding under their supervisors’ titles, and also by colleagues who were sociable, supportive and at the top of their game.

Diverse sector

The construction industry is involved in everything from motorways and bridges (civil construction) through to schools, hospitals, offices and homes (the built environment). Within these sectors there are designers (architects, structural engineers, building-services consultants and so on) and contractors, who actually put things together on-site. This latter group consists of the main contractors that erect the structures and specialist contractors that put in all the plant and equipment that make buildings comfortable and safe to live in. It is at the latter end of the spectrum that BSRIA works.

Founded in the mid-1950s by a group of companies that wanted to collaborate in research and development, BSRIA is one of about 50 research associations (RAs) in the UK that each deal with a particular market sector. For example, there are RAs for shoes, timber, drop forging and even (my second favourite) Scotch whisky. They are where industrial companies come together to do collaborative research, and over the years they have transformed from membership subscription-based organizations to wholly self-funded enterprises. Nevertheless, many research associations, including BSRIA, continue to operate a membership base and to facilitate collaborative efforts both in research and on a more political level.

In recent years the role of the built-environment engineer and contractor has changed radically. The escalating need to create buildings that produce very low or zero carbon-dioxide emissions has created new challenges that are taxing the very best brains. Indeed, we have to meet targets set by the UK government that all new houses built in the country will be “zero carbon” by 2016, with all non-domestic buildings following suit by 2019. You just have to look at your own home, with its heating, hot water and lighting needs, to appreciate just how difficult this is going to be to achieve if the occupants of these buildings are also going to be comfortable and healthy.

If this is problematic for new buildings, then it is even more difficult to achieve in existing buildings — and it is these that can make the real difference to carbon emissions. Roughly half of all carbon emissions come from buildings and their uses but only 1% of these buildings are newly constructed, while just 2% have some form of refurbishment each year. In other words, 97% of buildings have nothing done to them to improve their performance.

Building physics can do something about this by trying to understand the complex interactions of energy flows around structures. This involves resolving the influences of highly multivariate interactions and creating models that can eventually be used to build new structures. These tasks are well suited to people with a physics background, and there is a wonderful future ahead for bright and committed people who want to make a difference.

Path to success

BSRIA is not a large organization — it currently has 150 employees — and about half of the staff have numerate degrees, including mechanical engineering, aeronautical engineering and physics. We work both with those who are at the sharp end of invention, such as universities and other research institutions, and those who prefer to be a little further away from the cutting edge. Construction companies and their clients, for example, are highly risk averse, so it is our role to take innovation and create proven, derisked processes that can be used with confidence by constructors. As a result, both the company and its individual staff have a high exposure in the trade press and at conferences, and they have a lot of contact with the policy-makers in government. It is this diversity of activity that has made mine such a rewarding job.

I have now been chief executive of BSRIA for 10 years, having worked up through a variety of posts within the company. I started as a supervised project engineer, then progressed to running a small section of five people, and later a group of four sections. I took up the role of technical director and joined the first board of directors following a restructuring of the business in 1989 before being appointed to the top job in 1998.

Since I joined, the firm’s horizons have widened considerably — for example, in March of this year I opened a new office in Beijing, which employs five local staff. China is likely to undertake nearly half of the entire world’s activity in construction within the next decade and its government is anxious to ensure that its carbon footprint does not rise at the same rate. With many overseas companies now working there, the opportunities for transferring expertise are significant, although this expansion also presents many cultural and economic challenges for the firm. It is this mix of technology and business that has made my career at BSRIA such a delightful experience. Despite having been with the organization for 33 years, it seems like only yesterday since the beaches started to empty and the rain stopped play.

Once a physicist: Frank Reed

 

Why did you choose to study physics?

Being a child of the Moon-landing era, I was always interested in astronomy and space science. I eventually came to understand that physics was the power behind all the interesting things in the physical sciences, so I decided to study the subject at Wesleyan University in Connecticut in the US. I specialized in gravitational physics and wrote a thesis on the quantum behaviour of hydrogen atoms in the highly curved space–time near small black holes. I enjoyed it immensely, and my only regret is that there was not a better career path open to me afterwards.

What did you do after you left university?

I graduated with a BA in physics in 1984, after which I taught physics at an elite secondary school in New York City for three years. After that, I got into options trading and pricing models, which are actually rather closely related to some of the maths that appears in physics.

How did you come to create the atlas?

I got the idea for the Centennia Historical Atlas in the summer of 1990 when the Cold War ended. At that time the map of Europe was changing after decades of stability, and I thought I could make some money by putting those changes into a historical perspective — and that’s how I’ve been making my living ever since.

Can you explain what the atlas is all about?

It is a dynamic atlas of Europe and the Middle East from the year AD 1000 to the present day that puts 1000 years of history and 10,000 maps on a history student’s computer. When the software launches, the student has a map in front of them that can be played forwards or backwards in time. Countries and empires come and go right before your eyes.

How successful has it been?

After its initial release in the early 1990s, the atlas was immediately adopted by the US Naval Academy for their introductory course in Western civilization. By now, over 18,000 students at the Naval Academy, many of whom as now officers in the US Navy, have studied history using it. The atlas is also popular with diplomats and others connected with international relations — for example, I have one former director of the CIA and one former US secretary of state on my customer list. The atlas is also required reading at many universities across the world, and it is a popular tool among history enthusiasts and genealogists.

What are you working on at the moment?

As well as constantly making incremental improvements to the current edition (a free version is available for download at www.historicalatlas.com), I’m extending the atlas to cover the history of North America from the pre-Columbian period forward. I’m also working on software projects connected with celestial navigation and nautical astronomy. For example, I’ve developed tools that predict the exact position of the Moon and analyse lunar distance sights, which will allow modern navigation enthusiasts to determine their “longitude by lunar distances”, which was once the epitome of the navigator’s art. I also occasionally write options-pricing software for the financial markets.

Does your physics degree help you in your current work?

Though the contribution was minor and basically metaphorical, my physics education perhaps made it a bit easier to envision time as the fourth dimension for a map of history. It does help significantly with the work that I do on nautical astronomy, however. For example, I recently ran some numerical integrations to create refraction tables for variable atmospheric conditions. Also, in options-pricing models, the pricing is essentially identical to the mathematics of diffusion in physics.

Do you still keep up with physics at all?

I take every chance I get. Of course, it’s impossible to keep up with all of physics, but I try to keep up with the latest developments in gravitational physics, both in terms of the practical application of gravimetry and the higher level of developments in general relativity. I manage an online list related to nautical astronomy and celestial navigation, and every now and then I end up talking about gravitational physics. Just recently I managed to get a few people interested in the local tidal field of the Earth by segueing from a discussion of inertial navigation.

Lay-offs at Fermilab set to be reversed

Pier Oddone

Six months of lobbying brought some success for the American physics community yesterday when President George W Bush signed a $186bn “supplemental” spending bill. The bill, which continues funding for military campaigns in Iraq and Afghanistan, also includes $338m for government agencies that support science research.

The extra funds become available immediately and will go part of the way to compensating for severe cuts in funding for US physics that were announced last December. In particular, the new money could allow the Fermi National Accelerator Laboratory near Chicago to reverse a decision to lay off 140 staff. However, physicists think that serious problems with the US science budget remain.

Lay-offs reversed

The new legislation provides $62.5m apiece for the National Science Foundation, the Office of Science of the Department of Energy (DOE), and NASA. Although the DOE has yet to decide how it will spend the fresh funds, it is expected to give the bulk to the Fermilab and the Stanford Linear Accelerator Laboratory (SLAC) in California. Since the funds apply to the current financial year, which ends on 30 September, they can permit the institutions to cancel or reverse layoffs planned as a result of last December’s budget.

Fermilab’s director Pier Oddone, who was in process of sacking 140 employees, has called a meeting of all his staff for tomorrow. “I expect to announce an end of involuntary layoffs at the laboratory,” he said.

We’re not anticipating that we would do a large rehiring at this point Lee Lyon, SLAC

SLAC, which laid off 125 people in April, is less certain of its course. “At this stage we don’t really know what will happen,” said spokesman Lee Lyon. “We’re not anticipating that we would do a large rehiring at this point. But as critical positions open up going forward, we would anticipate that people would be interested in reapplying.”

Analysts expect that any funds remaining after those efforts will support DOE’s fusion programmes. That should help the American contribution to the ITER project, support for which was cut to zero in the 2008 budget signed last December.

The supplemental is a good sign that Congress has recognized the importance of physical science and other science funding Kei Koizumi, AAAS

Physicists have reacted to the supplemental bill with less than unalloyed glee, seeing it more as a bandage than a cure. “While it’s not as great as it could have been for physics, the supplemental is a good sign that Congress has recognized the importance of physical science and other science funding,” said Kei Koizumi, a policy analyst at the American Association for the Advancement of Science (AAAS).

I’m disappointed that there isn’t more funding for the basic energy science facilities at DOE Arthur Bienenstock, American Physical Society

Stanford University physicist Arthur Bienenstock, this year’s president of the American Physical Society (APS), takes a more sceptical view. “I’m very pleased that we will be able to retain some of the extreme capabilities in science and technology at Fermilab and SLAC,” he said. “But I’m disappointed that there isn’t more funding for the basic energy science facilities at DOE.”

Michael Lubell, a physicist at the City College of New York and APS director of public affairs, also points out that three key issues that physicists raised last December with budget makers remain “largely unaddressed”. These are cuts in funding for particle physics, the lack of support for national facilities, and the damage to America’s credibility as an international partner because of the decision to cut funds for ITER.

“Unless these issues are addressed some time in the next six months, the country will pay a heavy penalty,” said Lubell, who plans to continue lobbying over the 2009 budget.

Where are all the physics teachers?

By Hamish Johnston

The British media were talking about physics today, and I’m afraid the news wasn’t good.

The BBC was reporting on a study on the state of physics teaching in England by Alan Smithers and Pamela Robinson of the Centre for Education and Employment Research at the University Of Buckingham.

Robinson and Smithers found that in 2007 just 12% of scientists accepted on teacher training programmes were trained as physicists — down from 30% in 1983. If this trend continues, it could be very difficult for the government to hit its target of having 25% of all science teachers specialising in physics by 2014.

The decline in physics teachers has meant that many education authorities have opted for “general science” teachers who cover biology and chemistry as well as physics. Indeed, the researchers found that half the schools in inner London have no teachers specialising in physics.

However, all is not gloom and doom for teaching physics in England. Last week we had our summer company meeting and Bob Kirby-Harris, chief executive, the Institute of Physics (which owns the company that publishes physicsworld.com) told us about how the organization was tackling the problem. The IOP has set up the Physics Enhancement Project, which aims to boost the physics expertise of trainee science teachers who don’t have formal qualifications in physics.

Most of our readers are outside of the England, so please let us know about the state of physics teaching where you are.

Gamma-rays put limit on universe’s background light

A team of astronomers has detected a burst of very-high-energy gamma rays that was created over five billion years ago – the oldest such burst ever detected. On their journey to Earth, the gamma rays had to travel through the extragalactic background light (EBL), which is known to be a strong absorber of such gamma rays.

The fact that the burst was detected at all means that astronomers may be able to use such gamma rays to determine how the EBL changed over billions of years – something that could shed considerable light on how stars and galaxies form and evolve.

The universe is bathed in diffuse background light that comes from every star that has ever shone. As a result, the nature of this EBL could tell us much about the evolution of stars and galaxies over the history of the universe. However, because it is so faint compared to nearby stars and galaxies, it is very difficult to measure the EBL directly. As a result there are currently several theories about how much background light should be out there, and they make very different predictions.

History of the EBL

Very-high-energy gamma rays with energies between about 80 and 500 GeV can be used to probe the history of the EBL, particularly if they have travelled great distances across the universe. This is because the EBL is very good at blocking such gamma-rays and by measuring this effect, physicists can get a handle on the density of the EBL.

Now, Rudolf Bock of the Max Planck Institute for Physics in Munich along with other members of the MAGIC gamma-ray telescope collaboration have used the ground-based instrument to detect the oldest burst of such gamma rays ever seen (Science 320 1752). The radiation came from a distant quasar called 3C 279, which the team know is over 5bn light-years from Earth.

According to Bock, a key result of the measurement is that most current theories tend to over estimate the density of the EBL. This is actually good news for astronomers because it suggests that gamma-ray telescopes can be used to trace the evolution of the EBL much farther back in the history of the universe than most models had predicted.

He told physicsworld.com that it is likely that the technique developed by the MAGIC team “will teach us much more about the history of star formation”.

The measurement is also in line with estimates of the EBL made by the Hubble Space Telescope and Spitzer Telescope.

Dark energy at the Science Cafe

By Hamish Johnston

There’s nothing more annoying than the sound of your own voice…

I have come to this conclusion after spending many painful hours transcribing hundreds of taped interviews that I have done with scientists, industrialists and other luminaries.

But today, the shoe was on the other foot (or ear). At noon I was in a radio studio speaking to the host of the BBC Radio Wales programme The Science Cafe about dark energy.

2008 marks the tenth anniversary of the discovery of dark energy and Adam Walton wanted an update on the stuff from Physics World.

I’ll be the first to admit that cosmology is not one of my strengths and I know that I don’t have a voice for radio — but no-one else on the editorial team seemed to be available, so I agreed.

I spent the last few days reading up on dark energy — indeed, Physics World recently published three articles to mark the 10th anniversary, all of which were very helpful.

I did my best to answer Adam’s questions and I hope that after some skillful editing, his listeners will learn something about dark energy.

One thing I learned from the experience is that many of the things that I find interesting about dark energy — the huge discrepancy between the cosmological constant invoked to explain dark energy and the “cosmological constant” that can be derived from quantum field theory, for example — are very difficult to explain in sound bites.

If you live in Wales, you can listen to the show on Sunday, 29 June at 5pm. The rest of us can listen online.

I was pretty nervous during the interview, so I can’t really remember half of what I said. It will be interesting to hear what they use and what ends up on the cutting room floor.

Copyright © 2025 by IOP Publishing Ltd and individual contributors