Part of my job as Features Editor on Physics World is to dig up great ideas for possible feature articles. That’s one reason why I am spending this week on an island in Lake Constance in southern Germany at the 58th meeting of Nobel Laureates at Lindau.
The meeting, which is held every year, gives top young students the chance to hear, talk to and debate with leading researchers from a particular field of endeavour. This year’s meeting is dedicated to physics and there are some 25 Nobel-prize-winning physicists here as well as over 550 students.
Yesterday we were treated to a fascinating debate about the Large Hadron Collider (LHC) at CERN, featuring Nobel laureates David Gross, Martinus Veltman, George Smoot, Gerhard ‘t Hooft and Carlo Rubbia, along with LHC accelerator supremo Lyn Evans and CERN chief scientific officer Jos Engelen.
Chairing the session was my predecessor in the Physics World features hot-seat Matthew Chalmers, who is now forging a career as a freelance science journalist.
Some speakers, like Smoot and Gross, preferred to talk about the hope that the LHC will yield a cornucopia of new physics , prominently of Higgs bosons and supersymmetric particles. Others, like Veltman and Rubbia, took a more cautious stance as to what might be discovered.
The experiment itself is a complex beast and will take years before the experimentalists understand it completely. The computing challenge is also gargantuan: the proton-proton collisions will yield some 109 events per second, of which only 200 can be saved into a disk.
This means there is a huge responsibility on the shoulders of the thousands of young researchers working in the bowels of the LHC to make sure that the interesting events are the ones that get saved into the computing grid.
As Rubbia told the meeting: “The discussion about the Higgs is not the right discussion at the moment. This is a very complex machine, and presumably, it will take years before we understand it properly. One should let the physicists do their work instead of pressuring the scientists for results.”
I hope to tell you more about what’s been happening here on Lake Constance later this week. Meanwhile, back to those Nobel laureates…
A new postgraduate centre for maths and computer science has opened in the Nigerian capital of Abuja as part of an ambitious plan to find the “next Einstein” in Africa. The centre is providing advanced training to graduate students from across Africa in maths and related fields. It wants to attract the best young African scientists and nurture their talents as problem-solvers and teachers.
The new Nigerian centre is modelled on — and has close ties with — the African Institute for Mathematical Sciences (AIMS) in Cape Town, South Africa, which was set up in 2003 by the Cambridge University cosmologist Neil Turok. In recognition of the close ties with AIMS, the new centre is called AIMS (Abuja).
The plan is to set up another 15 AIMS-type centres across Africa over the next five years. Each centre will be run as a partnership with AIMS and AIMS (Abuja), plus one or more local universities. The centres will host students from across Africa but focus on particular branches of mathematical science.
New centres are planned for countries including Ghana, Madagascar, Sudan and Uganda — and they will join the African Mathematical Institutes Network (AMI-Net), which was created in 2005.
One wish to change the world
Turok, who was born in South Africa, wants to raise an endowment of $150m, which would allow 50 graduates to be supported at each of the 15 centres over the next five years. He has so far raised $2.7m after giving a talk earlier this year at the annual TED (technology, entertainment and design) conference in California, where he was one of three people to win a $100 000 TED prize. Each winner is obliged to give a talk that includes their “one wish to change the world”, which for Turok was for the next Einstein to come from Africa.
The challenges now include bringing more African women into science and convincing policy-makers across the continent of the importance of developing a scientific elite Karl Voltaire, AIMS (Abuja)
Karl Voltaire, chief executive of AIMS (Abuja), told physicsworld.com that the new centre already has 50 students from across the continent and that classes began on Monday. “[Setting up AIMS (Abuja)] has been an exhilarating experience, frustrating at times, but well worth the effort of bringing students from across the continent to a place where they can learn with top faculty from all over the world,” he said. “The challenges now include bringing more African women into science and convincing policy-makers across the continent of the importance of developing a scientific elite.”
AIMS (Abuja) is based at the African University of Science and Technology (AUST), which also opens this month. Focusing on engineering, materials science, computing as well as petroleum and natural-gas engineering, it has been funded by the World Bank and the Nigerian government. The plan is for AUST to eventually have up to 5000 students.
Physicists in the US and Singapore are the first to use light from distant galaxies to perform a systematic search for cosmic strings — massive structures that may have been created just after the Big Bang. Although the team has found no evidence of cosmic strings in the small patch of sky they surveyed, they have been able to set an upper limit on the mass per unit length of the strings. The team is now working to improve their results by looking at larger patches of the sky.
Cosmic strings are extremely long and dense structures that many physicists believe were created about 10-35 s after the Big Bang. At this time the universe became cool enough for the electrostrong force to begin to separate out into the strong and electroweak forces. This “symmetry breaking” process marked a phase transition in the state of the universe — something akin liquid water freezing to ice.
Just as crystal defects occur when water freezes, some cosmologists believe that defects in the fabric of the universe in the form of cosmic strings emerged during this phase transition. These extremely massive 1D objects could endure to this day — and studying them could provide important information about the early universe and how stars and galaxies evolved from the primordial fireball.
Lack of evidence
The only problem is that no-one has managed to find any compelling evidence for the existence of cosmic stings. The best that researchers have done so far is to put an upper limit on the density of cosmic strings. So far, this has been done by looking for their effects on the cosmic microwave background (CMB); searching for gravitational waves created when a cosmic string cracks like a whip; and looking for evidence of gravitational lensing, whereby light from a distant galaxy is bent by the strong gravitational field of a cosmic string, making a single galaxy appear as a pair of galaxies to an observer on Earth.
Now, however, a team of researchers in the US and Singapore has performed the first systematic search of a section of the sky to look for evidence of gravitational lensing by cosmic strings. The team used a survey of about 300 square arcminutes of the sky — about one millionth of the universe — obtained by the Hubble space telescope.
The survey contains about 78,000 galaxies, and the team used a computer algorithm to sift through all these images to look for nearby pairs of galaxies that may actually be a single galaxy seen through the gravitational lens of a cosmic string (Phys Rev D 77 123509).
Jodi Christiansen and colleagues at California Polytechnic State University, the University of California, Berkeley and the National University of Singapore began by cataloging the position, shape and brightness of all 78,000 individual galaxies above a certain brightness threshold. The team then searched the catalogue for “pairs” of galaxies of similar size and brightness that were separated by less than 15 arcseconds, finding about 6600 pairs that satisfied these criteria.
Background level
Computer simulations of how a cosmic string would bend the light from distant galaxies suggested that lensed galaxy pairs would be separated by 6 arcseconds or less. As a result, Christiansen and colleagues assumed that the pairs separated by 7–15 arcminutes give a measure of the background level of pairs that appeared by chance, rather than as a result of cosmic-string-induced gravitational lensing.
While the analysis failed to find a significant excess of pairs that could be attributed to the presence of cosmic strings, the team was able to put an upper limit on the mass of cosmic strings — they must be less than about 2% of the total mass of the universe. In terms of a mass per unit length for individual cosmic strings, the upper limit is about 10-7 in dimensionless units.
This is better than a limit (about 10-6) imposed by a recent search by team-member and Nobel prize-winner George Smoot, who looked for temperature fluctuations in the CMB that could be caused by cosmic strings. However, it is not as precise as limits derived from an analysis of the CMB power spectrum and the search for gravitational waves produced by cosmic strings (about 10-8).
However, Christiansen points out that the power-spectrum and gravitational-wave searches are dependent upon several assumptions regarding how cosmic strings move through the universe, which makes them potentially more difficult to interpret that gravitational lens results.
One drawback with the current survey is that it only covers a very small patch of the universe. Christiansen says that the team is now working to improve precision of the technique by using a more recent survey of the sky called COSMOS, which looks at a much larger patch of the sky measuring two square degrees.
Each charged elementary particle has a counterpart with the opposite charge, which is known as an antiparticle. The antiparticle partner of the negative electron, for example, is the positive positron, which was predicted by Paul Dirac in 1930 and discovered by Carl Anderson in 1932; while for the proton it is the antiproton, which was discovered by Emilio Segrè and Owen Chamberlain in 1955. Just like normal particles, antiparticles can combine, forming atoms of “antimatter”. Dirac’s theory suggested that the laws of physics were exactly the same for matter and antimatter; so given this symmetry, why is our visible universe made of matter with no antimatter? This is the question addressed by Helen Quinn and Yossi Nir in The Mystery of the Missing Antimatter.
A surprising experimental discovery in 1964 suggested a possible answer. While experimenting with K-mesons, which belong to the class of “strange” particles that contain a single strange quark, Jim Cronin and Val Fitch at Princeton University found a small asymmetry between particles and antiparticles. Their experiments revealed that there is an interaction that is not the same for quarks and antiquarks — a phenomenon that now goes by the name of CP violation (where C is charge conjugation and P is parity).
This led the Russian theoretical physicist Andrei Sakharov — who later became famous as a campaigner for human rights — to propose that at the beginning of the universe there were equal numbers of particles and antiparticles, but then, at an early stage in the evolution of the universe, some reaction or decay process that involved CP violation led to the destruction of some of the antiparticles. As the universe evolved further, particles and antiparticles annihilated each other until all the antiparticles were gone and only particles were left. Processes — such as Sakharov’s proposal — that produce an asymmetry between particles and antiparticles are referred to as “baryogenesis”.
Baryogenesis sees elementary particle theory intersecting with cosmology — the history of the universe. This intersection has proved to be a fascinating area of study in recent years as physicists have succeeded in applying known laws to interpret observational evidence about the early universe. However, at times it has also proved necessary to speculate about fundamental laws not yet discovered by experiments here on Earth. The mystery of dark matter, for example, points to some new fundamental physics, and Quinn and Nir discuss this in a quick summary of the standard cosmology of the expanding universe.
The major part of the book, however, is an exposition of the Standard Model of particle physics. The authors provide a systematic introduction to all the particles in the Standard Model, from the strongly interacting quarks to the weakly interacting neutrinos. They particularly emphasize how this model incorporates CP violation and discuss recent experiments at the Stanford Linear Accelerator Center in the US (where Quinn works) and at the KEK laboratory in Japan on particles called B-mesons, which exhibit large CPviolating effects.
Quinn and Nir then discuss how this source of CP violation can be the key to baryogenesis, and this discussion is the most complicated part of the book. This form of baryogenesis depends on a phase transition occurring just after the Big Bang. The authors describe what happens to a phase in which particles have mass and the weak interaction is less active as it expands very rapidly within a hotter phase where all the fundamental particles are massless and the weak interaction is very active. According to the Standard Model, baryogenesis can occur at the surface of this expanding bubble. The result is that there are more quarks than antiquarks inside the bubble, which eventually expands to become the entire universe. This model does not, however, fit with the observational evidence, which has led researchers to conclude that CP violation in the Standard Model is too small to do the job suggested by Sakharov.
In order to solve the mystery, therefore, it is necessary to speculate about particles and interactions not yet discovered. It was originally suggested that the answer might lie with grand unified theories, which predict that at extremely high energies the electromagnetic, weak nuclear and strong nuclear forces are fused into a single unified field. One of the predictions of these theories (which Quinn herself helped develop) was that the proton would decay. Large water-based Cerenkov detectors were built in the US and Japan to search for such decay, but none was observed, and most researchers have abandoned this approach.
The detectors were successful in another respect, however. They were also used to study neutrinos produced by cosmic rays in the atmosphere and neutrinos from the Sun, which led to the discovery that neutrinos have a small mass — contrary to the predictions of the Standard Model (in which neutrinos are massless). The discovery of neutrino masses has inspired new speculation as to the source of the matter/antimatter asymmetry, which goes by the name of leptogenesis.
The basic idea behind leptogenesis is that, besides the three types of neutrinos that have been discovered experimentally, there are three much more massive “singlet” neutrinos, which have no interactions under the Standard Model but instead have a new CP-violating interaction that allows them to decay into more antineutrinos than neutrinos. Under the high-temperature conditions present in the early universe, certain reactions could occur in accordance with the Standard Model that would convert antineutrinos into matter particles, thus converting this neutrino asymmetry into an asymmetry between matter and antimatter.
Future experiments may give hints that this idea is correct; for example, sending neutrino and antineutrino beams to detectors hundreds of kilometres away to try to detect CP violation in the oscillation of neutrinos of one type into another. However, there is no way to directly test these speculations because singlet neutrinos are so massive that they could never be produced in the laboratory.
The Mystery of the Missing Antimatter is extremely well written and easy to read. The style is quite informal, avoiding equations and including occasional personal recollections from Quinn. This means that the authors have needed to simplify sophisticated concepts; however, there is nothing written that is really misleading. The only illustrations in the book are cartoon- like drawings that offer simplistic analogies to physics concepts, such as an elephant sitting on a chair to illustrate forces and energy, and some diagrams or graphs that relate more directly to the physics might have been helpful. Overall, though, the book provides a very stimulating introduction to our current theories of cosmology and particle physics.
And the answer to the mystery? My own belief is that at some point we may have to accept arbitrary initial conditions — in other words, that the universe just started out with an imbalance of matter and antimatter. There may be some questions that have no simple answers. Nevertheless, pursuing questions, even those that may not have an answer, is the only way that science can progress.
Martin Gardner, who turns 94 this autumn, seems to have pulled off an astounding trick. Every other year hundreds of people gather to honour Gardner, who is the author of over 70 books and wrote the popular “Mathematical Games” column that appeared in Scientific American for a quarter of a century from 1956. What is astonishing is that the people come from a bewildering variety of professions and include jugglers, magicians, artists, puzzle-makers, logicians, computer scientists, pseudoscience debunkers and mathematicians.
One long-standing fan of Gardner is the Atlanta businessman and puzzle collector Tom Rodgers. After Gardner stopped writing his column, Rodgers began trying to convince the famously humble author to attend an event in his honour. When Gardner finally relented, Rodgers used Gardner’s extensive correspondence files to compile a list of invitees. The first “Gathering for Gardner”, which Rodgers organized in 1993, was so successful that a second was held in 1996. Since then, it has been held biannually in an Atlanta hotel. Each event is called “G4Gn”, with the n denoting the number in the series. The most recent gathering, which took place over four days last March, was G4G8.
Gardner, who now resides in an “assisted-living” facility in Oklahoma, sadly no longer travels. But his presence was everywhere at G4G8. Talks ranged over his favourite topics — including mathematics, logic, games, puzzles, sculptures, mosaics and knots — and were all delivered and often staged wittily, with infectious enthusiasm and commitment. One mathematician ran the Atlanta marathon in the morning, then turned up to co-present his talk still wearing running shorts and his marathon number, in a kind of run-and-prove biathlon.
Raymond Smullyan, an 89-year-old magician and logician with white, shoulder-length hair and long, crooked fingers, gave a talk that consisted entirely of reciting a string of puzzles, paradoxes and one-liner-like, selfrefuting sentences. It began when he strode to the podium and said “Before I begin speaking, let me say something”, and ended with paradoxes associated with the remark “Only an idiot would believe this sentence”.
Gardner is renowned for debunking pseudoscience — he is a founding member of the Committee for the Scientific Investigation of Claims of the Paranormal — and several talks were in that vein. One debunked the “Indian rope trick”, which supposedly involves a climbable, vertically rising rope but is evidently a hoax that can be traced back to an 1890 article in the Chicago Daily Tribune.
Gardner is also famous for his creation of “Dr Matrix”, a fictitious scholar whom Gardner used to mock numerology — the belief that all numbers are intrinsically linked to real objects and living things. In homage, a G4Gn tradition is for one Dr Matrix impersonator to promote n, followed by another who attacks n. At this year’s meeting, we heard tongue-in-cheek conspiracy theories; for instance, about why the maximum term of a US president is eight years, this summer’s Olympics start on 8.8.08, Santa has eight reindeer, a piano has 88 keys, there are now eight planets in the solar system, and so forth. When one speaker asked “What’s 987,654,321 divided by 123,456,789?”, my 13-year-old son Alex grabbed his calculator, already suspecting the answer.
Kenneth Brecher, a physicist and astronomer from Boston University, peppered his talk, “A torque about tops”, with phrases like “I simply love tops!” and “I’m a top-a-holic!” and made it clear why their delightful properties attracted the affection of great physicists. James Clerk Maxwell created “the fanciest top ever made”, Einstein worked on the subject, and Felix Klein and Arnold Sommerfeld spent 13 years writing a four-volume, 966-page text on rotating bodies.
After discussing the chirality of the peculiar spinning objects known as celts, Brecher mentioned scarab beetles, whose carapaces are the only known natural objects that circularly polarize light, and wondered about the evolutionary mechanism, if any, behind it. Brecher’s enthusiasm was so infectious that when he displayed a picture of Wolfgang Pauli and Niels Bohr intently playing with a top, the first thought that sprang to mind was not “Why did such great minds stoop to such a frivolous activity?”, but rather “Of course!”.
Between sessions, people juggled, danced, played the piano, built group sculptures, taught card tricks and how to fold intricate origami objects, and argued about difficult mathematical sequences. An exhibition area displayed puzzles, toys and art. Everyone had an interesting story. Colin Wright (who has a PhD in mathematics from Cambridge University in the UK) used a notation system to discover novel juggling tricks and was demonstrating them to anyone with an interest. Alex Stone, a graduate physics student from Columbia University, was researching a book on physics and magic. Louis Brown, an 11-year-old from New Jersey, had written a fan letter to Smullyan a week before G4G8 and Smullyan invited him to come along.
Though Gardner did not attend, nothing is said to delight him more than the collaborations fostered by the gatherings. At its conclusion, several participants flew to Oklahoma to bring him news and gifts.
The critical point
The key to Gardner’s trick — of assembling such a diverse array of people — is that creating and solving puzzles is an integral part of many fields. Mathematicians and scientists study puzzles posed by nature that perhaps only they can solve. Puzzle-makers design puzzles that they hope others can solve with difficulty, while magicians create perceptual puzzles to foist on audiences. Taking unashamed delight in the play of puzzle-solving, and witnessing others do it, is essential to being able to do it well. The G4Gs reveal that Martin Gardner does not force people to cross disciplinary boundaries — rather, he reminds us that these boundaries are artificial to begin with.
Pure curiosity has been the driving force behind many groundbreaking experiments in physics. This is no better illustrated than in quantum mechanics, initially the physics of the extremely small. Since its beginnings in the 1920s and 1930s, researchers have wanted to observe the counterintuitive properties of quantum mechanics directly in the laboratory. However, because experimental technology was not sufficiently developed at the time, people like Niels Bohr, Albert Einstein, Werner Heisenberg and Erwin Schrödinger relied instead on “gedankenexperiments” (thought experiments) to investigate the quantum physics of individual particles, mainly electrons and photons.
By the 1970s technology had caught up, which produced a “gold rush” of fundamental experiments that continued into the 1990s. These experiments confirmed quantum theory with striking success, and challenged many common-sense assumptions about the physical world. Among these assumptions are “realism” (which, roughly speaking, states that results of measurements reveal features of the world that exist independent of the measurement), “locality” (that the result of measurements here and now do not depend on some action that might be performed a large distance away at exactly the same time), and “non-contextuality” (asserting that results of measurements are independent of the context of the measurement apparatus).
But a big surprise awaited everyone working in this field. The fundamental quantum experiments triggered a completely new field whereby researchers apply phenomena such as superposition, entanglement and randomness to encode, transmit and process information in radically novel schemes. “Quantum information science” is now a booming interdisciplinary field that has brought futuristic-sounding applications such as quantum computers, quantum encryption and quantum teleportation within reach. Furthermore, the technological advances that underpin it have given researchers unprecedented control over individual quantum systems. That control is now fuelling a renaissance in our curiosity of the quantum world by allowing physicists to address new fundamental aspects of quantum mechanics. In turn, this may open up new avenues in quantum information science.
Against intuition
Both fundamental quantum experiments and quantum information science owe much to the arrival of the laser in the 1960s, which provided new and highly efficient ways to prepare individual quantum systems to test the predictions of quantum theory. Indeed, the early development of fundamental quantum-physics experiments went hand in hand with some of the first experimental investigations of quantum optics.
One of the major experimental leaps at that time was the ability to produce “entangled” pairs of photons. In 1935 Schrödinger coined the term “entanglement” to denote pairs of particles that are described only by their joint properties instead of their individual properties — which goes against our experience of the macroscopic world. Shortly beforehand, Einstein, Boris Podolsky and Nathan Rosen (collectively known as EPR) used a gedankenexperiment to argue that if entanglement exists, then the quantum-mechanical description of physical reality must be incomplete. Einstein did not like the idea that the quantum state of one entangled particle could change instantly when a measurement is made on the other particle. Calling it “spooky” action at a distance, he hoped for a more complete physical theory of the very small that did not exhibit such strange features (see “The power of entanglement” by Harald Weinfurter Physics World January 2005 pp47–51).
This lay at the heart of a famous debate between Einstein and Bohr about whether physics describes nature “as it really is”, as was Einstein’s view, or whether it describes “what we can say about nature”, as Bohr believed. Until the 1960s these questions were merely philosophical in nature. But in 1964, the Northern Irish physicist John Bell realized that experiments on entangled particles could provide a test of whether there is a more complete description of the world beyond quantum theory. EPR believed that such a theory exists.
Bell based his argument on two assumptions made by EPR that are directly contradicted by the properties of entangled particles. The first is locality, which states that the results of measurements performed on one particle must be independent of whatever is done at the same time to its entangled partner located at an arbitrary distance away. The second is realism, which states that the outcome of a measurement on one of the particles reflects properties that the particle carried prior to and independent of the measurement. Bell showed that a particular combination of measurements performed on identically prepared pairs of particles would produce a numerical bound (today called a Bell’s inequality) that is satisfied by all physical theories that obey these two assumptions. He also showed, however, that this bound is violated by the predictions of quantum physics for entangled particle pairs (Physics 1 195).
Bell experiment Experiments on entangled pairs or triplets of photons can be used to test the notion of physical realism. In the original Bell-type experiments, both photons of an entangled pair have the same linear polarization for parallel polarizers. But for polarizers oriented at a small angle with respect to each other, as shown, the same result is more often obtained for both photons than would be permitted if polarization were a real local property of the photons.
Take, for example, the polarization of photons. An individual photon may be polarized along a specific direction, say the horizontal, and we can measure this polarization by passing the photon through a horizontally oriented polarizer. A click in a photon detector placed behind it indicates a successful measurement and shows that the photon is horizontally polarized; no click means that the photon is polarized along the vertical direction. In the case of an entangled pair of photons, however, the individual photons turn out not to carry any specific polarization before they are measured! Measuring the horizontal polarization of one of the photons will always give a random result, thus making it equally likely to find an individual photon horizontally or vertically polarized. Yet performing the same measurement on the other photon of the entangled pair (assuming a specific type of entangled state) will show both photons to be polarized along the same direction. This is true for all measurement directions and is independent of the spatial separation of the particles.
Bell’s inequality opened up the possibility of testing specific underlying assumptions of physical theories — an effort rightfully referred to by Abner Shimony of Boston University as “experimental metaphysics”. In such Bell experiments, two distant observers measure the polarization of entangled particles along different directions and calculate the correlations between them. Because the quantum correlations between independent polarization measurements on entangled particles can be much stronger than is allowed by any local realistic theory, Bell’s inequality will be violated.
Quantum loopholes
The first such test was performed using entangled photons in 1972 by Stuart Freedman and John Clauser of the University of California at Berkeley; Bell’s inequality was violated and the predictions of quantum theory were confirmed (Phys. Rev. Lett. 28 938). But from early on there existed some loopholes that meant researchers could not exclude all possible “local realistic” models as explanations for the observed correlations. For example, it could be that the particles detected are not a fair sample of all particles emitted by the source (the so-called detection loophole) or that the various elements of the experiment may still be causally connected (the locality loophole). In order to close these loopholes, more stringent experimental conditions had to be fulfilled.
In 1982 Alain Aspect and colleagues at the Université Paris-Sud in Orsay, France, carried out a series of pioneering experiments that were very close to Bell’s original proposal. The team implemented a two-channel detection scheme to avoid making assumptions about photons that did not pass through the polarizer (Phys. Rev. Lett. 49 91), and the researchers also periodically — and thus deterministically — varied the orientation of the polarizers after the photons were emitted from the source (Phys. Rev. Lett. 49 1804). Even under these more stringent conditions, Bell’s inequality was violated in both cases, thus significantly narrowing the chances of local-realistic explanations of quantum entanglement.
In 1998 one of the present authors (AZ) and colleagues, then at the University of Innsbruck, closed the locality loophole by using two fully independent quantum random-number generators to set the directions of the photon measurements. This meant that the direction along which the polarization of each photon was measured was decided at the last instant, such that no signal (which by necessity has to travel slower than the speed of light) would be able to transfer information to the other side before that photon was registered (Phys. Rev. Lett. 81 5039). Bell’s inequality was violated.
Then in 2004, David Wineland and co-workers at the National Institute of Standards and Technology (NIST) in Colorado, US, set out to close the detection loophole by using detectors with perfect efficiency in an experiment involving entangled beryllium ions (Nature 409 791). Once again, Bell’s inequality was violated. Indeed, all results to date suggest that no local-realistic theory can explain quantum entanglement.
GHZ experiment In the so-called GHZ experiments knowing, say, the circular polarizations of two photons of a three-photon entangled state allows quantum mechanics to predict with certainty the linear polarization of the third (top) photon, which in this case is horizontal. A local realist would predict orthogonal linear polarization – in this case vertical.
But the ultimate test of Bell’s theorem is still missing: a single experiment that closes all the loopholes at once. It is very unlikely that such an experiment will disagree with the prediction of quantum mechanics, since this would imply that nature makes use of both the detection loophole in the Innsbruck experiment and of the locality loophole in the NIST experiment. Nevertheless, nature could be vicious, and such an experiment is desirable if we are to finally close the book on local realism.
In 1987 Daniel Greenberger of the New York City College, Michael Horne of Stonehill College and AZ (collectively GHZ) realized that the entanglement of three or more particles would provide an even stronger constraint on local realism than two-particle entanglement (Am. J. Phys. 58 1131). While two entangled particles are at variance with local realism only in their statistical properties, which is the essence of Bell’s theorem, three entangled particles can produce an immediate conflict in a single measurement result because measurements on two of the particles allow us to predict with certainty the property of the third particle.
The first experiments on three entangled photons were performed in late 1999 by AZ and co-workers, and they revealed a striking accordance with quantum theory (Nature 403 515). So far, all tests of both Bell’s inequalities and on three entangled particles (known as GHZ experiments) (see “GHZ experiments”) confirm the predictions of quantum theory, and hence are in conflict with the joint assumption of locality and realism as underlying working hypotheses for any physical theory that wants to explain the features of entangled particles.
Quantum information science
The many beautiful experiments performed during the early days of quantum optics prompted renewed interest in the basic concepts of quantum physics. Evidence for this can be seen, for example, in the number of citations received by the EPR paper, which argued that entanglement renders the quantum-mechanical description of physical reality incomplete. The paper was cited only about 40 times between its publication in 1935 and 1965, just after Bell developed his inequalities. Yet today it has more than 4000 citations, with an average of 200 per year since 2002. Part of the reason for this rise is that researchers from many different fields have begun to realize the dramatic consequences of using entanglement and other quantum concepts to encode, transmit and process information.
Entanglement-based quantum cryptography Quantum cryptography, which allows information to be sent totally securely between two sites, relies on entanglement. Pairs of polarization-entangled photons are distributed between Alice and Bob, who want to share a secret message (in this case an image of the famous Venus of Willendorf effigy). Measuring the polarization of an individual entangled photon will give a totally random outcome. However, if Alice and Bob perform measurements along the same polarization direction, then their outcomes are always the same within each entangled pair. By measuring many pairs Alice and Bob obtain the same random sequence, which they then use as a secret key. Alice mixes her key (top left) with the picture of the original (lower left). The encrypted picture (lower middle) is secure against eavesdropping because of the randomness of the key. Bob, however, can easily decode the message using his key (lower right). A potential eavesdropper can be detected because tampering with the photons of the entangled pair destroys the entanglement. This is a consequence of the uncertainty principle, and thus the security of quantum cryptography is guaranteed. The original experiment demonstrating this effect was performed by a team at the University of Innsbruck in 1998 over a distance of more than 300 m (Phys. Rev. Lett.84 4729).
Take quantum cryptography, which applies randomness, superposition and, in one scheme proposed by Artur Ekert of Oxford University in the UK, two-particle entanglement to transmit information such that its security against eavesdropping is guaranteed by physical law (see “Entanglement-based quantum cryptography”). This application of quantum information science has already left the laboratory environment. In 2004, for instance, AZ and co-workers at the University of Vienna transferred money securely between an Austrian bank and Vienna City Hall using pairs of entangled photons that were generated by a laser in a nonlinear optical process and distributed via optical fibres. More recently, two international collaborations were able to distribute entangled photons over a distance of 144 km between La Palma and Tenerife, including a demonstration of quantum cryptography, and earlier this year even showed that such links could be established in space by bouncing laser pulses attenuated to the single-photon level off a satellite back to a receiving station on Earth (Physics World May p4). Commercial quantum-encryption products based on attenuated laser pulses are already on the market (see “Key to the quantum industry”), and the challenge now is to achieve higher bit rates and to bridge larger distances.
In a similar manner, the road leading to the GHZ experiments opened up the huge field of multiparticle entanglement, which has applications in, among other things, quantum metrology. For example, greater uncertainty in the number of entangled photons in an interferometer leads to less uncertainty in their phases, thereby resulting in better measurement accuracy compared with a similar experiment using the same number of non-entangled photons.
Multiparticle entanglement is also essential for quantum computing. Quantum computing exploits fundamental quantum phenomena to allow calculations to be performed with unprecedented speed — perhaps even solving problems that are too complex for conventional computers, such as factoring large prime numbers or enabling fast database searches. The key idea behind quantum computing is to encode and process information in physical systems following the rules of quantum mechanics. Much current research is therefore devoted to finding reliable quantum bits or “qubits” that can be linked together to form registers and logical gates analogous to those in conventional computers, which would then allow full quantum algorithms to be implemented.
One-way quantum computation Quantum computers, by processing information via the states of quantum systems such as atoms and photons, promise to outperform classical computers for specific tasks. One approach to making a practical version of such a device is the so-called one-way quantum computer. Proposed in 2001, it relies on entanglement. In a photonic realization, the output from a pump laser passes through a nonlinear beta barium borate (BBO) crystal twice in order to generate two entangled pairs of photons, which means that the four spatial modes contain four photons altogether. Coherent superposition at the two polarizing beamsplitters ensures that the final four detected photons are in a “cluster state”, given appropriate placing of half-wave plates and polarizers. Since 2005, proof-of-concept experiments (above) with this basic quantum-computer set-up allowed computations of Grover’s search algorithm, a quantum prisoner’s dilemma and the demonstration of decoherence-free subspaces (as required for fault-tolerant quantum computation) to be carried out.
In 2001, however, Robert Raussendorf and Hans Briegel, then at the University of Munich in Germany, suggested an alternative route for quantum computation based on a highly entangled multiparticle “cluster state” (Phys. Rev. Lett. 86 5188). In this scheme, called “one-way” quantum computing, a computation is performed by measuring the individual particles of the entangled cluster state in a specific sequence that is defined by the particular calculation to be performed. The individual particles that have been measured are no longer entangled with the other particles and are therefore not available for further computation. But those that remain within the cluster state after each measurement end up in a specific state depending on which measurement was performed. As the measurement outcome of any individual entangled particle is completely random, different states result for the remaining particles after each measurement. But only in one specific case is the remaining state the correct one. Raussendorf and Briegel’s key idea was to eliminate that randomness by making the specific sequence of measurements depend on the earlier results. The whole scheme therefore represents a deterministic quantum computer, in which the remaining particles at the end of all measurements carry the result of the computation.
In 2005 the present authors and colleagues at Vienna demonstrated the principle of one-way quantum computing (see “One-way quantum computation”) and even a simple search algorithm using a four-photon entangled state (Nature 434 169). Then in 2007, Jian-Wei Pan of the University of Science and Technology of China in Hefei and coworkers implemented a similar scheme involving six photons. A distinct advantage of a photonic one-way computer is its unprecedented speed, with the time from the measurement of one photon to the next, i.e. a computational cycle, taking no more than 100 ns.
Foundational questions
The technology developed in the last 20 years to make quantum information processing and communication a reality has given researchers unprecedented control over individual quantum systems. For example, it is now clear that all-photonic quantum computing is possible but that it requires efficient and highly pure single-photon and entangled-photon sources, plus the ability to reliably manipulate the quantum states of photons or other quantum systems in a matter of nanoseconds. The tools required for this and other developments are being constantly improved, which is itself opening up completely new ways to explore the profound questions raised by quantum theory.
One such question concerns once again the notions of locality and realism. The whole body of Bell and GHZ experiments performed over the years suggests that at least one of these two assumptions is inadequate to describe the physical world (at least as long as entangled states are involved). But Bell’s theorem does not allow us to say which one of the two should be abandoned.
Leggett experiment In a Leggett-type experiment the correlations between linear polarization for one photon and elliptic polarization of the other can be shown to violate a realistic world view even if instant non-local communication is allowed. As shown in the figure, circular and elliptical polarizations of single photons can be measured by using appropriate combinations of quarter-wave-plates and linear polarizers.
In 2003 Anthony Leggett of the University of Illinois at Urbana-Champaign in the US provided a partial answer by presenting a new incompatibility theorem very much in the spirit of Bell’s theory but with a different set of assumptions (Found. Phys. 33 1469). His idea was to drop the assumption of locality and to ask if, in such a situation, a plausible concept of realism — namely to assign a fixed polarization as a “real” property of each particle in an entangled pair — is sufficient to fully reproduce quantum theory. Intuitively, one might expect that properly chosen non-local influences can produce arbitrary correlations. After all, if you allow your measurement outcome to depend on everything that goes on in the whole universe (including at the location of the second measurement apparatus), then why should you expect a restriction on such correlations?
For the specific case of a pair of polarization-entangled photons, the class of non-local realistic theories Leggett sets out to test fulfils the following assumptions: each particle of a pair is emitted from the source with a well-defined polarization; and non-local influences are present such that each individual measurement outcome may depend on any parameter at an arbitrary distance from the measurement. The predictions of such theories violate the original Bell inequalities due to the allowed non-local influence, so it is natural to ask whether they are able to reproduce all predictions of quantum theory.
Leggett showed that this is not the case. Analogous to Bell, he derived a set of inequalities for certain measurements on two entangled particles that are fulfilled by all theories based on these specific non-local realistic assumptions but which are violated by quantum-theoretical predictions. Testing Leggett’s inequalities is more challenging than testing Bell’s inequalities because they require measurements of both linear and elliptic polarization, and much higher-quality entanglement. But in 2007, thanks to the tremendous progress made with entangled-photon sources, the present authors and colleagues at Vienna were able to test a Leggett inequality experimentally by measuring correlations between linear and elliptical polarizations of entangled photons (Nature 446 871).
The experiment confirmed the predictions of quantum theory and thereby ruled out a broad class of non-local realistic theories as a conceptual basis for quantum phenomena. Similar to the evolution of Bell experiments, more rigorous Leggett-type experiments quickly followed. For example, independent experiments performed in 2007 by the Vienna team (Phys. Rev. Lett. 99 210406) and by researchers at the University of Geneva and the National University of Singapore (Phys. Rev. Lett. 99 210407) confirmed a violation of a Leggett inequality under more relaxed assumptions, thereby expanding the class of forbidden non-local realistic models. Two things are clear from these experiments. First, it is insufficient to give up completely the notion of locality. Second, one has to abandon at least the notion of naïve realism that particles have certain properties (in our case polarization) that are independent of any observation.
Macroscopic limits
The close interplay between quantum information science and fundamental curiosity has also been demonstrated by some fascinating experiments that involve more massive particles. According to quantum theory, there is no intrinsic upper limit on size or complexity of a physical system above which quantum effects no longer occur. This is at the heart of Schrödinger’s famous cat paradox, which ridicules the situation by suggesting an experiment whereby someone could prepare a cat that is in a superposition of being alive and dead. One particularly interesting case is that of “matter–wave” interference.
Macroscopic quantum experiments By studying the quantum-mechanical behaviour of ever larger and more massive objects, researchers stand a better chance of building reliable quantum computers or developing other applications of quantum information science. The current world record for the most massive objects to exhibit quantum interference in a two-slit type experiment, i.e. to demonstrate wave-like behaviour despite their corpuscular character, are fluorinated fullerene molecules C60F48 with a mass of 1632 atomic mass units (top left) (Phys. Rev. Lett.91 090408). The largest object showing interference is the molecule “azobenzene” (bottom) (Nature Physics3 711), as demonstrated by Markus Arndt’s team at the University of Vienna. Mechanical resonators (top right) have not yet entered the quantum regime, but this could soon change with such objects being cooled to their quantum ground state. This particular resonator contains a highly reflecting mirror placed on top of a vibrating singly clamped cantilever, which can be actuated via radiation pressure in combination with a high-finesse optical cavity.
Electrons, neutrons and atoms have been shown to exhibit interference effects when passing through a double-slit, thus proving that these massive systems neither passed through only one or the other slit. Similar behaviour was observed more recently in 1999 by AZ and colleagues at Vienna for relatively large carbon-60 and carbon-70 fullerene molecules (Nature 401 680), and ongoing research has demonstrated interference with even heavier and larger systems (see “Macroscopic quantum experiments”). One of the main goals of this research is to realize quantum interference for small viruses or maybe even nanobacteria.
Very recently the ability to cool nanomechanical devices to very low temperatures has opened up a new avenue to test systems containing up to 1020 atoms. One fascinating goal of experiments that probe the quantum regime of mechanical cantilevers is to demonstrate entanglement between a microscopic system such as a photon and a mechanical system — or even between two mechanical systems.
While the underlying motivation for studying macroscopic quantum systems is pure curiosity, the research touches on important questions for quantum information science. This is because increasingly large or complex quantum systems suffer from interactions with their environment, which is as important for macromolecules and cantilevers as it is for the large registers of a quantum computer.
One consequence of this interaction with the outside world is “decoherence”, whereby the system effectively becomes entangled with the environment and therefore loses its individual quantum state. As a result, measurements of that system are no longer able to reveal any quantum signature. Finding ways to avoid decoherence is thus a hot topic both in macroscopic quantum experiments and in quantum information science. With fullerene molecules, for instance, the effect of decoherence was studied in great detail in 2004 by coupling them to the outside environment in different, tuneable ways (Nature 427 711). From an experimental point of view, we see no reason to expect that decoherence cannot be overcome for systems much more macroscopic than is presently feasible in the laboratory.
Quantum curiosity
Quantum physics and the information science that it has inspired emerge as two sides of the same coin: as inspiration for conceptually new approaches to applications on the one side; and as an enabling toolbox for new fundamental questions on the other. It has often happened that new technologies raise questions that have not been asked before, simply because people could not imagine what has become possible in the laboratory.
One such case may be our increasing ability to manipulate complex quantum systems that live in high-dimensional Hilbert space — the mathematical space in which quantum states are described. Most of the known foundational questions in quantum theory have hitherto made use only of relatively simple systems, but larger Hilbert-space dimensions may add qualitatively new features to the interpretation of quantum physics. We are convinced that many surprises await us there.
We expect that future theoretical and experimental developments will shine more light on which of the counterintuitive features of quantum theory are really indispensable in our description of the physical world. In doing so, we expect to gain greater insight into the underlying fundamental question of what reality is and how to describe it. The close connection between basic curiosity of the quantum world and its application in information science may even lead to ideas for physics beyond quantum mechanics.
At a Glance: A quantum renaissance
Quantum mechanics challenges intuitive notions about reality, such as whether the property of a particle exists before a measurement is performed on it
Entanglement is one of the most perplexing aspects of quantum theory; it implies that measurement results on two particles are intimately connected to one another instantaneously no matter how far apart they are
Since the 1970s, experiments have shown repeatedly that quantum theory is correct, but researchers are still devising measurements in order to find out what quantum mechanics tells us about physical reality
These tests have sparked a new field called quantum information science, in which entanglement and other quantum phenomena are used in to encrypt, transmit and process information in radically new ways
The increased level of control over individual quantum systems that has driven quantum information science is now enabling physicists to tackle once again the fundamental puzzles raised by quantum theory
More about: A quantum renaissance
M Arndt, K Hornberger and A Zeilinger 2005 Probing the limits of the quantum world Physics World March pp35–40
D Bouwmeester et al. (ed) 1999 The Physics of Quantum Information (Springer, Heidelberg)
A Steinberg et al. 1996 Quantum optical tests of the foundations of physics The American Institute of Physics Atomic, Molecular, and Optical Physics Handbook (ed) G W F Drake (AIP Press)
A Zeilinger et al. 2005 Happy centenary, photon Nature 433 239
Blogger: Chris Lintott URL:chrislintott.net First post: January 2006
Who is the blog written by?
Chris Lintott is a postdoc at Oxford University in the UK, where he is studying the application of astrochemical models of star formation to galaxies beyond the Milky Way. He is also heavily involved in science popularization and co-presents the 50-year-old BBC TV programme The Sky at Night with Sir Patrick Moore and he co-authored, along with Moore and Queen guitarist Brian May, the book Bang!, which is a popular account of the history of the universe (see Physics World October 2006 pp12–13, print edition only). Lintott is also principal investigator on the Galaxy Zoo project, which enlists the help of members of the public to classify galaxies imaged by telescopes, and contributes to the project’s blog.
What topics does the blog cover?
Compared with many other scientists’ blogs, this one is quite focused. On the whole, Lintott restricts himself to writing about and commenting on topics related to physics, and astronomy in particular. Readers are often treated to previews of upcoming episodes of The Sky at Night, as well as being pointed in the direction of Lintott’s other outreach activities, such as his posts on the Galaxy Zoo blog and articles he writes for The Times newspaper, not to mention other interesting science-related articles and posts on other blogs. The funding crisis at the UK’s Science and Technology Facilities Council (STFC) has naturally featured quite heavily over the past few months.
Who is it aimed at?
General readers with an interest in astronomy should have no trouble following most of the topics discussed on Lintott’s blog, although some — such as the STFC funding crisis — are probably aimed more at those in the research community. Judging by the comments the blog receives, readers are a varied bunch ranging from Lintott’s Galaxy Zoo colleagues to physicists working in completely different areas, and even non-scientists.
Why should I read it?
Lintott writes really well, and always explains the science very clearly, and his passion for all things space infuses his posts with enthusiasm. He also provides some interesting behind-the-scenes insights into the making of The Sky at Night and what it is like working with Moore; for instance, the veteran astronomy popularizer apparently insists on doing all his writing on a nearly 80-year-old Woodstock typewriter that he has owned since he was a boy of nine. Don’t expect to hear too much about the minutiae of Lintott’s life, however: astronomy is definitely the star of this blog.
How often is it updated?
Lintott often posts in bursts, putting up several posts in a day or over a few days, and then has a break for a few days to a week (presumably to allow him to catch up with his other commitments).
Can you give me a sample quote?
My trip to Hawaii was more or less a complete washout. Of the four nights we had on the telescope, we made it to the summit for one and a half of them. The last night was the most depressing, when we sat there for hours waiting for fog to clear only for it to start snowing heavily. Once that happens, you need a team with shovels to be able to open up and it’s time to head down. We did manage to get about an hour’s data on the third night, for a program which didn’t need as good conditions as mine did, providing data on targets which will be viewed by the new Herschel telescope.
Like all magazines, Physics World thinks carefully about the images that it puts on the cover of each issue. We aim for illustrations that are attractive, eye-catching and usually related to one of the main features in the magazine. This year a variety of striking images have appeared, including a battleship, a Bombardier beetle, a glacier and a snowflake. This month’s cover, though, is highly abstract, consisting of a projection into 2D of a complex 8D lattice called E8.
This image — or at least a version of it — first appeared in a paper posted last year on the arXiv preprint server by a US physicist living in Hawaii called Garrett Lisi. In the paper, Lisi controversially claims that E8 could form the basis of a “theory of everything” that unites nature’s four forces. The picture has 240 vertices and Lisi believes that 220 of these are occupied by all the fundamental particles of the Standard Model. The other 20, unoccupied, slots hold additional particles, the existence of which could be used to test his proto-theory.
Lisi’s paper was reported widely last year, partly because of its eye-catching illustrations and partly because he gave it a deceptively simple title — “An exceptionally simple theory of everything”. However, the article was widely criticized in blogs on various technical grounds, including the fact that Lisi’s work does not give the strengths of the interactions between the particles. Lisi himself has admitted that the paper, which has still to appear in a peer-reviewed journal, is not the final word. And although the article has been heavily downloaded, many of the downloads are believed to be by non-scientists (rather than mainstream researchers) coming from “news-aggregation” websites like Digg and Reddit that link to it.
Another reason that the paper got so much attention was no doubt Lisi’s unusual background. Although he has a PhD from the University of California, San Diego, Lisi currently holds no academic position and enjoys surfing and snowboarding. “Surfer dude makes stunning breakthrough” must have been too tempting a story for the media to ignore, even if it is far from the truth. Our feature this month on E8 (“Symmetry’s physical dimension”) is not so much about Lisi’s work itself, but rather uses his paper as a hook to illustrate some of the links between symmetry and basic physics.
Whether or not Lisi’s paper is a major contribution to science — the jury is still out — it illustrates the difficulty that those outside the mainstream can sometimes face in making worthwhile contributions to physics. For example, one UK physicist in his mid-50s, who recently completed a PhD in particle physics, explains elsewhere in this issue the difficulties he faced breaking into the subject as an older person (pp18—19, print edition only). He has dubbed this problem “the grey ceiling”, in analogy to the “glass ceiling” that can hinder women’s careers from progressing.
It would be a shame if talented people were excluded from physics, which needs to do all it can to encourage interest in the subject. Of course, those with unconventional career paths have to work twice as hard to be taken seriously. But one only has to look at William Henry Bragg (pp42—43, print edition only) to see that age need not be a hindrance to making valuable contributions to the subject. Despite never carrying out an original experiment before the age of 40, he went on to share the 1915 Nobel Prize for Physics with his son for their work on X-ray crystallography. Breakthroughs in physics can sometimes happen in the most unconventional of manners.
When I left university in 1975 having completed a degree in applied physics, embarking on a long-term career was pretty much the last thing on my mind. Exhausted by exams, and with the summer of 1975 shaping up to be a real scorcher, the prospect of beaches, sunshine and sailing seemed much more attractive. So I took a few months off — think of it as a delayed gap year.
Come the winter, however, with the beaches deserted and funds dispiritingly low, I looked in my local paper for something temporary to do while I considered how best to put my shiny new degree to good use. The Building Services Research and Information Association — a small research and development outfit in Bracknell — was looking for graduates to carry out laboratory experimentation into the comfort issues associated with buildings. Although the money was not great, the job was certainly interesting and would do fine as a stopgap. As it happened, the job was rather better than that and 33 years later I am chief executive of that research association, which is now the company known as BSIRA.
When I began my career, it was clear that while construction was a big industry (it now accounts for about 10% of the UK’s Gross Domestic Product), on the whole it was not at the forefront of technology and did not have many graduates — especially those with a background in science or technology — working in it. This meant that I, as a physicist, suddenly found myself working with very senior people at major UK and international companies, as well as gaining immediate access to some of the best people in the industry through in the professional organizations such as the Chartered Institute of Building Services Engineering. I was able to make a difference almost immediately.
For example, my first task was to develop a method for testing “whole house ventilation heat recovery devices”. This was quite complex, since it required the airflow volume, moisture content, temperature and electrical load to be measured in each of four positions simultaneously. A curiosity 30 years ago, this research is now essential as we work towards “zero carbon” homes. I was helped by having an employer that wanted junior employees to put their names to their work rather than hiding under their supervisors’ titles, and also by colleagues who were sociable, supportive and at the top of their game.
Diverse sector
The construction industry is involved in everything from motorways and bridges (civil construction) through to schools, hospitals, offices and homes (the built environment). Within these sectors there are designers (architects, structural engineers, building-services consultants and so on) and contractors, who actually put things together on-site. This latter group consists of the main contractors that erect the structures and specialist contractors that put in all the plant and equipment that make buildings comfortable and safe to live in. It is at the latter end of the spectrum that BSRIA works.
Founded in the mid-1950s by a group of companies that wanted to collaborate in research and development, BSRIA is one of about 50 research associations (RAs) in the UK that each deal with a particular market sector. For example, there are RAs for shoes, timber, drop forging and even (my second favourite) Scotch whisky. They are where industrial companies come together to do collaborative research, and over the years they have transformed from membership subscription-based organizations to wholly self-funded enterprises. Nevertheless, many research associations, including BSRIA, continue to operate a membership base and to facilitate collaborative efforts both in research and on a more political level.
In recent years the role of the built-environment engineer and contractor has changed radically. The escalating need to create buildings that produce very low or zero carbon-dioxide emissions has created new challenges that are taxing the very best brains. Indeed, we have to meet targets set by the UK government that all new houses built in the country will be “zero carbon” by 2016, with all non-domestic buildings following suit by 2019. You just have to look at your own home, with its heating, hot water and lighting needs, to appreciate just how difficult this is going to be to achieve if the occupants of these buildings are also going to be comfortable and healthy.
If this is problematic for new buildings, then it is even more difficult to achieve in existing buildings — and it is these that can make the real difference to carbon emissions. Roughly half of all carbon emissions come from buildings and their uses but only 1% of these buildings are newly constructed, while just 2% have some form of refurbishment each year. In other words, 97% of buildings have nothing done to them to improve their performance.
Building physics can do something about this by trying to understand the complex interactions of energy flows around structures. This involves resolving the influences of highly multivariate interactions and creating models that can eventually be used to build new structures. These tasks are well suited to people with a physics background, and there is a wonderful future ahead for bright and committed people who want to make a difference.
Path to success
BSRIA is not a large organization — it currently has 150 employees — and about half of the staff have numerate degrees, including mechanical engineering, aeronautical engineering and physics. We work both with those who are at the sharp end of invention, such as universities and other research institutions, and those who prefer to be a little further away from the cutting edge. Construction companies and their clients, for example, are highly risk averse, so it is our role to take innovation and create proven, derisked processes that can be used with confidence by constructors. As a result, both the company and its individual staff have a high exposure in the trade press and at conferences, and they have a lot of contact with the policy-makers in government. It is this diversity of activity that has made mine such a rewarding job.
I have now been chief executive of BSRIA for 10 years, having worked up through a variety of posts within the company. I started as a supervised project engineer, then progressed to running a small section of five people, and later a group of four sections. I took up the role of technical director and joined the first board of directors following a restructuring of the business in 1989 before being appointed to the top job in 1998.
Since I joined, the firm’s horizons have widened considerably — for example, in March of this year I opened a new office in Beijing, which employs five local staff. China is likely to undertake nearly half of the entire world’s activity in construction within the next decade and its government is anxious to ensure that its carbon footprint does not rise at the same rate. With many overseas companies now working there, the opportunities for transferring expertise are significant, although this expansion also presents many cultural and economic challenges for the firm. It is this mix of technology and business that has made my career at BSRIA such a delightful experience. Despite having been with the organization for 33 years, it seems like only yesterday since the beaches started to empty and the rain stopped play.
Being a child of the Moon-landing era, I was always interested in astronomy and space science. I eventually came to understand that physics was the power behind all the interesting things in the physical sciences, so I decided to study the subject at Wesleyan University in Connecticut in the US. I specialized in gravitational physics and wrote a thesis on the quantum behaviour of hydrogen atoms in the highly curved space–time near small black holes. I enjoyed it immensely, and my only regret is that there was not a better career path open to me afterwards.
What did you do after you left university?
I graduated with a BA in physics in 1984, after which I taught physics at an elite secondary school in New York City for three years. After that, I got into options trading and pricing models, which are actually rather closely related to some of the maths that appears in physics.
How did you come to create the atlas?
I got the idea for the Centennia Historical Atlas in the summer of 1990 when the Cold War ended. At that time the map of Europe was changing after decades of stability, and I thought I could make some money by putting those changes into a historical perspective — and that’s how I’ve been making my living ever since.
Can you explain what the atlas is all about?
It is a dynamic atlas of Europe and the Middle East from the year AD 1000 to the present day that puts 1000 years of history and 10,000 maps on a history student’s computer. When the software launches, the student has a map in front of them that can be played forwards or backwards in time. Countries and empires come and go right before your eyes.
How successful has it been?
After its initial release in the early 1990s, the atlas was immediately adopted by the US Naval Academy for their introductory course in Western civilization. By now, over 18,000 students at the Naval Academy, many of whom as now officers in the US Navy, have studied history using it. The atlas is also popular with diplomats and others connected with international relations — for example, I have one former director of the CIA and one former US secretary of state on my customer list. The atlas is also required reading at many universities across the world, and it is a popular tool among history enthusiasts and genealogists.
What are you working on at the moment?
As well as constantly making incremental improvements to the current edition (a free version is available for download at www.historicalatlas.com), I’m extending the atlas to cover the history of North America from the pre-Columbian period forward. I’m also working on software projects connected with celestial navigation and nautical astronomy. For example, I’ve developed tools that predict the exact position of the Moon and analyse lunar distance sights, which will allow modern navigation enthusiasts to determine their “longitude by lunar distances”, which was once the epitome of the navigator’s art. I also occasionally write options-pricing software for the financial markets.
Does your physics degree help you in your current work?
Though the contribution was minor and basically metaphorical, my physics education perhaps made it a bit easier to envision time as the fourth dimension for a map of history. It does help significantly with the work that I do on nautical astronomy, however. For example, I recently ran some numerical integrations to create refraction tables for variable atmospheric conditions. Also, in options-pricing models, the pricing is essentially identical to the mathematics of diffusion in physics.
Do you still keep up with physics at all?
I take every chance I get. Of course, it’s impossible to keep up with all of physics, but I try to keep up with the latest developments in gravitational physics, both in terms of the practical application of gravimetry and the higher level of developments in general relativity. I manage an online list related to nautical astronomy and celestial navigation, and every now and then I end up talking about gravitational physics. Just recently I managed to get a few people interested in the local tidal field of the Earth by segueing from a discussion of inertial navigation.