Skip to main content

Ultracold neutrons probe the particle-physics frontier

So what are ultracold neutrons?
Peter Geltenbort: The “thermal neutrons” produced by a research reactor such as ILL have energies of a few hundredths of an electron volt. On average they travel at about 2200 m/s. The ultracold neutrons (UCNs) that we work with have speeds of about 5–8 m/s – so even I can run faster than a UCN.

How are they produced at ILL?
PG: The first step is to make thermal neutrons from the much more energetic “fission neutrons” that are produced in the ILL’s reactor. This happens naturally when these neutrons collide with deuterium nuclei in the heavy water within the reactor core. Some of these thermal neutrons are then cooled further through collisions in vessels close to the core that are filled with liquid deuterium at about 25 K, which takes the speed down to about 700 m/s. Next, some of these neutrons are guided up using a special pipe, which transmits only neutrons with lower energies – so at the top they emerge with speeds of about 50 m/s.

The neutrons are then piped into a large vacuum vessel above the reactor core, containing a turbine with rotating metal blades. The neutrons hit the blades, which rotate backwards and cause the neutrons to lose kinetic energy. It’s the same as when you’re playing tennis and instead of striking the ball you pull back your racquet. In physics terms this is a Doppler shift of the neutron from a velocity of about 50 m/s down to about 5 m/s.

What can physicists learn from studying extremely cold neutrons?
Oliver Zimmer: The nice thing about the neutron is that it is a complete laboratory – it feels all types of forces including gravity and magnetic forces, the weak force responsible for radioactivity and the strong force that keeps the particles bound together in atomic nuclei. It even responds to electric fields – although this might sound strange because the neutron has no electrical charge. However, the neutron has a rich internal structure and contains quarks, which are charged particles. The quarks are subject to the strong and weak forces. The weak force also gives rise to the decay of the neutron, which can be best studied by trapping UCNs in bottles and watching them decay.

So you do particle physics using UCNs?

PG: Yes, but in a different way from a high-energy physics lab like CERN, which tries to reproduce the conditions less than a microsecond after the Big Bang. With UCNs we look for the consequences of the Big Bang at very high precision.

OZ: Interactions between particles are mediated by particles called bosons. In an accelerator the energy is often high enough to produce these bosons and see how they decay. In UCN research you don’t actually produce these bosons as free particles. Instead, you see their effects indirectly in the static and decay properties of the neutron. With the stunningly high precision of some experiments you can learn about the fine details of fundamental interactions and possibly even find hitherto unknown forces – that’s the beautiful thing about this research.

What have you learned so far?
PG: A recent example is that physicists at ILL have measured for the first time the quantum gravitational states of the neutron, which is done by bouncing UCNs on a very smooth mirror. The repulsive force from the mirror pushes the neutrons up, while gravity pushes them down. The effect is to confine the neutrons in a potential well and quantum mechanics tells us that the neutrons exist in distinct energy levels.

This was demonstrated by bringing a second, absorbing surface down to the mirror and then slowly pulling it upwards. We measure how many neutrons can fly through this gap and find that there are none until the upper surface reaches the height for which the first gravitational quantum state fits in the gap. As the height is raised further, a step-like intensity increase indicates transmission via the subsequent quantum states.

OZ: Physicists from Vienna and the ILL have now even started to do spectroscopy of these gravitational states. You can think of the neutron on a mirror being much like an atom that is being excited using light – shifting the atom from one electronic state to another. Similarly, jumps between different gravitational quantum states of the neutron are induced by vibrating the mirror and the energy quanta involved in these transitions are detected as resonances that occur at certain vibrational frequencies. The success of this experiment gives us hope that we will be able to use spectroscopy to check Newton’s gravity law at micron distances.

If we found a deviation, what would that tell us?
OZ: It is not understood why gravity is so weak compared with the other known fundamental interactions. On the other hand, there are now many theoretical models predicting the existence of “extra” dimensions – in addition to the space and time of our 4D universe. Quite generically, in such scenarios gravity would become much stronger at short enough distances where these extra dimensions would manifest themselves. A deviation from Newton’s law on the micron scale found with spectroscopy of neutrons close to a mirror surface would indeed be a spectacular support of such theoretical ideas.

What are neutron oscillations and how could UCNs shed light on them?

OZ: An example is the oscillation of a neutron into an antineutron. This is a process forbidden by the Standard Model of particle physics and is therefore a very interesting testing ground to search for new physics. A strong limit on the oscillation period of three years was set at the ILL more than 20 years ago in an experiment using cold neutrons. With the advent of new UCN sources there is some hope to either push this limit further or – even better – discover a neutron–antineutron oscillation using UCNs.

I’ve heard about something called the mirror neutron – what’s that?
It’s a hypothetic particle proposed by theorists as a viable candidate of dark matter – the “missing matter” that we know must exist in our universe. Like dark matter, we would feel only the gravitational effects of mirror neutrons. The basic idea is the existence of a “mirror world” immersed in our world in which particles would behave the same way, apart from the left-handedness of the weak interactions. In the mirror world, weak interactions would be right-handed. A neutron might oscillate into its mirror partner and such an effect can be very well examined using UCNs. Indeed, the first experiments have been carried out at ILL to look for it.

What are your next plans?
OZ: We are currently building a new apparatus to significantly improve the knowledge of the neutron lifetime, which is a very important ingredient for understanding how the light chemical elements formed after the Big Bang. We will use a magnetic trapping technique where the neutrons never touch the walls of the container so cannot get lost through such contact, which has been a major difficulty in past experiments.

There is also a very interesting theoretical prediction of a nonlinear effect due to fluctuations in the quantum-electrodynamic vacuum. As you know, the vacuum is far from being “empty” – virtual electron–positron pairs constantly appearing and vanishing make it resemble a medium that can be polarized. Theorists have predicted that a neutron moving in the strong electric field close to a heavy atomic nucleus would develop an induced electric dipole field. We hope to be able to demonstrate this astonishing nonlinear effect in a neutron-scattering experiment. If succesful, it would provide a first observation of a violation of the superposition principle for classical electric and magnetic fields. We hope to be able to report good news on this soon.

PG: For me, an interesting project is the search for an electric dipole moment of the neutron. The Standard Model of particle physics predicts that this dipole moment is extremely small. However, if it is bigger than expected, that could point towards new physics that could explain why there is much more matter than antimatter in the universe.

This idea was proposed by Norman Ramsey and Edward Purcell in 1950 and the first experiment putting an upper limit on the dipole moment was done at Oak Ridge National Laboratory a year later. In the last 60 years we have been trying to measure what is essentially zero with ever higher precision. There is a nice way to look at the problem: imagine the neutron as blown up to the size of the Earth. The current experimental sensitivity of the electric dipole moment would then correspond to a separation of two elementary electric charges in this Earth-sized object by only 3 µm.

What is the point of art–science collaborations?

By James Dacey

hands smll.jpg

CERN has recently announced that it will be opening its doors to artists to come into the lab. Artists working in different art forms will have the opportunity to take up a funded residency of up to three months where they will come into the lab and work alongside CERN researchers. The artists will then produce works based on their experiences to be exhibited at CERN and other locations. In the same statement, CERN revealed that details of a separate residency scheme, coupling scientists with dance and performance artists, will be unveiled in November.

We are interested to know what you think of this kind of scheme. In the latest poll on the Physics World Facebook page, we ask the following question: Which of these statements best describes your opinion of art–science collaborations?

a) I love them, they’re fantastic
b) Hmm, some are great, some are not
c) They can be ok but often I don’t ‘get’ the point
d) Who cares? They’re a total waste of time

Take part by visiting our Facebook page. And please feel free to post a comment on the poll to describe your personal experiences of art–science collaborations.

In last week’s poll we celebrated the 100th anniversary of the publication of Rutherford’s seminal paper on the structure of the atom. Rutherford was an industrious researcher who many remarkable contributions to science, including three discoveries that revolutionized our view of matter. So we asked people to choose which of Rutherford’s discoveries they think was the greatest:

a) That atoms are not always stable (his Nobel-prize-winning work on radioactivity)
b) The atoms have the majority of their mass concentrated in a nucleus
c) The world’s first alchemy (converting nitrogen into oxygen)

The results were fairly conclusive with 76% of respondents believing that Rutherford’s discovery of the atomic nucleus was indeed his greatest contribution to science. 17% opted for his work on radioactivity and just 7% went for his later work on transmutation. The general sentiment was captured nicely in this comment from one of the respondents, Pradeep Sharma, who said: “Rutherford’s most important discovery is undoubtedly the nucleus of the atom. He is one of the rare breed of scientists who did his most important work after getting the Nobel prize.”

CRESST uncovers hint of dark matter

Physicists are finally closing in on dark matter, the elusive substance thought to make up most of the matter in the universe. Either that, or they’re being misled by some unknown source of error.

That seemed to be the general idea, after the team behind the CRESST (or the Cryogenic Rare Event Search with Superconducting Thermometers) experiment in Italy announced on Tuesday that it had uncovered signals that could be interpreted as dark matter. These signals join possible traces of dark matter seen by two other direct-detection experiments in recent years, which suggests that evidence is mounting. The trouble is, physicists cannot agree whether the different signals match up or not.

“There is no consensus – which is good, I think,” says Rafael Lang, a physicist at Columbia University, New York, and member of another dark-matter experiment called XENON. “My personal feeling is that we are not understanding the detectors well enough. At the same time, I think that if any of these [signals] were dark matter, that would be fantastic … It would be ‘easy’ dark matter, one that likes to interact with us and tell us where it’s coming from, what it’s doing.”

Tricky search

Dark matter is thought to make up more than 80% of the universe’s matter. However, it is invisible and has so far only been inferred by the gravitational pull that it exerts on normal matter. Physicists think that it probably takes the form of weakly interacting massive particles, or WIMPs.

To spot these WIMPs directly, researchers have built detectors in underground labs where the low background noise ought to allow any signals to stand out. The biggest underground lab is at Gran Sasso, a mountain in central Italy, home to various dark-matter experiments such as DAMA, XENON and CRESST. For just over a decade, the team behind the DAMA experiment has claimed to see a WIMP signal – an annual modulation that would fit in with the Earth orbiting with and against the prevailing “wind” of dark matter in our galaxy. Last year, the CoGeNT collaboration, based in the Soudan mine in Minnesota, US, reported hundreds of blips in their detectors that could also be WIMPs.

Such dark-matter detectors work when a WIMP collides with an atomic nuclei, which then recoils, producing a trademark scintillation, or flash. CRESST has an advantage here, in that its detectors use three types of nuclei – calcium, tungsten and oxygen – bonded together in calcium tungstate. Each of these nuclei has a different mass, which effectively means that dark matter can be probed in three different ways. For instance, if the recoil comes from a tungsten nucleus – the heaviest of the three – that probably means the WIMP itself is heavy.

Speaking at a conference in Munich on Tuesday, the CRESST team announced that they ran their latest experiment from June 2009 to April 2011, using 2.4 kg of calcium tungstate. They recorded 67 WIMP-like signals, of which roughly half could not be explained by any background phenomena. A preprint of their analysis is available at the arXiv preprint server.

Background to blame?

The question is whether the signal from CRESST, which points to a relatively light WIMP, can be reconciled with results from other direct-detection experiments. DAMA and CoGeNT have both recorded positive signals, but not for WIMPs with the same range of properties. Worse, the CRESST signal suggests a WIMP with properties that had previously been ruled-out by experiments such as XENON and CDMS, the latter of which is based at the Soudan mine.

“It is clear that it is difficult to reconcile the results from CDMS, XENON, CRESST and other dark-matter experiments with a single, simple dark-matter interpretation,” says Jodi Cooley, a physicist at the Southern Methodist University in Texas, US, who works on the CDMS experiment. “So, that leaves one of two possibilities. Either dark matter is behaving in a very strange way that we do not understand, or the backgrounds in the CRESST experiment are not well enough understood. To me, these results underline the need to have experiments that are capable of operating a mode where background subtraction is not necessary.”

Not everyone agrees. The properties of a detected WIMP are estimates, liable to change with varying assumptions about the equipment used. This leads some physicists to believe that the positive results can be reconciled.

Dan Hooper of the Fermi National Accelerator Laboratory near Chicago, US, is one such physicist. “The new results from CRESST are indeed very exciting,” he says. “They seem to be roughly compatible with previous signals reported by the DAMA and CoGeNT collaborations. The null results from CDMS and XENON do introduce some tension into this interpretation, although I am of the opinion that a self-consistent picture could come out of this complicated situation.” Members of the CRESST team could not be reached for comment.

More experience necessary

It will take time to understand whether the CRESST signals do, indeed, signify WIMP collisions and this requires a better understanding of the detector equipment itself. But the experimentalists are hopeful.

“[The CRESST signal] looks to me to be consistent with an ‘additional’ background,” says Alex Murphy, a physicist at the University of Edinburgh who works on the ZEPLIN dark-matter experiment in Boulby Mine, UK. “At the sensitivity level that the latest detectors are reaching, extremely rare interactions, involving multiple scattering, pile up, partially dead regions of detectors, edge effects and combinations of all of these, begin to be important. Even backgrounds have backgrounds!”

Become a Wikipedian!

By Louise Mayor

<img alt="500px-Laser_Towards_Milky_Ways_Centre.jpg" src="https://physicsworld.com/wp-content/uploads/2011/09/500px-Laser_Towards_Milky_Ways_Centre.jpg" width="250" height="227" class="mt-image-right" style="float: right; margin: 0 0 20px 20px;"
/>

Being online right now, chances are you’ve recently been to the fifth most visited site on the Web: Wikipedia.

I am happy to admit that I use Wikipedia frequently and find it very useful – particularly for physics. It’s great when I want an introduction to a phenomenon or technique, or to get the cogs going again on something I learned long ago at university.

However, I do remember a time when using Wikipedia was a bit more hit and miss. It was pot luck whether an article would be either well written and accessible, or an impenetrable wall of techno-speak and equations.

Now, thanks to more than a billion edits since Wikipedia’s inception, the odds of finding a well-written article are much higher and article quality continues to improve every day.

But there’s still a long way to go before the site’s eventual goal is achieved: to assemble a complete overview of human knowledge. And this is where you come in. Yes, you! With a lay or professional interest in physics, you are ideally placed to contribute.

According to Martin Poulter, a new media manager at the University of Bristol, and Mike Peel, an astrophysicist at the University of Manchester, it is rewarding work. In “Physics on Wikipedia”, an article published this month in Physics World, Poulter and Peel argue that if you have knowledge you can share, Wikipedia needs you.

Also, how about images you can share? You may be ideally placed, for example, to capture photographs of things the public would not normally be able to see, such as pieces of equipment or research facilities. The image at the top of this blog entry (By ESO/Yuri Beletsky (ybialets at eso.org) (http://www.eso.org/public/images/potw1036a/) [CC-BY-3.0, via Wikimedia Commons is a great example of this, and was picture of the year 2010 on Wikimedia Commons, an online respository where you can upload your images for free use.

Read “Physics on Wikipedia” now to find out why you should click that edit button.

How Rutherford shaped nuclear physics

This year is the 100th anniversary of Ernest Rutherford publishing his seminal paper describing the discovery of the atomic nucleus. The New-Zealand-born physicist reached this profound insight after his landmark alpha-particle scattering experiments carried out at the University of Manchester. To mark the centenary, the university hosted a special week-long conference in August, organized by the UK’s Institute of Physics, which publishes Physics World.

Physics World‘s multimedia team interviewed a number of invited speakers at the conference to find out how Rutherford’s discovery had inspired their own fields of research. Michael Pennington of the Jefferson Laboratory in Virginia, for instance, describes how physicists came to realize over the past 100 years that protons and neutrons are themselves divisible into smaller subatomic particles. In his own research, Pennington is concerned with the manner in which quarks and gluons interact inside hadrons. “The idea is to understand the nature of the colour force – how quarks are bound together by gluons to make protons and neutrons and all of nuclear matter,” he tells Physics World.

Another speaker, Hendrik Schatz of Michigan State University, is interested in how nuclear processes can explain astrophysical phenomena. Where he works at the National Superconducting Cyclotron Laboratory, Schatz recreates the explosive hydrogen burning that takes place on the surface of neutron stars. By studying the data from these experiments, Schatz is trying to understand the types of nuclear matter that exist within neutron stars and the physical properties of these stars.

In addition to these vox pops you can also watch a short film about the Rutherford Centennial Conference.

Physics on Wikipedia

In a 1992 article for Physics World, Tim Berners-Lee wrote about the difficulties of managing the explosion of information available through his new invention, the World Wide Web. He saw that while easy, global online publishing would bring many benefits, users would be overwhelmed by huge numbers of documents. Electronic publication was also blurring the traditional lines between academic work and personal opinions. Users would need overviews of each area of knowledge, with reviews to help them assess the reliability of what they read. This would have to happen on a network with no central control. In short, what people really needed was a Web of knowledge, not just of information.

Berners-Lee saw that the Web needed an encyclopedia. This would be “An attempt by the knowledgeable, the learned societies or anyone else, to represent the state of the art in their field. [It] will be a living document, as up to date as it can be, instantly accessible at any time.” Nearly 20 years on, it is time to revisit the idea of organizing the world’s knowledge. Wikipedia, the free online encyclopedia that anyone can edit, has become the fifth most visited site on the Web, with nearly half a billion visitors per month. It fits Berners-Lee’s description, but is still (and always will be) a work in progress.

This article looks at two developments. Having established itself as the largest reference work ever created, Wikipedia is looking to be ever more reliable and detailed in what it covers. This involves collaborating with scholarly communities, including the Institute of Physics (which publishes Physics World), and individual educators or researchers. If you want to inform and excite the public about the techniques and discoveries of your favourite area of physics, Wikipedia is a way to reach the greatest audience.

Additionally, although a great many articles are incomplete, this is increasingly being seen as an educational opportunity. Some university courses have started to assign students the task of improving Wikipedia articles. This process encourages some very good habits, such as proper sourcing of statements and respectful collaboration with others. The same opportunity is open to anybody with the right skills. If you can look up facts; summarize, structure or illustrate them; and make them understandable to other people, then Wikipedia needs you.

Free for all

Wikipedia is the best known of nine online projects run by the Wikimedia Foundation, a US-based charity. Each of them serves a different educational or reference need. For example, people can look up Richard Feynman’s witticisms on Wikiquote or define “attophysics” on Wiktionary. Each project is multilingual: Wikipedia itself is currently being written in 270 different languages. As the names suggest, all of these projects are based on a type of software called a “wiki”. This gives each page an edit button so that a site’s readers can rapidly make changes.

It is an unusual publishing enterprise, not least because it depends on volunteer labour. The roughly 100,000 regular contributors all work for no pay, because they believe in the shared goal of creating “a world in which every single human being on the planet is given free access to the sum of all human knowledge”. As well as the writing, all of the editing and reviewing is done collaboratively by volunteers.

The projects also differ from traditional publishing in that they provide free content. This is “free” not just in the sense of “for no money” but also in the sense of free speech. All the content is available under “copyleft” licences that guarantee the users’ right to copy, modify and redistribute, given certain conditions. So while most sites on the Web would not be happy with you taking their images or video for your own site or publication, anyone can reuse content from the Wikimedia projects, so long as they obey the licence conditions. These vary, but usually involve fully crediting the original source. The entire text of Wikipedia, and the software it runs on, can be taken and copied onto other media, again so long as the original source is credited.

Wikipedia does not accept original research; that has to be published and critically examined in the usual way in peer-reviewed journals. The function of Wikipedia (or any encyclopedia) is to give overviews of subjects in language that a layperson can understand. Different readers want different amounts of information, and Wikipedia‘s structure reflects this. Someone who wants a quick overview of what is known about stars can read the summary paragraphs at the top of the “Star” article. If they want more detail, they can read the full article and follow links to sub-articles such as “Stellar evolution”, “Neutron star” and then even to specific noteworthy neutron stars. All of these articles cite published sources, so that readers can check the facts for themselves. In this way, the encyclopedia serves a “pre-research” function, satisfying the curiosity of laypeople while driving experts to the most relevant sources for a specific topic.

Subject matter

Wikipedia‘s 3.7 million articles in English are backed by other pages such as policies, guidelines, user profiles and noticeboards. Though not part of the encyclopedia, these are also open for the public to view and edit. These pages include WikiProject Physics, a shared space for contributors – some expert, some amateur – who improve articles on the subject. On the WikiProject Physics discussion pages, they review articles, share tasks and ask for advice.

The wikiproject reviews and monitors about 14,000 articles with some relevance to physics. The 80 most popular of these are each getting more than a million hits per year. All you can assume about the readers of English Wikipedia is that they know English – not necessarily at a high level – so a lot of effort is put into making the encyclopedia not just technically correct but also accessible to people with no background in physics. Sometimes this takes the form of dedicated overview articles such as “Introduction to special relativity”. The extreme end of this drive for accessible explanations is the Simple English Wikipedia, written in a restricted vocabulary for learners of English around the world. Roughly 150 of its articles are on physics, including heroic attempts to explain terms such as “frame-dragging” and “negentropy”.

At their best, Wikipedia articles ignite curiosity as well as satisfying it. While being neutral and accurate, they also draw the reader in and show them why the topic is worth knowing about. That involves using not only engaging text, but also images, video clips, tables, equations and formulae, and of course references. The “free content” requirement means that Wikipedia cannot just take any image or video from the Internet – they need to be freely licensed or out of copyright. Photographs, diagrams, animations and video are all collected on, and curated by, its sister project Wikimedia Commons (see images at top of this article). Many of these files are uploaded by researchers, scientific bodies or educational projects. These files also include lecture videos – for example, the Massachusetts Institute of Technology has shared clips of Walter Lewin’s physics lectures.

Articles on a common theme can be organized into what is called a Wikipedia Book, which can be downloaded as a print-quality file. The books relating to physics include one on Isaac Newton and another about the Large Hadron Collider. As the overall quality of Wikipedia articles improves, this raises the prospect of highly specific textbooks customized to individual educational courses.

Drive for quality

WikiProject Physics, and the Wikipedia community generally, are working to drive articles to ever higher levels of quality. The site’s status as the largest ever encyclopedia is only a step towards the eventual goal of assembling a complete overview of human knowledge. Improvement takes the form of a repeating cycle of editing and reviewing. Whether a change remains or is undone by another editor often comes down to the quality of the sources. For example, if you add a fact that is cited to a peer-reviewed journal, that will probably remain, though anyone is welcome to improve the phrasing or structure. Statements based on a self-published source, such as a personal blog, are much more likely to be replaced or deleted.

Evaluating the quality of articles is one of the crucial activities going on behind the scenes of Wikipedia, and one of the ways it depends on respectful collaboration. There are multiple stages of review, at each of which the article is checked against specified criteria. Some criteria are straightforward, such as the correct capitalization of headings. Other contributors require a good knowledge of the published works on a given subject so that they can assess whether an article is complete and well written. Although Wikipedia does not give any special status to qualified users, experts have a real practical advantage because of their knowledge of the scientific literature.

As articles are developed, they can be given various ratings. Most begin as a stub (figure 1a), which includes a definition and some basic facts but no references. A few ratings above this is the C-class article, an example of which is “Hydrogen spectral series”. This has a lot of sourced material and tables but is a list of facts rather than an accessible overview of the topic. An article that has been confirmed as well written, factually accurate, broad, neutral, stable and illustrated can be labelled a good article, an example of which is “Weak interaction”. Only about 0.5% of Wikipedia‘s content is at or above this level. The criteria for a featured article are even more demanding (figure 1b). This label is awarded to articles that are “professional, outstanding and thorough”, meeting a long list of style requirements. The 44 current featured articles in physics include “Supernova”, “Atom”, “General relativity”, “Uranium” and “Magnetosphere of Jupiter”. Articles at this level can appear on the front page of the site as “Today’s Featured Article”. This happened to “Gamma-ray burst” on 18 June 2011. As a result, five million people saw the one-paragraph summary on the front page, and 32,000 of them followed the link to the full article. This shows how, through Wikipedia, scientific ideas and theories can access a readership in one day that pages on many other sites would struggle to reach in a whole year.

There are many ways to get involved in improving Wikipedia (see box). Some of the biggest contributions are made not by people writing articles, but by research groups or archives that give permission for collections of images to be reused.

Educational projects

We in the Wikipedia community have set ourselves a truly enormous task: to write high-quality, accessible articles on every area of knowledge. The estimated 100 million person-hours achieved in the first decade are only a start. Meanwhile, educators in schools and universities want to create experiences that encourage critical thinking, collaboration, proper use of sources and other good scholarly habits. Perhaps these problems can solve each other if educators and Wikipedians work together.

That is the thinking behind Wikipedia educational assignments. Students or school classes work with each other and with the Wikipedia community to create or improve articles or specific items such as graphs, timelines or diagrams. The wiki software records their contributions and instructors assess their work. The first university assignments focused on articles about public policy. For example, the extensive article “Nuclear energy policy in the United States” was written by Kasey Baker, a graduate student, in collaboration with Wikipedia volunteers. This opportunity has now been thrown open to all subjects. Thousands of students, at universities on four continents, are now improving Wikipedia for course credits.

Thousands of students, at universities on four continents, are now improving Wikipedia for course credits

Students who are working on undergraduate projects, or are beginning postgraduate work, are the ideal people to benefit. They already have to create neutral, well-sourced reviews of the existing literature, which is just what Wikipedia needs. The site is also an excellent self-study opportunity. For someone who is no longer a student and wants to extend their knowledge, improving a Wikipedia article and submitting it for review is a way to get independent feedback.

The above-mentioned quality scale can be used to guide assessment of students’ work, although a more finely grained scale has been created specifically to help instructors. The students see their work being published online for a global audience, perhaps becoming more widely read than their own tutors. This exposure, and the prospect of their work being evaluated by strangers, can motivate students to work especially hard. One university professor remarked that this was the first assignment in which students took a keen interest in the library. A downside is that students might experience stage fright when taking their first steps. This can be handled by creating the first drafts elsewhere, perhaps in a virtual learning environment, and adding them to Wikipedia once they have been checked.

Wikipedia provides two kinds of help for educational assignments. Campus ambassadors are trained to work with staff and students in person, giving them the confidence to use the site. Online ambassadors are experienced Wikipedians – perhaps geographically remote – who monitor and support the students’ contributions. With this volunteer help, which is available if you get in touch with us, instructors and students can take part without having to learn every little detail of Wikipedia.

Wikipedia‘s sister projects can also assist education. Wikibooks is another project that invites educational assignments from universities and schools. Unlike Wikipedia, it is not constrained to an encyclopedic style: it allows “how to” books or textbooks, so classes can be set the task of actually writing the textbook for their course. Wikiversity hosts a variety of educational resources and course pages.

In summary, Wikipedia and its related projects have an ambitious goal that we believe is worth pursuing in its own right. Berners-Lee saw that the Web was incomplete without a global, comprehensive, up-to-date encyclopedia. This is what the Wikipedia community is working to provide. If you have access to books about physics, you can help, even in a small way. We can testify that it is rewarding work that develops your knowledge of the subject, while potentially reaching an audience of millions. Clicking the edit button is just the first step.

Box: Get involved

In addition to writing and improving Wikipedia articles, there are many other ways to contribute.

  • Translate high-quality articles into other languages
  • Review articles and suggest improvements
  • Contribute to discussion pages and noticeboards
  • Share files such as images, animations and video clips
  • Meet up with other interested people to share tips
  • Keep an eye on articles and watch for attempts to vandalize them

If you have an idea that needs some money to get off the ground, get in touch with your local Wikimedia chapter, which in the UK is Wikimedia UK.

Young Earth was sprinkled with precious metals

Early Earth was showered in meteorites containing precious metals left over from the beginnings of the solar system, claims a group of researchers in the UK. This explains why Earth’s surface contains far more precious metals than is predicted from our understanding of how the Earth formed. These meteorite impacts may also have triggered the onset of large-scale convection cells in the Earth’s mantle – the driving force behind plate tectonics.

As Earth began some 4.56 billion years ago to emerge from the protoplanetary disc surrounding the Sun, its first 100 million years were marked by violence and turmoil. Collisions between Moon- to Mars-sized objects and the Earth caused widespread melting in the Earth’s molten oceans, leading iron to become separated from surrounding liquid silicate. The iron began to sink into the Earth’s interior, dragging with it metals such as gold and platinum-group elements, which have a high affinity for iron. Geologists believe that between five and ten million years into Earth’s history, the vast majority of these metals should have accumulated at the planet’s centre, forming the Earth’s core. Indeed, it has been estimated that there are enough precious metals in the core to cover the entire surface of the Earth with a 4 m-thick layer.

But researchers have been puzzled to discover that these “iron-loving” elements exist at the Earth’s surface, and inside its molten mantle, in much higher quantities than predicted by models and experiments. One proposed explanation is that these extra precious metals could have been delivered via meteorites that continued to crash into the Earth for a long time after the core had become established. Proponents of this theory suggest that an initial period of large impacts – one of which may have created the Moon – was followed by a sustained spell of smaller impacts lasting for 500 million years. This period terminated with a final bombardment around 4–3.8 billion years ago that could have enriched the mantle with a “late veneer” of precious metals.

‘Late veneer’

“We still do not have a clear picture of how the Earth formed and what processes took place early on in its history.” Matthias Willbold, University of Bristol

In a new study, a group of researchers led by Matthias Willbold at the University of Bristol has added significant weight to this theory after analysing some of the oldest known rocks on Earth – the 3.8-billion-year-old Isua belt in Greenland. These rocks have survived so long without being recycled by Earth’s tectonic processes because they were deposited into the heart of a young continent, away from continental subduction zones. Willbold and colleagues realized that because some of these rocks predate the end of the final bombardment of meteorites, they contain a geological record of the mantle at this time. By comparing the chemical composition of these rocks with younger rocks, the researchers set out to discover whether the Greenland rocks contained evidence to support the late-veneer theory.

“We still do not have a clear picture of how the Earth formed and what processes took place early on in its history,” Willbold tells physicsworld.com. We don’t have a well preserved geological record from this time so we have to use isotopic and chemical traces like in this study”.

In their analysis, the researchers looked at the relative abundance of a particular tungsten isotope, 182-tungsten, which is generated by the decay of the isotope 182-hafnium. When the core formed, tungsten gradually sunk towards the core due to its affinity for iron – similar to all of the precious metals – but the hafnium isotopes continued to decay, enriching the mantle with tungsten. Willbold and colleagues showed that the ancient Greenland rocks from this time are, indeed, enriched in 182-tungsten by 13 parts in 1,000,000 when compared with younger rocks.

Primitive space rocks

The researchers attribute this finding to the fact that younger mantle rocks contain a small contribution from the space rocks that hit Earth in the final meteorite bombardment. Some of these meteorites known as chondrites consist of primitive material that contains relatively lower quantities of 182-tungsten and also higher amounts of precious metals. This terminal bombardment lowered the relative 182-tungsten content of the ancient Earth down to its low present-day level, and replenished the mantle with precious metals.

Thorsten Kleine, a geoscientist at the University of Münster in Germany who specialises in the formation of terrestrial planets, believes that the new findings strongly support the late-veneer hypothesis. “The new data prove that there have been late additions of meteoritic material to the Earth’s mantle, because we don’t know of any other way to produce the observed tungsten isotope variations,” he says.

In addition to explaining the distribution of precious metals within the Earth, these meteorite impacts could also explain how large-scale convection cells became established within the mantle. These rising and falling columns of molten magma are the driving force behind tectonics, causing crust to be created where magma upwells at mid-ocean ridges, and to be destroyed at subduction zones where colder magma sinks deep into the Earth’s interior.

Willbold and colleagues speculate that the high energies released by those meteorite impacts would have melted the crust, creating kilometre-wide melt ponds. “At the bottom of these magma lakes, dense and heavy rocks would have formed when these magma lakes crystallized,” explains Willbold. Eventually, these heavy bottom layers would have sunk into the mantle and could have provided the thermodynamic disturbance to initiate a convection cell.

This research is described in a paper published in this week’s Nature.

Special report: China

china-cover-framed.jpg

By Matin Durrani

Ever since Physics World was launched by the Institute of Physics in 1988, we have sought to report on physics wherever it is going on in the world – first in our print magazine and then here on this website.

But with the huge recent expansion of physics research in China – which has included everything from next-generation telescopes to powerful synchrotron and neutron sources – we have, in turn, massively increased our coverage of the country’s physics.

We’ve now put together a new Physics World special report, which you can view online here, that draws together a selection of our recent news stories, features and careers articles published about physics in China.

Among the highlights is an exclusive interview with the chief scientist of China’s lunar programme and a profile of the Kavli Institute for Astronomy and Astrophysics in Beijing.

There’s also an analysis of the massive growth in scientific papers produced by researchers in China, who – at least, according to a recent study by the UK’s Royal Society – now publish more than 10% of the world’s total. Sadly, the rise in quantity has not always been accompanied by a rise in quality, with some unfortunate though high-profile examples of plagiarism.

Several of the articles are based on a week-long trip to Beijing that I went on earlier this year – it was my first visit to China and I found the country a fascinating place. I hope you find this special report equally stimulating and please do let me have your comments by e-mailing pwld@iop.org.

View the special report now!

Calcium ions simulate the quantum world

The first digital “quantum simulator” based on trapped ions has been built by physicists in Austria. The system, developed by Ben Lanyon and colleagues at the University of Innsbruck, comprises a number of trapped calcium ions that are manipulated using sequences of laser pulses. The team has used the system to simulate the time-evolution of several multi-particle systems.

A quantum simulator uses one quantum system to simulate the behaviour of another, less accessible system. For example, by carefully manipulating the laser light and magnetic fields trapping an ensemble of ultracold atoms, researchers can control the interactions between atoms – and therefore simulate interactions that occur between electrons in solids. But unlike electrons in solids, the strength of these interactions can be easily adjusted, allowing physicists to test theories of condensed-matter physics.

Analogue to digital

Most quantum simulators are “analogue” in the sense that the interactions between the trapped atoms are directly analogous to those between electrons. A digital quantum simulator, in contrast, contains an ensemble of interacting quantum particles that act as quantum bits (qubits) and can be used to create quantum logic gates. The quantum system to be simulated is then encoded into the system and the behaviour of the electrons is determined by performing a quantum calculation.

Unlike analogue simulators, which address specific systems, a digital simulator could be used to study a wide range of quantum systems. Furthermore, digital simulators can benefit from error-correction schemes, which means that physicists can be more confident in their results.

But while researchers have had some success creating digital quantum simulators using nuclear magnetic resonance (NMR) techniques, these work with just two or three qubits and it has proven difficult to scale up to the 40 or so qubits needed to do a useful quantum simulation. The new trapped-ion quantum simulator created Lanyon and colleagues means that it should, in principle, be much easier to scale up such a system to do useful simulations.

Easily scalable

The team’s experiments begin with a small number of calcium ions (a maximum of six) that are lined up in a row in an electromagnetic trap. Each ion can exist in two electronic states – “0” and “1” – and can therefore act as a qubit. Interactions between individual ions can be controlled by firing carefully selected laser pulses at the trapped ions.

A calculation begins by putting the ions into a specific quantum state. In an experiment involving four ions, for example, each qubit was given the value “1”. A series of laser pulses was then fired at the ions, which causes them to interact with each other creating a sequence of logic gates that process the quantum information held in the initial state.

It is this sequence that simulates the interactions that occur in a real (or imagined) quantum system. In this particular example, the qubits were used to simulate four spin-1/2 particles in which the spin of each particle can interact with the three other particles.

Approximate solutions

Lanyon and colleagues were interested in calculating the time evolution of the spins, which is particularly difficult to do using a classical computer. To do this, the team implement the “Trotter approximation” on their system. This is done by firing a series of pulses that simulates the evolution of the system over a certain period of time before the values of the qubits are read out. The system is then reset and an identical simulation is repeated many times to obtain average values for the qubits – which is an approximate solution to the problem being simulated.

This entire process is then repeated to simulate a number of different time periods, building up a map of the time evolution of the spins. Time evolution simulations were done for as many as six trapped-ion qubits and involving up to 100 quantum gates.

Towards quantum chemistry

“Six qubits and a 100 gates for quantum simulation is a feat that paves the way for more complex and rich digital quantum simulations in the future,” says Alán Aspuru-Guzik of Harvard University in the US. “What Ben Lanyon and his colleagues did was to implement one of the most important building blocks for quantum simulation, what we call a ‘Trotter step’ in a generic or universal sense. This is one of the required and essential building blocks to do exact quantum chemistry on quantum computers, when they become powerful enough.”

Lanyon told physicsworld.com that his team’s next challenge is to perform the simulations with 10 or more ions. Creating such a system is not the problem – the team has already trapped and entangled as many as 14 ions. However, performing large numbers of operations on the ions is tricky because the qubits tend to lose their quantum nature over time as they interact with their surroundings.

The work is described in Science 10.1126/science.1208001

The end of astronomy’s golden age?

The James Webb Space Telescope (JWST) – the planned successor to the Hubble Space Telescope – is in serious trouble. The most powerful body in the US House of Representatives – the Appropriations Committee – has adopted legislation that would specifically terminate the JWST, as part of a drastic reduction of NASA’s 2012 budget by $1.9bn to $16.8bn. The cancellation is not expected to be included in the US Senate’s separate spending bill, which is to be released this month. However, the negotiations for a compromise bill between the House and the Senate would entail substantial risk for the JWST under “normal” circumstances. In the current political climate, the risks are even greater – early autumn will be a critical time for the JWST.

Hubble is arguably the most widely recognized scientific mission of all time, delivering thousands of images of the cosmos. The JWST will be immensely more productive and powerful than Hubble via its much larger primary mirror and its hugely advanced technology. The JWST will continue the tradition of NASA’s “great observatories” – Hubble, the Chandra X-ray observatory and the Spitzer infrared telescope – by shedding light on the first galaxies and stars, as well as studying exoplanets for evidence of water and other molecules related to life. No existing or planned telescope, on the ground or in space, could do the amazing science that the JWST would do.

Lost generation

So far, $3.5bn has already been spent on the JWST, with excellent technical progress being made. Roughly 75% of the hardware for the mission has been delivered or is in the final fabrication stage. Indeed, in June polishing of the JWST’s 18 beryllium mirrors was complete and their cryogenic performance is now being tested. The mirrors are so smooth that if each were the size of the US, the typical surface ripples would be just 5 cm high. Unfortunately, all this good news has not percolated as widely as it should have to government, and we are now in the invidious situation of possibly losing the most powerful observatory ever conceived. In doing so we would not only harm US astronomy but also seriously damage our collaborations with Europe and Canada, which have already spent hundreds of millions of dollars on the JWST programme.

The cancellation would spell a dangerous time for US science. The ability of NASA and the wider science community to establish and sell “flagship” missions to Congress would be seriously hampered in future. Given that the JWST was first proposed 22 years ago, it is likely that US astronomy would not recover for at least a decade or two, effectively losing a generation of scientific progress.

Management issues

The JWST took its first major step in 2000, when it was included as the top-ranked space-based mission in the “decadal survey” – where astronomers in the US identify the highest priority research activities in astronomy and astrophysics for the coming decade. The mission’s initial cost of around $1bn was, however, underestimated and substantial efforts were made to put the JWST on a firmer fiscal footing in the early 2000s. Then in 2005 US President George W Bush (and Congress) placed further stress on NASA’s budget by requiring that the space agency do “everything” with no additional funding. This included continuing with the space shuttle for five more years, replacing it with a new rocket capability, finishing the International Space Station and planning for missions to the Moon and Mars. Following the adoption of Bush’s plan, the science budget was cut, losing roughly $10bn since 2005. It was then very hard for spending on JWST to be ramped up in its most critical funding years.

Budgetary stress, however, was not the only issue. Progress of the JWST was hit by poor oversight and lack of independent validation, together with communication and leadership issues at NASA’s Goddard Space Flight Center and the Science Mission Directorate. Following concerns about schedule delays and cost overruns, the Independent Comprehensive Review Panel (ICRP) was set up in June 2010 to look into NASA’s management of the telescope.

Chaired by John Casani from NASA’s Jet Propulsion Laboratory, the ICRP concluded that while the JWST project had made considerable technical progress that was “commendable and often excellent”, there were concerns that NASA was not exercising adequate independent oversight and evaluation of project performance (see “Hubble successor hit by budget setback”). NASA has since implemented virtually all of the ICRP’s 22 recommendations and has released its revised plans for the JWST’s completion, setting a possible 2018 launch. However, the full cost of the mission has not been announced, although some estimates put it in the range $7–8bn, which is consistent with extrapolating the ICRP’s estimate of $6.5bn if the JWST was launched in late 2015.

The answer to why the JWST has problems is simple. Any and every project that involves new technology and is on the scale of the JWST will have difficulties. This is not bridge-building – nobody has made a telescope like the JWST before. If the JWST did not have problems, it would mean that we lacked ambition and were not utilizing our full technological potential. What must be done is to plan for resolving and fixing problems quickly. In my view, it is impossible to carry out a one-off project of this scale, with such challenging technologies, without adequate funding and substantial contingency to minimize the inevitable problems. The core issue for the JWST is that it did not have the contingency needed to address such problems immediately.

Counting the costs

If the House of Representatives’ budget for NASA becomes law and the JWST is cancelled, I would compare the damage to astronomy to that the high-energy physics community suffered in the early 1990s when the Superconducting Super Collider was canned. There would be a similar devastating change in scientific opportunities, but with a very important difference – no other nation (or group of nations) could currently build a JWST. There is no plan B for the JWST as US particle physicists had with CERN’s Large Hadron Collider. China and Europe could build a replacement for the JWST in the coming decades, but only far into the future.

Cancellation would also mean that more than a third of the long-term astronomy budget would disappear overnight. The full expectation and understanding has been that as the JWST is completed, its funding would revert to the astrophysics budget and be available for new missions. But with no JWST, those funds would not be available. The recommendations of the 2010 decadal survey could not be implemented, meaning no Wide-Field Infrared Space Telescope, no International X-ray Observatory, no Laser Interferometer Space Antenna and no significant post-Kepler exoplanet mission. Astrophysics would become one of the smaller programmes in NASA space science, similar to what heliophysics is now. Study of the entire universe would receive similar funding to that of one star – the Sun.

Once Hubble and Chandra die by end of the decade, US space-based astronomy would consist of a series of small missions. This would be a devastating retreat from where we are now, and certainly from where we would be with the JWST. Astronomy research would lose more than $30m per year of funding for research students and postdocs at institutions across the US. While small missions have done great science, and will do so in the future, they are narrowly focused, infrequent and have a narrow research support base. They are far from being “observatories” that can respond quickly to new scientific issues and involve a broad swath of the international science community and its students and postdocs. If we are to continue the remarkable productivity and iconic visibility of NASA’s great missions such as Hubble, Chandra and Spitzer, then we must continue with the JWST. Scientists, and the public, need to have their voices heard if astronomy is to remain as dynamic and exciting as it is today. Amazing discoveries await when the JWST flies.

Copyright © 2026 by IOP Publishing Ltd and individual contributors