Skip to main content

physicsworld.com wins at the online media awards

USPS.jpg
Me (James Dacey, right) receiving the award from Noel Young, one of the judges

By James Dacey

I was absolutely delighted to be in London on Thursday night on behalf of physicsworld.com, to receive the award for Best Specialist Site for Journalism at the Online Media Awards. It was humbling just to be shortlisted for the award alongside the likes of popular sports sites Espn.com and foxsports.com – but to win was really fantastic!

The awards, which are sponsored by the Press Association, are said to “identify the best and boldest of online news-based creativity and also the most original.” Websites in the different categories were assessed by a panel of 11 judges based in a number of countries including the US, China and Australia.

Other winners included bbc.co.uk and theguardian.co.uk, which shared the award for Best Site For News-Led Journalism. And the biggest haul of the evening went to thesundaytimes.co.uk, which took six awards including Best Video Journalism and Best Campaigning/Investigative Journalism.

The award ceremony was held at the swanky Marriott Hotel in Kensington and the opening address was given by Gordon Young, editor of The Drum, the magazine that organized the event. “We believe that online media is really becoming a discipline on its own,” he said. “It’s important to establish an event that gets down to the business of comparing like with like and really celebrating and appreciating the very best of these skills.”

Flattering words indeed! I have to admit that I wasn’t entirely convinced that we’d win the prize, largely because we were up against some websites with far more general remits. But it’s great that a specialist site like ours can occasionally get recognition from the mainstream. Of course, it helps that it’s a really exciting time for physics right now. You’ve got particle physicists closing in on exciting new understandings of nature at the Tevatron and the LHC. Then there are exoplanet hunters who seem to be discovering new alien worlds every other day. I could go on.

But online media – including videos and embedded audio clips – is bringing new opportunities for us to tell these exciting stories to new audiences in different ways. We’ve plunged headfirst into digital publishing in the last few years, having existed as a print publication for over two decades. Who knows how you will be able to digest Physics World content two decades from now. But whatever form it takes, I’m sure it will be the fascinating stories from the world of physics that will give the magazine its enduring appeal.

Solar wind sheds light on early solar system

In 2004 NASA’s Genesis space mission made an unplanned crash landing, damaging its precious cargo of solar-wind particles. Now, after years of painstaking work, two independent groups of scientists have managed to measure the relative abundances of nitrogen and oxygen isotopes in the solar wind. Their studies reveal that the isotopic compositions on Earth are very different from the Sun. The result could prove important to understanding the conditions in the early solar system, when the Earth was forming.

While scientists know a great deal about the isotopic abundance of elements on the Earth, Moon and meteorites, very little is known about the Sun. Fortunately, the Sun spews out a steady stream of ions called the solar wind, which can be captured by spacecraft. Although mostly hydrogen, the wind does contain small amounts of heavier elements and their isotopic composition is believed to be similar to the material from which the solar system formed.

Focusing the wind

Genesis collected the particles over about two years using a solar-wind concentrator, which uses electric fields to accelerate the oxygen and nitrogen ions and focus them on a number of ultra-pure silicon-carbide targets. Despite boosting the number of ions hitting the targets by a factor of 20, the concentrations of the isotopes in the targets was still very small when the mission returned to Earth and would therefore require careful analysis.

But tragedy struck in 2004 when the mission’s sample-return capsule failed to deploy its parachute as it fell towards Earth. The capsule overheated and smashed into the ground, breaking open and shattering much of its contents, including many of the solar-wind targets.

What remained of the targets was contaminated by a range of materials (including a mysterious oily film) and scientists embarked on a painstaking process to clean the samples. This cleaning was difficult because the oxygen and nitrogen ions reside about 100 nm below the surface of the targets and could easily be scrubbed away.

Scanning the surface

Now, the silicon carbide targets have been cleaned sufficiently to have their oxygen and nitrogen contents analysed. One study has been undertaken by Kevin McKeegan and colleagues at the University of California, Los Angeles and other universities in the US, UK and Japan. They used an instrument specially designed for Genesis called MegaSIMS, which is a secondary ion mass spectrometer coupled to an accelerator mass spectrometer. An important feature of the instrument is that it can analyse tiny regions of the sample about 2 µm across in order to find regions of the surface that are not contaminated.

The team measured the abundances of oxygen-17, oxygen-18 and oxygen-16 in the samples. They found that the ratios of oxygen-17 to oxygen-18 are the same in the Sun and on Earth. However, they found that oxygen-16 is about 7% more abundant in the Sun than on Earth.

“We found that the Earth, the Moon, as well as Martian and other meteorites which are samples of asteroids, have a lower concentration of O-16 than the Sun,” said McKeegan. “The implication is that we did not form out of the same solar nebula materials that created the sun – just how and why remains to be discovered,” he added.

Cleaned with ions

Meanwhile, Bernard Marty and colleagues at the University of Nancy, France and several US institutions used a secondary ion mass spectrometer to measure the ratio of nitrogen-15 to nitrogen-14 in the solar wind. A target from Genesis was placed in the instrument and its surface was first cleaned using a low-energy beam of ions. Then the isotope concentration was measured as a function of depth in several regions about 10 µm in diameter.

The concentrations of the nitrogen isotopes peaked at about 80 nm into the target, which the team says is consistent with how Genesis captured the solar wind. The researchers found that the ratio of nitrogen-15 to nitrogen-14 is about 40% smaller in the solar wind than it is in Earth’s atmosphere.

Robert Clayton of the University of Chicago, who was not involved in either study, believes that the oxygen and nitrogen differences could be a result of the interaction of sunlight and the cloud of gas molecules present in the early solar system. This process is called photolysis and results in carbon monoxide and molecular nitrogen separating into their constituent atoms.

Opaque gas cloud

The precise wavelength of light required to break down a molecule depends on its isotopic composition, explains Clayton. It turns out that the light needed to break down carbon monoxide molecules made from oxygen-16 does not travel very far through a gas cloud. This means that the interior of the cloud will contain enhanced concentrations of atomic oxygen-17 and oxygen-18. These liberated atoms take part in the chemical reactions that led to the formation of mineral dust that eventually became Earth and the inner planets. A similar process would also enhance the amount of nitrogen-15 in the dust, according to Clayton.

Clayton points out that the relative abundance of the nitrogen isotopes on Jupiter is similar to that of the solar wind, suggesting that isotope separation by photolysis only occurred in the inner solar system.

The research is described in two papers in Science.

Welcome to Australia

sydneyharbour.jpg

By Matin Durrani, Sydney, Australia

It’s a tough life, but someone had to do it.

I’m here in Sydney with three other European science journalists after accepting an invitation from Australia’s Department for Foreign Affairs and Trade to take part in a week-long fact-finding tour of the country on the theme of “science and innovation”.

We’re being introduced to a range of Australian scientists and later this week are flying to Perth before being taken to the proposed site for the main component of the Square Kilometre Array (SKA). SKA is a set of radio-telescopes that will either be built in Australia and New Zealand or possibly in southern Africa. A choice is set to be made between the two competing bids by the international astronomy community on 29 February 2012.

The Australian government has a regular programme of inviting journalists from around the world to help showcase the country’s efforts in a range of different themes, not just science. You can’t blame them for making the effort. After all, Australia is just so far from the rest of the world – it’s a five-hour flight from Sydney to Perth alone – that a well-crafted programme of events is what’s needed to encourage busy journalists to give up their time to find out more.

So over the next few days I’ll be keeping you up to date with events Down Under. In the meantime, I hope you enjoy the photo I took from Circular Quays as an early Sunday-morning passenger ferry from Manly approaches me with the iconic Sydney harbour bridge in the background.

Australians have been moaning about all the poor weather they’ve been having in the last week or two, but all I can say is that having left the UK late last week, the Sydney winter seems as good as the summer I left behind.

I won’t make you jealous by showing what the beaches look like – oh, go on then (see below).

sydneybeach.jpg

Nanoparticles play at being red blood cells

Nanoparticles disguised as red blood cells could be used to deliver anti-cancer drugs directly to a tumour. So say researchers at the University of California at San Diego, whose new technique is unique in its approach to harnessing nanoparticles.

Drug delivery systems that mimic naturally occurring biological molecules seem to be the most efficient when it comes to delivering drugs to tumours. Such systems – usually based on nanoparticles – can also circulate in the body for extended periods of time without being rejected by the body’s immune system.

Cocktail of anti-cancer drugs

The new method invented by Liangfang Zhang and colleagues involves exploiting the cell membrane of a red blood cell. The researchers wrap the membrane around a biodegradable polymer nanoparticle that is around 100 nm in size and which has been filled with a cocktail of anti-cancer drugs. This is the first time that scientists have combined a natural cell membrane with a synthetic nanoparticle for drug-delivery applications, explained Zhang. “Such a nanoparticle platform will have little risk of immune response,” he said.

Such “stealth” nanoparticles, as they are called, have already been used with success in clinical cancer trials to deliver chemotherapy drugs. However, previous studies looked at nanoparticles coated with synthetic materials like polyethylene glycol (PEG) – the current gold standard in the field. These coatings protect the drugs contained inside the nanoparticles. If they weren’t protected, the nanoparticles would rapidly induce an immune system response in the body. The coatings allow the particles to circulate in the body for longer, giving them time to deliver their drug payload.

Cloak tricks the body

Zhang and colleagues’ strategy lies in another direction altogether – it involves using a naturally occurring membrane rather than a synthetic one. Such an approach avoids having to build a system that exactly mimics all the biological functions on the surface of a cell. A nanoparticle with a red blood cell membrane “cloak” tricks the body because it looks just like a real red blood cell to all intents and purposes, says Zhang.

Preliminary experiments by the team show that nanoparticles coated with a red blood cell membrane were able to safely stay inside the bodies of laboratory mice for as long as nearly 72 hours.

The researchers say that they would now like to be able to produce their biomimetic nanoparticles in larger quantities for future clinical trials. They also plan to add a targeting molecule to the red blood cell membranes so that the particles can seek out and bind to specific types of cancer cell.

The work was reported in PNAS.

Quarks break free at two trillion degrees

Physicists in the US, India and China have calculated that quarks and gluons can break free from their confinement inside protons and neutrons at a temperature of around two trillion degrees Kelvin – the temperature of the universe a fraction of a second after the Big Bang. The researchers arrived at this figure by combining the results of supercomputer calculations and heavy-ion collision experiments. They say that it puts our knowledge of quark matter on a firmer footing.

According to the Big Bang model, the very early universe was filled with “quark–gluon plasma”, in which quarks and gluons (the carriers of the strong nuclear force) existed as individual entities. The strong force between quarks increases rapidly with distance, which means that the quarks need large amounts of energy to remain free – and therefore the plasma can only exist at extremely high temperatures. When the cosmos was only about a millionth of a second old, it had cooled to the point where quarks and gluons combined to form composite particles such as protons and neutrons. Exactly what this temperature is, however, has not been easy to work out.

The theory of quantum chromodynamics (QCD) explains the interactions of quarks and gluons extremely well at very small distances, which are relevant in the collisions taking place inside the Large Hadron Collider (LHC) at CERN in Geneva. But at the larger distances characteristic of the quark–gluon plasma, QCD fails because it becomes impossible to account for all of the constituent interactions, which include many virtual pairs of quarks and antiquarks. So physicists use an approximation of the theory known as lattice QCD, in which the complexity of quark–gluon interactions is limited by breaking down space–time into manageable chunks.

Anchoring lattice QCD

Now Nu Xu of the Central China Normal University and the Lawrence Berkeley National Laboratory in California and colleagues have anchored the value of one of the key parameters of lattice QCD. They used results from the STAR detector at Brookhaven Laboratory’s Relativistic Heavy Ion Collider (RHIC), which collides gold ions together at high energies to work out the temperature at which the quark–gluon plasma “condenses” to form individual hadrons.

Team member Bedangadas Mohanty of the Variable Energy Cyclotron Centre in Kolkata, India, explains that knowing this temperature helps to map out the phase diagram of QCD. This diagram charts the transition from normal, hadronic, matter to quark matter (or possibly to another exotic state known as “colour superconductivity”) as two variables are altered. These are the temperature and “baryonic chemical potential”, the latter being the energy needed to remove or add a proton or neutron to the strongly interacting matter. He points out that thermodynamics can be used to work out how the temperature of water’s phase transitions varies with pressure but that absolute values for these temperatures require the measurement of at least one fixed point within the phase diagram, say the boiling point at atmospheric pressure. “Likewise,” he says, “in QCD we want to find out what is the temperature of the phase transition at zero chemical potential.”

Calculating susceptibilities

Xu and co-workers didn’t measure this temperature directly but derived it from theory and experiment. On the theoretical side, Sourendu Gupta and others at the Tata Institute of Fundamental Research in India calculated the first, second, third and fourth derivatives of the baryonic chemical potential with respect to pressure, and then worked out how these “susceptibilities” should vary with temperature. Meanwhile, the experimental half of the collaboration counted how many more protons than antiprotons were produced in millions of collisions of gold ions at RHIC and plotted the variation in this measured quantity. At the quark–gluon plasma transition temperature, certain combinations of the theoretical susceptibilities should numerically equal particular quantities relating to the shape of the measured distributions. So, by varying the susceptibilities with temperature until they equalled the quantities derived from experiment, the researchers arrived at a value for the transition temperature.

The value obtained by Xu’s team was 175 +1/–7 MeV, equivalent to 2 × 1012 Kelvin, which is exactly the value predicted by other indirect methods in lattice QCD. “This is the first time that there has been a direct comparison between high-temperature quark-matter theory and high-energy experiments,” says Mohanty. “People have predicted what the theoretical susceptibilities should be, but you need to compare these predictions with experiment to be sure that the theory is correct.”

Finding a critical point

The next step, adds Mohanty, is to measure a predicted critical point within the QCD phase diagram. At a critical point, a boundary between two phases comes to an end and the properties of the two phases become identical. There is a critical point for liquid water and steam, for example, and nuclear physicists believe that likewise there is one for normal and quark matter. Finding this critical point will involve carrying out heavy-ion collisions over a range of collision energies, something, says Mohanty, which RHIC is ideally suited to do. The LHC’s ALICE detector, on the other hand, should be able to nail down the quark–gluon plasma’s viscosity, with previous measurements having suggested that the plasma has a lower viscosity than any other liquid in the universe.

David Evans, a physicist at Birmingham University and head of the UK group at ALICE, is impressed by the latest work. “I think these techniques will allow theorists to tune up and improve lattice QCD by direct comparisons with experiment,” he says, “and hence provide even better calculations and predictions in the future.”

However, Johann Rafelski of the University of Arizona believes that the research suffers from “major deficiencies”, in particular a lack of analysis of systematic errors. For example, he says, Xu and colleagues have not accounted for the fact that the detector counts only a limited fraction of all collision products. “The total systematic error is very probably much, much larger than the statistical error [as presented],” he says, adding that his “colleagues from the lattice-QCD community believe that this analysis has `errors at every step'”.

The research is published in Science 332 1525.

Icy spray from Saturn's moon Enceladus sampled by Cassini

Cassini enhanced and false-coloured image of Enceladus backlit by the Sun shows the fountain-like plumes of the fine spray of material that spews from the south polar region (Credit: NASA/JPL/Space Science Institute)

Cassini’s enhanced and false-coloured image of Enceladus backlit by the Sun shows the fountain-like plumes of the fine spray of material that spews from the south polar region (Courtesy: NASA/JPL/Space Science Institute)

By Tushna Commissariat

Enceladus, Saturn’s sixth largest moon is back in the news as the Cassini-Huygens mission has managed to directly sample the water plumes jetting into space from its southern polar region. These plumes of ice and salt originate from the moon’s famed “tiger stripes” region – four parallel giant fissures on the southern face of the moon.

The findings from these fly-throughs are the strongest evidence yet for the existence of large-scale saltwater reservoirs beneath the moon’s icy crust. “Enceladus is a tiny icy moon located in a region of the outer solar system where no liquid water was expected to exist, because of its large distance from the Sun,” says Nicolas Altobelli, ESA’s project scientist for the Cassini-Huygens mission. “This finding is therefore a crucial new piece of evidence showing that environmental conditions favourable to the emergence of life may be sustainable on icy bodies orbiting gas-giant planets.”

Indeed, the moon has been described previously by other Cassini researchers as one of the “most habitable spots beyond Earth in the solar system for life as we know it”.

Enceladus’ water plumes are though to contribute towards replenishing Saturn’s outermost and faint E-ring, which traces the orbit of Enceladus around Saturn. The Cassini spacecraft discovered the plumes in 2005 and more recently has been able to fly directly through them.

During three of Cassini’s passes in 2008 and 2009, its Cosmic Dust Analyser measured the composition of freshly ejected plume grains. The icy particles hit the detector target at speeds of 6.5–17.5 km/s, and vaporized instantly. Electrical fields inside the instrument then separated the various constituents of the resulting impact cloud for analysis.

Researchers looking at the data from the detector have found that grains ejected in the plumes and into the atmosphere of the moon and out towards the E-ring are relatively small and mostly salt-poor, closely matching the composition of the E-ring. However, closer to the moon itself relatively large, salt-rich ice grains were found.

This mosaic of 21 Cassini images is a false colour full-disc view of the anti-Saturn hemisphere on Enceladus (Credit: NASA/JPL/Space Science Institute)

This mosaic of 21 Cassini images is a false colour full-disc view of the anti-Saturn hemisphere on Enceladus (Courtesy: NASA/JPL/Space Science Institute)

Scientists explain this by saying that more than 99% of the total mass of ejected solids is in salt-rich grains, but most of these are heavy and fall back to the moon, so never make it into the E-ring. The salt-rich particles have an “ocean-like” composition which indicates that most, if not all, of the expelled ice comes from liquid saltwater body somewhere under the surface, rather than from the icy face of the moon.

The scenario envisioned by the team goes something like this – deep underneath Enceladus’ surface, perhaps 80 km down, there is a reservoir of water between the rocky core and the icy mantle, kept liquid by tidal forces generated by Saturn and its neighbouring moons, as well as by the heat generated by radioactive decay. When the outermost layer cracks open, the reservoir is exposed to space. The drop in pressure causes the liquid to evaporate, with some of it flash-freezing into salty ice grains: together these create the plumes.

When salty water freezes slowly, the salt is squeezed out, leaving pure water ice behind. So, if the plumes were coming from the surface ice, there should be very little salt in them. “There currently is no plausible way to produce a steady outflow of salt-rich grains from solid ice across the tiger stripes other than from saltwater under Enceladus’ icy surface,” says Frank Postberg, Universität Heidelberg, Germany, who is the lead author of a Nature paper announcing these results.

Have your say by taking part in our reader polls

colouredhands.jpg

By James Dacey

Like many I was initially quite sceptical about the value that social media could add to journalism. To the uninitiated just “dipping their toe” into the likes of Facebook and Twitter it can seem like an unrelenting stream of throwaway remarks. But I’m starting to realize that, used effectively, social media can offer certain things that go beyond traditional journalism. One of the real positives of social media is its ability to break down boundaries and connect with audiences.

Physics World has existed in various social media formats for some time now, and we are keen to explore ways of engaging with readers using social media. One mini project we have tried recently is to run a series of polls on our Facebook page relating to hot topics within the physical sciences. This week we asked about a topic that never fails to provoke opinions: climate change. It is an issue that grabbed the headlines again recently after it emerged that several Australian climate researchers have received death threats on account of their research.

So we asked our Facebook followers if they believe it is realistic to think that the science and politics of climate issues can ever be truly separated. While I was perhaps not surprised by the outcome, I was surprised by its decisiveness as 93% of respondents said “no” – these issues cannot be separated. Of course, it is highly reductionist to boil such a complicated issue down to a simple yes/no response. But that’s why we are encouraging you to offer your comments too when taking part in the poll.

Our latest poll concerns particle physics. The community is geared up for an important few months as the LHC continues to perform well and Fermilab’s Teveatron will close at the end of September after a dazzling career. What we want to know is the following: If the LHC or the Tevatron fail to find the Higgs boson, should the world invest in a new machine to continue the search?

To take part in the poll then go to our fanpage on Facebook. And, as I said, feel free to add a comment to explain your reasons!

The sounds of science

 

When I was in my early 20s, I was diagnosed with diabetic retinopathy, or damage to the retina due to complications of diabetes. At the time, I was studying physics at the University of Puerto Rico, and at first I thought I would have to either change subjects or stop studying altogether. Instead, with help from a series of mentors, my sight loss has spurred me to develop other ways to observe and study the world – using my hands, my ears and what some people would call “physics intuition”, but which I call the heart.

My journey into this other way of doing physics began when a researcher I met at a conference told me about the Access internship programme at NASA’s Goddard Space Flight Center in Maryland. These internships are aimed at people with disabilities, so with some hesitation – and fear – I applied. To my surprise, I was accepted, and I went to Goddard for the first time in the summer of 2005.

At Goddard, I met Robert (“Bobby”) Candey, a computer scientist who has become my long-term mentor. Candey is interested in using sound to explore numerical data sets; this use of sound to convey information is called “sonification”. During that first summer, I worked with Candey and another researcher to build a sonification computer application. This program allows users to present numerical data as sound, using any of its three main attributes – pitch, volume and rhythm – to distinguish between different values in the data. For example, the largest value in a particular data set might be assigned an instrument’s highest pitch (maximum frequency), with the smallest value getting the lowest pitch (minimum frequency), and interim values scaled so that each tone represents a different value. The resulting sequences sound very atonal – there is no “melody” as such – but they allow listeners to identify features such as peaks, pulses, and noisy data.

Once we had got the application working, I used it to sonify data from radio telescopes and satellites, identify changes and trends (if any) in the data stream, characterize the latter with numerical analysis, and then present the results to other scientists. I have returned to Goddard every summer since 2005 to work with Candey on sonifying various space-physics data sets. During this time, I also became interested in some of the computer-science and perception-psychology aspects of sonification, so in 2008 I started a PhD at the University of Glasgow in the UK that combines research on these topics with my ongoing work in space physics. My supervisor at Glasgow, Stephen Brewster, is a computer scientist who is an expert in both sonification and human–computer interactions, and we are studying ways of using sound as a companion to data-visualization techniques.

A new way of exploring

Sonification is a developing field, and one of the beauties of it is that it brings together researchers from a wide variety of professional fields. As the website of the International Community for Auditory Display (www.icad.org) indicates, anyone who is interested in finding new ways of interacting with and exploring data should be interested in sonification – it is certainly not just for people who are blind or partially sighted. In fact, one thing I am trying to find out as part of my PhD research is whether sound can be used to augment signatures in a data set. This has meant learning how to do experiments on human perception, while also polishing my knowledge of Java so that I could improve the sonification prototype I created with Candey in 2005. Some of the improvements we have made have included making it possible for users to mark areas of interest in data displayed on a chart, save sonifications, zoom in and out, and have more options when co-ordinating sound and visual displays.

These improvements are important because large data sets – particularly time series, where measurements are recorded over many hours or days – typically contain much more information than can be effectively displayed on a monitor using currently available technologies. Even the best computer screens have a limited spatial resolution, and the restrictions imposed by nature on the human eye act as an additional constraint. Together, these limitations affect the useful dynamic range of the display, thus reducing the amount of data scientists can study at any one time.

Scientists currently work around this limitation by filtering the data, so as to display only the information they believe is important to the problem at hand. But since this involves making some guesses about the result they are searching for, they may miss some discoveries. Working with Candey, Brewster and our collaborators at the Harvard-Smithsonian Center for Astrophysics (among others), I have used sonification to analyse plasma, particle, radio and X-ray data. Together, we aim to show that using sound in tandem with visual techniques is a viable tool for finding changes and trends in data sets. We are currently in the process of testing the latest version of our sonification application with users at the University of St Andrews, and I plan to start writing up my thesis later this summer.

Bringing sounds to schools

In addition to my sonification research, I have also worked with members of Goddard’s Radio JOVE team, who run a public-outreach project that helps students and amateur scientists to observe and analyse radio emissions from Jupiter, the Sun and the Milky Way. This project, in particular, brought a lot of hope to my life at a time when I felt I had none for the future. At first, I worked on Radio JOVE’s audification tool, which translates a data waveform to the audible domain to help students monitor and comprehend the signals they record. Later, I began to visit schools, teaching the kids about radio astronomy and helping them to do soldering and other tasks needed to build radio telescopes that monitor emissions from Jupiter and the Sun at 21 kHz.

As part of this work, I helped to develop a method of soldering the Radio JOVE telescope that is safe for anyone to use, including children and people who are blind or partially sighted. This is really important, because it can be hard to convince teachers that it is safe for their students to solder – I have to show them how to do it first. Once the telescopes are built, children and parents can gather data together. Their contribution to science is significant: Candey and I used the data acquired by a group of nine year olds from the Rosa Cecilia Benitez school in Caguas, Puerto Rico, to publish a paper (2008 Sun and Geosphere 3 42) on using sonification to detect plasma bubbles.

For me, sonification has been the beginning of a journey that I hope will bring space science closer to people, since it uses humans’ ability to adapt to data to detect and/or augment interesting signals. People have different ways of approaching knowledge, and my mentors have inspired me to investigate another way for people to observe and approach numerical data sets. And the word “people” includes each and every one of us: from astrophysicists doing serious research, to children, parents, the shopper in the supermarket, and you and me.

So long, Tevatron

Quigg discusses the last-ditch efforts to extend the lifetime of the Tevatron by three years and why that appeal was ultimately rejected by the US Department of Energy. “The US government feels under great fiscal pressure,” he says. “I think much of the perception is vacuum-polarized and self-generated.” Quigg concedes, however, that the good performance of the LHC in the first few months of this year has made it easier for him to accept that the baton is quickly passing to CERN.

Beyond the Tevatron

But Quigg believes that the LHC and future accelerators will benefit greatly from the legacy of the Tevatron, particularly its ability to make highly precise measurements. “There are areas in which the Tevatron has a long lead, not just in the number of events that it has recorded but in the development of expertise in making very difficult measurements,” he says.

Of all the Tevatron’s achievements, Quigg singles out the discovery of the top quark in 1995 as the jewel in the crown. “What the researchers did was almost impossible,” he says, referring to the relatively small number of data and the way that the different experimental groups skilfully used a silicon vertex detector for the first time.

Looking to the future, Quigg would like to see the US develop a closer relationship with CERN. He believes this will come, in part, by continued involvement in various experiments, including work on the LHC’s two general detectors – ATLAS and CMS. While in the longer term, Quigg would like the US to carry out its own extensive R&D programme to feed into the design of a next-generation accelerator such as the proposed International Linear collider.

If you enjoyed this interview, you may also be interested to hear the views of Harvard theorist Lisa Randall in an interview from March. Randall was highly vocal in her support for the extension of the Tevatron’s lifetime, but she is very excited about the wealth of results – and new physics – that the LHC promises to deliver.

More surprises for the Voyager mission at the edge of the solar system

Unexpected observations by NASA's Voyager 1 spacecraft have astronomers once again revising their theories about the radial extent of the heliosheath – the heated outer shell of the solar system. Recent data from the spacecraft have shown a gentle decrease in the velocity of the solar wind at the heliopause – the outer boundary of the heliosheath – not the abrupt discontinuity predicted by current theories. Also, scientists looking at other data from both Voyager 1 and Voyager 2 have found that the magnetic field in the heliosheath is a tumultuous foam of magnetic bubbles, as compared to the graceful arcs of magnetic field lines they had expected.

At the edge

Ionized particles emitted at high speeds from the Sun – the solar wind – form a bubble around our solar system. The skin of the bubble, called the heliosphere, contains the heliopause, the heliosheath and the termination shock. The solar wind travels at supersonic speed until it crosses a shockwave – the termination shock where it slows down and heats up the heliosheath. The heliopause is the outer edge of the heliosheath where the solar wind slows down to zero.

Launched nearly 34 years ago, and now cruising through space some 14.4 billion kilometres from the Sun, both Voyager 1 and Voyager 2 are currently in the heliosheath. A team of scientists led by Stamatios Krimigis of the Johns Hopkins University Applied Physics Laboratory, Maryland, US have been using Voyager's Low-Energy Charged Particle instrument to determine the solar wind's velocity. Voyager 1 has crossed into an area where the velocity of the solar wind has slowed gradually to zero since 2007. As Voyager 1 has moved outwards over the past three years, the radial velocity of the wind has been decreasing almost linearly from 208,000 km/h to zero; while the transverse component that flows sideways relative to the Sun is also trending toward zero.

"This tells us that Voyager 1 may be close to the heliopause, or the boundary at which the interstellar medium basically stops the outflow of solar wind," says Krimigis. "The extended transition layer of near-zero outflow contradicts theories that predict a sharp transition to the interstellar flow at the heliopause – and means, once again, we will need to rework our models."

As velocities may fluctuate, the team looked at multiple monthly readings before confirming the velocity was actually at zero. However, scientists believe Voyager 1 has not yet crossed the heliopause into interstellar space. Crossing into interstellar space would mean a sudden drop in the density of hot particles of the heliosheath and an increase in the density of cold particles of the interstellar plasma. The researchers, writing in Nature, estimated the location of the heliopause by combining the Voyager 1 observations and energetic neutral atom images of the heliosheath from the Cassini mission. They believe that the heliopause may be as close as 18 billion kilometres, meaning that Voyager 1 could exit the transition layer and enter the galactic medium by the end of 2012. The research was published in Nature Letters.

Bubble trouble

At the same time, another team from NASA has found distinct bubbles of magnetism, each about 160 million kilometres wide, in the heliosheath. Voyager 1 entered the "foam-zone" in around 2007 and Voyager 2 followed about a year later, according to the researchers, and it would take either one of the probes weeks to cross just one bubble.

"The Sun's magnetic field extends all the way to the edge of the solar system," explains Merav Opher of Boston University, US. "Because the Sun spins, its magnetic field becomes twisted and wrinkled, a bit like a ballerina's skirt. Far, far away from the Sun, where the Voyagers are now, the folds of the skirt bunch up."

When a magnetic field gets severely folded, lines of magnetic force criss-cross and reorganize themselves into foamy magnetic bubbles. This magnetic reconnection is the same energetic process underlying solar flares. The actual bubbles appear to be self-contained and disconnected from the broader solar magnetic field.

Sensor readings from the spacecraft show that the Voyagers sometimes travel in and out of bubbles in the foam – zone, while at other times they seem to move through foam-free regions. This further complicates our picture of the heliosphere.

The researchers suggest that the foam zone might protect the solar system from cosmic rays, which would be trapped inside the bubbles and have to travel through individual bubbles before arriving at relatively smoother magnetic field lines to travel towards the Sun itself. "The magnetic bubbles appear to be our first line of defence against cosmic rays," points out Opher. "We haven't figured out yet if this is a good thing or not."

So far, most evidence for the bubbles comes from the Voyager energetic-particle and flow measurements and magnetic-field observations; but because the magnetic field is so weak, the data takes much longer to accurately analyse. "We'll probably discover [if our model] is correct as the Voyagers proceed deeper into the froth and learn more about its organization," says Opher. "This is just the beginning, and I predict more surprises ahead."

Watch the video below from the NASA Heliophysics and the Science Visualization Studio to find out more about the bubbles and how cosmic rays may travel through them:

Copyright © 2026 by IOP Publishing Ltd and individual contributors