Supercomputing "Grid" passes latest test
Apr 26, 2005
When the Large Hadron Collider (LHC) comes online at the CERN in 2007 it will produce more data than any other experiment in the history of physics. Particle physicists have now passed another milestone in their preparations for the LHC by sustaining a continuous flow of 600 megabytes of data per second (MB/s) for 10 days from the Geneva laboratory to seven sites in Europe and the US.
This is the first time that such high rates of data transfer have been maintained between so many sites and over such a long period of time. The total amount of data transmitted during the challenge -- some 500 terabytes -- would take about 250 years to download using a typical 512 kilobit per second household broadband connection. The exercise was the second of four "service challenges" that will be performed before data starts to flow from the four detectors at the collider -- ALICE, ATLAS, CMS and LHCb.
"This service challenge is a key step on the way to managing the torrents of data anticipated from the LHC," says Jamie Shiers, manager of the service challenges at CERN. "When the LHC starts it will be the most data-intensive physics instrument on the planet, producing more than 1500 megabytes of data every second for over a decade."
CERN plans to use a worldwide supercomputing "Grid" -- similar to the World Wide Web but much more powerful -- of computing centres to handle the data from the collider, which will then be analyzed by more than 6000 scientists all over the world.
The labs that participated in the latest challenge were: the Karlsruhe research centre in Germany, CCIN2P3 in France, INFN-CNAF in Italy, SARA/NIKHEF in the Netherlands, the Rutherford Appleton Laboratory in the UK, and Brookhaven and Fermilab in the US.
"This is the highest ever end-to-end data transfer across a computing grid," Fermilab supercomputing project manager Lothar Bauerdick told PhysicsWeb. "Today's result shows that the LHC project can be a truly international collaboration. However, more work on software and network research is still needed to reach higher rates of data transfer."
The next service challenge will take place this summer and will involve connecting more computing centres over a three-month period.
About the author
Belle Dumé is Science Writer at PhysicsWeb