iSGTW - International Science Grid This Week
iSGTW - International Science Grid This Week
Null

Home > iSGTW 19 September 2007 > iSGTW Feature - ATLAS: the data chain works

 

Feature - ATLAS: the data chain works


Tracks recorded in the muon chambers of the ATLAS detector can now be expressed to physicists all over the world, enabling simultaneous analysis at sites across the globe. The European and U.S. sites are connected via the Enabling Grids for E-sciencE and Open Science Grid infrastructures.
Image courtesy of ATLAS

This month particle physics experiment ATLAS went “end-to-end” for the first time.

Buried in Switzerland, 90 meters under the ground at the base of the French Jura mountains, ATLAS (A Torroidal LHC ApparatuS) is one of four high energy physics experiments attached to the Large Hadron Collider, a 27-kilometer particle accelerator nearing the final stages of completion.

When the LHC is turned on, physicists worldwide will be waiting by their computers. They’ll be expecting express and non-stop delivery of massive amounts of data, streamed in a virtually seamless sequence direct to their doorstep.

And this month, for the first time, ATLAS proved that this data distribution—from the LHC to physicists across the globe—will be possible.

From cosmic ray to you 

“We did the whole thing,” smiles ATLAS’ Kors Bos. “We mastered the entire data chain for the first time, from measurement of a real cosmic ray muon in the detector to arrival of reconstructed data at Tier-2 computers all over the world, with all the steps in between.”

This feat boils down to one thing: “The chain works,” says Bos.

“We measured about two million muons over two weeks. We spent the first week setting up, but in the last week we started getting something useful. The detector measured real particles, real analysis happened at sites across Europe and the U.S., and it happened in quasi real-time. The whole machine worked without human intervention.”

Terabytes of data were moved from the Tier-0 site at CERN to Tier-1 sites across Europe (seven sites), America (one site in America and one in Canada) and Asia (one site in Taiwan). Data transfer rates reached the expected maximum during early September.
Image courtesy of ATLAS

Quasi real-time, says Bos, will be more than sufficient to keep the chain moving.

“It’s good that we could achieve this transfer in hours, but with the real data there will be a greater delay. We will need to do regular calibrations and the processing and subsequent data transport will have to wait for that. The data chain is designed to cope with this delay and there are sufficient disk buffers at the Tier-0 stage to keep the data for as long as several days.”

Domino data chain of events

The data chain is triggered only when a particle enters the ATLAS detector and produces a signal in its sensitive layers.

“We know these particles must be muons, because everything else is stopped by the meters of clay above the detector. Only muons can get through and only muons can trigger the detector,” Bos explains.

Once the particle has been detected, the domino data chain is set in motion. The raw data is sent to the Tier-0 center, the computer center at CERN, where it is recorded onto tape before being sent to a different part of the center for reconstruction.

“The reconstructed data also goes to tape,” says Bos. “It is then exported to all the Tier-1s, and when it arrives it is exported by the Tier-1s to their Tier-2s. When it arrives at the Tier-2s the physicists can pick it up and say, ‘wow, look at that’”.

Bos says that while researchers from different Tier-2 centers will be analyzing the data in different ways, they will all be analyzing the same data.

“The data analyses are bound to be different, but if you say ‘give me the data from Run 123, Event 345 on Sunday morning’, they should all be able to provide the same thing.”

This data chain will be used to share ATLAS data with more than 1900 physicists from more than 160 participating institutions in 35 countries. The massive data transfer requirements are supported in Europe and the U.S. by the Enabling Grids for E-sciencE and Open Science Grid infrastructures.

- Cristy Burne, iSGTW 

Tags:



Null
 iSGTW 22 December 2010

Feature – Army of Women allies with CaBIG for online longitudinal studies

Special Announcement - iSGTW on Holiday

Video of the Week - Learn about LiDAR

 Announcements

NeHC launches social media

PRACE announces third Tier-0 machine

iRODS 2011 User Group Meeting

Jobs in distributed computing

 Subscribe

Enter your email address to subscribe to iSGTW.

Unsubscribe

 iSGTW Blog Watch

Keep up with the grid’s blogosphere

 Mark your calendar

December 2010

13-18, AGU Fall Meeting

14-16, UCC 2010

17, ICETI 2011 and ICSIT 2011

24, Abstract Submission deadline, EGI User Forum

 

January 2011

11, HPCS 2011 Submission Deadline

11, SPCloud 2011

22, ALENEX11

30 Jan – 3 Feb, ESCC/Internet2

 

February 2011

1 - 4, GlobusWorld '11

2, Lift 11

15 - 16, Cloudscape III


More calendar items . . .

 

FooterINFSOMEuropean CommissionDepartment of EnergyNational¬†Science¬†Foundation RSSHeadlines | Site Map