iSGTW - International Science Grid This Week
iSGTW - International Science Grid This Week
Null

Home > iSGTW - 3 September 2008 > iSGTW Feature - Mega grid for mega science

Feature - Mega grid for mega science


From left, Ian Foster, Carl Kesselman and Steve Tuecke who together created the Globus software for grid computing.

Image courtesy of ANL.

An editorial chronicling the development of grid computing and its relationship to the Large Hadron Collider, which originally appeared in the October 2006 print issue of R&D Magazine and ran again at rdmag.com last Friday (29 August), has been adapted and reprinted with permission. 

In 1995, Ian Foster at Argonne National Laboratory and the Univ. of Chicago, Ill., and Carl Kesselman in the Information Sciences Institute at the Univ. of Southern California, Los Angeles, known as the fathers of grid computing, looked at ways of using network technology to build very large, powerful systems, getting machines in different locations to work on parts of a problem and then combine for the result. Ultimately, these ideas together formed I-WAY, which enlisted high-speed networks to connect end resources at 17 sites across North America, marking the start of grid computing.

Advances followed, and in the summer of 2000, Kesselman went to CERN, near Geneva, to give a seminar on grid computing.  The seeds for the Large Hadron Collider (LHC) Computing Grid were planted.

The Large Hadron Collider (LHC) is the largest scientific instrument on the planet. It will produce roughly 15 petabytes of data annually, the equivalent of about 3 million DVDs. Access to experimental data will need to be provided for more than 5,000 scientists in 500 research institutes and universities worldwide over the 15-year estimated lifetime of the LHC.

The LHC Computing Grid (LCG) is a worldwide network of thousands of PCs, organized into large clusters and linked by ultra-high speed connections to create the world’s largest international scientific computing grid.

Map of LCG sites. 

Image courtesy of LCG.  


LCG partners are achieving record-breaking results for high-speed data transfers, distributed processing, and storage. For example, in 2005, eight major computing centers completed a challenge to sustain a continuous flow of 600 MB/sec on average for 10 days from CERN to seven sites in Europe and the U.S. This exercise was part of a service challenge designed to test the infrastructure of the LCG. The total amount of data transmitted in the challenge—500 TB—would take about 250 years to download using a typical 512 kb/sec household broadband connection.

Vicky White, head of the Fermilab Computing Division, one of the challenge participants, commented, “High-energy physicists have been transmitting large amounts of data around the world for years, but this has usually been in relatively brief bursts and between two sites. Sustaining such high rates of data for days on-end to multiple sites is a breakthrough, and augurs well for achieving the ultimate goals for grid computing.”

As the date for the LHC’s start approaches, the scale of the LCG in terms of the number of sites is already close to its target of 50,000 PCs. The LCG will continue to improve its reliability and grow over the next year by adding sites and increasing resources available at existing sites. In addition, the exponential increase in processor speed and disk storage capacity inherent to the IT industry will help achieve the LCG’s ambitious computing goals.

Martha Walz, R&D

Read the full editorial.

Tags:



Null
 iSGTW 22 December 2010

Feature – Army of Women allies with CaBIG for online longitudinal studies

Special Announcement - iSGTW on Holiday

Video of the Week - Learn about LiDAR

 Announcements

NeHC launches social media

PRACE announces third Tier-0 machine

iRODS 2011 User Group Meeting

Jobs in distributed computing

 Subscribe

Enter your email address to subscribe to iSGTW.

Unsubscribe

 iSGTW Blog Watch

Keep up with the grid’s blogosphere

 Mark your calendar

December 2010

13-18, AGU Fall Meeting

14-16, UCC 2010

17, ICETI 2011 and ICSIT 2011

24, Abstract Submission deadline, EGI User Forum

 

January 2011

11, HPCS 2011 Submission Deadline

11, SPCloud 2011

22, ALENEX11

30 Jan – 3 Feb, ESCC/Internet2

 

February 2011

1 - 4, GlobusWorld '11

2, Lift 11

15 - 16, Cloudscape III


More calendar items . . .

 

FooterINFSOMEuropean CommissionDepartment of EnergyNational¬†Science¬†Foundation RSSHeadlines | Site Map