iSGTW - International Science Grid This Week
iSGTW - International Science Grid This Week
Null

Home > iSGTW 18 April 2007 > iSGTW Feature - Open Science Grid: Living up to its Name


Feature - Open Science Grid: Living up to its Name


A neutrino event recorded by the MiniBooNE experiment. The ring of light, registered by some of more than one thousand light sensors inside the detector, indicates the collision of a muon neutrino with an atomic nucleus. 
Image courtesy of Fermi National Accelerator Laboratory

The Open Science Grid, supported by the U.S. Department of Energy’s Office of Science and the National Science Foundation, works hard to live up to its name: OSG is a grass roots consortium open to all scientific communities, accommodating to all resource providers, open to new technologies and to integration and interoperability with other grids. OSG supports science and research—especially collaborative science requiring high throughput computing. OSG enables the sharing of distributed computing and storage resources by connecting them into a common grid infrastructure over production and research networks via a common set of middleware. 

OSG’s mission is to help satisfy the ever-growing computing and data management requirements of scientific researchers.

The OSG is a unique alliance of universities, national laboratories, scientific collaborations and software developers. Members’ independently owned and managed resources make up the distributed facility, agreements between them provide the glue for it, their requirements drive its evolution, and their effort makes it happen. OSG provides a ‘virtual facility’ onto which individual research communities, the ‘virtual organizations,’ add services according to their scientists’ needs.

A number of physics collaborations provided the initial driving force for OSG and now rely on it for their data processing. These include the high energy physics experiments at CERN’s Large Hadron Collider, the STAR nuclear physics experiment, the Laser Interferometer Gravitational Wave Observatory and the current experiments at Fermilab’s Tevatron. OSG has grown to also support other branches of physics, biology and mathematics.

“Many of these physics experiments must sift out discoveries from prodigious quantities of data produced by detectors with up to hundreds of millions of electronics channels,” says Torre Wenaus, applications co-coordinator for OSG. “At the LHC in particular, the unprecedented particle collision energies will produce very complex data events. High complexity plus high rate requires vast amounts of processing power.”

OSG works with an expanding set of research communities to help them evaluate their cyberinfrastructure needs and plan their solutions both locally across the campus and as part of national or international efforts. OSG also organizes hands-on training schools and education activities to help students and teachers learn how to use and provide distributed computing.

The OSG software is based on the Virtual Data Toolkit—a packaging and distribution of Condor, Globus and other software needed to support a distributed facility. The OSG software stack is installed on participating grid computing nodes and includes a client package for end-user scientists. The OSG provides a test environment for integrating new services and functionality and a fully supported distributed production environment for running applications.

OSG members work actively with partners and peers, in particular Enabling Grids for E-sciencE in Europe and TeraGrid in the United States, network organizations such as ESnet and Internet2, as well as campus, regional, national and international grids to achieve interoperability.

“We want to open up gateways to enable communication between grids so that scientists can run the same application on multiple grids,” says Alan Blatecky, OSG’s engagement coordinator. “Interoperability increases the resources available to individual scientists, allowing them to become more productive.”

To learn more visit the Open Science Grid Web site.  

- Anne Heavey, OSG
iSGTW Contributing Editor

Tags:



Null
 iSGTW 22 December 2010

Feature – Army of Women allies with CaBIG for online longitudinal studies

Special Announcement - iSGTW on Holiday

Video of the Week - Learn about LiDAR

 Announcements

NeHC launches social media

PRACE announces third Tier-0 machine

iRODS 2011 User Group Meeting

Jobs in distributed computing

 Subscribe

Enter your email address to subscribe to iSGTW.

Unsubscribe

 iSGTW Blog Watch

Keep up with the grid’s blogosphere

 Mark your calendar

December 2010

13-18, AGU Fall Meeting

14-16, UCC 2010

17, ICETI 2011 and ICSIT 2011

24, Abstract Submission deadline, EGI User Forum

 

January 2011

11, HPCS 2011 Submission Deadline

11, SPCloud 2011

22, ALENEX11

30 Jan – 3 Feb, ESCC/Internet2

 

February 2011

1 - 4, GlobusWorld '11

2, Lift 11

15 - 16, Cloudscape III


More calendar items . . .

 

FooterINFSOMEuropean CommissionDepartment of EnergyNational¬†Science¬†Foundation RSSHeadlines | Site Map