The Large Hadron Collider (LHC) is a particle accelerator that aims to revolutionize our understanding of our universe. The LCG, which was launched in 2003, aims to integrate thousands of computers in hundreds of data centers worldwide into a global computing resource to store and analyze the huge amounts of data that the LHC will collect. The LHC is estimated to produce roughly 15 petabytes (15 million gigabytes) of data annually that will become available to thousands of scientists around the world.
This article describes the challenges that the engineers faced in building this pioneering system and the solutions they came up with.
Read the full article here.
Community comments
Additional important software used
by Mark Pollack,
Taipei
by Mickael Faivre-Macon,
Additional important software used
by Mark Pollack,
Your message is awaiting moderation. Thank you for participating in the discussion.
The data analysis system used at LHC is based on "root", root.cern.ch
It is quite an impressive piece of work in itself.
Taipei
by Mickael Faivre-Macon,
Your message is awaiting moderation. Thank you for participating in the discussion.
After initial processing, this data is distributed to eleven large computer centers - in Canada, France, Germany, Italy, the Netherlands, the Nordic countries, Spain, Taipei, the UK, and two sites in the USA
Taipei is the capital of Taiwan. Write "Taiwan", not "Taipei".