Article: Data storage and analysis for the largest scientific instrument on the planet (LHC Grid)
The Large Hadron Collider (LHC) is a particle accelerator that aims to revolutionize our understanding of our universe. The LCG, which was launched in 2003, aims to integrate thousands of computers in hundreds of data centers worldwide into a global computing resource to store and analyze the huge amounts of data that the LHC will collect. The LHC is estimated to produce roughly 15 petabytes (15 million gigabytes) of data annually that will become available to thousands of scientists around the world.
This article describes the challenges that the engineers faced in building this pioneering system and the solutions they came up with.
Read the full article here.
Additional important software used
It is quite an impressive piece of work in itself.
Taipei is the capital of Taiwan. Write "Taiwan", not "Taipei".
Craig Motlin Sep 01, 2014