Large Hadron Collider Smashes Particles and Crunches Numbers on a Massive Scale

By John Hazard  |  Posted 2008-09-10 Print this article Print

It's been called the search for the God Particle, the very origins of the universe and the beginning of time.

The effort is massive, akin to early days of the space program that put men on the moon in less than two decades.

It involves extreme equipment - the Large Hadron Collider is a 16 mile magnetic loop and particle accelerator buried 328 feet underground in a massive tunnel; extreme speeds - protons circling the loop at 11,000 revolutions per second, nearly the speed of light; extreme temperatures - the particles must be kept at minus 456 degrees Fahrenheit.

But smashing particles together is just the beginning. From there thousands of scientists around the world will begin crunching the data.

The Large Hadron Collider will produce 10 terabytes of data during every eight hour run, as much as 10 petabytes per year that must be stored and examined and scrutinized by scientists in search of the so called "God Particle" - a theoretical particle that creates dark matter, part of the mass that went missing at the beginning of time.

The CERN project has connected more than 1,000 scientists in 80 countries to large clusters of computing power linked by high-speed connections - Large Hadron Collider Grid, which technology advocates and fans say may have changed the Web from a communications network to one giant computer.

CERN Lessons

Maybe your project doesn't match CERN's size and scope, but it has launched some of the technologies we use everyday - the Web - and offers important lessons about running and managing a 10G-bps network.

Some of our CERN coverage over the past year:

  • Computing Grid Helps Get to the Heart of Matter

    A massive computing grid is helping physicists at CERN in Switzerland power the world's largest particle accelerator to answer the question: what other particles exist in the universe that we don't know about?

  • Supercomputer, Eh?

    A group of 10 Canadian universities will use an IBM supercomputer cluster to examine data being produced by the CERN project in Europe.

  • IBMs Storage Tank Technology to Aid Big-Bang Research

    IBM's Storage Tank storage visualization and management technology, was deployed in its first real-world test to handle the extraordinarily high volumes of data that CERNs new particle collider

  • HP Switches Key to CERNs LHC Project

    As part of CERN's network upgrade, some 2,000 tape and disk servers are connected via Hewlett-Packard's ProCurve switches.

  • CERN httpd Among the founding technologies of the Web

    The very first Web server, built to run the first Web sites, CERN httpd was in heavy use for many years after (and was the server that the original PC Week Web site, the forbearer to, was first run on).

  • CERN Gives birth to the Web

    What about the technology that has changed everything we do today--namely, the World Wide Web? That technology was created by a researcher at CERN, Tim Berners-Lee, who wanted to make it easier for groups (especially those he was a member of) to collaborate over the Internet.

  • Sun Debuts New Tape Drive for an interconnected systems approach such as CERN

    The CERNs storage environment must be prepared to withstand a significant data problem in 2007. At that time, CERN will be forced to deal with data delivery at 4GB per second and keep up to 15 petabytes of information every year for as long as 20 years from experiments run on its Large Hadron Collider.


    Submit a Comment

    Loading Comments...
    Manage your Newsletters: Login   Register My Newsletters

    Rocket Fuel