Big Data | Emerging Trends | Infrastructure | Network Management
May 18, 2017

Using Big Data to understand how the universe works

By ITU News

17 May was World Telecommunication and Information Society Day (WTISD) with the theme “Big Data for Big Impact.” ITU and our members are exploring how Big Data can help solve the world’s biggest challenges.

The Large Hadron Collider is responsible for some of the most important scientific discoveries in physics, such as the higgs boson particle. Lying underneath the Jura mountains, stretching from CERN (the European Organization for Nuclear Research) to surrounding France, the 27km ring of superconducting magnets carries out experiments that aim to further our understanding of the universe.

But this work requires a lot of data. And data requires processing power.

While distributed computing networks take on much of this work, thousands of personal at-home computers around the world carry out important simulations through the LHC@home project. This data is then used to help make scientific discoveries by helping physicists compare theories with experiment results.

CERN, the institution that owns and operates the Large Hadron Collider (LHC), runs LHC@Home projects such as ATLAS@Home and CMS@Home.

ITU News spoke to Dr David Cameron, Researcher at the University of Oslo, who is responsible for the ATLAS@Home programme at CERN, about the project and the benefits of Big Data for scientific research.

ATLAS@Home – uncovering the mysteries of the universe

ATLAS is one of two general-purpose detectors at the LHC which investigates a wide range of physics, such as the search for the higgs boson, extra dimensions and particles that could make up dark matter.

“Fundamentally, [it is] trying to understand how the universe works by trying to recreate the conditions that we don’t get every day in nature,” Dr Cameron explained, “to hopefully discover the big questions like ‘Why are we here?’ and ‘Where do we come from?’ ”

ATLAS@Home started three years ago as a way to boost the project’s computing power and engage the global community in science. The volunteer computing platform runs simulations for the ATLAS project through downloadable scientific software, and operates on ‘idle time’, i.e. the time when a computer is on but not in use. To date, there are 100,000 registered accounts, of which 1,000 are ‘core’ participants continuously that run tasks 24/7, with a few thousand more running projects on and off.

How does the simulation work?

Theoretical models of physics are plugged into the ATLAS@Home simulation which uses a model of the detector to simulate how particles pass through its different parts. These are then compared to real data from the detector which not only helps to ensure that the detector is working properly, but know if something new has been found.

“What people are doing on their PCs is they are simulating how those particles interact with different pieces of the ATLAS detector,” Dr Cameron explains. “If something different comes out of the detector in the real data, we know that we have discovered something that wasn’t in the theoretical model. This is basically the only way that we can detect that we have a new kind of physics.”

But this requires a lot of data.

Big data, very big data!

“Big Data can mean many things. … To me, ‘Big Data’ means unstructured, random Big Data that some algorithm has to make sense out of. This applies to ATLAS as well as Facebook,” Dr Cameron said.

“For science, the advantage is being able to collect way more information than we ever could before, but the disadvantage is that we have to have way more complicated algorithms and techniques to extract what we need from that data.”

The volunteers generate 100MB every few hours, which, in the grander the scale of things, is a rather small amount. Data processing is run at some 100 distributed computing sites around the world, and moves about 1 petabyte (1,000 terabytes) of data per day between them.

Though ATLAS@Home only contributes a small fraction of the required computing power, the work makes a valuable contribution to science. Around 300 papers per year on all kinds of different physics are written using ATLAS data – though physicists use various processing algorithms to get these petabytes of data down to a small format which they can run on their laptops!

“If we didn’t have a means of handling Big Data, then we couldn’t do the physics that we need to do,” Dr Cameron explained. “When people who do the analysis use our systems, they don’t even know this data was processed from ATLAS@Home, they just know that it was processed somewhere in the world because it looks the same as data that was processed in other places.”

So, as Dr Cameron said, while 50 years ago, a scientist with a journal was easily capable of making a paper or discovery of particles in physics, today, it takes a big experiment, up to 3,000 people and 1 petabyte of data per day to make new scientific discoveries. And people at home on their computers are directly contributing to expanding our understanding of the universe around us.

By Lucy Spencer (@inquisitivelucy), ITU News

  • Was this article Helpful ?
  • yes   no
© International Telecommunication Union 1865-2017 All Rights Reserved.
ITU is the United Nations specialized agency for Information and Communication Technologies.