Modeling Heavy-Ion Collisions With Open Science Grid

First Posted: Apr 10, 2014 06:57 PM EDT
Close

Scientists have traced the expansion of the universe back to the very beginning - when it occupied an infinitesimal point in space. This was the state of the universe at time t=0, 13 billion years ago. It is from this starting point (the big bang) that everything we are familiar with - space, time, stars, galaxies, protons, neutrons, matter - came into existence.

If we go back in time to a millionth of a second after the big bang, all matter was still very much concentrated in a miniscule volume, and in a state called the Quark Gluon Plasma (QGP), where quarks and gluons floated freely. Scientists believe the QGP is a liquid that likely has the lowest viscosity of anything ever measured.

During the cooling and rapid expansion of the universe, the QGP underwent a phase transition forming hadrons - protons and neutrons - the building blocks of matter as we know it. Today, scientists seek to recreate the QGP state by colliding heavy ions at the highest possible energies obtainable in particle colliders. One of the main tasks in relativistic heavy-ion research is to find clear connections between the transient, quark-gluon plasma state and the experimentally observable hadronic final state.

Steffen Bass, a theoretical physicist and professor at Duke University in North Carolina, US, is actively involved in developing models for the dynamics of such highly energetic heavy-ion collisions. "The only way to create a QGP in the lab is with big nuclei and a large particle accelerator capable of creating the temperatures and pressures that were present just after the big bang," says Bass.

The Relativistic Heavy Ion Collider (RHIC), at Brookhaven National Lab in Upton, New York, US, was developed specifically for colliding such heavy ions. RHIC primarily uses ions of gold, one of the heaviest common elements, due to its densely packed nucleus. The ALICE detector, hosted at CERN, near Geneva, Switzerland, also measures the particles produced in heavy ion collisions.

Collisions at this miniscule scale happen so quickly that they are not directly observable. Instead, physicists look at the collision debris field. "We study the subatomic particle debris that emerges from these light-speed collisions, looking for clues about what matter was like at the beginning of time," explains Bass. "By looking at the debris field we can piece together whether, for a minimal amount of time, we did indeed create a QGP."

Through modeling and simulation, Bass is able to connect the debris field with the quantities he wants to explore. "We compare the outcome of the experiment with the outcomes of the simulations noting, among other things, the temperatures and pressures we achieve." It is through this comparison that new scientific insight can be gleaned.

Collider experiments generate petabytes of data. In some cases, collisions are being recorded at a rate of millions per minute. "On the simulation side, some of our models may take on the order of 40 hours to simulate one collision, " explains Bass. The simulations are so computationally expensive that standard resources are unable to reproduce what experiments have measured.

"We've tried many different things to gain efficiency and computational power over the years," Bass says. "The first simulations I programmed in the 1990s were optimized for vector CPUs. We then went to compute clusters since most of our simulations are trivially parallelizable."

Available cluster resources in the last decade, however, didn't enable Bass and his students to do the kinds of calculations and investigate the things they were really interested in. In 2009, they began using the Open Science Grid (OSG). "Initially OSG afforded us hundreds of thousands of CPU hours, which, to us at that time, was massive."

The availability of OSG eventually changed the questions Bass was willing to address. "Very often there are questions and calculations you ponder on the back of an envelope, but you assume they are not possible and you move on," he says. "Suddenly, with OSG we were able to ask the questions we wanted to ask. Not having to jump through formal hoops, or compete for cluster compute cycles enabled us to get to projects we previously would have deemed too risky," says Bass.

Over time, he's been able to able to see and experience first-hand exactly how OSG runs and determine a sort of 'sweet spot.' "The simulations and comparisons we're doing now use 50,000 to 100,000 CPU hours a day - that's pretty amazing. The efficient way in which OSG is run and the seamless access to massive amounts of computing power have both been a tremendous benefit to us. It is an impressive model that I hope will continue well into the future," says Bass.

By better understanding the QGP, scientists may better understand how the early universe expanded. But what is not always clear is the impact such research will have. Einstein's theory of gravity, for example, was fueled and formed by his desire to better understand the universe. Yet many years later, as a spin-off from his theories, we are able to send satellites into space, and navigate with GPS.

Basic research pushes the boundaries of what is possible. OSG is one tool helping scientists push further. "Science is not happening in a vacuum," notes Bass. "Much of what we do connects to what we see in the world today, but more importantly, it connects to what will happen tomorrow and in the future. OSG is a fantastic tool that maximizes the payoff for investment in science and infrastructure." -- by Amber Harmon, © i SGTW

See Now: NASA's Juno Spacecraft's Rendezvous With Jupiter's Mammoth Cyclone

©2017 ScienceWorldReport.com All rights reserved. Do not reproduce without permission. The window to the world of science news.

Join the Conversation

Real Time Analytics