Now on ASCR Discovery: Infant Cosmos
About 13.5 billion years ago, the Big Bang filled the universe with ions – atoms bearing electric charges. Intense heat kept every hydrogen atom in that ionic state for a few hundred thousand years, until things cooled enough that each could combine with an electron, creating neutral gas in the so-called Cosmic Dark Age.
“That was before the first stars formed, when there were no photons to see,” says Nickolay Gnedin, a scientist in the theoretical astrophysics group at the Fermi National Accelerator Laboratory in Illinois.
After about 100 million years, gravity started pulling particles together and galaxies and quasars began to form. The radiation they produced started ionizing atoms once more, marking the beginning of the Epoch of Reionization. It took about a billion years to turn all the atoms back into ions. “Today, about 98 percent of the gas in the universe is ionized,” Gnedin says.
Little information about these early cosmic times can be observed at the moment, so scientists turn to simulations, combining algorithms with high-performance computing to model the early universe. But bigger telescopes on the horizon – especially NASA’s James Webb Space Telescope (JWST), scheduled for launch in October 2018 – promise more observational data. Over the next five to 10 years, the amount of information from astronomical instruments will increase by tenfold – at least. Simulations must improve to keep up.
“None of the existing models are likely to survive,” Gnedin says. In short, observational data will have more detail than existing computational models.
As such, scientists must create new models and algorithms that can handle the increase in observational output and then run these simulations on the most advanced supercomputers. Rather than generating dismay along cosmologists, they relish the challenge. “The most significant progress in science took place in fields which had theory and experiment – read ‘observation’ in astronomy – on the same level of sophistication,” Gnedin says. “Pure observational or purely theoretical fields tend to stagnate.”
With a Department of Energy INCITE (Innovative and Novel Computational Impact on Theory and Experiment) award of 65 million processor hours on the Argonne National Laboratory Leadership Computing Facility’s IBM Blue Gene/Q, Gnedin and his colleagues are developing their Cosmic Reionization On Computers (CROC) project. CROC’s main mission, Gnedin says: “to model the signal that telescopes like the JWST will measure, to serve as a theoretical counterpart to the observational program.”
Teams besides Gnedin’s are developing next-generation cosmic reionization simulations, Gnedin says, and the field is progressing well. “I am very optimistic. When the flood of new data comes, theorists will be ready to compare them with similar high-quality models.”
Read more at ASCR Discovery, a website highlighting research supported by Department of Energy’s Advanced Scientific Computing Research program.
Image caption: A frame from a visualization of the early universe. The brownish non-transparent fog is neutral hydrogen; the glowing blue portions are dense ionized hydrogen, which becomes transparent when it is not dense; yellow dots are galaxies. Image courtesy of Nickolay Gnedin, Fermi National Accelerator Laboratory.