Galaxies forming stars at extreme rates nine billion years ago were more efficient than average galaxies today, researchers have found.

The majority of stars have been believed to lie on a “main sequence,” where the larger a galaxy’s mass, the higher its efficiency to form new stars.

However, every now and then a galaxy will display a burst of newly-formed stars that shine brighter than the rest.

A collision between two large galaxies is usually the cause of such starburst phases, where the cold gas residing in the giant molecular clouds becomes the fuel for sustaining such high rates of star formation.

The question astronomers have been asking is whether such starbursts in the early universe were the result of having an overabundant gas supply, or whether galaxies converted gas more efficiently.

A new study led by John Silverman at the Kavli Institute for the Physics and Mathematics of the universe, studied carbon monoxide (CO) gas content in seven starburst galaxies far away from when the universe was a young four billion years old.

This was feasible by the advent of Atacama Large Millimeter Array (ALMA), located on a mountaintop plateau in Chile, which works in tandem to detect electromagnetic waves at a avelength range in the millimeter (pivotal for studying molecular gas) and a sensitivity level that is just starting to be explored by astronomers today.

The researchers found the amount of CO-emitting gas was already diminished even though the galaxy continued to form stars at high rates.

These observations are similar to those recorded for starburst galaxies near Earth today, but the amount of gas depletion was not quite as rapid as expected.

This led researchers to conclude there might be a continuous increase in the efficiency depending on how high above the rate of forming stars is from the main sequence.

The study was published in the Astrophysical Journal Letters.

-----------------------------Advertisement------------------------------------

must read