A brand new map of the universe could challenge Einstein's theory of the cosmos

A brand new map of the universe could challenge Einstein's theory of the cosmos

By Miles Gilroy, SciTech Deputy Editor

Einstein’s greatest blunder

One of the greatest misconceptions in all of science is the idea that gravity is a force. Yes, Isaac Newton was wrong. I realise that contesting arguably the greatest scientist of all time as a mere undergrad is bold, but, fear not, I have Albert Einstein, one of the most robust cosmological theories of all time, and the entirety of the 21st century scientific community on my side. Phew!

The reason this misconception is so great and so unknown is because gravity acts identically to a force - it is mathematically indistinguishable. Even in the vast majority of cutting edge physics experiments, we treat gravity as a force, simply because we can, and it’s the only way to incorporate its effects into our maths. In reality, gravity is an effect caused by the curvature of spacetime - the fabric of the universe, if you will. I would like nothing more than to elaborate on that, but I would also like this article to be less than ten pages, so you’re just going to have to trust me.

This theory of gravity was published by Einstein in four separate papers during November of 1915, and is most commonly known as general relativity (GR). The fourth and final paper outlined the field equations of GR, which detail how the motion of energy and matter curve spacetime. These equations compliment Newton on relevant scales while also explaining observations that he was unable to, making them an incredibly complete cosmological theory.

However, Einstein made one big error - he assumed the universe was static. In fairness to him, he came up with GR before Edwin Hubble discovered the universe was expanding in 1929. In response to this discovery, Einstein retroactively removed a constant - now known as the cosmological constant - from his equations, allegedly referring to it as his ‘greatest blunder’.

Now, you may be thinking that this seems like an awfully simple fix for such a gravely false assumption. You would be absolutely correct. By removing the constant, more problems arose, including the equations telling us that matter couldn’t exist. 

Luckily for Einstein, a Russian physicist by the name of Alexander Friedmann reworked his equations under the assumption that the universe was expanding, solving these issues. These new and improved equations predicted the expansion to be decelerating as the density of the universe decreased with growing volume.

So we now have a set of equations that describe the curvature of spacetime in an expanding universe and that have taught us something new about it. Fantastic!

Wrong! Guess what? The expansion of the universe is accelerating.

I guess now we’re just stuck in some sort of sick back and forth with the universe itself and we can’t seem to get anything right. What could possibly make this worse?

Well, as it turns out, in order to account for this acceleration, we needed to reintroduce the cosmological constant. Einstein was right all along, just for the wrong reasons.

NASA Hubble

Enter the 21st century

It’s now been a good while since Einstein published the greatest set of equations ever devised, realised they were actually nonsense, and then realised they actually weren’t. Experimental data has reinforced the refined field equations relentlessly and they have stood the test of time. They even survived the discovery of dark matter and dark energy, with the cosmological constant now being seen as the mathematical representation of dark energy.

But, the universe isn’t done screwing with us yet. Since its discovery, the cosmological constant has been believed to be a constant, hence the name. However, brand new data from the Dark Energy Spectroscopic Instrument (DESI), an experiment to create a 3D map of the universe and track the effect of dark energy, is suggesting otherwise. 

Technically, taken in isolation, this data actually agrees with the constant being constant, but, when combined with previous data recorded using various other methods, it suggests that the effect of dark energy is weakening over time, meaning the cosmological constant is actually a time dependent variable.

This is exciting stuff! We are back to tussling with the universe and the scientific community has not been buzzing like this for a long time. But, we must remain calm. When I say this is brand new data, I mean this is brand new data. It can’t even be officially classified as a discovery yet. An observation is considered a discovery when the data is at a statistical significance level of five sigma - for reference, one sigma essentially means 33 percent chance of fluke and three means 0.3 percent chance. Currently, the significance level is around four sigma which is not enough to make it a discovery but is enough to get people talking - four sigma is less than 0.01 percent chance of fluke. 

Even notably cautious scientists are getting on board, with Prof Ofer Lahav of UCL saying ‘we may be witnessing a paradigm shift in our understanding of the universe.’

Scotland’s Astronomer Royal, Prof Catherine Heymans of the University of Edinburgh, has said ‘dark energy may be even weirder than we thought’, which says a lot, considering it already completely baffles us.

Marilyn Sargent/Berkeley Lab

Can we trust this?

Easily the most difficult question in all of science.

Bias is a huge issue when analysing data and interpreting trends. Confirmation bias is defined by Britannica as ‘people’s tendency to process information by looking for, or interpreting, information that is consistent with their existing beliefs.’ This is one of the strongest forms of bias in science. If data is collected that supports current beliefs, it will likely be immediately accepted as accurate with no further scrutiny, while unexpected results are more likely to encourage tweaks to the experiment or interpretation to make them conform to current theories.

To mitigate the effect of confirmation bias, scientists use a method called blinding. The DESI crew used the following blinding strategy. Firstly, the real data is shifted to ensure its true nature is hidden. For example, galaxy positions are shifted along the line of sight to mimic a different, randomly selected cosmological model - the statistical properties of the data are preserved in doing this, so the same analysis can be applied to the real data. Secondly, the analysis method is refined using the test data to ensure these refinements are not cherry-picked to agree with current ideas - no meaningful analysis will agree with any current ideas because the test data is bogus. Finally, the refined analysis method is applied to the real data in a process called ‘unblinding’, after which no further changes are allowed.

This observation will be particularly reliable as time goes on thanks to the immense attention it has drawn. Attention comes hand in hand with scrutiny. The more people talk about something new, the more other people will want to test it with their own instruments, techniques, and perspectives.

This new observation from DESI is extremely exciting in that it questions some of our most fundamental theories of the cosmos and proves to us that science never stops advancing. It also highlights some interesting questions around the credibility of observations and over-hyping by the media. It is safe to say, however, that this is an observation to be taken seriously. Given the statistical significance of the data, this is one of the rare occasions where the media coverage is apt. I cannot wait to see how this progresses as more data is gathered.


Featured image: Dark Energy Spectroscopic Instrument