|The obligatory conference photo. The photographer spoke to us in Mandarin. I think what he was trying to say was "more intensity".|
Just over a week ago I was at the annual COSMO conference. This year's host was Beijing. I had originally intended to live blog this event, but the Great Wall of China (alternative link) managed to prevent that entirely.
What follows are some reflections on the scientific bits and pieces people presented at the conference that I happened to find interesting. It might be a bit technical, but please ask questions if I use jargon you don't understand. Also, if you're an expert and I write something you want to comment on, please do (especially if something I write is misleading or just plain wrong).
The topics I've chosen below just happen to be what I found memorable. I made no attempt to choose these topics by any sort of theme. I apologise if I've missed anything particularly interesting. Perhaps if you were there and think I missed out something interesting you can either mention it in the comments or write a guest post for us.
Neutrinos and precision cosmology
|One of the first images captured by the Dark Energy Survey. The more interesting images it will take will be of very distant galaxies and won't look anywhere near as nice. This one is just for people to put in their blogs.|
Jan Hamann gave a talk on the future constraints that cosmology will provide for neutrino physics. I was pleasantly surprised by the power of large scale structure probes, such as Euclid.
We know from particle physics experiments that the difference between the masses of two of the neutrinos is more than 0.06 electron volts. This means that the heaviest neutrino must be heavier than 0.06 electron volts.
What is nice is that cosmological probes are sensitive to neutrino masses. Massive neutrinos affect the expansion rate of the universe at earlier times. This affects which scales we expect to see the likes of baryon acoustic oscillations (BAO) at. Massive neutrinos also affect the formation of structures in the universe. The more massive the neutrinos are, the more they contribute to the total mass of the universe. However, because neutrinos are very light they cluster much less on small scales than heavier dark matter – essentially, they're moving too fast to be captured by structures that are too small. On the larger scales though, neutrinos will cluster just as much as ordinary dark matter. This will therefore be seen as a suppression of the growth of structure on a characteristic scale, which will be determined by the sum of the masses of the neutrinos.
Currently, the tightest constraint on the sum of neutrino masses from cosmology is that it must be less than 0.6 electron volts.
According to the results presented in Jan's talk, based on work he has completed with collaborators, with Euclid we will measure BAO and structure growth well enough that cosmology definitely will be able to measure the sum of the neutrino masses. It won't be dealing with constraints any more. It's hard to think of a clearer example of how cosmology will be able to constrain fundamental physics just as well as, or better than, direct particle physics experiments.
We might not need to wait for Euclid for detections either. If the neutrinos are a little more massive than that 0.06 electron volt lower limit then the Dark Energy Survey, which coincidentally began taking measurements during COSMO, will beat Euclid to the detection.
So, keep an eye out for future neutrino measurements from cosmology.
Parameterising modified gravity
Pedro Ferreira gave a nice plenary talk on modified gravity. There are as many theories of modified gravity as there are papers on the arXiv. If there is a way to play with changing general relativity then somebody, at some point, has probably speculated about it.
Each of these modifications have different theoretical motivations and will have particular ways in which they would cause the universe to look different to what general relativity would predict. On large enough scales we can use perturbation theory to calculate exactly what a modified gravity model will predict for cosmological measurements. However, on small enough distance scales, cosmology gets more messy. The best way we have to deal with that in the standard cosmological model is to run enormous simulations of the universe.
However these simulations are very expensive and take weeks to run, or longer. Moreover, they run for only one set of parameters, even in the standard model. How the universe would change if any of those parameters were to change must be inferred. For modified gravity models this is a nightmare. How can we know what a modified gravity model predicts for small scale structure without running a full simulation? And how can we run simulations for every single modified gravity model and all of its parameterisations?
What Pedro and his collaborators have been trying to do is construct a very general parameterisation of gravitational phenomena that can encompass all of these models, or at least as many as possible. They've called it the Parameterised Post-Friedmann framework in this paper.
The motivation behind this framework is pretty simple. It is intended to be used as an intermediate step between modified gravity models and observations. Essentially, if you're a theorist, you can calculate quite simply how your modified gravity model fits into the framework and thus see what your model predicts and maybe even whether it has been ruled out already. If you're an observer with interesting new data you don't need to pick one individual modified gravity model and calculate how your model constrains it, instead you can choose to constrain the parameters of the PPF framework. If the data does require some sort of new physics, then the PPF framework will see this and hopefully tell the observer what types of models could have produced the new physics.
I'm far from an expert on modified gravity models so I don't know how well it does encompass the variety of possibilities of new physical phenomena that modified gravity models can produce, nor do I know whether it will be possible to make simulations that are able to show even what the PPF framework predicts for the very small scales, nor do I know how well data will be able to constrain the parameters in the framework. But it seems like a new, interesting and very well-motivated idea for how to solve a very real problem in fundamental physics.
So keep an eye out for the PPF framework and its future applications to modified gravity models.
Is the South Pole Telescope about to announce a discovery?
|The three plots on the left of this figure show the constraints of the number of effective neutrinos. The dotted line is the standard prediction. It is tantalisingly close to being ruled out at high significance.|
One of the most interesting talks was by Scott Dodelson on the cosmic microwave background. In the very near future the Planck satellite will be unveiling all of its measurements of the CMB and cosmologists will go a bit crazy for a few days/weeks/months, but most of Dodelson's talk was not about Planck at all.
Planck is a satellite that is measuring the CMB from space. However right now two telescopes in the two driest places on Earth are also measuring the CMB and poor Planck runs the risk of getting seriously scooped. These are the South Pole Telescope and the Atacama Cosmology Telescope. The problem for SPT and ACT are that they can only measure a patch of the CMB, whereas Planck can measure the entire sky.
Since WMAP first measured the acoustic peaks of the CMB there has been a mild tension between the measurements of the CMB and the amount of non-photonic radiation we expected to be present when the CMB was formed. This radiation is customarily quantified by the number of effective neutrino species because, whatever the masses of the neutrinos, when the CMB formed their masses would have been negligible compared to their kinetic energies. This has a measurable effect on the anisotropies that are imprinted into the CMB at this surface.
There seems to have been slightly too much neutrino-like radiation at this time. The most interesting scales for probing the effects extra neutrino-like radiation would have are exactly those that SPT and ACT are starting to probe and that Planck will unveil next year. An independent analysis from earlier in the year studied the WMAP, SPT and ACT data and found that there is greater than a 95% chance that the three neutrinos of the standard model do not produce enough radiation for the CMB to look like it does.
Dodelson showed the plot below from a forthcoming SPT paper which includes their latest data release and made allusions to the idea that SPT will be further increasing that probability that this extra radiation exists. The popular name for what this radiation will be called if it does get verified is dark radiation. So, it appears that the duopoly of dark matter and dark energy may be on the verge of becoming a triumvirate.
|Readers of this blog will have seen the left hand side of this image before. CMB cosmology has made great leaps over the last few years. You might ask what will be left for Planck.|
Watch this space very closely. When the SPT paper comes out I will try to write a brief summary.
Dark energy clustering
Ole Eggers Bjælde gave a nice talk on dark energy clustering. In the usual cosmological model dark energy is completely constant, in both time and space. Some modifications to this model allow dark energy to interact (either gravitationally, or through other forces) with dark matter and ordinary matter. This would necessarily mean that it is not purely constant.
Rather than deal with any theoretical implications of such models, Ole and his collaborators were studying the observational consequences. In particular, they were looking at the effects on the abundances of galaxy clusters. They used just two parameters beyond the usual cosmological model, these were the equation of state of dark energy (essentially this measures how dark energy changes with time) and the speed of sound waves in the dark energy.
The talk and his work were certainly both interesting, but what particularly caught my eye was the possibility that dark energy clustering could be responsible for the “ISW mystery” that I have written about here extensively. Ignoring any theoretical issues, if dark energy clusters along with matter then the ISW effect will also be enhanced in the more clustered regions, which is precisely what this observation saw.
Could it be that the particular measurement I described as an ISW mystery is the first detection of the physical effects of the clustering of dark energy? Watch this space too.
The very, very small scales
There were two talks about the interesting, but ambitious, idea of probing the primordial fluctuations of the density of the universe on very, very small scales. And by small, I mean tiny. I mean Earthly scales, which by cosmological standards are miniscule.
One of the problems we face when trying to observe the universe is that the farthest back we can see is the CMB. However, the event that formed the CMB already took place around 400,000 years after the big bang. By then, any of the primordial fluctuations on extremely small scales had already been completely wiped out.
However, if the primordial fluctuations on very small scales were very large they would have caused things to happen before then that might still be observable now. The big problem with this is that those primordial fluctuations would have had to be very large compared to how large the fluctuations were on the larger scales that we have measured.
Nonetheless, why not check, just in case?
Christian Byrnes gave a talk about some interesting work he has done with collaborators looking at the effects we would expect if the primordial fluctuations on these small scales were both large and didn't follow a Gaussian distribution (which until now had always been assumed). The particular observable he looked at are primordial black holes (PBHs). If the fluctuations were big enough, PBHs would have formed just after the big bang. Black holes decay into radiation over time, but the smaller they are the faster they decay. Therefore, constrains on the initial abundance of PBHs depend on their size. For example very small ones would have decayed very quickly and so are constrained by how much excess radiation there was early in the universe. Whereas, very large ones would still be around now and would act like dark matter so their existence is constrained by the density of matter in the universe.
What Chris found is that a positive skewness would enhance the expectations for PBHs. This wasn't surprising but it is difficult to use this to constrain the overall skewness because the abundance is also strongly dependent on the variance of the primordial fluctuations. More interestingly perhaps, they also found that the existence of even a very small negative skewness in the primordial fluctuations would be completely ruled out by the detection of PBHs. Essentially, no matter how big the variance gets, a small amount of negative skewness would remove the possibility of PBHs. PBHs haven't been detected, and may never be, so I won't say watch this space, but if they do get detected then suddenly works like this will become very relevant.
|Schematic of DECIGO. If a gravitational wave passes through the space between the mirrors the light will travel a slightly different distance. Then the photo-detectors should see an interference pattern.|
Laila Alabidi also gave an interesting talk on cosmological probes of these very small scales. Her probe was the amplitude of gravitational waves on these very small scales. Thankfully, unlike light, gravitational waves interact incredibly weakly. Of course this is a curse when we try to measure them, but it also means that gravitational waves created even before the formation of the CMB are still around propagating almost freely today. These gravitational waves carry within themselves information from extremely early times in our universe's history. And, one possible source of gravitational waves would be very large fluctuations in the density of the early universe. In other words, exactly the sort of initial conditions that generate PBHs will also generate gravitational waves. What's more, if the fluctuations were large enough, they would be detectable by future gravitational wave experiments (such as LISA and DECIGO).
Laila and collaborators looked at specific models of inflation (inflation is the most popular mechanism for generating the initial conditions of the big bang) that produce large fluctuations on small scales and found that many of them would have a measurable signal.
The caveat is again that there is no reason to expect these inflationary models are correct. No evidence of PBHs or the gravitational waves that would come with them has been found. I'm not going to say watch this space. But I like the idea of looking for signals in unexpected places because if one shows up even its mere existence will be interesting. What I will say is remain aware of this space, just in case.
I was fascinated by many talks about 21cm radiation from hydrogen during "the dark ages". The dark ages were the period after the CMB formed and before the first luminous structures started to form. I don't know much about 21cm radiation. However the feeling I got from these talks is that, whereas it is incredibly difficult to measure, the wealth of information it possesses about the universe is bigger than the CMB and nearby structures combined! Maybe they were over-selling the case, but I was impressed.
If you're a physicist and you've read this, feel free to point out the things I've not explained well and/or what else you thought was interesting about COSMO.
And, if you know about 21cm radiation, would you like to write a guest post?