Like my friend and colleague Peter Coles, I am just returned from the fine wine-soaked dinner for the workshop “Cosmology and Astroparticle physics from the LHC to PLANCK” held at the Niels Bohr Institute in Copenhagen. It is an honor to visit the place where so many discoveries of 20th Century physics were made, and an even greater honor to be able to speak in the same auditorium as many of the best physicists of the last hundred years.
(You can see the conference photo here; apparently, the trumpet is just the latest in a long series.)
I talked about the most recent results from the Planck Satellite, gave an overview of the state of the art of (pre-Planck) measurements of the Cosmic Microwave Background, and found myself in what feels like the unlikely role as a mouthpiece for a large (and therefore conservative) community, basically putting forward the standard model of cosmology: a hot big bang with dark matter, dark energy, and inflation — a model that requires not one, not two, but (at least) three separate additions to particles and fields that we know from terrestrial observations: one to make up the bulk of the mass of today’s Universe, another to make the Universe accelerate in its expansion today, and another to make it accelerate at early times. It would sound absurd if it weren’t so well supported by observations.
(See below for an update.)
In one of the more bizarre meta-experiments that have come out of the latter-day social web, Trieste astrophysicist Paolo Salucci is trying to use Facebook to spread some astrophysics, not to the public, but within the astronomical community.
Specifically, he’s trying to “eliminate the deep-routed [sic] wrong misconception [sic] of Flat Rotation Curves of Spiral Galaxies”. A little scientific background: rotation curves are simply a measurement of how fast the stars are moving around the center of their host galaxies (plotted as a function of distance from the center to make a curve). If gravity is responsible for the motion of the stars, we can use the curve to determine the amount of mass in the galaxy. And when we do this, we find that there appears to be much more mass than the luminous matter — the stars — is responsible for. This is one of the strongest pieces of evidence for dark matter.
Salucci has a fairly specific axe to grind: the evidence is often caricatured as “flat rotation curves”. However, when considered in detail, the rotation curves are not completely flat, but do indeed seem to rise and (in the rare cases we can measure far enough out from the center) fall. More specifically, galaxy rotation curves do appear to take a very simple form, each of them being one of a very limited family of possibilities. This is much less variation than might have been naively surmised, but does seem to be borne out by massive numerical simulations of the Universe (including dark matter) as well as the observations.
Nonetheless, I think Salucci misses the point (or perhaps I miss his): indeed we do say “galaxies have flat rotation curves” but this meme (let’s not call it a “wrong misconception”) isn’t about the detailed shape of the rotation curves — rather, it is shorthand for “galaxies are dominated by dark matter”. Yes, we probably should be more precise in our language, but I don’t think we are spreading quite as gross a misconception as Salucci (who works directly in this field and so is admittedly more attuned to it than I) worries.
Of course the real interest in this experiment may just be the attempt to use a consumer social network to foster real discussion (or right thinking) within a specialized and technical community. Sarah Kendrew has an excellent dissection of the methodological side of Salucci’s attempt: can we actually measure how big a problem this “wrong misconception” is? How quickly should we expect Facebook to solve this problem? Would this be a good or a bad thing, outside of the usual professional channels of peer-review and conferences? I completely agree with Sarah that this experiment per se may not teach us much, but that the broader presence of professional astronomers, and scientists more generally, in the world of the web has already begun to prove itself useful as a tool for communication to the public and within the professional community.
Update: I had a very nice telephone discussion with Paolo Salucci today. I just want to re-emphasize the point that he is, indeed, right about the facts of the case: rotation curves are not flat (hence the name of the group), and moreover (and more subtly, which is the rub) it is exactly the rising and falling shape of these curves that makes the standard cosmological explanation of dark matter more compelling (and a just plain better fit to the data) than, say, alternative gravity theories such as MOND and its more theoretically coherent variants like TeVeS.
Luckily, not all the astrophysics news this week was so bad.
First, and most important, two of our Imperial College Astrophysics postgraduate students, Stuart Sale and Paniez Paykari, passed their PhD viva exams, and so are on their ways to officially being Doctors of Philosophy. Congratulations to both, especially (if I may say so) to Dr Paykari, who I had the pleasure and fortune to supervise and collaborate with. Both are on their way to continue their careers as postdocs in far-flung lands.
Second, the first major results from the Herschel Space Telescope, Planck’s sister satellite, were released. There are impressive pictures dwarf planets in the outer regions of our solar system, of star-forming regions in the Milky Way galaxy, of the vary massive Virgo Cluster of galaxies, and of the so-called “GOODS” (Great Observatory Origins Deep Survey) field, one of the most well-studied areas of sky. All of these open new windows into these areas of astrophysics, with Herschel’s amazing sensitivity.
Finally, tantalisingly, the Cryogenic Dark Matter Search (CDMS) released the results of its latest (and final) effort to search for the Dark Matter that seems to make up most of the matter in the Universe, but doesn’t seem to be the same stuff as the normal atoms that we’re made of. Under some theories, the dark matter would interact weakly with normal matter, and in such a way that it could possibly be distinguished from all the possible sources of background. These experiments are therefore done deep underground — to shield from cosmic rays which stream through us all the time — and with the cleanest and purest possible materials — to avoid contamination with both both naturally-occurring radioactivity and the man-made kind which has plagued us since the late 1940s.
With all of these precautions, CDMS expected to see a background rate of about 0.8 events during the time they were observing. And they saw (wait for it) two events! This is on the one hand more than a factor of two greater than the expected number, but on the other is only one extra count. To put this in perspective, I’ve made a couple of graphs where I try to approximate their results (for aficionados, these are just simple plots of the Poisson distribution). The first shows the expected number of counts from the background alone:


(I should point out a few caveats in my micro-analysis of their data. First, I don’t take into account the uncertainty in their background rate, which they say is really 0.8±0.1±0.2, where the first uncertainty, ±0.1 is “statistical”, because they only had a limited number of background measurements, and the second, ±0.2, is “systematic”, due to the way they collect and analyse their data. Eventually, one could take this into account via Bayesian marginalization, although ideally we’d need some more information about their experimental setup. Second, I’ve only plotted the likelihood above, but true Bayesians will want to apply a prior probability and plot the posterior distribution. The most sensible choice (the so-called Jeffreys prior) for this case would in fact make the probability peak at zero signal. Finally, one would really like to formally compare the no-signal model with a signal-greater-than-zero model, and the best way to do this would be using the tool of Bayesian model comparison.)
Nonetheless, in their paper they go on to interpret these results in the context of particle physics, which can eventually be used to put limits on the parameters of supersymmetric theories which may be tested further at the LHC accelerator over the next couple of years.
I should bring this back to the aforementioned bad news. The UK has its own dark matter direct detection experiments as well. In particular, Imperial leads the ZEPLIN-III experiment which has, at times, had the world’s best limits on dark matter, and is poised to possibly confirm this possible detection — this will be funded for the next couple of years. Unfortunately, STFC has decided that the next generation of dark matter experiments, EURECA and LUX-ZEPLIN, needed to make convincing statements about these results, weren’t possible to fund.
PAMELA (Payload for Antimatter Matter Exploration and Light-nuclei Astrophysics) is a Russian-Italian satellite measuring the composition of cosmic rays. One of the motivations for the measurements is the indirect detection of dark matter — the very-weakly-interacting particles that make up about 25% of the matter in the Universe (with, as I’m sure you all know by now) normal matter about 5% and the so-called Dark Energy the remaining 70%. By observing the decay products of the dark matter — with more decay occurring in the densest locations — we can probe the properties of the dark particles. So far, these decays haven’t yet been unequivocally observed. Recently, however, members of the PAMELA collaboration have been out giving talks, carefully labelled “preliminary”, showing the kind of excess cosmic ray flux that dark matter might be expected to produce.
But preliminary data is just that, and there’s a (usually) unwritten rule that the audience certainly shouldn’t rely on the numerical details in talks like these. Cirelli & Strumia have written a paper based on those numbers, “Minimal Dark Matter predictions and the PAMELA positron excess” (arXiv:0808.3867), arguing that the data fits their pet dark-matter model, so-called minimal dark matter (MDM). MDM adds just a single type of particle to those we know about, compared to the generally-favored supersymmetric (SUSY) dark matter model which doubles the number of particle types in the Universe (but has other motivations as well). What do the authors base their results on? As they say in a footnote, “the preliminary data points for positron and antiproton fluxes plotted in our figures have been extracted from a photo of the slides taken during the talk, and can thereby slightly differ from the data that the PAMELA collaboration will officially publish” (originally pointed out to me in the physics arXiv blog).
This makes me very uncomfortable. It would be one thing to write a paper saying that recent presentations from the PAMELA team have hinted at an excess — that’s public knowledge. But a photograph of the slides sounds more like amateur spycraft than legitimate scientific data-sharing.
Indeed, it’s to avoid such inadvertent data-sharing (which has happened in the CMB community in the past) that the Planck Satellite team has come up with its rather draconian communication policy (which is itself located in a password-protected site): essentially, the first rule of Planck is you do not talk about Planck. The second rule of Planck is you do not talk about Planck. And you don’t leave paper in the printer, or plots on your screen. Not always easy in our hot-house academic environments.
Update: Bergstrom, Bringmann, & Edsjo, “New Positron Spectral Features from Supersymmetric Dark Matter - a Way to Explain the PAMELA Data?” (arXiv: 0808.3725) also refers to the unpublished data, but presents a blue swathe in a plot rather than individual points. This seems a slightly more legitimate way to discuss unpublished data. Or am I just quibbling?
Update 2: One of the authors of the MDM paper comments below. He makes one very important point, which I didn’t know about: “Before doing anything with those points we asked the spokeperson of the collaboration at the Conference, who agreed and said that there was no problem”. Essentially, I think that absolves them of any “wrongdoing” — if the owners of the data don’t have a problem with it, then we shouldn’t, either (although absent that I think the situation would still be dicey, despite the arguments below and elsewhere). And so now we should get onto the really interesting question: is this evidence for dark matter, and, if so, for this particular model. (An opportunity for Bayesian model comparison!?)
Starting tomorrow, you’ll be able to sign up with MI5 to receive an email notice when the “Threat Level” changes. Right now it’s “severe”, but they have the fine-grained menu of “low”, “moderate”, “substantial”, “severe” and “critical” to choose from — we certainly need that much more detail compared to the meagre green/yellow/red of the US system that everyone checks each and every morning. In a grotesque act of fearmongering, MI5 use a picture from the September 11 wreckage of the World Trade Center on their page outlining “The Threats”.
But maybe we need a new level for “smelly”, like New York?
[Yes, I know there has been big cosmology news today, but in this twenty-four-science-blogging culture everyone else, like Sean, Clifford and Steinn, has already posted the lovely pictures, and fine explanations, of the dark matter distribution.]
Anyone reading this blog has doubtless heard about the results announced a few weeks ago, observations of the “bullet cluster” claimed (in the title of the paper) to be “A direct empirical proof of the existence of dark matter.” (The basic idea is recounted better, and with prettier pictures, than I can do here by Sean in Cosmic Variance, and in their own press release.)
The bullet cluster is actually a pair of galaxy clusters that have recently slammed into one another. We call them “galaxy clusters” but in fact the galaxies themselves are a relatively small fraction of their mass. The rest is hot gas — shining in the x-rays — and, we think, dark matter. When the two clusters plowed into each other, the galaxies themselves, and the dark matter, just passed through, interacting only through gravity, but the gas actually collides, heating up in what is called a shock front. So we can easily observe that the galaxies, observed with optical telescopes, aren’t quite aligned with the gas, observed with the Chandra X-Ray satellite. Over the last decade it’s become possible to observe the mass distribution of clusters directly, using the technique of weak lensing. And it seems that the mass is aligned with the galaxies, not with the gas — much more mass, and more smoothly distributed, than the galaxies themselves. The argument rests on the idea that alternatives to dark matter, such as Modified Newtonian Gravity (MOND) would have the mass exactly tracing the light, specifically the x-ray-emitting gas. So: it must be dark matter. Case closed.
Well, sort of.
The simplest versions of MOND were always known to be too simple to apply to the largest scales such as clusters and the Universe as a whole (i.e., cosmology). But more recently, Bekenstein has created a “relativistic” version of the theory which, Skordis and collaborators have shown, reproduces at least some cosmological observations. This theory, known as TeVeS (for Tensor, Vector, Scalar gravity) is hardly simpler than a theory with Dark Matter; as the name implies to physicists, it requires a vector and a scalar field in addition to the metric tensor that characterizes Einstein’s relativity. These fields don’t weigh much — they’re not the dark matter — but the forces that they implicitly exert mimic its effects.
The vector field, in particular, can have unexpected repercussions beyond this, however: it’s a vector, an arrow, which means it has a direction. It breaks the symmetry of the situation, and could, in specific circumstances displace some gravitational effects from others (the lensing from the light, for example, perhaps in exactly the same way as dark matter — a distinction without a difference?). No one has performed the required calculations yet, but other groups have argued that other possibilities, such as Moffat’s MOG and massive neutrinos combined with these MOND-like theories could in principle explain offsets between lensing and light. Indeed, they point out that MOND-like theories have already had a problem with explaining observations of clusters in which the details of the mass distribution had rarely lined up with the light. (See this Cosmocoffee discussion for more, from the authors of these papers themselves.)
These responses point to the real worry here: any single observation can be refuted. (Worse, of course, is the simple fact that lots of observations are wrong, and it’s impossible to know at the time which ones.) Moreover, such an argument from a single case is the same tactic used by those offering weak evidence against the current paradigms (not to mention the real crackpots). Despite claims to the contrary, science does not progress by simple falsification, by the single case that brings down the old paradigm. Extraordinary claims require extraordinary evidence. Back when it was first suggested by Zwicky in the 1930s, the existence of dark matter was an extraordinary claim. Now, dark matter, especially the weakly interactive massive particles that seem to be a natural corollary to supersymmetric theories in particle physics, is a much more parsimonious explanation of the various observations of galaxies, clusters and the cosmos as a whole than these baroque theories. Refuting its existence would certainly be extraordinary — but so would declaring the issue completely finished, at least until its definitive non-gravitational detection.
I spent the early part of the week in Sheffield at the first meeting of the Institute of Physics Astroparticle Physics Group. There were talks on the search for the Dark Matter, gravitational waves, neutrino astrophysics, gamma-ray astrophsyics, and, of course, cosmology. All of this sometimes goes by the name “non-accelerator particle physics”: trying to learn about the basic constituents of the Universe in way other than smashing particles against one another at high speeds in a particle accelerator. Aside from the excellent science, we had the conference dinner at the Kelham Island Museum, which had some great machines from Sheffield’s industrial past (the UK’s equivalent of Pittsburgh).
(An aside: I’ll go along with Clifford’s observations that the UK is woefully, frighteningly, ridiculously expensive. It cost me £120 to get between London and Sheffield.)
Another distraction from the science was a discussion of the recent announcement that the UK government wants to reorganize the funding of both particle physics and astrophysics, especially as they relate to so-called “large facilities”, such as big telescopes and the aforementioned particle accelerators. The heads of the current Particle Physics and Astrophysics Research Council (PPARC) and the Central Laboratory of the Research Councils (CCLRC) got together last week to respond to the government and endorse the possible changes, and to nudge them in some specific directions. The worry, however, is that the progeny of these two councils would embody the worst of both worlds: big, unweildy, badly managed, and driven more by industrial applications than science. Indeed, the Royal Astronomical Society has produced its own response to the government, and reacted somewhat warily to the PPARC/CCLRC paper: science must remain at the forefront, not the facilities.
The RAS is now headed by Michael Rowan-Robinson, my colleague at Imperial. Another Imperial colleague, Professor Sir John Pendry, has been in the news recently for his part in developing “metamaterials” with weird properties such as the ability to bend light around them, making them appear invisible (but invisibility to radar — stealth technology —is more likely to be developed before one that could make someone invisible in visible light).
This morning I woke to find that most of the UK media outlets (like the Guardian and the BBC) were carrying a story that astronomers, in particular a team lead by Gerry Gilmore of Cambridge’s Institute of Astronomy, had unlocked some of the secrets of Dark Matter, the mysterious stuff that makes up, we think, 30% of the total mass of the universe, a substantially greater proportion than the atoms and molecules that we interact with every day.
It seems that the team has measured the velocity dispersion of stars in dwarf galaxies in the local group — small galaxies near our own Milky Way that, we think, are dominated not by the stars and gas that we can see, but by the dark matter. The motions of these stars are caused by their gravitational attraction to that dark matter, however it is spread through the galaxy, and so by measuring the speeds of these stars we can map out the distribution of matter.
It was originally expected that the matter would be “cuspy” — it would get denser and denser towards the center. This is the signature of “cold dark matter”, heavy particles moving so slowly that they could be compressed to very high densities. However, observations seem to show (and indeed have been showing for several years) that, instead, the distribution shows a “core” — it reaches a maximum density and cannot be compressed further. This is expected if the dark matter is, instead, “warm” — moving at a few kilometers per second.
Unfortunately, I have to keep using the word “seems” here, since, in fact, all the available information as I write this is from the mainstream media, rather than in journals or even unrefereed preprints.
Update: Steinn has a bit more detail.
Update 2: And this conference proceedings seems to actually be what the fuss is about.