To celebrate, the Planck team have released an image of the full sky. The telescope has detectors which can see the sky with 9 bands at wavelengths ranging from 0.3 millimeters up to nearly a centimeter, out of which we have made this false-color image. The center of the picture is toward the center of the Galaxy, with the rest of the sphere unwrapped into an ellipse so that we can put it onto a computer screen (so the left and right edges are really both the same points).
At the longest and shortest wavelengths, our view is dominated by matter in our own Milky Way galaxy — this is the purple-blue cloud, mostly so-called galactic “cirrus” gas and dust, largely concentrated in a thin band running through the center which is the disk of our galaxy viewed from within.
In addition to this so-called diffuse emission, we can also see individual, bright blue-white objects. Some of these are within our galaxy, but many are themselves whole distant galaxies viewed from many thousands or millions of light years distance. Here’s a version of the picture with some objects highlighted:
Even though Planck is largely a cosmology mission, we expect these galactic and extragalactic data to be invaluable to astrophysicists of all stripes. Buried in these pictures we hope to find information on the structure and formation of galaxies, on the evolution of very faint magnetic fields, and on the evolution of the most massive objects in the Universe, clusters of galaxies.
But there is plenty of cosmology to be done: we see the Cosmic Microwave Background (CMB) in the red and yellow splotches at the top and bottom — out of the galactic plane. We on the Planck team will be spending much of the next two years separating the galactic and extragalactic “foreground” emission from the CMB, and characterizing its properties in as much detail as we can. Stay tuned.
I admit that I was somewhat taken aback by the level of interest in these pictures: we haven’t released any data to the community, or written any papers. Indeed, we’ve really said nothing at all about science. Yet we’ve made it onto the front page of the Independent and even the Financial Times, and yours truly was quoted on the BBC’s website. I hope this is just a precursor to the excitement we’ll generate when we can actually talk about science, first early next year when we release a catalog of sources on the sky for the community to observe with other telescopes, and then in a couple of years time when we will finally drop the real CMB cosmology results.
The cosmology community has had a terrible few months.
I am saddened to report the passing of Andrew Lange, a physicist from CalTech and one of the world’s preeminent experimental cosmologists. Among many other accomplishments, Andrew was one of the leaders of the Boomerang experiment, which made the first large-scale map of the Cosmic Microwave Background radiation with a resolution of less than one degree, sufficient to see the opposing action of gravity and pressure in the gas of the early Universe, and to use that to measure the overall density of matter, among many other cosmological properties. He has since been an important leader in a number of other experiments, notably the Planck Surveyor satellite and the Spider balloon-borne telescope, currently being developed to become one of the most sensitive CMB experiments ever built.
I learned about this tragedy on the same day that people are gathering in Berkeley, California, to mourn the passing of another experimental cosmologist, Huan Tran of Berkeley. Huan was an excellent young scientist, most recently deeply involved in the development of PolarBear, another one of the current generation of ultra-sensitive CMB experiments. Huan lead the development of the PolarBear telescope itself, currently being tested in the mountains of California, but to be deployed for real science on the Atacama plane in Chile. We on the PolarBear team are proud to name the PolarBear telescope after Huan Tran, a token of our esteem for him, and a small tribute to his memory.
My thoughts go out to the friends and family of both Huan and Andrew. I, and many others, will miss them both.
I’m happy to be able to point to ESA’s first post-launch press release from the Planck Surveyor Satellite.
Here is a picture of the area of sky that Planck has observed during its “First Light Survey”, superposed on an optical image of the Milky Way galaxy:
(Image credit: ESA, LFI and HFI Consortia (Planck); Background image: Axel Mellinger. More pictures are available on the UK Planck Site as well as in French.)
The last few months since the launch have been a lot of fun, getting to play with Planck data ourselves. Here at Imperial, our data-processing remit is fairly narrow: we compute and check how well the satellite is pointing where it is supposed to, and calculate the shape of its beam on the sky (i.e., how blurry its vision is). Nonetheless, just being able to work at all with this incredibly high-quality data is satisfying.
Because of the way Planck scans the sky, in individual rings slowly stepping around the sky over the course of about seven months, with a nominal mission of two full observations of the sky, even the two weeks of “First Light Survey” data is remarkably powerful: we have seen a bit more than 5% of the sky with about half of the sensitivity that Planck is meant to eventually have (in fact, we hope to extend the mission beyond the initial 14 months). This is already comparable to the most powerful sub-orbital (i.e., ground and balloon-based) CMB experiments to date.
But a full scientific analysis will have to wait a while: after the 14 month nominal mission, we will have one year to analyze the data, and another year to get science out of it before we need to release the data alongside, we hope, a whole raft of papers. So stay tuned until roughly Autumn of 2012 for the next big Planck splash.
[Warning: this post will be fairly technical and political and may only be of interest to those in the field.]
I spent the first couple of days this week stuck in a room in Cambridge with about 40 of my colleagues pondering a very important question: what is the future of the study of the Cosmic Microwave Background in the UK?
Organized by Keith Grainge of Cambridge’s MRAO, and held at Cambridge’s new Kavli Institute for Cosmology, the workshop brought together a significant fraction of the UK CMB community, from Cambridge itself, Cardiff, Imperial, Manchester, Oxford and elsewhere.
With the recent cancellation of the Clover experiment by STFC, there is no major UK-led CMB experiment (I am making a distinction between CMB experiments per se and those with other primary purposes, such observing the Sunyaev-Zel’dovich effect with AMI, or astrophysical foregrounds with QUIJOTE.) However, there is a huge amount of CMB expertise in the UK, from the design of detectors and telescopes through to the analysis of CMB data.
In the short term, it seems there is some appetite for attempting to revive the Clover effort at some level, perhaps in collaboration with other experimental teams outside of the UK. The major driver — and the only way it makes any sense at all — is to get this done quickly, before the other experiments pursuing the same goals begin to gather data (in the interests of full disclosure, I should point out that I am involved in a couple of those other experiments: EBEX and PolarBear). This decision, I imagine, will be dominated by the politics and economics of the current STFC funding
debacle fiasco debate as well as what I understand are the internal relationships of the Clover team.
So of more scientific interest is the question of what to do next. Right now, the UK astronomy and particle physics community is undertaking a series of consultations to figure out what it thinks are the most important topics, instruments and experiments to concentrate upon over the next few years. One very real possibility is that we could decide not to lead any new CMB experiments, but just to continue to lend our expertise to other efforts. This is cost-effective but unsatisfying, especially to experimentalists who want to take the lead in the design of new efforts. The only viable alternative, I think, is for the community to come together and, with apologies for the cliche, speak with a unified voice in support of a coherent plan. There is enough expertise in the UK to produce great CMB science over the next decade, but it is thinly spread. The basic design of any such experiment is clear: thousands of detectors observing the sky over as many frequencies as possible. But the details — exactly what sorts of detectors, flown from a balloon or stationary on the ground, or to wait for a future satellite — will be crucial to the success or otherwise of the experiment. Unfortunately, these decisions can often degenerate into “not-invented-here” syndrome and personality clashes between strong scientific egos. But as Ben Franklin said on signing the Declaration of Independence, “we must, indeed, all hang together, or assuredly we shall all hang separately.”
Right now, the UK’s astronomy and nuclear/particle physics research council, STFC, is supposedly undergoing a series of “consultations” with the community to try to figure out exactly which of the many possible big-ticket items (telescopes, satellites, particle detectors, etc.) the community wants to pursue.
In the meantime, however, things are proceeding in their usual autocratic way, as our financial overlords attempt to deal with the financial shortfall that a combination of bad luck, the global financial crisis, their own mismanagement, and government policy (in no particular order), has bequeathed the council.
Following on the cancellation of the Clover CMB experiment, this week we heard that the number of Advanced Fellowships per year will be cut in half, from twelve to six for all of astronomy and particle physics, and that the outreach budget will be cut by even more.
I came to the UK on an AF, and so have a soft spot for the program: the five-year fellowships have a very profile worldwide and are indeed open to applicants from all over the world. They have traditionally been one of the best ways to attract and retain young scientists. In many institutions, coming with the imprimatur of a pretty rigorous peer-review process, they lead directly to a truly permanent academic position.
As my fellow AF-alumnus, Peter Coles (from whom I got most of this information and the inspiration for writing it here), puts it: “Who needs half a dozen top class scientists when you can have Moonlite instead?”
Update: There was a package on BBC news today, lamenting the state of UK “space policy” — even the representative from EADS Astrium (“industry”) was complaining. Meanwhile, Lord Drayson, the “space minister”, was on the Politics Show, at least admitting this sort of thing “is going to cost money” — especially the twenty-year plan he wants.
Not all CMB (Cosmic Microwave Background) experiments get launched on a rocket.
There’s a long history of telescopes flown from balloons — huge mylar balloons floating over 100,000 feet in the air. MAXIMA and BOOMERaNG, the first experiments to map out the microwave sky on the sub-degree scales containing information about the detailed physical conditions in the Universe over the first few hundred thousand years after the Big Bang. The Planck Satellite will close out that era of CMB experiments, by giving us a complete picture of the microwave sky down to less than a tenth of a degree.
But there is still more to be done, even beyond what Planck is capable of. By measuring the polarization of the microwave background at even higher sensitivities than Planck, we hope to observe the effects of gravitational radiation in the early Universe.
Last week, EBEX, one of a new generation of balloon-borne experiments designed specifically with this goal, had its maiden flight from Fort Sumner, New Mexico.
EBEX Launch, 6/11/09 from asad137 on Vimeo.
It’s worth remembering, of course, that even with a parachute, these telescopes hit the ground pretty hard. But these things are amazingly well-built, and the EBEX crew have managed to recover most of the hardware and all of the data. So now the team have some time to get the hardware and software ready to fly for a couple of weeks over Antarctica next year.
And let’s not forget that New Mexico is also the home of Roswell, where conspiracy theorists and other wackjobs have been trying to find the government cover-up of UFO sightings. Indeed, the EBEX balloon was spotted, but at least in neighbouring Arizona, they can tell the difference.
Meanwhile, another CMB experiment, PolarBear, is about to start its first set of important tests. PolarBear is a ground-based telescope, which means it can watch the sky for far longer than a balloon, at the cost of being at the bottom of the atmosphere and all of the extra noise that adds to the signal. So despite some hard times (especially here in the UK), the next generation of CMB experiments are on the way, hoping to probe all the way back to the epoch of inflation.
In today’s Sunday NY Times Magazine, there’s a long article by psychologist Steven Pinker, on “Personal Genomics”, the growing ability for individuals to get information about their genetic inheritance. He discusses the evolution of psychological traits versus intelligence, and highlights the complicated interaction amongst genes, and between genes and society.
But what caught my eye was this paragraph:
What should I make of the nonsensical news that I… have a “twofold risk of baldness”? … 40 percent of men with the C version of the rs2180439 SNP are bald, compared with 80 percent of men with the T version, and I have the T. But something strange happens when you take a number representing the proportion of people in a sample and apply it to a single individual…. Anyone who knows me can confirm that I’m not 80 percent bald, or even 80 percent likely to be bald; I’m 100 percent likely not to be bald. The most charitable interpretation of the number when applied to me is, “If you knew nothing else about me, your subjective confidence that I am bald, on a scale of 0 to 10, should be 8.” But that is a statement about your mental state, not my physical one. If you learned more clues about me (like seeing photographs of my father and grandfathers), that number would change, while not a hair on my head would be different. [Emphasis mine].
That “charitable interpretation” of the 80% likelihood to be bald is exactly Bayesian statistics (which I’ve talked about, possibly ad nauseum, before) : it’s the translation from some objective data about the world — the frequency of baldness in carriers of this gene — into a subjective statement about the top of Pinker’s head, in the absence of any other information. And that’s the point of probability: given enough of that objective data, scientists will come to agreement. But even in the state of uncertainty that most scientists find themselves, Bayesian probability forces us to enumerate the assumptions (usually called “prior probabilities”) that enter into our assignments reasoning along with the data. Hence, if you knew Pinker, your prior probability is that he’s fully hirsute (perhaps not 100% if you allow for the possibility of hair extensions and toupees); but if you didn’t then you’d probably be willing to take 4:1 odds on a bet about his baldness — and you would lose to someone with more information.
In science, of course, it usually isn’t about wagering, but just about coming to agreement about the state of the world: do the predictions of a theory fit the data, given the inevitable noise in our measurements, and the difficulty of working out the predictions of interesting theoretical ideas? In cosmology, this is particularly difficult: we can’t go out and do the equivalent of surveying a cross section of the population for their genes: we’ve got only one universe, and can only observe a small patch of it. So probabilities become even more subjective and difficult to tie uniquely to the data. Hence the information available to us on the very largest observable scales is scarce, and unlikely to improve much, despite tantalizing hints of data discrepant with our theories, such as the possibly mysterious alignment of patterns in the Cosmic Microwave Background on very large angles of the sky (discussed recently by Peter Coles here). Indeed, much of the data pointing to a possible problem was actually available from the COBE Satellite; results from the more recent and much more sensitive WMAP Satellite have only reinforced the original problems — we hope that the Planck Surveyor — to be launched in April! — will actually be able to shed light on the problem by providing genuinely new information about the polarization of the CMB on large scales to complement the temperature maps from COBE and WMAP.
Although the big satellites get most of the press, a lot of astronomy is done from balloons, huge mylar bubbles that can carry a gondola up to about 120,000 feet over the earth — more than 22 miles or 32 km. That’s high enough that much of the atmospheric contamination is gone, but a lot cheaper and easier to reach than orbit. I’ve been involved in the BOOMERaNG and MAXIMA balloon experiments, to measure the Cosmic Microwave Background, and currently with EBEX. Some experiments, BOOMERaNG among them, take advantage of the conditions at the South Pole and launches from Antarctica, using the “polar vortex” in the atmosphere to keep the balloon aloft for as much as a couple of weeks. (I should point out that for me, “involved with” means that I stay home where it’s warm and comfortable, but get to play with the data once my hardier colleagues return from the field.)
If you want to get a feel for ballooning, check out BLAST!, a film made of the campaign to fly the eponymous experiment (the acronym stands for Balloon-borne Large-Aperture Sub-millimeter Telescope), made by Paul Devlin, the film-maker brother of one the experiment’s Principal Investigator. It follows the team from their university labs to the Northern launch site in Scandanavia, and finally to Antarctica. I haven’t seen the whole thing yet, but I’m told it does a good job of giving the impression of the alternating excitement and boredom — and lofty goals — of these experiments.
A quick pointer to Initiative for Cosmology (iCosmo). The website brings together a bunch of useful calculations for physical cosmology — relatively simple quantities like the relationship between redshift and distance, and also more complicated ones like the power spectrum of density perturbations (which tells us the distribution of galaxies on the largest scales in the Universe) and quantities derived from that like the distortions in the shapes of galaxies due to gravitational lensing, when the path of light from galaxies is perturbed by intervening mass in the Universe. Combined with good documentation and tutorials (and downloadable source), it makes a good companion to sites such as LAMBDA’s CMB toolbox, which provides similar services targeted specifically at Cosmic Microwave Background science. iCosmo looks like it will be useful for researchers in the field as well as students, so thanks and congratulations to its creators (I’d like to point directly at the page listing them, but that doesn’t seem to be possible… instead, there’s a discussion forum at CosmoCoffee.).
PAMELA (Payload for Antimatter Matter Exploration and Light-nuclei Astrophysics) is a Russian-Italian satellite measuring the composition of cosmic rays. One of the motivations for the measurements is the indirect detection of dark matter — the very-weakly-interacting particles that make up about 25% of the matter in the Universe (with, as I’m sure you all know by now) normal matter about 5% and the so-called Dark Energy the remaining 70%. By observing the decay products of the dark matter — with more decay occurring in the densest locations — we can probe the properties of the dark particles. So far, these decays haven’t yet been unequivocally observed. Recently, however, members of the PAMELA collaboration have been out giving talks, carefully labelled “preliminary”, showing the kind of excess cosmic ray flux that dark matter might be expected to produce.
But preliminary data is just that, and there’s a (usually) unwritten rule that the audience certainly shouldn’t rely on the numerical details in talks like these. Cirelli & Strumia have written a paper based on those numbers, “Minimal Dark Matter predictions and the PAMELA positron excess” (arXiv:0808.3867), arguing that the data fits their pet dark-matter model, so-called minimal dark matter (MDM). MDM adds just a single type of particle to those we know about, compared to the generally-favored supersymmetric (SUSY) dark matter model which doubles the number of particle types in the Universe (but has other motivations as well). What do the authors base their results on? As they say in a footnote, “the preliminary data points for positron and antiproton fluxes plotted in our figures have been extracted from a photo of the slides taken during the talk, and can thereby slightly differ from the data that the PAMELA collaboration will officially publish” (originally pointed out to me in the physics arXiv blog).
This makes me very uncomfortable. It would be one thing to write a paper saying that recent presentations from the PAMELA team have hinted at an excess — that’s public knowledge. But a photograph of the slides sounds more like amateur spycraft than legitimate scientific data-sharing.
Indeed, it’s to avoid such inadvertent data-sharing (which has happened in the CMB community in the past) that the Planck Satellite team has come up with its rather draconian communication policy (which is itself located in a password-protected site): essentially, the first rule of Planck is you do not talk about Planck. The second rule of Planck is you do not talk about Planck. And you don’t leave paper in the printer, or plots on your screen. Not always easy in our hot-house academic environments.
Update: Bergstrom, Bringmann, & Edsjo, “New Positron Spectral Features from Supersymmetric Dark Matter - a Way to Explain the PAMELA Data?” (arXiv: 0808.3725) also refers to the unpublished data, but presents a blue swathe in a plot rather than individual points. This seems a slightly more legitimate way to discuss unpublished data. Or am I just quibbling?
Update 2: One of the authors of the MDM paper comments below. He makes one very important point, which I didn’t know about: “Before doing anything with those points we asked the spokeperson of the collaboration at the Conference, who agreed and said that there was no problem”. Essentially, I think that absolves them of any “wrongdoing” — if the owners of the data don’t have a problem with it, then we shouldn’t, either (although absent that I think the situation would still be dicey, despite the arguments below and elsewhere). And so now we should get onto the really interesting question: is this evidence for dark matter, and, if so, for this particular model. (An opportunity for Bayesian model comparison!?)
OK, not a vacation in the true sense of the word: I’ve been in the US, attending meetings (in Berkeley), workshops (in Santa Fe), conferences (in Pasadena) and, because I can’t seem to escape them, teleconferences everywhere and all the time.
In Berkeley, I attended the first all-hands collaboration meeting for PolarBear, an experiment that will measure the polarization of the CMB from a telescope that will eventually be situated on the Atacama desert plain in Chile — one of the highest, driest, least accessible places on the earth, and one of the least contaminated with light or radio interference. (Despite the name of the experiment, there are no polar bears there.) First, we’ll test it at the somewhat less remote White Mountain facility in California, shake out all the bugs. PolarBear is one of a new generation of experiments that will measure the CMB using not just a few tens of detectors, but a few thousand, which brings with it all sorts of technical challenges. In hardware, the first challenge is simply making so many detectors and keeping their properties uniform each to each — these are among the most sensitive microwave detectors ever built, essentially as good as the constraints of quantum mechanics and thermodynamics allow. The second, related to the first, is to pack as many of these into a small space — the focal plane of the telescope — as possible. Traditionally, microwave detectors have used horns to guide the electromagnetic waves from the sky onto the detectors, but those horns are much wider than the detector hardware. For experiments like PolarBear, we put the detectors themselves right at the focus of the telescope and make each of them into a little antenna, receiving directly the focused light after passing through a hemispherical lens. The final hardware challenge is to get the information from these thousands of detectors off of the telescope and into our computers, which the PolarBear designers have solved with a new technique called “frequency-domain multiplexing”. Sort of like the way FM radio manages to convey the full spectrum of sound by modulating at a particular frequency, the very high-tech SQUIDs (Superconducting QUantum Interference Devices) can then amplify these tiny CMB signals into data we can analyze.
In fact, the data analysis and computing challenges are almost as significant as those faced in hardware. With thousands of detectors and a telescope that will run for the better part of several, we have many orders of magnitude more CMB data than we’ve ever dealt with before, combined with a sensitivity goal better than a millionth of a degree. By adding more and more detectors, we can make the raw experiment itself sensitive enough to do this. What we don’t know is whether we can eliminate everything else that can possibly contaminate our results: light may spill over our shield from the 300 degree ground or directly from the atmosphere; dust in our solar system or our galaxy also glows in the bands we want to measure, as do external galaxies millions of light-years away. So our task is to compress the terabytes of data into a few interesting numbers (like the energy scale of inflation) and to simultaneously separate the cosmic signal from the that produced by instrument and from the rest of the Universe (which may be much brighter!). Suffice to say, we have some good ideas but until we’re confronted with real data we won’t know how successful we’ll be.
Plus, I ate bagels (better than London; not as good as New York) and burritos, and bought shoes at cheap American prices (at least when I think in British Pounds).
Next up, Santa Fe…
In its continuing bid to take over all aspects of science communication, Nature magazine (or more properly, an alliance between Nature Network and the Royal Institution) will be hosting a European Science Blogging conference in August or September.
Right now, however, I’m in Norway. In addition to discussing how we’re going to measure the CMB power spectrum with Planck, I’ve already eaten a slab of reindeer, ran for an hour up and down the snowy hills, and sweated in a sauna.
Today we heard that the (bizarrely agglomerated) UK Department for Innovation, Universities and Skills will be significantly cutting the physics budget that comes through the Science and Technology Facilities Council (STFC). STFC was formed earlier this year out of PPARC (Particle Physics and Astrophysics) and the CCLRC (which ran big facilities like the Rutherford Appleton Lab). When it was formed, we were told this would enable better science. But it seems we may have been sold a bill of goods: the science program is being saddled with what is, essentially, CCLRC’s debt, in the form of an £80 million shortfall that will fall disproportionately on academic research. And therefore, of course, on physics departments and, inevitably, physics education.
The Delivery Plan has just been announced, but of course the spin is all on the overall increase to funding, not these cuts. Happily (and a little surprisingly) the BBC highlighted the impact on physics in its usual stroppy manner.
Andy Lawrence has been following the news of the impending cuts over the last few weeks. Chris Lintott and Stuart have some more details. The headline cuts seem to be: withdrawal from the International Linear Collider (particle physicists’ next big instrument after the LHC at CERN), cessation of all support for ground-based solar-terrestrial physics facilities (i.e., telescopes and instruments that investigate the sun and its impact on the earth from the ground), and “revisiting the on-going level of investment” in gravitational wave detection, dark matter detection, the Clover CMB experiment and the UKIRT telescope. The UK will pull out of the Isaac Newton Group of telescopes.
Most important for me, so-called post-launch support for existing space missions (such as the Planck Surveyor CMB Mission, although it was never explicitly mentioned in the plan) will be cut by around 30%. This is a very cynical ploy: we will undoubtedly be so excited by the data from missions like Planck that we will donate our time, gratis, just to make sure that it gets analyzed.
There do appear to have been some small victories. Rather than a full termination as mooted last week, STFC plans “to withdraw from future investment in the twin 8-metre Gemini telescopes and we will work with our international partners to retain access to Gemini North.” So at least UK astronomers will have access to a world-class telescope in the Northern hemisphere. Most importantly, “Science Minister Ian Pearson said [on the BBC] funding arrangements would be reviewed,” — which we hope means actual compromises are possible — although of course he “did not promise extra money.”
Google has just released a new version of its Google Earth software — one that lets you look up to the sky instead of down to the ground. It’s essentially a consumer-grade Virtual Observatory, like the UK AstroGrid, the US National Virtual Observatory and the Euro-VO project.
It’s not so obvious when you fire it up and are presented with little icons for various stars and galaxies, but the underlying data is a continuous picture of the sky, although the resolution depends on what data exists in a given area. For example, type in “HDF” and it takes you to the Hubble Deep Field North, one of the deepest images ever taken of the sky, showing galaxies in every stage of their evolution. Conversely, however, most of the objects don’t have any information attached to them at all — just fuzzy blobs.
Of course, real astronomers would require a lot more information: how was the data taken? At what frequency? It would certainly be great to be able to use this as a front-end to the “real” Virtual Observatory like AstroGrid. These science-oriented projects have spent a considerable amount of time and effort refining their interface, but just don’t have the funds or expertise of a company like Google. And now I’m just waiting for someone to implement a layer showing the Cosmic Microwave Background and other “diffuse” sets of data on the sky. (Update: my very bright grad student, JZ, has figured out how to import CMB data as an image into the program.)
Update 2: VO/Blogger Alasdair Allan has started to work out how to connect Google Sky to the Virtual Observatory via the PLASTIC protocol. Alasdair was also interviewed about Google Sky for the Guardian’s science podcast.
I spent the week before last in Portugal working with the team designing and building the GEM telescope: The Polarized Galactic Emission Mapping Project in Portugal. GEM (aka GEM-P or even P-GEM-P) aims to measure the emission of our Milky Way galaxy using light at a wavelength of 6 cm. Those frequencies are dominated by synchrotron emission, generated by electrons deflected by the magnetic field of the galaxy. These measurements will give us invaluable information about the structure of the galaxy. Moreover, this emission is an important contaminant for the cosmological maps that experiments like the Planck Surveyor and its successors like the BPol project that many of us have just proposed to the European Space Agency.
GEM-P is being built mostly by a small group at the Instituto de Telecomunicações and the Universidade de Aveiro. Here’s me with a large part of the Aveiro team:
That’s Rui Fonseca, Domingos Barbosa, me, Dinis Magalhães and Luis Cupido. This wasn’t taken at the GEM-P lab, by the way, but in Luis’s backyard, and that’s the 5.5 meter dish he keeps there to play with and do things like track NASA spacecraft in the outer reaches of the solar system!
Aveiro is a lovely town (even The New York Times agrees) experiencing a pretty remarkable building boom that manages to combine a high-tech University with an old-fashioned fishing village. For me, that meant the useful combination of pretty ubiquitous wifi and great seafood. (More pictures of Aveiro here).
(I also got to spend some time wandering around Porto, where I stayed right down the road from Rem Koolhaas’s new Casa de Musica, just shortlisted for the RIBA Stirling Prize. Photos of the older bits of Porto here.)
The Observer featured a lengthy article by Tim Adams bemoaning the generic scientific illiteracy of society today, tracing a line from CP Snow’s “Two Cultures” through Natalie Angier’s new book, The Canon:A Whirligig Tour of the Beautiful Basics of Science. It concentrates a bit too heavily on uber-agent John Brockman’s somewhat pretentious “Third Culture, a marriage of physics and philosophy, astronomy and art,” as exemplified by his website, The Edge, but it does finger a real and disturbing (but not really new) trend. But to me the real howler was the following quote:
George Smoot, the Nobel-winning astrophysicist who first identified the background radiation of the Big Bang and thereby invented cosmology.OK, first, George Smoot didn’t identify “the background radiation of the Big Bang”, he was the Principal Investigator of the DMR Instrument on the COBE satellite, which identified the fluctuations in the background radiation (aka the CMB), the seeds of structure that eventually grew into galaxies and cluster of galaxies in the Universe today. The CMB itself was first identified by Penzias and Wilson in the 1960s — for which they also won the Nobel Prize. That may give you a hint about the other problem here: George, although a pretty smart guy, certainly didn’t invent cosmology, which has been around as a legitimate scientific field at least since Einstein’s discovery of General Relativity, and as a human endeavor for thousands of years.
We take so much of the web for granted today, we often forget how very contingent it all is. Without the very specific work by Tim Berners-Lee inventing the http protocol, perhaps some sort of hypertext communication standard would have come along, but it’s hard to believe that it would be quite the same. Berners-Lee has always advocated a still more open “read/write” web, and about the closest we come to that is, of course, the weblog. Well, blogs were arguably launched ten years ago, on April 1, 1997, by Dave Winer. Scripting News was an outgrowth of his DaveNet emails, but had all the usual hallmarks of a blog: short items, lots of links, and, crucially, reverse chronological order. Dave has gone onto a career as a general computer pundit and curmudgeon — and also invented RSS (that orange “XML Feed” icon over at the side).
My own April Fool’s incident came a bit early, last Thursday night, unable to make my way from the arrival hall of Rome’s Fiumicino airport to the airport Hilton. I arrived around midnight, after the trains stop running into town. What they don’t tell you is that all the passageways between the airport buildings are also shut — without signs to tell you where to go. After conflicting information from three different sets of people, I found myself staggering around the deserted parking lots searching for the warm bed I had booked (I did eventually find it, and the front desk took pity on me in the form of an upgrade to the “executive suite” floor). The next day, although a bit sleepy, was at least a productive discussion of the next step in our proposal for a new mission to measure the polarization of the microwave background — in about 2015 or 2020.
But since today is really April 2, you can also read a real blog post by Amedeo Balbi, my cosmology colleague (on MAXIMA and Planck and probably more in the future) over in Tommaso Dorigo’s blog; he’s got one of his own, but it will only make sense if you read Italian.
Yesterday evening I attended the launch party for Nature Network London, a new site run by Nature magazine, which hopes to be a web home for science and scientists in London. There are articles, blogs, discussion forums and calendars of scientific events.
Perhaps unsurprisingly, I ended up meeting lots of people from Imperial — whom of course I had never met here on campus. I also met the site’s editor, Matt Brown, as well as blogger Jennifer Rohn, who also runs the science/culture site LabLit.
It’s an ambitious idea, and anything that gets us out of our offices and talking with other scientists is welcome. The formal barriers to entry are quite low, but to get working scientists to spend their time blogging, posting in discussion forums, and just taking this newfangled social web 2.0 thing seriously may be a hard sell. We’ll have to hook ‘em young. However, “science” in London is dominated by Medicine and biology — we physical scientists are a distinct minority, and our interests, academic lives and ways of working are often very different indeed (for example, the biologists last night spent a lot of time trying to decide whether to approach someone like Paul Smith for a design of a fashionable lab coat — I’ve never worn a lab coat in my life!). Anyway, if you’re a London-based scientist of any stripe reading this, sign up and join in!
Tonight I’m off on a 24-hour jaunt to Rome to discuss our proposal for a new Satellite, BPol, to measure the CMB polarization (and thereby discover if inflation could be responsible for getting our Universe into the shape we find it today). Unfortunately, this satellite wouldn’t be launched until the late 2010s, which means that the data wouldn’t flow for a staggering decade and a half.
Luckily, cosmology will remain interesting while we’re waiting — as Tommaso Dorigo’s ongoing reports from our Outstanding questions for the standard cosmological model meeting continue to attest.
I’m just back from a couple of days up in Edinburgh, one of my favorite cities in the UK. London is bigger, more intense, but Edinburgh is more beautiful, dominated by its landscape—London is New York to Edinburgh’s San Francisco.
I was up there to give the Edinburgh University Physics “General Interest Seminar”. Mostly, I talked about the physical theory behind and observations of the Cosmic Microwave Background, but I was also encouraged to make some philosophical excursions. Needless to say, I talked about Bayesian Probability, and this in turn gave me an excuse to talk about David Hume, my favorite philosopher, and son of Edinburgh. Hume was the first to pose the “problem of induction”: how can we justify our prediction of the future based on past events? How can we logically justify our idea that there is a set of principles that govern the workings of the Universe? The canonical version of this asks: how can we be sure that the sun will rise tomorrow? Yes, it’s done so every day up until now, but tomorrow’s sunrise doesn’t logically follow from that. One possible argument is that induction has always worked up until now, so we can expect it to work again in this case. But this seems to be a vicious circle (rather than a virtuous spiral). As I discussed a few weeks ago, I think this whole problem just grows out of a category error: one cannot make logical proofs of physical theories.
I also went down the dangerous road of discussing anthropic arguments in cosmology, to some extent rehashing the discussion in my review of Paul Davies’ “Goldilocks Enigma”.
But in between I talked about the current state of CMB data, our own efforts to constrain the topology of the Universe, and the satellites, balloons and telescopes that we hope will improve our knowledge even further over the coming few years.
Next up, a more general talk on the topology of the Universe at next week’s Outstanding questions for the standard cosmological model meeting, and then a more general review of the CMB at the Institute of Physics Nuclear and Particle Physics Divisional Conference.
OK, this is going to be very technical. In his comment to my last post, my colleague Ned Wright asks a couple of important questions about the way that the Planck Surveyor satellite is going to observe the sky. In the spirit of Mark Trodden’s question about the use of blogs in the research process, let’s see if we can answer these questions in a way that will satisfy Ned (who knows more about observing the CMB than most people on the planet) and not be completely opaque to the rest of my readers. Ned asks:
What is the current plan for the Planck scan pattern? I see this quote from the 2005 Blue Book:
The spacecraft will spin at 1 rpm around an axis offset by 85 degrees from the telescope boresight, so that the observed sky patch will trace a large circle on the sky (Dupac and Tauber, Astronomy & Astrophysics 430, 363, 2005)….
As the spin axis follows the Sun, the circle observed by the instruments sweeps through the sky at a rate of 1 degree/day. The whole sky will be covered (by all feeds) in a little more than 6 months; this operation will be repeated twice, resulting in a mission lifetime of around 15 months.
This describes a terrible scan pattern that may ruin Planck’s ability to measure the low-ell polarization signal that is essential for deterimining tau.
And the claim of covering the whole sky is wrong since as described, a 5 degree radius about each ecliptic pole is left out. That’s most of the sky, but not all.
The inner quote describes the way Planck will scan the sky — every minute, the satellite will observe a circle with an opening of twice the 85 degree angle described in the quote (a great circle that goes through the poles would have twice 90 degrees). Here’s a picture (from Dupac and Tauber 2005) of Planck’s location and the way it will scan the sky:
Ned is worried about two things, but I’ll discuss them in the opposite order.
He points out that the scan strategy leaves “holes” about five degrees across in the North and South poles. Indeed, the so-called “nominal” scan strategy above does suffer from this (although this is somewhat ameliorated by the fact that Planck has many detectors all looking at spots up to eight degrees from one another on the sky, so in fact those holes are largely filled). A more realistic scan strategy, as described in the paper by Dupac and Tauber mentioned above, will dip up and down out of the plane defined by pointing away from the sun, earth and moon. The exact way we perform these dips (how quickly and with what pattern) remains to be decided, but in any event will fill in those holes. For one possible strategy, the coverage looks like the following, from the same paper (yellow and red areas are observed more than blue and green):
Second, and most important, are the effects of long-term drifts in our detectors. An instrument like Planck can’t just look at a point in the sky and measure the temperature directly. Instead, the background level coming out of our detector is drifting over time, and these drifts can actually be large compared to the tiny CMB signal we’re trying to measure (for aficionados, this is often known as 1/f noise, after the power spectrum of the noise often observed in cases like this). This means that it’s relatively easy to measure the relative temperature of points that are observed nearby in time — since the background hasn’t drifted by much. But it’s much more difficult to measure relative temperatures over long periods of time. Therefore (as Ned points out), it might be difficult to observe patterns on large angular scales, across many individual rings. This is “the low-ell polarization signal that is essential for deterimining tau”: only on these scales can we observe the effects on the CMB of the very first objects to “light up”, more than twelve billion years ago.
This difficulty can only be ameliorated by “cross-linking” — making sure that you observe the same point at many different times. This lets us recalibrate the baseline of the detector every time we revisit that point or, better, set of points. The experiments that Ned Wright himself has worked on, COBE/DMR and WMAP, cleverly achieve this by the very complicated way they observe the sky.
Planck will definitely have a harder time, since its cross-linking only occurs at those points near the poles with many repeated observations. This puts a strong constraint on our detectors: they can’t drift very much over the one minute it takes to make a single circular scan. In this article, my Planck colleagues Christopher Cantalupo, Julian Borrill & Radek Stompor show that we can indeed handle these problems for more-or-less realistic kinds of noise.
To be sure, the real world is always more complicated than our simulations. The hard part will be dealing with what we euphemistically call “systematic effects” — roughly speaking, those errors that we don’t know how to describe very well, or that we don’t know about when we first fly the satellite. The ability to find these systematic effects is another reason why we want both very small known sources of error and the greater redundancy afforded by cross-linking, by comparing the same signal seen under very different conditions at different times during the mission.
Undoubtedly, we will encounter such unexpected sources of noise when we confront real data from Planck late next year, but we hope that the quality of our detectors, combined with the design of our scan strategy, will give us enough extra information to account for these inevitable problems. (But I probably won’t be able to tell you for sure until about 2011 when we’re due to make our first release of Planck results!)
Update: Further comments from Ned below. Discretion being the better part of valor, I shan’t comment on why these decisions have been left until now (although it is certainly arguable that flexibility is a good thing), and why Ned himself wasn’t consulted (suffice to say I wasn’t a member of the team that far back). However I must certainly agree that Planck’s ability to measure the polarization of the CMB would certainly be better if, as he suggests, the scan strategy visited pixels from many different directions, rather than approximately along lines of “longitude”; the measurement of polarization depends on just those directions, and having many different such measurements at the same location would make it easier to account for the aforementioned 1/f noise and possible systematic effects. Indeed, the experience of the WMAP team teaches us the difficulties of the measurement of large-scale polarization. We do believe that our raw sensitivity will be such that we can recover this polarization sufficiently accurately, but the proof will be in our results, and not in any simulations we do beforehand.