Not all CMB (Cosmic Microwave Background) experiments get launched on a rocket.
There’s a long history of telescopes flown from balloons — huge mylar balloons floating over 100,000 feet in the air. MAXIMA and BOOMERaNG, the first experiments to map out the microwave sky on the sub-degree scales containing information about the detailed physical conditions in the Universe over the first few hundred thousand years after the Big Bang. The Planck Satellite will close out that era of CMB experiments, by giving us a complete picture of the microwave sky down to less than a tenth of a degree.
But there is still more to be done, even beyond what Planck is capable of. By measuring the polarization of the microwave background at even higher sensitivities than Planck, we hope to observe the effects of gravitational radiation in the early Universe.
Last week, EBEX, one of a new generation of balloon-borne experiments designed specifically with this goal, had its maiden flight from Fort Sumner, New Mexico.
EBEX Launch, 6/11/09 from asad137 on Vimeo.
It’s worth remembering, of course, that even with a parachute, these telescopes hit the ground pretty hard. But these things are amazingly well-built, and the EBEX crew have managed to recover most of the hardware and all of the data. So now the team have some time to get the hardware and software ready to fly for a couple of weeks over Antarctica next year.
And let’s not forget that New Mexico is also the home of Roswell, where conspiracy theorists and other wackjobs have been trying to find the government cover-up of UFO sightings. Indeed, the EBEX balloon was spotted, but at least in neighbouring Arizona, they can tell the difference.
Meanwhile, another CMB experiment, PolarBear, is about to start its first set of important tests. PolarBear is a ground-based telescope, which means it can watch the sky for far longer than a balloon, at the cost of being at the bottom of the atmosphere and all of the extra noise that adds to the signal. So despite some hard times (especially here in the UK), the next generation of CMB experiments are on the way, hoping to probe all the way back to the epoch of inflation.
Despite my almost eight years in Britain as an astronomer, I suppose I have to be embarrassed to admit I’ve never actually watched “The Sky At Night”, apparently the longest-running show on television (possibly in the whole world, not just the UK). But I’m watching this evening’s episode, mostly because I’m on it. I was filmed during last month’s trip to the Planck launch. As always, it was painful to realize the fat figure with the bad posture and annoying voice was actually me. But it was fun to watch Patrick Moore do his studio interviews, with a style and on a set neither of which seem to have changed since the 1970s.
But it was beautiful and moving to see the launch again, and to watch the much closer movies and pictures than I was able to get on the day. Since then, parts of Planck have been slowly turned on, cooled down, and checked out. Everything is working well so far; we’re looking forward to the first data in a little more than two months. It’s going to be a long summer.
The episode will briefly be available on BBC’s iPlayer, but more of my cringeworthy discussion of Planck in a different context is up on YouTube; check out the next post for much cooler cosmology video from a more photogenic cosmologist with a better voice.
Planck and Herschel are en route to their orbit at L2!
We all milled around for half an hour, snapping pictures of friends, eminent scientists, and at least one Nobel prize winner, but it all went silent when they announced the last few minutes before launch. The inevitable 10.9.8.7.188.8.131.52.2.1 and ignition was followed by a still, silent seven or so seconds, and then we saw the smoke and flames.
(Apologies for the poor quality; there were many people there with far more powerful zoom lenses than my meagre 2.5x.)
Huge thanks to the instrument teams for their hard work for more than the last decade. Soon, the hard part for us scientists and data-analysts begins: four or so years of data coming down from the satellite, being cleaned and calibrated, building and rebuilding our (computer) model of the instrument, letting us build and rebuild our models of the Universe.
Thanks also to the HFI Instrument Principle Investigator and co-PI, Jean-Loup Puget and Francois Bouchet (and especially Hélène Blavot) for arranging this extraordinary opportunity for us scientists to see this part of the fruits of our work.
Today we saw the rollout of the gargantuan Planck/Herschel Ariane 5 rocket, when they move it from its assembly building to the launchpad. Spectacular!
There are plenty more pictures, and some movies, which I’ll try to edit and post shortly. At the end of the day, I was interviewed and inadvertently kidnapped by Chris Lintott and the BBC Sky at Night team. But I am here to tell the tale (and better fed for it) and ready for the — very — big day tomorrow.
Live coverage of the launch, scheduled for 2:13pm on 14 May, at:
Today was spent in Cayenne — the capital of French Guiana, where most of the hotels are located, and Kourou — home of ESA’a Centre Spatial Guianaise. We climbed up a nearby peak for a look over the Spaceport, but mostly we saw hand-sized spiders and a hazy view of what some very large if indistinct structures.
Closer up, we (about a hundred scientists, obviously more than the ESA staff were used to) got a tour of the facilities, starting in the “Jupiter II” control room where the launch will actually be, um, controlled:
We also saw the launch sites for the Vega and Soyuz rockets, and of course for our own Ariane 5:
But better will be tomorrow, when we get to see the rocket — our rocket — rolled the few kilometers from its current building to the pad in preparation for Thursday’s (hoped-for) launch.
With less than a week to go before its planned launch, The Planck Surveyor Satellite has been loaded into the fairing of its Ariane 5 rocket along with its sister satellite, Herschel. It is scheduled to be rolled out to the pad on May 13, and the launch window opens on May 14 at 13:12 GMT. Within three months, it will be at the Lagrange 2 (L2) point, from where it can watch the sky with the Sun, Earth and Moon all comfortably shielded from view.
Once there, Planck will scan the sky for at least 14 months. But don’t expect to see much out of the mouths (or blogs, or printers) of Planck scientists for a while: we’ve got a full year thereafter to analyze the data, followed by a year’s “proprietary period” during which we’ll do our best to extract the most exciting science. But until then — the first rule of Planck is: you do not talk about Planck. The second rule of Planck is: you DO NOT talk about Planck. (Luckily, Herschel expects to release its pictures of the infrared and submillimetre universe much more quickly.)
For now, the European Space Agency, the UK’s Science and Technology Facilities Council, and of course us Planck scientists ourselves have been gearing up both for the scientific data — and the press.
ESA has a Herschel and Planck launch campaign page with a nifty live countdown (which users of Apple’s Safari browser can make a dashboard widget out of). Last week, STFC held a pre-launch press event in London, which got us some coverage in The Independent, The Daily Mail, The Telegraph, The Times, as well as BBC Radio and TV news. (And Sky at Night will have coverage from the launch.) We’ve also been covered in New Scientist (complete with always-exciting quotes from me).
If this media saturation isn’t enough, you can check out the page dedicated to Planck in the UK, Follow Planck on Twitter (and Herschel too), read the Planck Mission Blog (there’s one for Herschel, too).
As for me, I’m taking a break from this term’s teaching — off to French Guiana next week for the launch (barring further delays). For those of you less lucky, it will be visible on satellite tv and streamed by ESA. I’ll do my best to keep up the twittering and blogging, probably cross-posting from here to the Planck Mission Blog. Wish us luck!
The Planck Surveyor Satellite has finished its assembly and testing in Liège, Belgium, and this week was loaded onto a Volga-Dnepr Antonov AN-124 plane, and sent to Kourou, French Guiana, location of the Centre Spatial Guyanais (one of the few places near the Equator politically connected to Europe). It’s due to be launched in tandem with Herschel on April 16. Here are some pictures of the “Planck Transport and Storage Container” making its way on a “Convoi Exceptionnel” to the airstrip. These photos came to me third-hand, so my apologies and thanks to the unknown (to me) photographer.
Just a quick apology for the lack of words appearing on the page here lately. In addition to planning for the upcoming launch of the Planck Satellite, I’ve been swamped with teaching my first-ever full-length undergraduate cosmology course. It’s lots of fun, but the biggest challenge is just systematizing this whole body of knowledge that I am supposed to already know so well. Like most scientists, I don’t quite want to take the information directly from someone else’s textbook (although there are quite a few good ones at the right level, notably Rowan-Robinson’s Cosmology and Liddle’s An Introduction to Modern Cosmology) so I am trying to put it all together in a way that fits my way of thinking about it (and, I hope, my students’). But probably, this is just my version of Blake’s “I must create a system or be enslaved by another man’s” (of course I am purposefully ignoring his next line from Jerusalem, the very wrongheaded miscomprehension of science, “I will not reason and compare: my business is to create”).
P.S. If you’re a student, feel free to comment here (anonymously, if you’d prefer) or on our favorite e-learning system at Imperial).
In today’s Sunday NY Times Magazine, there’s a long article by psychologist Steven Pinker, on “Personal Genomics”, the growing ability for individuals to get information about their genetic inheritance. He discusses the evolution of psychological traits versus intelligence, and highlights the complicated interaction amongst genes, and between genes and society.
But what caught my eye was this paragraph:
What should I make of the nonsensical news that I… have a “twofold risk of baldness”? … 40 percent of men with the C version of the rs2180439 SNP are bald, compared with 80 percent of men with the T version, and I have the T. But something strange happens when you take a number representing the proportion of people in a sample and apply it to a single individual…. Anyone who knows me can confirm that I’m not 80 percent bald, or even 80 percent likely to be bald; I’m 100 percent likely not to be bald. The most charitable interpretation of the number when applied to me is, “If you knew nothing else about me, your subjective confidence that I am bald, on a scale of 0 to 10, should be 8.” But that is a statement about your mental state, not my physical one. If you learned more clues about me (like seeing photographs of my father and grandfathers), that number would change, while not a hair on my head would be different. [Emphasis mine].
That “charitable interpretation” of the 80% likelihood to be bald is exactly Bayesian statistics (which I’ve talked about, possibly ad nauseum, before) : it’s the translation from some objective data about the world — the frequency of baldness in carriers of this gene — into a subjective statement about the top of Pinker’s head, in the absence of any other information. And that’s the point of probability: given enough of that objective data, scientists will come to agreement. But even in the state of uncertainty that most scientists find themselves, Bayesian probability forces us to enumerate the assumptions (usually called “prior probabilities”) that enter into our assignments reasoning along with the data. Hence, if you knew Pinker, your prior probability is that he’s fully hirsute (perhaps not 100% if you allow for the possibility of hair extensions and toupees); but if you didn’t then you’d probably be willing to take 4:1 odds on a bet about his baldness — and you would lose to someone with more information.
In science, of course, it usually isn’t about wagering, but just about coming to agreement about the state of the world: do the predictions of a theory fit the data, given the inevitable noise in our measurements, and the difficulty of working out the predictions of interesting theoretical ideas? In cosmology, this is particularly difficult: we can’t go out and do the equivalent of surveying a cross section of the population for their genes: we’ve got only one universe, and can only observe a small patch of it. So probabilities become even more subjective and difficult to tie uniquely to the data. Hence the information available to us on the very largest observable scales is scarce, and unlikely to improve much, despite tantalizing hints of data discrepant with our theories, such as the possibly mysterious alignment of patterns in the Cosmic Microwave Background on very large angles of the sky (discussed recently by Peter Coles here). Indeed, much of the data pointing to a possible problem was actually available from the COBE Satellite; results from the more recent and much more sensitive WMAP Satellite have only reinforced the original problems — we hope that the Planck Surveyor — to be launched in April! — will actually be able to shed light on the problem by providing genuinely new information about the polarization of the CMB on large scales to complement the temperature maps from COBE and WMAP.
So, apologies for taking so long between posts. For now, I’ll blame twitter and its ADD version of blogging, because that at least lets me point to an interesting meeting that went on last week: the .Astronomy Conference on Networked Astronomy and the New Media. the conference brought together several related strands of astronomical computing, from the grid (the Virtual Observatory), to “citizen astronomy” (Galaxy Zoo, which is apparently being upgraded to “Universe Zoo”, Google Sky, and blogs and podcasts), to hacks and mashups built on top of current bits of distributed infrastructure, not to mention twitter itself. (Connectivity is terrible here, but much of the material from the conference is available from the conference site.)
Now, I’m in the
Macedonian Greek city of Thessalonika, lucky enough to have been invited to give a talk at From the Antikythera Mechanism to Herschel and Planck: 2500 Years of Observational Astronomy, organized by one of Imperial’s postdocs. I won’t let it go to my head, but it’s nice being treated as someone vaguely important: lunch with the vice-mayor, nice hotel, and amusing Thessaloniki swag to cart home (although when Ute Lemper came to sing she had lunch with the Mayor himself…). My talk is this evening, but the rain outside is precluding much local exploration, but at least I have some time to finish my talk (and write this).
For me, home for about 12 hours tomorrow night, and then off to a Planck meeting in Rome and then Palermo.
Finally let me also welcome Peter Coles to the astro blogosphere. His current prolixity is putting me to shame.
PAMELA (Payload for Antimatter Matter Exploration and Light-nuclei Astrophysics) is a Russian-Italian satellite measuring the composition of cosmic rays. One of the motivations for the measurements is the indirect detection of dark matter — the very-weakly-interacting particles that make up about 25% of the matter in the Universe (with, as I’m sure you all know by now) normal matter about 5% and the so-called Dark Energy the remaining 70%. By observing the decay products of the dark matter — with more decay occurring in the densest locations — we can probe the properties of the dark particles. So far, these decays haven’t yet been unequivocally observed. Recently, however, members of the PAMELA collaboration have been out giving talks, carefully labelled “preliminary”, showing the kind of excess cosmic ray flux that dark matter might be expected to produce.
But preliminary data is just that, and there’s a (usually) unwritten rule that the audience certainly shouldn’t rely on the numerical details in talks like these. Cirelli & Strumia have written a paper based on those numbers, “Minimal Dark Matter predictions and the PAMELA positron excess” (arXiv:0808.3867), arguing that the data fits their pet dark-matter model, so-called minimal dark matter (MDM). MDM adds just a single type of particle to those we know about, compared to the generally-favored supersymmetric (SUSY) dark matter model which doubles the number of particle types in the Universe (but has other motivations as well). What do the authors base their results on? As they say in a footnote, “the preliminary data points for positron and antiproton fluxes plotted in our figures have been extracted from a photo of the slides taken during the talk, and can thereby slightly differ from the data that the PAMELA collaboration will officially publish” (originally pointed out to me in the physics arXiv blog).
This makes me very uncomfortable. It would be one thing to write a paper saying that recent presentations from the PAMELA team have hinted at an excess — that’s public knowledge. But a photograph of the slides sounds more like amateur spycraft than legitimate scientific data-sharing.
Indeed, it’s to avoid such inadvertent data-sharing (which has happened in the CMB community in the past) that the Planck Satellite team has come up with its rather draconian communication policy (which is itself located in a password-protected site): essentially, the first rule of Planck is you do not talk about Planck. The second rule of Planck is you do not talk about Planck. And you don’t leave paper in the printer, or plots on your screen. Not always easy in our hot-house academic environments.
Update: Bergstrom, Bringmann, & Edsjo, “New Positron Spectral Features from Supersymmetric Dark Matter - a Way to Explain the PAMELA Data?” (arXiv: 0808.3725) also refers to the unpublished data, but presents a blue swathe in a plot rather than individual points. This seems a slightly more legitimate way to discuss unpublished data. Or am I just quibbling?
Update 2: One of the authors of the MDM paper comments below. He makes one very important point, which I didn’t know about: “Before doing anything with those points we asked the spokeperson of the collaboration at the Conference, who agreed and said that there was no problem”. Essentially, I think that absolves them of any “wrongdoing” — if the owners of the data don’t have a problem with it, then we shouldn’t, either (although absent that I think the situation would still be dicey, despite the arguments below and elsewhere). And so now we should get onto the really interesting question: is this evidence for dark matter, and, if so, for this particular model. (An opportunity for Bayesian model comparison!?)
OK, not a vacation in the true sense of the word: I’ve been in the US, attending meetings (in Berkeley), workshops (in Santa Fe), conferences (in Pasadena) and, because I can’t seem to escape them, teleconferences everywhere and all the time.
In Berkeley, I attended the first all-hands collaboration meeting for PolarBear, an experiment that will measure the polarization of the CMB from a telescope that will eventually be situated on the Atacama desert plain in Chile — one of the highest, driest, least accessible places on the earth, and one of the least contaminated with light or radio interference. (Despite the name of the experiment, there are no polar bears there.) First, we’ll test it at the somewhat less remote White Mountain facility in California, shake out all the bugs. PolarBear is one of a new generation of experiments that will measure the CMB using not just a few tens of detectors, but a few thousand, which brings with it all sorts of technical challenges. In hardware, the first challenge is simply making so many detectors and keeping their properties uniform each to each — these are among the most sensitive microwave detectors ever built, essentially as good as the constraints of quantum mechanics and thermodynamics allow. The second, related to the first, is to pack as many of these into a small space — the focal plane of the telescope — as possible. Traditionally, microwave detectors have used horns to guide the electromagnetic waves from the sky onto the detectors, but those horns are much wider than the detector hardware. For experiments like PolarBear, we put the detectors themselves right at the focus of the telescope and make each of them into a little antenna, receiving directly the focused light after passing through a hemispherical lens. The final hardware challenge is to get the information from these thousands of detectors off of the telescope and into our computers, which the PolarBear designers have solved with a new technique called “frequency-domain multiplexing”. Sort of like the way FM radio manages to convey the full spectrum of sound by modulating at a particular frequency, the very high-tech SQUIDs (Superconducting QUantum Interference Devices) can then amplify these tiny CMB signals into data we can analyze.
In fact, the data analysis and computing challenges are almost as significant as those faced in hardware. With thousands of detectors and a telescope that will run for the better part of several, we have many orders of magnitude more CMB data than we’ve ever dealt with before, combined with a sensitivity goal better than a millionth of a degree. By adding more and more detectors, we can make the raw experiment itself sensitive enough to do this. What we don’t know is whether we can eliminate everything else that can possibly contaminate our results: light may spill over our shield from the 300 degree ground or directly from the atmosphere; dust in our solar system or our galaxy also glows in the bands we want to measure, as do external galaxies millions of light-years away. So our task is to compress the terabytes of data into a few interesting numbers (like the energy scale of inflation) and to simultaneously separate the cosmic signal from the that produced by instrument and from the rest of the Universe (which may be much brighter!). Suffice to say, we have some good ideas but until we’re confronted with real data we won’t know how successful we’ll be.
Plus, I ate bagels (better than London; not as good as New York) and burritos, and bought shoes at cheap American prices (at least when I think in British Pounds).
Next up, Santa Fe…
Thanks to Dave for pointing out that the final results of the STFC programmatic review
sweepstakes popularity contest consultation exercise have been released. Following on from the recommendations, which grouped all projects into five projects, the STFC Council has decided where and how the money will flow.
The best news overall is that only the very lowest band of projects will no longer be funded, rather than two lowest as had originally been planned. As expected, Imperial Astrophysics has fared relatively well, with continued support for Planck, Herschel, Scuba II, UKIDSS, LISA Pathfinder and XMM Newton.
Overall, it looks like a relatively small number of projects will be “discontinued” and that STFC “will therefore ramp down funding at an expeditious but appropriate rate in consultation with the PIs/stakeholders. Where possible [they] will look for ways to ensure that there is a return on … previous investments.” In astrophysics, these projects include the UK’s contributions to the gamma-ray observatory VERITAS and the astronomical computing and data-analysis projects AstroGrid and CASU/WFAU, in particle physics the b-physics experiment BaBar, and most of ground-based Solar and Terrestrial physics. On the other hand, despite the panel recommendations which put it into the lowest band, the Mercury mission BepiColombo — which apparently threatens to consume the entire ESA science budget — will continue to be funded, because the UK contribution “is subject to an MOU [memorandum of understanding] with the Agency and will be respected.”
But the dark underside to the entire process remains the assumed 25% cut to the “grants line” — the money to pay for the actual science return on these missions, as well as all of the science that doesn’t come from large projects, mostly in the form of salaries for postdocs: theoretical physics, observations of individual astronomical objects, and just thinking hard and opening up new areas to explore. I’ve just got a big stack of grant applications to referee from STFC — let’s see how many of even the best manage to survive the cut.
[I promise to find something new to talk about, now that this unsavory episode seems to be reaching its conclusion, for now at least. Until then, you’ll just have to follow my twittering, although you’re more likely to learn about my musical tastes than cosmology…]
No time for a full blog post, but I wanted to point out the results of the STFC Consultation, now available.
Some of my favorite projects like AstroGrid seem to have not fared too well (the consultation panel rated it highly, but PPAN, responsible for the final ranks, disagreed). Nonetheless, Imperial Astrophysics projects like Planck, Herschel, Scuba II, UKIDSS, LISA Pathfinder and XMM Newton appear to have survived the cut. However,
It is important to stress that these reports are not the final conclusions of the Programmatic Review. These conclusions will be reached by STFC Council using these reports to inform their decision-making.More later as the repercussions become clear.
But this afternoon I took a few hours off and attended the Imperial College postgraduate degree ceremony. In and amongst the several hundred students received their degrees were all three of my first students (I celebrated their successful PhD vivas here, here and here). There were a couple of short speeches, and a few honorary degrees awarded (the morning ceremony gave one to F1 head and infamous labour donor Bernie Ecclestone), but most of the time was taken up by the students marching up one at a time and shaking the hand of an Imperial luminary. In addition to my students, their were a few other astrophysics PhDs awarded, including to Dr Brian May, who got (by far) the biggest cheer of the day. Me, I got the rare honor of sitting on the stage of the Royal Albert Hall, in my academic regalia (American PhDs robes are heavier than those from the UK, for reasons that escape me, not ideal for a couple of hours under stage lights — and it now appears that my hood wasn’t even the proper maroon and black combo that my Chicago degree apparently calls for).
I admit I was inordinately proud of my students, in my meagre supervisory role as Doktorvater (to use the excellent Germanic term for supervisor): they’ve all done fantastic theses, important science, and most importantly by the end I was able to just get out of the way while they did the hard work. Congratulations again to each of them.
I kind of like the retro 60s hand-drawn feel (or is it Le Petit Prince?) but the juxtaposition of typefaces on the bottom is awful (and “Planck” should probably be more important than “HFI”).
In its continuing bid to take over all aspects of science communication, Nature magazine (or more properly, an alliance between Nature Network and the Royal Institution) will be hosting a European Science Blogging conference in August or September.
Right now, however, I’m in Norway. In addition to discussing how we’re going to measure the CMB power spectrum with Planck, I’ve already eaten a slab of reindeer, ran for an hour up and down the snowy hills, and sweated in a sauna.
I’ve been distracted from preparing a presentation trying to make the sure the UK (and, yes, our group at Imperial in particular) gets its fair share of the dwindling UK astrophysics budget: Newsnight has a pretty extensive package, filmed over the last few weeks, discussing the ongoing astrophysics funding issues. Most impressive was the strong editorial line, starting with always-irascible host Jeremy Paxman’s opening comment that “the consequences [of the funding cuts] haven’t been thought through. And they could be dire.”
From there, Susan Watts presented interviews with luminaries such as Astronomer Royal and Royal Society President Martin Rees (describing the situation as “poor management and poor planning…. ineptitude”), Royal Astronomical Society President Michael Rowan-Robinson, and footage from the NAM Town Hall meeting with the STFC Chief Executive Keith Mason. Watts explicitly asks “Who mismanaged what?” and interviewed Mason, “the man many of them [the astronomers] hold responsible”, who could only say that “we have to think in new ways”. Indeed.
In what I assume wasn’t a coincidence, the government today released the PM’s response to the petition to “reverse the decision to cut vital UK contributions to Particle Physics and Astronomy.” Alas, it just seems to be parroting the comments of the STFC Executive over the last few months. Roughly paraphrasing: “Actually, there’s no cut. Really, it looks great, if you only look at the numbers we tell you to look at. OK, well, it’s not a bad cut, anyway, and maybe the current review will convince us to make it better in the future. Oh, just stop complaining, we really love science.”
Today we heard that the (bizarrely agglomerated) UK Department for Innovation, Universities and Skills will be significantly cutting the physics budget that comes through the Science and Technology Facilities Council (STFC). STFC was formed earlier this year out of PPARC (Particle Physics and Astrophysics) and the CCLRC (which ran big facilities like the Rutherford Appleton Lab). When it was formed, we were told this would enable better science. But it seems we may have been sold a bill of goods: the science program is being saddled with what is, essentially, CCLRC’s debt, in the form of an £80 million shortfall that will fall disproportionately on academic research. And therefore, of course, on physics departments and, inevitably, physics education.
The Delivery Plan has just been announced, but of course the spin is all on the overall increase to funding, not these cuts. Happily (and a little surprisingly) the BBC highlighted the impact on physics in its usual stroppy manner.
Andy Lawrence has been following the news of the impending cuts over the last few weeks. Chris Lintott and Stuart have some more details. The headline cuts seem to be: withdrawal from the International Linear Collider (particle physicists’ next big instrument after the LHC at CERN), cessation of all support for ground-based solar-terrestrial physics facilities (i.e., telescopes and instruments that investigate the sun and its impact on the earth from the ground), and “revisiting the on-going level of investment” in gravitational wave detection, dark matter detection, the Clover CMB experiment and the UKIRT telescope. The UK will pull out of the Isaac Newton Group of telescopes.
Most important for me, so-called post-launch support for existing space missions (such as the Planck Surveyor CMB Mission, although it was never explicitly mentioned in the plan) will be cut by around 30%. This is a very cynical ploy: we will undoubtedly be so excited by the data from missions like Planck that we will donate our time, gratis, just to make sure that it gets analyzed.
There do appear to have been some small victories. Rather than a full termination as mooted last week, STFC plans “to withdraw from future investment in the twin 8-metre Gemini telescopes and we will work with our international partners to retain access to Gemini North.” So at least UK astronomers will have access to a world-class telescope in the Northern hemisphere. Most importantly, “Science Minister Ian Pearson said [on the BBC] funding arrangements would be reviewed,” — which we hope means actual compromises are possible — although of course he “did not promise extra money.”