Nine New Dwarfs Discovered in Our Local Group of Galaxies

Just as astronomers are examining dwarf planets, they’re investigating dwarf galaxies too. Two weeks ago, an international collaboration of scientists with the Dark Energy Survey (DES) peered around the southern hemisphere and announced in a paper in the Astrophysical Journal that they found candidates for nine new “satellite” galaxies around our Milky Way. For those of you keeping count—and many people are—if confirmed, this means that we now have 35 satellites in our Local Group of galaxies, which could even tell us something about the dark matter out there.

An illustration of the previously discovered dwarf satellite galaxies (in blue) and the newly discovered candidates (in red) as they sit outside the Milky Way. (Image: Yao-Yuan Mao, Ralf Kaehler, Risa Wechsler.)

An illustration of the previously discovered dwarf satellite galaxies (in blue) and the newly discovered candidates (in red) as they sit outside the Milky Way. (Image: Yao-Yuan Mao, Ralf Kaehler, Risa Wechsler.)

The smallest known galaxies (as might be inferred from their name), dwarf galaxies are extremely faint and difficult to detect, sometimes only containing a few hundred stars and appearing to blend in with the stars in the disk of the Milky Way. They can also be difficult to distinguish from globular clusters, which are just clumps of stars that evolved with a galaxy and orbit around its core.

Astrophysicists refer to galaxies that travel around a larger galaxy as “satellite” galaxies. In many cases, these galaxies were previously floating through space, minding their own business, until the gravitational force of the massive galaxy pulled them in. Some astronomers think that that is what happened to the Small Magellanic Cloud and Large Magellanic Cloud, the brightest satellites of the Milky Way. (The Persian astronomer Abd-al-Rahman Al-Sufi discovered the LMC in 964 A.D., and it does sort of look like a “cloud.”) To give these satellites some perspective, they’re mostly between 100,000-200,000 light-years away, while the Milky Way’s radius is about 50,000 light-years, which is already much longer than the road to the chemist’s.

Keith Bechtol (University of Chicago) and Sergey Koposov (University of Cambridge) led parallel studies with the DES, which uses an optical/infrared instrument on a telescope at the Cerro Tololo Inter-American Observatory in the Chilean mountains. “The discovery of so many satellites in such a small area of the sky was completely unexpected,” says Koposov. These findings only include the first-year data of the DES though, and the research team stands poised to discover as many as two dozen more satellite galaxies as they continue their survey.

Six of the nine newly discovered dwarf satellite galaxies. (V. Belokurov, S. Koposov. Photo: Y. Beletsky.)

Six of the nine newly discovered dwarf satellite galaxies. (V. Belokurov, S. Koposov. Photo: Y. Beletsky.)

In 2005-2006, Koposov and his colleagues (Vasily Belokurov, Beth Willman, and others) found about half of the previously detected satellite galaxies of the Milky Way with the Sloan Digital Sky Survey (SDSS), the DES’s predecessor in the northern hemisphere. The SDSS and DES are powerful enough to detect and resolve faint dwarf galaxies that hadn’t been observed before, transforming this field and stimulating interest in the Milky Way’s neighborhood.

Dwarf galaxies could reveal new information about dark matter, since their mass in stars is outweighed by thousands of times by the mass of dark matter particles surrounding them. Astrophysicists developing numerical simulations of growing clumps of dark matter, thought to host galaxies within them, have been concerned that more satellite clumps form in the simulations than satellite galaxies have been observed in the Milky Way–a discrepancy referred to as the “missing satellites” problem. It’s not clear yet whether the newly discovered satellite galaxy candidates could solve or complicate this problem. Moreover, astrophysicists continue to worry about other problems, including disagreements between observed galaxies and simulations involving the masses and angular momenta of dark matter clumps. In any case, scientists working with the DES continue to push the debate further, and their ongoing survey will be of great interest to the astronomical community.

A simulated dark matter "halo" with satellites, possibly similar to the Milky Way. (Credit: Volker Springel, Aquarius Simulation.)

A simulated dark matter “halo” with satellites, possibly similar to the Milky Way. (Credit: Volker Springel, Aquarius Simulation.)

For more coverage, check out this article by Monica Young in Sky & Telescope and articles in Wired and Washington Post. If you’re interested, you can also see my own earlier research on satellite galaxies in dark matter models and on the Magellanic Clouds.

How Do Politics Interfere with the National Science Foundation and NASA?

Why do Congress members members keep getting involved in scientists’ work? Is it because they really love science? In my opinion, this interference impedes scientists’ communities from setting their own priorities and from continuing their work. (I argued as much when I spoke to Senator Feinstein’s staff at her San Diego office recently.) But first I’ll describe how Representatives in the House Science Committee seek to interfere with the National Science Foundation’s peer-review process and how a Subcommittee Chair in the Senate interferes with NASA’s scientific programs. As budget negotiations begin for FY 2016, these issues take on additional importance.

Suppose the scientist Dr. X wrote a paper about her findings and wants to publish it. She’d submit it to a journal, where it would go through the peer-review process: a peer reviewer would review the paper and assess whether it is publishable and appropriate for the journal. When Dr. X submits a proposal for a research grant with a federal agency, such as with the National Science Foundation (NSF), the process works sort of similarly. More is at stake though, and a panel of reviewers review many proposals and assess their scientific merits.

nsf1

In the context of budget debates during the recession and ongoing “sequestration,” it’s natural that policy-makers would scrutinize agencies’ budgets. Nevertheless, in the federal R&D budgets by agency, the NSF’s is rather small—much smaller than the National Institutes of Health and the Department of Defense, for example—and in any case, hasn’t the NSF been doing a good job? In spite of this, last year the House Committee on Science, Space, and Technology (“House Science Committee,” for short), chaired by Representative Lamar Smith (R–TX), began “an unprecedented—and some say bizarre—intrusion into the much admired process that NSF has used for more than 60 years to award research grants,” according to science policy analyst Jeffrey Mervis.

Representatives Eddie Bernice Johnson (D–TX) and Lamar Smith (R–TX). Credit: Science Insider

Representatives Eddie Bernice Johnson (D–TX) and Lamar Smith (R–TX). Credit: Science Insider

In 1976, Senator William Proxmire (D–WI) attacked scientific research with the annual “Golden Fleece” Awards, the first of which went to the NSF. These awards and Proxmire’s grandstanding resulted in generating suspicion towards government spending on science. Senator Tom Coburn (R-OK) continued this legacy by criticizing primarily research grants in the Social, Behavioral, and Economic (SBE) sciences. In response, a few years ago, a coalition of scientific groups started the Golden Goose Awards to highlight “examples of seemingly obscure studies that have led to major breakthroughs and resulted in significant societal impact.”

Lamar Smith’s current attack goes further than the Golden Fleece Awards by investigating the NSF’s peer-review process itself, and scientists are concerned about whether the process will remain confidential. Moreover, Smith would like to ensure that every research grant funded by the NSF is in the “national interest;” any other research, according to him, constitutes “wasteful spending.” It seems that Smith’s mission is to attack research in the social sciences, and at the same time he threatens to “compromise the integrity of NSF’s merit review system as part of this campaign,” according to House Science Committee member Rep. Eddie Bernice Johnson (D–TX). (For more coverage, see these excellent articles in Science, National Geographic, and LA Times.)

Finally, on a more positive note, it seems that Smith and NSF Director France Córdova may eventually resolve their disagreements. Following a hearing on the NSF’s grant making policies and procedures, Smith backed down from his previous position and appears to have endorsed the NSF’s peer review system. This is encouraging, but I fear that the battle isn’t over.

Senator Ted Cruz (R-TX). (Credit: AP)

Senator Ted Cruz (R-TX). (Credit: AP)

But it’s not just the NSF that has experienced politicians interfering in its work. NASA faces a somewhat similar situation. (The Environmental Protection Agency has also withstood attacks in recent weeks, but that’s another story.) Senator Ted Cruz (R-TX), the new chair of the Senate Commerce Subcommittee on Science and Space, which oversees NASA, is getting involved in that agency’s work. At a budget hearing, Cruz questioned Charles Bolden, a former astronaut and NASA’s administrator, to explain NASA’s funding of earth sciences (also known as geosciences), which Cruz claimed are not “hard science.” Cruz argued that manned space exploration is NASA’s “core mission,” and earth sciences have nothing to do with that.

Bolden responded, “It is absolutely critical that we understand Earth’s environment, because this is the only place we have to live…We’ve got to take care of it. and the only way to take care of it is to know what’s happening.” Moreover, according to the American Geophysical Union (AGU) in Science magazine, one can’t decouple earth sciences and planetary sciences, which are inextricably linked. (For more coverage, also check out these articles in the Guardian, Slate, and Salon.)

Cruz is right that the proportion of NASA funding going to earth science research has increased over the past few years, but there is a reason for that. In my opinion, some people reporting on this in the news seem to focus on the misguided and ill-informed views of Senator Cruz when it comes to climate science in particular. But I think the issue here is that politicians shouldn’t generally interfere with scientists doing their work as best they can. Scientists in the space sciences (including earth sciences) periodically write reports known as Decadal Surveys, in which they set their short- and long-term priorities for investing funding and research. Though there could be more interaction and better communication between scientists and policy makers, especially when some research programs might have policy implications, that doesn’t mean that non-scientists know better when it comes to setting priorities for scientific research.

These debates don’t happen in a vacuum but are related to the larger context of federal budgets for science research, education, and public outreach. Negotiations for FY 2016 budgets are already underway, and just last week scientists and their allies advocated for a 5% increase to the NSF’s budget, primarily going to telescope construction projects and the Atmospheric and Geospace Sciences Division, as well as an 11% increase to its education budget. The debates surely will continue, and I’ll keep you posted.

Book Review: “Fukushima: The Story of a Nuclear Disaster”

First the ground shook violently, and then a succession of towering waves smashed the island of Honshu. As people sought shelter and braced themselves during a magnitude 9.0 earthquake and tsunami—the worst and deadliest experienced by Japan in a century—they had no idea what was yet in store for them. The rest of the world was transfixed as well by the unfolding events when on 11th March 2011, four years ago this week, multiple reactor cores at the Fukushima Daiichi nuclear power plant had meltdowns and threatened millions with radiation exposure. Today, scientists continue to assess the effects on public health and ecological damage, while the nuclear industry still reels from the worst disaster since Chernobyl.

fukushima-book

Fukushima: The Story of a Nuclear Disaster, published last year by Dave Lochbaum, Edwin Lyman, Susan Q. Stranahan, and the Union of Concerned Scientists (UCS) analyzes these events and their implications and consequences in detail. Japanese are still recovering from the disaster, and the rest of us are still coming to terms with it as well, making necessary a thorough accounting of it, Tokyo Electric Power Company’s (TEPCO) handling of it, and the nuclear industry’s response. This investigative and well-researched book manages to accomplish that. [Disclosure: I am a member of the UCS Science Network.]

Credit: International Nuclear Safety Center

Credit: International Nuclear Safety Center

Lochbaum and Lyman are both senior scientists and nuclear energy analysts for UCS, while Stranahan was the lead reporter of the Philadelphia Inquirer‘s Pulitzer Prize-winning coverage of the Three Mile Island nuclear accident. They appear to have written the book for a US audience, as they include investigations of the Nuclear Regulatory Commission (NRC) and the vulnerabilities of nuclear reactors in the US similar to Fukushima’s.

The authors describe the tumultuous week of 11th March 2011, as TEPCO workers with little information about what is happening inside Units 1-4 of the plant, scramble to contain the meltdown and prevent additional radiation spreading to a larger zone and getting into the air, water and land. (Residents who weren’t evacuated were told to stay indoors but remained in danger.) First flooding occurred throughout the plant, backup power generators available turned out to be inefficient, there was insufficient water to keep the reactors cool, workers couldn’t enter buildings as they had already exceeded their allowable radiation exposure, an explosion delayed recovery efforts and scattered more radioactive material, and spent fuel pools turned out to be as dangerous as the meltdowns themselves.

As they note in the first chapter and elaborate upon later in the book,

If a natural disaster could trigger a crisis like the one unfolding at Fukushima Daiichi, then, one might wonder, why aren’t even more safety features required to prevent such a catastrophic event from occurring? The short answer is that developers of nuclear power historically have regarded such severe events [“beyond design-basis” accidents] as so unlikely that they needn’t be factored into a nuclear plant’s design.

Lochbaum, Lyman, and Stranahan give a blow-by-blow of the worsening disaster, at times perhaps going into too much detail or giving more background than all but the most interested reader would want to follow. The writing style sometimes was a bit dry as well, though there were plenty of dramatic moments as well. For example, a particularly moving scene occurred when Katsunobu Sakurai, the mayor of Minamisoma, a devastated coastal community just outside the twelve-mile (twenty-kilometer) evacuation zone, took a video pleading for assistance from anyone. “With the scarce information we can gather from the government or TEPCO, we are left isolated,” Sakurai said. “I beg you to help us…Helping each other is what makes us human being[s].” He posted the recording on YouTube, which was viewed by more than two hundred thousand people, and then relief finally poured in.

The authors also describe debates and disagreements between TEPCO and NRC officials, such as about which of the four most damaged reactors and spent fuel pools were at risk of releasing more radiation and which presented the most pressing danger, as they could not focus on all four units at once. They also disagreed about an appropriate evacuation zone, as the NRC eventually recommended a larger zone, and about what officials should tell the public and US citizens in the area.

Following the disaster, antinuclear protesters resisted re-opening plants or continuing construction on new ones. As nearly three fourths of the Japanese public supported an energy policy that would eliminate nuclear power, on 6th May, Prime Minister Naoto Kan announced, “Japan should aim for a society that does not depend on nuclear energy.” The Japan Times stated in an editorial, nuclear power “worked for a while, until, of course, it no longer worked. Now is the time to begin the arduous process of moving towards safer, renewable and efficient energy resources.”

The NRC outlines four or five levels of nuclear power reactor “defense-in-depth,” where first an event occurs, then it could be followed by core damage, radiation release, and exposure to the public. Safety measures at each level are intended to prevent the accident from worsening to the next level, but each level has more and more uncertainty. More importantly, beyond design-basis accidents could exceed all levels of safety measures at once.

Credit: International Nuclear Safety Advisory Group (INSAG)

Credit: International Nuclear Safety Advisory Group (INSAG)

It turns out that in the US, there are numerous Mark I boiling water reactors similar to the ones in Japan. They have similar safety measures as well, as the international nuclear industry generally has the same regulations in both countries. Following Fukushima, some analysts argue that many nuclear reactors throughout the US could be vulnerable to floods, fires, and earthquakes, and people are not sufficiently prepared for such events. For example, 34 reactors at 20 sites around the US are located downstream from large dams, and “the threat posed by the failure of those dams was not taken into account when the plants were licensed.” The authors highlight a particular example: the three-unit Oconee Nuclear Station in South Carolina is especially at risk. The Prairie Island nuclear plant southeast of Minneapolis is another. People think that “it can’t happen here” in the US, but apparently it can, so that leads to the critical question, “how safe is safe enough?” This is a complicated question, and it remains unanswered.

The Japanese continue to recover from the real and figurative fallout at Fukushima. Four years after the disaster, while scientists assess the damage and recovery, sailors sue TEPCO after radiation exposure, the NRC can’t decide how to proceed, and scientists study possible contamination to food supplies and the ecological toll. The thorough analysis in Fukushima remains extremely relevant today, and those interested in the risks and challenges of the nuclear industry will do well to read it.

My Views

In my opinion, the authors could have included a little more discussion about nuclear energy in the context of energy policy and implications for it as we move to a carbon-limited economy. But this was beyond the scope of the analysis in their book. In the US, in spite of Three Mile Island, Browns Ferry, and other accidents or near-accidents, nuclear energy remains a primary energy source. Many countries oppose nuclear energy, while others such as France, Russia, China, and South Korea, have many plants and have more in construction.

Source: NRC, DOE/EIA

Source: NRC, DOE/EIA

At this point, it might not be possible to transition to a low-carbon economy in the US without including nuclear energy as part of the transition. In the long term, I believe that solar and wind power have the most potential with the least risk, and countries such as Germany have shown that it is possible to ramp up investment in wind and solar in a short period of time. Who knows–maybe fusion energy may be a possibility in the very long-term future, but as I’ve noted before, the ITER experiment is behind schedule, over budget, and has management problems. Finally, we must focus on energy demand, not just supply. We should work on making our cities, industries, transportation, and communities less energy intensive, and it will be worth the effort.

NASA Missions Exploring Dwarf Planets Ceres and Pluto

Now I’m not a planetary astronomer, but like you, I’m excited by any kind of space exploration, and this year the NASA missions, Dawn and New Horizons, will give us the closest and most detailed views of dwarf planets yet.

What is a “dwarf planet,” you ask? Excellent question. Until about ten years ago, astronomers usually referred to small planet-like objects that were not satellites (moons) as “planetoids.” In some ways, they resembled the eight more massive planets in our solar system as well as Pluto, which had a borderline status. Astronomers discovered Charon, Eris (previously called 2003 UB313), and Ceres, and they expected to discover many more, likely rapidly expanding the ranks of our esteemed class of planets. Either they all had to be included, or a clear classification system would have to be determined and Pluto would be reclassified.

Courtesy: IAU

Courtesy: IAU

At the International Astronomical Union (IAU) meeting in Prague in 2006, astronomers opted for the latter in Resolution 5. They demoted poor Pluto, but I think they did the right thing. (I was working in Heidelberg, Germany at the time, and if I’d known how historic this IAU meeting would be, maybe I would’ve tried to attend!) The IAU’s defines a dwarf planet as “a celestial body that (a) is in orbit around the Sun, (b) has sufficient mass for its self-gravity to overcome rigid body forces so that it assumes a hydrostatic equilibrium (nearly round) shape (c) has not cleared the neighbourhood around its orbit, and (d) is not a satellite.” The criterion (c) is the important one here, because it means that the object has not become gravitationally dominant in its orbital zone, which is the case for Pluto and the other planetoids beyond Neptune and for Ceres, the only dwarf planet in the asteroid belt between Mars and Jupiter. These are contentious issues, and the debate even made it into the New Yorker. But let’s be clear: these things are small, and they’re all less massive than Earth’s moon.

We don’t know as much about dwarf planets as we do about the planets in our system, so let’s go exploring! What do these new space missions have in store for us?

Ceres

In 2007, NASA launched the Dawn spacecraft to study Ceres up close. A couple days ago, two centuries after Sicilian astronomer Father Giuseppe Piazzi discovered Ceres, Dawn became the first spacecraft to orbit a dwarf planet. As the deputy Principal Investigator Carol Raymond put it on Friday, this is an “historic day for planetary exploration.” Jim Green, NASA’s Planetary Science Division Director, says that with Dawn, we are “learning about building blocks of terrestrial planets in our solar system.”

Dawn has obtained excellent detailed images already, as you can see in the (sped up) animation below.

Credit:  NASA/JPL-Caltech/UCLA/MPS/DLR/IDA

Credit:
NASA/JPL-Caltech/UCLA/MPS/DLR/IDA

The pair of bright spots in a crater stand out, and astronomers are trying to figure out what they are. They might be an indication of geological activity on it’s changing surface. Ceres has a rocky core and an ice layer, and it’s also possible that these are reflective patches of ice that have been exposed by space rocks falling in and striking the surface. For more information, check out this blog post by Emily Lakdawalla and these articles in the LA Times and Wired.

As Dawn uses its propulsion systems to reshape its orbit and get closer views, astronomers expect to learn more about those spots, look for plumes, and examine the surface for strange craters or other distinguishing features. The spacecraft will later turn on its spectrometers and determine which minerals are present and how abundant they are.

Pluto

NASA launched New Horizons in 2006, and it had much farther to travel to reach Pluto. In January, NASA announced that New Horizons is making its approach to the erstwhile planet, though it’s still about 200 million kilometers away. Mark your calendars: it will fly by Pluto (as it will be traveling too fast to orbit) on 14th July, and at a distance of only 13,000 km, New Horizons’ instruments will obtain the best images yet of it. For more information, check out this article by Jason Major in Universe Today and Phil Plait in Slate.

Distant image of Pluto by New Horizons. Credit: NASA/Johns Hopkins APL/Southwest Research Institute.

Distant image of Pluto by New Horizons. Credit: NASA/Johns Hopkins APL/Southwest Research Institute.

A couple ago, leaders in planetary astronomy highlighted the importance of Dawn and New Horizons in their Decadal Survey. I think both space missions will turn out to be worthwhile, and let’s stay tuned to see what they discover over the next few months.

Geoengineering and Climate Interventions: Too Risky or Needs More Research?

At the American Association for the Advancement of Science (AAAS) meeting in San Jose in February, scientists from the US National Research Council released two high-profile reports on climate interventions and geoengineering techniques. The most thorough evaluation of its kind, this pair of studies assesses proposed climate intervention approaches including their cost, technological capacities, uncertainties, impacts, challenges, and risks. As the Earth and its inhabitants experience a changing climate nothing like any in recorded human history, and as concentrations of greenhouse gases in the atmosphere continue to rise, scientists are interested in considering all possible responses.

The research committee consists of an impressive array of experts from a variety of institutions and universities, including Ken Caldeira (Carnegie Inst. for Science), Lynn Russell (Scripps Inst. of Oceanography) and David Titley (Penn State), and it is chaired by Marcia McNutt, editor-in-chief of Science and former director of the US Geological Survey, and they are informed by numerous analysts and staff. The National Academy of Sciences (NAS), the US intelligence community, NASA, NOAA, and the Dept. of Energy sponsored the studies. One can access both full reports and a 4-page summary at the NAS website.

Photograph: Frank Gunn/The Canadian Press/Associated Press

Photograph: Frank Gunn/The Canadian Press/Associated Press

The authors avoid the more commonly-used term “geoengineering,” which they also used in previous reports, because they consider the atmosphere and not just the Earth and because engineering “implies a greater degree of precision and control than might be possible.” Instead, they propose the term “intervention,” with its connotation of “an action intended to improve a situation.”

Through these reports, the committee makes three main recommendations and conclusions. First, the authors argue that there is no substitute for climate change mitigation and adaptation. Second, they recommend research and development investment to improve methods of carbon dioxide removal and disposal at scales that would have a significant global climate impact. Third, they oppose deployment of albedo-modification techniques but recommend further research.

Carbon dioxide removal and sequestration

NAS report: "Climate Intervention: Carbon Dioxide Removal and Reliable Sequestration"

NAS report: “Climate Intervention: Carbon Dioxide Removal and Reliable Sequestration”

Carbon dioxide removal (CDR) strategies involve capturing carbon in the terrestrial biosphere or the ocean after it’s been emitted. These approaches are intended to mimic or accelerate processes that are already occurring as part of the natural carbon cycle. The authors consider five types of CDR techniques: land-management approaches such as forest restoration; accelerated weathering techniques (allowing the oceans to absorb more CO2 than normal); ocean iron fertilization (so that more microorganisms such as plankton consume CO2, like a “biological pump”); bioenergy (using biomass) followed by CO2 capture and sequestration; and direct air capture of carbon.

The authors describe ocean fertilization and direct air capture as “immature technologies,” while land management and weathering processes have only been carried out on a limited scale, and bioenergy is limited by the availability of land for biomass and by the need to transport it to processing facilities. The barriers to CDR deployment involve slow implementation, limited capacity, policy considerations, and high costs of currently available technologies. The committee concludes the report with the following recommendation:

Recommendation 2: The Committee recommends research and development investment to improve methods of carbon dioxide removal and disposal at scales that would have a global impact on reducing greenhouse warming, in particular to minimize energy and materials consumption, identify and quantify risks, lower costs, and develop reliable sequestration and monitoring.

Albedo-modification research

NAS report: "Climate Intervention: Reflecting Sunlight to Cool Earth"

NAS report: “Climate Intervention: Reflecting Sunlight to Cool Earth”

Albedo-modification techniques ignore the greenhouse gases and instead seek to avoid global warming by blocking the sun to prevent light from reaching the Earth’s surface. Such methods could lower average global temperatures in a couple years, like the effects of volcano eruptions, such as Mount Pinatubo in the Philippines in 1991. The authors mainly consider two methods for scattering sunlight: injecting millions of tons of aerosol-forming gases into the stratosphere; or marine cloud brightening, increasing the efficiency with which the ocean clouds reflect sunlight. They also briefly consider other techniques including: space-based methods, placing scatterers or reflectors in the atmosphere; and cirrus cloud modification, such that more long-wave radiation can flow up into space.

The authors acknowledge that albedo-modification techniques only temporarily mask the warming effect of greenhouse gases and would be needed to be sustained indefinitely. In addition, there could be unanticipated and unmanageable risks and consequences, “including political, social, legal, economic, and ethical dimensions.” Therefore, the committee comes to the following simple conclusion:

Recommendation 3: Albedo modification at scales sufficient to alter climate should not be deployed at this time.

Media Response

It’s interesting that with the same pair of reports, journalists at different media outlets present the study’s results in a variety of ways, demonstrating the many perspectives with which people approach these issues. For example, journalists and editors at the New York Times, Los Angeles Times, Science, and National Geographic point to the need for more research, primarily on carbon dioxide removal techniques. On the other hand, Suzanne Goldenberg at The Guardian writes that the consideration of planetary-scale interventions shows how concerned scientists have become about advancing climate change, while Alexandra Witze at Nature writes about how these reports legitimize geoengineering, though many of the climate intervention approaches are deemed too risky.

I would argue that most of these journalists describe the study correctly, but since the study has multiple recommendations that are somewhat at odds with each other and since the committee includes people with different views and backgrounds, it’s inevitable that some people would be more responsive to some aspects of the report over others. You may also be interested in critical responses by people blogging with the Union of Concerned Scientists and the National Association of Science Writers.

Moving Ahead

Finally, I’ll end with my view of this study and of climate interventions. I’m not sure that the term “climate interventions” itself is an improvement over “geoengineering”: I think that the former amounts to re-branding the issue and that it sounds less serious. Make no mistake, what scientists consider in these reports are serious stuff indeed. And as some have mentioned before, such a scheme has been imagined before—by “The Simpsons” villain Mr. Burns.

The study’s authors state that there is no substitute for climate mitigation and that we should focus on the root cause of climate change, which is the carbon dioxide in the Earth’s atmosphere. However, the carbon emissions are themselves caused by human society’s growing energy demand and the widespread use of fossil fuels: coal, oil, and gas.

The authors point out that most geoengineering schemes are too risky, involve immature technologies, have high costs, and could have unknown consequences on a planet-wide scale. Is it really worthwhile to invest in more research of them? The only exception is forest restoration and other land management methods, which would help when combined with reduced carbon emissions, and I wouldn’t group them with these other carbon-capture climate interventions.

I worry that this report would pave the way for wasting large investments of funding and effort researching these schemes, rather than focusing on the goal of slowing and eventually stopping climate change by transitioning to a low-carbon economy. Moreover, if people believe that a technological solution is possible in the distant future, they will not strive so hard to reduce carbon emissions today and will continue with business-as-usual. Above all, we should be focusing on expanding climate mitigation efforts. We should also work on climate adaptation, since the carbon already in the atmosphere will cause some warming in the coming decades no matter what.