Current Views of Climate Change: the general public versus top presidential candidates

I crossed the country last weekend to participate in the annual meeting of the American Association for the Advancement of Science in Washington, DC. It’s the biggest general science meeting of the year in the U.S., and I was excited to attend along with thousands of other scientists, science writers, science policy experts, and educators. I darted from session to session to see as many interesting sessions and talks that I could, including ones about gravitational waves (of course!), science in Iran, communicating science with humor, and grand visions of the future of science—presented by the heads of NASA and the National Science Foundation, among others.

But I’d like to share some other findings presented at the AAAS meeting, about public opinion on science and technology issues. Cary Funk of the Pew Research Center warned that journalists should not oversimplify the state of affairs. “There are a mix of factors underlying public attitudes toward science-related topics,” she said.

What do people think?

Based on Pew and Gallup surveys, it seems that people’s views on climate change vary with political ideology or party affiliation, with age, and to some extent with geographic location. Their views don’t seem to vary as much with gender, race, religion or education level.

Latin America and Africa are more concerned about climate change than the U.S. and China. (Credit: Pew Research Center.)

Latin America and Africa are more concerned about climate change than the U.S. and China. (Credit: Pew Research Center.)

It turns out that views of climate change are different around the world. In particular, Latin Americans and Africans, more than people elsewhere, think that climate change is a very serious problem and that it’s harming people now, and they’re more concerned that climate change will harm them personally. In contrast, people from the U.S. and China—the world’s biggest greenhouse gas emitters—expressed much less concern.

“Overall, people in countries with high levels of carbon dioxide emissions per capita tend to express less anxiety about climate change than those in nations with lower per-capita emissions,” the 2015 Pew report said.

The political divide has nearly doubled in the last 15 years for people who “worry a great deal or fair amount about global warming.” (Credit: Gallup, Inc.)

The political divide has nearly doubled in the last 15 years for people who “worry a great deal or fair amount about global warming.” (Credit: Gallup, Inc.)

This probably won’t surprise you, but Lydia Saad and her fellow researchers at Gallup see a huge political divide among those who identify as “Republican” and “Democrat” when it comes to: how much people worry about global warming; whether they consider global warming a serious threat; believe the effects of global warming are already occurring; believe that there is a scientific consensus; and believe that global warming is caused by human activity.

The chasm widened after 2008—when President Obama took office—in spite of the fact that the Obama administration did almost nothing to address climate change until two years ago. (Saad didn’t say that; that’s me editorializing.)

The political gap has also widened for people who "think scientists believe global warming is occurring." (Credit: Gallup, Inc.)

The political gap has also widened for people who “think scientists believe global warming is occurring.” (Credit: Gallup, Inc.)

The 2015 Pew survey finds the people have similar political differences on: fracking, prioritizing wind and solar energy over fossil fuels, offshore drilling, and regulating power plant emissions.

And here’s the kicker: during an election year and following the warmest January on record, climate change currently ranks only #14 on the list of voters’ priorities, according to a Gallup poll this month. (The economy, jobs, and national security topped the list.) Nearly half of people surveyed considered climate change extremely or very important in their vote for president though, so we should still ask what the top presidential candidates have to say about these issues.

What do the presidential candidates think?

Now that the relentless, ceaseless, interminable, monotonous and tedious political campaign nears its end—with nine months to go before it gives birth to a fledgling president—it seems to be a good time to review the candidates’ positions on important issues relevant to science, especially climate change and energy policy. This takes on extra importance now, as the Supreme Court has complicated or delayed efforts to implement the Clean Power Plan. Depending on who replaces Scalia, completing this plan and building on it may be the charge of Obama’s successor.

I’ve ordered these candidates alphabetically by party and then by last name.

Hillary Clinton (Democrat, former Senator and Secretary of State)

Clinton says that she will expand clean energy, especially solar; create clean energy jobs; improve energy efficiency in homes and other buildings; increase fuel efficiency of cars and trucks; and since last fall she has expressed opposition to the Keystone XL pipeline. (She had not taken a position one way or the other before that.) Clinton also has a $30 billion plan to “revitalize coal communities” and help them transition toward an economy based on cleaner energy sources.

She has a modest goal of reducing greenhouse gas emissions by 30% below 2005 levels by 2030. Note that, like Obama, Clinton has changed the goalposts, as they say, from the standard baseline: relative to 1990 emissions, this would amount to a reduction of less than 4%, which is tiny compared to plans proposed by European countries and Russia.

Bernie Sanders (Democrat, Senator)

Like Clinton, Sanders supports improving energy efficiency in buildings, electricity grids and cars; investing in renewable energies—especially solar and wind; and aims to create many green jobs. In contrast with Clinton, he opposes fracking and offshore drilling. He recently (in December) released a climate action plan, in which he advocates for a carbon tax and for steeper carbon emission cuts by 2030.

Dr. Jill Stein (Green, Physician)

Stein also has an ambitious climate action plan, and her stance on many energy and climate issues is similar to Sanders’s. Her plan includes a “Green New Deal” to promote the creation millions of green living-wage jobs by investing in clean energy infrastructure, public transit, and more sustainable agriculture. But unique among all the candidates, she aims to achieve 100% clean energy for the U.S. by 2030.

Gary Johnson (Libertarian, former Governor of New Mexico)

Johnson, a leading Libertarian candidate, does not appear to have a climate plan or a detailed energy policy. He accepts that climate change is human-caused. He favors natural gas and to some extent coal power plants, and he emphasizes a free-market approach and opposes cap-and-trade systems.

Ted Cruz (Republican, Senator)

Cruz, like all of the leading Republican candidates but unlike candidates from any other party, does not believe that climate change is happening. He opposes “climate change alarmism.” He is the chairman of the Senate subcommittee on Space, Science, and Competitiveness, and he believes that there is no consensus among scientists about climate change. Cruz supports fracking, the Keystone pipeline, and increasing offshore drilling.

Marco Rubio (Republican, Senator)

Rubio’s positions appear to be similar to Cruz’s. It’s not clear to me whether he believes climate change is occurring, but he has clearly stated that it is not human-caused. Like Cruz, he supports fracking, the Keystone pipeline, and increasing offshore drilling, and he opposes cap-and-trade programs.

Donald Trump (Republican, Businessman)

Trump does not have a climate or energy policy. He believes that climate change is not happening; it’s just the weather.

News from Monterey: Low Water Levels, Fracking Debates

Here’s a couple new stories I reported on and wrote for the Monterey Herald newspaper over the past couple weeks:

 

Water levels on the rise, but slowly in Monterey County

As Northern California was pelted with rain to start the new year, there seemed to be reason to celebrate as the critically low levels at some lakes and reservoirs rose quickly.

Lake Shasta went up by half since the beginning of winter, to 46 percent of capacity. Folsom Lake, east of Sacramento, rose 44 feet in just over a month, and Lake Oroville rose 20 feet, according to the California Department of Water Resources.

A fishing boat motors across Lake Nacimiento in San Luis Obispo County on Tuesday, January 26, 2016.  The recent rains have raised the water level to 22% of capacity.  (Vern Fisher - Monterey Herald)

A fishing boat motors across Lake Nacimiento in San Luis Obispo County on Tuesday, January 26, 2016. The recent rains have raised the water level to 22% of capacity. (Vern Fisher – Monterey Herald)

But that wasn’t the case everywhere, including the two Southern Monterey County lakes, Nacimiento and San Antonio, key bodies of water in recharging Salinas Valley aquifers.

As with all lakes and reservoirs throughout the state affected by years of drought, water levels at Nacimiento and San Antonio had become dire. Last summer, Lake San Antonio dramatically dropped to 3 percent of capacity and recreational facilities there were closed.

Last month, Lake Nacimiento stored water at only 16 to 17 percent of capacity. It has now risen to 22 percent.

“Most of the gains came from the last couple rain events over the past week,” said German Criollo, a hydrologist with the Monterey County Water Resources Agency.

Lake San Antonio, on the other hand, currently remains at 3 percent of capacity. This is so low that engineers refer to it as a “dead pool”: gravity cannot pull any water out of the reservoir when it is at such levels…

[For more, check out the entire story in the Monterey Herald, published on 26 Jan. 2016.]

 

Fracking: Environmental group startled by pro-oil production radio/TV campaign

While Robert Frischmuth tuned in to the Democratic debate a few weeks back, he was startled to see a new commercial promoting oil and natural gas development in Monterey County. It turned out to be part of a TV and radio ad campaign, which promises economic benefits and thousands of jobs for the region.

Oil fields near San Ardo in southern Monterey County. (Vern Fisher - Monterey Herald)

Oil fields near San Ardo in southern Monterey County. (Vern Fisher – Monterey Herald)

Frischmuth is a member of Protect Monterey County, a local nonprofit group that focuses on environmental issues. He and his colleagues believe the campaign comes in response to a ballot initiative they are preparing for the November election.

“We think it’s unprecedented that the oil companies would spend so lavishly on advertising when the initiative isn’t even drafted yet,” said Mary Hsia-Coron, another member of the group.

Californians for Energy Independence is credited on the video for sponsoring the ad. Its coalition includes Chevron Corp., Exxon Mobil Corp., Occidental Petroleum Corp., and other oil and gas corporations.

“We’re supporting local energy production,” said Karen Hanretty, spokeswoman for Californians for Energy Independence. “These folks in Monterey would like to ban oil production in California, but that would be terrible for the state and terrible for the county.”

The ballot measure would likely support a ban on fracking, not all forms of oil well stimulation techniques…

[For more, check out the entire story in the Monterey Herald, published on 3 Feb. 2016. As usual, thanks to David Kellogg for help editing it.]

Book Review: “Trace” by Lauret Savoy

As I work on improving my essay writing skills, I’ve attempted to expand my horizons and read a wide variety of authors, including those with whom I’m not familiar. I recently came across Lauret Savoy, who in her new book, Trace, offers us a different perspective of nature, the environment, geography and American history, including its evolving race relations. She focuses on how people and communities interact with nature, which shouldn’t be viewed as some pristine thing that white people enjoy every once in a while.

Trace_FINAL

Her writings dovetail with environmental justice, which is something I’ve been thinking about over the past few years. It refers to the “fair treatment and meaningful involvement of all people regardless of race, color, national origin, or income” (as the EPA defines it) with respect to environmental regulations and policies. In my opinion, people whose work or activism involves race, class, gender and other power relations often ignore environmental issues, while environmentalists are often white and operate in a vacuum as if those other divisions aren’t important.

But environmental justice brings these issues together. It grew out of the civil rights movement when people of color realized they were often suffering silently while disproportionately affected by toxic waste sites, power plants, landfills, and other environmental hazards. In one of my first guest blog posts (outside of this blog), for the Union of Concerned Scientists a couple years ago, I argued that climate change is an environmental justice issue, as the people most harmed by rising sea levels, floods, extreme droughts and heatwaves are those who did the least to contribute to the problem. Savoy considers these kinds of issues as she weaves in environmental justice in her new book, referring to “people of color and the economically poor [who] live, and die, next to degraded environments.” She argues that the concept of “ecological footprint” should account for dispossessed people and people’s labor.

In Trace, a slim yet powerful volume, Savoy invites us to accompany her as she traces through her travels, her past, and her family history, following the paths she and her predecessors have taken. She explores varied and uneven terrain through ever changing and troubled relations between race and the American landscape. The book is sort of a collection of interconnected essays, which fit together into a cohesive story. Each chapter searches a particular place, asks questions about its origins and names, and considers her and others’ experiences there. “The American landscape was in some ways the template, but also the trigger, to each of the searches,” she said in an interview about the book.

Savoy is a professor of environmental studies and geology at Mount Holyoke College in Massachusetts. She identifies as a woman of African American, Euro-American, and Native American heritage. During her childhood, she lived in California and journeyed to Arizona, through Mexican borderlands, and across the continental divide, and she takes us through each of these places. Early in the book, she tells a story about how as a 7-year-old girl at a gift shop at the north rim of the Grand Canyon, she tried to purchase some postcards displaying photos of places she liked. But the woman behind the counter wouldn’t sell them to her, and she runs off into the woods. It’s one of her first experiences of racism.

In another chapter, she analyzes a book she clearly loves, Aldo Leopold’s A Sand County Almanac. Leopold enlarged the boundaries of “community” to include “soils, waters, plants, and animals, or collectively: the land,” but she fears that the “we” in his book excludes her. She then recalls a novel her father wrote as a young man, called Alien Land, and this leads her to consider the chasms evident in an “alien land ethic.”

Savoy’s extensive background in the earth sciences comes through in beautifully written passages such as this one about stones lying on an island beach: “…each cobble a relic of a remote past and a piece of and in this present. These fragments of placed-memory could trace, to the geologist’s eye, a continent’s coming of age as it shifted and rifted in a tectonic-plated world. They also pointed north toward ghosts of ice sheets grinding across the shield.”

While exploring Trace and Savoy’s other writings, I encountered an excellently written essay by Catherine Buni in the LA Review of Books, where she champions a wider view of nature writing. In a few sentences, she sums up Savoy’s book: “In blazing, beautiful prose, unblinkingly researched and reported, Savoy explores how the country’s still unfolding history, along with ideas of ‘race,’ have marked her and the land. She also traces, in a mosaic of journeys across a continent and time, her mixed-blood ancestry, carefully taking apart the frame at dovetail joints, curiously inspecting and turning over the smallest points of connection, omission, dislocation, and break.”

Throughout her work, Savoy advocates for a more diverse and nuanced view of nature and the environment, and she encourages us to remember that each place we visit has a complex history. I welcome and respect her voice. I found Trace to be fascinating and inspiring to read, and I think anyone who enjoys Leopold, Walt Whitman, Rachel Carson, Annie Dillard and Terry Tempest Williams would love this book too. Next time you go an a walk through your neighborhood park or on a road trip in the Southwest, bring it with you.

Exciting and Controversial Science: Gravitational Waves and a New Ninth Planet?

We’ve had some fantastic astronomical news this month. Last week, we encountered evidence of a “new ninth planet” lurking in the outer reaches of our solar system—170 years after the discovery of Neptune. And earlier in January, we heard a cacophony of whispers about minute gravitational waves being detected for the first time ever. Either one, if true, would be amazing to both astrophysicists and space lovers and would be the biggest discovery of 2016. We should be excited about them, but we should be careful about getting our hopes up so soon.

A New Planet, Far, Far Away?

A couple fellow science writers and I went hiking at Castle Rock State Park in the middle of the Santa Cruz Mountains yesterday, and along the trail, we encountered a variety of people. On our way down, we happened to overhear a conversation: “What’s your favorite planet?” followed by a reply, “Did you hear about the new planet scientists discovered?” Isn’t that great? I’m glad that the story got so much media attention and made it to the front pages of newspapers. It intrigued people, and they’re talking about it.

By studying the strangely aligned orbits of Kuiper Belt Objects far beyond Pluto’s realm, astronomers may have inferred evidence of a planet up to 10 times bigger than Earth. It would be much, much farther than Pluto, making it hard to spot. And from that distance, our sun would look almost like any other star. But if it exists, a new world (dubbed “Planet X”) joining our solar system’s family, even such an estranged cousin, would be exciting indeed.

Eric Hand (Science magazine) points out that the Subaru Telescope could search for Planet X. (Data) JPL; Batygin and Brown/Caltech; (Diagram) A. Cuadra/Science

Eric Hand (Science magazine) points out that the Subaru Telescope could search for Planet X. (Data) JPL; Batygin and Brown/Caltech; (Diagram) A. Cuadra/Science

Nevertheless, we should be concerned that the results are still very uncertain. The authors of the paper in Astronomical Journal, Konstantin Batygin and Mike Brown (both at Caltech), argue that there’s only a 0.007% chance, about one in 15,000, that the clustering of the distant objects’ orbits could be a coincidence. But it’s possible that the behavior of the orbits could have other possibly more likely explanations, such as other unseen Kuiper Belt Objects with orbits aligned in the opposite way. (Other astronomers, like Scott Sheppard and Greg Laughlin, estimate the chance of a planet really being out there at 60-70%. I wouldn’t bank on those odds.)

For that reason, we should remain skeptical for now. Some reporters and editors were a bit more careful than others. For example, while some headlines used appropriately hedging words like “suggest” and “may,” papers like the Denver Post and Washington Post had “The New No. 9” or “Welcome to Planet Nine.” This is already an exciting story to tell though, and we don’t need to exaggerate to get readers’ attention. If the planet turns out not to exist, people who read overblown headlines like those will be frustrated and confused.

Finally, we should all recall that Mike Brown was the main force behind Pluto’s demotion by the International Astronomical Union ten years ago. Since he calls himself the “Pluto Killer” (and wrote a book, “How I Killed Pluto and Why It Had It Coming”), it would be ironic if he helped discover a new ninth planet, replacing Pluto. But he and the Caltech news office seem to have hyped up his paper’s findings more than they deserved, given all the uncertainties involved.

Gravitational Waves Discovered?

While procrastinating and flipping through Twitter earlier this month, I came across some juicy gossip. I heard what sounded like the tantalizing detection of gravitational waves—an unprecedented achievement. These tiny ripples in space-time, predicted by Albert Einstein and thought to be produced by collisions of black holes or neutron stars, had been too small to measure before. Gravity is the weakest of forces, after all.

But it turns out that Lawrence Krauss, a well-known cosmologist and provocateur at Arizona State University, had caused the hullabaloo with some ill-advised tweets. He once again drew the media’s limelight to himself by spreading rumors that scientists in the Laser Interferometer Gravitational-Wave Observatory (LIGO) collaboration had detected gravitational waves for the first time. In the process, he put those scientists in a tough spot, as I’m sure they faced pressure to make sensitive statements about their ongoing research.

The LIGO Laboratory operates two detector sites, one near Hanford in eastern Washington (pictured here) and another near Livingston, Louisiana. (Credit: Caltech/MIT/LIGO Lab)

The LIGO Laboratory operates two detector sites, one near Hanford in eastern Washington (pictured here) and another near Livingston, Louisiana. (Credit: Caltech/MIT/LIGO Lab)

The LIGO team is still working on their analysis using a pair of detectors in Louisiana and Washington state, and they haven’t yet produced conclusive results. From what I can tell, they may have evidence but the situation is far from clear. There is nothing wrong with waiting a while until you’ve thoroughly investigated all the relevant issues and sources of error before announcing a momentous discovery. The alternative is to prematurely declare it, only to face the embarrassing possibility of retracting it later (which sort of happened to BICEP2 scientists with their supposed discovery of primordial gravitational waves).

Gravitational waves will have to remain elusive for now. And if and when LIGO physicists do have convincing evidence of gravitational waves, they need not share any of the glory or credit with Krauss.

Fortunately, in spite of this excitement, science writers and editors kept their cool and soberly pointed to Krauss’s rumors before digging into the fascinating and painstaking work LIGO scientists are doing. Here’s some excellent coverage by Clara Moskowitz in Scientific American and by Lisa Grossman in New Scientist.

[26 Jan. update: I decided to tone down my criticism of Mike Brown, but not of Lawrence Krauss.]

El Niño brings concerns of mudslides, flooding, and coastal erosion

Here’s two related stories I reported on and wrote for the Monterey Herald newspaper over the past week:

 

Concerns rise about mudslides in Monterey County

Higher-than-average rainfall last week, in the midst of an El Niño winter, has raised concerns about landslides and mudslides in Monterey County.

And those concerns are warranted. Although dangerous landslides have not been witnessed in the area this year, historically January and February is prime time. More than 60 percent of landslides happen in those months, when enough rain has been sopped up by the ground to loosen the soil, said John Stock, a U.S. Geological Survey landslide expert in the San Francisco Bay Area.

Vehicles get trapped in a mudslide on Highway 1 three miles south of Esalen on Feb. 13, 1987. The major winter storm caused this section of roadway to be closed for weeks. (Herald file photo)

Vehicles get trapped in a mudslide on Highway 1 three miles south of Esalen on Feb. 13, 1987. The major winter storm caused this section of roadway to be closed for weeks. (Herald file photo)

There are a few key elements for landslides: gravity and water. After four years of drought, the soil might take a little longer to be saturated by rain, but is still vulnerable.

“Weak rocks and steep slopes are the main factors, and they’re helped by rainfall,” said Chris Wills, a geologist for the California Geological Society.

Geologists have identified two varieties of landslides and debris flows: a fast-moving kind and a slow-moving kind. Fast-moving landslides tend to occur during or shortly after intense rainstorms. The soil acts like a sponge, soaking up water in its pores until no more water can be absorbed. Then the freely moving water exerts pressure and pushes the sand grains apart. On steep eroding hillsides, this can generate landslides, spreading shallow soil and rocks downhill.

Slow-moving landslides are much larger and deeper, typically happening after a delay — days, weeks or even months after a lot of rainfall has accumulated. Such a landslide occurred in La Conchita near Santa Barbara during the winter of 2005, killing 10 people and damaging dozens of homes. They begin by creeping down slower than at a speed of an inch per year, making them hard to spot. Then they accelerate as massive amounts of earth, boulders and trees tumble down.

“There are two reasons to worry about landslides,” said Wills. “Fast-moving ones can be dangerous to people, and deep slow-moving ones can affect roads, pipelines and houses built on them. These cause significant costs to society,” he added…

[For more, check out the entire story in the Monterey Herald, published on 10 Jan. 2016. Thanks to David Kellogg for editing assistance.]

 

Rainstorms, tides and El Niño are reshaping Monterey Bay beaches

Just as big waves wash away carefully constructed sand castles, El Niño threatens to transform Monterey County beaches and coastlines.

Every winter rainy season brings storms and heavy surf that erode shores and wash away sand, which waves return to the coast in summer. But El Niño generates extra rain and higher sea levels, which increases the erosion during intense and windy storms, affecting coastal bluffs and beaches around Carmel, Pacific Grove, Pebble Beach and Monterey.

People watch as the Carmel River flows to the ocean at Carmel River State Beach on Monday, January 11, 2016.  County crews worked with equipment on Sunday to start the breach through the sand bar at the southern channel.  The river broke through on its own sometime Sunday night.  (Vern Fisher - Monterey Herald)

People watch as the Carmel River flows to the ocean at Carmel River State Beach on Monday, January 11, 2016. County crews worked with equipment on Sunday to start the breach through the sand bar at the southern channel. The river broke through on its own sometime Sunday night. (Vern Fisher – Monterey Herald)

“Southern Monterey Bay has the most highly erosive beaches in all of California,” said Paul Michel, superintendent of the Monterey Bay National Marine Sanctuary.

Continual rainstorms swelled the waters of Carmel River last week. After 10 p.m. Sunday, a sand barrier built to keep the river from breaching at the Carmel Lagoon was washed away, according to Melanie Beretti, program manager at the Monterey County Resource Management Agency. After considering bulldozing sand back in place, officials expressed concern about it just getting flushed out to sea, so they decided to hold back.

“We need to balance when and how to utilize the sand,” Beretti said.

The beach has been reopened to visitors, and the Resource Management Agency and National Oceanic and Atmospheric Administration continue to monitor the situation to protect the steelhead trout habitat in the lagoon.

This incident could be a small taste of much bigger things to come. Abnormally large storms and big tides could damage roads, bike paths, hiking trails, water lines, sewage systems or even homes close to the coast…

[For more, check out the entire story in the Monterey Herald, published on 11 Jan. 2016.]

Reporting from the American Geophysical Union: Fire Risk Maps, Rocky Mountain Forests, Sierra Nevada Water

Here are three new stories I reported on and wrote at the American Geophysical Union meeting in San Francisco a week ago:

 

Assessing U.S. Fire Risks Using Soil Moisture Satellite Data

Soaring hundreds of kilometers above the Earth, a NASA satellite monitors soil moisture in the ground far below, probing drought conditions. Scientists at NASA Jet Propulsion Laboratory (JPL) analyzed these data and combined them with wildfire information from the U.S. Forest Service and land cover data from the U.S. Geological Survey. They used the results to assess fire risks, taking the first important step toward developing predictive maps for fires throughout the continental United States.

Burning trees and brush in the 2013 Rim Fire, which burned more than 1000 square kilometers in California's Stanislaus National Forest. The Rim Fire was the largest ever recorded for the Sierra Nevada mountain range. (Credit: Mike McMillan, USFS , CC BY-NC 2.0)

Burning trees and brush in the 2013 Rim Fire, which burned more than 1000 square kilometers in California’s Stanislaus National Forest. The Rim Fire was the largest ever recorded for the Sierra Nevada mountain range. (Credit: Mike McMillan, USFS , CC BY-NC 2.0)

The JPL scientists, a team led by Nick Rousseau in NASA’s DEVELOP Applied Sciences Program, find that soil moisture data alone can approximately explain the distribution and extent of fires, from the Sierra Nevada to the western plains to the Florida wetlands. Their results determine how much the dryness of regions indicates fuel available for fires. They reported their findings on Wednesday at the American Geophysical Union Fall Meeting in San Francisco, Calif.

Every year, wildfire outbreaks cause economic loss, property damage, and environmental degradation. Local, state, and federal agencies want to prepare for fire activity, and knowledge about particularly high risk areas would help them do so. If these new maps could be used to predict wildfire potential, then they would be an invaluable resource.

“This shows how much overall area is likely to burn, which could be a useful tool when Congress allocates resources for fire management,” said Sparkle Malone, a research ecologist at Rocky Mountain Research Station in Fort Collins, Colo., who was not involved in the study.

[For more, check out the entire article in Eos magazine, published on 17 Dec. 2015. Thanks to Peter Weiss and Nancy Mcguire for help with editing it.]

 

Climate change and bark beetles spell doom for Rocky Mountain spruce forests

The combination of climate change and spruce bark beetles could drastically alter Rocky Mountain spruce and pine tree populations over the next three centuries, according to a new study. Using an improved model of forest growth, death, and regeneration, a group of scientists predicts that spruce populations will decline and lodgepole pines will take their place.

Nearly every mature spruce tree has been killed by spruce beetle in this area of the Rio Grande National Forest in southwest Colorado. (Credit: U.S. Forest Service; photo: Brian Howell)

Nearly every mature spruce tree has been killed by spruce beetle in this area of the Rio Grande National Forest in southwest Colorado. (Credit: U.S. Forest Service; photo: Brian Howell)

According to new research presented at the 2015 American Geophysical Union Fall Meeting, the demographics of a forested region can be dramatically affected by insect outbreaks and fires over time. In addition, different kinds of trees have different tolerance to drought, strong winds and temperature changes. “These act to create competition between individual species and even between trees,” said Adrianna Foster, an environmental scientist at the University of Virginia and lead author of the new study.

Bark beetles are tiny–only a quarter inch in length, smaller than a grain of rice–but given the opportunity, they can rapidly consume a forest. According to the U.S. Forest Service, over the past 15 years, pine beetles have devastated Rocky Mountain forests, killing off tens of millions of lodgepole pines. But if the new study’s predictions are correct, the trees stand to make a comeback as spruce trees decline, according to Foster.

[For more, check out the entire article on the GeoSpace blog site, published on 21 Dec. 2015. Thanks to Lauren Lipuma for editing assistance.]

 

Parts of Sierra Nevada Mountains more susceptible to drought than previously thought, study finds

Particular areas in California’s Sierra Nevada Mountains have a high capacity to store water but are more susceptible to droughts than previously thought, new research finds.

The Salt Springs Reservoir, in the Sierra Nevada Mountains east of Sacramento, has experienced dropping water levels, barely covering the lowest gauges installed by the U.S. Geological Survey. (Credit: Rowan Gaffney)

The Salt Springs Reservoir, in the Sierra Nevada Mountains east of Sacramento, has experienced dropping water levels, barely covering the lowest gauges installed by the U.S. Geological Survey. (Credit: Rowan Gaffney)

“The areas we think that are most resilient to drought are actually more vulnerable to the transition from historical droughts to more extreme ones, like the one happening now,” said Rowan Gaffney, a geoscientist at the University of Nevada, Reno, and lead author of the study.

Every year, accumulated snowpack in the region stores water, which melts during the summer and recharges groundwater that flows into river and stream areas. Ecosystems, communities and agricultural irrigation depend on that water downstream, Gaffney said.

In the new study, Gaffney and fellow University of Nevada geoscientist Scott Tyler investigated the relationship between groundwater and stream flow in 10 strategically chosen locations throughout the Sierra Nevada in eastern California. High groundwater storage areas are losing the most water during the current drought, Gaffney reported at the 2015 American Geophysical Union Fall Meeting.

[For more, check out the entire article on the GeoSpace blog site, published on Dec. 2015.]

New Planetary Science: Habitable Planets and Saturn’s Titan Moon

Here are two new stories I’ve written about interesting new research presented at recent conferences this fall, the National Association of Science Writers meeting in October and the American Geophysical Union meeting this week.

 

Sara Seager’s Search for Distant Habitable Worlds

Like a 21st-century Spock, Dr. Sara Seager seeks out new worlds and civilizations. With continually improving telescopes, she persistently and passionately pursues her grand quest: to search throughout our galaxy for habitable planets, a few of which might even resemble the Earth.

Sara Seager, MIT planetary physics professor. (Credit: MIT)

Sara Seager, MIT planetary physics professor. (Credit: MIT)

Seager, an accomplished professor of planetary science and astrophysics at MIT, gave an engaging presentation at the 2015 Science Writers meeting. She spoke clearly and intensely about her research and the exciting future of planetary exploration.

She and her research group have made important breakthroughs while characterizing newly discovered planets beyond our solar system, known as exoplanets, using the NASA Kepler space telescope. With powerful next-generation observatories, she also looks forward to the next frontier, where her ongoing mission could come to fruition.

…In most exoplanet work, astronomers consider only certain planets as potentially life-friendly. Their orbit, atmosphere, surface and climate all must be just right, falling within narrow ranges of parameters. A successful search requires a daunting understanding of biology, chemistry, and geology, as well as astronomy and physics.

…Seager argues that the traditional concept of habitable zone is too rigid and should be expanded. “Exoplanets are diverse, covering nearly all masses, sizes and orbits possible,” she says. What scientists mean by habitable should be more inclusive, or they risk missing outlier planets that nonetheless could be conducive to life. Accounting for habitability varying depending on the type of star or planet alleviates the situation.

[For more, check out the entire article on the Minority Postdoc site, published on 14 Dec. 2015. Thanks to Matthew Francis for help with editing.]

 

Scientists Map Titan’s Lakes, Revealing Clues to their Origins

As Saturn’s largest moon, Titan earns its name. It’s also the only known body other than Earth with seas, numerous surface lakes, and even rainy weather. Now scientists have mapped out Titan’s polar lakes for the first time, revealing information about the moon’s climate and surface evolution. They found that the lakes formed differently than had been previously thought—and differently than any lakes on Earth.

A map of Titan’s North Pole, including its lakes, sediments and complex terrain. (Credit: NASA/JPL-Caltech/Space Science Institute.)

A map of Titan’s North Pole, including its lakes, sediments and complex terrain.
(Credit: NASA/JPL-Caltech/Space Science Institute.)

A collaboration of scientists led by Alexander Hayes of Cornell University presented their findings at the 2015 American Geophysical Union Fall Meeting. They used NASA’s Cassini spacecraft to penetrate Titan’s smoggy atmosphere and probe the complex lake systems below.

Titan’s seas and giant lakes, which are larger than the Caspian Sea and Great Lakes, appear unique in the solar system, the study found. They consist of mostly liquid hydrocarbons like methane and ethane, possibly making them a promising location to search for building blocks of carbon-based extraterrestrial life. Because Titan is tilted with respect to its orbit, it also experiences seasons, which drive these lakes toward its North Pole. But Saturn’s eccentric orbit makes the lakes shift from pole to pole, Hayes explained.

By combining Cassini RADAR mapper observations with other data, Hayes and his colleagues compiled detailed information about Titan’s lake systems and topography, allowing scientists to test ideas for how these lakes developed.

“Topography in geology is the key because it drives the evolution of landscapes,” said Samuel Birch, lead author of one of the Titan studies and a Ph.D. student at Cornell.

[For more, check out the entire article on GeoSpace, published on 14 Dec. 2015. Thanks to Lauren Lipuma for editing assistance.]

Philanthropists are Enabling and Influencing the Future of Astronomy

[This is a longer version of an op-ed I published in the San Jose Mercury News with the title “Tech moguls increasingly deciding what scientific research will be funded.” Thanks to Ed Clendaniel for help editing it.]

Billionaires and their foundations are both enabling and shaping scientific endeavors in the 21st century, raising questions that we as a society need to consider more seriously.

I have spoken to many astronomers, who consistently clamor for more reliable funding for scientific research and education. With broad public support, these scientists passionately explore the origins of life, the Milky Way, and the universe, and they naturally want to continue their research.

But what does it mean when private interests fund a growing fraction of scientific work? Can we be sure that limited resources are being directed toward the most important science?

Research & Development as a Fraction of Discretionary Spending, 1962-2014. (Source: Budget of the U.S. Government FY 2015; American Association for the Advancement of Science.)

Research & Development as a Fraction of Discretionary Spending, 1962-2014. (Source: Budget of the U.S. Government FY 2015; American Association for the Advancement of Science.)

After the Apollo program, federal funding for science and for astronomy in particular has never been a top priority, declining as a fraction of GDP. Since the Great Recession, science has received an increasingly narrow piece of the pie. Acrimonious budget debates perennially worry scientists that the mission or research program they’ve devoted their careers to might be cut.

Trends in Federal Research & Development. (Source: National Science Foundation, AAAS.)

Trends in Federal Research & Development. (Source: National Science Foundation, AAAS.)

Perhaps as a result, philanthropic funding for scientific research has bloomed, increasing sharply relative to the federal government, according to the National Science Foundation. For example, the Palo Alto-based Gordon and Betty Moore Foundation, built on the success of Intel, agreed to provide $200 million for the Thirty Meter Telescope in Hawaii, intended to study distant stars and galaxies. This summer, Yuri Milner and the Breakthrough Prize Foundation dedicated $100 million to research at the University of California, Berkeley and elsewhere to expand the search for extraterrestrial intelligence.

“Because the federal role is more and more constrained, there is a real opportunity for private philanthropy to have a lot of influence on the way in which scientific research goes forward,” Robert Kirshner, head of the Moore Foundation’s science program, told me.

These laudable initiatives put personal wealth to good use. They enable important scientific research and technology development, and some scientists benefit from the philanthropists’ largesse. But they also transfer leadership from the scientific community and public interest to the hands of a few wealthy businesspeople and Silicon Valley tech moguls.

While philanthropists support leading scientists and valuable scientific research, they and their advisors decide what is “valuable.” If they desire, they could fund their favorite scientists or the elite university they attended. They have no obligation to appeal to the scientific community or to public interests.

Philanthropists sometimes go for attention-getting projects that gets their name or logo on a major telescope (like Keck or Sloan) or a research institute (like Kavli), which also happen to enable important science for many years.

For better and perhaps also for worse, private funding of science is here to stay. Although fears of billionaires controlling science might be overblown, we should ensure that we support a democratic and transparent national system, with scientists’ and the public’s priorities guiding decisions about which projects to pursue.

Public funding involves thorough review systems involving the community, and projects develop upon a strong base with considerable oversight and transparency. This takes time, but it’s worthwhile.

Government agencies and universities support “basic” science research, allowing scientists to focus on science for its own sake and to explore long-term projects. Private interests often ignore basic research, typically spending 80 cents of every research and development dollar on the latter. In response to this shortcoming, the Science Philanthropy Alliance formed recently near Stanford University to advise foundations about how to invest directly in fundamental scientific research.

“If you’re going to have an impact in the long run, then you should be supporting basic research, which is often where some of the biggest breakthroughs come from,” said Marc Kastner, its president, referring to the Internet and the human genome.

These well-intentioned efforts offer no guarantee, however. We should urge policy-makers to reliably fund science and consider it as sacrosanct as healthcare and social security, regardless of budget limits. At the same time, we should clearly delineate the role philanthropy and private industry will play.

More Engineering News: Protein Engineering and Next-Generation Computer Architecture

Here are two new articles I’ve written about exciting newly published research on protein engineering and computer systems, led by engineers at Stanford University:

 

Stanford engineers invent process to accelerate protein evolution

A new tool enables researchers to test millions of mutated proteins in a matter of hours or days, speeding the search for new medicines, industrial enzymes and biosensors.

All living things require proteins, members of a vast family of molecules that nature “makes to order” according to the blueprints in DNA.

Through the natural process of evolution, DNA mutations generate new or more effective proteins. Humans have found so many alternative uses for these molecules – as foods, industrial enzymes, anti-cancer drugs – that scientists are eager to better understand how to engineer protein variants designed for specific uses.

Now Stanford engineers have invented a technique to dramatically accelerate protein evolution for this purpose. This technology, described in Nature Chemical Biology, allows researchers to test millions of variants of a given protein, choose the best for some task and determine the DNA sequence that creates this variant.

An overview of the directed evolution process with μSCALE: preparing protein libraries, screening them, extracting desired cells, and then inferring the DNA sequence at work. (Credit: Cochran Lab, Stanford)

An overview of the directed evolution process with μSCALE: preparing protein libraries, screening them, extracting desired cells, and then inferring the DNA sequence at work. (Credit: Cochran Lab, Stanford)

“Evolution, the survival of the fittest, takes place over a span of thousands of years, but we can now direct proteins to evolve in hours or days,” said Jennifer Cochran, an associate professor of bioengineering who co-authored the paper with Thomas Baer, executive director of the Stanford Photonics Research Center.

“This is a practical, versatile system with broad applications that researchers will find easy to use,” Baer said.

By combining Cochran’s protein engineering know-how with Baer’s expertise in laser-based instrumentation, the team created a tool that can test millions of protein variants in a matter of hours.

“The demonstrations are impressive and I look forward to seeing this technology more widely adopted,” said Frances Arnold, a professor of chemical engineering at Caltech who was not affiliated with the study.

[For more, check out the entire article in Stanford News, published on 7 Dec. 2015. Thanks to Tom Abate for help with editing.]

 

Stanford-led skyscraper-style chip design boosts electronic performance by factor of a thousand

In modern computer systems, processor and memory chips are laid out like single-story structures in a suburb. But suburban layouts waste time and energy. A new skyscraper-like design, based on materials more advanced than silicon, provides the next computing platform.

For decades, engineers have designed computer systems with processors and memory chips laid out like single-story structures in a suburb. Wires connect these chips like streets, carrying digital traffic between the processors that compute data and the memory chips that store it.

But suburban-style layouts create long commutes and regular traffic jams in electronic circuits, wasting time and energy.

That is why researchers from three other universities are working with Stanford engineers, including Associate Professor Subhasish Mitra and Professor H.-S. Philip Wong, to create a revolutionary new high-rise architecture for computing.

A multi-campus team led by Stanford engineers Subhasish Mitra and H.-S. Philip Wong has developed a revolutionary high-rise architecture for computing.

A multi-campus team led by Stanford engineers Subhasish Mitra and H.-S. Philip Wong has developed a revolutionary high-rise architecture for computing.

In Rebooting Computing, a special issue of the IEEE Computer journal, the team describes its new approach as Nano-Engineered Computing Systems Technology, or N3XT.

N3XT will break data bottlenecks by integrating processors and memory like floors in a skyscraper and by connecting these components with millions of “vias,” which play the role of tiny electronic elevators. The N3XT high-rise approach will move more data, much faster, using far less energy, than would be possible using low-rise circuits.

“We have assembled a group of top thinkers and advanced technologies to create a platform that can meet the computing demands of the future,” Mitra said.

Shifting electronics from a low-rise to a high-rise architecture will demand huge investments from industry – and the promise of big payoffs for making the switch.

“When you combine higher speed with lower energy use, N3XT systems outperform conventional approaches by a factor of a thousand,” Wong said.

[For more, check out the entire article in Stanford News, published on 9 Dec. 2015.]

Engineering News: Solar Cells and Plasma Combustion

Check out these new articles I’ve written about solar cells and plasma combustion research led by Stanford University engineers:

 

Plasma experiments bring astrophysics down to Earth

New laboratory technique allows researchers to replicate on a tiny scale the swirling clouds of ionized gases that power the sun, to further our understanding of fusion energy, solar flares and other cosmic phenomena.

Intense heat, like that found in the sun, can strip gas atoms of their electrons, creating a swirling mass of positively and negatively charged ions known as a plasma.

For several decades, laboratory researchers sought to replicate plasma conditions similar to those found in the sun in order to help them understand the basic physics of ionized matter and, ultimately, harness and control fusion energy on Earth or use it as a means of space propulsion.

Now Stanford engineers have created a tool that enables researchers to make detailed studies of certain types of plasmas in a laboratory. Their technique allows them to study astrophysical jets—very powerful streams of focused plasma energy.

A long-exposure photographic image capturing the Stanford Plasma Gun during a single firing. The image shows where the plasma is brightest during the acceleration process, which occurs over tens of microseconds.

A long-exposure photographic image capturing the Stanford Plasma Gun during a single firing. The image shows where the plasma is brightest during the acceleration process, which occurs over tens of microseconds.

Writing in Physical Review Letters, mechanical engineering graduate students Keith Loebner and Tom Underwood, together with Professor Mark Cappelli, describe how they built a device that creates tiny plasma jets and enabled them to make detailed measurements of these ionized clouds.

The researchers also proved that plasmas exhibit some of the same behavior as the gas clouds created by, say, firing a rocket engine or burning fuel inside an internal combustion engine.

Their instrument, coupled with this new understanding of the fire-like behavior of plasmas, creates a down-to-earth way to explore the physics of solar flares, fusion energy and other astrophysical events.

“The understanding of astrophysical phenomena has always been hindered by the inability to generate scaled conditions in the laboratory and measure the results in great detail,” Cappelli said.

[For more, check out the entire article in Stanford News, published on 1 Dec. 2015. Thanks to Tom Abate for help with editing.]

 

Stanford designs underwater solar cells that turn captured greenhouse gases into fuel

Taking a cue from plants, researchers figure out how to use the sun’s energy to combine CO2 with H2O to create benign chemical products, as part of a futuristic technology called artificial photosynthesis.

Stanford engineers have developed solar cells that can function under water. Instead of pumping electricity into the grid, though, the power these cells produce would be used to spur chemical reactions to convert captured greenhouse gases into fuel.

This new work, published in Nature Materials, was led by Stanford materials scientist Paul McIntyre, whose lab has been a pioneer in an emerging field known as artificial photosynthesis.

Stanford engineers have shown how to increase the power of corrosion-resistant solar cells, setting a record for solar energy output under water. (Photo credit: Shutterstock)

Stanford engineers have shown how to increase the power of corrosion-resistant solar cells, setting a record for solar energy output under water. (Photo credit: Shutterstock)

In plants, photosynthesis uses the sun’s energy to combine water and carbon dioxide to create sugar, the fuel on which they live. Artificial photosynthesis would use the energy from specialized solar cells to combine water with captured carbon dioxide to produce industrial fuels, such as natural gas.

Until now, artificial photosynthesis has faced two challenges: ordinary silicon solar cells corrode under water, and even corrosion-proof solar cells had been unable to capture enough sunlight under water to drive the envisioned chemical reactions.

Four years ago, McIntyre’s lab made solar cells resistant to corrosion in water. In the new paper, working with doctoral student Andrew Scheuermann, the researchers have shown how to increase the power of corrosion-resistant solar cells, setting a record for solar energy output under water.

“The results reported in this paper are significant because they represent not only an advance in performance of silicon artificial photosynthesis cells, but also establish the design rules needed to achieve high performance for a wide array of different semiconductors, corrosion protection layers and catalysts,” McIntyre said.

Such solar cells would be part of a larger system to fight climate change. The vision is to funnel greenhouse gases from smokestacks or the atmosphere into giant, transparent chemical tanks. Solar cells inside the tanks would spur chemical reactions to turn the greenhouse gases and water into what are sometimes called “solar fuels.”

[For more, check out the entire article in Stanford News, published on 18 Nov. 2015.]