Reporting from the American Geophysical Union: Fire Risk Maps, Rocky Mountain Forests, Sierra Nevada Water

Here are three new stories I reported on and wrote at the American Geophysical Union meeting in San Francisco a week ago:

 

Assessing U.S. Fire Risks Using Soil Moisture Satellite Data

Soaring hundreds of kilometers above the Earth, a NASA satellite monitors soil moisture in the ground far below, probing drought conditions. Scientists at NASA Jet Propulsion Laboratory (JPL) analyzed these data and combined them with wildfire information from the U.S. Forest Service and land cover data from the U.S. Geological Survey. They used the results to assess fire risks, taking the first important step toward developing predictive maps for fires throughout the continental United States.

Burning trees and brush in the 2013 Rim Fire, which burned more than 1000 square kilometers in California's Stanislaus National Forest. The Rim Fire was the largest ever recorded for the Sierra Nevada mountain range. (Credit: Mike McMillan, USFS , CC BY-NC 2.0)

Burning trees and brush in the 2013 Rim Fire, which burned more than 1000 square kilometers in California’s Stanislaus National Forest. The Rim Fire was the largest ever recorded for the Sierra Nevada mountain range. (Credit: Mike McMillan, USFS , CC BY-NC 2.0)

The JPL scientists, a team led by Nick Rousseau in NASA’s DEVELOP Applied Sciences Program, find that soil moisture data alone can approximately explain the distribution and extent of fires, from the Sierra Nevada to the western plains to the Florida wetlands. Their results determine how much the dryness of regions indicates fuel available for fires. They reported their findings on Wednesday at the American Geophysical Union Fall Meeting in San Francisco, Calif.

Every year, wildfire outbreaks cause economic loss, property damage, and environmental degradation. Local, state, and federal agencies want to prepare for fire activity, and knowledge about particularly high risk areas would help them do so. If these new maps could be used to predict wildfire potential, then they would be an invaluable resource.

“This shows how much overall area is likely to burn, which could be a useful tool when Congress allocates resources for fire management,” said Sparkle Malone, a research ecologist at Rocky Mountain Research Station in Fort Collins, Colo., who was not involved in the study.

[For more, check out the entire article in Eos magazine, published on 17 Dec. 2015. Thanks to Peter Weiss and Nancy Mcguire for help with editing it.]

 

Climate change and bark beetles spell doom for Rocky Mountain spruce forests

The combination of climate change and spruce bark beetles could drastically alter Rocky Mountain spruce and pine tree populations over the next three centuries, according to a new study. Using an improved model of forest growth, death, and regeneration, a group of scientists predicts that spruce populations will decline and lodgepole pines will take their place.

Nearly every mature spruce tree has been killed by spruce beetle in this area of the Rio Grande National Forest in southwest Colorado. (Credit: U.S. Forest Service; photo: Brian Howell)

Nearly every mature spruce tree has been killed by spruce beetle in this area of the Rio Grande National Forest in southwest Colorado. (Credit: U.S. Forest Service; photo: Brian Howell)

According to new research presented at the 2015 American Geophysical Union Fall Meeting, the demographics of a forested region can be dramatically affected by insect outbreaks and fires over time. In addition, different kinds of trees have different tolerance to drought, strong winds and temperature changes. “These act to create competition between individual species and even between trees,” said Adrianna Foster, an environmental scientist at the University of Virginia and lead author of the new study.

Bark beetles are tiny–only a quarter inch in length, smaller than a grain of rice–but given the opportunity, they can rapidly consume a forest. According to the U.S. Forest Service, over the past 15 years, pine beetles have devastated Rocky Mountain forests, killing off tens of millions of lodgepole pines. But if the new study’s predictions are correct, the trees stand to make a comeback as spruce trees decline, according to Foster.

[For more, check out the entire article on the GeoSpace blog site, published on 21 Dec. 2015. Thanks to Lauren Lipuma for editing assistance.]

 

Parts of Sierra Nevada Mountains more susceptible to drought than previously thought, study finds

Particular areas in California’s Sierra Nevada Mountains have a high capacity to store water but are more susceptible to droughts than previously thought, new research finds.

The Salt Springs Reservoir, in the Sierra Nevada Mountains east of Sacramento, has experienced dropping water levels, barely covering the lowest gauges installed by the U.S. Geological Survey. (Credit: Rowan Gaffney)

The Salt Springs Reservoir, in the Sierra Nevada Mountains east of Sacramento, has experienced dropping water levels, barely covering the lowest gauges installed by the U.S. Geological Survey. (Credit: Rowan Gaffney)

“The areas we think that are most resilient to drought are actually more vulnerable to the transition from historical droughts to more extreme ones, like the one happening now,” said Rowan Gaffney, a geoscientist at the University of Nevada, Reno, and lead author of the study.

Every year, accumulated snowpack in the region stores water, which melts during the summer and recharges groundwater that flows into river and stream areas. Ecosystems, communities and agricultural irrigation depend on that water downstream, Gaffney said.

In the new study, Gaffney and fellow University of Nevada geoscientist Scott Tyler investigated the relationship between groundwater and stream flow in 10 strategically chosen locations throughout the Sierra Nevada in eastern California. High groundwater storage areas are losing the most water during the current drought, Gaffney reported at the 2015 American Geophysical Union Fall Meeting.

[For more, check out the entire article on the GeoSpace blog site, published on Dec. 2015.]

New Planetary Science: Habitable Planets and Saturn’s Titan Moon

Here are two new stories I’ve written about interesting new research presented at recent conferences this fall, the National Association of Science Writers meeting in October and the American Geophysical Union meeting this week.

 

Sara Seager’s Search for Distant Habitable Worlds

Like a 21st-century Spock, Dr. Sara Seager seeks out new worlds and civilizations. With continually improving telescopes, she persistently and passionately pursues her grand quest: to search throughout our galaxy for habitable planets, a few of which might even resemble the Earth.

Sara Seager, MIT planetary physics professor. (Credit: MIT)

Sara Seager, MIT planetary physics professor. (Credit: MIT)

Seager, an accomplished professor of planetary science and astrophysics at MIT, gave an engaging presentation at the 2015 Science Writers meeting. She spoke clearly and intensely about her research and the exciting future of planetary exploration.

She and her research group have made important breakthroughs while characterizing newly discovered planets beyond our solar system, known as exoplanets, using the NASA Kepler space telescope. With powerful next-generation observatories, she also looks forward to the next frontier, where her ongoing mission could come to fruition.

…In most exoplanet work, astronomers consider only certain planets as potentially life-friendly. Their orbit, atmosphere, surface and climate all must be just right, falling within narrow ranges of parameters. A successful search requires a daunting understanding of biology, chemistry, and geology, as well as astronomy and physics.

…Seager argues that the traditional concept of habitable zone is too rigid and should be expanded. “Exoplanets are diverse, covering nearly all masses, sizes and orbits possible,” she says. What scientists mean by habitable should be more inclusive, or they risk missing outlier planets that nonetheless could be conducive to life. Accounting for habitability varying depending on the type of star or planet alleviates the situation.

[For more, check out the entire article on the Minority Postdoc site, published on 14 Dec. 2015. Thanks to Matthew Francis for help with editing.]

 

Scientists Map Titan’s Lakes, Revealing Clues to their Origins

As Saturn’s largest moon, Titan earns its name. It’s also the only known body other than Earth with seas, numerous surface lakes, and even rainy weather. Now scientists have mapped out Titan’s polar lakes for the first time, revealing information about the moon’s climate and surface evolution. They found that the lakes formed differently than had been previously thought—and differently than any lakes on Earth.

A map of Titan’s North Pole, including its lakes, sediments and complex terrain. (Credit: NASA/JPL-Caltech/Space Science Institute.)

A map of Titan’s North Pole, including its lakes, sediments and complex terrain.
(Credit: NASA/JPL-Caltech/Space Science Institute.)

A collaboration of scientists led by Alexander Hayes of Cornell University presented their findings at the 2015 American Geophysical Union Fall Meeting. They used NASA’s Cassini spacecraft to penetrate Titan’s smoggy atmosphere and probe the complex lake systems below.

Titan’s seas and giant lakes, which are larger than the Caspian Sea and Great Lakes, appear unique in the solar system, the study found. They consist of mostly liquid hydrocarbons like methane and ethane, possibly making them a promising location to search for building blocks of carbon-based extraterrestrial life. Because Titan is tilted with respect to its orbit, it also experiences seasons, which drive these lakes toward its North Pole. But Saturn’s eccentric orbit makes the lakes shift from pole to pole, Hayes explained.

By combining Cassini RADAR mapper observations with other data, Hayes and his colleagues compiled detailed information about Titan’s lake systems and topography, allowing scientists to test ideas for how these lakes developed.

“Topography in geology is the key because it drives the evolution of landscapes,” said Samuel Birch, lead author of one of the Titan studies and a Ph.D. student at Cornell.

[For more, check out the entire article on GeoSpace, published on 14 Dec. 2015. Thanks to Lauren Lipuma for editing assistance.]

Philanthropists are Enabling and Influencing the Future of Astronomy

[This is a longer version of an op-ed I published in the San Jose Mercury News with the title “Tech moguls increasingly deciding what scientific research will be funded.” Thanks to Ed Clendaniel for help editing it.]

Billionaires and their foundations are both enabling and shaping scientific endeavors in the 21st century, raising questions that we as a society need to consider more seriously.

I have spoken to many astronomers, who consistently clamor for more reliable funding for scientific research and education. With broad public support, these scientists passionately explore the origins of life, the Milky Way, and the universe, and they naturally want to continue their research.

But what does it mean when private interests fund a growing fraction of scientific work? Can we be sure that limited resources are being directed toward the most important science?

Research & Development as a Fraction of Discretionary Spending, 1962-2014. (Source: Budget of the U.S. Government FY 2015; American Association for the Advancement of Science.)

Research & Development as a Fraction of Discretionary Spending, 1962-2014. (Source: Budget of the U.S. Government FY 2015; American Association for the Advancement of Science.)

After the Apollo program, federal funding for science and for astronomy in particular has never been a top priority, declining as a fraction of GDP. Since the Great Recession, science has received an increasingly narrow piece of the pie. Acrimonious budget debates perennially worry scientists that the mission or research program they’ve devoted their careers to might be cut.

Trends in Federal Research & Development. (Source: National Science Foundation, AAAS.)

Trends in Federal Research & Development. (Source: National Science Foundation, AAAS.)

Perhaps as a result, philanthropic funding for scientific research has bloomed, increasing sharply relative to the federal government, according to the National Science Foundation. For example, the Palo Alto-based Gordon and Betty Moore Foundation, built on the success of Intel, agreed to provide $200 million for the Thirty Meter Telescope in Hawaii, intended to study distant stars and galaxies. This summer, Yuri Milner and the Breakthrough Prize Foundation dedicated $100 million to research at the University of California, Berkeley and elsewhere to expand the search for extraterrestrial intelligence.

“Because the federal role is more and more constrained, there is a real opportunity for private philanthropy to have a lot of influence on the way in which scientific research goes forward,” Robert Kirshner, head of the Moore Foundation’s science program, told me.

These laudable initiatives put personal wealth to good use. They enable important scientific research and technology development, and some scientists benefit from the philanthropists’ largesse. But they also transfer leadership from the scientific community and public interest to the hands of a few wealthy businesspeople and Silicon Valley tech moguls.

While philanthropists support leading scientists and valuable scientific research, they and their advisors decide what is “valuable.” If they desire, they could fund their favorite scientists or the elite university they attended. They have no obligation to appeal to the scientific community or to public interests.

Philanthropists sometimes go for attention-getting projects that gets their name or logo on a major telescope (like Keck or Sloan) or a research institute (like Kavli), which also happen to enable important science for many years.

For better and perhaps also for worse, private funding of science is here to stay. Although fears of billionaires controlling science might be overblown, we should ensure that we support a democratic and transparent national system, with scientists’ and the public’s priorities guiding decisions about which projects to pursue.

Public funding involves thorough review systems involving the community, and projects develop upon a strong base with considerable oversight and transparency. This takes time, but it’s worthwhile.

Government agencies and universities support “basic” science research, allowing scientists to focus on science for its own sake and to explore long-term projects. Private interests often ignore basic research, typically spending 80 cents of every research and development dollar on the latter. In response to this shortcoming, the Science Philanthropy Alliance formed recently near Stanford University to advise foundations about how to invest directly in fundamental scientific research.

“If you’re going to have an impact in the long run, then you should be supporting basic research, which is often where some of the biggest breakthroughs come from,” said Marc Kastner, its president, referring to the Internet and the human genome.

These well-intentioned efforts offer no guarantee, however. We should urge policy-makers to reliably fund science and consider it as sacrosanct as healthcare and social security, regardless of budget limits. At the same time, we should clearly delineate the role philanthropy and private industry will play.

More Engineering News: Protein Engineering and Next-Generation Computer Architecture

Here are two new articles I’ve written about exciting newly published research on protein engineering and computer systems, led by engineers at Stanford University:

 

Stanford engineers invent process to accelerate protein evolution

A new tool enables researchers to test millions of mutated proteins in a matter of hours or days, speeding the search for new medicines, industrial enzymes and biosensors.

All living things require proteins, members of a vast family of molecules that nature “makes to order” according to the blueprints in DNA.

Through the natural process of evolution, DNA mutations generate new or more effective proteins. Humans have found so many alternative uses for these molecules – as foods, industrial enzymes, anti-cancer drugs – that scientists are eager to better understand how to engineer protein variants designed for specific uses.

Now Stanford engineers have invented a technique to dramatically accelerate protein evolution for this purpose. This technology, described in Nature Chemical Biology, allows researchers to test millions of variants of a given protein, choose the best for some task and determine the DNA sequence that creates this variant.

An overview of the directed evolution process with μSCALE: preparing protein libraries, screening them, extracting desired cells, and then inferring the DNA sequence at work. (Credit: Cochran Lab, Stanford)

An overview of the directed evolution process with μSCALE: preparing protein libraries, screening them, extracting desired cells, and then inferring the DNA sequence at work. (Credit: Cochran Lab, Stanford)

“Evolution, the survival of the fittest, takes place over a span of thousands of years, but we can now direct proteins to evolve in hours or days,” said Jennifer Cochran, an associate professor of bioengineering who co-authored the paper with Thomas Baer, executive director of the Stanford Photonics Research Center.

“This is a practical, versatile system with broad applications that researchers will find easy to use,” Baer said.

By combining Cochran’s protein engineering know-how with Baer’s expertise in laser-based instrumentation, the team created a tool that can test millions of protein variants in a matter of hours.

“The demonstrations are impressive and I look forward to seeing this technology more widely adopted,” said Frances Arnold, a professor of chemical engineering at Caltech who was not affiliated with the study.

[For more, check out the entire article in Stanford News, published on 7 Dec. 2015. Thanks to Tom Abate for help with editing.]

 

Stanford-led skyscraper-style chip design boosts electronic performance by factor of a thousand

In modern computer systems, processor and memory chips are laid out like single-story structures in a suburb. But suburban layouts waste time and energy. A new skyscraper-like design, based on materials more advanced than silicon, provides the next computing platform.

For decades, engineers have designed computer systems with processors and memory chips laid out like single-story structures in a suburb. Wires connect these chips like streets, carrying digital traffic between the processors that compute data and the memory chips that store it.

But suburban-style layouts create long commutes and regular traffic jams in electronic circuits, wasting time and energy.

That is why researchers from three other universities are working with Stanford engineers, including Associate Professor Subhasish Mitra and Professor H.-S. Philip Wong, to create a revolutionary new high-rise architecture for computing.

A multi-campus team led by Stanford engineers Subhasish Mitra and H.-S. Philip Wong has developed a revolutionary high-rise architecture for computing.

A multi-campus team led by Stanford engineers Subhasish Mitra and H.-S. Philip Wong has developed a revolutionary high-rise architecture for computing.

In Rebooting Computing, a special issue of the IEEE Computer journal, the team describes its new approach as Nano-Engineered Computing Systems Technology, or N3XT.

N3XT will break data bottlenecks by integrating processors and memory like floors in a skyscraper and by connecting these components with millions of “vias,” which play the role of tiny electronic elevators. The N3XT high-rise approach will move more data, much faster, using far less energy, than would be possible using low-rise circuits.

“We have assembled a group of top thinkers and advanced technologies to create a platform that can meet the computing demands of the future,” Mitra said.

Shifting electronics from a low-rise to a high-rise architecture will demand huge investments from industry – and the promise of big payoffs for making the switch.

“When you combine higher speed with lower energy use, N3XT systems outperform conventional approaches by a factor of a thousand,” Wong said.

[For more, check out the entire article in Stanford News, published on 9 Dec. 2015.]

Engineering News: Solar Cells and Plasma Combustion

Check out these new articles I’ve written about solar cells and plasma combustion research led by Stanford University engineers:

 

Plasma experiments bring astrophysics down to Earth

New laboratory technique allows researchers to replicate on a tiny scale the swirling clouds of ionized gases that power the sun, to further our understanding of fusion energy, solar flares and other cosmic phenomena.

Intense heat, like that found in the sun, can strip gas atoms of their electrons, creating a swirling mass of positively and negatively charged ions known as a plasma.

For several decades, laboratory researchers sought to replicate plasma conditions similar to those found in the sun in order to help them understand the basic physics of ionized matter and, ultimately, harness and control fusion energy on Earth or use it as a means of space propulsion.

Now Stanford engineers have created a tool that enables researchers to make detailed studies of certain types of plasmas in a laboratory. Their technique allows them to study astrophysical jets—very powerful streams of focused plasma energy.

A long-exposure photographic image capturing the Stanford Plasma Gun during a single firing. The image shows where the plasma is brightest during the acceleration process, which occurs over tens of microseconds.

A long-exposure photographic image capturing the Stanford Plasma Gun during a single firing. The image shows where the plasma is brightest during the acceleration process, which occurs over tens of microseconds.

Writing in Physical Review Letters, mechanical engineering graduate students Keith Loebner and Tom Underwood, together with Professor Mark Cappelli, describe how they built a device that creates tiny plasma jets and enabled them to make detailed measurements of these ionized clouds.

The researchers also proved that plasmas exhibit some of the same behavior as the gas clouds created by, say, firing a rocket engine or burning fuel inside an internal combustion engine.

Their instrument, coupled with this new understanding of the fire-like behavior of plasmas, creates a down-to-earth way to explore the physics of solar flares, fusion energy and other astrophysical events.

“The understanding of astrophysical phenomena has always been hindered by the inability to generate scaled conditions in the laboratory and measure the results in great detail,” Cappelli said.

[For more, check out the entire article in Stanford News, published on 1 Dec. 2015. Thanks to Tom Abate for help with editing.]

 

Stanford designs underwater solar cells that turn captured greenhouse gases into fuel

Taking a cue from plants, researchers figure out how to use the sun’s energy to combine CO2 with H2O to create benign chemical products, as part of a futuristic technology called artificial photosynthesis.

Stanford engineers have developed solar cells that can function under water. Instead of pumping electricity into the grid, though, the power these cells produce would be used to spur chemical reactions to convert captured greenhouse gases into fuel.

This new work, published in Nature Materials, was led by Stanford materials scientist Paul McIntyre, whose lab has been a pioneer in an emerging field known as artificial photosynthesis.

Stanford engineers have shown how to increase the power of corrosion-resistant solar cells, setting a record for solar energy output under water. (Photo credit: Shutterstock)

Stanford engineers have shown how to increase the power of corrosion-resistant solar cells, setting a record for solar energy output under water. (Photo credit: Shutterstock)

In plants, photosynthesis uses the sun’s energy to combine water and carbon dioxide to create sugar, the fuel on which they live. Artificial photosynthesis would use the energy from specialized solar cells to combine water with captured carbon dioxide to produce industrial fuels, such as natural gas.

Until now, artificial photosynthesis has faced two challenges: ordinary silicon solar cells corrode under water, and even corrosion-proof solar cells had been unable to capture enough sunlight under water to drive the envisioned chemical reactions.

Four years ago, McIntyre’s lab made solar cells resistant to corrosion in water. In the new paper, working with doctoral student Andrew Scheuermann, the researchers have shown how to increase the power of corrosion-resistant solar cells, setting a record for solar energy output under water.

“The results reported in this paper are significant because they represent not only an advance in performance of silicon artificial photosynthesis cells, but also establish the design rules needed to achieve high performance for a wide array of different semiconductors, corrosion protection layers and catalysts,” McIntyre said.

Such solar cells would be part of a larger system to fight climate change. The vision is to funnel greenhouse gases from smokestacks or the atmosphere into giant, transparent chemical tanks. Solar cells inside the tanks would spur chemical reactions to turn the greenhouse gases and water into what are sometimes called “solar fuels.”

[For more, check out the entire article in Stanford News, published on 18 Nov. 2015.]