Exciting and Controversial Science: Gravitational Waves and a New Ninth Planet?

We’ve had some fantastic astronomical news this month. Last week, we encountered evidence of a “new ninth planet” lurking in the outer reaches of our solar system—170 years after the discovery of Neptune. And earlier in January, we heard a cacophony of whispers about minute gravitational waves being detected for the first time ever. Either one, if true, would be amazing to both astrophysicists and space lovers and would be the biggest discovery of 2016. We should be excited about them, but we should be careful about getting our hopes up so soon.

A New Planet, Far, Far Away?

A couple fellow science writers and I went hiking at Castle Rock State Park in the middle of the Santa Cruz Mountains yesterday, and along the trail, we encountered a variety of people. On our way down, we happened to overhear a conversation: “What’s your favorite planet?” followed by a reply, “Did you hear about the new planet scientists discovered?” Isn’t that great? I’m glad that the story got so much media attention and made it to the front pages of newspapers. It intrigued people, and they’re talking about it.

By studying the strangely aligned orbits of Kuiper Belt Objects far beyond Pluto’s realm, astronomers may have inferred evidence of a planet up to 10 times bigger than Earth. It would be much, much farther than Pluto, making it hard to spot. And from that distance, our sun would look almost like any other star. But if it exists, a new world (dubbed “Planet X”) joining our solar system’s family, even such an estranged cousin, would be exciting indeed.

Eric Hand (Science magazine) points out that the Subaru Telescope could search for Planet X. (Data) JPL; Batygin and Brown/Caltech; (Diagram) A. Cuadra/Science

Eric Hand (Science magazine) points out that the Subaru Telescope could search for Planet X. (Data) JPL; Batygin and Brown/Caltech; (Diagram) A. Cuadra/Science

Nevertheless, we should be concerned that the results are still very uncertain. The authors of the paper in Astronomical Journal, Konstantin Batygin and Mike Brown (both at Caltech), argue that there’s only a 0.007% chance, about one in 15,000, that the clustering of the distant objects’ orbits could be a coincidence. But it’s possible that the behavior of the orbits could have other possibly more likely explanations, such as other unseen Kuiper Belt Objects with orbits aligned in the opposite way. (Other astronomers, like Scott Sheppard and Greg Laughlin, estimate the chance of a planet really being out there at 60-70%. I wouldn’t bank on those odds.)

For that reason, we should remain skeptical for now. Some reporters and editors were a bit more careful than others. For example, while some headlines used appropriately hedging words like “suggest” and “may,” papers like the Denver Post and Washington Post had “The New No. 9” or “Welcome to Planet Nine.” This is already an exciting story to tell though, and we don’t need to exaggerate to get readers’ attention. If the planet turns out not to exist, people who read overblown headlines like those will be frustrated and confused.

Finally, we should all recall that Mike Brown was the main force behind Pluto’s demotion by the International Astronomical Union ten years ago. Since he calls himself the “Pluto Killer” (and wrote a book, “How I Killed Pluto and Why It Had It Coming”), it would be ironic if he helped discover a new ninth planet, replacing Pluto. But he and the Caltech news office seem to have hyped up his paper’s findings more than they deserved, given all the uncertainties involved.

Gravitational Waves Discovered?

While procrastinating and flipping through Twitter earlier this month, I came across some juicy gossip. I heard what sounded like the tantalizing detection of gravitational waves—an unprecedented achievement. These tiny ripples in space-time, predicted by Albert Einstein and thought to be produced by collisions of black holes or neutron stars, had been too small to measure before. Gravity is the weakest of forces, after all.

But it turns out that Lawrence Krauss, a well-known cosmologist and provocateur at Arizona State University, had caused the hullabaloo with some ill-advised tweets. He once again drew the media’s limelight to himself by spreading rumors that scientists in the Laser Interferometer Gravitational-Wave Observatory (LIGO) collaboration had detected gravitational waves for the first time. In the process, he put those scientists in a tough spot, as I’m sure they faced pressure to make sensitive statements about their ongoing research.

The LIGO Laboratory operates two detector sites, one near Hanford in eastern Washington (pictured here) and another near Livingston, Louisiana. (Credit: Caltech/MIT/LIGO Lab)

The LIGO Laboratory operates two detector sites, one near Hanford in eastern Washington (pictured here) and another near Livingston, Louisiana. (Credit: Caltech/MIT/LIGO Lab)

The LIGO team is still working on their analysis using a pair of detectors in Louisiana and Washington state, and they haven’t yet produced conclusive results. From what I can tell, they may have evidence but the situation is far from clear. There is nothing wrong with waiting a while until you’ve thoroughly investigated all the relevant issues and sources of error before announcing a momentous discovery. The alternative is to prematurely declare it, only to face the embarrassing possibility of retracting it later (which sort of happened to BICEP2 scientists with their supposed discovery of primordial gravitational waves).

Gravitational waves will have to remain elusive for now. And if and when LIGO physicists do have convincing evidence of gravitational waves, they need not share any of the glory or credit with Krauss.

Fortunately, in spite of this excitement, science writers and editors kept their cool and soberly pointed to Krauss’s rumors before digging into the fascinating and painstaking work LIGO scientists are doing. Here’s some excellent coverage by Clara Moskowitz in Scientific American and by Lisa Grossman in New Scientist.

[26 Jan. update: I decided to tone down my criticism of Mike Brown, but not of Lawrence Krauss.]

El Niño brings concerns of mudslides, flooding, and coastal erosion

Here’s two related stories I reported on and wrote for the Monterey Herald newspaper over the past week:

 

Concerns rise about mudslides in Monterey County

Higher-than-average rainfall last week, in the midst of an El Niño winter, has raised concerns about landslides and mudslides in Monterey County.

And those concerns are warranted. Although dangerous landslides have not been witnessed in the area this year, historically January and February is prime time. More than 60 percent of landslides happen in those months, when enough rain has been sopped up by the ground to loosen the soil, said John Stock, a U.S. Geological Survey landslide expert in the San Francisco Bay Area.

Vehicles get trapped in a mudslide on Highway 1 three miles south of Esalen on Feb. 13, 1987. The major winter storm caused this section of roadway to be closed for weeks. (Herald file photo)

Vehicles get trapped in a mudslide on Highway 1 three miles south of Esalen on Feb. 13, 1987. The major winter storm caused this section of roadway to be closed for weeks. (Herald file photo)

There are a few key elements for landslides: gravity and water. After four years of drought, the soil might take a little longer to be saturated by rain, but is still vulnerable.

“Weak rocks and steep slopes are the main factors, and they’re helped by rainfall,” said Chris Wills, a geologist for the California Geological Society.

Geologists have identified two varieties of landslides and debris flows: a fast-moving kind and a slow-moving kind. Fast-moving landslides tend to occur during or shortly after intense rainstorms. The soil acts like a sponge, soaking up water in its pores until no more water can be absorbed. Then the freely moving water exerts pressure and pushes the sand grains apart. On steep eroding hillsides, this can generate landslides, spreading shallow soil and rocks downhill.

Slow-moving landslides are much larger and deeper, typically happening after a delay — days, weeks or even months after a lot of rainfall has accumulated. Such a landslide occurred in La Conchita near Santa Barbara during the winter of 2005, killing 10 people and damaging dozens of homes. They begin by creeping down slower than at a speed of an inch per year, making them hard to spot. Then they accelerate as massive amounts of earth, boulders and trees tumble down.

“There are two reasons to worry about landslides,” said Wills. “Fast-moving ones can be dangerous to people, and deep slow-moving ones can affect roads, pipelines and houses built on them. These cause significant costs to society,” he added…

[For more, check out the entire story in the Monterey Herald, published on 10 Jan. 2016. Thanks to David Kellogg for editing assistance.]

 

Rainstorms, tides and El Niño are reshaping Monterey Bay beaches

Just as big waves wash away carefully constructed sand castles, El Niño threatens to transform Monterey County beaches and coastlines.

Every winter rainy season brings storms and heavy surf that erode shores and wash away sand, which waves return to the coast in summer. But El Niño generates extra rain and higher sea levels, which increases the erosion during intense and windy storms, affecting coastal bluffs and beaches around Carmel, Pacific Grove, Pebble Beach and Monterey.

People watch as the Carmel River flows to the ocean at Carmel River State Beach on Monday, January 11, 2016.  County crews worked with equipment on Sunday to start the breach through the sand bar at the southern channel.  The river broke through on its own sometime Sunday night.  (Vern Fisher - Monterey Herald)

People watch as the Carmel River flows to the ocean at Carmel River State Beach on Monday, January 11, 2016. County crews worked with equipment on Sunday to start the breach through the sand bar at the southern channel. The river broke through on its own sometime Sunday night. (Vern Fisher – Monterey Herald)

“Southern Monterey Bay has the most highly erosive beaches in all of California,” said Paul Michel, superintendent of the Monterey Bay National Marine Sanctuary.

Continual rainstorms swelled the waters of Carmel River last week. After 10 p.m. Sunday, a sand barrier built to keep the river from breaching at the Carmel Lagoon was washed away, according to Melanie Beretti, program manager at the Monterey County Resource Management Agency. After considering bulldozing sand back in place, officials expressed concern about it just getting flushed out to sea, so they decided to hold back.

“We need to balance when and how to utilize the sand,” Beretti said.

The beach has been reopened to visitors, and the Resource Management Agency and National Oceanic and Atmospheric Administration continue to monitor the situation to protect the steelhead trout habitat in the lagoon.

This incident could be a small taste of much bigger things to come. Abnormally large storms and big tides could damage roads, bike paths, hiking trails, water lines, sewage systems or even homes close to the coast…

[For more, check out the entire story in the Monterey Herald, published on 11 Jan. 2016.]

Reporting from the American Geophysical Union: Fire Risk Maps, Rocky Mountain Forests, Sierra Nevada Water

Here are three new stories I reported on and wrote at the American Geophysical Union meeting in San Francisco a week ago:

 

Assessing U.S. Fire Risks Using Soil Moisture Satellite Data

Soaring hundreds of kilometers above the Earth, a NASA satellite monitors soil moisture in the ground far below, probing drought conditions. Scientists at NASA Jet Propulsion Laboratory (JPL) analyzed these data and combined them with wildfire information from the U.S. Forest Service and land cover data from the U.S. Geological Survey. They used the results to assess fire risks, taking the first important step toward developing predictive maps for fires throughout the continental United States.

Burning trees and brush in the 2013 Rim Fire, which burned more than 1000 square kilometers in California's Stanislaus National Forest. The Rim Fire was the largest ever recorded for the Sierra Nevada mountain range. (Credit: Mike McMillan, USFS , CC BY-NC 2.0)

Burning trees and brush in the 2013 Rim Fire, which burned more than 1000 square kilometers in California’s Stanislaus National Forest. The Rim Fire was the largest ever recorded for the Sierra Nevada mountain range. (Credit: Mike McMillan, USFS , CC BY-NC 2.0)

The JPL scientists, a team led by Nick Rousseau in NASA’s DEVELOP Applied Sciences Program, find that soil moisture data alone can approximately explain the distribution and extent of fires, from the Sierra Nevada to the western plains to the Florida wetlands. Their results determine how much the dryness of regions indicates fuel available for fires. They reported their findings on Wednesday at the American Geophysical Union Fall Meeting in San Francisco, Calif.

Every year, wildfire outbreaks cause economic loss, property damage, and environmental degradation. Local, state, and federal agencies want to prepare for fire activity, and knowledge about particularly high risk areas would help them do so. If these new maps could be used to predict wildfire potential, then they would be an invaluable resource.

“This shows how much overall area is likely to burn, which could be a useful tool when Congress allocates resources for fire management,” said Sparkle Malone, a research ecologist at Rocky Mountain Research Station in Fort Collins, Colo., who was not involved in the study.

[For more, check out the entire article in Eos magazine, published on 17 Dec. 2015. Thanks to Peter Weiss and Nancy Mcguire for help with editing it.]

 

Climate change and bark beetles spell doom for Rocky Mountain spruce forests

The combination of climate change and spruce bark beetles could drastically alter Rocky Mountain spruce and pine tree populations over the next three centuries, according to a new study. Using an improved model of forest growth, death, and regeneration, a group of scientists predicts that spruce populations will decline and lodgepole pines will take their place.

Nearly every mature spruce tree has been killed by spruce beetle in this area of the Rio Grande National Forest in southwest Colorado. (Credit: U.S. Forest Service; photo: Brian Howell)

Nearly every mature spruce tree has been killed by spruce beetle in this area of the Rio Grande National Forest in southwest Colorado. (Credit: U.S. Forest Service; photo: Brian Howell)

According to new research presented at the 2015 American Geophysical Union Fall Meeting, the demographics of a forested region can be dramatically affected by insect outbreaks and fires over time. In addition, different kinds of trees have different tolerance to drought, strong winds and temperature changes. “These act to create competition between individual species and even between trees,” said Adrianna Foster, an environmental scientist at the University of Virginia and lead author of the new study.

Bark beetles are tiny–only a quarter inch in length, smaller than a grain of rice–but given the opportunity, they can rapidly consume a forest. According to the U.S. Forest Service, over the past 15 years, pine beetles have devastated Rocky Mountain forests, killing off tens of millions of lodgepole pines. But if the new study’s predictions are correct, the trees stand to make a comeback as spruce trees decline, according to Foster.

[For more, check out the entire article on the GeoSpace blog site, published on 21 Dec. 2015. Thanks to Lauren Lipuma for editing assistance.]

 

Parts of Sierra Nevada Mountains more susceptible to drought than previously thought, study finds

Particular areas in California’s Sierra Nevada Mountains have a high capacity to store water but are more susceptible to droughts than previously thought, new research finds.

The Salt Springs Reservoir, in the Sierra Nevada Mountains east of Sacramento, has experienced dropping water levels, barely covering the lowest gauges installed by the U.S. Geological Survey. (Credit: Rowan Gaffney)

The Salt Springs Reservoir, in the Sierra Nevada Mountains east of Sacramento, has experienced dropping water levels, barely covering the lowest gauges installed by the U.S. Geological Survey. (Credit: Rowan Gaffney)

“The areas we think that are most resilient to drought are actually more vulnerable to the transition from historical droughts to more extreme ones, like the one happening now,” said Rowan Gaffney, a geoscientist at the University of Nevada, Reno, and lead author of the study.

Every year, accumulated snowpack in the region stores water, which melts during the summer and recharges groundwater that flows into river and stream areas. Ecosystems, communities and agricultural irrigation depend on that water downstream, Gaffney said.

In the new study, Gaffney and fellow University of Nevada geoscientist Scott Tyler investigated the relationship between groundwater and stream flow in 10 strategically chosen locations throughout the Sierra Nevada in eastern California. High groundwater storage areas are losing the most water during the current drought, Gaffney reported at the 2015 American Geophysical Union Fall Meeting.

[For more, check out the entire article on the GeoSpace blog site, published on Dec. 2015.]

New Planetary Science: Habitable Planets and Saturn’s Titan Moon

Here are two new stories I’ve written about interesting new research presented at recent conferences this fall, the National Association of Science Writers meeting in October and the American Geophysical Union meeting this week.

 

Sara Seager’s Search for Distant Habitable Worlds

Like a 21st-century Spock, Dr. Sara Seager seeks out new worlds and civilizations. With continually improving telescopes, she persistently and passionately pursues her grand quest: to search throughout our galaxy for habitable planets, a few of which might even resemble the Earth.

Sara Seager, MIT planetary physics professor. (Credit: MIT)

Sara Seager, MIT planetary physics professor. (Credit: MIT)

Seager, an accomplished professor of planetary science and astrophysics at MIT, gave an engaging presentation at the 2015 Science Writers meeting. She spoke clearly and intensely about her research and the exciting future of planetary exploration.

She and her research group have made important breakthroughs while characterizing newly discovered planets beyond our solar system, known as exoplanets, using the NASA Kepler space telescope. With powerful next-generation observatories, she also looks forward to the next frontier, where her ongoing mission could come to fruition.

…In most exoplanet work, astronomers consider only certain planets as potentially life-friendly. Their orbit, atmosphere, surface and climate all must be just right, falling within narrow ranges of parameters. A successful search requires a daunting understanding of biology, chemistry, and geology, as well as astronomy and physics.

…Seager argues that the traditional concept of habitable zone is too rigid and should be expanded. “Exoplanets are diverse, covering nearly all masses, sizes and orbits possible,” she says. What scientists mean by habitable should be more inclusive, or they risk missing outlier planets that nonetheless could be conducive to life. Accounting for habitability varying depending on the type of star or planet alleviates the situation.

[For more, check out the entire article on the Minority Postdoc site, published on 14 Dec. 2015. Thanks to Matthew Francis for help with editing.]

 

Scientists Map Titan’s Lakes, Revealing Clues to their Origins

As Saturn’s largest moon, Titan earns its name. It’s also the only known body other than Earth with seas, numerous surface lakes, and even rainy weather. Now scientists have mapped out Titan’s polar lakes for the first time, revealing information about the moon’s climate and surface evolution. They found that the lakes formed differently than had been previously thought—and differently than any lakes on Earth.

A map of Titan’s North Pole, including its lakes, sediments and complex terrain. (Credit: NASA/JPL-Caltech/Space Science Institute.)

A map of Titan’s North Pole, including its lakes, sediments and complex terrain.
(Credit: NASA/JPL-Caltech/Space Science Institute.)

A collaboration of scientists led by Alexander Hayes of Cornell University presented their findings at the 2015 American Geophysical Union Fall Meeting. They used NASA’s Cassini spacecraft to penetrate Titan’s smoggy atmosphere and probe the complex lake systems below.

Titan’s seas and giant lakes, which are larger than the Caspian Sea and Great Lakes, appear unique in the solar system, the study found. They consist of mostly liquid hydrocarbons like methane and ethane, possibly making them a promising location to search for building blocks of carbon-based extraterrestrial life. Because Titan is tilted with respect to its orbit, it also experiences seasons, which drive these lakes toward its North Pole. But Saturn’s eccentric orbit makes the lakes shift from pole to pole, Hayes explained.

By combining Cassini RADAR mapper observations with other data, Hayes and his colleagues compiled detailed information about Titan’s lake systems and topography, allowing scientists to test ideas for how these lakes developed.

“Topography in geology is the key because it drives the evolution of landscapes,” said Samuel Birch, lead author of one of the Titan studies and a Ph.D. student at Cornell.

[For more, check out the entire article on GeoSpace, published on 14 Dec. 2015. Thanks to Lauren Lipuma for editing assistance.]

More Engineering News: Protein Engineering and Next-Generation Computer Architecture

Here are two new articles I’ve written about exciting newly published research on protein engineering and computer systems, led by engineers at Stanford University:

 

Stanford engineers invent process to accelerate protein evolution

A new tool enables researchers to test millions of mutated proteins in a matter of hours or days, speeding the search for new medicines, industrial enzymes and biosensors.

All living things require proteins, members of a vast family of molecules that nature “makes to order” according to the blueprints in DNA.

Through the natural process of evolution, DNA mutations generate new or more effective proteins. Humans have found so many alternative uses for these molecules – as foods, industrial enzymes, anti-cancer drugs – that scientists are eager to better understand how to engineer protein variants designed for specific uses.

Now Stanford engineers have invented a technique to dramatically accelerate protein evolution for this purpose. This technology, described in Nature Chemical Biology, allows researchers to test millions of variants of a given protein, choose the best for some task and determine the DNA sequence that creates this variant.

An overview of the directed evolution process with μSCALE: preparing protein libraries, screening them, extracting desired cells, and then inferring the DNA sequence at work. (Credit: Cochran Lab, Stanford)

An overview of the directed evolution process with μSCALE: preparing protein libraries, screening them, extracting desired cells, and then inferring the DNA sequence at work. (Credit: Cochran Lab, Stanford)

“Evolution, the survival of the fittest, takes place over a span of thousands of years, but we can now direct proteins to evolve in hours or days,” said Jennifer Cochran, an associate professor of bioengineering who co-authored the paper with Thomas Baer, executive director of the Stanford Photonics Research Center.

“This is a practical, versatile system with broad applications that researchers will find easy to use,” Baer said.

By combining Cochran’s protein engineering know-how with Baer’s expertise in laser-based instrumentation, the team created a tool that can test millions of protein variants in a matter of hours.

“The demonstrations are impressive and I look forward to seeing this technology more widely adopted,” said Frances Arnold, a professor of chemical engineering at Caltech who was not affiliated with the study.

[For more, check out the entire article in Stanford News, published on 7 Dec. 2015. Thanks to Tom Abate for help with editing.]

 

Stanford-led skyscraper-style chip design boosts electronic performance by factor of a thousand

In modern computer systems, processor and memory chips are laid out like single-story structures in a suburb. But suburban layouts waste time and energy. A new skyscraper-like design, based on materials more advanced than silicon, provides the next computing platform.

For decades, engineers have designed computer systems with processors and memory chips laid out like single-story structures in a suburb. Wires connect these chips like streets, carrying digital traffic between the processors that compute data and the memory chips that store it.

But suburban-style layouts create long commutes and regular traffic jams in electronic circuits, wasting time and energy.

That is why researchers from three other universities are working with Stanford engineers, including Associate Professor Subhasish Mitra and Professor H.-S. Philip Wong, to create a revolutionary new high-rise architecture for computing.

A multi-campus team led by Stanford engineers Subhasish Mitra and H.-S. Philip Wong has developed a revolutionary high-rise architecture for computing.

A multi-campus team led by Stanford engineers Subhasish Mitra and H.-S. Philip Wong has developed a revolutionary high-rise architecture for computing.

In Rebooting Computing, a special issue of the IEEE Computer journal, the team describes its new approach as Nano-Engineered Computing Systems Technology, or N3XT.

N3XT will break data bottlenecks by integrating processors and memory like floors in a skyscraper and by connecting these components with millions of “vias,” which play the role of tiny electronic elevators. The N3XT high-rise approach will move more data, much faster, using far less energy, than would be possible using low-rise circuits.

“We have assembled a group of top thinkers and advanced technologies to create a platform that can meet the computing demands of the future,” Mitra said.

Shifting electronics from a low-rise to a high-rise architecture will demand huge investments from industry – and the promise of big payoffs for making the switch.

“When you combine higher speed with lower energy use, N3XT systems outperform conventional approaches by a factor of a thousand,” Wong said.

[For more, check out the entire article in Stanford News, published on 9 Dec. 2015.]

Engineering News: Solar Cells and Plasma Combustion

Check out these new articles I’ve written about solar cells and plasma combustion research led by Stanford University engineers:

 

Plasma experiments bring astrophysics down to Earth

New laboratory technique allows researchers to replicate on a tiny scale the swirling clouds of ionized gases that power the sun, to further our understanding of fusion energy, solar flares and other cosmic phenomena.

Intense heat, like that found in the sun, can strip gas atoms of their electrons, creating a swirling mass of positively and negatively charged ions known as a plasma.

For several decades, laboratory researchers sought to replicate plasma conditions similar to those found in the sun in order to help them understand the basic physics of ionized matter and, ultimately, harness and control fusion energy on Earth or use it as a means of space propulsion.

Now Stanford engineers have created a tool that enables researchers to make detailed studies of certain types of plasmas in a laboratory. Their technique allows them to study astrophysical jets—very powerful streams of focused plasma energy.

A long-exposure photographic image capturing the Stanford Plasma Gun during a single firing. The image shows where the plasma is brightest during the acceleration process, which occurs over tens of microseconds.

A long-exposure photographic image capturing the Stanford Plasma Gun during a single firing. The image shows where the plasma is brightest during the acceleration process, which occurs over tens of microseconds.

Writing in Physical Review Letters, mechanical engineering graduate students Keith Loebner and Tom Underwood, together with Professor Mark Cappelli, describe how they built a device that creates tiny plasma jets and enabled them to make detailed measurements of these ionized clouds.

The researchers also proved that plasmas exhibit some of the same behavior as the gas clouds created by, say, firing a rocket engine or burning fuel inside an internal combustion engine.

Their instrument, coupled with this new understanding of the fire-like behavior of plasmas, creates a down-to-earth way to explore the physics of solar flares, fusion energy and other astrophysical events.

“The understanding of astrophysical phenomena has always been hindered by the inability to generate scaled conditions in the laboratory and measure the results in great detail,” Cappelli said.

[For more, check out the entire article in Stanford News, published on 1 Dec. 2015. Thanks to Tom Abate for help with editing.]

 

Stanford designs underwater solar cells that turn captured greenhouse gases into fuel

Taking a cue from plants, researchers figure out how to use the sun’s energy to combine CO2 with H2O to create benign chemical products, as part of a futuristic technology called artificial photosynthesis.

Stanford engineers have developed solar cells that can function under water. Instead of pumping electricity into the grid, though, the power these cells produce would be used to spur chemical reactions to convert captured greenhouse gases into fuel.

This new work, published in Nature Materials, was led by Stanford materials scientist Paul McIntyre, whose lab has been a pioneer in an emerging field known as artificial photosynthesis.

Stanford engineers have shown how to increase the power of corrosion-resistant solar cells, setting a record for solar energy output under water. (Photo credit: Shutterstock)

Stanford engineers have shown how to increase the power of corrosion-resistant solar cells, setting a record for solar energy output under water. (Photo credit: Shutterstock)

In plants, photosynthesis uses the sun’s energy to combine water and carbon dioxide to create sugar, the fuel on which they live. Artificial photosynthesis would use the energy from specialized solar cells to combine water with captured carbon dioxide to produce industrial fuels, such as natural gas.

Until now, artificial photosynthesis has faced two challenges: ordinary silicon solar cells corrode under water, and even corrosion-proof solar cells had been unable to capture enough sunlight under water to drive the envisioned chemical reactions.

Four years ago, McIntyre’s lab made solar cells resistant to corrosion in water. In the new paper, working with doctoral student Andrew Scheuermann, the researchers have shown how to increase the power of corrosion-resistant solar cells, setting a record for solar energy output under water.

“The results reported in this paper are significant because they represent not only an advance in performance of silicon artificial photosynthesis cells, but also establish the design rules needed to achieve high performance for a wide array of different semiconductors, corrosion protection layers and catalysts,” McIntyre said.

Such solar cells would be part of a larger system to fight climate change. The vision is to funnel greenhouse gases from smokestacks or the atmosphere into giant, transparent chemical tanks. Solar cells inside the tanks would spur chemical reactions to turn the greenhouse gases and water into what are sometimes called “solar fuels.”

[For more, check out the entire article in Stanford News, published on 18 Nov. 2015.]

The Return of Persian Science

Like many multiethnic multicultural people, I’ve had difficulty coming to terms with my multifaceted yet fragmented identity. As a half-Iranian in the midst of Americans, I’ve lacked key cultural influences and a US-centric worldview, while in Iran I feel like an outsider at times.

I’ve had the wonderful opportunity to visit twice so far—once as a teenager and once more recently as a physicist. Each time, I’ve been very observant in the hopes of better understanding an important side of myself. I’ve explored its fascinatingly unique cities, including the massive capital, Tehran, and its huge bazaars; Esfahan, with its spectacular architecture and Jahan Square, a national landmark; and Shiraz, with its tombs of poet giants, Hafez and Saadi. I’ve also looked for signs of how the country appears to be changing as it becomes more open to the international community.

Me and Sohrab Rahvar outside the physics department of University of Sharif, May 13, 2008. (Photo: Forood Daneshbad.)

Me and Sohrab Rahvar outside the physics department of University of Sharif, May 13, 2008. (Photo: Forood Daneshbad.)

At the invitation of Sohrab Rahvar, physics professor at the University of Sharif, I gave two seminars, one there and another at the University of Tehran. I presented postdoctoral research I was doing at the Max Planck Institute for Astronomy in Heidelberg, Germany, investigating connections between observations of galaxies and theories of dark matter.

I introduced myself in Farsi and gave the talks in English—the usual second language there. I had learned Farsi from my mother in the US, and I had a pretty good accent too, but I lacked the vocabulary to communicate astrophysics in the language. I found out though that, for example, like in English, Iranians use the same word for a “cluster of galaxies” and a “cluster of grapes”.

After my presentations, the students asked challenging questions about my work—both in English and Farsi. One student asked me for advice, as she was preparing a job application for the Max Planck Institute for the Science of Light, near Nuremberg.

For all their talent and promise, students and scientists like her face many difficulties under the tough nuclear-related sanctions imposed on Iran. Many have a hard time traveling to conferences, obtaining student visas, or meeting with international colleagues. Even the Iranian physicists who played an integral role in the CERN Large Hadron Collider collaboration ran into restrictions. Obtaining professional journals and lab equipment can be prohibitively expensive for Iranian scientists too. Perhaps for these reasons, many scientists shifted to theoretical rather than experimental work; for example, I met surprisingly many string theory researchers there.

Science, medicine and mathematics have a long and glorious history in Iran and Persia. Six centuries before Galileo, the physicist Biruni was the first scientist to propose that the speed of light is finite. Ibn al-Haytham developed the field of optics, Ibn Sina (known in the West as Avicenna) made important contributions to medicine and philosophy, and the 11th-century poet Omar Khayyam—author of The Rubaiyat—also happened to figure out the principles of algebra and devised an accurate solar calendar. Observatories proliferated throughout Persia then, and precise planetary records collected at Maragheh observatory, in what is now northwestern Iran, likely influenced Copernicus’s hypothesis that the Earth revolves around the sun.

A thousand years later, Iran is a nation of 78 million people, almost as populous as Germany. More than half the population is under the age of 35—many of them politically active—and male and female young adults have a literacy rate of 97 percent. According to the Institute of International Education, 10,200 Iranian students and nearly 1,400 scholars studied at US colleges and universities, making it the 12th leading country to send students to the US. In 1979, however, more than 51,000 students enrolled in U.S. universities—the biggest source of overseas students. The large Iranian diaspora have been known for their accomplished work in science and other fields, but according to the International Monetary Fund, this has fueled the highest “brain drain” among developing and developed countries, with 150,000 to 180,000 educated people emigrating every year. But now that may change.

As the international sanctions will be gradually lifted, students and scientists in Iran and their colleagues abroad have much to look forward to. As part of the historic nuclear deal, the uranium enrichment facility in Fordo, between Tehran and Esfahan, will be converted into an international nuclear physics and technology center.

Iranians have other plans in the works too. Within the next 4 or 5 years, astronomers are working on building a new observatory, a 3.4-meter optical telescope, on a 12,000-foot peak in central Iran at a site comparable to Hawaii’s Mauna Kea. Once it’s completed, the international community will be invited to use up to 70 percent of the observing time to study planets outside the solar system, gamma-ray bursts, distant galaxies and elusive dark matter. I hope to see the telescope the next time I travel there.

In addition, Iranian physicists plan to construct an ambitious $300 million “synchrotron” particle accelerator. Like the telescope, it would be difficult to complete on schedule, if at all, were the sanctions not removed. Iranian scientists and their international partners excitedly anticipate new experiments on a wide range of subjects, from research on biological molecules to advanced materials. “Big Science” is not limited to the West.

Other sciences also look forward to a changing environment, as described in a Science special issue on science in Iran.

Rahvar seems optimistic about the post-sanctions situation. “We hope to reestablish our previous scientific relations and make new collaborations,” he says. It will take time, but the prospect of an improving research climate in Iran could herald a new era of scientific achievements in the country, especially in the physical sciences.

I think that a more open political environment in Iran won’t just invigorate science in the country and in the international community; with time, it will stimulate a more open exchange of ideas and cultural understanding. I’m proud of my Iranian blood, and I excitedly await Iran’s renewal and resurgence.

[I’m cross-posting this from the Last Word on Nothing blog, where this was originally published. Thanks to Jessa Gamble and other LWON members for their editing assistance and helpful advice.]

Finding Earth 2.0

In honor of Carl Sagan’s birthday, I figured I’d write a few thoughts I had about a fascinatingly unique conference I attended in the Bay Area last week. It was called “Finding Earth 2.0,” and it was organized by 100 Year Starship, a group partially funded by NASA and the Defense Advanced Research Projects Agency (DARPA) to plan for interstellar travel within the next century.

A potential spacecraft called Icarus Pathfinder would be powered by electric propulsion engines called VASIMR, taking it out to 1,000 times the distance between the Earth and Sun. (Credit: NBC News)

A potential spacecraft called Icarus Pathfinder would be powered by electric propulsion engines called VASIMR, taking it out to 1,000 times the distance between the Earth and Sun. (Credit: NBC News)

Like you might imagine such an organization, the conference speakers and attendees appeared rather eclectic, including astronomers and planetary physicists and science journalists—whom I’m usually hanging out with—as well as aerospace engineers, science fiction writers, business people, teachers, space enthusiasts, and many others. But everyone displayed an active interest in exploring the distant universe and imagining what our future might be like.

Dr. Mae Jemison, the first woman of color in space, heads the 100 Year Starship, and she gave a plenary talk. She pointed to many motivations people have for finding another Earth, including conundrums and challenges our planet and species face, such as limited resources, overpopulation, and our own behavior—perhaps a reference to climate change or nuclear weapons. I think we have many other compelling reasons for interstellar space exploration, but I’ve written about that here before.

I also saw many interesting perspectives and presentations about hunting for planets beyond the solar system, called exoplanets, including habitable ones or even inhabited ones. Dr. Jill Tarter, SETI (Search for Extraterrestrial Intelligence) Institute co-founder and inspiration for Sagan’s protagonist in Contact (Dr. Arroway), gave a provocative presentation on attempts to detect “technosignatures” from distant planets. (She clarified that possessing technology doesn’t imply an intelligent civilization; however, technologies serve as a proxy for intelligence.) Advanced species on these planets could be giving off radio and optical signals that could reach the Earth, but we’d have to listen really really hard to hear them. But if they had a Dyson sphere or an “alien superstructure,” that would be easier.

Other astronomers and astrobiologists talked about their work on related subjects. Margaret Turnbull, also of the SETI Institute, spoke about the “massive harvest” of planets reaped by NASA’s Kepler probe, which confirmed more than 1,000 planets in our Milky Way neighborhood and which showed that about 1 in 5 stars has a planet in the “habitable zone.” Stephen Kane (San Francisco State University) made a convincing case that we should view the habitable zone boundaries as uncertain, and that many planets in the zone would actually be not very hospitable to life. Natalie Batalha (NASA Ames) argued that we should be open-minded about planets in other systems. In one of a few relationship-like quotes, she said, “In our search for a [Earth-like] soul-mate, we may be a bit myopic.” But she was talking about the fact that we have no planets between Earth and Neptune sizes here, while according to Kepler observations, such planets seem rather common throughout the galaxy. She and others also made the point that we need detailed imaging or spectra of planetary systems to learn more about their habitability.

Niki Parenteau (SETI) talked about her efforts to study exoplanets and spot signs of life, which would likely be microorganisms and would have to cover the world to be detectable. “There’s no one smoking gun for biosignatures,” she said. “We need multiple lines of evidence.” She looks for things like biogenic gases and certain planetary surface features. But for her, water is the #1 requirement…and then Morgan Cable, a nerdy joke-telling astrochemist from Jet Propulsion Laboratory, considered a range of other liquids life might be able to develop in, including ammonia, carbon dioxide, petroleum, and liquid hydrocarbons. She ended with her main argument: “NASA shouldn’t just be looking for places with liquid water.”

Artist's illustration of NASA's NEA Scout CubeSat, which is scheduled to launch aboard the maiden flight of the agency’s Space Launch System rocket in 2018. (Credit: NASA)

Artist’s illustration of NASA’s NEA Scout CubeSat, which is scheduled to launch aboard the maiden flight of the agency’s Space Launch System rocket in 2018. (Credit: NASA)

A bunch of people gave presentations about propulsion systems, trying to push the boundaries of space travel. I thought the most interesting one was by Les Johnson, Deputy Manager for NASA’s Advanced Concepts Office at Marshall Space Flight Center. In back-to-back talks, he described current efforts to design and construct giant solar and electric sails. The sails involve ultra-thin reflective materials that are unfurled in space and use solar energy to propel a spacecraft to the distant reaches of the solar system and beyond. In an important step toward that goal, Johnson and NASA engineers are currently building a solar sail for the Near-Earth Asteroid Scout mission to transport a CubeSat “nanosatellite” to study asteroids past Mars in two years. He and his colleagues are also currently testing electric sails for fast solar wind-powered spacecraft, which—if as powerful as hoped—could even send a probe to another star.

Finally, I saw a few strange talks at the conference, and I wasn’t sure what to make of them. For example, one person spoke about the new field of “astrosociology.” He avoided giving any specifics though, even though he had been discussing “deviant” behavior, and admitted after the talk that he had envisioned studying multi-year trips transporting tens of thousands of colonists beyond the solar system. Maybe for the 200 Year Starship! Unfortunately, the speaker had not considered small missions, such as handfuls of astronauts traveling to Mars or private ventures conducting asteroid mining. I’d imagine that such small groups of people stuck together for long periods could benefit from sociological study.

Frontiers of Computer Engineering: Graphene and Cognitive Networking

Check out these new articles I’ve written about exciting computer engineering research going on at Stanford University and the University of California, San Diego:

 

Graphene key to high-density, energy-efficient memory chips, Stanford engineers say

Only an atom thick, graphene is a key ingredient in three Stanford projects to create data storage technologies that use nanomaterials other than standard silicon.

The memory chips in phones, laptops and other electronic devices need to be small, fast and draw as little power as possible. For years, silicon chips have delivered on that promise.

But to dramatically extend the battery life of mobile gadgets, and to create data centers that use far less energy, engineers are developing memory chips based on new nanomaterials with capabilities that silicon can’t match.

Stanford electrical engineering professor H.-S. Philip Wong, left, graduate student Joon Sohn and postdoctoral fellow Seunghyun Lee (seated) are developing high-capacity, energy-efficient memory chips that are not based on silicon. (Photo: Norbert von der Groeben)

Stanford electrical engineering professor H.-S. Philip Wong, left, graduate student Joon Sohn and postdoctoral fellow Seunghyun Lee (seated) are developing high-capacity, energy-efficient memory chips that are not based on silicon. (Photo: Norbert von der Groeben)

In three recent experiments, Stanford engineers demonstrate post-silicon materials and technologies that store more data per square inch and use a fraction of the energy of today’s memory chips.

The unifying thread in all three experiments is graphene, an extraordinary material isolated a decade ago but which had, until now, relatively few practical applications in electronics…

[For more, check out the entire article in Stanford News, published on 23 Oct. 2015. Thanks to Tom Abate for help with editing.]

 

New Frontiers of Cognitive Networking

Most of us use many devices — perhaps too many — throughout the day: A smartphone at home and then at the cafe around the corner, a laptop computer at work and maybe a tablet or e-reader in the evening. That’s not counting all of the other possible “smart” devices at our fingertips, such as health monitors, fitness trackers and smart watches.

Few people think about how all these devices could be efficiently networked via software and wireless technology; however, if you were trying to download a video and coworkers on the same network were attempting to communicate via Google Talk, you would want to be sure those limited wireless resources were allocated optimally.

UC San Diego Postdoctoral Researcher Giorgio Quer.

UC San Diego Postdoctoral Researcher Giorgio Quer.

Giorgio Quer, a postdoctoral researcher in electrical and computer engineering at the University of California, San Diego is on a quest to solve these and other networking problems. Quer has been working on “cognitive networking” and exploring related projects at the UC San Diego division of Qualcomm Institute for the past five years as a visiting scholar from the University of Padua in northern Italy. He works with Ramesh Rao, director of the Qualcomm Institute and principal investigator of the research, as well as Matteo Danieletto, another postdoc in the department.

The concept of cognitive networking, according to Quer, refers to “a way to apply cognition to wireless, where a network learns from past history.” Such a process enables a network to “perceive” and learn about its current conditions and then plan, decide and act on those conditions, resembling (in a limited sense) cognition in the human brain…

[For more, check out the entire article at the California Institute for Telecommunications and Information Technology (Calit2), published on 21 Sep. 2015. Thanks to Tiffany Fox for help with editing.]

What happens if you fall into a black hole?

Q: What happens if you fall into a black hole?
– Jane Doe, Calif.

Ramin Skibba, a science communicator and astrophysicist at UC Santa Cruz, illuminates:

When a dying star much bigger than our sun burns the last of its fuel, it finally collapses under its own weight, explodes, and leaves behind a black hole. If you fell into the black hole, even in a sturdy spacecraft, powerful tides from its gravity would rip you into a ribbon of atoms.

Artist's drawing of the black hole Cygnus X-1, pulling matter from the blue star beside it. (Credits: NASA/CXC/M.Weiss)

Artist’s drawing of the black hole Cygnus X-1, pulling matter from the blue star beside it. (Credits: NASA/CXC/M.Weiss)

According to Albert Einstein’s relativity theory, the laws of physics break down near extremely massive objects. Black holes have such densely compressed mass that they warp the very fabric of space around them. If you got too close, it would inevitably suck you in. Along the way, you would perceive distorted colors and shapes as if through carnival mirrors. Your clocks would run differently, too; black holes bend not only space, but time itself.

Suppose you fell in feet first. Your legs would feel a much stronger gravitational force than your head. In a fraction of a second, this tide would stretch you and tear you apart like taffy. The resulting shrapnel and debris would spiral into the hole, vanishing forever.

Astronomers see evidence of this at the centers of galaxies, where the largest black holes grow. It happens to entire stars that venture too close, then get shredded in blazes of energy.

[Thanks to Rob Irion for editing help with this piece, which is written to resemble the short Q&A-style articles previously published in Scientific American.]