The Physics of Sustainable Energy

I attended a conference this weekend called “The Physics of Sustainable Energy” at the University of California, Berkeley. It was organized by people affiliated with the American Physical Society, Energy Resources Group, and a couple other organizations. Most of the speakers and attendees (including me) seemed to be Californians. I had some interesting conversations with people and attended some great talks by experts in their fields, and here I’ll just give you a few highlights.

First though, I want to make two general comments. I did notice that only ~20% of the speakers were women, which is worse than astrophysics conferences, and it’s too bad the organizers weren’t able to make the conference more diverse. (There were a few people of color speaking though.) Secondly, I think it’s excellent that people (and not just in California) are actively involved working on solutions and innovations, but I think we should be careful about an technophilic or technocratic emphasis. This was a conference for physicists and engineers though, and energy policy and communication with the media and policy-makers, for example, were mostly beyond the scope of it. I was struck by the apparently close ties with industry some speakers had (such as Amory Lovins and Jonathan Koomey); to some extent that’s necessary, but I was a little concerned about potential conflicts of interest.

…On to the conference. Ken Caldeira spoke about the global carbon balance. When accounting for CO2 emission per capita from fossil fuel use and cement production: the US is worst (50kg CO2/person/day), followed by Russia, China and the EU. California emits half as much as the rest of this country, but 2/3 of the difference is due to a fortunate climate (it doesn’t get very cold); according to an audience member (Art Rosenfeld?), “we’re mostly blessed with good luck as well as some brains.” Daniel Kammen (one of the organizers) then talked about developing a framework for energy access for all. According to the International Panel on Climate Change (IPCC AR4 in 2007): “warming will most strongly and quickly impact the global poor.” Kammen described the concept of “energy poverty”: 1.4 billion people lack access to electricity today, and that will still be the case for a similar number in 2030, with more having unreliable/intermittent access. There appears to be a strong correlation between electricity access and human development index.

VAWTs

It seems that many people are working on interesting research & development on renewable energy sources. Jennifer Dionne spoke about the “upconversion” of solar cells, which includes thermodynamic, electronic, and photonic design considerations. The upconversion process improves cell efficiency by at least 1.5× (see Atre & Dionne 2011), and it often works well at optical near-infrared wavelengths. (She pointed out that of energy from the sun, 5% is in the UV, 43% in optical, and 52% in infrared. And if you’re interested in what those proportions are like for different types of galaxies, check out my recent paper.) Then Chris Somerville spoke about the status and prospects of biofuels, the production of which is currently dominated by the US and Brazil. The combustion of biomass has challenges for providing low-carbon energy: depends on tilling of soil, land conversion, fertilizer, transportation, and processing. I’m concerned about deforestation and effects on ecosystems as well as the effects on food/crop prices (remember the food riots in 2007-2008 and the rising cost of corn/maize?). In my opinion, Somerville didn’t sufficiently address this, though he did argue in favor of miscanthus and other biomass rather than the use of corn. John Dabiri spoke about the advantages of vertical-axis wind turbines (called VAWTs, see the figure above), in addition to the ubiquitous horizontal-axis variety. VAWTs have a smaller structure size and cost, simpler installation logistics, and are safer for birds and bats as well. Currently only four countries get >10% of their electricity from wind (Spain, Portugal, Ireland, Denmark, followed by Germany with 9%), but this can be easily improved.

LLNL_Flow-Chart_2012

This flow diagram is pretty nice, and it describes current energy use in the US (presented by Valerie Thomas). And Daniel Kammen, in a paper on the relation between energy use, population density, and suburbanization, shows the spatial distribution of carbon footprints (where the units are tCO2e, or total carbon dioxide equivalent per household).

JonesKammen2014_Fig1

Tilman Santarius give a nice talk about energy efficiency rebound effects, which is closely related to my previous post, where you can find more information. He discussed the interactions between energy efficiency, labor productivity, human behavior, and economic growth, and he distinguished between rebounds due to an income effect vs a substitution effect. In any case, average direct rebound effects appear to be around 20-30% (Greening et al. 2000; Sorell 2007), in addition to a 5-10% of indirect rebound. In other words, around 1/3 of income savings due to energy efficiency is lost because of an increase in energy demand. He also talked about the psychology of rebounds, including moral licensing (such as Prius drivers who drive more) and moral leakage (people feel less responsible). It will be a difficult task to try to separate energy demand from economic growth.

There were many other interesting talks, but I’ll end with the issue of climate adaptation and geoengineering. Ann Kinzig described how the combined risk of a phenomenon is the sum Σ p (event) × impact (event). Mitigation seeks to reduce the probability p while adaptation seeks to reduce the impact. Climate change will have impacts on food, water, ecosystems, and weather events, and decision-makers in urban areas can try to prepare for these (see this website). Kinzig also spoke about historical case studies of failed adaptations by people in the Hohokam (Arizona), Mesa Verde (Colorado), and Mimbres (New Mexico) regions, and the dependence on societal hierarchy and conformity. Alan Robock spoke about the risks and benefits of “geoengineering”, which involves gigantic projects in the future to address climate change, such as space-based reflectors, stratospheric aerosols, and cloud brightening (seeding clouds), and basically involve using the Earth as a science experiment with a huge cost of failure. In particular, he studies the many problems of injecting sulfate aerosols into the stratosphere to cool the planet. (Some people have supported this idea because of the supposedly benign effects of volcanic eruptions in the past.) He discussed the potential benefits of stratospheric geoengineering but compiled a list of 17 risks, including drought in Africa and Asia, continued ocean acidification, ozone depletion, no more blue skies, military use of technology, ruining terrestial optical astronomy, moral issues, and unexpected consequences. For more on Robock’s research and for other useful references, go here.

Robock_figure

An introduction to “space security”

I’m curious about what people refer to as “space security”, as well as space policy and sustainability, and if you’re interested, you can learn with me. This post will just be an introduction to some of the issues involved. Note that I’m not an expert on many of these issues, so take my comments and thoughts with a grain of salt.

images

The idea of “space security” might conjure images of invading aliens, but as much fun as that is, that’s not what I’m talking about. I’m also not planning on talking about killer asteroids and dangerous radiation, though these are much less far-fetched. For example, the Pan-STARRS survey (of which I was briefly a member a few years ago) received funding from NASA to assess the threat to the planet from Near Earth Objects, some of which pass closer to us than the moon. (A limitation of Pan-STARRS, however, was that images that happened to contain passing satellites had software applied to black out or blur the pixels in the region.) On the other hand, solar flares can produce “coronal mass ejections” and intense cosmic rays that could be hazardous to spacecraft but on Earth we’re somewhat protected by our atmosphere and magnetosphere. This and other forms of “space weather” could be the subject of another post later.

I’d like to talk about the issue of satellites, as well as weapons and reactors, in space. More than 5,000 satellites have been launched into orbit and about 1,000 are in operation today. The act of destroying a satellite or of colliding satellites can damage the space environment by creating dangerous amounts of debris. (If you’ve seen the Oscar-winning Gravity, then you know that debris from satellites can be a serious problem.) For example, in a demonstration of an anti-satellite weapon in 2007, China destroyed one of its own satellites; the resulting “space junk” then struck and destroyed a small Russian satellite last year. The following computer-generated images of the growing number of objects in low-earth orbit (courtesy of the NASA Orbital Debris Office) illustrates the problem. Only 5% of the objects are satellites; the rest are debris. Currently more than 21,000 pieces of debris larger than 10cm are being tracked, and there are as many as 500,000 additional untracted pieces larger than 1cm.

Satellites and orbital debris_500x350

In addition, the loss of an important satellite could create or escalate a conflict, especially during a time of tension between states. The US and other countries possess “anti-satellite” weapons (ASATs) and have or are considering space-based missile defense systems. Attacks on satellites are a very real possibility, and it is important to beware of the destabilizing effects and potential for proliferation with such weapons. Moreover, since the Cold War, the US and other governments have considered deploying nuclear reactors on spacecraft, which have proven to be controversial (such as the dubiously named Project Prometheus, which was cancelled in 2006); an intentionally or unintentionally damaged nuclear reactor in space could have major consequences.

Considering that we are increasingly dependent on satellites and that there are military, commercial, and civil interests in space, how can we attempt to ensure space security and sustainability in the future? In the US, the Obama administration has a National Space Policy, which was released in June 2010. The policy mainly consists of: (1) limit further pollution of the space environment; (2) limit objects from colliding with each other and/or exploding; (3) actively removing high-risk space debris. The policy a good start, but much more could be done. An emphasis on international cooperation rather than unilateral action would help; space debris are clearly a global problem requiring global solutions. It is also important to negotiate on the control of space weapons. The US and other space powers should declare that they will not intentionally damage or disable satellites operating in accordance with the Outer Space Treaty and that they will not be the first to station weapons in space. Moreover, “space situational awareness” (SSA), which allows for the coordination of space traffic, can be improved in collaboration with other countries, and satellites can be made less vulnerable to collision or attack. Finally, the US should play an active role in negotiations with the international community on space security and sustainability. The United Nations has the Committee on the Peaceful Uses of Outer Space (COPUOS), with 76 member states, has been working on a variety of programs to improve the long-term sustainability of space activities, and in particular, to develop and adopt international standards to minimize space debris.

The Future of Fracking in California

I attended an interesting forum at UC San Diego on Thursday, and this post is based on that. It was titled, “The Future of Fracking in California: Energy, Environment and Economics,” and the speakers included: Taiga Takahashi, Associate in the San Diego office of Latham & Watkins; Mark Ellis, Chief of Corporate Strategy for San Diego-based Sempra Energy; and Andrew Rosenberg, Director of the Center for Science and Democracy at the Union of Concerned Scientists. I’ll just summarize some of the more important points people made (based on my incomplete notes), and you can decide what you think of them.

UCS-fracking-report-Fig1

Taiga Takahashi described the legal situation in California vis-à-vis hydraulic fracturing (fracking). Governor Jerry Brown supports “science-based fracking” that is protective of the environment. Brown also touts the economic benefits, including the creation of 2.8 million jobs (though this figure was disputed). In contrast, the CA Democratic party supports a moratorium on fracking. The bill SB 4 on well stimulation was passed in September requires the state Department of Oil, Gas & Geothermal Resources (DOGGR) to adopt regulations regarding water well testing and other tests of air and water pollution. New regulations will be developed by January 2015 while an environmental impact study will be completed six months afterward (my emphasis). Fracking restrictions are mostly similar to those in Colorado and much better than those in Pennsylvania. Takahashi argued that a “consensus approach” on fracking regulation in CA could be reached, which would include nongovernmental organizations (NGOs), the state, and industry.

Mark Ellis is a representative of industry. Sempra Energy is a major natural gas utility that owns San Diego Gas & Electric and Southern California Gas. Ellis argued that the “shale revolution” (his term) has made gas cheap relative to oil and thereby reduced prices. Gas is used mostly for power, since many are making a switch from coal to gas, as well as in industry and residential areas. There are also opportunities for using gas in transportation, such as with compressed/liquefied natural gas (LNG). Sempra is expanding production and building pipelines from Texas and Arizona to Mexico. Ellis argued that the “shale revolution” is being or could be replicated in other places, such as the UK, Australia, Brazil, and Russia.

Andrew Rosenberg spoke about a couple recent Union of Concerned Scientists (UCS) reports: “The Curious Case of Fracking: Engaging to Empower Citizens with Information” and “Toward an Evidence-Based Fracking Debate,” written by Pallavi Phartiyal, him, and others. He brought up many issues, such as the use of pipeline infrastructure vs trains and the relation between fracking, chemical plants, and oil. Importantly, fracking is a many-step process (as you can see in the figure at the top of this post), which includes water acquisition, chemical transport and mixing, well drilling and injection, a wastewater pit, onsite fuel processing and pipelines, nearby community residences and residential water wells, and waste transport and wastewater injection. The most important point he made is that we as a society must decide when particular actions are worth the risks, and to what extent those risks can be mitigated with regulations. There should be as much transparency as possible and plenty of opportunities for public comment. It’s important to close loopholes in federal environmental legislation; disclose the chemical composition, volume, and concentration of fracking fluids and wastewater; we require baseline and monitoring requirements for air water, and soil quality; make data publicly accessible; and engage citizens and address their concerns. (My views were mostly in agreement with Rosenberg’s. Full disclosure: I am an active member of UCS.)

After the speakers, there were a few comments and questions. I was surprised that this was the only time during the forum that climate change issues were raised. The issue of water usage was discussed as well, because of our ongoing drought. (In related news, Gov. Brown and the state Legislature just passed a drought relief package.) It also was clear that Sempra and other companies wouldn’t voluntarily make changes unless industry-wide regulations were applied; Ellis argued that singling out particular companies is counter-productive. It’s possible that there will be new Environmental Protection Agency (EPA) regulations on water and air pollution in the future.

The fracking debates in California continue. For example, the Los Angeles City Council is taking steps toward a fracking ban, and a rally against fracking is being organized at the Capitol in Sacramento in two weeks.

Publish or Perish?

I’d like to add a short post about writing and publishing papers. The phrase “publish or perish” is commonly heard because there is some truth to it. According to Wikipedia, the phrase dates back to the 1930s and 1940s.

There are some advantages to having pressure to publish. By encouraging scientists to write and publish their research, they and their work become more widely known, including among their peers. As scientists, we enjoy and are excited about working on research and publishing the interesting and new results, and putting the papers out helps to advance the field.

When considering scientists for academic posts (or for research grants), it can be a difficult and time-consuming process. That’s unavoidable, especially when numerous people will apply for small numbers of jobs and grants. One clearly quantifiable metric by which academics are judged is the number of papers they’ve published (and another is the size of grants they draw in). As pointed out by this recent New Yorker blog, both the amount and style of writing are related to the constant pressure to publish and the tough academic job market. In addition, the job market now appears two-tiered, with part-time and adjunct faculty working long hours for lower pay (see these NY Times and Salon articles). I plan to talk about this issue further in another post.

There are some major disadvantages to the publish-or-perish culture. One problem is that it doesn’t leave time for long-term or risky research on controversial topics. It also doesn’t allow for exploring new ideas, issues, or collaborations that might not pan out and result in something publishable. Nonetheless, these things are good for scientists and they are good for science, and when they are successful, there are huge rewards or discoveries. For example, Peter Higgs, the Nobel prize-winning physicist who was one of the discoverers of the Higgs boson, says that no university would employ him in today’s academic system because he would not be considered “productive” enough. The publish-or-perish culture is not just dominant in the physical sciences, but also in the social sciences, humanities, and law.

A related issue is that of “luxury” journals like Nature, Science, and Cell. According to the Guardian, Randy Schekman, the Nobel prize-winning biologist is now no longer publishing in these journals because they distort the scientific process. He writes: “A paper can become highly cited because it is good science – or because it is eye-catching, provocative, or wrong.” These journals have become brand names, and they prioritize publishing provocative results–perhaps before they’ve been sufficiently tested and vetted by editors, peer reviewers, or the authors themselves. The result is that the journals have a reputation for publishing results that are often wrong. Scientists know this, yet publishing in these journals still carries prestige.

In addition to publishing papers and books, scientists work on other important things that should be valued too. Key among them is teaching, of course, as well as participating in outreach programs, mentoring students, communicating with journalists and policy-makers, and other academic service to the community. These activities are not as easily quantifiable as scientists’ publications, but we should make the effort to recognize this work.

More from the AAAS meeting

The second half of the AAAS meeting in Chicago was interesting too. (I wrote about the first half in my previous post.)

alda-160x220

Probably the best and most popular event at the meeting was Alan Alda’s presentation. You’ll know Alan Alda as the actor from M*A*S*H (and recently, 30 Rock), but he’s also a visiting professor at the Alan Alda Center for Communicating Science at Stony Brook University. He gave an inspiring talk to a few thousand people about how to communicate science clearly and effectively in a way that people can understand. He talked about how one should avoid or be careful about using jargon. Interaction with the audience is important, and one can do that by telling a personalized story (with a hero, goal, and an obstacle, which develops an emotional connection), or by engaging with the audience so that they become participants. It’s also important to communicate what is most interesting or exciting or curiosity-piquing about the science, but in the end, the words you use don’t matter as much as your body language and tone of voice. It’s also good to develop improvisation skills, so when a particular explanation or analogy doesn’t appear to work well with the audience, you can adapt to the situation. He referred to the “curse of knowledge”, such that as scientists we forget what it’s like not to be experts in our particular field of research. That can be an obstacle when interacting with most segments of the public, Congress members and other politicians (most of whom aren’t scientists or haven’t the time to become familiar with the science), and even with scientists in other fields. Most of all, one needs to be clear, engaged, and connected with one’s audience. Finally, Alda told us about the “flame challenge
–challenging scientists to explain flames and other concepts for 11-year olds to understand. (The kids are also the judges of the competition.) If the video of Alda’s talk becomes available online, I’ll link to it here for you.

I attended an interesting session on climate change and whether/how it’s possible to reduce 80% of greenhouse gas emissions from energy by 2050. As pointed out by the chair, Jane Long (who is one of the authors of this report), our energy needs will likely double or even triple by then, while we must be simultaneously reducing carbon emissions. Peter Loftus discussed this issue as well, and showed the primary energy demand as well as energy intensity (energy used per unit GDP) have been rapidly increasing over the past twenty years, partly due to China. But to obtain substantial carbon reductions, the intensity needs to drop below what we’ve had for the past 40 years! We need to massively add to power generation capacity (10 times more rapidly than our previous rates), and it might not be feasible to exclude both nuclear and “carbon capture” in the process. Karen Palmer gave an interesting talk about the importance of energy efficiency as part of the solution, but she says that one problem is that it’s still hard to evaluate which policies best promote energy efficiency as well as ultimately energy savings and carbon emission reductions. Richard Lester made strong arguments about the need for nuclear power, since renewables might not be up for the task of meeting rising energy demands in the near future. This was disputed by Mark Jacobson, who pointed out that nuclear power has 9-25 times more pollution per kW-hour than wind (due to mining and refining) and it takes longer to construct a plant than the 2-5 years it takes to build wind or solar farms. Jacobson also discussed state-by-state plans: California benefits from many solar devices, for example, while some places in the northeast could use offshore wind farms. In addition, such offshore arrays could withstand and dissipate hurricanes (depending on their strength), and WWS (wind, water, solar) could generate about 1.5 million new jobs in the U.S. in construction alone. Different countries have very different economic situations and carbon footprints, so different solutions may be needed.
CO2em_percapita

I caught part of a session on “citizen science” (see my previous post). Chris Lintott spoke about the history of citizen science and about how the internet has allowed for unprecedented growth and breadth of projects, including the numerous Zooniverse projects. Caren Cooper discussed social benefits of citizen science, and Carsten Østerlund discussed what motivates the citizen scientists themselves and how they learn as they participate. Lastly, Stuart Lynn spoke about how the next generation of citizen science systems can be developed, so that they can accommodate larger communities and larger amounts of data and so that people can classify billions of galaxies with the upcoming Large Synoptic Survey Telescope, for example.

Finally, there was another interesting session on how scientists can work with Congress and on the challenges they face, but more on that later…

Reporting from the American Association for the Advancement of Science (AAAS) meeting

I’d like to tell you about the AAAS meeting I’m attending. (Look here for the program.) It’s in Chicago, which is definitely much colder than southern California! I know it might sounds strange, but it’s nice to experience a real winter again.

There were some astrophysics sessions (such as on galaxy evolution in the early universe and dark matter particles) but that wasn’t my focus here. I took some brief notes, and this is based on them…

There were a few sessions about science communication, outreach, and media. These are very important things: for example, according to Rabiah Mayas, the best indicator of whether people participate in science or become scientists as adults is the extent to which they engaged in science-related activities outside of school as kids. One person discussed the importance of fact-checking for producing high-quality and robust science writing, but it takes time; one should note that peer-review in scientific research is supposed to perform a similar purpose, though it can be time-consuming as well. In any case, many people agreed that scientists and journalists need to interact better and more frequently. (As a side note, I heard two high-profile science journalists mispronounce “arXiv”, which is pronounced exactly like “archive”.) In addition, it’s worth noting that smaller regional newspapers often don’t have dedicated science desks, though this could provide opportunities for young writers to contribute. There was also an excellent talk by Danielle Lee about “Raising STEM Awareness Among Under-Served and Under-Represented Audiences,” who talked about ways to take advantage of social media.

There were interesting presentations about scientists’ role in policy-making, but I’ll get back to that later. Someone made an important point that scientists should be extremely clear about when they are just trying to provide information versus when they are presenting (or advocating) policy options. I should be clearer about that myself.

I also saw interesting talks by people about public opinion surveys in the U.S. and internationally of knowledge and opinions of science and technology. According to these polls, although some Americans are worried about global warming/climate change, people are more worried about toxic waste, water and air pollution. According to Lydia Saad (of Gallup), 58% of Americans worry a “great deal” or “fair amount” about global warming, 57% think human activities are the main cause, 56% think it’s possible to take action to slow its effects, while only 34% think it will affect them or their way of life. In addition, she and Cary Funk (of Pew) found huge partisan gaps between self-identified Democrats, Independents, and Republicans. As one person pointed out, climate change is not just a science issue but has become a political one. Americans in polls had pretty high opinions of scientists, engineers, and medical doctors, but people had the best views of those in the military. There is a wide range of knowledge of science, especially when it comes to issues such as evolution. (Note that fewer Republicans believe in evolution by natural processes, due to a drop in those who are not evangelicals, who already had a low fraction.) Also note that the numbers depend on how poll questions are asked: for example, ~40% agree to, “The universe began with a huge explosion”, and when you add “according to astronomers”, then the proportion jumps up to 60%. (If you’re curious, this image basically describes astronomers’ current view of the Big Bang.)

bigbang

There was an interesting session dedicated to climate change science, which included scientists that contributed to the IPCC’s recent 5th Assessment Report (which we talked about in an earlier blog). Note the language they’re required to use to quantify their un/certainty: “virtually certain” means 99% certain, and then there’s “very likely” (90%), “likely” (67%), and “more likely than not” (>50%). Michael Wehner discussed applications of “extreme value statistics” (which are sometimes used analyze extremely luminous galaxies or large structures in astronomy: see this and this) on extreme temperatures. Extremely cold days will be less cold, while extremely hot days will be more common and hotter. For particular extreme weather events, one can’t say whether they’re due to climate change, but one can ask “How has the risk of this event changed because of climate change?” or “How did climate change affect the magnitude of this event?” It seems very likely that the there will be heavier heavy rainy days, longer dry seasons, and more consecutive dry days between precipitation events. There will be more droughts in the west (west of the Rockies) and southeast, and more floods in the midwest and northeast.

The plenary speaker today was Steven Chu, former Secretary of Energy until last year, who gave an excellent talk. He compared convincing people about climate change to earlier campaigns to convince people about the dangers of tobacco use and its connection to lung cancer; both issues have had industry-promoted disinformation as well. On rising temperatures with climate change, he channeled Yogi Berra when he said, “If we don’t change direction, we’ll end up where we’re heading.” He talked a little about the role of natural gas (see also these NYT and Science blogs), and he discussed carbon capture, utilization, and sequestration (CCUS). Finally, he talked about how one might determine an appropriate price of carbon. He advocated a revenue-neutral tax, starting at $10/ton and over ~12 years raising it to $50/ton, and then giving the money raised from this directly back to the public. He also talked about wind turbines, which are now more reliable, efficient, and bigger, and he predicted a 20-30% decline in price in the next 10-15 years. The cost of solar photovoltaic (PV) modules is also dropping, but installation costs and licensing fees (“soft costs”) should be reduced. I definitely had the impression that, now that Chu is no longer Energy Secretary, he could be more frank than before about his views on contentious issues.

Jevons paradox: a problem for energy efficiency?

I’d like to discuss an issue that probably isn’t sufficiently studied or addressed. If the Jevons paradox is relevant for today’s energy consumption and efficiency problems (or to other resources), then it is certainly worth further investigation.

The “Jevons paradox” (which I briefly mentioned in a previous post) is the idea that improved energy (and other material-resource) efficiency ultimately tends to lead not to conservation but to increased consumption. It’s named after the English economist William Stanley Jevons, who in his book The Coal Question (1865) observed that the coal-powered steam engine made coal a more cost-effective power source, leading to the increased use of the steam engine in a wide range of industries. This in turn increased total coal consumption and depleted reserves, even as the amount of coal required for any particular application fell. He argued that increased efficiency in the use of coal as an energy source only generated increased demand for that resource, not decreased demand, as one might expect. This was because improvement in efficiency led to further economic expansion.

Jevons was wrong about a few points: he failed to foresee the development of energy substitutes for coal, such as petroleum and hydroelectric power, or that coal supplies would take a long time to be exhausted. But the “Jevons paradox” appears to be a very important insight that has lately become a popular issue again (see for example, this New Yorker piece). In addition, some consider it an extreme version of the “rebound effect”, in which there is a rebound of more than 100% of “engineering savings,” resulting in an increase rather than decrease in the consumption of a given resource. In other words, savings from efficiency are used for additional consumption, thus cancelling the savings. In light of recent efforts to improve energy efficiency, the significance of Jevons paradox effects has understandably generated considerable debate (such as here and here).

If the Jevons paradox and rebound effect are real, then this has important policy implications. It does not mean that efforts for improving efficiency in homes, businesses, and vehicles are wasted, but it does mean that those efforts by themselves won’t reduce energy consumption and carbon emission. Major environmental problems like climate change cannot be solved by purely technological methods in a “free market”. In addition, cap-and-trade systems might not be as successful as hoped in terms of reducing emissions (for example, see this critique of California’s cap-and-trade program, which allows carbon offsets). There is no simple solution, but energy efficiency goals should be combined with other strategies and policies for reducing demand for fossil fuels, such as a carbon tax, requiring utilities to generate a higher fraction of their electricity from renewables, requiring automakers to increase fuel economy standards, etc. Some of these would be less popular with particular industries because it would be a bigger break from business-as-usual, but business-as-usual is clearly worsening climate change (and other resource-related environmental problems, such as involving water resources and pollution). Any effective strategy must cut carbon emissions deeply enough to avoid the worst effects of climate change, which means at least 80% below 2000 levels by 2050. The transition to a low-carbon economy will be a difficult one.

Citizen Science: a tool for education and outreach

I’ll write about a different kind of topic today. “Citizen science” is a relatively new term though the activity itself is not so new. One definition of citizen science is “the systematic collection and analysis of data; development of technology; testing of natural phenomena; and the dissemination of these activities by researchers on a primarily avocational basis.” It involves public participation and engagement in scientific research in a way that educates the participants, makes the research more democratic, and makes it possible to perform tasks that a small number of researchers could not accomplish alone. Volunteers simply need access to a computer (or smartphone) and an internet connection to become involved and assist scientific research.

example_face_on_spiral

Citizen science was popularized a few years ago by Galaxy Zoo, which involved visually classifying hundreds of thousands of galaxies into spirals, ellipticals, mergers, and finer classifications using the classification tree below. (I am a member of the Galaxy Zoo collaboration and have published a few papers with them.) As a result of “crowdsourcing” the work of more than 100,000 volunteers around the world, new scientific research can be done that was not previously possible with such large datasets, including studies of the handedness of spiral galaxies, analyses of the environmental dependence of barred galaxies, and the identification of rare objects such as a quasar light echo that was dubbed “Hanny’s Voorwerp”. Other citizen science projects include mapping the moon, mapping air pollution, counting birds with birdwatchers, classifying a variety of insects, and many other projects.

Willettetal13_Fig1

Citizen scientists have many motivations, but it appears that the primary one is the desire to make a contribution to scientific research (see this paper). In the process, by bringing together professional scientists and members of the general public and facilitating interactions between them, citizen science projects are important for outreach purposes, not just for research. In addition, by encouraging people to see a variety of images or photographs and to learn about how the research is done, citizen science is useful for education as well. Many valuable educational tools have been produced (such as by the Zooniverse projects). Citizen science projects are popular and proliferating because they give the opportunity for people at home or in the classroom to become actively involved in science. It has other advantages too, including raising awareness and stimulating interest in particular issues. Citizen science is continuing to evolve, and in the era of “big data” and social media, it has much potential and room for improvement.

Scientific Integrity

In this blog post, let’s discuss scientific integrity–specifically, efforts to keep scientific research as independent as possible from political, corporate, or other influence. Such influences are important for a variety of policies including energy policy (especially related to climate change), health and drugs, food and nutrition, education, etc., when particular companies or organizations have a financial or other stake in the outcome. For example, fossil fuel companies support the “denial industry“, claiming that the science of global warming is inconclusive, agribusinesses promote genetically modified crops, and drug companies promote antidepressant and ADHD drugs, while funding scientific research that often supports their campaigns.

Science informs political officials and agencies when they’re designing regulations for air and water pollution, when determining whether a particular drug is safe and efficacious, when assessing whether particular foods or products are safe for consumers, etc. In my opinion, science can rarely be completely “objective” and “unbiased”; scientists are humans, after all, and they have their own motivations and considerations that can affect their work. The important thing, however, is to reduce political and commercial influence as much as possible so that scientists can do their research and then present their results as clearly and accurately as possible.

In all fields of science, scientists to some extent are affected by funding constraints and grant agencies. These constraints can affect exactly what is studied, how it is researched, and how the results are presented in the media and to the public. Nonetheless, scientific research is particularly important–and susceptible to more outside influences–when it is related to public policy, including the topics above. In addition, politically-related work in the social sciences, especially economics, can be contentious as well.

In the US under the Bush administration, many felt that scientists were under attack. For example, a “revolving door” appeared to be in place when former lobbyists and spokespeople for industries later worked at agencies having the task of regulating their former industries; in particular cases, they appeared to write or advocate for policy shifts that benefited these industries. In 2004, the Union of Concerned Scientists (UCS) released a report, “Scientific Integrity in Policymaking: An Investigation into the Bush Administration’s Misuse of Science”, claiming that the White House censors and suppresses reports by its own scientists, stacks advisory committees, and disbands government panels. There later appeared to be political influence on the Food and Drug Administration (FDA), on researchers working on embryonic stem cells, on sex education (because of arguments about the effectiveness of abstinence-based programs), and on the teaching of biological evolution.

Although the Obama administration appears to have more respect for science and scientists (see this 2013 UCS report), the politicization of some scientific work continues. The assessment of the social and environmental impact of the Keystone XL pipeline may be such an example. The final environmental impact statement, which was released by the State Department yesterday, appears to endorse the pipeline, but the interpretation is unclear (see this coverage in the Wall Street Journal and Scientific American blog).

In any case, these contentious situations will be easier when government agencies have explicit policies for scientific integrity and when the affiliations and employment histories of officials are transparent. It’s also important to keep in mind that the struggle for independent and transparent science never ends. Scientists should always try to be as clear as possible about their views or beliefs when they are relevant to their work (see this NYT blog for useful advice), and results and data should be made publicly available whenever possible.

US Energy Policy (part 2)

Since President Obama will deliver his State of the Union address on Tuesday, I’d like to write a bit more about energy policy, which may come up during the address in the context of the Climate Action Plan that was initiated last summer (when the picture below was taken). In addition, some new energy policies that are being advocated would create new jobs, especially in manufacturing and government sectors, whose employment rates haven’t improved much yet during the recovery from the economic recession.

AP110537194959-620

The president can call on Congress to do its part to pass laws that will complement his Climate Action Plan. Some of the recommendations below would be difficult to achieve in the current political climate, but it’s important to at least demonstrate the political will and public commitment to improve energy and climate policies.

1. President Obama could urge Congress to extend tax incentives for renewable energy technologies, in particular for solar electricity and wind power, which have already expired for the latter. These could at least be extended to 2020. This may be politically feasible, considering that some conservatives are now in support of renewable energy. This is also popular: wind and solar power increased nearly four-fold in the US over the past five years, and nine states currently generate 10% or more of their electricity from wind and solar power. The technology already exists to have dynamic electricity grids that are designed to handle variability in supply (such as due to unexpected weather) and demand, making it possible to transition to an increasing reliance on renewables and less on fossil fuels. (See this report for more info.)

2. President Obama could lay the ground for eventually rejecting the Keystone XL pipeline (see also our earlier post). He said last year that it would be approved “only if this project does not significantly exacerbate the problem of carbon pollution.” We have to wait for a supplemental Environmental Impact Statement before a final decision will be made.

3. The Environmental Protection Agency (EPA) has proposed a carbon pollution standard for new power plants. These limits, which are required under the Clean Air Act, could be applied to existing plants as well. In order to meet the carbon pollution reductions outlined in the Climate Action Plan, 25% cuts in carbon pollution will be required.

4. The president could outline new energy efficiency policies for homes, automobiles, businesses, and industries. For example, the industrial sector is responsible for about 1/3 of all U.S. energy use. Energy-efficient building designs and investment in high-efficiency combined heat and power systems can reduce these energy demands. For cars and trucks, Corporate Average Fuel Economy (CAFE) standards should be enforced by the EPA and Department of Transportation. In addition, a June 2012 study by the Blue Green Alliance finds that the new round of CAFE standards will create an estimated 570,000 full-time jobs throughout the US economy by 2030. The president could also urge Congress to expand investment in public transportation infrastructure that was begun in the The Recovery Act; this too would create thousands of new jobs.