My Surprising and Exciting Journey from Scientist to Science Writer

I’ve been drawn to science since I was a kid. I had many excellent and creative teachers along the way, including one who taught us students to be more observant and to think critically and another who smashed bowling balls into desks and who ran into a wall (while wearing pads and a helmet, of course) to demonstrate momentum conservation. I grew up in Colorado, and I enjoyed gaping at the Milky Way and the beautiful night sky while in the Rockies, even if I couldn’t name many constellations. Carl Sagan’s Cosmos program and the Star Trek TV shows also inspired me to explore astrophysics later in life.

Milky Way over Great Sand Dunes National Park, Colorado. (Photo by Carl Fredrickson)

Milky Way over Great Sand Dunes National Park, Colorado. (Photo by Carl Fredrickson)

But my head isn’t always in the stars. I have many other interests too, including sociology, political science and philosophy of science, and I’ve always enjoyed literature and poetry too. I’m not just interested in doing science and analyzing datasets and phenomena; that, by itself, is not enough. I also desire to use science and critical thinking to help people and connect with them. Since science plays such an important role in human society, I’d like to communicate scientists’ research and debates and the scientific process as well as I can. While the behavior of neutrinos, ice sheets and red pandas might sound interesting, for example, we always have to ask, why are they important? What do scientists claim to have learned about them and how did they learn it? What are the broader implications and context for the research?

Ever the lifetime student, a couple years ago I thought I might become an absent-minded, nerdy, activist professor, maybe widening my scope beyond astronomy and physics into interdisciplinary research and public outreach. But then I realized that I wanted to do more. I examined many interconnections between science and policy—often posting about them on this blog—and I investigated ways I could utilize and develop my science writing skills. I earned fellowship opportunities in both science writing and science policy, and I considered going on both directions. As the head of our astrophysics and space sciences department told me while I mulled over the options, “Those aren’t actually that different. They both involve communicating science to people who might not understand it well.”

In the end, after fifteen years working as a Ph.D. student, teaching and research assistant, postdoctoral researcher, research scientist and lecturer, I decided that I would make the shift and become a science writer! It’s a big step, and I felt a bit nervous about it. Now that I’ve made the decision, I am happy and excited to be trying something new, and I look forward to improving my skills and working on it full-time.

For those of you considering working in science writing or science policy, or for those of you just interested in learning more, I am happy to help. In any case, here are a few suggestions and pieces of advice, which will be particularly relevant for you if you’re coming from a science background as I did.

First, I recommend becoming involved in public outreach and education programs. You may even decide to organize your own events. Just connect to people in whatever ways work well for you, such as speaking in local school classrooms, making demonstrations for students at your university, mentoring prospective students, interacting with members of the public at museums and planetaria, talking to people at cafes and pubs (such as Two Scientists Walk into a Bar, Astronomy on Tap, and other programs), etc.

Second, become more involved in and volunteer for the relevant professional scientific societies, such as the American Astronomical Society, American Physical Society, American Geophysical Union, etc. Be more than just a card-carrying member. All of these societies, and especially the American Association for the Advancement of Science (AAAS), have many useful resources, scholarships and internships at your disposal.

Third, it is crucially important to talk to a variety of people who work in science writing or science policy (or whatever you might be interested in), get involved and try it yourself. Make sure that you don’t merely like the concept of it but that you actually enjoy and excel at doing it. You will need to make the time to do this. You may find new people in your own college, university or community working in these professions who have much to teach you. Try a variety of media and styles too, possibly including social media, blogs, podcasts, news articles, feature stories, videos, etc. If you’re curious about what I’ve done over the past year or so, look here.

Fourth, check out professional science writing organizations. In particular, I recommend looking up the National Association of Science Writers, the Society of Environmental Journalists and the Association of Healthcare Journalists. Furthermore, you might find useful local organizations too. (We have the San Diego Press Club here, for example). Science writing workshops, such as those in Santa Fe, New Mexico and in Banff, Alberta, could be beneficial for you and could introduce you to others like yourself who are also just starting to venture into the profession. Finally, if you are interested, the AAAS has mass media and science policy fellowships, and the University of California, Santa Cruz, MIT, NYU, and other universities have graduate programs you may consider, though these involve an investment of time and money.

Before diving in, consider the job prospects. Although we have our ideals, we also want to work for a livable salary with sufficient job security. Staff writers, editors, freelancers and public information officers (PIOs) all have pros and cons to their jobs, and it’s important to understand them well.

I’ll make it official: I decided to head to the UC Santa Cruz science communication program, and I’m looking forward to it! In a few days I will be on my way north to Santa Cruz. I plan to try my hand working with a local newspaper, magazine, and an online news outlet, and this fall I will be working with PIOs at Stanford Engineering. Stay tuned for my new articles!

Coming from a science background, I have many challenging things to learn, but I think I’m up to it. I’m trying to learn to write more creatively and evocatively, while identifying compelling characters. I’m learning to assess which scientific discoveries and developments make for the most intriguing stories. Moreover, scientists and science writers have different ways of thinking, and bridging the gap between them involves more steps than you might think it does. Perhaps most importantly, after thinking of myself as a scientist for so many years, it’s hard to craft a new identity. It turns out that while I am an astronomer and a physicist, I am many other things too. I’m continuing to explore the universe, just in a myriad different ways than before. I’m boldly going where I haven’t gone before, and the sky’s the limit!

8 Ways to Improve the Academic System for Science and Scientists

I’ve enjoyed most of my time working in academic science in the U.S. and Germany as a graduate student, a postdoctoral researcher, a research scientist and a lecturer. I’ve benefited from supportive mentors, talented colleagues and wonderful friends. I think I’ve accomplished a lot in terms of research, teaching, political advocacy and public outreach. Based on my experience and on anecdotal evidence, the system works well in some ways but is flawed in many others, especially involve the job market and career advancement.

Reflecting on the past fifteen years, here are my current thoughts on problems with the system and ways it could be improved, with a focus on the U.S. and on the physical sciences, though the social sciences and life sciences face similar problems.

1. Let’s be honest: the academic job market is horrible. It was already pretty bad before the recession, and it is worse now. Many scientists move from institution to institution, working on many postdocs, fellowships, and other short-term jobs while seeking permanent positions or more secure funding, but these turn out to be increasingly elusive and competitive. (I worked at three positions over nine years since earning my Ph.D.) I’ve seen some tenure-track faculty positions receive well over 400 applications—I don’t envy the hiring committees there—and I’ve seen some grant proposal success rates drop well below 10%.

Note the trends: more and more people with Ph.D's are going into postdocs or are unemployed. (Credit: NSF, The Atlantic)

Note the trends: more and more people with Ph.D’s are going into postdocs or are unemployed. (Credit: NSF, The Atlantic)

This system causes people a lot of stress; from a societal perspective, in this situation, how well can people work under such pressure and job insecurity, and how much can they accomplish when they must perennially focus on job applications and grant proposals rather than on the things that drew them to their profession? If the scientific community wants to attract the best scientists, then shouldn’t we strive to make their jobs more desirable than they are now, with better pay and security? As Beryl Lieff Benderly wrote in the Pacific Standard, “unless the nation stops…’burning its intellectual capital’ by heedlessly using talented young people as cheap labor, the possibility of drawing the best of them back into careers as scientists will become increasingly remote.” In much the same way, the inadequate job prospects of adjunct faculty renders the possibility of drawing the best teachers and retaining them similarly small.

For doctorate recipients who care primarily about salary, their choice is obvious. (Credit: National Science Foundation)

For doctorate recipients who care primarily about salary, their choice is obvious. (Credit: National Science Foundation)

People have been diagnosing these problems for years, but no clear solutions have emerged. In my opinion, the job market situation could be gradually ameliorated if many institutions simultaneously sought to improve it. In particular, I think scientists should have longer-term postdoctoral positions, such as five years rather than one, two or three. I also think faculty should hire fewer graduate students, such as one or two at a time rather than, say, five of them, regardless of how much funding they happen to have at the time.

I also think that colleges, universities, and national labs should allocate funding for more staff positions, though of course that funding has to come from somewhere, and tuition and student debt are already too high. On the other hand, some people argue that university administrations have ballooned too much over the past few decades; others argue that some universities spend too much money on their sports programs. In addition, federal funding for “basic research” (as opposed to applied research) in science should be increased, as such grants often supplement university funding.

Federal funding for non-defense research & development has been pretty flat since the 1980s, except for "sequestration." (Credit: AAAS, NSF)

Federal funding for non-defense research & development has been pretty flat since the 1980s, except for “sequestration.” (Credit: AAAS, NSF)

2. We can considerably improve the graduate student experience as well. Many university departments and professional societies now give more information about academic career prospects to students than before, and it should be their official policy to do so. Furthermore, students should be encouraged to explore as many of their interests as possible, not just those focused on their narrow field of research. If they want to learn to teach well, or learn about computer programming, software, statistics, policy-making, or the history or philosophy or sociology of their science, or if they want to investigate interdisciplinary connections, or if they want to develop other skills, they should have the time and space to do that. Universities have many excellent resources, and students should have the opportunity to utilize them.

We know that only a fraction of graduate students will continue in academia, and the best scientists will be well-rounded and have a wide range of experience; if they move on to something else, they should be prepared and have the tools and expertise they need.

3. The scientific community can take this an important step further by acknowledging the many roles and variety of activities scientists engage in in addition to research: teaching courses, participating in outreach programs, advancing efforts to improve diversity, becoming involved in political advocacy, developing software and instrumentation that don’t necessarily result in publications, etc. Many scientists agree that we do not sufficiently value these kinds of activities even though they are necessary for the vitality and sustainability of the scientific enterprise itself. For example, in a new paper submitted to the Communicating Astronomy with the Public journal, the authors find that many astronomers think a larger fraction of their grant-funded work (up to 10%) should be allocated to education and public outreach (EPO). EPO are included among the “broader impacts” of National Science Foundation grants, but much more can be done in this regard. All of these activities should be explicitly recognized by the relevant federal agencies during the evaluation of grant proposals and by departmental hiring committees when assessing candidates for jobs and promotions.

Distribution of percentage of research grant astronomers currently invest (blue) and suggest (yellow) to allocate into public outreach engagement. (Credit: Lisa Dang, Pedro Russo)

Distribution of percentage of research grant astronomers currently invest (blue) and suggest (yellow) to allocate into public outreach engagement. (Credit: Lisa Dang, Pedro Russo)

Therefore, a corollary follows: if the community appreciates a wider scope of activities as important components of a scientist’s job, then it is not necessary to relentlessly pursue published research papers all of the time. Perhaps this could alleviate the “publish or perish” problem, in which some scientists rush the publication of insufficiently vetted results or make provocative claims that go far beyond what their analysis actually shows. That is, endeavoring for a more open-minded view of scientists’ work could improve the quality and reliability of scientific research.

In practice, how would this be done? Scientists could organize more conferences and meetings specifically devoted to education research, outreach programs, policy developments, etc., and the proceedings should be published online. Another way a scientist’s peers could be aware of the wider scope of her non-research work would be to have different levels of publication involving them, from informal social media and blog posts to possibly peer-reviewed statements and articles that could be posted on online archives or wiki pages. For example, if she participated in an outreach project with local high school students or in Congressional visit days, she could speak or write about the experience and about what worked well with the program and then publish that presentation or statement.

Furthermore, since research projects can take years and many grueling steps to complete, often by graduate students toiling away in their offices and labs, why not reduce the pressure and recognize the interim work at intermediate stages? Some people are considering publishing a wider scope of research-related work, even including the initial idea phase. A new open-access journal, Research Ideas and Outcomes, aims to do just that. I’m not sure whether it will work, but it’s worth trying, and I hope that scientists will be honorable and cooperative and avoid scooping each other’s ideas.

On that note, as some of you know, I will make it official that I am leaving academic science. (In my next post, I will write about what I am shifting my career toward.) As a result, I will be unable to complete many of my scientific project ideas and papers, and for the few astrophysicist readers of this blog, I will not be annoyed if you run with them (but please give me proper credit). My next four projects probably would have been the following: modeling galaxy catalogs including realistic dynamics within galaxy groups and clusters within dark matter clumps of the “cosmic web”; assessing observational and theoretical problems in the relation between galaxy stellar mass and dark matter halo mass; modeling the mass-morphology relation of galaxies using constraints I previously obtained with the Galaxy Zoo citizen science project; and modeling and analyzing the star formation rate dependence of the spatial distribution of galaxies in the distant cosmic past. I am happy to give more details about any of these ideas.

4. We should also address the problem of academic status inequality. If a person makes it to an elite university or has the opportunity to work with a big-name faculty member or manages to win a prestigious award, grant or fellowship, that is an excellent achievement of which they should be proud. Nevertheless, such a person is essentially endorsed by the establishment and is much more likely to be considered part of an in-crowd, with everyone else struggling in the periphery. In-crowd scientists then often have an easier time obtaining future opportunities, and like an academic capitalism, wealth and capital flow toward this in-crowd at the expense of the periphery scientists. On the one hand, the in-crowd scientists have accomplished something and the community should encourage them to continue their work. On the other hand, scientists are busy people, but they can also be lazy; it’s too easy to give an award to someone who as already received one or to hire someone from another elite institution rather than to assess the merits of the many people with whom they may be less familiar.

According to a recent study in Science Advances, the top ten elite universities produce three times as many future professors as the next ten in the rankings. However, the authors find plenty of evidence that this system does not resemble a meritocracy; in addition, female graduates slip 15% further down the academic hierarchy than men from the same institutions. According to a Slate piece by Joel Warner and Aaron Clauset, a co-author of the paper, the findings suggest that upward career mobility in the world of professors is mostly a myth. Many scientists coming from academic outsiders—not from the elite universities—have made important discoveries in the past, but their peers only slowly noticed them. “Thanks to the restrictive nature of the academic system there may be many more innovations that are languishing in obscurity, and they will continue to do so until our universities find a way to apply the principles of diversity they espouse in building student bodies to their hiring practices as well.”

5. As I’ve written before, much more work can be done to improve gender, race, class and other forms of diversity when hiring students, postdocs and faculty and promoting them at universities. Furthermore, when organizing conferences, workshops, meetings and speaker series, diverse committees should explicitly take these principles into consideration. Even the most thorough and attentive committees must also beware of “unconscious bias,” which affects everyone but can be reduced.

6. In a related point, colleges and universities can implement many family-friendly (or more generally, life-friendly) policies to improve and promote work-life balance of academic workers. These include flexible schedules, parental leave, tenure-clock extensions and many others. However, this is not sufficient: scientists who happen to lack the benefits and privileges of white, male, straight people from elite universities seem to have to work that much harder to have a chance of drawing the attention of hiring committees. One should not need to work 100 hours a week to be a successful scientist. Shouldn’t we want more balanced scientists with lives and interests beyond their narrow research field? This means that committees should recognize that sometimes excellent scientists may have fewer yet very high-quality accomplishments and may be under the radar waiting to be “discovered.”

7. The scientific community would also benefit from more opportunities for videoconferencing, in which people remotely present talks and field questions about them. As I’ve written for the American Astronomical Society Sustainability Committee, our biggest source of carbon emissions comes from frequent travel, and we should try to reduce our carbon “footprint.” Moreover, people at small colleges with small travel budgets and people with families who have a harder time traveling would appreciate this, as it would level the playing field a bit. Of course, there is no substitute for face-to-face interactions, but people continue to improve video tools with Skype, Google and many others, which could be utilized much more extensively.

8. Finally, I argue that everyone would benefit from more and better interactions between scientists, public affairs representatives and government affairs officials at universities. Such interactions would help scientists to present their accomplishments to a wider community, help universities to publicize their scientists’ work, and help political officials to understand the important science being done in their districts, often benefiting from federal and state investment.

These are my current thoughts, and I hope they spark discussions and debates.

Reproducibility in Science: Study Finds Psychology Experiments Fail Replication Test

Scientists toiling away in their laboratories, observatories and offices don’t just fabricate data, plagiarize other research, or make up questionable conclusions when publishing their work. Participating in any of these dishonest activities would be like violating a scientific Hippocratic oath. So why do many scientific studies and papers turn out to be unreliable or flawed?

(Credit: Shutterstock/Lightspring)

(Credit: Shutterstock/Lightspring)

In a massive analysis of 100 recently published psychology papers with different research designs and authors, University of Virginia psychologist Brian Nosek and his colleagues find that more than half of them fail replication tests. Only 39% of the psychology experiments could be replicated unambiguously, while those claiming surprising effects or effects that were challenging to replicate were less reproducible. They published their results in the new issue of Science.

Nosek began crowdsourcing the Reproducibility Project in 2012, when he reached out to nearly 300 members of the psychology community. Scientists lead and work on many projects simultaneously for which they receive credit when publishing their own papers, so it takes some sacrifice to take part: the replication paper lists the authors of the Open Science Collaboration alphabetically, rather than in order of their contributions to it, and working with so many people presents logistical difficulties. Nevertheless, considering the importance of scientific integrity and investigations of the reliability of analyses and results, such an undertaking is worthwhile to the community. (In the past, I have participated in similarly large collaboration projects such as this, which I too believe have benefited the astrophysical community.)

The researchers evaluated five complementary indicators of reproducibility using significance and p-values, effect sizes, subjective assessments of replication teams and meta-analyses of effect sizes. Although a failure to reproduce does not necessarily mean that the original report was incorrect, they state that such “replications suggest that more investigation is needed to establish the validity of the original findings.” This is diplomatic scientist-speak for: “people have reason to doubt the results.” In the end, the scientists in this study find that in the majority of cases, the p-values are higher (making the results less significant or statistically insignificant) and the effect size is smaller or even goes in the opposite direction of the claimed trend!

Effects claimed in the majority of studies cannot be reproduced. Figure shows density plots of original and replication p-values and effect sizes (correlation coefficients).

Effects claimed in the majority of studies cannot be reproduced. Figure shows density plots of original and replication p-values and effect sizes (correlation coefficients).

Note that this meta-analysis has a few limitations and shortcomings. Some studies or analysis methods that are difficult to replicate involve research that may be pushing the limits or testing very new or little studied questions, and if scientists only asked easy questions or questions to which they already knew the answer, then the research would not be particularly useful to the advancement of science. In addition, I could find no comment in the paper about situations in which the scientists face the prospect of replicating their own or competitors’ previous papers; presumably they avoided potential conflicts of interest.

These contentious conclusions could shake up the social sciences and subject more papers and experiments to scrutiny. This isn’t necessarily a bad thing; according to Oxford psychologist Dorothy Bishop in the Guardian, it could be “the starting point for the revitalization and improvement of science.”

In any case, scientists must acknowledge the publication of so many questionable results. Since scientists generally strive for honesty, integrity and transparency, and cases of outright fraud are extremely rare, we must investigate the causes of these problems. As pointed out by Ed Yong in the Atlantic, like many sciences, “psychology suffers from publication bias, where journals tend to only publish positive results (that is, those that confirm the researchers’ hypothesis), and negative results are left to linger in file drawers.” In addition, some social scientists have published what first appear to be startling discoveries but turn out to be cases of “p-hacking…attempts to torture positive results out of ambiguous data.”

Unfortunately, this could also provide more fuel for critics of science, who already seem to have enough ammunition judging by overblown headlines pointing to increasing numbers of scientists retracting papers, often due to misconduct, such as plagiarism and image manipulation. In spite of this trend, as Christie Aschwanden argues in a FiveThirtyEight piece, science isn’t broken! Scientists should be cautious about unreliable statistical tools though, and p-values fall into that category. The psychology paper meta-analysis shows that p<0.05 tests are too easy to pass, but scientists knew that already, as the Basic and Applied Social Psychology journal banned p-values earlier this year.

Furthermore, larger trends may be driving the publication of such problematic science papers. Increasing competition between scientists for high-status jobs, federal grants, and speaking opportunities at high-profile conferences pressure scientists to publish more and to publish provocative results in major journals. To quote the Open Science Collaboration’s paper, “the incentives for individual scientists prioritize novelty over replication.” Furthermore, overextended peer reviewers and editors often lack the time to properly vet and examine submitted manuscripts, making it more likely that problematic papers might slip through and carry much more weight upon publication. At that point, it can take a while to refute an influential published paper or reduce its impact on the field.

Source: American Society for Microbiology, Nature

Source: American Society for Microbiology, Nature

When I worked as an astrophysics researcher, I carefully reviewed numerous papers for many different journals and considered that work an important part of my job. Perhaps utilizing multiple reviewers per manuscript and paying reviewers for their time may improve that situation. In any case, most scientists recognize that though peer review plays an important role in the process, it is no panacea.

I know that I am proud of all of my research papers, but at times I wished to have more time for additional or more comprehensive analysis in order to be more thorough and certain about some results. This can be prohibitively time-consuming for any scientist—theorists, observers and experimentalists alike—but scientists draw a line at different places when deciding whether or when to publish research. I also feel that sometimes I have been too conservative in the presentation of my conclusions, while some scientists make claims that go far beyond the limited implications of uncertain results.

Some scientists jump on opportunities to publish the most provocative results they can find, and science journalists and editors love a great headline, but we should express skepticism when people announce unconvincing or improbable findings, as many of them turn out to be wrong. (Remember when Opera physicists thought that neutrinos could travel faster than light?)

When conducting research and writing and reviewing papers, scientists should aim for as much transparency and openness as possible. The Open Science Framework demonstrates how such research could be done, where the data are accessible to everyone and individual scientist’s contributions can be tracked. With such a “GitHub-like version control system, it’s clear exactly who takes responsibility for what part of a research project, and when—helping resolve problems of ownership and first publication,” writes Katie Palmer in Wired. As Marcia McNutt, editor in chief of Science, says, “authors and journal editors should be wary of publishing marginally significant results, as those are the ones that are less likely to reproduce.”

If some newly published paper is going to attract the attention of the scientific community and news media, then it must be sufficiently interesting, novel or even contentious, so scientists and journalists must work harder to strike that balance. We should also remember that, for better or worse, science rarely yields clear answers; it usually leads to more questions.