https://twitter.com/PatrickTBrown31/...16555844035045
to:
https://twitter.com/PatrickTBrown31/...18089281573092
[thread archive:
https://threadreaderapp.com/thread/1...844035045.html
{
@PatrickTBrown31 | 05 September 2023}
Last week, I described our paper on climate change and wildfires:
I am very proud of this research overall. But I want to talk about how molding research presentations for high-profile journals can reduce its usefulness & actually mislead the public.
For climate research, I think the crux of the issue is highlighted here in my thread:
I mentioned that this research looked at the effect of warming in isolation but that warming is just one of many important influences on wildfires with others being changes in human ignition patterns and changes in vegetation/fuels.
So why didn’t I include these obviously relevant factors in my research from the outset? Why did I focus exclusively on the impact of climate change?
Well, I wanted the researche to get as widely disseminated as possible, and thus I wanted it to be published in a high-impact journal.
Put simply, I've found that there is a formula for success for publishing climate change research in the most prestigious and widely-read scientific journals and unfortunately this formula also makes the research less useful.
1) The first thing to know is that simply *showing* that climate change impacts something of value is usually sufficient, and it is not typically necessary to show that the impact is large compared to other relevant influences.
In the paper, I focused on the influence of climate change on extreme wildfire behavior but did not quantify (i.e., I “held constant”) the influence of other obviously relevant factors like changes in human ignitions or the effect of poor forest management.
I knew that considering these factors would make for a more realistic (and thus useful) analysis, but I also knew that it would muddy the waters of an otherwise clean story and thus make the research more difficult to publish.
This type of framing, where the influence of climate change is unrealistically considered in isolation, is the norm for high-profile research papers.
For example, in another recent influential Nature paper, they calculated that the two largest climate change impacts on society are deaths related to extreme heat and damage to agriculture.
However, that paper does not mention that climate change is not the dominant driver for either one of these impacts: temperature-related deaths have been declining, and agricultural yields have been increasing for decades despite climate change.
2) This brings me to the second component of the formula, which is to ignore or at least downplay near-term practical actions that can negate the impact of climate change.
If deaths related to outdoor temperatures are decreasing and agricultural yields are increasing, then it stands to reason that we can overcome some major negative effects of climate change. It is then valuable to study this success so that we can facilitate more of it.
However, there is a taboo against studying or even mentioning successes since they are thought to undermine the motivation for greenhouse gas emissions reductions.
Identifying and focusing on problems rather than studying the effectiveness of solutions makes for more compelling abstracts that can be turned into headlines, but it is a major reason why high-profile research is not as useful to society as it could be.
3) A third element of a high-profile climate change research paper is to focus on metrics that are not necessarily the most illuminating or relevant but serve more to generate impressive numbers.
In the case of my paper, I followed the common convention of focusing on changes in the risk of extreme events rather than simpler and more intuitive metrics like changes in intensity.
The sacrifice of clarity for the sake of more impressive numbers was probably necessary for it to get into Nature.
Another related convention, which I also followed in my paper, is to report results corresponding to time periods that are not necessarily relevant to society but, again, get you the large numbers that justify the importance of your research.
For example, it is standard practice to report climate change related societal impacts associated with how much warming has occurred since the industrial revolution but to ignore or “hold constant” societal changes over that time.
This makes little sense from a practical standpoint since the influence of societal changes have been much larger than the influence of climate changes on people since the 1800s.
Similarly, it is conventional to report projections associated with distant future warming scenarios now (or always) thought to be implausible (RCP8.5) while ignoring potential changes in technology and resilience.
A much more useful analysis for informing actual decisions we face would focus on changes in climate from the recent past that living people have experienced to the foreseeable future - the next several decades - while accounting for changes in technology and resilience.
In the case of our research, this would mean considering the impact of climate change in conjunction with proposed reforms to forest management practices over the next several decades. This is what we are doing in the current phase of the research.
This more practical kind of analysis is discouraged because looking at changes in impacts over shorter time periods and in the context of other relevant factors reduces the calculated magnitude of the impact of climate change, and thus it appears to weaken the case for greenhouse gas emissions reductions.
So why did I follow this formula for producing a high-profile scientific research paper if I don’t believe it creates the most useful knowledge for society? I did it because I began this research as a new assistant professor facing pressure to establish myself in a new field and to maximize my prospects of securing respect from my peers, future funding, tenure, and ultimately a successful career.
When I had previously attempted to deviate from the formula I outlined here…
…my papers were promptly rejected out of hand by the editors of high-profile journals without even going to peer review.
To put it bluntly, I sacrificed value added for society in order to mold the presentation of the research to be compatible with the preferred narratives of the editors and reviewers of high-profile journals.
I am bringing these issue to light because I hope that highlighting them will push for reforms that will better align the incentives of researchers with the production of the most useful knowledge for society.
I write more about this today in a piece in The Free Press:
I also have more thoughts on my personal blog:
Connect With Us