In March of 2012, the climate alarmist website Skeptical Science had their forums "hacked" and the contents posted online. In a forum thread titled, "Introduction to TCP" (2012-01-19) John Cook layed out the game plan for the 97% consensus study, Cook et al. (2013) 'Quantifying the consensus on anthropogenic global warming in the scientific literature',
It's essential that the public understands that there's a scientific consensus on AGW. So Jim Powell, Dana and I have been working on something over the last few months that we hope will have a game changing impact on the public perception of consensus. Basically, we hope to establish that not only is there a consensus, there is a strengthening consensus. Deniers like to portray the myth that the consensus is crumbling, that the tide is turning. However, our survey of the peer-reviewed literature shows that the opposite is true - the consensus is getting stronger and the gap between those that accept and reject the consensus is increasing. What we have in mind is an extended campaign over 2012 (and beyond).
Phase 1: Publishing a paper on the negligible impact of climate denial in the peer-reviewed literature
TCP is basically an update and expansion of Naomi Oreskes' survey of the peer-reviewed literature with deeper analysis. In 2004, Naomi surveyed 928 articles in the Web of Science matching the search "global climate change" from 1993 to 2003. We've expanded the time period (1991 to 2011) and added papers matching the search "global warming". We ended up with 12,272 papers. I imported the details of each paper (including abstracts) into the SkS database and set up a simple crowd sourcing system allowing us to rate the category of each paper using Naomi's initial categories (impacts, mitigation, paleoclimate, methods, rejection, opinion). We did find some rejection papers in the larger sample but the amount was negligible. The amount of citations the rejection papers received were even smaller proportionally, indicating the negligible impact of AGW denial in the peer-reviewed literature. Jim and I wrote these initial results up into a short Brevia article that we just submitted to Science (so please don't mention these results outside of this forum yet, lest it spook Science who freak out if there's any mention of a submitted paper before publication). Of course, Science have a 92% rejection rate so the chances are very slim - we'll try other journals if rejected there.
When the paper is published, we would announce it on SkS as the beginning of the public launch of TCP. It will also be promoted through the communications dept at the Global Change Institute although their press releases only go to Australian media so will have to explore other promotion ideas.
Phase 2: SkS team rates endorsements
For Phase 1, we didn't rate the actual # of endorsements of AGW - the focus was on the proportion and impact of rejection articles. So Phase 2 will be about tallying the # of endorsements and comparing it to the # of rejections in a variety of ways. This is where it gets exciting. A simple comparison of the # of endorsement papers vs rejection papers tells a vivid story of a strengthening consensus. Even more so, the # of citations of endorsement papers vs rejection citations. And this is something I haven't crunched any data for yet but just adding up the # of authors who have written endorsement papers vs rejection authors will, I imagine, tell another interesting tale.
What I'm thinking of doing is crowd sourcing among the SkS team the role of rating the 12,000 papers. By rating, we are actually going beyond what Naomi did. Her rating was one dimensional - just the 6 categories. We decided we wanted to collect more information about each paper and have defined two dimensions or two aspects of each paper that we want to capture - the category (impacts, mitigation, paleoclimate, methods, opinion) and the endorsement level (from explicit endorsement down to explicit rejection). So I'll program up a crowd sourcing system allowing SkSers to rate papers - the goal being every paper gets at least 2 ratings from different people for consistency.
The end goal of Phase 2 is publishing the results in a peer-reviewed paper. As far as co-authorship of the paper goes, I was thinking perhaps a practical approach would be that to be a co-author on the paper, you rate at least 2000 papers - seems a fair requirement to get your name on a peer-reviewed paper. And of course input into the writing of the paper - we'll need to anticipate all the various attacks our results will get as this result will be highly threatening to the denialosphere.
The result is we'll have 12,000 papers with category ratings and endorsement level. We can analyse this data in a variety of ways to tell many interesting stories - but what I'm guessing from what I've rated so far is we'll find is around 50% of the papers are explicit or implicit endorsements and the rest are neutral (with the tiniest fraction being rejection). Note and this is an important note - this result is based just on the abstract text, not the full paper, and hence is an underestimate of the actual number of endorsements.
Phase 3: Publicly crowd source the categorisation of neutral papers
When we publish the Phase 2 paper, it will strongly emphasise that the endorsement percentage is based just on the abstract text and hence an underestimate of the true number of papers endorsing the consensus. I anticipate there will be around 6000 "neutral" papers. So what I was thinking of doing next was a public crowd sourcing project where the public are given the list of neutral papers and links to the full paper - if they find evidence of an endorsement, they submit it to SkS (I'll have an easy-to-use online form) with the excerpted text. The SkS team would check incoming submissions, and if they check out, make the endorsement official. Thus over time, we would gradually process the 6000 neutral papers, converting many of them to endorsement papers - and make regular announcements like "hey the consensus just went from 99.75% to 99.8%, here are the latest papers with quotes". The final result will be a definitive, comprehensive survey of the number of endorsements of AGW in the literature over the last 21 years.
Phase 4: Repeat each year
Fingers crossed, Phase 3 will be complete by the end of 2012. Then in early 2013, we can repeat the process for all papers published in 2012 to show that the consensus is still strengthening. We beat the consensus drum often and regularly and make SkS the home of the perceived strengthening consensus.
In a forum thread titled, "Marketing Ideas" (2012-01-19) John Cook immediately moved onto marketing a "scientific" study before it was even started,
This thread is for general discussions of how to market TCP (began in this earlier thread) and make as great an impact as possible. Various surveys find that a disturbing proportion of the public don't think scientists agree about global warming so I suggest our goal be to establish "strengthening consensus" as a term in the general public consciousness (that goal can be a topic for discussion if required).Ari Jokimaki responded to Cook,
To achieve this goal, we mustn't fall into the trap of spending too much time on analysis and too little time on promotion. As we do the analysis, would be good to have the marketing plan percolating along as well. So a few ideas floating around:
Press releases: Talked to Ove about this yesterday, the Global Change Institute have a communications dept (well, two people) and will issue press releases to Australian media when this comes out. No plan yet for US media.
Mainstream Media: This is the key if we want to achieve public consciousness. MSM is an opaque wall to me so ideas welcome. I suspect this will involve developing time lines, building momentum for the idea and consulting with PR professionals like Jim Hoggan.
Climate Communicators: There needs to be a concerted effort (spearheaded by me) to get climate communicators using these results in their messaging. I've been hooking up with a lot of climate communicators over the last month and will be hooking up with more over the next few months so will be discussing these results with every climate communicator I can get hold of, including heavyweights like Susan Hassol and Richard Somerville, to discuss ways of amplifying this message.
Also Ed Maibach is doing research on the most effective way to debunk the "no consensus" myth so I hope to contact him and hopefully include our results in his research. The more we can get climate communicators incorporating our results into their messages, the better.
Blogosphere: The usual blogosphere networking. Note - Tim Lambert tried to do a similar crowd sourcing effort a few years ago but didn't succeed in generating enough support for the crowd sourcing - I'm confident we can get it done.
Climate Orgs: Also have been making connections with various climate organisations and occasionally talked about the possibility of collaboration so will use this project as a focal point as ways to work together. Have to think about this some more
Google: Coincidentally, started talking to someone who works at Google, specifically the data visualisation department. So I've been working with them on visualising the consensus data in sexy, interactive ways. This will be one of the X-factor elements of TCP - maybe they can even provide an embeddable version of the visualisation which blogs and websites can use.
Video: Peter Sinclair is keen to produce a YouTube video about the TCP results to publish on the Yale Forum on Climate Change.
Booklet: similar to Guide and Debunking Handbook, explaining the results of the peer-reviewed paper in plain English with big shiny graphics (with translations, I suppose - they're a pain for me to convert but worthwhile doing).
Kindle/iBook version of Booklet (can you publish free books on Amazon?).
Embeddable widget: graphic showing the graph of strengthening consensus, updated each year, easily copy and pasteable into other blogs. I like this idea, can make TCP go viral and become ubiquitious on the climate blogosphere!
Other ideas very welcome.
Update - will continue to add to this list as more ideas come along.
"I have to say that I find this planning of huge marketing strategies somewhat strange when we don't even have our results in and the research subject is not that revolutionary either (just summarizing existing research)." - Ari Jokimäki