Is there a scientific aristocracy?

The word aristocracy comes from Greek and means “ruling of the best”, but it is also synonymous with a subset of individuals having an unfair advantage. Do we have an aristocracy in scientific academia? If so, to what extent is it the ruling of the best, and to what extent is it an inequality?

In a professional context, just as with other aspects of life, name means a lot. A researcher’s name is their personal brand, and years of publishing high quality science can build trust in the brand and thus the name. But does being an established name mean you get better treatment than more junior counterparts?

Does the influence of luck decline over a career?

Getting high quality research published is a non-trivial task. In fact, I’d go further than that: it’s desperately hard for many good and some bad reasons. You need the right idea, the right data, the right story; and you need this to be read by the right people at the right time. Napoleon Bonaparte’s question “I know he’s a good general, but is he lucky?” seems appropriate for researchers, because luck plays such a pivotal role. Mind you, if you are lucky enough to win some battles/publish some papers in good journals early on, you get the chance to make your own luck. Early winners get to become the aristocrats. They may therefore be more likely to get past publication pinch points.

sir_arthur_wellesley_1st_duke_of_wellington

Sir Arthur Wellesley, 1st Duke of Wellington. Badass (and lucky) General, Prime Minister and first generation aristocrat. Image from Wikicommons.

Publication pinch points

We face two key pinch points when trying to publish a paper: (1) getting the manuscript sent out to review, and (2) getting over the line from “reject” to “resubmit”. Both of these points are involve gatekeepers making expert decisions that are nonetheless open to some subjectivity. Is this paper of broad enough appeal? Is it scientifically good enough?

We must ask ourselves whether the likelihood of getting over these pinch points is increased when the author list includes a scientific aristocrat. This is, of course, difficult to unpick because senior researchers will write better papers by virtue of experience – they are just as likely to be “ruling of the best” aristocrats as they are “unfair advantage” aristocrats. I argue that double blind reviewing is the only way to unpick the problem.

A plan for a more egalitarian scientific community

Many people exhibit conscious biases, and everybody exhibits unconscious bias. Everybody. Unconscious bias an evolved mechanism our brains use to make decisions. If we force ourselves to bring our unconscious biases into the conscious, we can challenge them and fight them – indeed we need to do this to fight sexism, racism and other prejudices. Alternatively, we can prevent our unconscious biases from occurring by withholding key information from our brains – this is where double blind review comes in.

Let us submit manuscripts with no name list, no institution details and no grant information. No more pre-submission emails saying “I’m about to send you something interesting”; no more big name patronage on papers; no more benefit of the doubt because they’ve published great stuff before. Let only that particular scientific document do the talking. Some journals, for example Nature Communications and American Naturalist offer double blind review. This is a great start, but there still remains a massive problem: if you’re a scientific aristocrat, you are not forced to relinquish the advantage of name.

The best science demands strong competition and a level playing field

Competition to publish is intense, and rightfully so. However, in order for our respective fields to incorporate the best new ideas, there has to be a level playing field; we must be sure that early career researchers have an equal chance of exposing their research to key audiences as established researchers.

The tragedy of unscientific policymaking from our governments: from pesticides to GMOs

It has been quite some time since I have written a blog post. My last one was on how politicians approach data and my current one is no different (though perhaps a bit angrier). Last week, the UK Government decided to suspend a ban on pesticides that have been shown (convincingly) to harm bees.

My heart sank. Bees are already in a precipitous decline (for many reasons, including pesticides), and they are responsible for pollinating much of our crops. Bee decline will lead to an agricultural disaster.

So as a concerned citizen, I wrote to my MP (Steven Paterson, MP for Stirling) imploring him to fight this government policy. Here it is:

Dear Mr Paterson,

I write to you to express my extreme concern with the Government’s approach to neonicotinoid (neonic) pesticides. The Government claims the jury is out as to whether these pesticides cause damage to key pollinators, but this position is much like claiming the jury is out on manmade global warming. Top quality research from our own university here in Stirling (published in the leading journal, Science) has demonstrated that neonics cause huge damage to bees (that are already struggling). This is not just a conservation concern; bees pollinate our crops, so the agricultural consequences of further bee decline could be catastrophic.

Relaxing the ban on neonics may be of short-term benefit as they make it easier to grow crops such as oil seed rape, however, the long-term consequences will be disastrous.

Thank you for taking the time to read my short email. I hope you will be able to make representations to the relevant Minister.

Yours sincerely,

Stuart Auld

The response I got was this:

Hi Stuart,

Thank you for taking the time to write to me.

The SNP Government’s Cabinet Secretary for Rural Affairs has urged the UK government to accept EU restrictions on the use of neonicotinoids, but has said the measures should not to be implemented until more evidence has been gathered.

The SNP Scottish Government has suggested a precautionary approach with a built‐in breathing space and exit strategy. When it comes to protecting our biodiversity and wildlife, there are times when taking a precautionary approach is perfectly justifiable.

It is in the interests of our environment and our farmers that we have healthy bee populations but we know there are a wide range of factors affecting these valuable pollinators.

Kind regards

Steven Paterson MP

This response did not leave me happy. It seems (after googling some of the phrases in the reply) that the SNP Government entirely agrees with Westminster on this issue – they think the burden of proof should be on showing that the pesticides are unsafe (and not vice-versa). Not happy with this, I sent the following reply (which, in hindsight, was more than a little snobby, but here it is for completeness).

Dear Mr Paterson,

Forgive me, but it took me a while to understand your response – “a precautionary approach with built-in breathing space and exit strategy” doesn’t make a whole lot of sense. On googling that phrase, I found the Scottish Government’s relevant policy document. I am assuming this means that the Scottish Government’s position is that of the UK Government, i.e., to allow the use of neonics until there is more evidence saying they cause bee declines?

I’m sorry, but this policy is wrong-headed for two reasons:

(1) The appropriate “precautionary approach” is to ban these pesticides until there is more evidence they are safe – “Safety First!” is a common phrase for good reason.

(2) The UK Government is gagging its own scientific advisors because they refused to back the NFU’s request to lift the neonic ban. Even if, as both Governments argue, the burden of proof should be on showing that neonics are NOT safe for bees (which it shouldn’t), this will never happen whilst the Government’s own scientists are prevented from publishing results.

The evidence for neonic-induced bee declines is exceptionally compelling, and comes from numerous labs working all over the world. Farmers are having a tough time, but without bees, our agriculture sector will be devastated and we will struggle to feed ourselves. It may sound like I am catastrophising, but I cannot overstate the extent of the problem. I urge you, as an independent minded local MP to challenge this dangerous policy.

Yours sincerely,

Stuart Auld

Somewhat unsurprisingly, I have not had any other correspondence from my MP. He clearly does not have a view on the subject and is happy to refer me to the Scottish Government’s official stance. But now I realise things are even worse: the Scottish Government’s agricultural policy is even more unscientific than I originally feared – they are seeking to ban GM crops in Scotland.

That’s right – they want to allow pesticides that have been shown to be dangerous, yet ban a technology that has been shown to be safe.

Whenever I challenge someone attacking GM crops as being “frankenfood” etc. the arguments almost always contain phrases like “it’s just not natural” and “you can’t mess with nature like that”. Getting into a metal tube and flying isn’t natural; engineering antibiotics isn’t natural; our entire agricultural landscape isn’t natural. This argument that we should ban GM because it isn’t natural is a complete and utter red herring.

Selective breeding for preferred trains has been around for centuries. Wouldn’t it be nice to selectively breed only the precise traits we need (and therefore avoid simultaneously selecting for bad traits)? That is GM, and it will help us – no, already is helping us –  feed our ever-growing populations.

What I find interesting is many who are anti-GM are ardent believers in man-made climate change. They don’t see their anti-GM stance is equally unscientific as the stance of climate change deniers. People being unscientific is one thing, but I expect my governments (both Scottish, UK and European) to aim to at least try to formulate evidence-based policy. And yes, I will be that incredibly annoying constituent that emails and blogs about it until they do. Trust me, I can be really annoying.

A scientist’s view of the European Parliament: How politicians approach data

The role of politicians is to formulate, debate, and enact policy. For this process to work effectively, they need high-quality, unbiased data. The way politicians choose and use data therefore has a profound influence on the world we live in. Likewise, the way scientists package their data will affect how policymakers view those data. This post discussing science-based policy comes from my time at the European Parliament as part of the British Ecological Society’s Parliamentary Shadowing Scheme.

“Evidence-based policy” is a commonly used term nowadays. I cringe when I hear it, because I think all policy should be evidence-based. The term is an extension of “evidence-based medicine” which is equally scary (what were we basing medicine on in the past – witchcraft?)

The idea is there is a single large pool of data that can be analysed and interpreted to generate policy. However, one can take different samples of the data and come to wildly different conclusions about patterns and processes; this will lead to different policy outcomes (see below).

BlogPost1

The way data is selected and used is therefore of paramount importance. The problem is that many policymakers are not scientists and therefore haven’t been trained in how to select and interpret data – it’s not their main focus. Likewise, most scientists are not trained to present data to policymakers – their main focus is to communicate with other scientists. Important aspects of patterns and process can therefore get lost in translation. This is particularly true when describing uncertainty and complexity associated with patterns in data.

The difficulties of uncertainty and complexity
Scientist: There is strong evidence that an increase in X leads to an increase in Y.
Policymaker: Can you say for certain that this is the case?
Scientist: No.
Policymaker: So the evidence is inconclusive?
Scientist: There is a 5% chance this relationship could have occurred by chance alone, so the association between X and Y is compelling.
Policymaker: So if we boost X, we’ll get an increase in Y?
Scientist: Well that depends on Z.
Policymaker: ?
This conversation isn’t particularly useful for either the Scientist or the Policymaker. Bridging the gap in approaches requires specific training for both parties. Fortunately, this is starting to happen.

Identifying consensus in the scientific community
Let’s consider the following statement: “Humans are contributing to global climate change”. There are scientists that are sceptical as to whether this statement is valid, but they are in the extreme minority. The vast majority of climate scientists agree that climate change has a significant manmade component. On more than one occasion, I heard MEPs in the European Parliament say “it depends which scientist you ask”, which while technically true, massively misrepresents the general view of the scientific community. Does every single climate scientist agree that manmade climate change is a thing? No. Is there an emerging consensus? You betcha: we are warming our planet.

Learned societies as couriers of scientific information
Having identified the gap between scientists and policymakers, we must now think of ways to fill it. First, we need scientists to better communicate their findings to non-scientists. More and more scientists are achieving this, but it is a skill that takes time to develop. Also, we need policymakers to think carefully about the evidence they use to formulate and debate policy before the policy has been drafted. This is also a skill that takes time to learn and develop. So what can be done in the meantime?
I argue that learned societies can play an important role in providing scientific advice. For example, if a policymaker wants to understand how to deal with invasive species (which can sometimes, though not always, threaten native species – e.g. North American squirrels transmitting diseases that kill our native red squirrels), then get in touch with the British Ecological Society and get the evidence*. There are numerous scientific learned societies across the world whose raison d’être is to communicate their science to as wide an audience as possible. Use them.

As ever, comments are particularly welcome.

*This isn’t just a plug for the British Ecological Society – honest!

 

A scientist’s view of the European Parliament.

Image

As part of the British Ecological Society Parliamentary Shadowing Scheme, I went to the European Parliament to meet Linda McAvan MEP, who represents the Yorkshire and Humber constituency in the UK. There, I learned about her work as member of the Committee on the Environment, Public Health and Food Safety. I also gained a much better appreciation of the EU’s role in determining environmental and science policy.

People of the UK (and across the European continent) tend to either love or hate the EU. Some see it as a vital union that will help the continent maintain freedom and prosperity in a changing world, while others think it is an expensive way of eroding national sovereignty. Many of these opinions are built on fundamental misconceptions. One thing is for certain: the EU is seriously important. The legislation it creates affects the daily lives of all of its citizens, even when we don’t know it – from regulating the additives in our food to combatting man-made climate change. In this collection of posts, I will be discussing my time at the European Parliament, writing about how MEPS approach scientific evidence, the EU’s strategy for women in science, the next stage of the Kyoto Protocol, and the complexities associated with crafting Europe-wide legislation.

Off to Europe to find out about EU environment policy

I will be taking part in the British Ecological Society Parliamentary Shadowing Scheme. I will shadowing Linda McAvan MEP and learning about EU environment policy.

Those of you who know me will know that I’m a) interested in how climate change affects the evolution and ecology of infectious disease, and b) that I’m a massive politics nerd. So it won’t surprise you that I’m particularly excited to learn about how environmental policy is formulated, negotiated and applied in the EU with an MEP with a special interest in climate change.

I shall keep you posted with details about my experience over in Brussels.

On reviewing and being reviewed. Part 2: being a reviewer.

Authors write a grant or paper which is submitted to a funding agency or journal. It’s then sent out to some other academics who anonymously assess it and come back with comments/criticisms/recommendations. The editor/funder then decides whether to publish the paper or give money to do the research. Reviewing is a major component of being a member of the academic scientific community. Nevertheless, there is huge variation both in the way people go about reviewing and the time they invest in it.

I have very little experience of reviewing grants, so I’ll focus on manuscripts here. I like reviewing manuscripts  – I really do. Firstly, it’s an extra opportunity to read a paper in depth, (i.e., not just scout out the information you want or find interesting); it’s a chance to see new approaches to analysing data and new questions in the field; and it’s an opportunity to influence things a tiny bit. I must confess, there is also an ego component to it: “What’s that important editor in the field? You want my opinion?” Sometimes I despair at myself, but there it is.

Much as I like reviewing, it takes a lot of time from my schedule.

I read somewhere (and I can’t remember where – please chip in if you know so I can reference the source) that one should aim to spend two hours on reviewing a manuscript. No. Just no. No-can-do. If I limited myself to two hours, I’d have a superficial knowledge of what the authors were on about and could provide only the weakest of reviews. Likewise, I wouldn’t consider marrying somebody after a speed-date. After submitting a manuscript, it can take 1-3 months for the authors to get the reviewers’ comments back. If one of the comments are “the authors did not consider x” and figure 1 in the paper is “the effect of x on the trait in question”, the authors will, quite rightly, want to do the reviewer a serious physical discourtesy (fists and/or knives may be involved). We all make mistakes, and as a reviewer, I want to minimize the chance of making them. I do this by re-reading everything.

Reviewers are usually asked to submit their comments to the journal within two weeks. I get the sense that a lot of people wait until the end of that period (or even until they are badgered by editors for being late). I’m not judging people for this – some people can write insightful reviews immediately before the deadline. I can’t. I need to do the review, put it away and work on something else for a few days, dig it out and re-read everything. I am someone who very much needs ponder time, and I’m constantly surprised by how much my views can change after this period.

Sometimes, reviewing means I need to learn new stuff.

Another reason why I can’t wait until the deadline to review is that sometimes I need to read up about parts of the paper. This is especially true for statistical analyses sections. If someone is doing an analysis I’m unfamiliar with, I like to see that it makes sense. Selfishly, I find this really useful – there are numerous times I’ve learned new statistical techniques as a result of reviewing. I’ve also had to read up about particular sub-fields, which has given me ideas for my own research – and we all love that feeling, right?

The review process. Part I: being reviewed

Peer review is one of the most important components of science, and it can be the most frustrating.

We write a grant or paper which we submit to a funding agency or journal. It’s then sent out to our peers who anonymously assess it and come back with comments/criticisms/recommendations. The editor/funder then decides whether to publish our paper or give us money to do the research.

The quality of reviews is essential to the proper functioning of the scientific process.

Waiting for the reviews to come back (and the editor/funders decision) can take anything from one to six months. If you’re an early career researcher like myself, you check for updates on the status of your paper/grant every single day. Why? Because your career depends on the outcome of these reviews. Rejection is by far the most common outcome, and dealing with that rejection is one of the toughest bits of academia. The reviews that are negative and cursory are the absolute worst. To quote Brass Eye, it often feels like someone is writing “You’re wrong and you’re a grotesquely ugly freak”*. These reviews are completely gutting and are a common complaint among academics. The trouble is that because the reviews are anonymous, reviewers can get away with this.

However, we rarely talk about the other negative reviews – the in depth useful critiques of how to improve a manuscript. I received one of these immediately before Christmas (actually, just as I returned from my work Christmas party full of good cheer and perhaps also full of good beer). After a day of being extremely grumpy that my paper had been rejected (and mildly hungover), I read the reviews properly and realised that the reviewing process had actually worked really well. Ok, so I hadn’t got the paper into the journal I wanted (which was a real disappointment). But on the other hand, people who really know what they’re on about took the time to provide very detailed comments, and I am grateful to them. These critical comments give me confidence: I have the tools to make this paper much better and get it published else.

So, to the people who reviewed my manuscript: thank you for your time and effort. You could have got away with a few lines of why it wasn’t ready for publication, but you instead told me how to write a better paper.

* http://www.imdb.com/title/tt0118273/