It’s publish or perish, Jim – but not as we know it

perishOr to put it another way: The scientist’s dilemma . . . Where to publish?

Let me explain.

It’s autumn 1982. And just over a year since I joined the faculty of The University of Birmingham. Our department had a new Mason Professor of Botany, someone with a very different academic background and interests from myself.

At one departmental coffee break several of us were sitting around discussing various issues when the topic of academic publishing came up.

“In which journals do you publish, Mike?” the new head of department asked me. 1355408371_883_00_800I told him that I’d published several papers in the journal Euphytica, an international journal covering the theoretical and applied aspects of plant breeding. It’s now part of the Springer stable, but I’m not sure who was the publisher then.

His next question surprised me. It’s not an exaggeration to say that I was gob-smacked. “Is that a refereed journal?” he asked, and went on to explain that he’d never even heard of Euphytica. In my field, Euphytica was considered then as an excellent choice for papers on genetic resources. In a sense he was valuing my academic output based on his ‘blinkered’ view of our shared discipline, botany, which is after all a broad church.

10722Springer now has its own in-house genetic resources journal, Genetic Resources and Crop Evolution (I’m a member of the editorial board), but there are others such as Plant Genetic Resources – Characterization and Utilization (published by Cambridge University Press). Nowadays there are more journals to choose from dealing with disciplines like seed physiology, molecular systematics and ecology, among others, in which papers on genetic resources can find a home.

But in the 1970s and 80s and beyond, I’d always thought about the visibility of my research to others working in the same or allied fields. My research would be of little or no interest to researchers beyond genetic resources or plant breeding for example. So choice of journal in which to publish was predicated very much on this basis. Today, with online searches, the world’s voluminous scientific publishing is accessible at the click of a mouse, it’s perhaps less important exactly where you publish.

Back in the day we had to seek out a hard copy of a journal that interested us, or use something like Current Contents (I’m surprised that’s still going, even in hard copy) to check, on a regular basis, what was being published in various journals. And then contact the author for a reprint (before the days of email).

I can remember way back in the mid-1980s when I had to write a review of true potato seed, when you had to pay for a special literature search through the university library. Now everyone can do it themselves—from their own desk. Nowadays you just search for a journal online, or tap in a few key words, and Hey Presto! there’s a list of relevant papers, complete journal contents lists, abstracts, and even full papers if your institute has a subscription to the journal or the article itself is Open Access.

So the dynamics of scientific publishing have changed from the days when I first began. In some respects then scientific publishing has never been easier but then again never more challenging. Not only are scientists publishing more but they are expected to publish more. Sink or swim!

About a year ago, I was ‘invited’ to join ResearchGatea social networking site for scientists and researchers to share papers, ask and answer questions, and find collaborators. Since then I receive almost daily (if not more frequent) stats about my science publications and who is citing them. It’s obviously quite gratifying to know that many of the papers I published over the decades are still having scientific traction, so-to-speak. And ResearchGate gives me a score indicating how much my papers are being cited (currently 32.10—is this good? I have no idea). There’s obviously no metric that determines the quality of these papers, nor whether they are being cited for good or bad.

In the 1980s there was some discussion of the value of citation indices. I remember reading an interesting article in an internal University of Birmingham newsletter, Teaching News I think it was called, that was distributed to all staff. In this article the author had warned against the indiscriminate use of citation indices, pointing out that an excellent piece of scholarship on depopulation in rural Wales would receive a much lower citation than say a lower quality paper on the rise of fascism, simply because the former represented a much narrower field of academic pursuit.

Today there are so many more metrics, journal impact factors and the like that are taken into account to assess the quality of science. And for many young researchers these metrics play an important role—for good or bad—for the progression of their careers. Frankly, I don’t understand all of these, and I’m glad I didn’t have to worry about them when I was a young researcher.

David_Colquhoun

Prof. David Colquhoun, FRS

And there are many pitfalls. I came across this interesting article on the blog of Professor David Colquhoun, FRS (formerly professor of pharmacology at University College London) about the use (and misuse) of metrics to assess research performance. There was one very interesting comment that I think sums up many of the concerns about the indiscriminate use of publication metrics:

. . . in six of the ten years leading up to the 1991 Nobel prize, Bert Sakmann failed to meet the metrics-based publication target set by Imperial College London, and these failures included the years in which the original single channel paper was published and also the year, 1985, when he published a paper that was subsequently named as a classic in the field. In two of these ten years he had no publications whatsoever.

Application of metrics in the way that it’s been done at Imperial and also at Queen Mary College London, would result in firing of the most original minds.

We seem obsessed by metrics. And whenever there is a request for publication metrics for whatever purpose, there are always perverse incentives and opportunities to game the system, as I discovered to IRRI’s cost during the CGIAR annual performance exercise in the late ‘Noughties’. And when the submitted data are scrutinized by someone who really does not understand the nature of scientific publishing, then you’re on a slippery slope to accepting scientific mediocrity.

Research impact is all around – or at least it should be.

I believe it was IRRI’s former head of plant pathology Dr Tom (Twng-Wah) Mew who first coined this aphorism to describe IRRI’s philosophical approach to research (and I paraphrase):

It’s not only necessary to do the right science,
but to do the science right.

I couldn’t agree more, and have blogged elsewhere about the relevance of IRRI’s science. But this is science or research for development (or R4D as it’s often abbreviated) and best explained, perhaps by the institute’s tagline or slogan:

Rice Science copy

This is not science in a vacuum, in an ivory tower seeking knowledge for knowledge’s sake. This is research to solve real problems: to reduce poverty and increase food security. I don’t really like the distinction that’s often made between so-called pure or basic science, and applied science. Surely it’s a continuum? Let me give you just one example from my own research experience.

I have also blogged about the problem of bacterial wilt of potatoes. It can be a devastating disease, not only of potatoes and other solaneaceous crops like tomatoes and eggplants, but also of bananas. While the research I carried out was initially aimed at identifying better adapted potatoes resistant to bacterial wilt, very much an ‘applied’ perspective, we also had to investigate why the bacterium was surviving so long in the soil in the apparent absence of susceptible hosts. This epidemiological focus fed into better disease control approaches.

But in any case, the only distinction that perhaps really matters is whether the science is ‘good’ or ‘bad’.

Why is rice science so crucial? Because rice is the world’s most important staple food, feeding more than half of the global population on a daily basis, even several times a day in some Asian countries. IRRI’s science focuses on gains for rice farmers and those who eat rice, research that can potentially affect billions of people. It’s all about impact, at different levels. While not all impact is positive, however, it’s important to think through all the implications and direction of a particular line of research even before it starts. In other words ‘What does success look like?‘ and how will research outputs become positive outcomes?

Now I don’t claim to be an expert in impact assessment. That’s quite a specialized field, with its own methodologies. It wasn’t until I changed careers at IRRI in 2001 and became the Director for Program Planning and Communications (DPPC) that I fully came to understand (or even appreciate) what ex ante and ex post impact meant in the context of R4D. I was fortunate as DPPC to call upon the expertise of my Australian colleague, Dr Debbie Templeton, now back in her home country with the Australian Center for International Agricultural Research (ACIAR).


11222449_888009937912763_3115952232097675704_oRice Science for a Better World?

IRRI has a prestigious scientific reputation, and deservedly so. It strives hard to maintain that reputation.

IRRI scientists publish widely in international journals. IRRI’s publication rate is second-to-none. On occasion IRRI has been criticized, censured almost, for being ‘obsessed with science and scientific publication’. Extraordinary! What for heaven’s sake does ‘Research’ in the name ‘International Rice Research Institute’ stand for? Or for that matter, in the name ‘CGIAR’ or ‘Consultative Group on International Agricultural Research’?

What our erstwhile colleagues fail to grasp, I believe, is that scientific publication is a consequence of doing good science, not an objective in itself. Having recruited some of the best scientists, IRRI provides an environment that brings out the best in its staff to contribute effectively to the institute’s common goals, while permitting them to grow professionally. Surely it must be the best of both worlds to have scientists contributing to a worthwhile and important research agenda, but knowing that their work is also esteemed by their scientific peers?

But what is the ‘right science’? Well, it depends of course.

IRRI is not an academic institution, where scientists are expected to independently pursue their own interests, and bring in large sums of research funding (along with the delicious overheads that administrators expect). All IRRI scientists contribute—as breeders, geneticists, pathologists, molecular biologists, economists, or whatever—to a common mission that:

. . . aims to reduce poverty and hunger, improve the health of rice farmers and consumers, and ensure environmental sustainability of rice farming. We do these through collaborative research, partnerships, and the strengthening of the national agricultural research and extension systems, or NARES, of the countries we work in.

IRRI’s research agenda and policies are determined by a board of trustees, guided by input from its partners, donors, end users such as farmers, and its staff. IRRI aims to meet five goals, aligned with the objectives of the Global Rice Science Partnership (GRiSP), that coordinates rice research among more than 900 international partners, to:

  • Reduce poverty through improved and diversified rice-based systems.
  • Ensure that rice production is stable and sustainable, does minimal harm to the environment, and can cope with climate change.
  • Improve the nutrition and health of poor rice consumers and farmers.
  • Provide equitable access to information and knowledge on rice and help develop the next generation of rice scientists.
  • Provide scientists and producers with the genetic information and material they need to develop improved technologies and enhance rice production.

Rice Science for a Better World, indeed.

International agricultural research like IRRI’s is funded from the public purse, in the main, though the Bill & Melinda Gates Foundation has become a major player supporting agricultural research over the past decade. Tax dollars, Euros, British pounds, Swiss francs, or Japanese yen are donated—invested even—through overseas development assistance budgets like USAID in the USA, the European Commission, DfID in the UK, SDC in Switzerland, and several institutions in Japan, to name just a handful of those donor agencies committed to finding solutions to real problems through research. Donors want to see how their funds are being used, and the positive benefits that their investments have contributed to. Unfortunately donors rarely share the same vision of ‘success’.

One of the challenges that faces a number of research organizations however, is that their research mandates fall short of effectively turning research outputs into research outcomes or impact. But having an idea of ‘what success looks like’ researchers can be in a better position to know who to partner with to ensure that research outputs become outcomes, be they national scientists, civil society organizations, NGOs, and the like.

As I said, when I became DPPC at IRRI, my office managed the process of developing and submitting research project funding proposals, as well as reporting back to donors what had been achieved. I had to get this message across to my research scientist colleagues: How will your proposed research project benefit farmers and rice consumers? This was not something they expected.

Quite early on in my DPPC tenure, I had a wake-up call after we had submitted a proposal to the Asian Development Bank (ADB), at their request I should add, to support some work on rice genomics. The science described in the proposal was first rate. After mulling over our proposal for a couple of months, I received a phone call from our contact at ADB in Manila who was handling the internal review of the proposal. He asked me to add a paragraph or two about how this work on rice genomics would benefit rice consumers otherwise ADB would not be able to consider this project in its next funding round.

So I went to discuss this apparent conundrum with the scientist involved, and explained what was required for ADB approval. ‘How will rice genomics benefit rice farmers and consumers?‘, I asked him. ‘I can’t describe that‘ he relied, somewhat woefully. ‘Well‘, I replied, ‘unless we can tell ADB how your project is going to benefit farmers etc, then your proposal is dead in the water‘.

After some thought, and based on my simplistic explanation of the impact pathway, he did come up with quite an elegant justification that we could submit to ADB. Despite our efforts, the project was not funded by ADB. The powers-that-be decided that the research was too far removed from the ultimate beneficiaries. But the process in itself was useful. It helped us to understand better how we should pitch our proposals and what essential elements to show we had thought things through.

Now the graphic below is obviously a simplistic representation of a complex set of issues. The figure on the left represents a farmer, a community, a situation that is constrained in some way or other, such as low yield, diseased crops, access to market, human health issues, and the like. The objective of the research must be clearly defined and described. No point tilting at the wrong windmills.

The solid black and dashed red line represents the impact pathway to a better situation, turning research outputs into outcomes. The green arrow represents the point on that impact pathway where the research mandate of an institute often ends—before the outcome is delivered and adopted. How to fill that gap?

Individual research projects produce outputs along the impact pathway, and outputs from one project can be the inputs into another.

Whatever the impact pathway, it’s necessary to describe what success looks like, an increase in production over a specified area, release and adoption of disease resistant varieties, incomes of X% of farmers in region Y increased by Z%, or whatever.

Impact pathway

Let me highlight two IRRI projects. One has already shown impact after a research journey of almost two decades. The other, perhaps on-going for the same time period, has yet to show impact. I’m referring to submergence tolerant or ‘scuba rice‘, and ‘Golden Rice’, respectively.

9203724733_3f71432126_zFor the development of scuba rice it was first necessary to identify and characterize genes conferring submergence tolerance—many years in the laboratory even before the first lines were tested in the field and the proof of concept realized. It didn’t take long for farmers to see the advantage of these new rice varieties. They voted with their feet! So, in a sense, the farmers themselves managed the dashed red line of the impact pathway. Scuba rice is now grown on more than 2.5 million hectares by 10 million farmers in India and Bangladesh on land that could not consistently support rice crops because of flooding.

golden-riceGolden Rice has the potential to eradicate the problem of Vitamin A deficiency, which can lead to blindness. As I mentioned earlier, rice is eaten by many people in Asia several times a day. It’s the perfect vehicle to enhance the Vitamin A intake. Varieties have been produced, the proof of concept completed, yet Golden Rice is not yet grown commercially anywhere in those countries that would benefit most. The dashed red line in my impact pathway diagram is the constraint. Golden Rice is a GMO, and the post-research and pre-release regulatory framework has not been surmounted. Pressure groups also have delayed the testing of Golden Rice lines, even destroying field experiments that would provide the very data they are so ‘afraid’ of. Thus its impact is more potential than real. Donors have been patient, but is there a limit to that patience?

Keeping donors on-side
What I also came to realize early on is that it’s so necessary to engage on a regular basis with donors, establish a good working relationship, visit them in their offices from time-to-time, sharing a drink or a meal. Mutual confidence builds, and I found that I could pick up the phone and talk through an issue, send an email and get a reply quickly, and even consulted by donors themselves as they developed their funding priorities. It’s all part of research management. Donors also like to have ‘good news stories’. Nowadays, social media such as Facebook and Twitter, blogging even, also keep them in the loop. After all donors have their own constituencies—the taxpayers—to keep informed and onside as well.

Achieving impact is not easy. But if you have identified the wrong target, then no amount of research will bring about the desired outcome, or less likely to do so. While impact is the name of the game, good communications is equally important. They go hand-in-hand.