2015: a great year for National Trust and English Heritage visits

Steph and I have been members of the National Trust for five years now. We even qualify for the Seniors discount from January! And we’ve been members of English Heritage for just a year.

But we will be renewing our membership of both organizations in 2016. Why? Because they both offer excellent value for money, and certainly give purpose to our trips out, whatever the weather. Be it a visit to a stately home, a ruined castle, a country park, or a beautiful garden, there are so many properties to visit and experience so many aspects of our cultural heritage.

Looking back on our 2015 visits we have certainly had our money’s worth, and annual membership has more than paid for all the entrance fees we would have had to pay in any case. And much more!

So here is a pictorial summary of our great visits this past year, beginning in early April and ending just last week when we visited Charlecote Park to see the Christmas decorations. And there are links to individual posts about each visit.

NATIONAL TRUST

Lyveden New Bield (9 April)

20150409 092 Lyveden

Brodie Castle (National Trust for Scotland – 29 May)

Brodie Castle

Culloden Battlefield (National Trust for Scotland – 29 May)

Scotland 082

Inverewe Garden (National Trust for Scotland – 1 June)

Scotland 312

Arduaine Garden (National Trust for Scotland – 7 June)

Scotland 877

Rufford Old Hall (8 June)

The main entrance in the seventeenth century wing.

Tredegar House (18 June)

Tredegar House, near Newport in South Wales

Chirk Castle (1 July)

20150701 147 Chirk Castle

Hawford Dovecote (9 July)

20150709 010 Hawford dovecote

Wichenden Dovecote (9 July)

20150709 022 Wichenford dovecote

Hardwick Hall (12 August)

Hardwick Hall, Derbyshire

Newark Park (28 August)

20150828 031 Newark Park

Croome Park (12 October)

20110328046 Croome Court

Charlecote Park (16 December)

The entrance hall.

ENGLISH HERITAGE

Rushton Triangular Lodge (9 April)

Rushton Triangular Lodge, Northamptonshire

Stokesay Castle (14 April)

Stokesay Castle, Shropshire

Wroxeter Roman City (14 April)

20150414 130 Wroxeter Roman city

Kenilworth Castle (21 April)

cropped-20150421-023-kenilworth-castle.jpg

Goodrich Castle (21 May)

Goodrich Castle, Herefordshire

St Mary’s Church, Kempley (21 May)

20150521 135 St Marys Kempley

Witley Court (9 July)

20150709 091 Witley Court

Hardwick Old Hall (12 August)

Looking down six floors in the Old Hall. And the magnificent plasterwork on the walls.

Wenlock Priory (18 August)

20150818 043 Wenlock Priory

Ironbridge (18 August)

From: http://en.wikipedia.org/wiki/File:Ironbridge

It’s publish or perish, Jim – but not as we know it

perishOr to put it another way: The scientist’s dilemma . . . Where to publish?

Let me explain.

It’s autumn 1982. And just over a year since I joined the faculty of The University of Birmingham. Our department had a new Mason Professor of Botany, someone with a very different academic background and interests from myself.

At one departmental coffee break several of us were sitting around discussing various issues when the topic of academic publishing came up.

“In which journals do you publish, Mike?” the new head of department asked me. 1355408371_883_00_800I told him that I’d published several papers in the journal Euphytica, an international journal covering the theoretical and applied aspects of plant breeding. It’s now part of the Springer stable, but I’m not sure who was the publisher then.

His next question surprised me. It’s not an exaggeration to say that I was gob-smacked. “Is that a refereed journal?” he asked, and went on to explain that he’d never even heard of Euphytica. In my field, Euphytica was considered then as an excellent choice for papers on genetic resources. In a sense he was valuing my academic output based on his ‘blinkered’ view of our shared discipline, botany, which is after all a broad church.

10722Springer now has its own in-house genetic resources journal, Genetic Resources and Crop Evolution (I’m a member of the editorial board), but there are others such as Plant Genetic Resources – Characterization and Utilization (published by Cambridge University Press). Nowadays there are more journals to choose from dealing with disciplines like seed physiology, molecular systematics and ecology, among others, in which papers on genetic resources can find a home.

But in the 1970s and 80s and beyond, I’d always thought about the visibility of my research to others working in the same or allied fields. My research would be of little or no interest to researchers beyond genetic resources or plant breeding for example. So choice of journal in which to publish was predicated very much on this basis. Today, with online searches, the world’s voluminous scientific publishing is accessible at the click of a mouse, it’s perhaps less important exactly where you publish.

Back in the day we had to seek out a hard copy of a journal that interested us, or use something like Current Contents (I’m surprised that’s still going, even in hard copy) to check, on a regular basis, what was being published in various journals. And then contact the author for a reprint (before the days of email).

I can remember way back in the mid-1980s when I had to write a review of true potato seed, when you had to pay for a special literature search through the university library. Now everyone can do it themselves—from their own desk. Nowadays you just search for a journal online, or tap in a few key words, and Hey Presto! there’s a list of relevant papers, complete journal contents lists, abstracts, and even full papers if your institute has a subscription to the journal or the article itself is Open Access.

So the dynamics of scientific publishing have changed from the days when I first began. In some respects then scientific publishing has never been easier but then again never more challenging. Not only are scientists publishing more but they are expected to publish more. Sink or swim!

About a year ago, I was ‘invited’ to join ResearchGatea social networking site for scientists and researchers to share papers, ask and answer questions, and find collaborators. Since then I receive almost daily (if not more frequent) stats about my science publications and who is citing them. It’s obviously quite gratifying to know that many of the papers I published over the decades are still having scientific traction, so-to-speak. And ResearchGate gives me a score indicating how much my papers are being cited (currently 32.10—is this good? I have no idea). There’s obviously no metric that determines the quality of these papers, nor whether they are being cited for good or bad.

In the 1980s there was some discussion of the value of citation indices. I remember reading an interesting article in an internal University of Birmingham newsletter, Teaching News I think it was called, that was distributed to all staff. In this article the author had warned against the indiscriminate use of citation indices, pointing out that an excellent piece of scholarship on depopulation in rural Wales would receive a much lower citation than say a lower quality paper on the rise of fascism, simply because the former represented a much narrower field of academic pursuit.

Today there are so many more metrics, journal impact factors and the like that are taken into account to assess the quality of science. And for many young researchers these metrics play an important role—for good or bad—for the progression of their careers. Frankly, I don’t understand all of these, and I’m glad I didn’t have to worry about them when I was a young researcher.

David_Colquhoun

Prof. David Colquhoun, FRS

And there are many pitfalls. I came across this interesting article on the blog of Professor David Colquhoun, FRS (formerly professor of pharmacology at University College London) about the use (and misuse) of metrics to assess research performance. There was one very interesting comment that I think sums up many of the concerns about the indiscriminate use of publication metrics:

. . . in six of the ten years leading up to the 1991 Nobel prize, Bert Sakmann failed to meet the metrics-based publication target set by Imperial College London, and these failures included the years in which the original single channel paper was published and also the year, 1985, when he published a paper that was subsequently named as a classic in the field. In two of these ten years he had no publications whatsoever.

Application of metrics in the way that it’s been done at Imperial and also at Queen Mary College London, would result in firing of the most original minds.

We seem obsessed by metrics. And whenever there is a request for publication metrics for whatever purpose, there are always perverse incentives and opportunities to game the system, as I discovered to IRRI’s cost during the CGIAR annual performance exercise in the late ‘Noughties’. And when the submitted data are scrutinized by someone who really does not understand the nature of scientific publishing, then you’re on a slippery slope to accepting scientific mediocrity.