Tag Archives: compliance

Blood: in short supply?

Two years ago (almost to the day) we called out Blood for their misleading open access options that they offered to Research Council and Charity Open Access Fund (COAF) authors. Unfortunately, little has changed since then:

Neither of these routes is sufficient to comply with either Research Councils’ or COAF’s open access policies which require that the accepted text be made available in PMC within 6 months of publication, or that the published paper is available immediately under a CC BY licence.

At the time, we called on Blood to change their offerings or we would advise Research Councils and COAF funded authors to publish elsewhere. And that’s exactly what’s happened:

Figure 1. All articles published in Blood since 2007 which acknowledge MRC, Wellcome, CRUK or BHF funding. Data obtained from Web of Science.

Over the last two years we’ve seen a dramatic decline in the number of papers being published in Blood by Medical Research Council (MRC), Wellcome Trust, Cancer Research UK (CRUK) and British Heart Foundation (BHF) researchers. The number of papers published in Blood that acknowledge these funders in now at its lowest point in over a decade.

It’s important to remember that the 23 papers published in Blood in 2017 are all non-compliant with the open access policies of Research Councils and COAF, and if these papers acknowledge Wellcome Trust funding then those researchers may also be at risk of losing 10% of their total grant. If you are funded by Research Councils or one of the COAF members, please consider publishing elsewhere. SHERPA/FACT confirms our assessment:

Sign the open letter

We’re still collecting signatures for our open letter to the editor of Blood in the hope that they’ll reconsider their open access options. Please join us by adding your name.

Compliance is not the whole story

Today, Research England released Monitoring sector progress towards compliance with funder open access policies the results of a survey they ran in August last year in conjunction with RCUK, Wellcome Trust and Jisc.

Cambridge University was one of the 113 institutions that answered a significant number of questions about how we were managing compliance with various open access policies, what systems we were using and our decision making processes. Reading the collective responses has been illuminating.

The rather celebratory commentary from UKRI has focused on the compliance aspect – see the Research England’s press release: Over 80% of research outputs meet requirements of REF 2021 open access policy and the post by the Executive Chair of Research England David Sweeney, Open access – are we almost there for REF?

What’s it all about?

At risk of putting a dampener on the party I’d like to point a few things out. For a start,  compliance with a policy is not the end goal of a policy in itself. While clearly the UK policies over the past five years have increased the amount of UK research that is available open access, we do need to ask ourselves ‘so what?’.

What we are not measuring, or indeed even discussing, is the reason why we are doing this.

While the open access policies of other funders such as Wellcome Trust and Bill and Melinda Gates Foundation articulate the end goal: “foster a richer research culture” in the former and “ information sharing and transparency” in the latter, the REF2021 policy is surprisingly perfunctory. It simply states: “certain research outputs should be made open-access to be eligible for submission to the next Research Excellence Framework”.

It would be enormously helpful to those responsible for ‘selling’ the idea to our research community if there were some evidence to demonstrate the value in what we are all doing. A stick only goes so far.

It’s really hard, people

Part of the reason why we are having so much difficulty selling the idea to both our research community and the administration of the University is because open access compliance is expensive and complicated, as this survey amply demonstrates.

While there may have been an idea that requiring the research community to provide their work on acceptance would mean they would become more aware and engaged with Open Access, it seems this has not been achieved. Given that 71% of HEIs reported that AAMs are deposited by a member of staff from professional services, it is safe to say the past six years since the Finch Report have not significantly changed author behaviour.

With 335 staff at 1.0FTE recorded as “directly engaged in supporting and implementing OA at their institution”, it is clear that compliance is a highly resource hungry endeavour. This is driving the decision making at institutional level. While “the intent of funders’ OA policies is to make as many outputs freely available as possible”, institutions are focusing on the outputs that are likely to be chosen for the REF (as opposed to making everything available).

I suspect this is ideology meeting pragmatism. Not only can institutions not support the overall openness agenda, these policies seem to be further underlining the limited reward systems we currently use in academia.

The infrastructure problem

The first conclusion of the report was that “systems which support and implement OA are largely manual, resource-intensive processes”. The report notes that compliance checking tools are inadequate partly because of the complexity of funder policies and the labyrinth that is publisher embargo policies. It goes on to say the findings “demonstrate the need for CRIS systems, and other compliance tools used by institutions be reviewed and updated”.

This may the case, but buried in that suggestion is years of work and considerable cost. We know from experience. It has taken us at Cambridge 2.5 years and a very significant investment to link our CRIS system (Symplectic Elements) to our DSpace repository Apollo. And we are still not there in terms of being able to provide meaningful reports to our departments.

Who is paying for all of this?

When we say ‘open’…

The report touches on what is a serious problem in the process. Because we are obtaining works at time of acceptance (an aspect of the policy Cambridge supports), and embargo periods cannot be set until the date of publication is known, there is a significant body of material languishing under indefinite embargoes waiting to be manually checked and updated.

The report notes that ‘there is no clear preference…as to how AAMs are augmented or replaced in repositories following the release of later versions’. Given the lack of any automated way of checking this information the problem is unmanageable without huge human intervention.

At Cambridge we offer a ‘Request a Copy’ service which at least makes the works accessible, but this is an already out of control situation that is compounding as time progresses.

Solutions?

We really need to focus on sector solutions rather than each institution investing independently. Indeed, the second last conclusion is that ‘the survey has demonstrated the need for publishers, funders and research institutions to work towards reducing burdensome manual processes”. One such solution, which has a sole mention in the report, is the UK Scholarly Communication Licence as a way of managing the host of licences.

Right at the end of the report in the second last point something very true to my heart was mentioned: “Finally, respondents highlighted the need for training and skills at an institutional level to ensure that staff are kept up to date with resources and tools associated with OA processes.” Well, yes. This is something we have been trying to address at a sector level, and the solutions are not yet obvious.

This report is an excellent snapshot and will allow institutions such as ours some level of benchmarking. But it does highlight that we have a long way to go.

Published 14 June 2018
Written by Dr Danny Kingsley
Creative Commons License

How open is Cambridge?

As part of Open Access Week 2016, the Office of Scholarly Communication is publishing a series of blog posts on open access and open research. In this final OAWeek post Dr Arthur Smith analyses how much Cambridge research is openly available.

For us in the Office of Scholarly Communication it’s important that, as much possible, the University’s research is made Open Access. While we can guarantee that research deposited in the University repository Apollo will be made available in one way or another, it’s not clear how other sources of Open Access contribute to this goal. This blog is an attempt to quantify the amount of Cambridge research that is openly available.

In mid-August I used Cottage Labs’ Lantern service in anLantern_Oct2016_Graphic attempt to quantify just how open the University’s research really is. Lantern uses DOIs, PMIDs or PMCIDs to match publications in a variety of sources such as CORE and Europe PMC, to determine the Open Access status of a publication – it will even try to look at a publisher’s website to determine an article’s Open Access status. This process isn’t infallible, and it relies heavily on DOI matching, but it provides a good insight into the possible sources of Open Access material.

To determine the base list of publications against which the analysis could be run,  I queried Web of Science (WoS) and Scopus to obtain a list of publications attributed to Cambridge authors. In 2015, the University published 9069 articles, reviews and conference papers according to Web of Science. Scopus returned a slightly lower figure of 7983 publications. Combining these two publication lists, and filtering to only include records with a DOI, produced one master list of 9714 unique publications (that’s ~26 publications/day!).

In 2015 the Open Access team processed 2746 HEFCE eligible submissions, so naïvely speaking, the University achieved a 28.3% HEFCE compliance rate. That’s not bad, especially because the HEFCE policy had not yet come into force, but what about other Open Access sources? We know that other universities in the UK are also depositing papers in their repositories, and some researchers make their work ‘gold’ Open Access without going through the Open Access team, so the total amount of Open Access content must be higher.

In addition to the Lantern analysis, I also exported all available DOIs from Apollo and matched these to the DOIs obtained from WoS/Scopus. WoS also classifies some publications as being Open Access, and I included these figures too. If a publication was found in at least one potentially Open Access source I classified it as Open Access. Here are the results:

Lantern_Oct2016_Figure1
Figure 1. Of 9714 DOIs analysed by Lantern, 51.8% appear in at least one open access source.

It is pleasing that our naïve estimate of 28.3% HEFCE compliance closely matches the number of records found in Apollo (26.2%). The discrepancy is likely due to a number of factors, including publications received by the Open Access Team that were actually published in 2014 or 2016, but submitted in 2015, and Apollo records that don’t have a publisher DOI to match against. However, the most important point to note is the overall open access figure – in 2015 more than 50% of the University’s scholarly publications with a DOI were available in at least one “open access” source.

Let’s dig a little deeper into the analysis. Using everyone’s favourite metric, the journal impact factor (JIF), the average JIF of articles in Apollo was 5.74 compared to 4.33 for articles that were not OA. Other repositories and Europe PMC achieved even higher average JIFs. On average, Open Access publications by Cambridge authors have a higher JIF (6.04) than articles that are not OA, which suggests that researchers are making value judgements on what to make Open Access based on journal reputation. If a paper appears in a low(er) impact journal, it’s less likely to be made Open Access. Anecdotally this is something we have experienced at Cambridge.

Lantern_Oct2016_Figure2
Figure 2. Average 2015 JIF of papers classified according to their open access status.

The WoS and Scopus exports contain citation information at the article level, so we can also look at direct citations received by these publications (up to 16 August 2016)  rather than relying on the JIF. I found that Open Access articles, on average, received 1.5 to 2 more citations than articles that are not Open Access. However, is this because authors are making their higher impact articles Open Access (which one might expect to receive more citations anyway) and are not bothering with the rest? Or this is effect due entirely to the greater accessibility offered by Open Access publication? Could the differences arise because of different researcher behaviour across different disciplines?

My feeling is that we have reached a turning point – the increased citation rates of Open Access material is not caused by the article being Open Access as these articles would have naturally received more citations anyway. Instead of looking at formal literature citations, the benefits of Open Access need to be measured outside of academia in areas that would not contribute to an articles citations.

Lantern_Oct2016_Figure5
Figure 3. Average citations received by papers according to their open access source.

Breaking it down by the source of Open Access reveals that articles that appear in other repositories receive significantly more citations than any other source. This potentially reveals that collaborative papers between researchers at different institutions are likely to have greater impact than papers conducted solely at one institution (Cambridge), however, a more thorough analysis that looks at author affiliations would be needed to confirm this.

If we focus on the WoS citation distribution the difference in average citations becomes clearer. Of 8348 WoS articles, not only are there fewer Open Access articles with no citations (14% vs 17%), but Open Access articles also receive more citations in general.

Lantern_Oct2016_Figure4
Figure 4. Citation distribution of papers found in WoS depending on their open access status.

What can we take away from this analysis? Firstly, Lantern is a valuable tool for discovering other sources of Open Access content. It identified over a thousand articles by Cambridge researchers in other institutional repositories that we did not know existed. When it comes time for the next REF, these other repositories may prove a vital lifeline in determining whether a paper is HEFCE compliant.

Secondly, more than 50% of the University’s 2015 research publications are potentially Open Access. Hopefully a similar analysis of 2016’s papers will show that even more of the University’s research is Open Access this year. And finally, although Open Access articles receive more citations than articles that are not Open Access, it is no longer clear whether this is caused by the article being Open Access, disciplinary differences, or if authors are more likely to make their best work Open Access.

Published 28 October 2016
Written by Dr Arthur Smith

Creative Commons License