Compliance is not the whole story

Today, Research England released Monitoring sector progress towards compliance with funder open access policies the results of a survey they ran in August last year in conjunction with RCUK, Wellcome Trust and Jisc.

Cambridge University was one of the 113 institutions that answered a significant number of questions about how we were managing compliance with various open access policies, what systems we were using and our decision making processes. Reading the collective responses has been illuminating.

The rather celebratory commentary from UKRI has focused on the compliance aspect – see the Research England’s press release: Over 80% of research outputs meet requirements of REF 2021 open access policy and the post by the Executive Chair of Research England David Sweeney, Open access – are we almost there for REF?

What’s it all about?

At risk of putting a dampener on the party I’d like to point a few things out. For a start,  compliance with a policy is not the end goal of a policy in itself. While clearly the UK policies over the past five years have increased the amount of UK research that is available open access, we do need to ask ourselves ‘so what?’.

What we are not measuring, or indeed even discussing, is the reason why we are doing this.

While the open access policies of other funders such as Wellcome Trust and Bill and Melinda Gates Foundation articulate the end goal: “foster a richer research culture” in the former and “ information sharing and transparency” in the latter, the REF2021 policy is surprisingly perfunctory. It simply states: “certain research outputs should be made open-access to be eligible for submission to the next Research Excellence Framework”.

It would be enormously helpful to those responsible for ‘selling’ the idea to our research community if there were some evidence to demonstrate the value in what we are all doing. A stick only goes so far.

It’s really hard, people

Part of the reason why we are having so much difficulty selling the idea to both our research community and the administration of the University is because open access compliance is expensive and complicated, as this survey amply demonstrates.

While there may have been an idea that requiring the research community to provide their work on acceptance would mean they would become more aware and engaged with Open Access, it seems this has not been achieved. Given that 71% of HEIs reported that AAMs are deposited by a member of staff from professional services, it is safe to say the past six years since the Finch Report have not significantly changed author behaviour.

With 335 staff at 1.0FTE recorded as “directly engaged in supporting and implementing OA at their institution”, it is clear that compliance is a highly resource hungry endeavour. This is driving the decision making at institutional level. While “the intent of funders’ OA policies is to make as many outputs freely available as possible”, institutions are focusing on the outputs that are likely to be chosen for the REF (as opposed to making everything available).

I suspect this is ideology meeting pragmatism. Not only can institutions not support the overall openness agenda, these policies seem to be further underlining the limited reward systems we currently use in academia.

The infrastructure problem

The first conclusion of the report was that “systems which support and implement OA are largely manual, resource-intensive processes”. The report notes that compliance checking tools are inadequate partly because of the complexity of funder policies and the labyrinth that is publisher embargo policies. It goes on to say the findings “demonstrate the need for CRIS systems, and other compliance tools used by institutions be reviewed and updated”.

This may the case, but buried in that suggestion is years of work and considerable cost. We know from experience. It has taken us at Cambridge 2.5 years and a very significant investment to link our CRIS system (Symplectic Elements) to our DSpace repository Apollo. And we are still not there in terms of being able to provide meaningful reports to our departments.

Who is paying for all of this?

When we say ‘open’…

The report touches on what is a serious problem in the process. Because we are obtaining works at time of acceptance (an aspect of the policy Cambridge supports), and embargo periods cannot be set until the date of publication is known, there is a significant body of material languishing under indefinite embargoes waiting to be manually checked and updated.

The report notes that ‘there is no clear preference…as to how AAMs are augmented or replaced in repositories following the release of later versions’. Given the lack of any automated way of checking this information the problem is unmanageable without huge human intervention.

At Cambridge we offer a ‘Request a Copy’ service which at least makes the works accessible, but this is an already out of control situation that is compounding as time progresses.


We really need to focus on sector solutions rather than each institution investing independently. Indeed, the second last conclusion is that ‘the survey has demonstrated the need for publishers, funders and research institutions to work towards reducing burdensome manual processes”. One such solution, which has a sole mention in the report, is the UK Scholarly Communication Licence as a way of managing the host of licences.

Right at the end of the report in the second last point something very true to my heart was mentioned: “Finally, respondents highlighted the need for training and skills at an institutional level to ensure that staff are kept up to date with resources and tools associated with OA processes.” Well, yes. This is something we have been trying to address at a sector level, and the solutions are not yet obvious.

This report is an excellent snapshot and will allow institutions such as ours some level of benchmarking. But it does highlight that we have a long way to go.

Published 14 June 2018
Written by Dr Danny Kingsley
Creative Commons License

What’s new in OA?

The world of Open Access moves fast and it can be difficult to keep up. We run regular updates for our community here at Cambridge and following a recent webinar, figured a blog about it might be a good idea too. Strap yourselves in, this is a bumpy ride.

Sweden draws the line

After a breakdown in negotiations, the Bibsam Consortium in Sweden cancelled the agreement with Elsevier on 16 May. It is anticipated that after 1 July 2018, Swedish universities will not have access to new articles in Elsevier’s journals. Articles published before this date will remain accessible.

In his blog, The circuitous road towards open access: Swedish universities to pull the plug on Elsevier, Ole Petter Ottersen  Rector of Karolinska Institute in Sweden noted: “Almost 600 years ago the development of the printing press led to dramatic changes in how knowledge was spread and communicated. This did not happen without opposition. Today digitalization opens for an equally dramatic and welcome change towards the democratization of knowledge. …  It’s time that knowledge becomes a public good.”

Europe no-deals

Sweden is following a growing European trend in relation to pulling out of publishing deals.

On 30th of March this year, the French national consortium representing 250 academic institutions,, cancelled subscriptions to SpringerNature journals. Despite expectations that the publisher would cut access, Springer is maintaining access to journals for French institutions while discussions continue.

Two weeks earlier on 12 March, the Dutch consortium VSNU announced that “Dutch universities and Royal Society of Chemistry Publishing (RSC) have been unable to reach a new agreement on access to scientific journals”. In anticipation of losing access to the material, the VSNU has advised that researchers use alternative ways to access materials including Unpaywall, Open Access button and requesting a copy from the author or from the library. At the end of the list ‘if all else fails’ they suggest using Sci-Hub, noting that “the use of Sci-Hub is considered as an illegal act”.

There were concerns that German researchers would lose access to Elsevier materials from January 2017 after negotiations broke down and subscriptions stopped being paid at the end of 2016.  But in January this year, Nature reported that German universities still had access to Elsevier journals as discussions continued.  It has been estimated that across the country, libraries are saving more than €10 million (£8.7 million) a year as a result of cancelling the subscriptions.

While not exactly ‘new news’ it is worth mentioning here that in October 2016, the French Law for a Digital Republic Act came into force, including Article 30, which is about Open Access and creates a legal right for authors to archive an OA copy, even if they have granted an exclusive right to publish.

Springer sinks

On May 9, Springer Nature was due to be listed on the German midcap index, offering 1.6 billion Euros in shares. However the day before, the float was cancelled due to ‘weak demand’.

An analysis of the prospectus recently published in the Times Higher Education has identified plans for Springer Nature to link the cost of Article Processing Charges for open access with a journal’s Impact Factor. This is interesting to say the least and indicates a move by large publishers to consolidate payments for open access into an effective new type of Big Deal. This approach also further cements the current flawed academic reward system despite Springer recently signing the Declaration on Research Assessment (DORA). The messages the academic/library community are being given are in stark contrast to the messages that were being sent to potential investors.

ResearchGate shenanigans

ResearchGate is an academic social networking site upon which researchers post copies of their works, often against copyright agreements they have signed with their publishers. In October last year, Elsevier and the American Chemical Society filed a lawsuit in Germany against ResearchGate, alleging copyright infringement on a mass scale. In November, ResearchGate restricted access to 1.7 million papers on their site. The court case began on 18 April in Germany with the intention to: “establish clarity on the legal responsibility of ResearchGate regarding copyright infringements”.

Not all publishers have agreed with this combative approach to ResearchGate. A day after the court case began, an agreement between Springer, Cambridge University Press, Theime and Research Gate was announced. The agreement is to work together on the sharing of articles on the scholarly collaboration platform “in a way that protects the rights of authors and publishers”.

All together now

The 1 April marked several important happenings in the open access space. The former Research Councils UK and Higher Education Council for England merged under the single banner of UK Research and Innovation (UKRI), with the latter being rebranded as Research England. Apart from the intensely irritating breaking of every link to RCUK webpages, this is a very positive step. Even before starting operations, the organisation was flagging a review of open access, including questioning whether it would continue to support the payment of hybrid article processing charges.

The 1 April also marked the first time HEFCE/Research England was implementing the ‘three months from acceptance’ rule for compliance of works for the Research Excellence Framework (REF). This timeframe for depositing works and making them open access was in the original policy, but in the first two years of operation of the policy, HEFCE relented to pressure from institutions concerned they were not prepared and amended the policy to ‘three months from publication’. This has been a tricky balancing act for those working in institutions (such as the Office of Scholarly Communication), with many opting not to inform their research community of the extended time frame from 1 April 2016 because of concerns about confusion. It is a relief to have the policy operating as originally written.

Speaking of REF, in March 2017, HEFCE conducted a consultation on the REF. The initial outcomes were made available in September last year.  The guidance for the 2021REF is not far from being released for feedback, and signs are that there might be some movement on the question of the eligibility of arXiv as a repository. Those interested in this issue might find “ArXiv and the REF open access policy” by yours truly and Katie Shamash which presents the case that articles deposited to arXiv are, in general, compliant with the requirements of the HEFCE policy worth a read.

Wellcome Trust consultation

Not to be left out, the Wellcome Trust is coming to the end of its consultation on its open access policy. Wellcome Trust has, along with the National Institutes of Health, led the world in the implementation of open access requirements in 2005. This is the first wide scale review of the policy and the report from the review will be released by the end of 2018.

Questions being asked of funders, publishers and institutions have focused on the hybrid question. There has also been discussion of the merits or otherwise of the Wellcome Trust centralising the negotiation with publishers and managing the block grants centrally. Some respondents have made their responses public already, such as SCONUL.

Responsible metrics?

As we blogged about recently, a considerable amount of open access activity is tied into reproducibility issues in research. Universities UK is involved in a Forum for Responsible Research Metrics in conjunction with HEFCE to address these questions. At an event in February this year: ‘The turning tide: A new culture of responsible metrics for research’ researchers spoke about the impact of metrics on their careers and health. Spoiler alert: it is not good.

The first half of last year saw a spate of organisations signing the San Francisco Declaration on Research Assessment (DORA) including Nature (April 2017)Imperial College London  and Birkbeck University of London (both in Feb 2017). However, University College London gets the prize, having signed up in 2015. In February 2018 all seven UK research councils signed up to DORA.

Early this year DORA revamped its steering committee. In addition funders (ASCB, Cancer Research UK, the European Molecular Biology Organization and Wellcome Trust) and publishers (the Company of Biologists, eLife, F1000, Hindawi and PLOS) invested in DORA to allow the hiring of a full-time community manager.

Data monetisation

There has been an increase in the offerings by publishers to manage data. Mendeley (owned by Elsevier) has long been in this space, but in 2018 Mendeley Data announced a ‘comprehensive research data platform for institutions‘ and ‘superior data management for researchers‘. Note that under the ‘Data Linking’ heading it only offers to link “datasets in repositories with research articles on ScienceDirect”, which suggests limited value of the ‘service’.

Elsevier is experimenting with ways to monetise freely available data. Datasearch allows people to: “Search for research data across domains and types, from many domain-specific, cross-domain and institutional data repositories”. The FAQs list the repositories that are indexed, including Cambridge’s very own Apollo. Because our metadata is available under a CC0 license there is nothing we can do. The FAQs also state: “At the moment, DataSearch is not a commercial product.” There is no guarantee of course that this will remain the status quo.

But Elsevier is not the only company moving into this space. Since 22 March, Springer Nature have been offering ‘Research Data Support‘ which for £265 + VAT will deposit up to 50GB of data into figshare – a commercial repository owned by Digital Science, which shares a parent company with Springer Nature. The companies insist they are entirely separate organisations.

Ecosystem takeover

If this is all starting to sound a little incestuous, then you are on the right track. As I am arguing in an upcoming Group Editorial for the Journal of Librarianship and Scholarly Communication:

There has been a redirection of business strategy by some academic publishing companies to develop portfolios that address the entire research process. Rather than adjusting workflows and internal processes, several companies are moving away from publishing into scholarly infrastructure: the tools and services that underpin the scholarly research life cycle, many of which are geared toward data analytics. This has been effected through an aggressive acquisition program in the case of Elsevier, and through the development of new products in the case of Digital Science. In both cases, the individual products across the portfolio retain their own distinctive branding.

Possibly the most dramatic way to illustrate the extent of the situation is a graphic showing where Elsevier-owned products sit throughout the research lifecycle, appearing in Rent Seeking and Financialization strategies of the Academic Publishing Industry – Publishers are increasingly in control of scholarly infrastructure and why we should care- A Case Study of Elsevier.

This situation requires vigilance. Infrastructure is the next big battleground.

Stay up to date

Remember, the Office of Scholarly Communication tries to make our work as accessible as possible to all. In addition to this blog we have a sister blog called Open Research:  Adventures from the frontline.

We publish two monthly newsletters – KaleidOSCope is focused on scholarly communication more broadly and the Research Data Newsletter keeps people up to date on data issues and opportunities.

Many of our presentations are filmed and uploaded to our YouTube channel – and there is a list of our recordings of past events including all the presentations from our TDM Symposium, our Open Access Week ‘getting published’ events and Engaging Researchers in Good Data Management.

Our presentations are freely available from Apollo, as are the slides from our training sessions. Our Twitter feeds are very popular, Cambridge Open Access @CamOpenAccess and Cambridge Research Data Management @CamOpenData.

You have no excuse!

Published 4 June 2018
Written by Dr Danny Kingsley
Creative Commons License

Hot topics – research integrity and open research

Research integrity is the current discussion topic at many levels in the research sector. This week, the Council of the European Union will adopt conclusions on the European Open Science Cloud, including the Open Science agenda. To complement this, the League of European Research Universities (LERU) released its advice paper Open Science and its role in universities: A roadmap for cultural change, which discusses the  cultural change that is needed for universities – and other stakeholders – to embrace it.

This is hot on the heels of two events I spoke at last week. The first was “Towards cultural change in data management – data stewardship in practice” held at TU Delft in the Netherlands, and the second was Nurturing a Culture of Responsible Research in the Era of Open Science, coordinated by LERU and held at Geneva University’s Campus Biotech. Both events were attended by a mix of policy makers, researchers, librarians and administrators.

The slides and accompanying tweets for my TU Delft talk: The ‘end of the expert’: why science needs to be above criticism and the slides from my LERU talk: Institutional Framework and Responsibilities: Facing Open Science’s challenges and assuring quality of research are available.

My talks focused on questions of research integrity – that in this period of ‘post-truth’, where “Britain has had enough of experts” and a US lawyer “alone will decide what is and isn’t acceptable science” – science is very much under attack.

In this environment it is important that research remains above criticism, and that opens up the question of reproducibility of research. The late Professor Stephen Hawking noted “When public figures abuse scientific argument, citing some studies but suppressing others, to justify policies that they want to implement for other reasons, it debases scientific culture”.

We have been here before

This harks back to observations made by sociologist Robert K Merton in a 1942 essay “The Normative Structure of Science” who noted “Incipient and actual attacks upon the integrity of science” stating that “An institution under attack must re-examine its foundations, restate its objectives, seek out its rationale”. Now, 72 years later, the world is once again in a similar position.

Discussion of a ‘reproducibility crisis’ has been circulating for some years now. In 2016, Nature published the results of a survey of researchers asking about this issue, where 90% of the respondents said there was a ‘significant’ or ‘slight’ crisis. The study also asked whether people had been able to replicate results (others’ or their own) and whether they had published their inability to replicate the work.

Reproducibility studies – where specific studies are chosen and attempts were made to reproduce the results – support the argument that published research is not always repeatable. A reproducibility study of 100 psychology experimental and correlational studies showed a substantial decline in the replication effects: 97% of the original studies had significant results (p < .05), but only 36% of replications had significant results.

This issue is significant enough for governments and large bodies to take notice. A UK enquiry into Research Integrity that was halted for a few months last year during the general election has been revived. In December 2017, the US National Academies of Science established a committee on Reproducibility and Replicability in Science.

However, an opinion piece published in March in the Proceedings of the National Academies of Science put forward an alternative view, asking “Is science really facing a reproducibility crisis, and do we need it to?”. The piece tracks the recent increase in the frequency of use of the ‘crisis narrative’ in published research. It then asks how common fabricated, false, biased, and irreproducible findings are. The conclusion is there has not been a measurable increase, and that “Instead of inviting greater respect for and investment in research, [the crisis narrative] risks discrediting the value of evidence and feeding antiscientific agendas.”

Either way, this is a narrative that is experiencing a high level of interest currently.

Open Research to the rescue?

One of the proposed solutions to this issue is to increase transparency in the research process – opening up research so that each step of the research process is itself a research output that is citable and can be recognised. The term ‘Open Research’ or ‘Open Science’ (in Europe, meaning all types of research) or ‘Open Scholarship’ has many definitions. A small example includes from FOSTER science, also Wikipedia article on Open Science and in the Open innovation, open science, open to the world report.

Indeed, there are so many different definitions of Open Research/Science that now there is an attempt to define the definitions. One list attempting to collect the declarations and position statements on Open Research from around the world has over 90 entries. However, common to the majority of them is they are effectively based on Robert Merton’s 1942 work. The Mertonian norms of science are:

  • Universalism
  • Communalism
  • Disinterestedness
  • Organised scepticism

An example of how Open Research is being put forward as a solution is in the recent National Academies of Science report: “The Irreproducibility Crisis of Modern Science” which recommends: “Researchers should make their data available for public inspection after publication of their results”.

The European Commission Open Science Monitor has three core principles: Open Scholarly Communications, Open Access publications and Open Data. But this is easier said than done. While making underlying data available has been a requirement of some publishers and funders for some years now. Open Data is becoming less contentious, but other aspects of ‘openness’ are still new concepts to the majority of the research community.

There are plenty of actions individual researchers can do to make their work more open. Bianca Kramer and Jeroen Bosman have published 17 open science practices throughout the whole research workflow, including examples of tools that can help, as part of their 101 Innovations project.

As we have noted for some time at the Office of Scholarly Communication, making data and other non-traditional research outputs available is difficult. This requires training of our research community in how to research openly from the beginning of their research activity, rather than asking them to be open as an afterthought.

The institutional imperative

For all of the government enquiries, the funder requirements and the changing publisher processes, at some stage, research institutions need to be part of the process to implement change. In 2012, the Royal Society Science Policy Centre report Science as an open enterprise noted “Universities and research institutes should play a major role in supporting an open data culture”. In a 2016 paper in Royal Society Open Science titled The natural selection of bad science argued: “Improving the quality of research requires change at the institutional level”.

But institutions are slow to act, and reluctant to step outside the global esteem and reward systems that scaffold the research community. In 2017, a Harvard professor in a Nature column Faculty promotion must assess reproducibility referred to “The spectre of irreproducible research haunts the biomedical community”, and argued that “one group that must step up is that to which I belong: academic leadership”.

So are institutions meeting this challenge?

European moves towards Open Research

Well, some are. TU Delft is on the record for leading in this area  – their TU Delft Strategic Framework has implications for Open Science. Even more impressive is Utrecht University which “aims to operate at the forefront of Open Science” according to the University Strategic Plan 2016-2020.

Both universities, of course, are based in The Netherlands, which held the presidency of the European Union during the first half of 2016 with an agenda that included “Europe as an innovator and job creator”, achievable through “better alignment between academia and business through open access and better use of data”. This approach has clearly had deep impacts across the country.

Open Research in the UK

We are behind the curve in the Open Research space in the UK, but there are some moves in this direction.

The University of Reading has recently closed their consultation on their vision statement on Open Research. Discussion with the coordinators of this consultation underlined the  the question of language in this area. Because many of the terms used in scholarly communication are vernacular, interpretation of them varies (‘publish’ anyone?)

There is considerable confusion amongst the research community over questions of openness, even beyond the language question. It is very common for researchers to throw accusations against open access that reflect problems with the whole scholarly communication system, as I argued in 2015 with my research colleague Mary Anne Kennan in “Open access: the whipping boy for problems in scholarly publication“.

Cambridge is working towards a position statement on Open Research. When preparing for the accompanying consultation we are currently running at Cambridge, we considered the ‘Open Typology’ that was proposed by Sheila Corrall and Stephen Pinfield in their 2014 paper, “Coherence of ‘Open’ Initiatives in Higher Education and Research: Framing a Policy Agenda”. This breaks the areas of ‘open’ into three categories: Open Content, Open Systems and Open Development. Given the primary focus, at least initially, is a way of considering the overarching approach behind the Open Access and Research Data Management policy frameworks, the Cambridge consultation is focusing on Open Content and Systems. Open Development may come later but the conversations are not yet mature enough to include them at this stage.

The consultation is still underway, and to date attracting a strong response. The outcomes will be written up and shared after analysis.

However, while decidedly a step in the right direction, a position statement is only the beginning, as we have seen already over the past couple of years. The implementation is where the hard work begins. Change is slow in this space.

Published 31 May 2018
Written by Dr Danny Kingsley
Creative Commons License

Post script

I have spoken and written about the Open Research question at length. If you are interested, my keynote to the 2017 Conference on Open Access Scholarly Publishing – Is the tail wagging the dog? Perversity in academic rewards is available on video, as is my keynote to the Munin Open Access Conference in 2016 Reward, reproducibility and recognition in research – the case for going Open.

I have also written a series of blogs on the argument for Open Research:

There are also a couple of blogs about the Open Agenda at Cambridge: