Tag Archives: HEFCE

Disruptive innovation: notes from SCONUL winter conference

On Friday 27 November Danny Kingsley attended the SCONUL Winter Conference 2015 which addressed the theme of disruptive innovation and looked at the changes in policy and practice which will shape the scholarly communications environment for years to come. This blog is a summary of her notes from the event. The hastag was #sconul15  and there is a link in Twitter.

Disruptions in scholarly publishing

Dr Koen Becking, President of the Executive Board, Tilburg University, spoke first. He is the lead negotiator with the publishers in the Netherlands. Things are getting tight as we count down to the end of the year given the Dutch negotiations with Elsevier (read more in ‘Dutch boycott of Elsevier – a game changer?‘)

Koen asked: what is the role of a university – is it knowledge to an end, knowledge in relation to learning or knowledge in relation to professional skills? He said that 21st century universities should face society. While Tilburg University is still tied to traditional roots, it is now focused in the idea of ‘third generation university’. The idea of impact on society – the work needs to impact on society.

The Dutch are leading on the open access fight and Koen said they may look at legislation to force the government goal of open access to research articles of 40% by 2016 & 100% by 2024. [Note that the largest Dutch funder NOW has just changed their policy to say funds can no longer be used to pay for hybrid OA and that green OA must be available ‘the moment of’ publication].

Kurt noted that the way the Vice Chancellors got involved in the publisher discussions in the Netherlands was the library director came to him ask about increasing the subscription budget and the he asked why it was going up so much given the publisher’s profit levels. Money talks.

Managing the move away from historic print spend

Liam Earney from Jisc said there were several drivers for the move from historic print spend and we need models that are transparent, equitable, sustainable and acceptable to institutions. They have been running a survey on acceptable metrics on cost allocation (note that Cambridge has participated in this process). Jisc will shortly launch a consultation document on institutions on new proposals.

Liam noted that part of their research found that it was apparent that across Jisc bands and within Jisc bands there are profound differences in what institutions paying for the same material – sometimes by a factor of 100’s of 1000’s pounds different to access the same content in similar institutions.

They also worked out that if they took a mix of historical print spend and a new metric it would take over 50 years to migrate to a new model. This is not realistic.

Jisc is supported by an expert group of senior librarians (including members at Cambridge) who are working on an alternative. Liam noted that historical print spend is harmful to the ability of a consortium to negotiate coherently. Any new solution needs to meet the needs of academics and institutions.

Building a library monograph collection: time for the next paradigm shift?

Diane Brusvoort from the University of Aberdeen comes from the US originally and talked about collaborative collection development – we can move together. Her main argument was that for years we have built libraries on a ‘just in case model’ and we can no longer afford to do that. We need to refine our ‘just in time’ purchasing to take care of faculty requests, also have another strand working across sector to develop the ‘for the ages’ library.

She mentioned the FLorida Academic REpository (FLARE) which is the statewide shared collection of low use print materials from academic libraries in Florida. Libraries look at what is in FLARE and move the FLARE holding into their cataloguing. It is a one copy repository for low use monographs.  The Digital Public Library of America is open to anything that had digitised content can be put in the DPA portal and deals with the problem of items that they are all siloed.

Libraries are also taking books off the shelf when there is an electronic version. This is a pragmatic decision not made because lots of people are reading the electronic one preferentially, it is simply to save shelf space.

Diane noted a benefit of UK compared to UK is the size – it is possible to do collaborative work here in ways you can’t in the US. We need collaborative storage and to create more opportunities for collaborative collections development.

The Metric Tide

Professor James Wilsdon – University of Sussex spoke about the HEFCE report he contributed to The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management. 

This report looked at responsible uses of quantitative data in research management and assessment. He said we should not turn our backs on big data and its possibilities but we know of our experience in the research systems that these can be used as blunt tools in the process. He felt that across the community at large the discussion about metrics was unhelpfully polarised. The debate is open to misunderstanding and need a more sophisticated understanding on ways they can be used more responsibly.

The agreement is that peer review is the ‘least worst’ form of academic governance that we have. Metrics should support not supplant academic management. This underpins the design of assessment exercises like the REF.

James noted that the metrics review team was getting back together that afternoon to discuss ‘section d’ in the report. He referred to this as being ‘like a band reunion’.

A new era for library publishing? The potential for collaboration amongst new university presses

This workshop was presented by Sue White, Director of Computing and Library Services and Graham Stone, Information Resources Manager, University of Huddersfield.

Sue talked about the Northern Collaboration of libraries looking at joining forces to create a publishing group. They started with a meeting in October 2104. There is a lot of uncertainty in the landscape, with a big range of activity from well established university presses to those doing no publishing at all. She said the key challenges to the idea of a joint press was the competition between institutions. But they decided the idea merited further exploration.

Discussions were around the national monograph strategy roadmap  that advocated university publishing models. The Northern Collaboration took a discussion paper to Jisc – and they suggested three areas of activity. They were:

  • Benchmarking and data gathering to see what was happening in the UK.
  • Second to identify best practice and possible workflow efficiencies- common ground.
  • Third was exploring potential for the library publishing coalition.

The project is about sharing and providing networks for publishing ventures. In the past couple of days Jisc has agreed to take the first two forward and welcome input. They want some feedback about taking it forward.

Graham then spoke about the Huddersfield University Press which has been around since 2007 – but was re-launched with an open access flavour. They have been publishing open access materials stuff for three to four years. They publish three formats – monographs, journal publications and sound recordings.

The principles governing the Press is that everything is peer reviewed, as a general rule everything should be open access and they publish by the (ePrints) open access repository which gets lots of downloads. The Press is managed by the library but led by the academics. Business model is a not for profit as it is scholarly communication. If there were any surplus it would be reinvested in the Press. In last four years they have published 12 monographs, of which six are open access.

Potential authors have to come with their own funding. Tends to be an institutional funder sponsored arrangement. The proposal form has a question ‘how is this going to be funded’? This point is ignored for the peer review process. Having money does not guarantee publishing. It means it will be looked at but doesn’t guarantee publishing. The money pays for a small print run, copy editing not staff costs. About a 70,000 word monograph costs in the region of £3000-£4000.

Seven journals are published in the repository – there is an overlay on the repository, preserved in Portico. Discoverable through Google (via OAI-PMH) compliance with repository, Library webscale discovery includes membership of the DOAJ. Their ‘Teaching and lifelong learning’ journal has every tickbox on DOAJ.

The enablers for this Press have been senior support in the university at DVC level and the capacity and enthusiasm of an Information Resource Manager to absorb the Press into existing role. Also having an editorial board with people across the institution. The Press is operating on a shoestring hard. It is difficult to establish reputation and convincing the potential stakeholders and impact. A lack of long term funding means it is difficult to forward plan.

They also noted that there are not very many professional support networks out there and it would be good to have one. They need specialist help with author contracts and licences.

Who will disrupt the disruptors?

The last talk was by Martin Eve, Senior Lecturer in Literature, Technology and Publishing, Birkbeck, University of London.  This was an extremely entertaining and thought provoking talk. The slides are available.

Martin started with critical remarks about the terminology of ‘disruptive’, arguing that often the word is used so the public monopoly can be broken up into private hands. That said, there are parts of the higher education sector which are broken and need disruption.

Disruption is an old word – from Latin used first in 15th century. Now it actually means the point at which an entire industry is shifted out. What we see now is just a series of small increments. The changes happening in the higher education sector are not technological they are social and it is difficult to break that cycle of authority and how it works.

Martin argued that libraries need to be strategic and pragmatic. We have had a century long breakdown of the artificial scarcities in trading of knowledge coming to a head in the digital age. There are new computational practices with no physical or historical analogy – the practices don’t fit well with current understandings. He gave a couple of historical examples where in the 1930s people made similar claims.

The book as a product of scholarly communication is so fetishized that when we want the information we need the real object – we cannot conceive of it in another form.

Universities in the digital age just don’t look like they did before. We have an increasingly educated populace – more people can actually read this stuff so the argument that ‘the public’ can’t understand it is elitist and untrue. Institutional missions need to be to benefit society.

Martin discussed the issues with the academic publication market. A reader always needs a particular article – the traditional discourses around the market play out badly. You don’t know if you need a particular article until you read it and if you do need it you can’t replace it with anything else.

Certain publications can have a rigorous review practice because they are receiving high quality submissions. But they only get high quality submissions if you have lots of them and they get that reputation because of a rigorous review practice. So early players have the advantage.

He noted that different actors care about the academic market in different ways. Researchers produce academic products for themselves – to buy an income and promotion. Publishers frame their services as doing things for authors – but they don’t do enough for readers and libraries. Who pays? Researchers have zero price sensitivity. Libraries are stuck between rock & hard place. They have the cash but are told how to spend it. The whole thing is completely dysfunctional as a market. In the academy, reading is underprivileged. Authorship is rewarded.

Martin then talked about open access and how it affects the Humanities. He noted that monographs are acknowledged as different – eg: HEFCE mandate. There are higher barriers for entry to new publishers – people don’t have a good second book to give away to an OA publisher. There are different employment practices, for example creative writers are often employed on a 0.5 contract – they are writing novels and selling them and commercial publishers get antsy about requirements for open access because there is a cross over with trade books.

The subscription model exists on the principle that if enough people contribute then you have enough centrally to pay for what the costs are. It assumes a rivalrous mode – the assumption is there will always be greedy people who won’t pay in if they don’t get an exclusive benefit.

The Open Library of the Humanities is funded by a library consortium. It is based on arXiv funding model and Knowledge Unlatched. Libraries pay into a central fund in the same way of a subscription. Researcher who publish with us do not have to be at an institution who is funding or even at an institution. There are 128 libraries financially supporting the model (as of Monday should be 150). The rates are very low – each one only has to pay about $5 per article. They are publishing approximately 150 articles per year.

Published 28 November 2015
Written by Dr Danny Kingsley
Creative Commons License

Where to from here? Open Access in Five Years

As part of the Office of Scholarly Communication Open Access Week celebrations, we are uploading a blog a day written by members of the team. Thursday is a piece by Dr Arthur Smith looking to the future.

Introduction

Academic publishing is not what it used to be. Open access has exploded on the scene and challenged the established publishing model that has remained largely unchanged for 350 years. However, for those of us working in scholarly communications, the pace of change feels at times frustratingly slow, with constant roadblocks along the way. Navigating the policy landscape provided by universities, funders and publishers can be maddening, yet we need to remain mindful of how far we have come in a relatively short time. There is no sign that open access is losing momentum, so it’s perhaps instructive to consider the direction we want open access to take over the next five years, based upon the experiences of the past.

So how much is the University of Cambridge publishing and is it open access? Since 1980, according to Web of Science, the University’s publications increased from 3000 articles per year to more than 11,000 in 2014 (Fig. 1). Over the same period the proportion of gold open access articles rose steadily since first appearing on the scene in the late 1990s. Thus far in 2015 nearly one in ten articles is available gold open access, although this ignores the many articles available via green routes.

image02

Fig. 1. Publications at the University of Cambridge since 1980 according to WoS (accessed 14/10/2015).

 

The HEFCE policy

By far the most important development for open access in the UK has been the introduction of HEFCE’s open access policy. As the policy applies to all higher education institutions it affects every university researcher in the UK. While the policy doesn’t formally start until April 2016, so far progress has been slow (Fig. 2). We believe that less than a third of all the University’s articles that are published today are currently compliant with the HEFCE policy, and despite a strong information campaign, our article submission rate has stagnated at around 250 articles per month, well off the monthly target of 930.

image03 image04

Fig. 2. Publications received to the University of Cambridge open access service. The target number of articles per month is 930.

It’s understandable that some papers will fall through the cracks, but even for high impact journals many papers still don’t comply with the policy. But let’s be clear, aside from any policy compliance issues and future REF eligibility, these numbers reveal that fully two thirds of research papers produced at the University cannot be read without a journal subscription. And if readers can’t afford to pay for access then they’ll happily find other means of obtaining research papers.

What about inviting authors to make their research papers open access? Since June I have tracked five high impact journals and monitored the papers published by University of Cambridge authors (Fig. 3). Upon first discovery of a published paper, only 29% of articles were compliant with the HEFCE policy, which is consistent with our overall experience in receiving AAMs. But even after inviting authors to submit their accepted manuscripts to the University’s open access repository, the number of compliant articles rose to only 42%. Less than a third of authors who were directly contacted and asked to make their work open access eventually submitted their manuscripts. Clearly, the merits of open access are not enough to convince authors to act and distribute their manuscripts.

image03

Fig. 3. Compliant articles published in five high impact journals. Even after direct intervention less than half of all articles are HEFCE compliant.

SCOAP³

The SCOAP3 initiative is a publishing partnership that makes journals in the field of particle physics open access. This innovative scheme brings together multiple universities, funders and publishers and turns traditional journals, that are already widely respected by the physics community, into purely open access journals. No intervention is required by either authors or university administrators, making the process of publishing open access as simple as possible. The great advantage of this scheme is that authors don’t need to worry about choosing an open access option from the publisher, nor deal with messy invoices or copyright issues. All of these problems have been swept away.

Jisc Springer Compact

Like SCOAP3 the recently announced Jisc Springer Compact is a coalition of universities in the UK that have agreed a publishing model with Springer that makes ~1600 journals open access. Following a similar Dutch agreement, this publishing model means that any authors with qualifying institutional affiliations will have their publications made open access automatically. We’ve already started receiving our first requests under this scheme. However, unlike the SCOAP3 initiative which ‘flips’ entire journals to gold OA, the journals under the UK Jisc Springer Compact are still hybrid and only content produced by qualifying authors is open access. While this is great for those universities signed up to the deal, it still leaves a great many papers languishing under the subscription model.

Affiliation vs. Community

So which of these strategies will prove to the most successful? Will universities take ownership of open access publishing or will subject based communities come together in publishing coalitions.

The advantage of subject based initiatives is they flip entire journals for the benefit of a whole research community, making all the work within a specific discipline open access. However, without sufficient cohesion and drive within an academic community it’s likely that adoption will be fragmented across the myriad of disciplines. It’s no surprise that SCOAP3 emerged out of the particle physics community, given this scholarly community’s involvement in the development of arXiv, but it’s unrealistic to expect this will be the case everywhere.

Publishing agreements based around institutional affiliations will undoubtedly become more common, but until all universities have agreements in place with all the major publishers (Elsevier, Wiley, Springer, etc.) then a large fraction of scholarly outputs will still remain locked down.

What does the future hold?

Ultimately I want to do myself out of a job. As odd as that sounds, the current system of paying publishers for individual papers to be made open access is a laborious and time consuming process for authors, publishers and universities. Similarly the process of making accepted manuscripts available under the green model is equally ridiculous. Publishers should be automatically depositing AAMs on behalf of authors. There is no evidence that making AAMs available has ever killed a journal, and besides, the sooner we can reach agreements with all the major publishers and research funders that result in change on a global scale the better it will be for everyone.

Published 22 October 2015
Written by Dr Arthur Smith
Creative Commons License

Good news stories about data sharing?

We have been speaking to researchers around the University recently to discuss the expectations of their funders in relation to data management. This has raised the issue of how best to convince people this is a process that benefits society rather than a waste of time or just yet another thing they are being ‘forced to do’ – which is the perspective of some that we have spoken with.

Policy requirements

In general most funders require a Research Data Management Plan to be developed at the beginning of the project – and then adhered to. But the Engineering and Physical Sciences Research Council (EPSRC) have upped the ante by introducing a policy requiring that papers published from May 2015 onwards resulting from funded research include a statement about where the supporting research data may be accessed. The data needs to be available in a secure storage facility with a persistent URL, and that it must be available for 10 years from the last time it was accessed.

Carrot or stick?

While having a policy from funders does make researchers sit up and listen, there is a perception in the UK research community that this is yet another impost on time-poor researchers. This is not surprising. There has recently been an acceleration of new rules about sharing and assessing research.

The Research Excellence Framework (REF) occurred last year, and many researchers are still ‘recuperating’. Now the Higher Education Funding Council of England (HEFCE) is introducing  a policy in April 2016 that any peer reviewed article or conference paper that is to be included in the post-2014 REF must have been deposited to their institution’s repository within three months of acceptance or it cannot be counted.  This policy is a ‘green’ open access policy.

The Research Councils UK (RCUK) have had an open access policy in place for two years, introduced in 1 April 2013, a result of the 2012 Finch Report. The RCUK policy states that funded research outputs must be available open access, and it is permitted to make them available through deposit into a repository. At first glance this seems to align with the HEFCE policy, however, restrictions on the allowed embargo periods mean that in practice most articles must be made available gold open access – usually with the payment of an accompanying article processing charge. While these charges are supported by a block grant fund, there is considerable impost on the institutions to manage these.

There is also considerable confusion amongst researchers about what all these policies mean and how they relate to each other.

Data as a system

We are trying to find some examples about how making research data available can help research and society. It is unrealistic to hope for something along the lines of Jack Akandra‘s breakthrough for a diagnostic test for pancreatic cancer using only open access research.

That’s why I was pleased when Nicholas Gruen pointed me to a report he co-authored: Open for Business: How Open Data Can Help Achieve the G20 Growth Target – A Lateral Economics report commissioned by Omidyar Network – published in June 2014.

This report is looking primarily at government data but does consider access to data generated in publicly funded research. It makes some interesting observations about what can happen when data is made available. The consideration is that data can have properties at the system level, not just the individual  level of a particular data set.

The point is that if data does behave in this way, once a collection of data becomes sufficiently large then the addition of one more set of data could cause the “entire network to jump to a new state in which the connections and the payoffs change dramatically, perhaps by several orders of magnitude”.

Benefits of sharing data

The report also refers to a 2014 report The Value and Impact of Data Sharing and Curation: A synthesis of three recent studies of UK research data centres. This work explored the value and impact of curating and sharing research data through three well-established UK research data centres – the Archaeological Data Service, the Economic and Social Data Services, and the British Atmospheric Data Centre.

In summarising the results, Beagrie and Houghton noted that their economic analysis indicated that:

  • Very significant increases in research, teaching and studying efficiency were realised by the users as a result of their use of the data centres;
  • The value to users exceeds the investment made in data sharing and curation via the centres in all three cases; and
  • By facilitating additional use, the data centres significantly increase the measurable returns on investment in the creation/collection of the data hosted.
So clearly there are good stories out there.

If you know of any good news stories that have arisen from sharing UK research output data we would love to hear them. Email us or leave a comment!