As part of our series of posts on the Elsevier negotiations, Dominic Dixon, Research Librarian at Cambridge University Libraries, explains the work of the library’s Data Analysis Working Group to access, understand and analyse the data relating to how researchers at Cambridge use Elsevier publications. These findings are also presented as a series of data visualisations on the recently launched Elsevier Data Dashboard [Cambridge University Raven account required].
Having a strong underpinning of data is critical to strengthening the University and sector position in negotiations with Elsevier. This post outlines our approach in the data analysis working group to gathering and presenting the data underpinning the negotiations, looks at some of the questions we have sought to answer, and shares some high-level findings from our analysis.
As with many data science projects, a large majority of the time has been spent on data cleaning. This is in part due to the way the exports from the platforms we used are structured but also to allow us to carry out a more fine-grained analysis than would have been possible with the data in its default state. Some of this work involved disambiguating publisher names, splitting and pivoting fields with multiple entries (e.g., funders, disciplines, and subjects), and enriching the records with metadata not included in the original files.
Publishing
To build a profile of research published by Cambridge researchers in Elsevier journals, we experimented with three platforms: Dimensions, Scopus, and Web of Science (WoS). Each of these platforms is commercial and each has varying levels of coverage and richness of metadata. A recent comparative analysis between WoS, Scopus and Dimensions found that Dimensions indexed 82.22% more journals than WoS and 48.17% more journals than Scopus. We decided to compare the coverage in each of these platforms for articles published between 2015 & 2020 by a Cambridge affiliated author. In this case, WoS (n=59, 587) returned 1% more results than Dimensions (n=58,908) and 32% more than Scopus (n=40,385).* However, filtering to Elsevier gave a different picture. We found that Dimensions (n=11,431) returned 16% more articles than WoS (n=9,504) and 44% more than Scopus (n=6,345). Given this and considering that our primary focus was research published by Elsevier, we opted to use Dimensions.
Of the 58,908 records exported from Dimensions, we found that 19% were published in Elsevier journals, making Elsevier the single most chosen publishing venue for Cambridge authors. Filtering to only articles with a Cambridge corresponding author, we again found that Elsevier was the most chosen publishing venue, with over 34% (n=4,564) of the articles published in Elsevier journals. Having looked at publishing levels more broadly, we then broke down the articles published with a Cambridge corresponding author by Open Access category. We found that 22% (n=1,137) of the articles were categorised as closed and therefore behind a paywall, 35%(n=1,585) were paid for via different routes including funder block grants administered by the University, 32% (n=1,467) were self-archived (Green OA), and 8% (n=375) were published in journals that do not charge APCs. Thus, the percentage of articles that are either behind a paywall, or are only available openly because an APC has been paid, is significantly higher than the amount that is published open access without any associated fees.
Another aspect of publishing we decided to focus on is funding, asking specifically “Who is funding the Cambridge research published with Elsevier?”. Given the inclusion of funder data in the Dimensions export, we were able to break down the articles by both funder and funder groups. This enabled us to determine who is funding the research. Looking at articles with a Cambridge affiliated author, Cambridge corresponding author, and articles resulting from grants we found that in each category over 70% were linked to at least one cOAlition S funder. The wider implication of this – specifically for the corresponding author articles – is that in the absence of a read and publish agreement, many of the funders would not pay the APCs associated with publishing in Elsevier journals.
Reading
To provide a picture of the extent to which articles published in Elsevier subscription journals are read at Cambridge, we gathered usage data from COUNTER and the Alma library management system. This allowed us to consider reading over the 6-year between 2015 and 2020 both overall and at a disciplinary level. We found that reading of Elsevier journals was consistently higher in each year than for any other publisher. Reading of Elsevier in 2020 represented 20% of all reading and was at its highest level in physical sciences and engineering. The single highest total of reading in the sub-categories within each discipline was in biochemistry, genetics, and molecular biology within the life sciences, with over 400,000 article downloads in 2020 alone.
Another question we considered is how frequently articles published in Elsevier journals are cited by researchers at Cambridge. To answer this, we took advantage of the Dimensions API to gather a dataset of the cited publications from articles published with a Cambridge affiliated author between 2015 and 2020. The resulting data set consisted of over 1.2m bibliographic records and revealed that 22% (n=269,917) of the cited articles were published by Elsevier. Interestingly, this percentage closely matches both the percentage of articles published in Elsevier journals by Cambridge affiliated authors (19%), the percentage of articles read at Cambridge (21.78%) (2015-20), as well as the percentage of publishing with Elsevier at the national level (20%). Using the Dimensions API to enrich the citation data with the open access category, we were able to see that 66% (over 174,000 publications) of the cited Elsevier content is currently paywalled. Elsevier is both the most cited and most paywalled publisher. This observation has wider implications for open research given that many of these articles would be inaccessible to those who are not affiliated with an institution that subscribes to the journals in which the articles appear.
Paying
One of the main questions we considered when looking at data relating to expenditure on Elsevier was how much we pay to publish with Elsevier journals. Our source for this data was OpenAPC – an initiative that aggregates data on open access expenditure and makes it openly available – combined with data from our internal compliance reports. Looking at the overall spend across all institutions that have contributed to the OpenAPC dataset, we can see that over €49,000,000 has been paid to Elsevier. This represents 19% of the total reported spend on article processing charges (APCs). Looking at data the data on Cambridge expenditure, we found that between 2015 and 2020, 30% (over £3,000,000) of our total spend on APCs from block grants was paid to Elsevier (the highest spend on any single publisher), with a single payment averaging at £3,302 and ranging between £450 and £7320.
Final notes
This post has covered just some of the questions we have been able to answer with the data. We think that overall, we have been able to demonstrate that Elsevier journals are among the most read and published in, but also consistently the most paywalled and expensive to publish in journals of all publishers. This serves to highlight the importance of the ongoing negotiations and of considering other options such as those explored in previous posts. Our complete findings are presented on a dashboard that is accessible to members of the University. Unfortunately, legal restrictions mean we are not able to share the dashboard or underlying datasets externally; however, we have made the Python code we used to gather the citation data available as a Jupyter notebook on Google Colab. This can be used to retrieve the dataset we used to carry out the citation analysis and is easily modifiable for other purposes (see the notebook for full details). We refer the interested reader to the Dimensions API Lab, and the ESAC guide to uncovering the publishing profile of your institution. The former was helpful for learning how to take advantage of the Dimensions API (as were the staff at Dimensions), and the latter has been useful in formulating our approach to the whole project. We are also happy to answer questions about any aspect of our work.
* The original percentage quoted here was 18%. This was incorrect and has now been corrected to 32%.