Tag Archives: funder

Cambridge RCUK Block Grant spend for 2016-2017

Much to our relief, last Friday we sent off our most recent report on our expenditure of the RCUK Block Grant fund. The report is available in our repository. Cambridge makes all of its information about spend on Open Access publicly available. This blog continues on from that describing our spend from 2009 – 2016, and from the blog on our open access spend in 2014.

Compliance

We are pleased to be able to report that we reached 80% compliance in this reporting period, up from 76% last year. The RCUK is expecting 75% compliance by the end of the transition period on 31 March 2018, so we are well over target.

According to our internal helpdesk system ZenDesk, our compliance is shared between 52% gold (publication in an Open Access journal or payment for hybrid Open Access), and 28% green (placement of the work into our institutional repository, Apollo). We do not have the breakdown of how many of the gold APC payments were for hybrid. In the past it we have had an overall 86.8% spend on hybrid.

Not only do we have an increase from 76% to 80% in our compliance rates overall, this is even more impressive when we consider that this is in the face of a 15% increase in the number of research outputs acknowledging RCUK funding. Web of Science indicated in a search for articles, reviews and proceedings papers that Cambridge published 2400 papers funded by RCUK in 2016. In 2015 Web of Science the same search counted 2080 RCUK funded research outputs.

Headline numbers

  • In total Cambridge spent £1.68 million of RCUK funds on APCs (this is up from £1.28 last year)
  • 1920 articles identified as being RCUK funded were submitted to the Open Access Service, of which 1248 required payment for RCUK*
  • The average article processing charge was £1850 – this is significantly less than the £2008 average last year, reflecting the value of the memberships we have (see below)

*Note these numbers will differ slightly from the report due to the difference in dates between the calendar and financial years (see below).

Non APC spend

In total Cambridge spent £1.94 million of RCUK funds in this reporting period, of which £1.68 million was on APCs.  Approximately 13% was spent on other costs,  primarily distributed between staffing, infrastructure and memberships.  The greatest proportion is staffing, with £95,000 spent on this cost. Memberships were the next largest category, mostly arrangements to reduce the cost of APCs, including:

  • £42,000 on the open access component of the Springer Compact
  • £22,000 on memberships to obtain discounts – there is a list of these on the OSC website
  • £18,000 on the University’s SCOAP3 subscription

The RCUK fund has also supported the infrastructure for Open Access at Cambridge, with £62,000 covering the cost of several upgrades of DSpace and general support for the repository. This has allowed us to implement new services such as the minting of DOIs and our hugely successful Request a Copy service which allows people to contact authors of embargoed material in the repository and ask them to send through the author’s accepted manuscript. This category also covers our license for our helpdesk system, ZenDesk, which helps the Open Access team manage the on-average  responses to 60 queries a day. We are also able to run most of our reporting out of ZenDesk.

There are some other smaller items in the non APC category, including £1500 on bank charges that for various reasons we have not been able to allocate to specific articles.

Are these deals good value?

Some are. The Springer Compact is shown as a single charge in the report with the articles listed individually. The RCUK Block Grant contributed £46,020 to the Springer Compact and 128 Cambridge papers were published by Springer that acknowledged RCUK funding. This gives us an average APC cost per paper to the RCUK fund* of £359.53 including VAT. This represents excellent value, given that the average APC for Springer is $3,000 (about £2,300).

*Note that in some instances the papers acknowledging RCUK may also have acknowledged COAF in which case the overall cost for the APC for those papers will be higher.

Cambridge has now completed a year having a prepayment arrangement with Wiley. Over this time we contributed £108,000 to the account and published 68 papers acknowledging RCUK. This works out that on average the Wiley APC cost was £1,588 per paper. Like Springer, the average APC is approximately £2,300 so this amount appears to be good value.

However the RCUK has contributed a higher proportion to the Wiley account than COAF because at the time the account was established we had run low on COAF funds. Because the University does not provide any of its own funds for Open Access, there was no option other than to use RCUK funds. We will need to do some calculations to ensure that the correct proportion of COAF and RCUK funds are supporting this account. It is a reflection of the challenges we are facing on a rolling basis when the dates are fluid (see below).

It appears we need to look very closely at our membership with Oxford University Press. We spent £44,000 of RCUK funds on this, and published 22 articles acknowledging RCUK funding. This works out to be an APC of £2000 per article, which is not dissimilar to an average OUP APC, and therefore does not represent any value at all. This is possibly because our allocation of the expense of the membership between COAF and RCUK might not reflect what has been published with OUP. We need to investigate further.

Caveat – the date problem

We manage Open Access funds that operate on different patterns. The COAF funds match the academic year, with the new grants starting on 1 October each year.  The RCUK works on a financial year, starting on 1 April each year. Many of our memberships and offset deals work on the calendar year.

To add to the confusion, the RCUK is behind in its payments, so for this current year which started on 1 April 2017, we will not receive our first part-payment until 1 June. That amount will not cover the commitments we had already made by the end of 2016, let alone those made between 1 April when this year started and the 1 June when the money is forthcoming. This means we will remain in the red. Cambridge is carrying half a million pounds in commitments at any given time. The situation makes it very difficult to balance the books.

Our recent RCUK report covers the period of 1 April 2016 – 31 March 2017 and refers only to invoices paid in this period. In the report the dates go beyond the 31 March 2017 because the reconciliation in the system sometimes takes longer, so items are logged as later dates even though the payment was made within the period. The publication dates for the articles these invoices relate to are wildly different, and many of these have not yet been published due to the delay between acceptance and publication which ranges from days to years.

This means working out averages is an inexact science. It is only possible to filter Web of Science by year, so we are only able to establish the number of papers published in a given calendar year. This set of papers is not the same set for which we have paid, but we can compare year on year and identify some trends that make sense.

Published 22 May 2017
Written by Dr Danny Kingsley

Creative Commons License

How open is Cambridge?

As part of Open Access Week 2016, the Office of Scholarly Communication is publishing a series of blog posts on open access and open research. In this final OAWeek post Dr Arthur Smith analyses how much Cambridge research is openly available.

For us in the Office of Scholarly Communication it’s important that, as much possible, the University’s research is made Open Access. While we can guarantee that research deposited in the University repository Apollo will be made available in one way or another, it’s not clear how other sources of Open Access contribute to this goal. This blog is an attempt to quantify the amount of Cambridge research that is openly available.

In mid-August I used Cottage Labs’ Lantern service in anLantern_Oct2016_Graphic attempt to quantify just how open the University’s research really is. Lantern uses DOIs, PMIDs or PMCIDs to match publications in a variety of sources such as CORE and Europe PMC, to determine the Open Access status of a publication – it will even try to look at a publisher’s website to determine an article’s Open Access status. This process isn’t infallible, and it relies heavily on DOI matching, but it provides a good insight into the possible sources of Open Access material.

To determine the base list of publications against which the analysis could be run,  I queried Web of Science (WoS) and Scopus to obtain a list of publications attributed to Cambridge authors. In 2015, the University published 9069 articles, reviews and conference papers according to Web of Science. Scopus returned a slightly lower figure of 7983 publications. Combining these two publication lists, and filtering to only include records with a DOI, produced one master list of 9714 unique publications (that’s ~26 publications/day!).

In 2015 the Open Access team processed 2746 HEFCE eligible submissions, so naïvely speaking, the University achieved a 28.3% HEFCE compliance rate. That’s not bad, especially because the HEFCE policy had not yet come into force, but what about other Open Access sources? We know that other universities in the UK are also depositing papers in their repositories, and some researchers make their work ‘gold’ Open Access without going through the Open Access team, so the total amount of Open Access content must be higher.

In addition to the Lantern analysis, I also exported all available DOIs from Apollo and matched these to the DOIs obtained from WoS/Scopus. WoS also classifies some publications as being Open Access, and I included these figures too. If a publication was found in at least one potentially Open Access source I classified it as Open Access. Here are the results:

Lantern_Oct2016_Figure1
Figure 1. Of 9714 DOIs analysed by Lantern, 51.8% appear in at least one open access source.

It is pleasing that our naïve estimate of 28.3% HEFCE compliance closely matches the number of records found in Apollo (26.2%). The discrepancy is likely due to a number of factors, including publications received by the Open Access Team that were actually published in 2014 or 2016, but submitted in 2015, and Apollo records that don’t have a publisher DOI to match against. However, the most important point to note is the overall open access figure – in 2015 more than 50% of the University’s scholarly publications with a DOI were available in at least one “open access” source.

Let’s dig a little deeper into the analysis. Using everyone’s favourite metric, the journal impact factor (JIF), the average JIF of articles in Apollo was 5.74 compared to 4.33 for articles that were not OA. Other repositories and Europe PMC achieved even higher average JIFs. On average, Open Access publications by Cambridge authors have a higher JIF (6.04) than articles that are not OA, which suggests that researchers are making value judgements on what to make Open Access based on journal reputation. If a paper appears in a low(er) impact journal, it’s less likely to be made Open Access. Anecdotally this is something we have experienced at Cambridge.

Lantern_Oct2016_Figure2
Figure 2. Average 2015 JIF of papers classified according to their open access status.

The WoS and Scopus exports contain citation information at the article level, so we can also look at direct citations received by these publications (up to 16 August 2016)  rather than relying on the JIF. I found that Open Access articles, on average, received 1.5 to 2 more citations than articles that are not Open Access. However, is this because authors are making their higher impact articles Open Access (which one might expect to receive more citations anyway) and are not bothering with the rest? Or this is effect due entirely to the greater accessibility offered by Open Access publication? Could the differences arise because of different researcher behaviour across different disciplines?

My feeling is that we have reached a turning point – the increased citation rates of Open Access material is not caused by the article being Open Access as these articles would have naturally received more citations anyway. Instead of looking at formal literature citations, the benefits of Open Access need to be measured outside of academia in areas that would not contribute to an articles citations.

Lantern_Oct2016_Figure5
Figure 3. Average citations received by papers according to their open access source.

Breaking it down by the source of Open Access reveals that articles that appear in other repositories receive significantly more citations than any other source. This potentially reveals that collaborative papers between researchers at different institutions are likely to have greater impact than papers conducted solely at one institution (Cambridge), however, a more thorough analysis that looks at author affiliations would be needed to confirm this.

If we focus on the WoS citation distribution the difference in average citations becomes clearer. Of 8348 WoS articles, not only are there fewer Open Access articles with no citations (14% vs 17%), but Open Access articles also receive more citations in general.

Lantern_Oct2016_Figure4
Figure 4. Citation distribution of papers found in WoS depending on their open access status.

What can we take away from this analysis? Firstly, Lantern is a valuable tool for discovering other sources of Open Access content. It identified over a thousand articles by Cambridge researchers in other institutional repositories that we did not know existed. When it comes time for the next REF, these other repositories may prove a vital lifeline in determining whether a paper is HEFCE compliant.

Secondly, more than 50% of the University’s 2015 research publications are potentially Open Access. Hopefully a similar analysis of 2016’s papers will show that even more of the University’s research is Open Access this year. And finally, although Open Access articles receive more citations than articles that are not Open Access, it is no longer clear whether this is caused by the article being Open Access, disciplinary differences, or if authors are more likely to make their best work Open Access.

Published 28 October 2016
Written by Dr Arthur Smith

Creative Commons License

Good news stories about data sharing?

We have been speaking to researchers around the University recently to discuss the expectations of their funders in relation to data management. This has raised the issue of how best to convince people this is a process that benefits society rather than a waste of time or just yet another thing they are being ‘forced to do’ – which is the perspective of some that we have spoken with.

Policy requirements

In general most funders require a Research Data Management Plan to be developed at the beginning of the project – and then adhered to. But the Engineering and Physical Sciences Research Council (EPSRC) have upped the ante by introducing a policy requiring that papers published from May 2015 onwards resulting from funded research include a statement about where the supporting research data may be accessed. The data needs to be available in a secure storage facility with a persistent URL, and that it must be available for 10 years from the last time it was accessed.

Carrot or stick?

While having a policy from funders does make researchers sit up and listen, there is a perception in the UK research community that this is yet another impost on time-poor researchers. This is not surprising. There has recently been an acceleration of new rules about sharing and assessing research.

The Research Excellence Framework (REF) occurred last year, and many researchers are still ‘recuperating’. Now the Higher Education Funding Council of England (HEFCE) is introducing  a policy in April 2016 that any peer reviewed article or conference paper that is to be included in the post-2014 REF must have been deposited to their institution’s repository within three months of acceptance or it cannot be counted.  This policy is a ‘green’ open access policy.

The Research Councils UK (RCUK) have had an open access policy in place for two years, introduced in 1 April 2013, a result of the 2012 Finch Report. The RCUK policy states that funded research outputs must be available open access, and it is permitted to make them available through deposit into a repository. At first glance this seems to align with the HEFCE policy, however, restrictions on the allowed embargo periods mean that in practice most articles must be made available gold open access – usually with the payment of an accompanying article processing charge. While these charges are supported by a block grant fund, there is considerable impost on the institutions to manage these.

There is also considerable confusion amongst researchers about what all these policies mean and how they relate to each other.

Data as a system

We are trying to find some examples about how making research data available can help research and society. It is unrealistic to hope for something along the lines of Jack Akandra‘s breakthrough for a diagnostic test for pancreatic cancer using only open access research.

That’s why I was pleased when Nicholas Gruen pointed me to a report he co-authored: Open for Business: How Open Data Can Help Achieve the G20 Growth Target – A Lateral Economics report commissioned by Omidyar Network – published in June 2014.

This report is looking primarily at government data but does consider access to data generated in publicly funded research. It makes some interesting observations about what can happen when data is made available. The consideration is that data can have properties at the system level, not just the individual  level of a particular data set.

The point is that if data does behave in this way, once a collection of data becomes sufficiently large then the addition of one more set of data could cause the “entire network to jump to a new state in which the connections and the payoffs change dramatically, perhaps by several orders of magnitude”.

Benefits of sharing data

The report also refers to a 2014 report The Value and Impact of Data Sharing and Curation: A synthesis of three recent studies of UK research data centres. This work explored the value and impact of curating and sharing research data through three well-established UK research data centres – the Archaeological Data Service, the Economic and Social Data Services, and the British Atmospheric Data Centre.

In summarising the results, Beagrie and Houghton noted that their economic analysis indicated that:

  • Very significant increases in research, teaching and studying efficiency were realised by the users as a result of their use of the data centres;
  • The value to users exceeds the investment made in data sharing and curation via the centres in all three cases; and
  • By facilitating additional use, the data centres significantly increase the measurable returns on investment in the creation/collection of the data hosted.
So clearly there are good stories out there.

If you know of any good news stories that have arisen from sharing UK research output data we would love to hear them. Email us or leave a comment!