Evaluation and assessment #or2017

Abstracts

Cambridge’s journey towards Open Access: what we’ve learnt by Arthur Smith [slides]

Policies

  • Complicated – lots of funding and govt policies eg HEFCE, Research Councils UK, Wellcome, European Research Council, Cancer Research UK
  • HEFCE run REF with green OA policy (12 month embargo allowed in STEM, 24 month in other fields). Accepted manuscripts must be deposits within 3 months of acceptance. Don’t know what will be used in REF, so have to make them all OA. Goal of 10,000 ms/year.
  • Research Councils UK: green OA (embargo 6month STEM / 12 month others) or gold OA – CC BY immediate availability
  • Engineering and Physical Sciences Research Council – all articles must have data availability statement. Data has to be stored for 10 years or 10 years from the last request for access.
  • Wellcome Trust – gold OA preferred with immediate availability under CC BY and deposited in PMC; or green OA with up to 6month embargo in Europe PMC. Sanctions for non-compliance

Hugely complicated flowchart for decision-making; and another one for depositing.

Uni’s position is: committed to disseminating its research but not fond of APCs (especially hybrid) so prefer green.

Response

Promoted message: “accepted for publication? upload all manuscripts to be eligible for REF” and they take care of rest. Explosion of manuscripts uploaded when HEFCE policy announced; similarly when data requirements started being enforced.

Have spent a lot of money (7mill pounds APCs just centrally – 75% for hybrid) – average APC and lots of paper work. No reduction in subscription spend. 1000/submissions a month. Spike whenever uni-wide emails go out but hard to cope with all the mss deposited in response! Have had to add lots of staff to deal with these and datasets and coordinate outreach, provide support. Lots with research background, and/or with library background.

Systems

Early on set up simple upload website with form – but not connected to anything so lots of cutting and pasting. Had lots of sites but nothing connected.

Upgraded DSpace to 5.6, rethemed nicely, and rebranded as “Apollo”. Did lots of work to integrate systems: items (publications and datasets) deposited through Elements (includes OA Monitor) which is linked to the repository, which does DOI minting automatically using DataCite. Zendesk links to DSpace; ORCID integration.

Impact

Since 2015 have added 10,000 articles, 1,700 images, 1,600 theses, 800 datasets, 500 working papers, 1,100 ORCIDS. 52% of Cambridge’s output is in an OA source of some kind (may include embargo). 26% in Apollo; 13% in other repositories; 26% in EPMC; 13% in DOAJ; 2% on publisher website.

Papers published in 2015 have a normal falling citation curve. Things that are OA get more citations – counting only less than a year – and are less likely to never be cited. Possibly authors are self-selecting and depositing their best manuscripts. So do need new strategies to capture 100% of outputs.

“Request a copy” button – 8 requests per day for embargoed content (3000 requests total); >20% of requests occur before publication (due to ‘deposit on acceptance’ policy).

Future

Soon upgrading to DSpace 6 or 7, and Elements to Repository Tools 2

HEFCE + Research Councils UK soon forming UK Research & Innovation and hoping for better alignment of policies.

Researchers equate OA with Gold [and presenter equates Gold with APCs] so need cultural change from researchers and policy changes from funders: stop paying for hybrid! Development of the Springer Compact has been great but otherwise pretty problematic.

Open Access policy 3 years in, has researcher behaviour changed? by Avonne Newton, Kate Sergeant

Uni of South Australia’s OA policy launched 2014. Relevant OA funder policies from NHMRC and ARC. Two key points:

  • post-print has to be deposited within 1 month of acceptance
  • embargo only up to 12 months post-publication

Appointed research connections librarian: workshops, training, on OA compliance, ORCIDs etc. Developed OA research guide, publishing research guide – online, plus paper handouts on submitting and on postprints and OA. Uni’s copyright officer has also been a helpful advocate with content on their website.

System and process improvements

  • Developed a system where mss are deposited to the library then exposed in repository, via OAI-PMH, and into reporting systems. Did some prepopulation on submission using DOI lookup of CrossRef. Also harvest UniSA-authored outputs automatically. Migrated repository to library management system – some weaknesses but extensive APIs for direct deposit from automated systems, and good searching, workflow-management and reporting.
  • Developers have used APIs to generate handles and DOIs and automating generation of emails requesting accepted manuscripts.

Data and findings

  • Of traditional research outputs 2008-17, journal articles are 69%.
  • Open access: books and chapters 90%+ restricted; conference have 25% and articles 15% open content; reports >50%.
  • Funding: ARC funding 30% open (now or after embargo); NHMRC funding 38% open; all publications 20%.
  • Since 2014 getting more content (and higher percentages of embargoed content per funders). Hard to determine cause among funder policy, uni policy, general increase in OA awareness – probably all three.

Researcher feedback and final conclusion

  • Aware of policy but more important to some than others. Like disseminating research, gaining readership, increasing citations.

Self-Auditing as a Trusted Digital Repository – evaluating the Australian Data Archive as a trusted repository for Australian social science by Steven McEachern, Heather Leasor

Australian Data Archive (ADA) started as SSDA in 1981; not holds 5000 datasets from 1500 studies from academic, govt and private sectors. Wide variety of subject areas, broadly social science. Serviced by National Computational Infrastructure.

Were going to do Data Seal of Approval but then waited to new joint standard DSA/WDS per November 2016 criteria. Still waiting for review to come back. Purpose is to give you a seal to show you hold trusted data and are a trusted repository.

Process and Outcomes

  • change to DSA/WDS enlarged focus and breadth dramatically. Meant there were no organisations to reference for guidance
  • More emphasis on IT, security, preservation, risk management; governance, expert guidance, business plans; data quality assurance; outsourcing is appropriate
    • eg “The repository has adequate funding and sufficient numbers of qualified staff managed through a clear system of governance to effectively carry out the mission” need to self-assess with stars and write an explanation
  • Not clear what the minimum requirements are. Guidance sometimes didn’t quite match the question and unclear whether to answer the top-level question or the questions in supporting guidance. Most items should be in the public domain so had to work out how to provide evidence from confidential items. In review assessor wants timelines for items “in process” though docs didn’t explain this.

Identified 4 guidelines at level 3, 12 guidelines at level 4. Assessment not complete yet, though some feedback from one of two reviewers.

Assessing the seal

103 Australian research organisations, which could benefit from the seal. But there are challenges:

  • does the data seal support the variation of repository types? or the fact that there isn’t a one-to-one match between organisations and repositories?
  • variety of national/funding/infrastructure/governance frameworks

DSA needs to specify what things need to be in the public domain (many unis won’t make their whole budget public!)  or how to explain items out of org’s direct control (eg funding). Risk management standards per ISO (requires paying) seems overboard for a free self-assessment.

Helped identify things that needed to be updated; info to make public; policies to polish in order to make public; changes to make in practice.

Would like to look at other levels of certification and whether this affords other levels of trust and if this makes a difference to their user community.

One thought on “Evaluation and assessment #or2017

  1. Pingback: Integrating DSpace #or2017 | Blog | Deborah Fitchett

Leave a Reply

Your email address will not be published. Required fields are marked *