Primo out of the box #anzreg2018

Primo out of the box: Making the box work for you
Stacey Van Groll, UQ

Core philosophy – maintain out-of-the-box unless there’s a strong use case, user feedback, or bug. Focus on core high-use features like basic search (rather than browse) and search refinement (rather than my account). Stable and reliable discovery interface; quick and seamless resource access.

Said yes to:

  • UQ stylesheet – one search scope, oneview, one tab, their own prefilters on library homepage (a drop-down menu – includes some Primo things like newspaper search, some non-Primo things)

Said no to:

  • Journals A-Z
  • Citation linker
  • Purchase requests
  • main menu
  • EBSCO API
  • Featured Results
  • Collection Discovery
  • Tags & Reviews
  • Database search (for now)
  • Newspaper search (for now)
  • Resource recommender (for now)

Dev work for some things – eg tweaked the log out functionality to address an issue; then Primo improved something, which broke their fix; fixed the fix; next release was okay; next release broke it again; so have reviewed and gone back to out-of-the-box. An example of the downsides to making tweaks.

But sometimes really need to make a change – consider the drivers, good use cases, who and how many people experience the problem, how much work it is to make/develop the change and how much work to maintain it? Is there existing functionality in the product or on the Roadmap? How do you measure success?

Does environmental scans – has bookmarks of other Primo NewUI sites to see what else other people do and how.

Data analysis – lots of bugs in My Account but also very low usage. So doesn’t put much work in, just submits a Salesforce case then forgets.

Evaluates new releases – likes to piggyback on these eg adding OA and peer-reviewed tags to institutional repository norm rules.

User feedback – classify by how common the complaint is and try to address most common.

Feedback:

  • first goes to Knowledge Centre Feedback feature and includes email address which forces a response
  • second listserv
  • third Salesforce, and then escalation channels if needed

Lessons learned: A good salesforce case has a single problem, include screenshots, explain what behaviour you desire.

Ex Libris company / product updates #anzreg2018

Ex Libris company update
Bar Veinstein, President Ex Libris

  • in 85 of top 100 unis; 65million api calls/month; percentage of new sales that are in cloud up from 16% in 2009 to 96% in 2017; 92% customer satisfaction
  • Pivot for exploration of funding/collaboration https://www.proquest.com/products-services/Pivot.html
  • aim to develop solutions sustainably so not a proliferation of systems for developing needs
  • looking at more AI to develop recommendation eg “high patron demand for 8 titles. review and purchase?”, “based on usage patterns, you should move 46 titles from closed stacks to open shelves?”, “your interloans rota needs load balancing, configure now?”, “you’ve got usage from vendors who provide SUSHI accounts you haven’t set up yet, do that now?”, algorithms around SUSHI vs usage.
  • serious about retaining Primo/Summon; shared content and metadata
  • Primo VE – realtime updates. Trying to reduce complexity of Primo Back Office (pipes etc – but unclear what replaces this when pipes are “all gone”)
  • RefWorks not just for end user but also aggregated analytics on cloud platform. Should this be connected/equal to eshelf on Primo?
  • Leganto – ‘wanting to get libraries closer to teaching and learning’ – tracking whether instructors are actually using it and big jumps between semesters.
  • developing app services (ux, workflow, collaboration, analytics, shared data) and infrastructure services (agile, multi-tenancy, open apis, metadata schemas, auth) on top of cloud platform – if you’ve got one thing with them very quick to implement another because they already know how you’re set up.
  • principles of openness: more transactions now via api than staff direct action.
  • https://trust.exlibrisgroup.com/
  • Proquest issues – ExL & PQ passing the customer service buck, so to align this. Eg being able to transfer support cases directly across between Salesforce instances.

Ex Libris prodct presentation
Oren Beit-Arie, Ex Libris Chief Strategy Officer

  • 1980s acquisitions not part of library systems -> integrated library systems
  • 2000s e-resource mgmt not part of ILS -> library services platform (‘unified resource mgmt system’)
  • now teaching/learning/research not part of LSPs -> … Ex Libris’s view of a cloud ‘higher education platform’
  • Leganto
    – course reading lists; copyright compliance; integration with Alma/Primo/learning management system
    – improve teaching and learning experience; student engagement; library efficiency; compliance; maximise use of library collections
    – Alma workflows, creation of OpenURLs…
  • Esploro
    – in dev
    – RIMs
    – planning – discovery and analysis – writing – publication – outreach – assessment
    – researchers (publish, publish, publish); librarians (provide research services); research office (increase research funding/impact)
    – [venn diagram] research admin systems [research master]; research data mgmt systems [figshare]; institutional repositories [dspace]; current research information systems [elements]
    – pain points for rseearchers: too may systems, overhead, lack of incentive, hard to keep public profile up to date
    – for research office – research output of the uni, lack of metrics, hard to track output and impact, risk of noncompliance
    – next gen research repository: all assets; automated capture (don’t expect all content to be in repository); enrichment of metadata
    – showcase research via discovery/portals; automated researcher profiles; research benchmarks/metrics
    – different assets including creative works, research data, activities
    – metadata curation and enrichment (whether direct deposit, mediated deposit, automatic capture) through partnerships with other parties (data then flows both ways, with consent)
    – guiding principles: not to change researchers’ habits; not to create more work for librarians; not to be another ‘point solution’ (interoperable)
    – parses pdf from upload for metadata (also checks against Primo etc). Keywords suggested based on researcher profile
    – deposit management, apc requests, dmp management etc in “Research” tab on Alma
    – allows analytics of eg journals in library containing articles published by faculty
    – tries to track relationships with datasets
    – public view essentially a discovery layer (it’s very Primo NewUI with bonus document viewer – possibly just an extra view) for research assets – colocates article with related dataset
    – however have essentially ruled research administration systems out of scope as starting where their strength is. Do have Pivot however.

EZproxy log monitoring with Splunk for security management #anzreg2018

Ingesting EZproxy logs into Splunk. Proactive security breach management and generating rich eResource metrics
Linda Farrall, Monash University

Use Alma analytics for usage, but also using EZproxy logs.

EZProxy is locally hosted and administered by library/IT. On- and off-campus access is through EZproxy where possible, and Monash has always used EZproxy logs to report on access statistics. (For some vendors it’s the only stats available.) Used a Python script to generate html and CSV files.

Maintenance hard, logs bigger so execution took longer, python libraries no longer supported, skewed statistics due to EZproxy misuse/compromised accounts. So moved to Splunk (already had enterprise version at university) to ingest logs; can then enrich with faculty data, and improve detection of compromised accounts.

EZproxy misuse – mostly excessive downloads, eg using script or browser plugin – related to study but the amount triggers vendor Concerns (ie block all university access) – in this case check in with user to make sure it was them and sort out the issue. Or compromised accounts due to phishing. Have created a process to identify issues and block the account until ITS educates the user (because phishing emails will get sent to the same person who fell for it last time).

Pre-Splunk, it was time-consuming to monitor logs and investigate. Python script monitoring downloads no longer worked due to change of file size/number involved in typical download.

Most compromised accounts from Canada, US, Europe – in Splunk can look at reports where a user has bounced between a few countries within one week. Can look at total download size (file numbers, file size) – and can then join these two reports to look for accounts downloading a lot from a lot of countries.

To investigate have to go into identity management accounts – but can then see all their private data. Once they integrate faculty information into Splunk they don’t have to look them up so can actually enhance privacy – can see downloading lots of engineering data but are actually in engineering faculty so probably okay.

In 2016 had 10 incidents with resources blocked by vendors for 26 days. In 2017 16 incidents (all before August when started using Splunk). In 2018, 0 incidents of blocking – because they’re staying on top of compromised accounts (identifying an average of 4 a week) and taking pre-emptive action (see an issue, block the account, notify the vendor). Also now have a very good relationship with IEEE! (Notes that when IEEE alerts you to an issue it’s always a compromised account, there’s never any other explanation.)

Typically account compromised; tested quietly over several days; then sold on and used heavily. If a university hasn’t been targeted yet, it will be. By detecting accounts downloading data, are also protecting the university from other damage they can cause to university systems.

Notes that each university will have different patterns of normal use: you get to know your own data.

Lots of vendors moving to SSO. Plan to do SSO through EZproxy – haven’t done it yet so not sure it’ll work or not but testing it within a couple of months. ITS will implement SSO logging for the university, so hopefully they’ll pick up issues before it gets to EZproxy. Actively asking vendors to do it through IP recognition/EZproxy.

E-resource usage analytics in Alma #anzreg2018

“Pillars in the Mist: Supporting Effective Decision-making with Statistical Analysis of SUSHI and COUNTER Usage Reports
Aleksandra Petrovic, University of Auckland

Increasing call for evidence-based decision making in combination with rising importance of e-resources (from 60% -> 87% of collection in last ten years), in context of decreasing budget and changes in user behaviour.

Options: EBSCO usage consolidations, Alma analytics or Journal Usage Statistics Portal (JUSP). Pros of Alma: no additional fees; part of existing system; no restrictions for historical records; could modify/enhance reports; could have input in future development. But does involve more work than other systems.

Workflow: harvest data by manual methods; automatic receipt of reports, mostly COUNTER; receipt by email. All go into Alma Analytics, then create reports, analyse, make subscription decisions.

Use the Pareto Principle eg 20% of vendors responsible for 80% of usage. Similarly 80% of project time spent in data gathering creates 20% of business value; 20% of time spent in analysis for 80% of value.

Some vendors slow to respond (asking at renewal time increased their motivation….) Harvesting bugs eg issue with JR1. There were reporting failures (especially in move from http to https) and issues tracking the harvesting. Important to monitor what data is being harvested before basing decisions on it! Alma provides a “Missing data” view but can’t export into Excel to filter so created a similar report on Alma Analytics (which they’re willing to share).

So far have 106 SUSHI, 45 manual COUNTER vendors and 17 non-COUNTER vendors. Got stats from 85% of vendors.

Can see trends in open access usage. Can compare whether users are using recent vs older material – drives decisions around backfiles vs rolling embargos. Can look at usage for titles in package – eg one where only three titles had high usage so just bought those and cancelled package.

All reports in one place. Can be imported into Tableau for display/visualisation: a nice cherry on the top.

Cancelling low-use items / reducing duplication has saved money. Hope more vendors will use SUSHI to increase data available. If doing it again would:

  • use a generic contact email for gathering data
  • use the dashboard earlier in the project

Cost per use trickier to get out – especially with exchange rate issues but also sounds like reports don’t quite match up in Alma.

Alma plus JUSP
Julie Wright, University of Adelaide

Moved from using Alma Analytics to JUSP – to both. Timeline:

  • Manual analysis of COUNTER: very time intensive: 2-3 weeks each time and wanted to do it monthly…
  • UStat better but only SUSHI, specific reports, and no integration with Alma Analytics
  • Alma Analytics better still but still needs monitoring (see above-mentioned https issues)
  • JUSP – only COUNTER/SUSHI, reports easy and good, but can’t make your own
Alma JUST
much work easy
complex analyses available only simple reports
only has 12 months data data back to 2014
benchmarking works with vendors on issues
quality control of data

JUSP also has its own SUSHI server – so can harvest from here into Alma. This causes issues with duplicate data when the publishers don’t match exactly. Eg JUSP shows “BioOne” when there are actually various publishers; or “Wiley” when Alma has “John Wiley and Sons”. Might need to delete all Alma data and use only JUSP data.

Round-up of LIANZA 2017 sessions #open17

Below is a summary-of-the-summary of some of the LIANZA 2017 sessions I attended (some others were too participatory to allow live-blogging, or I ran out of brain) with key points and highlighting of things I particularly want to remember for some reason; no value judgements to be implied by the lack thereof!

Sunday

  • The dangerous myth about librarians – libraries are powerful and words have power so stop with the ‘save our libraries’ rhetoric. Stop relying on how ‘obvious’ our value is; stop being lazy about biculturalism; value ourselves, have courage, be visible.

Monday

Tuesday

  • Huakina te whare ki te ao – background and examples of Ngā Upoko Tukutuku (Māori subject headings)
  • What’s going on with ebook usage? – public library context, did lots of work extracting usage data and combining with patron data, plus surveying satisfaction
  • Games for learning – focusing on the learning around making games rather than playing them, and particularly using the presenter’s Gamefroot platform
  • Opening up licensing agreements – the kinds of terms we should be clarifying with database vendors, and how we convey this to users (particularly in Alma – we could be doing this a bit on the journal level now, though not on the article level)
  • The Future of the Commons – looking at Creative Commons (and the commons in general) from the point of view of the social systems supporting the commons, and in relation to the state and the market.
  • Enhancing library services with a journey mapping approach – a user experience methodology with a focus on the user’s emotions. Looking at what the user does and how they feel at each stage of carrying out a particular task/heading towards a particular goal.

Wednesday

The Anthropologist’s Tale: A Caution – Donna Lanclos #open17

Anthropologists get to do the work they do because someone lets them in. Listen, collect, collate, interpret, and tell stories. Stories are data – ways of representing and interpreting reality. She studies the ‘village’ of academia, investigating the logic behind the behaviours in academia – students, academics, others.

Example of bowdlerised version of Chaucer’s “Wife’s Tale” when she was in high school – she wanted the real story. Also as a folklorist, very aware of different versions of stories. There’s meaning not just in the story but in the fact that there are different versions. Who tells the tale informs how it’s told.

Early anthropology work was literally a tool of the Man. Finding out more about a people in order to colonise and control them. Eg “The Nuer” by Evans-Pritchard. Franz Boas ‘the father of anthropology’ when native American groups were the object of study because people believed they were ‘disappearing’ (a framing that ignores the agency of colonisation). In WWII armchair anthropology by Ruth Benedict informed post-war occupation strategy of US in Japan. Margaret Mead worked in Samoa and other people in the Pacific – many issues around whose stories she told and why. But her purposes shifted from institutional control to understanding. Wanted to make the unfamiliar familiar and relatable. Also to make the familiar unfamiliar – so people can look at things they’ve always done and wonder why.

Moving to libraries. Andrew Carnegie (as a retirement project from his life as a robber baron) founded lots of libraries all over US, UK, NZ, basically everywhere – to impose his ideas of what communities should have. There was an application process – communities wanted to be associated with the respectability and power. Libraries as colonising structures. And assumption that if you don’t put a library there, don’t establish a colonial government, there won’t be anything. It ignores what’s already there. There were people long before there were libraries.

Colonising impulse in libraries:

  • When she presents on student behaviour (googling, citing Wikipedia, not putting materials in IR) she talks about motivations, conflicting messages people get around these, the ways these things make sense to people where they are. And gets the question “So how do we get them to change their behaviour?” Wants the idea of what’s “best” to fall away. Listen to what people, understand why.
  • When she proposes open-ended investigations, eg day-in-the-life studies, geolocating emotions across various institutions and look at the pattern of their lives. No particular question or problem in mind, just wanted to know what it looked like. But often got asked, “How will that help me solve [very specific problem]?” Exploratory research isn’t about solving problems, it’s about gaining insight.

You don’t do anthropology to shift how they do library things; you do it so the library can shift its practices. How do we listen? How do we change? Study people not to control, but to connect. We don’t want to be the colonising library! We may think we’re powerless, but have so much more power than our users, so have a responsibility to be careful.

Approaches beyond ‘solutionism’:

  1. Syncretism: cobbling together, where you can see the component parts. In libraries, users already have a fully formed set of practices. They’ll make room for new ones if they’re useful. We should expect to be taught by them, as we teach them, what libraries mean for them.
  2. Decolonisation – listen to users, make space so the definition of what a library is emerges from the community. (cf Linda Tuhiwai-Smith’s “Decolonising Methodologies”)
  3. Community – not just responsible to users but to the whole community. (Public libraries are good at this.) Anthropological approaches can help if moving away from colonialism.

“Trying to predict the future is a really neat way of avoiding talking about the present.”

Open data? Perceptions of barriers to research data-sharing – Jo Simons #open17

Many aspects of open data – today focusing on research data, ie created by research projects at an institution.

Research workflow is very complex but to really simplify: researchers start a project, get lots of data, and summarise results in journals.  But it’s not the data – it’s a summary of the data with maybe a few key examples. The rest goes to places where only the researcher can access it.

Why do we care?

  • for the good of all
  • expensive to generate so want to maximise use eg validate, meta-analyses, used in different ways
  • much funded by government therefore taxpayer – so they should be able to access it

Used to work in a group which shared greenhouse space but had no idea what else was in there. Proposed sharing basic information about what was there and what to do in case of emergency – and was shocked when some said no. Supervisor said don’t let it stop you asking the question but that’ll happen, yeah.

Requesting data, odds of it being extant decrease 17% each year. (cite: Vines (2013) 10.1016/j.cub/2013.11.014)

This is where academic libraries come in – getting the data off the USB drives. So need to understand why they might not want to share. Did interviews to inform survey construction to get info from more people. 102 responses from researchers across 10 disciplines; 18 from librarians (about 20% response rate).

Do librarians and researchers agree on the major drivers that determine whether researchers choose to share their data?

Is data-sharing part of the research culture? Librarians: 7% said common/essential; researchers 26%

Factors influencing data-sharing

  • agreement in some areas eg ability to publish, inappropriate use, copyright and IP pretty high; then resources, interest to others, system structure and data access
  • differences: librarians thought institutional policy, system integration very important; funder policy, system usability somewhat important – all very low for researchers. What was important for researchers were: ethics (>40%); culture, research quality (10-15%); data preservation, publisher policy (5-10%)

Are there differences across major disciplines in what those drivers are?

5 disciplines with 10+ responses: business, medicine/health, phys/chem/earth; life sci/bio; soc sci/education. Ethics important for most but not a high-ranking factor for phys/chem/earth due to nature of their data. Whereas data preservation/archiving is more important for them (and med/health), somewhat important for life sci and soc sci, while business barely cared.

Take home

So consult with your community to find out what’s worrying them. Target those concerns in promotion and training. Eg we know system usability is important so definitely fix it – but don’t waste your communication opportunities talking about it when they’re worried about other things.

Learnt it on the grapevine – Pat Mock, Jenny Kirkwood #open17

Lots of e-resources that need certain amount of skills to use. But don’t have a trainer so implementing training isn’t manageable – fitting into schedules is hard. Training isn’t always motivating – especially hard for the trainer when trainees forget everything they’ve been told – only remember who the expert was “and it wasn’t them”.

Did research and found brain is designed to shed information. 50% of what you hear will be gone on within an hour. Unless you can convince your brain you’re going to need it again – this is the key to their new system, “grapevine training”.

Short 10-15min sessions where person A trains B -> C -> D … -> A. Different topic starting a chain every few weeks. Done for technical issues, work processes, etc.

Staff like the format – get engaged working one-on-one. Often work together longer than session intended and first staff member gets more out than put in. More confident demonstrating to public because they’ve already demo’d to each other.

Not perfect each time. One problem is that once a train sets off it’s hard to track how far it’s progressed – so create a document where staff tick when training is received and passed on.

Usually reference staff are responsible for training so they started kicking off the training but when they got a bit tired of this, other staff got asked to kick off chains. Staff are now using chains when they want to use a skill.

Takes the expert out of the equation so staff are now more empowered. Doing better with familiarity with resources by engaging staff.


Did they check this doesn’t end up like Chinese Whispers? Actually didn’t. Theoretically the last person gives it back to person A but in practice the chains broke first. But didn’t find that it got distorted. Sometimes you get something different but not wrong – they’d just gone off on a tangent.

May not work in big systems – online document to track helped but easier in smaller organisation.

For a short thing, can have one person teach two and spreads faster – pyramid style.

Who initiates? Still mostly the reference team. But very successful when others start. Requires one of the reference team to push it at the start.

Have considered trying it with school classes too – haven’t had a chance to try that yet.

What about capturing notes from people along the chain?

What happens when the chain breaks? You can prod people. But if people really don’t want to learn, so be it. Has worked better and for longer than anything else.

They set a time limit, not always met.

Is there a structured chain? In the start, yes, but really labour-intensive and would break when someone went on leave. More flexible when there’s an online form as staff can find someone available.

Finding our happy place at work – Cath Sheard #open17

Used to feel pressured, even bullied. Some used to go home and cry. Now happy, confident, prepared to take risks.

When took over as manager instituted regular one-on-one meetings with staff. 95% of time has an open door policy. So staff chat regularly, meaning when it’s time for a harder conversation it’s okay because they already know each other.

Cut back on number of events because staff were too stressed. So number of events has gone up – because the pressure is off and they’re enjoying it again.

Stopped being involved in things just because she could. Doesn’t need to check every poster, sign – don’t need to do quality control because trust staff. So she has more time and staff feel trusted. If she needs things changed staff knows there’s a real reason.

Did Myers-Briggs to ‘know yourself’.  Strengths and weaknesses; what makes you happy or frustrated? No use implementing a change that’ll make you cross!

Look for quick wins so you can see it’ll work and there’s benefits to making changes. Acknowledge the wins. Be prepared for multiple iterations.

The summer boutique library – Daille White #open17

The summer boutique library – closing the library but keeping the doors open
Daille White, Jane Brooker, and Lucy Lang – Victoria University of Wellington
Completely refurbishing library so closed for 3 months; wanted to create a small library as a temporary replacement to provide resources/services over summer trimester.
Lots of communication, especially over Facebook and face-to-face meetings which allayed fears and panic.
(‘Before’ photos of “dingy, cramped” library actually look pretty nice! ‘After’ photos have replaced brightly painted walls with timber and glass which is also very smart. Black shelving was the architect’s vision and not their own preference though….)
Various plans were made and changed. Ended up moving to a low-use student common room. Small area but looked clean and bright. Informal atmosphere. Open 9-5 Mon-Fri which was plenty – being staffed by core people strengthened relationships. Users had to consult for help and discovered librarians actually really nice. 🙂
Selected high-use material and course materials, and material requested, to put into temporary collection. Also promoted extended loan. No complaints about interloans as delivering service very quick – some even commented better than before. Some found items in the smaller collection that they didn’t know library had. Academics were particularly interested in new items display.
Always had been planned to change the service model, to be in the space with clients around them – more visible and approachable. Summer boutique library helped.