Tag Archives: user experience

Using APIs to enhance the user experience #anzreg2019

Using APIs to enhance the user experience
Euwe Ermita

Live with Primo and Alma in 2017, and Rosetta and Adlib 2017. Trying to customise interfaces to fit user needs and reach parity with previous system.

Adlib (manuscripts, oral history and pictures catalogue) with thumbnails pointing back to Rosetta. Primo doesn’t do hierarchies well but Adlib can show collection in context. But different technology stack – dotnet while their developers were used to other techs, so had to bring in skills.

Still getting lots of feedback that experience is inconsistent between website, catalogue, collection viewer, etc. Viewers would get lost. System performance slow for large collections; downtime for many release dates.


  • do nothing (and hide from users)
  • configure out of box – but hitting diminishing returns
  • decouple user interfaces (where user interface is separate from the application, connected via web services)

Application portfolio management strategy

  • systems of record – I know exactly what I want and it doesn’t have to be unique (eg Rosetta, Alma) – longer lifespan, maintain tight control
  • systems of differentiation – I know what I want but it needs to be different from competitors (eg Primo, their own website)
  • systems of innovation – I don’t know what I want, I need to experiment (developing their own new interfaces) – shorter lifespan, disruptive thinking

But most importantly is having a good service layer in the middle.

Lots of caching so even if Alma/Primo go down can still serve a lot of content.

Apigee API management layer – an important feature is the response cache so API responses get stored ‘forever’ – cuts response time to 1/180, and cuts load on back-end systems, avoiding hitting the API limit. Also handy to have this layer if you want to make your data open as whatever system you have behind the scenes, the links you give users don’t change; can also give customised API to users (rather than giving them a key to your backend system).

MASA – Mesh App and Service Architecture. Want to get rid of point-to-point integrations as if one point changes, you have to update all your integrations. Instead just update the single point-to-mesh connection.

Have done an internal prototype release, looking at pushing out to public end of this year/early next year.


  • Important to have an application strategy – use systems for their strengths (whether that’s data or usability)
  • Don’t over-customise systems of record: it creates technical debt. Every time there’s an upgrade you have to re-test, re-customise
  • Play with API mediation/management – lots of free tools out there
  • Align technology with business strategy

What do users want from Primo? #anzreg2019

What do users want from Primo? Or how to get the evidence you need to understand user behaviour in Primo.
Rachelle Orodio & Megan Lee, Monash University

Survey users about most important services. #4 is LibrarySearch letting them use it quickly; #9 is off-campus access. Feedback that LibrarySearch is “very slow and bulky”, accessing articles “takes way to many steps”, search results “hard to navigate”, “links don’t work”.

Project with strategic objectives, success factors, milestones, etc.

Started by gathering data on user behaviour – Primo usage logs, Primo/Alma analytics, Google analytics. Ingested into Splunk. Got a large dataset: http://tinyurl.com/y5k4nzr4 

How users search:

  • 90% start on basic screen, and 98% use the “All resources” scope (not collections, online articles, popular databases) – basically using the defaults.
  • Only 15% sign in during a session. 51% click on availability statement, 45% click on ViewIt links. Sometimes filter by facets, rarely sort. Don’t display reviews or use tags; don’t browse, don’t use lateral or enrichment links. Little take up of citation export, save session/query, add to eShelf, etc.
  • Most searchers are 2-4 words long. 69% less than 7 words – 14% longer than 50 words! 1.13% of searches are unsuccessful

Two rounds of user testing. Splunk analytics -> designed two views (one similar to classic, one stripped down) and ran think-aloud tests on 10 students using these views, along with pre-test and post-test surveys. Results classified into: user education, system changes, system limitations.  System changes were made and testing rerun with another group of students. Testing kits at https://tinyurl.com/y4fgwhhx


  • Searching for authoritative information – start at Google Scholar and databases, only go to Primo if hit a paywall.
  • Preferred the simplified view. Said that most useful: advanced search, favourites, citation link to styles – but this wasn’t borne out by observations
  • Liked the “Download now” (LibKey I think) feature and wanted it everywhere


  • only sign in if they need to eg to check loans, read articles. So want to educate users and enable auto-login
  • Only a few individuals use advanced search
  • don’t change the scope – renamed scopings and enabled auto-complete
  • prefer a few facets – simplified list of facets
  • don’t change sorting order – changed location and educating
  • want fewer clicks to get full text
  • not familiar with saved queries – needs education

Put new UI in beta for a couple of month, ran roadshows and blog communications. Added a Hotjar feedback widget into the new UI. Responses average at 2.3 rating out of 5 – hoping that people happy with things aren’t complaining. Can see that people are using facets, Endnote desktop and citation links; labels on item page.

Feedback themes – mostly searching, getIt and viewIt access.

Q: You want to do more user education – have you done any anything on education at point-of-need ie on Primo itself?
A: Redesigning Primo LibGuide, investigating maybe creating a chatbot. Some subject librarians are embedded in faculty so sometimes even involved in lectures.

Primo out of the box #anzreg2018

Primo out of the box: Making the box work for you
Stacey Van Groll, UQ

Core philosophy – maintain out-of-the-box unless there’s a strong use case, user feedback, or bug. Focus on core high-use features like basic search (rather than browse) and search refinement (rather than my account). Stable and reliable discovery interface; quick and seamless resource access.

Said yes to:

  • UQ stylesheet – one search scope, oneview, one tab, their own prefilters on library homepage (a drop-down menu – includes some Primo things like newspaper search, some non-Primo things)

Said no to:

  • Journals A-Z
  • Citation linker
  • Purchase requests
  • main menu
  • Featured Results
  • Collection Discovery
  • Tags & Reviews
  • Database search (for now)
  • Newspaper search (for now)
  • Resource recommender (for now)

Dev work for some things – eg tweaked the log out functionality to address an issue; then Primo improved something, which broke their fix; fixed the fix; next release was okay; next release broke it again; so have reviewed and gone back to out-of-the-box. An example of the downsides to making tweaks.

But sometimes really need to make a change – consider the drivers, good use cases, who and how many people experience the problem, how much work it is to make/develop the change and how much work to maintain it? Is there existing functionality in the product or on the Roadmap? How do you measure success?

Does environmental scans – has bookmarks of other Primo NewUI sites to see what else other people do and how.

Data analysis – lots of bugs in My Account but also very low usage. So doesn’t put much work in, just submits a Salesforce case then forgets.

Evaluates new releases – likes to piggyback on these eg adding OA and peer-reviewed tags to institutional repository norm rules.

User feedback – classify by how common the complaint is and try to address most common.


  • first goes to Knowledge Centre Feedback feature and includes email address which forces a response
  • second listserv
  • third Salesforce, and then escalation channels if needed

Lessons learned: A good salesforce case has a single problem, include screenshots, explain what behaviour you desire.

Journey mapping approach – Maxine Ramsay #open17

Enhancing library services with a journey mapping approach
“Journey maps illustrate customers’ stories.” – Kerry Bodine. About user experience – not just the step by step process but also user’s emotions over time. We often make a lot of assumptions; journey mapping is a way to find out what’s really happening from the user’s perspective.
Journey-mapped all 500 students at an intermediate school, especially interested in:
  • taking shoes off at door
  • usage of OPAC
  • use of AccessIt’s OneSearch system for database search

Created a stylised journey map template to prompt where feedback was wanted. Explained to teachers how it’d work. Trialed with one class, then refined as had to explain to students it wasn’t a test. Hard for students in this age group to give their own opinion without knowing what librarians “want them to write”.

As you come into the foyer, thoughts include:

  • too full, smells bad, keen to find a good book, taking off shoes OK, taking off shoes a pain, untidy – note that negative feelings about taking off shoes seems much higher for year 7 than year 8

Exciting part was the actions as the result of the report

  • eg scrapped the ‘no shoes in the library’ rule.
  • Promoting IP address for catalogue as mural on the wall
  • Found students not confident searching catalogue so extended catalogue teaching so now goes into classrooms to teach it.
  • Students found it hard to navigate around lots of furniture so freed up some space
  • Trialed a self-issue desk but it didn’t work and wasn’t totally reliable so scrapped that but introduced extra student librarians to free up queues of student

Lessons learned:

  • Focus on one aspect of student experience / one user goal, not entire experience
  • Good to see what the pain points are
  • Students reacted really well to immediate changes


  • collaborate – who will you work with to trial the approach? consider working with people trialling it in other sectors
  • decide – which user goal / journey will you focus on, and which user group (or non-user group) will you target
  • map – what tools and resources do we need? develop simple templates, or set up video diaries – just think about how you’re going to collate at the other end; and think about resources for recruitment
  • analyse – how will you use the data/evidence; how will you present it (and recommendations) to others in the team;
  • act – what resources do you need to implement any changes. When you’re seen to act on feedback it reinforces that you’re user-centred, makes them more likely to participate later and gives them greater ownership of the library
  • evaluate – the information collected, the process, the impact of changes

(Or could use Matt Finch’s “Who/What/Where/When/How” process.)

Could also journey map the ideal experience and then identify the gaps.