Tag Archives: analytics

You are what you count #anzreg2019

You are what you count
Rachelle Orodio & Megan Lee, Monash University

Very often we count what’s easy to count, rather than what’s meaningful. Created a project starting with identifying what metrics they should collect.

Principles: metricsshould be strategic, purposeful, attributable, systematic, consistent, accurate, secure and accessible, efficient, integrated. Wanted to reflect key library activities.

Identified 35 metrics – 18 were manually recorded into Google Forms, Qualtrics and other temporary storage. All needed to be pulled into one place so it could be cross-referenced, and data visualisations created. Data only valuable if it can be used and shared.

Looked at Tableau, Splunk, Power BI (uni-preferred for use with data warehouse), Excel, OpenRefine, Google Data Studio.

Data sources: Alma/Primo analytics, Google analytics, EZproxy, Figshare, Libcal/LibGuides, the people counter, and custom software, spreadsheets, forms, manual recording. Quarterly email for collection of manual data.

Dashboard in Tableau with eg number of searches in Primo, how many searches produce zero results. Usage of discussion rooms vs availability. Tableau provides sophisticated visualisations, integrates with lots of sources and is great for large datasets. But expensive annual fees, needs a server environment to share reports securely, and not as easy to use as PowerBI.

Power BI example showing reference queries. Easy to learn and most functionality available in free version; full control over the layout; changes reflected immediately from one graph to another eg when you filter to one library. Sharing interactive version, the other person needs a license – or thousands of dollars for a cloud computing license.

Alma Analytics FTP – used for new titles list. Create report, schedule a job, FTP, then process files, upload to LibraryThing to get bookcovers in a carousel.

Project is ongoing. Scoping is important. Lots of info you could present, have to select the key data based on target audience, their needs etc.

Harnessing Alma Analytics and R/RShiny for Insights #anzreg2019

Harnessing Alma Analytics and R/RShiny for Insights
David Lewis & Drew Fordham, Curtin University

Interactive visualisation tools useful as it lets the user choose (within parameters) what they want to see. Alma Analytics was a bit limited. Looked at products like Tableau but it’s mostly for visualisation (and expensive) albeit easy to use.  R/RShiny free to install on desktop, more of a learning curve but worth it.

Early successes:

  • in exporting Analytics -> CSV -> clean with R -> reimport into Alma. Weeding project with printouts of the whole collection was highly manual, lots of errors, seemingly endless. With R, ran logic over entire collection and could print targeted pick lists for closer investigation. Massively accelerated deselection.
  • Could also finely-tune shelving roster more finely over the semester which saved money.

Refurbishment modelling needed to create a low-use compactus collection. Created model of previous semester as if the collection had been shelved that way, to see what would actually need to be moved back and forth. Let people explore parameters. Ended up deciding that there’d be a lot of movement in and out of the open access collection and would still require a lot of staff effort – so needed to make the compactus open access, not closed access.

Getting started with Alma Analytics and Trove API. Started with documentation then experimenting. Found the only match point was the ISBN number. Record structures complex so needed to know which substructures were relevant. Created test SQL schema and started trying test queries. Next phase: took 3-4 days to get all their holdings in Trove. Then started importing into SQL database, Views were cumbersome so created a table from the view and indexed that – which proved a lot faster.

Visualisation example with

  • * number of libraries with shared holdings – in WA, interstate, or both; at university libraries, other libraries, or both; not borrowed since [date slider input].
  • * usage by call number – user can select call number range, not borrowed since, etc.

Expanded professional networks in process of making a lot of impact with their analyses

Creating actionable data using Alma Analytics #anzreg2019

Beyond the numbers: creating actionable data using Alma Analytics dashboard
Aleksandra Petrovic, University of Auckland

Using analytics to inform relegation of print resources (to off-site storage) and retention (on main shelves).

Alma analytics lets you create very detailed reports but a fair amount of work, especially with data cleaning and analysing to get 100% accuracy. A lower accuracy option using the dashboard would be much quicker. Visualisations they used included:

  • Overview by subject view showed how many items no usage, low usage, medium usage, high usage in different subjects based on checkout history.
  • Overview of usage by publication year bands
  • Overview of usage of possible duplicates in different subjects
  • Overview weeding reports that could be more closely investigated
  • Overview of books needing preservation
  • Quick stats eg monographs count, zero uses, low uses, over 10 years old, possible duplicates – per library

Weeding parameters:

  • publication year
  • Alma usage
  • accession year
  • historical usage
  • possible duplicates

(Other libraries might also consider value, authorship (eg by own institution’s authors), theses (irreplaceable), donations/bequests.)

Different methodology types eg soft methodology would give a number of “soft retain”, “soft relegate”. Could improve with weighted indexes among other options.

Q: Will you share reports in community area?
A: Yes, though some are very specific to Auckland so can’t promise they’ll automatically work.

Q: Are you using Greenglass with this approach?
A: Using this by itself.

Q: Ex Libris have released some P&E duplication reports – how do you approach risk if an electronic item is in an aggregator collection (and might disappear…)?
A: Excluded all electronic items from dashboard as it needs more information about subscribed vs owned. This is a next step…

 

Connecting data to actions for improved learning #theta2015

Connecting data to actions for improved learning
Cathy Cavanaugh, Director of Teaching and Learning, Microsoft Corporation

How are we learning from learning? Educators learn by teaching and by feedback (explicit or body language) as well as by sharing expertise with each other. But still limits on dissemination of knowledge about teaching and learning.

Everything in a student’s life has an impact on their learning. What has the greatest impact? Pre-2010 we got data from:

  • observations, checklists
  • assessments
  • LMS logs
  • SIS -> demographic data

Found students at a novice level did better if they logged into the LMS less often but for longer. At the advance level the opposite was true. But when talking to other educators in other programs found that the factors are different. Context is vital.

Factors we can use in 2015:

  • observations, checklists
  • predictive analytics
  • voice, gesture, ink input
  • making
  • games
  • badges
  • social network analysis
  • LMS logs
  • self-reports
  • assessments
  • adaptive systems
  • device usage
  • effort

Using machine-learning they were able to predict with high accuracy which students are likely to pass the class.

At one high school found that one policy (re dressing for PE) was causing students to fail. Could quickly change the policy.

Doesn’t need to be institutional data – student wearing a wearable tracking health data, could also be tracking their ability to learn.
Augmented reality -> lots of opportunity to get a lot more data.
Internet of things

(Azure) Machine-learning service can ingest data from variety of sources, run analytics/predictions, and output to data dashboards. Drag-and-drop interface.

Q: Security and privacy when processing on the cloud?
A: Each institution will need to grapple with it. Accounts are secure. Some aspects accessible to institution, others to student only.

Q: When do analytics become creepy rather than helpful?
A: When something is too detailed it doesn’t feel helpful. Need to move deliberately, pilot projects, maybe with grad students who are thinking about their learning.