Tag Archives: ezproxy

The fight against academic piracy #anzreg2019

UniSA Library and the fight against academic piracy
Sam Germein, University of South Australia

Previous method for monitoring abuse of EZproxy was cumbersome and prone to error.

Next used Splunk. Could get a top 10 downloaders; do a lookup on usernames etc. Reduced time to look for unauthorised access, but vendors would still contact them outside of business hours, and block access to the EZproxy for server for potentially the whole weekend.

Splunk has a notification function – looking into how to use this.

Eg a report if a username logging in from three countries or more. (Two countries turned up lots of false positives due to VPNs.) Alerts got sent to Sam by email. Could then block the username.

Looked into other ways it might be more accurate. Still potential situation where student in a country where access was blocked and VPN needed. Added database info to see if they’re hopping between lots of databases, and how much content they’re downloading. All this info built into dashboards so needed to reverse engineer them and get the info into his report.

Another issue – in the weekend getting alerts on phone where couldn’t view spreadsheet. But Splunk could embed the info in the email.

Extended emails to other team members and to their help desk software to log a formal job and make it part of the business workflow. Got IT Helpdesk involved.

Still getting false positives, so looked into only sending the alert if downloaded more than 25MB. Refine how info displayed for wider range of people managing it.

Increased frequency to every 6 hours.

Using API could directly write the username to the EZproxy deny file – fully automating the block process. Still getting some false positives but much more on the front foot – they see alerts and contact vendor rather than vice versa.

Still lots more to do. Still implementing EZproxy 6.5 and experimenting with the EZproxy blacklist which helps.

Q: How did you decide the parameters?
A: Mostly trial and error, trying to strike a balance between legitimate blocks and false positives. Decided to be reasonably strict.

Q: Have you had any feedback from vendors?
A: Not specifically, but have had a reduction of contacts from vendors about issues.

Q: Have you had feedback from false positives blocked?
A: No, put a note in the deny file. [Another audience member’s had some conversations, students are usually good and good opportunity to hear how they’re using resources.]

Analysing logs #anzreg2018

How to work with EZproxy logs in Splunk. Why; how; who
Linda Farrall, Monash University

Monash uses EZproxy for all access either on/off campus. Manage EZproxy themselves. Use logs for resource statistics and preventing unauthorised access. Splunk is a log-ingestion tool – could use anything.

Notes can’t rely just on country changes though this is important as people use VPNs a lot. Eg people in China especially appear elsewhere; and people often use US VPN to watch Netflix and then forget to turn it off. Similarly total downloads isn’t very important as illegal downloads often happen a bit by bit.

Number of events by sessionid can be an indicator; as can number of sessions per user. And then there’s suspicious referrers eg SciHub! But some users do a search on SciHub because it’s more user-friendly and then come to get the article legally through their EZproxy.

https://github.com/prbutler/EZProxy_IP_Blacklist – doesn’t use this directly as doesn’t want to encourage them to just move to another IP.

A report of users who seem to be testing accounts with different databases.

Splunk can send alerts based on queries. Also is doing work with machine learning so could theoretically identify ‘normal’ behaviour and alert for abnormal behaviour.

But currently Monash does no automated blocking – investigates anything that looks unusual first.

 

Working with Tableau, Alma, Primo and Leganto
Sabrina Alvaro UNSW Megan Lee Monash University

Tableau server: self-hosted or Tableau-hosted (these two give you more security options to make reports private), and public (free) version.

Tableau desktop: similarly enterprise vs public.

UNSW using self-hosted server and enterprise desktop, with 9 dashboards (or ‘projects’)

For Alma/Primo can’t use Ex Libris web data connector so extract Analytics data manually but it may be a server version issue.

Easy interface to create report and then share with link or embed code.

UNSW  still learning. Want to join sources together, identify correlations, capture user stories.

EZproxy log monitoring with Splunk for security management #anzreg2018

Ingesting EZproxy logs into Splunk. Proactive security breach management and generating rich eResource metrics
Linda Farrall, Monash University

Use Alma analytics for usage, but also using EZproxy logs.

EZProxy is locally hosted and administered by library/IT. On- and off-campus access is through EZproxy where possible, and Monash has always used EZproxy logs to report on access statistics. (For some vendors it’s the only stats available.) Used a Python script to generate html and CSV files.

Maintenance hard, logs bigger so execution took longer, python libraries no longer supported, skewed statistics due to EZproxy misuse/compromised accounts. So moved to Splunk (already had enterprise version at university) to ingest logs; can then enrich with faculty data, and improve detection of compromised accounts.

EZproxy misuse – mostly excessive downloads, eg using script or browser plugin – related to study but the amount triggers vendor Concerns (ie block all university access) – in this case check in with user to make sure it was them and sort out the issue. Or compromised accounts due to phishing. Have created a process to identify issues and block the account until ITS educates the user (because phishing emails will get sent to the same person who fell for it last time).

Pre-Splunk, it was time-consuming to monitor logs and investigate. Python script monitoring downloads no longer worked due to change of file size/number involved in typical download.

Most compromised accounts from Canada, US, Europe – in Splunk can look at reports where a user has bounced between a few countries within one week. Can look at total download size (file numbers, file size) – and can then join these two reports to look for accounts downloading a lot from a lot of countries.

To investigate have to go into identity management accounts – but can then see all their private data. Once they integrate faculty information into Splunk they don’t have to look them up so can actually enhance privacy – can see downloading lots of engineering data but are actually in engineering faculty so probably okay.

In 2016 had 10 incidents with resources blocked by vendors for 26 days. In 2017 16 incidents (all before August when started using Splunk). In 2018, 0 incidents of blocking – because they’re staying on top of compromised accounts (identifying an average of 4 a week) and taking pre-emptive action (see an issue, block the account, notify the vendor). Also now have a very good relationship with IEEE! (Notes that when IEEE alerts you to an issue it’s always a compromised account, there’s never any other explanation.)

Typically account compromised; tested quietly over several days; then sold on and used heavily. If a university hasn’t been targeted yet, it will be. By detecting accounts downloading data, are also protecting the university from other damage they can cause to university systems.

Notes that each university will have different patterns of normal use: you get to know your own data.

Lots of vendors moving to SSO. Plan to do SSO through EZproxy – haven’t done it yet so not sure it’ll work or not but testing it within a couple of months. ITS will implement SSO logging for the university, so hopefully they’ll pick up issues before it gets to EZproxy. Actively asking vendors to do it through IP recognition/EZproxy.