UoL Library Blog

Develop, debate, innovate.

Posts Tagged ‘analysis’

World Cat Local and ILL – a paper review

Posted by gazjjohnson on 11 December, 2009

Thanks to Keith for pointing me in the direction of this article WorldCat Local implementation: the impact on interlibrary loan.  Following on from a demo of WCL the other week, and with my ILL hat on, this is obviously of some interest to me.  Like us the University of Washington seems to have had a fairly steady state if high level of ILL requests to deal with over the years.  Like us not all of their information resources weren’t locatable via a single interface, which they suggest meant that most end users didn’t get as far as requesting an ILL to satisfy demand.

With WCL it seems that the system searches across all e-only and print collections, and if it is unable to retrieve an item flags up the ability to place an ILL.  This raises a question with me, knowing that a small but steady stream of users fail to search effectively before placing an ILL that is actually available locally (or via e-resources).  I can see that with a WCL implementation like this the big advantage for readers is that they are made more aware and able to seamlessly place pre-populated item requests. 

This will doubtless lead to an elevated number of requests being placed, of which my concern is that the same proportion will be of items requested inappropriately.  As we do check every item to see if we can satisfy the request from local collections, this might well have an impact on workflows.  Washington reports a 92% increase in loan requests placed, which is by any measure a phenomenal rise; although interestingly article requests did not significantly rise.  interestingly 17% of items delivered were never actually collected by their requesters.

The university did commit an additional 2.17FTE staff to deal with this increase (at 52,000 students and staff UoW is around twice as big as Leicester).  Interesting to note as well that with more items being requested, a broader spectrum of venues were used to source these requests.  It doesn’t take a big stretch of the imagination to see that as more institutions set up WCL (or similar) systems, that this means an increasing demand on research rich non-British Library collections like our own.  One final note in the paper was that resolving requests resulted in increases of 82% 07-08, and another 36% in 08-09 in terms of overall ILL costs to the institution.

Overall a very interesting case study, and while one can admire the increased service to the user community of the integrated searching, as an ILL member of staff I must of course reflect on the impact of increased demand on delivery of the document supply service.

Advertisements

Posted in Document Supply, Wider profession | Tagged: , , , , , , , , , | Leave a Comment »

Top of the (Repo)Pops

Posted by gazjjohnson on 21 August, 2009

A week or so ago I went through all the items on the LRA and looked at their usage figures since 1st Jan 09.  Normally I only look at these figures month by month, but it was suggested to do this for the whole of the year and hence the study.  Due to way DSpace is configured I could only scrape data for those used 20 times or more in a month – thus I can’t claim any great functional validity to these stats.  Took a while as well to do the number crunching.  But when I was done I was quite pleased with the overview that the data gave me.

What it did give once I summated the data was a very clear picture of the items in the repository that are being accessed the most.  We’ve passed this information on to departments and many of the individual researchers themselves for interest, and to reward them in a small way for their compliance in placing items onto the LRA.

In terms of greatest number of appearances in the top 100 (rather than in all 588 items in my list)- the top 5 Depts. whose work is most regually accessed on the LRA are:

  1. Museum Studies
  2. Psychology
  3. Computer Science
  4. Engineering
  5. Education

Interesting.  But how does this rack up when you consider what proportion of the items on the LRA come from a Dept.? Psychology may have 11 appearances in the top 100, but with 241 papers there’s more chance of them being up there as part of a critical mass of papers.  So for interest I decided to divide the number of each Dept’s appearances in the top 100 by their total number of items on the LRA, to give what I’m calling Johnson’s Repository Significance Quotient (or JRSQ for short!).  When sorted by their JRSQ how does the top 5 look now?

  1. Museum Studies
  2. Institute of Life Long Learning
  3. Social Work
  4. Computer Science
  5. BDRA

What this does tell me is that these collections are comprised of more papers overall that are getting high usage, though remember this is only taking into account the top 100 papers this year.  I’m giving serious thought to going through the remaining 488 items in the list and including them in the data set.  If there’s enough interest, maybe I will…

What does this all really mean?  Well nothing most probably.  The impact and usage of these items depends on too many variables to take account of in this quick and dirty analysis; such as custom and practice of searching for and using repository based items, use of personal networks to obtain papers, traditional journal usage, relative visibility on search engines of items in the LRA etc.  Doubtless you’ll be able to think of many others.  I’ve also not factored out full text items in the list from metadata only (this would be possible should it become a worthwhile endeavour).

Posted in Leicester Research Archive, Open Access | Tagged: , , , , , | Leave a Comment »

In the Shadow of Bibliometric Analysis

Posted by gazjjohnson on 19 March, 2009

As anyone who’s following me on Twitter knows, the last week or so has been rather dominated by my work on bibliometrics.  Let me state up front here, I’m not a bibliometrician (sounds worryingly close to mathamagician to me) nor statistician, rather I’m a former scientist who spent a lot of time working with stats in another life.  I sat in on a meeting about statistical teaching last week which served to rather poitnly remind me of all the things I used to know how to do (linear regression, chi squared, two tailed T-tests etc). 

On the other hand I’ve always quite enjoyed working with data collection and simple anaysis; when I was a library researcher at Univ Warwick I spent quite a bit of time doing just this.  So this does mean that any outputs that I produce aren’t going to be stunningly complex, but they should help people to get a picture based on fact.  This, and my role as LRA personage involved in the Research Excellence Framework (REF) preparations, are doubtless why I was tapped by Louise to run a bibliometric profile of the Chemistry dept. 

Bibliometrics, in case you didn’t know, is the analysis applied to texts or information.  In this case I was asked by the Dept. to run a sample profile of their publication outputs; in an attempt to establish where they stand in relation to the rest of the academic world.  In practise this meant taking a sizable sample (half the departmental academics) and looking at which journals they’ve published in over the last 9 years (2001-date).  This is a key range for a number of reasons – firstly due to the suggestion that the REF will take account of publications back to this date.  It’s also due to the fact that Journal Citation Reports (JCR) only goes back to 2000 online, so it’d be harder work to analyse publications beyond this point.

Now whilst the results are naturally confidential at this point I can tell you about what I sample in brief

  • Article outputs– Number of articles produced and indexed within the time frame.
  • Citation counts – Number of references to articles produced.
  • H Index– The Hirsh Index quantifies both the actual scientific productivity and the apparent scientific impact of a scientist. It is increasingly viewed as a major indicator of academic esteem.   Anything over 100-120 and you’re into Nobel laureate territory.
  • Journal Impact Factors – A measure of how often all articles in a specific journal have been cited.   Usually the most common “How important is this journal?” value.
  • Cited Half Life – measures the number of years, going back from the current year, that account for half the total citations received by the cited journal in the current year. 

From these I’ve been able to profile the departmental research publications as a whole, as well as getting an idea about the impact of the individual contributions to it.  Quite looking forward to discussing the results with Chemistry in the near future.

The biggest challenge (data collection aside, which currently is very long winded) is knowing when to stop.  I’m still very new to bibliometrics, and my inner scientist kept suggestion other ways to contrast the data or to analyse it.  Essentially I could have been at this for weeks.  And since we’re still not quite sure what metrics the REF will be using there didn’t seem much point in going to far with the first attempt.

There’s also the question of benchmarking.  Raw stats on our depts are all well and good – but where do they stand contrasted with the rest of the world?  That’s something that I might need to follow up on, but would likely be a far more time consuming operation than a week’s work.  For now the Chemists might just need to trade notes on H Factors held by comparator academics, in contrast with their own.

Posted in Research Support | Tagged: , , , , , , , , | Leave a Comment »