UC Next Generation Technical Services Initiative

The University of California libraries have started to implement a Next Generation Technical Services Initiative. One of the task groups is called “Transform cataloging practices”, and one of its charges (PDF) is to “define a ‘good enough’ record standard for all UC original cataloging”.

The obvious advantages for workflow are quicker discoverability of resources, reduction of backlogs, freeing up of staff time. By including records into WorldCat, enhancements by others become possible. Metadata automation can play an important role in these iterative improvements. The effort at UC will be collaborative – the plan is to survey public service librarians, selectors and users in order to determine minimum needs in a bibliographic record.

Already in 2005, the University of California espoused the “good enough” approach, in a report (PDF) entitled Rethinking how we provide bibliographic services for the University of California: “Focus on being good enough instead of being perfect”.

I think it is possible to be “perfect” (or rather as good as we can be) in certain areas of bibliographic description and “good enough” in others. If we know which elements do what in OPACs or discovery systems (indexing, faceting, browsing or pure display), if we know the value of fields for users, we can concentrate on these. Our time and energy is well spent on data elements that are relevant for search and retrieval and that have potential in a linked data world (mainly authority data). However, we could cut back on a lot of footnotes or the statement of responsibility without severely harming the user’s ability to find and locate resources. High quality in the right place, and “good enough” where it is sufficient, this balance might be the way to go forward.


2 thoughts on “UC Next Generation Technical Services Initiative

  1. Pingback: 加州大学新一代技术服务计划 » 编目精灵III

  2. Heather Jardine

    While I’m not opposed to the concept of “good enough” – in fact, I am quite in favour – I think there are a couple of difficulties. The first is with your starting point, “If we know which elements do what in OPACs or discovery systems…” – and the problem is that this changes over time. For example, it is only a few years ago that OPACs didn’t do much with the fixed fields and it was a reasonable saving to stop checking or entering them. Now OPACs are mining them for all sorts of useful stuff, and our records from a few years back are lacking or just plain wrong.
    The second thing I’d take issue with, is the omission of notes and statements of responsibility because it doesn’t severely harm “the user’s ability to find and locate resources”. No – but it does harm the user’s ability to select between different resources, to identify what s/he has found. We are all getting used to finding LOTS of information on the Internet, and adding lots of enrichment to bib records (tables of content, cover images, reviews and recommendations) and this is becoming the norm rather than the exception. Leaving stuff out goes against expectations.
    On the other hand, I absolutely agree with you about sharing enhancements and linked data offers the opportunity to bring in data from outside that we couldn’t have afforded to put in for ourselves. That’s the way forward, I think.
    That’s my opinion, anyway!


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s