On Thu, 5 Nov 2009 10:56:35 -0500, Roe,Kevin wrote:
>In the last few years, I would say that it is rare that we do not have to make some corrections in the record. Some are somewhat insignificant in terms of our users’ ability to find the records, while others are would create problems if not corrected.
>So the dilemma is to decide what changes are needed and what aren’t. Many libraries are simply accepting copy from their vendors without checking them, but this can and will only result in a massive cleanup job somewhere in the future, when patrons begin to complain about not finding materials that are in their libraries.
Correct, and this emphasizes that library cataloging/metadata *standards* are not at all like standards in other parts of our society. There are standards for water quality, handling electricity and gas, sewer lines; there are standards for butchers and bakers; there are standards for building codes and automobile manufacture.
Those are real, honest-to-goodness *standards,* and they are standards precisely because if you break the standards, you will be punished, and often severely, but it’s been quite different in the library world of “nudge-nudge wink-wink” reactions to these matters. Now that our data increasingly is being used outside of our own worlds, we are seeing some consequences. The discussion of cataloging quality at Language Log http://languagelog.ldc.upenn.edu/nll/?p=1701 and Jon Orwant’s reply (manager of Google Books) is quite enlightening. Orwant was critical of the quality of human-created metadata, and although I am personally suspicious of the specific examples he gave, problems of quality in library-created records certainly exist.
The real dilemma as I see it is that we cannot compete with automated methods on the very important issues of price and quantity, because one computer can churn out enough metadata in one hour equal to 100 catalogers working a year (I just made that up, but it is probably something close to the truth). The only advantages that librarians can offer is better *quality.* That’s all.
Scary, but I think people want high quality badly, and issues of “quality” information retrieval will become far more important as the web grows and becomes less and less easy to manage.
What is needed is a general re-thinking of what “quality” means in a catalog and in an individual instance of a metadata record (which I think may be radically different from what has been traditionally thought), then relate this to what libraries are in a position to achieve; finally, to *ensure* the records created will achieve these standards. Regrettable as it would be, this may mean we would have to “lower” the current standards so that they can become something that is achievable, I don’t know.
But the current system does not seem to be functioning very well and seems to require some fundamental changes.