RE: Will RDA make OPAC design better?

Posting to Autocat

On Thu, 23 Sep 2010 14:25:30 -0600, Janet Hill wrote:

>Molti anni fa, I was working at Northwestern. And they were just organizing a media library. The folk responsible for the media collection wanted Cataloging to create skeleton records so that things could circulate, but they told us that no one ever searched by anything but title, so not to bother.
>But we couldn’t help ourselves, so we provided real records, and traced directors, actors, screenwriters, composers (etc.) that appeared prominently.
>AND LO! Use of the material skyrocketed, AND the materials started being used by all kinds of people outside Film Studies —- English, Drama, History, Art ….
>It turned out that “they” wanted to find materials in all sorts of different ways for all sorts of different reasons.
>Skeleton records waste money. They waste the money we spent on purchasing the materials in the first place, because they don’t exploit the various facets of those materials.

This is correct, but it needs to continue and discuss what is possible today with the new tools. In the card catalog world, and through the days of early computerization, the rule was: “Catalog it once, do it right, move on.” This was a correct attitude since there were very few instances when an institution could actually afford to pay people later to find the skeleton records, get the books from the shelves, research the books, update where possible and desirable, and finally put everything back. This required funding projects of significant size to create entire teams of people to redo the work of former days, and of course, all done at the expense of not doing the new materials. Emphasis should be on the new materials, in order to avoid finding yourself in the absurd loop of having to do skeleton records for the new materials, then having to correct all of those records by still future projects, and so on ad infinitum. Therefore, if you wanted to maintain high standards and reliability, the above rule made lots of sense.

Still, what that meant for the public was that significant amounts of materials they wanted were lying uncataloged in dusty backlogs waiting for the catalogers to get to them, so it is understandable that the public wanted at least some kind of access instead of none at all, and as a result, there was a push for skeleton records, etc. etc.

But there are new possibilities today, such as systematic updates of selected records through queries of distant databases, and this age-old conundrum can begin to change. Today, it is *possible* to catalog a book in such a way that it can be tagged as a skeleton record, and if the local database is configured correctly, it can query outside union catalog(s) automatically to discover if some other library has updated the record. If so, the new information can be added to the local record, all without any human-intervention.

There are almost an infinite number of ways of accomplishing something like this, but the fact is that today, the cataloging process *does not have to end* where it used to, and therefore, a catalog record, once it leaves the hands of the cataloger, can still be a dynamic entity, bringing information in from all over the place.
And yet, for this to have even the slightest chance of working efficiently, I believe that adherence to cataloging standards becomes even more critical than it has been in the past. For automatic searching and matching to occur correctly, the information in the local skeleton record must match what is in the remote database. It could be done through ISBN (which has known problems), perhaps something like the International Standard Text Code (ISTC) (but this is something brand new), or even better in my opinion: close adherence to ISBD, which is actually designed for optimal matching.

The times, they are achangin’!