A number of people have asked about how we will create a single database for all items and another for all customers. This "part 1" post is to give a high level overview of how things will work regarding the items or "bibliographic database".
Our network has had an agreed cataloguing standard for many years. This standard gets updated from time to time through consultation with cataloguers across the network. We then provide this standard to our suppliers who catalogue to this standard. This standard is primarily based on international and national cataloguing rules, standards & conventions, with some agreed local state-wide variations.
Over years of having standalone databases libraries will inevitably have created some additional local variants to our agreed standard.
The challenge is now to merge these databases, attempt to keep local information which adds value, to strip out some of the variants that add no value, and continue to use agreed standards where possible. To achieve this we have developed a process which we are in the process of implementing.
We will purchase the Libraries Australia authority headings to load into the database. We will also add the SCIS subject authority headings as they are used by a number of libraries. Having these embedded into the system will hopefully keep us on track to keep developing a consistent database.
Mapping for consistency: We are currently working with the first two groups of libraries to ensure that where libraries have used MARC tags in unique ways that we either map the data in these fields to the correct locations or ignore it when we harvest records. In some cases where libraries have added unique information such as additional local history information we will endeavour to retain this information. To improve the probability that this occurs it may require that this data is mapped to new data fields in the conversion.
As the first two groups of libraries include a number of large libraries with sizable collections we expect to have a very large database by the end of June. From there, as libraries join the consortium their Item records will be added to existing Bib records wherever there is a match. And where there are unique titles these will be added as new Bib records.
We do have the ability to match records, keep all of the original record, but also add data from specific fields where they need to be retained. We will use this feature when adding some of the local history and other data I mentioned earlier.
I can also add that each Item record (i.e. the unique record for each item) will have that item's DDC call number. This means that if some libraries want to include 1 or 3 or 6 numbers after the decimal point then each library can choose their option and it will be displayed that way locally.
I know that many people will want much greater detail regarding the match & merge process. This blog is not the place to go into the finer details of why we will be using specific MARC tags etc. This information will be posted on the network's Intranet site in the near future.
In the meantime please feel free to ask questions or make comments. Depending on how detailed the answers need to be they may be answered here, or posted as part of the Intranet paper.
And look out for Part 2 of this post where I will tackle what it means to have a single customer database!