There is not a macro yet to update the 260 field when upgrading records to RDA. This is something that can be considered in the future.
Records should already have this information coded in the 008.
Yes.
At this time, we will begin with $b eng cataloged records. It is possible that other language of cataloging will go through the conversion process in the future.
No, since the records that are being targeting are all coded as RDA. Hybrid records will not be considered for this project.
Yes, manuscript materials will be managed separately and coded accordingly.
Yes. Records that cannot be changed automatically will be processed manually by WorldCat Metadata Quality staff.
No. We will not be converting 260 to 264 when records are not coded as RDA.
As long as the record is coded in RDA and has 260 field it will be changed through this automated process.
Yes. For Chinese, Japanese, and Korean (CJK) records we will be working closely with the OCLC CJK Users Group to review these records.
Yes, as RDA has become more used in WorldCat records, there has also been an increase in the number of records that have a 260 field rather than a 264.
The 26x was done purposefully to allow libraries to follow whatever content cataloging standards they wanted and continue using 260 if they follow AACR2 or 264 if they follow RDA. At this time there are no plans to change workforms since OCLC Connexion is no longer being enhanced. You could use a constant data record as a workaround for the workform.
There is a possibility for this as a future project. There are some complexities to this as a city may have the same name in various countries to change the fixed field can become tricky.
You will have to look at the LC PCC PS for this, but past practice would have you ignore the renewable copyright dates.
It does take some time to process due to the number of duplicate requests we receive. Currently our backlog goes back to 4 to 5 months depending on the format.
We currently run macros to merge duplicate vendor records.
If you use the OCLC Connexion function “Report error” it will send you an email with your request and you can track the requests you sent.
We would need the OCLC records that got merged in this scenario to investigate if a possible incorrect merge occurred. The DDR software that merges records follows a set of rules and each incorrect merge needs to be looked in a case-by-case basis. Please report incorrect merges to bibchange@oclc.org.
A monthly process monitors additions or changes to LCSH and makes applicable changes to FAST headings. Because of this, catalogers do not need to edit FAST headings when they change LCSH. If a cataloger would like to change the FAST headings, this is okay, and the monthly process will look at those changes, updating or correcting the FAST headings, as necessary. However, with cataloger entered changes, no attempt will be made to synchronize the LCSH and FAST headings. This message was originally sent to the PCCLIST on 30 May 2017.
Yes, we have a web form similar to what you are asking. You can access it here: https://www.oclc.org/forms/record-quality.en.html
When using the Connexion “Report Error” function all you need to do is include the duplicate OCLC record number to the record you have in display. This function shows us the record in question and all you need to do is include the duplicate OCLC number.
These are allowable requests as the WorldCat Metadata Quality staff uses macros to help update headings.
We recommend you follow LC’s practice. LC has not terminated the genre use of LCSH in 650 fields. You can use both the still-correct LCSH genre term and an LCGFT 655.
This is due to the new data sync process where subject headings from different thesaurus are transferred to records if they don’t exist. We are aware of this issue and are working on getting it resolved.
You can send this type of request, but we will have to manually verify each record to make sure that the genre heading is appropriate to add.
If changes have been done to the LCSH headings in a record you do not need to do anything with FAST. They will automatically update the headings based on whatever LCSH the record now has.
At this time, we do not have plans to implement the AAT vocabulary for controlling.
This is another area to stay tuned for as FAST will be changing some of these mechanisms in the future.
We are continuing to constantly add 33x fields to records in WorldCat. These records will eventually be targeted for conversion.
No. The WorldCat database will continue accepting AACR2 and are not requiring libraries to switch to RDA.
You can code the date type as “s” for single date and just have the one date for publication or you can code the date type as “t” for publication and copyright date and include both dates even if they are the same (e.g. 2017, 2017).
The 084 field may be transferring when records are merge automatically by the Detection Duplications and Resolution (DDR) software. We have been correcting these records when we come across them by removing the 084 field. You can report them as well to bibchange@oclc.org.
At this time, we do not know when Connexion is going away. The new metadata editor is Record Manager and if you have a cataloging subscription you can begin using it at your convenience I have noticed a merging of OCLC records where the pub. and/or copyright dates do not match under the WorldCat record they are merged into. Can you explain this?
The WorldCat Metadata Quality staff and Data Ingest specialist interact constantly. Whenever we receive requests for corrections to records that may have been loaded through data sync, we make sure to communicate these issues with the data ingest specialist.
Yes, we do have tools that can search for fields that WorldCat does not index.
OCLC has a copy of the Name Authority File (NAF) but the file belongs to the Library of Congress (LC). OCLC cannot integrate ISNIs in name authority records and must get make a resolution with the Library of Congress and PCC members to make changes like these to name authority records.
At this time no but we can bring forth this idea to the VIAF team in OCLC.
We are currently developing practices and policies that will be used with $0 and $1 so please stay tuned!
Bibliographically speaking, a first edition statement is essentially ignored in the cataloging that we do every day. The lack of a first-edition statement and a first edition statement are equal as far as determining whether to input a new record. If there is a bibliographic record that has first edition, but your resource does not say that, and if you have the option, you can edit the record locally if you want to remove that statement; however, a new record should not be created.
Long-standing practice as outlined in When to Input a New Record takes "Book club edition" and says not to create a new record if that is the only difference. There will be cases where a book club edition has other differences (paging, size, etc.) that would have an impact on the content of the item so that page for page it's not the same. When you look at book club editions, in many cases, they are just a cheaper binding and that page for page it really is the same content. That was the original decision that went into putting that criteria into When to Input a New Record, to say discount book club editions.
This is what we have come to incorrectly or to narrowly refer to as the Romance language problem. In certain languages, such as Spanish, Italian, German, and various other languages what look to be edition statements but are printing statements and are often associated with a number of copies that are printed. These should be generally ignored and not even transcribed, although practices have differed over the years. Those are not considered to be real edition statements. They are just printing statements and should not be a factor whether you input a new record. They should be ignored.
That will depend on the resource. There are some things, particularly with scores, where what used to be separate edition statements versus what used to be called, under AACR2, musical presentation statements. Where the statement in the edition statement and the statement that was in the 254 field, are grammatically and intellectually separate. So those would be legitimately separate 250 fields. It is a matter of judgment. Field 250 was made repeatable just a few years ago to accommodate multiple edition statements, where previously we were obligated to record it in a single 250 field, separating one edition statement from another by a comma. That no longer must be the case.
If you come across records that have been merged incorrectly, please report them to us. Depending on how long ago they were merged, they can often be recovered. In cases where we can recover an incorrect merge, we or you can often supply something that will help differentiate the records in the future so that DDR won't incorrectly merge them again. In government documents, it is very common to have multiple documents published in the same year. If you can identify a particular month or date in addition to the year and include that information in a supplied edition statement, that is one option to prevent the records from being merged. DDR tries to look for certain quoted information, either specific publication numbers or serial numbers, that may appear in quoted 500 notes.
Subfield $3 in a 250 field is not taken into consideration in DDR.
If the error is reported to us, we will fix it. There is no automatic mechanism to alert us about something like that.
Yes, galley copies are prepublications, different editions. So, if you have a galley version, you can include that in a 250 field and DDR will not merge that to the published version or other similar things. Many of the books that we pick up at ALA that are advance reader's copies, you can use "Advance reader's copy" in a supplied edition statement, or if that is what it says on the item you can use that in the edition statement.
In the United States, dissertations in 502 fields should be used only for the actual unpublished dissertation. References to the fact that it is an adaptation of a dissertation, or along those lines, would be recorded in a 500 note. DDR does try to make that distinction between the published version and unpublished version of a dissertation, including looking for the presence of field 502 in addition to other elements as well.
The 501 field (With note) is generally not used in RDA. It should be used properly only when the physical item is published as that conglomeration of things, not for things that your institution has bound together after their individual publications.
Cannot definitively say anything for sure; however, there will be no one day at a point where we will map all of our MARC data to BIBFRAME and everyone will all switch over to use BIBFRAME exclusively. There will be a long time period where data is created in one format and needs to be mapped to the other. We will probably see a lot of back and forth, moving data around from one format to another. Even though MARC may be on its way out in the long run, it will probably be in use for a good number of years.
The Library of Congress made a decision a number of years ago, widely known as the "LC Series Decision", where they decided as an individual library that they were no longer going to make series authority records. The PCC continues to make series authority records in PCC libraries outside of LC, but there is no requirement within PCC that a series authority be made. The idea being that if you do have a traced series, meaning that you are going to use it as an access point and have an 8XX, you do want to have a series authority record if you are a PCC library. LC just transcribes the series in a 490 field and does not add an access point in an 8XX field.
If it's what we have come to call a "trade publication" (a major publisher), you may want to use an existing record even if the date is a year off. That depends on the individual instance and your own judgment.
No. In theory, if that information is reflected elsewhere in a record such as the edition statement or possibly in a quoted note, DDR would pay attention to that.
We try to identify dates in quoted notes. There are lots of different ways to present a date (mmddyy, yyyymmdd, etc.) and which part of the date should come first, second, or third. We try to interpret and parse those different methods of quoting dates in 500 quoted notes. In many cases that will prevent an improper merge; however, if that information is in a 250 edition statement it is more easily interpreted and seen as a differentiating factor that will prevent an improper merge.
Encoding level M indicates that the record was Batchloaded without a human looking at the record trying to compare it to other records that are in the database, but our algorithms trying to find a duplicate and to merge it to that. Our algorithms are not perfectible, they are imperfect and don't have the advantages that we humans have of being able to interpret the information. We try to bring together records that should be together, but we also try to do our best to keep apart records that should be kept apart. It is a balancing act to merge records that should be merged and keep records part that should be kept apart. If you find records that are duplicates, report them to us. If you find records that have been merged incorrectly, report those to us as well. We learn something from every incorrect merge, and many incorrect merges allow us to further fine-tune our bibliographic and matching algorithms.
Yes. When we started the RDA hybridization of records to add fields like the 33X fields, and other things like the spelling out of abbreviations in field 300, we started with Books because they are the easiest ones to deal with and then went on to other formats. Serials still need to be done. One of the factors we've considered is how many of the CONSER records will be affected, because the changes that we make are then transmitted in the file that we send to LC. We will continue to make headway on Serials and all of the other records that are in the database that don't have 33X fields.
Incorrect merges can be sent to bibchange@oclc.org.
There is a way to change your validation level check in settings. You may have it set to where it is doing a full validation. That is something that you can minimize.
The process that creates HathiTrust records is out of date and needs work. We have put together requirements for what needs to be done with that, but we do not have a timeline to when that will take place. We have certainly thought about a process to delete 77X references, but we don't have a mechanism in place that can go through the database and detect that those fields exist and that they are pointing to records that no longer exist. It is something that we will keep in mind and try to get something in place if it is possible to do so.
No, not right now.
It would seem that the 006 field does not belong on those types of records. If you are unsure as to whether or not to remove those fields, you can send those to bibchange@oclc.org for further investigation.
In WorldShare and WorldCat Discovery, the Material Type "mix" does retrieve only records coded as Type of Record (Leader/06) "p", according to Searching WorldCat Indexes.
"When performing a command-line search in Connexion or an expert search in FirstSearch, WorldShare, or WorldCat Discovery" the Material Type search "mt:arc" (for Archival Material) should retrieve all records coded as "a" (Archival) in Type of Control (Leader/08) (according to Searching WorldCat Indexes. Note the difference between Type of Record (Leader/06) and Type of Control (Leader/08).
Although this additional fact is no longer accurately reflected in Indexing, as far as I can tell, the Connexion indexing of "mt:mix" should actually retrieve Type of Record (Leader/06) values p, t, d, and f. A few test searches in Connexion suggest to me that this is definitely still true.
Type of Record (Leader/06) value "p" should be used for "Collections of materials in two or more forms that are usually related by virtue of having been accumulated by or about a person or body. … This category includes archival and manuscript collections of mixed forms of materials such as text, photographs, and sound recordings." Manuscript and archival collections that are primarily textual should be coded Type of Record (Leader/06) value "t"; primarily cartographic should be coded "f"; and primarily notated music should be coded "d".. For all of these sorts of mixed/archival material collections, Type of Control (Leader/08, Ctrl) should be coded with value "a".
We are planning to hold these on the last Wednesday of each month through the end of June this year. We will then evaluate and decide whether to continue beyond that point or not. You don't need to sign up any place, you can just use the login information and log in at 1:00 PM Eastern Standard Time. It will be the same login information that was used for today.
The recording will be posted for this session. We will announce this on OCLC-CAT when the recording is posted. We may also post some notes as well and will announce on OCLC-CAT when those are posted and where they're posted. Today's presentation will be added to the Cataloging Defensively page on the OCLC website as well. We can post the login information at the same place that we post the recording. This information will also be announced on OCLC-CAT and included in the Message of the Day in the Connexion login a few times before each session.
Yes, please send your ideas in to askqc@oclc.org.
No, you do not need to provide proof for that. Whether or not to record illustrations can be considered cataloger's judgment. If the Fixed Field is coded for illustrations, we can add it to the 300 as needed.
Sometimes this doesn't fit into their workflow, it may be easier for them to just report it. Or it could be that the record is a PCC record, and they are not able to make changes to a PCC record and therefore they need to report it.
It is your preference based on what fits best in your workflow. You can send multiple requests periodically or individually as you come across them.
You can use this area to describe what the error might be if you wanted to elaborate on some details. Or just writing the word duplicate would let us know what you are requesting.
We wish we had an answer to that, we would love to have an undo button as well.
Yes. A lot of times we use Amazon, Google Books, HathiTrust, etc. as proof when processing change requests.
No, requests are processed first in, first out.
We would need to see an example of that. You can always save these to your online save file and then send an email or report a change request. We can then go into your online save file and see the record and issue you are reporting.
No. Once the change is made, a NACO lock is placed on the record until it completes the distribution process. We would encourage institutions in this situation to wait until the record has completed the distribution cycle and make any corrections themselves. Optionally they could report this to us, but we cannot stop the distribution or make any changes to the record until the process completes either. One of the features being built for authorities in Record Manager, which will be released in the fall, will be the ability to make changes to an edited authority record before it is locked for distribution to LC.
An example was requested. Due to the amount of records in WorldCat, we may or may not have seen the issue before. When reported we will look at more than just that one record and we will look for patterns or any other changes, especially if they all came from one institution. We will look at their other records to see if we shouldn't be cleaning more up than just what was reported. So, we would ask if your workflow allows it, please report those.
Yes, we are aware of this, this was brought up via OCLC-CAT as well, so we are in the process of looking into that.
We routinely will try to revise proxy URLs to a usable URL structure. We have a macro that can change these. If they are able users could also make this change or report them so we can see if the change could be made to other such records.
Please report them and we’ll take a look at them. We will merge as appropriate.
Standardizing the order of 007 fields is probably not a good idea. Sometimes the 007 relates to the main part of the resource and other times a 007 may relate to accompanying material. There is no way to programmatically determine which is which. There are certainly other built-in problems regarding the 007 as far as MARC 21 is concerned and distinguishing between the main item and accompanying material is one of those things. There is no way to distinguish 007s in that respect. It probably wouldn’t be possible or necessarily a good idea to standardize the order of 007 fields. It sounds like it is something you need to address in your discovery system, it should be determining the material type of your resource by some other means than by looking at the first 007.
We do provide training for Connexion and Record Manager covering the interface functionality. We don’t provide training on how to use RDA or how to use MARC. If you would like to email askqc@oclc.org we can put you in touch with our training team.
Yes, we do. As mentioned earlier, if you come across an error and send it to bibchange@oclc.org or authfile@oclc.org we will look to see if there are other records affected. Also, it’s helpful if you notice a pattern to include that information in your request.
This is something which might need to be reported to LC or a NACO funnel. NACO institutions are supposed to update related name/title records when changing the 1xx form of the name. There may be lots of complications within this question because there will be situations where we have headings that were never controlled to the authority record but they don’t match the form in the 1xx on the authority record nor do they match any of the 4xx references that are on the authority record. They are just a little bit different, so they will not get globally controlled on their own and will just sit out there in a different form than the established heading in the authority record unless it’s called to our attention. We would do a manual and perhaps with the assistance with macros, the follow up to clean up the headings. In other cases, we may have situations where the authority record had changed and headings that were controlled didn’t get completely changed. We need to be alerted to that situation so that we can do the follow-up and make sure that everything is in step with the authority file.
Update: Since the WorldCat authority file is a copy of the Library of Congress authority file, we would not run a script to crawl through and make changes. For non-NACO institutions, if you do see a situation where a name authority record (NAR) was updated but the corresponding name/title NARs were not updated, you may report these to authfile@oclc.org and staff will investigate further. For NACO institutions, please contact your NACO funnel or NACO directly for all NACO related issues.
No, you should be able to upgrade encoding level M records. You may run into an issue if another library has locked that record for editing, but otherwise, you should be able to edit these records depending on your authorization level. If you had a Search authorization you would not be able to make the edits, but with Full-level authorization or higher, you should be able to edit these records. If you came across a situation where you were not able to edit the record, send your request to bibchange@oclc.org and we will make the edits for you. This would also be a good opportunity to put the record in your online save file so we could take a look at it and see if there are reasons why that record is not editable for you.
Subfield $0 by itself should not have an impact on your ability to control a heading. What I suspect in this case is that it’s not the kind of record, in other words, it’s not necessarily cataloged in English, and you’re trying control a name to the LC/NACO authority file, in which case you wouldn’t be able to do that. Typically, subfield $0s exist in records that are from outside the US, so it’s not uncommon to see a record that is cataloged in German that has subfield $0s on every access point, or records that are created with language of cataloging Dutch that have subfield $0s. But typically for a record that’s created in English you don’t normally see a subfield $0. Even if the subfield $0 is there you should be able to control it, but the subfield $0 will disappear when you do that if you are successful in controlling. Another possibility with that though is if you are trying to control MeSH or some of the other subject vocabularies which you cannot control in Connexion. You can control them through Record Manager, and if you were in Record Manager you would see the typical blue link that you see with LCSH. Those controlled headings, like the medical subject headings, controlled in Record Manager will display a subfield $0 in there. You may also see a subfield $0 for FAST and those actually aren’t controlled in Record Manager or in Connexion, but the subfield $0 is part of our processing when we enter that into the record.
Yes, that sort of thing should be reported so we can investigate and see what’s going on. It may be that there is a disconnect between the text and the authority record that you’re attempting to control to, so it’s not finding it. It could be any number of issues. If it looks like it should fit into the category that should routinely control, by sending it to us as an example we can investigate and get back to you.
If you come across something like this, report it and we’ll look into it to see if there’s a bigger issue at hand with other records. Also, if you were to notice a bigger issue or pattern, do let us know.
What you are probably seeing is this record probably came in through our data sync project and the date you actually seeing was probably the date from the local catalog of the institution that sent that record, so sometimes you’ll see those entered dates don’t make sense, but that’s where they are originating from.
mau is the code for Massachusetts. So, if the city listed in the 260 or 264 subfield $a was a place in Massachusetts that was the reasoning for the country being coded mau for Massachusetts.
Chat comment: check out https://www.oclc.org/bibformats/en/f...ield/ctry.html and scroll down to Codes. This explains how to code this field for items published in the U.S.
Chat comment: The LCCN structure is controlled by LC https://www.loc.gov/marc/lccn_structure.html - OCLC's search system takes that into consideration, so the space is part of the LCCN structure.
There aren’t any plans to do that right now, we are working on creating a more robust authorities’ infrastructure that will allow adding new authority files in the future much, much easier. Jody DeRitter, Director of Metadata Frameworks, she was hired last August to look into that and she’s also working on moving FAST into production as well as moving VIAF out of Research and into production. So, there are no current plans for AAT at the moment but stay tuned as all of the authorities’ infrastructure beefs up.
A little bit of history, the value v for DVDs was a later addition to the MARC format. After DVDs began being published it took months, or possibly even years before code v for DVDs was defined in MARC 21 and then validated. So, there are probably records that are incorrectly coded in the 007. If you have evidence in the record that it’s a DVD and not an earlier laserdisc technology you should, and it’s perfectly proper, to change it yourself and replace the WorldCat record. Otherwise, you can report it to us, and we’ll take care of it.
Yes, we do allow libraries to lock records if they are upgrading the records, also if they are doing the NACO work necessary for upgrading the records. Locking a record may also be used for staff training.
Yes, it but depends. The first OCLCQ could have been because it was in a group of records the macro ran on, and the second OCLCQ could have been to remove the incorrect field, but then a batch or ingest process has reintroduced the error back into the record. There is a variety of reasons a library would send us updates to a record, so if we take it out once, it’s possible that if the library resends us the record it will show up again. But you can report those, we do try to work with the specific institution to try and minimize errors that keep coming back. We do ask that you report these, so we do know it’s happening.
We are continually trying to clean those records up as they are coming in. That is a known issue that is being looked at but no specific data on a fix for that, so we are just trying to stay on top of it until we do get a fix in. We recognize that is very frustrating to a cataloger, you are more than welcome to take them out as well, but it is a known issue we are working on.
That is another thing we continually clean up; those are coming in through batch load. We don’t know what the bl stands for, but we are aware of that and we do try to stay on top of that and clean those up as they come in.
This particular combination of place with colonies is an issue within the heading control software for subjects. There is a portion of that process that looks at subdivisions and whether they can be geographically subdivided and then moves the geographics to that spot. In the case of these headings with colonies where you can further subdivide by a continent like Africa or Asia, we have two geographics that are actually separated by a topical subdivision and the software doesn’t get them into the correct order. We’ve known about this for a while and have a ticket in place with our developers to try and get that fixed, but that has not happened yet. So, we’ve generally just have looked the other way on these, looking forward to the time period when once it’s adjusted we’ll be able to go back through and actually control them correctly. Otherwise, you ought to be able to add such a heading to a record, leave it uncontrolled and make use of it that way.
Absolutely, which is why we don’t bother to correct them because they will be flipped back to the incorrect form. So, we’re still really waiting on the software to be fixed before we go through and globally fix these.
That would be appropriate when we are asked by that institution to look at a particular record. We don’t go into institution specific online save files on a regular basis, we don’t do that unless we are asked to. It’s a tool that we have to help institutions and we use it for that purpose. An institution may have made changes to a record and they want us to look it over, they have a question about it, a specific situation with a record, we can help them with that. We’ve also run into issues where someone has tried to replace the WorldCat record and are unable to do so, but they are able to save it to their online save file, and then we can go in and try to recreate the problem with that actual record with the edits that they made.
It depends what type of record that 856 field is on. If these URLs are on an electronic record, then both the 856 field representing the table of contents and the 856 field representing the cover image should have second indicators coded 0 because those URLs represent a part of the resource itself. Second indicator 0 should be used for URLs that represent the entire resources itself or a portion of the resource itself.
It depends. If an institution specific URL represents the same link as another URL in the record, then you may delete that institution specific URL or report it to bibchange@oclc.org. For example, if the institution specific URL was a Wiley URL but a non-institution specific Wiley URL already existed in the WorldCat record, then you would retain delete the institution specific URL. If an institution specific URL was unique, then you would convert it to a non-institution specific URL instead of deleting it. If you are unsure, please report the URL to bibchange@oclc.org and staff will decide whether to delete or convert the institution specific URL.
Yes, they will be retained. Some of the links to HathiTrust and GoogleBooks records are freely accessible. Others are only searchable or not accessible yet due to copyright restrictions, but that could change at a later date. If we deleted the URLs that were not accessible, we would need to go back and repopulate those records with the appropriate URLs after they become accessible, so these HathiTrust and GoogleBooks URLs that are searchable only or not accessible are okay to leave on the WorldCat records.
Materials specified is a general caption for what subfield $3 represents. It is used for both specifying the specific material as well as differentiating between different providers in a provider neutral record. OCLC discussed the issue of whether subfield $3 covers the name of a provider many years ago at the outset of provider neutral cataloging. While it is a stretch of that definition, there was no other place to very conveniently indicate what URL belonged to what provider. After making use of subfield $3 in that way, there are now millions of records with subfield $3's with provider names in them, so usage has since dictated the change and shift in the definition.
Currently the subfield $y is used about 50 million times in WorldCat. The most common use is "View online". If you are interested in seeing how this field is used, click on this link: http://experimental.worldcat.org/marcusage/2018-01-85. Note that this list is a ".txt" file and its pretty large so it may take time to download.
The subfield $y is defined as "the text that is used for display in place of the URI in subfield $u." It converts the text in the subfield $y into a clickable link. For example, the phrase, "Click here", could be used in the subfield $y so that the URI would no longer displays in the library's catalog but instead, the phrase, "Click here" would be displayed as a clickable link. The URI can also be masked using subfield $3 and subfield $z. So, subfield $y is not always necessary if the text from the subfield $3 or subfield $z is being used by the library's catalog. Years ago, subfield $y was used with the phrase, "Click here", pretty frequently but later on it was not used as much. What works for one library doesn't necessarily work for another library that would prefer to use some different kind of phrasing. Because of this, subfield $y is not used all that often anymore.
Currently, if a bibliographic record does not have an 856 field, the eBook icon will not appear in any of the OCLC's interfaces: WorldCat.org, WorldCat local, Discovery, or Record Manager. In order for the eBook icon to show, the record currently needs to have an 856 field. This is a known issue and OCLC staff are currently working to resolve it. In the next few months, we hope for a resolution that will change how the eBook icon is generated, removing the requirement for field 856. OCLC will send out announcements when this change takes place.
This would depend on the local practice of your institution. You may delete it from the local copy of the record, but it should remain on the WorldCat record.
We do not know what percentage of PURLs are broken. We know that the way that OCLC PURLs were handled changed over time, but we don't have any way of knowing the percentage of broken PURLs. Please report all broken OCLC PURLs to bibchange@oclc.org.
Whether or not you choose to delete it from your local record would be up to your institution. However, the 776 fields should remain in the WorldCat record.
The 776 field is used to link between two different versions of a record. So, for example, it would be used to link an electronic version record with a print version record. These two records have the same title, the same publisher, and everything is identical except for the version (i.e. print versus online). The 776 field on the print version record would point to the online version record and the 776 field on the online version record would point to the print version record. This links the records together in the library's online catalog.
Yes. You may catalog the electronic resource record in WorldCat for the online resource and mention the printouts in local fields. It is also okay to create a record representing the printout resource versus the online resource itself. That record would not be considered a duplicate record. How you handle what you attach your holdings symbol to, whether you are working with the record for the online resource itself and treating the printout as a copy or creating two records, is up to you locally but it is possible to have the two records.
You may email any needed BFM to Metadata Quality staff at bibchange@oclc.org. Please include the authority record number (ARN) or Library of Congress control number (LCCN) representing the added or updated authority record and Metadata Quality staff will add your request to their workflow for processing.
The only way you would do that in the Connexion client, for example, is by going through the drop-down menus for the guided entry and you can look at the existing field to potentially edit it with the fixed field mnemonics in that case. If you have an 006 field in the record and you right-click on it the guided entry box will come up as well as in the drop-down from the top.
It really is the same. It used to be that in our Batchload processing that we had a separate set of validation rules, but over time we have come to use the very same set of validation rules so that we don’t have to maintain multiple sets of rules. The difference comes in how we deal with the errors that are spotted in the records after the fact. In the Batchload processing, there is an error level that is assigned to various records and causes the records to go through different kinds of processing in those cases, it is not as if records have to be absolutely perfect to be added to the database but we do detect the very same set of errors.
That is a case where the record has most likely come the Batchload process because it is impossible to do a record like that as an online input. We have a relationship in place between 6xx with the second indicator coded 7 and a subfield 2 but relationships in Batchload are considered a lower-level error and those can find their way into the database.
This is really tough thing to go look for and find. This may be different in Record Manager, but in Connexion client you could input the vertical bar charter at different positions in the field as a way to spot where you should be to find that invalid character. There may also be macros out there that can help find that spot.
This is where you have options that you can set in your holdings so that you don’t require full validation. A lot of libraries would prefer to be able set their holdings without necessarily fixing everything on a record. If you do an explicit validation command you will get back full validation with all of the errors listed, but that is not necessarily something that you have to fix in order to set a holding.
In a case like this you need to replace the record if you are able to do so. If the error is on a PCC record, and you only have a full level authorization, you can report so that we can fix it.
The question here is that the code in subfield $b is actually out of step with the term in subfield $a. At this point we validate the term in $a, we validate the code in $b but we don’t actually have the two of them related together.
This is an error where we have transferred the controlling from an LC heading in the past, if you see ones like that report them so that we can investigate.
In part, there was a decision within CONSER in the past that considered the issues related to the print record carrying so much information about the electronic resource because the record should represent the print and note the existence of the electronic. We were including elements from the electronic version in the record that could then be confusing when processing the record in the future. So, a decision was made to remove certain things like 006, 007 relating to the electronic version, at least for serials initially, and then that conversation carried on over into monographs where there was a PCC decision to handle that exactly the same way. But you will still find records in the database that represent the print version and the form is coded blank in the fixed field also indicating that it represents the print version but there might be a 006 or 007 there for electronic information that may come out.
For 040 subfield $b which is the language of the cataloging, we have discussed before making that a mandatory element so that you would be required to input subfield $b when you were creating a new record. Subfield $e is a harder thing to require in that there is not another element to link that to because if you created an AACR2 bibliographic record you would not have a subfield $e in the 040. Desc in the fixed field would be just coded as just ”a” and there. But, Desc i without 040 subfield $e is a valid combination so subfield $e may be something you will always just have to remember, subfield $b may be something we input a relationship for in the future.
Yes, that’s correct, since it is the same validation rules that is used for various services. When the MARC update is applied, we update the validation database. It is automatically updated for Connexion and Record Manager.
In most cases a copyright date can be used in RDA to infer a date of publication if there isn’t an explicit date of publication. So, in a 264 1 that inferred date of publication would be bracketed. A subsequent 264 4 could be input with only the $c identified as a copyright date. So, if you find items without the brackets that is an error, and it should be fixed or reported to OCLC. Do not input a new record.
This question has come up before when our validation was previously based on AACR2. In the context of MARC records, we took this issue to CONSER and had a discussion with them about what would this mean? Because I could potentially have a serial that has all the ISBNs that are assigned to all the individual volumes. So, this is really the constraint of MARC at this point that is a consideration in the decision to continue how we have done it in the past, which to omit the ISBN’s for the individual parts.
020 subfield $z should not require a valid check digit. In fact if you had a number that you were going to include in 020 subfield $a, if the check digit was incorrect that is a case where you would put it in $z in addition to the cases where it is not the number that is appropriate for the item being described in the record.
I agree, it could be fussier about bringing in those kinds of headings. There are some issues that are being resolved about the number of headings that transfer and the state they are in when they do transfer. We are also attempting to clean up problems that we know about e.g. certain combinations of headings. We will try to go after them and get them out of the way.
The flip side would be the person that wants to catalog the individual volume in a series who would then complain to us that “I searched this ISBN expecting to get the one monograph record but I am also always retrieving the serial and I don’t want that” it is a really difficult position and it can be extremely useful but in other cases not so much.
We have validation in place for all languages of cataloging.
They pass validation because all of those characters are now valid with OCLC’s implementation of Unicode. The pre-composed characters that have the diacritic combined with the letter are somewhat problematic when using macros because the macro language is not Unicode compliant. There are issues with any macro that are written, including the transliterate macro. We are thinking about possible solutions but do not have any definite plans yet.
The language of cataloging code in the 040 subfield $b applies to the descriptive cataloging not to the subject cataloging. So subject headings 6xx can be any languages as long as they are coded correctly.
Yes, we would like that process to change. It is the result of records being built in the data ingest process, batchloading, there was also an issue at one point in Record Manager where these things were sorted into a tag order rather than keeping the most relevant heading intact.
Treat the item according to the Print-on-demand and photocopy provider neutral guidelines, which you can find on the PCC website. We will be including information about provider neutral cataloging for print on demand publications in an update in Bibliographic Format and Standards, but otherwise you can search for that on the PCC website and find the guidelines. There will be one record for the reproduction. You would have a 533 field that would indicate that it is a reproduction in print. It would not include any details on its publication.
Validation was designed solely to report back errors. In a case like this when you have a mismatch between two elements the question is “which one is really wrong?” It may be that the heading that is coded 110 shouldn’t be changed to a 100. It may be that the 110 is correct and name should be coded “n” or it could be the reverse.
Yes, there is a second cohort that started last year and they're going great guns. We are planning on starting a third group sometime this summer, at least later this year. That group is still being formed we are very excited about moving forward with that.
Yes, that was a change on how those FAST headings are generated and applied to existing records. It is no longer a requirement that all the heading have to be convertible to FAST. We will do the ones that we can do.
Yes, there is. The thing to look for is Print-on-demand.
A couple of the topics that we have done so far have been suggested by our members and we have also picked a few topics that we thought were important. A survey will be coming out soon and we are hoping that you will suggest lots of great topics. Feel free to let us know what you want in the survey, through AskQC@oclc.org or write to any of us individually.
Yes, we are aware. We have attempted to put together a macro to try and fix them, but it is a very tricky thing to do. In many cases the ones that are in the combination of publisher/place are not necessarily subfielded. We have got commas and things that we have to rely on. We don’t get reliable results all the time. We are also aware that because of some of these having been corrected in the past while we were still batchloading them, that we have 260 fields that did not compare and match correctly so that we have lots of duplicates as well. We are aware of this and working to clean up as much as we can.
This is a case where since you could include multiple 33x fields for different aspects of an item. It might be difficult to include this kind of thing in validation and it may be better for us to look for these things in the database. This one occurs frequently for Hathi Trust and Google Books because existing records are simply cloned and the 337 and 338 are not removed and replaced with their online counterparts.
Libraries are welcome go back and add non-Latin scripts to any WorldCat records because of the expert community. They will be parallel fields and are a great addition to records. This is especially a good time to add non-Latin scripts to records since OCLC accepts all Unicode, which allows more languages to be used in WorldCat.
Yes.
Would be a matter of cataloger’s judgment but if you are certain that a change needs to be done to correct the record, especially in the case of wanting to change the form of the name or title in an access point to match the LCNAF, it would be right to do so. When you are changing the main entry of the record due to a cataloger mistakenly placing an editor in a 100 field, make sure to move that access point to a 700 field and not delete it completely.
We do keep monthly statistics and have a record of those. These statistics are broken down by type of changes that were done in the past such as minimal level upgrades or database enrichment. We are not able to get too granularly on the exact change that was done to the record and we currently do not have numbers we can share on this.
As part of the expert community you can make this change. If you have the item in hand and are sure that the change is correct to do, please go ahead and make the change. If the record is PCC, then please send us whatever proof you have in order for us to see if we can make the change. If we do see something that is obviously wrong, like a publication date that proceeds the author’s birthdate, we would do some research to see if we can infer what is the correct metadata for the record.
We think it is appropriate, especially if you check it in the LC/NACO authority file and find an authorized series access point. If it is not in the authority file but your library wants to add an 8xx field, that would be cataloger’s judgment and it’s okay to do so as other libraries would find that helpful.
You can take out that note if your upgrading the record to RDA. The compact disc note was discontinued in AACR2.
We are aware of the problem with data transfer and we are looking to have this resolved. We are also correcting these records so if you do come across issues like these please go ahead and send them to bibchange@oclc.org. We would like to be able to identify who is the source behind the incorrect data transfer and make sure we are aware of all the records that were affected to later correct them all.
If the record is clearly in one language that is not reflected in the $b of the 040 MARC field please feel free to go ahead and correct the language of cataloging code in $b. Make sure to consider all the descriptive fields such as 300 and 5xx fields. The subject headings being in a different language does not count in this situation though. If you think changing the 040 $b of the record would drastically make the record different do not hesitate to report this to us at bibchange@oclc.org and we can determine what is the best course of action to take.
This would be cataloger’s judgment and if it is too difficult to decide on what is the appropriate action to take it would be best to go ahead and report the record to bibchange@oclc.org to help make the correction.
Correct, even if a record is in a language of cataloging that is not used in your institution you can still use the record by deriving it and making a new master bibliographic record that follows English language cataloging practices.
Ideally yes, a second indicator 0 in an 856 field would indicate that the entire resource is available on the web. The second indicator 0 does not imply anything about the resource being freely available or behind a pay wall. It just indicates that the resource is available at this link. For multivolume sets or serials records that provide separate links to each volume or issue it is still appropriate to have a second indicator 0 because you will be obtaining the whole volume or issue for that multipart or serial.
We would suggest that you try to work with that record if the vendor or library intended to use this record for English language cataloging conventions. You can upgrade them to the guidelines that you are using since these institutions are not required to use RDA.
We try to use the record that needs the least amount of work and is most complete. In most cases, it would be a PCC record, but you do not have to feel obligated to choose a record for us to retain. When we are merging, we follow a hierarchy like DDR that helps us determine which record should be retained. Users can just send us the duplicates and we will make the choice. Generally, we also keep the record with the most holdings.
If it’s just a link to a table of contents or index it would be inappropriate to use a second indicator 0. In those cases, you would use second indicator 1. For a chapter in a book that would be more difficult to determine, and it would be a cataloger’s judgment call to decide if indicator 0 is appropriate.
The definition of second indicator 4 is that it’s a local vocabulary that is not following a controlled list. If there are duplicates of subject access points, this is not useful, and it would be appropriate to remove the 650s with second indicator 4. If they are different then it may be helpful to keep them in the record. 653 holds uncontrolled headings that are not related to any list which go beyond the topical. 650s would be generally held for topical headings but 653s holds any information from subjects to names. 650 second indicator 4 is also structured in some way, unlike the 653 which are just keywords. You wouldn’t be able to use subdivisions in 653s.
LCSH "Electronic books" doesn't have a scope note to help us determine if it can be used as a genre, so judgment applies, if it falls under the category of "disciplines in which LCGFT authority records have not yet been made." If not, then the local 655 would be appropriate (using _4). This response was derived from LC's Frequently Asked Questions about Library of Congress Genre/Form Terms for Library and Archival Materials (LCGFT).
Unless you have the script and language expertise of that vernacular field it would be better to leave it alone.
No. LCGFT would be identified by second indicator 7 and $2 lcgft.
It is not recommended to use this field; they are intended for conversion of bibliographic data by a machine and not humans. They are not properly differentiated access points as you would see done by human catalogers. If a cataloger sees a 720 field and can appropriately determine what it is trying to represent, then go ahead and correct it to the correct form of the LC/NACO authority file and move it to a 100 or 700 field.
If the record is going to be in your local catalog this note can be anywhere in the record. We would prefer for you to please not add this information in the master bibliographic record. You can put this information in the local holdings record as well or in the 956 which is the locally defined equivalent of the 856.
Don’t spend your time correcting it if you know this record is a duplicate. Go ahead and report it to bibchange@oclc.org. If the record is not a duplicate, then please go ahead and correct it.
You can learn more about the different OCLC specific symbols used in WorldCat by checking out the latest updates from chapter 5.4 of Bibliographic Formats and Standards. OCLCQ is for WorldCat Metadata Quality, it appears whenever changes in the record are done, whether automatically or manually by Metadata Quality staff. OCLCA is the automated process where a controlled authorized access point in a bibliographic record is updated to match changes to an authority record (OCLCO is similar to OCLCA).
There is no end of life date for Connexion. Someday Connexion will end and its successor is Record Manger. You can already begin using Record Manager now. When there is an end-of-life date of Connexion we will make sure to notify way in advance. Connexion is not being developed anymore though. All the functionalities upgrades are being done in Record Manager.
This is an indexing issue. Please go ahead and send us the OCN number that is being display twice in your results list to bibchange@oclc.org. We will re-index the record which will clear up the issue.
There are some functionalities that are being added to Record Manager one of them being controlling headings.
Record Manager will be able to sustain the work that is done with authorities. Both vendors and libraries will be able to continue doing their work in the LC/NACO authority file through Record Manager once Connexion ceases to exist.
At this moment no, but we are hopeful that a new mechanism will be built in Record Manager to help us modify large amounts of records. There are some functions in Record Manager that mimic current macros in Connexion such as taking a print version record and converting it to an electronic version record. There are at least about 5 to 7 advanced functions available in Record Manager.
Whenever there are changes in MARC coding it will continually be changed in our systems, regardless of the interface changes that are being done on the front end. Jay manages the MARC update which will be done in the next 6 months (we will be implementing the 2018 MARC update). We will be publicized the changes in the MARC coding and additions through our technical bulletin.
This question would be better addressed to Customer Support at support@oclc.org. We try our best to answer Record Manager questions but the AskQC sessions are more focused on cataloging questions and Record Manager questions can be sent to Customer Support who will forward it to the appropriate department.
No. Technical Bulletins’ are valid and not obsolete until a certain point in time. We try each year when a technical bulletin is issued to update BFAS and incorporate those updates to the document.
We suspect that it probably will, given the way we process information in the 533 in our data sync processing. We pay attention to publishers that are in the 533 so that, for example in the case of microforms with different publishers, we wouldn't want to merge them together. So, the fact that we have a record that has a 533 that would include what looks like a publisher versus another record that doesn't, we would probably end up adding that at this point. We also have macros that we run on occasion to go after records for certain online resources, to make them provider neutral. So, it's possible that a 533 that once appears in the database might be removed as part of the process to make it provider neutral, and then the record could be subject to being merged after the fact through our duplicate detection algorithm.
Whatever method works best for you and fits into your workflow is fine.
There aren't any specific plans, at this point, to clean up local headings. We encounter local headings as part of all the other work that we do, and we look to make sure that they are correctly formulated. So, if we have a pattern of some local heading that is problematic in that respect, we will sometimes go after it and fix it up. In some cases, if we are encountering multiple forms, staff will do the additional step of establishing an authority record for that name. We know that a lot of local headings have been entering the database through our data sync processes, and when those are duplicating subject headings already in the record that are not local, we do have a macro established that deletes those headings. We have not taken a systematic approach, but when we do encounter those, we are deleting the local headings. We also have in process a correction to the way data sync works, so that a few of these local headings will transfer from incoming records to WorldCat records.
We don't know of any public interfaces that are using them (or specific ones), but they are being used by various institutions. Nathan said that he would have to talk to Jody DeRidder, who is overseeing the FAST process to figure out what specifically they are. We are in the process of creating an editorial board for FAST so that we can go through the process of updating them or adding terms, or various things like that, and not be solely dependent on the conversion of the LCSH. We have some announcements in the works that will go out in the next few weeks, or sometime in the future about those sorts of things. We are looking at creating a sustainable future of the FAST headings.
We presume this is referring to the situation where the character typically shows up as a black diamond with a question mark. Fixing them is certainly an appropriate thing to do. That would involve deleting the fields because they are often duplicates of fields that are already there. That's because the character is considered a non-Latin character within the OCLC database, and that is partly the reason they have transferred in. We are working on the root cause of this problem before we take an approach to finally cleaning all of them up. We have, on occasion, gone back to get rid of large groupings of these but find that in some cases they transfer back in again. So, helping us by cleaning them up when you happen to see them is a good thing. Nathan added that the time frame for the fix thing is within a couple of months and hopes that with the October or November office hours to state where we are in the development work for that. The biggest thing is that we don't want them to continue coming in.
A request can be submitted to the Bibchange inbox (bibchange@oclc.org) to let us know that an error in merging has occurred. As long as the records were not merged prior to 2012, they can be recovered. Once the records have been pulled apart, we can test them to see if subsequent changes to Duplicate Detection and Resolution (DDR) may have taken care of the problem that allowed them to merge in the first place. If DDR would still merge them for one reason or another, we can often work with the reporting institution to come up with a way to prevent DDR from merging them again. As we have mentioned in previous sessions, our de-duplication process is continually evolving. As we stumble upon an incorrect merge, we do go back and test it and change the algorithms that are merging it in hopes of preventing future cases like that.
As of right now, it's since 2012. This date is fixed, so in 2020 we will be able to go back to 2012. As we look at data retention and the size of this file, because there is a lot of stuff in the journal history file that is kept, this decision may be reversed or changed, or shortened sometime in the future.
Right now, within WMS, there is controlling for multiple sets of authority records. NACO being prominent, and the one most used with the English language of cataloging records. There is also controlling for LC subject headings, for MeSH subject headings, Maori subject headings, Dutch names (mostly used on the Dutch language of cataloging records), and German names (mostly used on the German language of cataloging records). Later this year we are going to be implementing a French-language authority file from Canada. That will be controlling names in the French language of cataloging records. If something is not in NACO, it will not be controlled. If an authority record is needed for a particular heading, where one does not exist, one of the things that OCLC can do is create an authority record for that heading. Requests for new headings can be sent to authfile@oclc.org and WorldCat Quality staff handle those requests.
This is an OCLC thing, not a vendor thing. If a vendor sends us records on your behalf and it's loaded under your symbol or a collection created for your library, our software here at OCLC (DataSync software) changes the 040 subfield $a and subfield $c to your OCLC symbol. This is a decision that was made when the DataSync system was being programmed.
Response: These are reasons why we would want to pay attention to the subfield $a and subfield $c in the 040 field, so we will take that into account as we continue to look at that system.
There are edit restrictions that are built into the system for certain fields and in certain kinds of records that would prevent somebody from replacing a record after making certain kinds of changes. There is no edit restriction on field 015. Libraries can add them or delete them as they see fit. It seems pretty unlikely that an 015 field would be deleted if it's legitimate. And in this case, since we're talking about records that have come from Library and Archives Canada, I would imagine that your 015 fields are safe.
There are roughly two dozen comparison points for bibliographic records in DDR (Duplicate Detection and Resolution). That is misleading, in the sense that many of those comparison points actually draw from various parts of the bibliographic record and not simply one field. In many cases, the information gets manipulated in order first to see if things are the same that are transcribed differently or look different to see if they are actually considered to be the same thing or two, if they appear to be the same but really are different. There are roughly 300 fields possible in a MARC bibliographic record and there are roughly over 200 fields that we look at or otherwise consider. Most of those are the things that you would expect such as the title (245 field), places of publication, publishers, dates, series. But there are all sorts of other things where a comparison point is specific to a particular kind of bibliographic record. For instance, scale in Maps records, publisher numbers in Sound Recordings, and in Scores various elements of instrumentation. So, there are lots and lots of comparison points. Jay has done a defensive cataloging presentation that helps you know which fields play in, so that if you wanted to create a record it's not merged into another record. We can maybe do a session sometime in the spring about what our merging process is.
This seems like a very worthwhile request. What we have in place, in terms of browsing headings in the bibliographic file, is that all languages of cataloging are included with all their headings integrated into the same index. So, sometimes you'll see variations in names that are legitimate names because one is the form that's used by the Germans, the other is the form that's used in English language cataloging and so there's a difference in qualifiers. Or, if it's personal names, one will have a date and the other one doesn't. It's a little bit confusing as you're looking at that display. I presume this is what you are asking about in this question, as opposed to searching related to an authority file where you normally just pick the file that you want to search in. In browsing through headings in the bibliographic file, they are all mixed together. This is something that we ought to keep in mind for Record Manager because it would be a desirable thing to have.
Since you will be using WMS and WorldCat is your database for the Library and Archives Canada, we would hope that you would enhance what was there and not create a duplicate record. In many cases, there won't be a record already there when you are doing Canadian CIP.
This is a known problem in data processing. It is on a list of things to resolve, along with other things, but we are not sure where it ranks. It is a problem in that tag 600 is automatically going to float to the top rather than the 650 that a cataloger put there intentionally.
As an LC subject heading, that would be only up to LC to establish a subject authority record for it that we could potentially control to. The re-positioning of the geographic is determined by what can be subdivided in terms of the subdivisions that were here. So, this heading had History and Press coverage, presumably Press coverage is a subdivision that can be subdivided geographically which is why Lebanon was moved to the end in this case.
Right now when you merge two records and the OCN that is in the Knowledge Base is not the OCN that is retained in the bibliographic database, there's a brief disconnect period where we have to wait until the OCN gets updated in the Knowledge Base. That will happen automatically eventually but if you need it to happen much sooner, then the OCN needs to be updated in the Knowledge Base. We are working on making that much more streamlined because it doesn't do any good to not have the correct OCN in the Knowledge Base. At the moment there aren't any notifications as to when records are merged, but we should probably look into that more.
Yes, they can. It doesn't matter the process they were merged, as long as it happened after 2012, we can have them recovered.
GLIMIR stands for Global LIbrary Manifestation IdentifeR. GLIMIR is an effort by OCLC to bring together holdings for the same manifestation that are distributed across multiple parallel language records. GLIMIR began as a project in 2009 and was fully implemented around 2012. When WorldCat was GLIMIR-ized, the same or similar records were clustered together to improve end-user searching. Since the purpose of this tool was end-user searching, it is not very useful in a cataloging context. It is much more useful in WorldCat Discovery or WorldCat.org since it brings together all of the parallel records as well as print and microform records representing the same manifestation. For catalogers, make sure the GLIMIR box is not checked when you search in a catalog interface.
GLIMIR clustering can be disabled. Once it has been disabled, the system will remember that preference. If it is not, make sure that only one instance of Connexion Client is open, open the search dialog box, and uncheck the "Display using GLIMIR clustering" option. Search WorldCat, then exit out of Connexion Client. The Client will save the last preferences that you chose before exiting the Client. When you open the Client back up, the "Display using GLIMIR clustering" should no longer be selected.
TRCLS is a vendor institution in Japan and their intent is to catalog in English. While their intent is to create an English language record, they do not catalog the same way we would catalog the resource. These should remain in the English language of cataloging records, so please feel free to correct these records as needed.
The OCLC symbol representing the Bibliothèque nationale de France is BDF. To identify the OCLC symbol of a particular library, go to the Directory of OCLC Members and search by the institution's name. As far as the language code to put in field 040 subfield $b, the code would be "fre" whether the language of cataloging is French from Canada or French from France.
If the 520 is a quoted note, then that is perfectly fine to keep in the record no matter what the language of cataloging. If your library has the language knowledge to add a translated summary note in the language of cataloging of that record, you may do so and replace the quoted summary in the other language. If you are cataloging in English then, yes, the summary note in the 520 field is supposed to be in English as well. Allowances are made for libraries serving multiple communities, for example a library who catalogs in English but also serves a Spanish language community, may keep the quoted Spanish language summary statement along with the English language summary note. If you see these translated summary notes in a record, you may fix it to represent the language of cataloging of that record.
To limit records to a particular language of cataloging, use the "Apply language of cataloging limiter" option. You may also use ll: if searching in the command line search. For example, ll:eng would limit your command line search to only English language of cataloging records.
If you notice that correcting the hybrid record will result in a duplicate record for that language of cataloging, we encourage you to report them as duplicates to bibchange@oclc.org. In many cases, DDR will come along a few days later and find and merge these records. For more information on DDR, see Defending Differences from Duplicate Detection and BFAS 5.1 OCLC Member Quality Assurance.
The records may be merged but Metadata Quality staff would first consult with experts in the language before making a final determination on the matter.
The same field 010 may be used in a parallel record since the identifier in that 010 field represents the same resource.
It is common practice for some Dutch academic institutions to catalog certain records in English. If all of the fields except for the access points are cataloged in English, then most likely the intent of the institution inputting the record was to catalog it in English even if they have used the Dutch authority file. Verify that field 300, the 33x fields, non-quoted notes, and other elements are English language of cataloging then you may correct the form of the access point to conform to the English language of cataloging practice. It’s also worth noting that a number of Dutch academic libraries are planning on joining NACO and will start creating English language authority records. This should make a difference for records contributed by Dutch academic libraries in the future.
,dd>When parallel records were introduced in 2003, OCLC also introduced "PR" notes in field 936. This note contained a list of OCLC numbers representing parallel language records for that same manifestation. Because this field was not always used as intended, OCLC stopped using the field altogether a number of years ago and deleted them from WorldCat.
When entering the codes into the 3xx if your institution enters English language records and is following RDA guidelines, you do not need to add the language code after the code in subfield $2. For example, if you were cataloging an item in English under RDA guidelines, you would expect to have English language terms in the 33x fields subfield $a with rdamedia, rdacontent, or rdacarrier in subfield $2. Under RDA guidelines, there is no need to include the slash "eng" ( /eng) after the code in the subfield $2. If you were cataloging in a language other than English, though, you would include the appropriate term in subfield $a for that language and then follow the source code in the subfield $2 with a slash and the appropriate language code.
Yes, please correct errors that you find in these records. If you are unsure or see a pattern of errors, please send these to bibchange@oclc.org for WorldCat Metadata Quality staff to review and correct and needed. Metadata Quality staff correct these errors on a regular basis but member libraries are also encouraged to correct them as well.
No. You will need to contact your specific ILS provider and work with them.
Yes.
OCLC prefers that summary notes added to field 520 match the language of cataloging of the record. While it’s understandable that a library would want to include summary notes in other languages, it's preferred that libraries treat these other language summary notes as local notes and either add them to your record locally or add it to an LBD record.
We urge you to look carefully at all elements in the records but if you do feel like they are duplicates, please report them to bibchange@oclc.org. We will make a determination whether to merge them or not, consulting with our language experts if needed.
Yes, please feel free to correct errors if you see them.
Yes, you may derive a new record from a parallel record. When you do, be careful not to enter a hybrid record. Make sure to verify the non-transcribed elements to make sure that the language of cataloging matches the language of cataloging of your institution.
It depends on the language of cataloging of the record. Look at the whole record to determine what the intended language of cataloging was before deciding what to do. If the language of cataloging is determined to be Chinese and the summary notes in the 520 fields are in the vernacular with a transliteration, then do not change this record. If the language of cataloging is determined to be English and the summary notes in the 520 fields are in the vernacular with a transliteration, then you may correct the 520 field. Yes, OCLC does allow the practice of adding a quoted summary statement in CJK along with an English summary note in an English language record.
When you send duplicate requests using the OCLC form on the web, the request is sent directly to bibchange@oclc.org. Once it’s sent to Bibchange staff, it's placed into the duplicate workflow. In general, staff trained in each format process duplicate requests on a first in, first out basis. Be aware that there is currently a backlog in duplicate requests. Some formats have a bigger backlog than others, but your requests sent through the online form will be sent to bibchange@oclc.org and processed when staff are able to get to them. For more information on Bibchange Staff workflow, please see Processing change requests, the Virtual AskQC Office Hours presentation given on March 28, 2018.
When you derive a new record, you are creating a new record, so the holdings will not move. Holdings only move if records are merged together. It’s a judgment call whether to fix the record or just derive a new one.
You should determine the language of cataloging based on the intent of the cataloging agency that input the record, the language of the libraries who have attached holdings, and the language of the descriptive cataloging elements in the record. Deciding what language of cataloging should be used on a record needs to be determined on a case-by-case basis. It is worth looking at the hybrid record in WorldCat to see if you can resolve the problem since incorrect coding in field 040 subfield $b may lead to an incorrect merge via DDR. Also, you may have a case where considering the holdings will inform you on the best course of action. If a record was intended to be a French language of cataloging record but only 2 out of the 102 holdings are French libraries, while the other 100 libraries are English language libraries, then it may be that it should remain English and be corrected to reflect English language of cataloging instead. If you can do this, then feel free to modify the record as needed to correct the hybrid record, otherwise you may report the record to bibchange@oclc.org.
If the names in a record are controlled to a non-English authority file, then the language of cataloging ought not to be English. Records cataloged with English language of cataloging, are controlled to the NACO authority file. Records cataloged with Dutch language of cataloging are controlled to the Dutch language authority file. The same with German, French, etc. It’s unlikely that you will see an English language record controlled to another authority file other than the NACO authority file, because of how OCLC controls authorized access points. There may be cases where a record coded as English language of cataloging has access points with subfield $0s that link to a non-English authority file. If you encounter this, carefully review the record to determine which access points to retain and edit the record accordingly.
This is a great question and is something that we have been wondering what to do about. So far, we have the French Canadian authority file that we will be using to control access points in French language records. We currently do not have a French authority file from Europe yet. We are thinking about it and carefully considering what we will do in the future but in the short term there is no conflicting French authority file to consider in the short term.
While there may no longer be a credit incentive as there was in the past, encoding level K and I communicate to other catalogers the record's level of completeness. Updating the ELvl fixed field when upgrading the record will assist others in identifying what your intent was when cataloging it. Please feel free to upgrade minimal-level records to full-level records if you come across them.
Yes. You may delete local URLs from the WorldCat record. For more information on URLs in field 856, see URLs in a shared cataloging environment, the Virtual AskQC Office Hours presentation given on April 15, 2018.
Level M means that the record came in through Data Sync, it doesn't indicate the completeness of the record. We do work to clean these up as we see them. The current policy for vendors is to delete vendor contributed records that have existed in WorldCat for about 4 years and do not have library holdings attached. If you can identify one of these records by ISBN, you are welcome to upgrade it to match the item you have in hand. You may also report these to bibchange@oclc.org.
This is the new OCLC batch load system and stands for Data Synchronization. Libraries send us files of MARC records and those files are taken in through the Data Sync system so that the records and are either matched to records or added as new records in WorldCat in a batch mode.
Yes, feel free to delete these or report them to bibchange@oclc.org if you see a particular institution's local URLs being added to WorldCat records. Staff are continually working to remove these from WorldCat records and anticipate changes will be made behind the scenes this fiscal year so that less of those will get added to WorldCat.
Yes, we do encourage members to report multiple records for merging. Please send all duplicate requests to bibchange@oclc.org.
The Data Sync process does make use of validation; however, it's handled differently than online inputs. The reason for this is that when you catalog online and receive a validation error, you are able to fix the validation problem at that time, but when records arrive in files through Data Sync, they are run through validation and separated into significant errors and minor errors. The significant errors include a bad tag or an incorrect record structure. These records are set aside and not loaded into WorldCat. The minor errors include relationship errors, such as if you have one of these then you must have one of those. These records are loaded into WorldCat. If these records with minor errors were not added the copy would not be made available for other libraries to use and the library's holdings would not be added as well. We do realize that validation errors are a problem and have change instructions up front where we try to fix some of the errors that are coming in. Metadata Quality staff also have macros and tools to clean up the records as they are loaded as well. If you are seeing a pattern of a problem, please report these to askqc@oclc.org. We may be able to fix the one record you are reporting but also make the same change across the thousands of records that have the same problem.
No, if you are upgrading a WorldCat record to PCC, you are not responsible for checking the validity of the 856 fields. That being said, we encourage you to look at them and delete of any URLs that are obviously local to an institution. You may not be able to check all of the URLs because your institution may not have access to the providers.
No, that is not true. In the past there was a system of credits, but the credits have been discontinued for quite a few years now. So, you are not going to receive a monetary credit for creating or enhancing bibliographic records. For guidelines on when to input a new record or use a record already in WorldCat, see BFAS Chapter 4, When to input a new record.
This is similar to the issue where more 856 fields than we would like are transferring to the WorldCat record. There is an effort underway to address the corrupted copyright symbol problem so that we don't transfer in these fields as often as we have in the past. The problem that we have with some of these corrupted diacritics is that they turn into a character that, while a valid Unicode character, is one that we do not want in the WorldCat record. They transfer in because it looks to the system like it’s a valid non-Latin script. We are currently working to resolve this problem. Once we get to the point where we are no longer transferring them to the extent that we currently do, then we will start the clean-up process in WorldCat. Unfortunately, as we have cleaned up these records, we have seen the same errors reappear on the record from a different library's Data Sync load on the same day. Because of this, we are focusing efforts on resolving the underlying source of the problem before going in and cleaning up the WorldCat records that are currently affected by this problem.
Yes, that is something we do check for in both DDR and manual merging. If you suspect that records have been incorrectly merged, please report them to bibchange@oclc.org and we will look into the records through OCLC's Journal History and if appropriate, we will recover the records.
This is good to know. Staff have gone through and fixed this spacing error before but we appreciate knowing that these are coming back. We'll work on cleaning these up again.
Most likely these are digital gateway records, which maps Dublin Core data to MARC 21. This often results in a mixed materials type so it would not be added as a book. Because they are not necessarily constructed according to the same cataloging rules that we would use for other materials, we do not merge these records at this point. If there is a specific institution please contact askqc@oclc.org so staff can look into the problem to see how significant it is.
The Lang fixed field contains the language of the resource itself. If the resource is bilingual then you would use both the Lang fixed field and field 041 to code the languages of the resource. For example, if the DVD of a German film included dialog in both German and French with subtitles in English and Spanish, the Lang fixed field would contain "ger" while the 041 field would contain all of the languages included in the appropriate subfields. For example:
Lang: ger
041 1 ger $a fre $j eng $j spa
The presentation given today was an overview of the topic. Please refer to the guidelines for more information and further examples. Metadata Quality staff are currently working to clarify the guidelines for Parallel language records, which is currently located in BFAS 3.10, Parallel Records for Language of Cataloging. These guidelines, once refined, most likely will be moved to BFAS Chapter 2. And, if you have additional questions, please send them to askqc@oclc.org.
Yes, you may send ISBNs as duplicate requests to bibchange@oclc.org.
If a code has been made obsolete by the Library of Congress, it should have a dash in front of it. If it doesn’t have a dash in front of it, it is still a valid code and can be used. If the thesaurus or reference document the code corresponds to has been superseded by a later edition or a more up-to-date list but is still a valid MARC code, you can continue to use the MARC code in the field. Occupation Term Source Codes still has the code dot listed for the Dictionary of occupational titles.
In general, DDR can’t catch everything, and we have designed it to be extremely careful and to err on the side of leaving a duplicate rather than incorrectly merging records that shouldn’t be merged. If you have record numbers that DDR has missed, do report them to bibchange@oclc.org. It may take us some time to get to these, but it does help us to get these reports as they can help us find other patterns of issues that we can address.
For the specific records reported, #1043516446 is a Library of Congress contributed record as indicated by the symbol DLC in the 040 subfield $c. #1061860203 is a member contributed record as indicated by the institution symbol TXN in the 040 subfield $c. This record was contributed on November 8, 2018, which was only 6 days ago. For records contributed through Connexion and Record Manager, there is a 10-day grace period before DDR evaluates them as potential duplicate records.
Field 040 subfield $a is used to record the original cataloging agency, not the transcribing agency which is recorded in subfield $c. For more information about field 040 and what the different subfields are used for, please see OCLC’s Bibliographic Formats and Standards, 040 Cataloging Source.
If you would put something in the subject line to alert us to it being a possible incorrect merge, that would help as we do have several hundred requests that come into our inbox on a daily basis, so we do have to prioritize those requests. So, if you could give them some sort of indication that would get our attention, once we see that it usually only takes a matter of a few hours to recover that merge.
We have no preference for us, they all come to the same place, the inbox that QC staff work from. But just based on what was just said about drawing attention to it, an email might be better so you could put something in the subject line.
That’s a question you should ask the Library of Congress. We check that page on a regular basis to see if it’s been updated. The Library of Congress has not updated the national requirements for almost a decade now, that would be nice if they would. Since then, we have basically made them up as we go along. We have tried to determine in our QC Policy meetings and in documentation meetings for BFAS what would be the most logical requirements for both full-level records and minimal-level records for everything that’s been implemented by OCLC, all the bibliographic elements since 2010. So, we essentially create those requirements which are in the Input Standards sections of each bibliographic BFAS page.
Yes, there is no reason not to.
This is a known issue that has been reported. It’s happening through some ingest processing where subject headings are being reordered and we have alerted the staff and I believe there is a problem report for that and hopefully being worked on soon.
We are aware of it, it’s not an ideal situation. You are more than welcome if it fits into your workflow to reorder those appropriately. We are actively working to try and prevent that from happening, because as you said yes, it is unhelpful to both WorldCat local users and those who are downloading those MARC records.
We 100% agree with you. I don’t have a good answer to this other than we are looking at it. The factors involved is whether or not basically is, the record is being brought in through ingest.
That’s part of our development work. We have reports out to our development team, and those have to fit into all the other development processes and priorities that are going on. We’ve outlined the safeguards that need to go in, talked with, and communicated this with our development team. Now it’s just a matter of when that can be scheduled to be put in.
We have quite robust validation rules and processes that are continuously being updated and which is one of the reasons why we like you to report any kinds of problems that you run into with the records because if we can find a pattern we can add it to our validation rules, to prevent them from happening in the first place.
As of the time of our implementation of the 647 field, you are allowed to add 647 fields to bibliographic records. Whether there is an LC authority record that corresponds to what would now be a 647 field is a different question and I do not know if there are any yet. Yes, the FAST headings are machine generated although there are institutions that create their own as well. Either manually or through a macro of some kind. You may add 647 FAST headings if they are appropriate, which do not necessarily have to refer to the LC authority file. You aren’t obligated to add them, but you may. The system will regenerate FAST headings once a month, so if you are changing LC subject headings in a record of any kind, you can allow the system to regenerate the FAST headings, you don’t have to remove the FAST headings or do anything with them and they will be taken care of within a few weeks in most cases.
Yes, absolutely a perfect question. Also, you don’t have to wait for one of these to come along since our next one won’t be until January. You can always email questions to askqc@oclc.org, but to answer your question…
You have the choice of cataloging that 4-book set as a set, in which case I would think that the set would have one classification number, and you could do as you wish locally with parts 1through 4. It’s also legitimate to catalog each of those volumes separately if you wish and of course that would allow you to also “classify” them as parts 1 through 4. That kind of issue would be a local determination.
Yes. If there are additional questions on that feel free to email askqc@oclc.org. A lot of times it is best for us to look at the actual records in question to give you a more definitive answer.
At the top of the WebDewey page there are a series of orange buttons, one is labeled Updates which may be the information you are looking for. Or, on the upper right-hand side of the page there is a “contact” button where you could ask this question. There is also the Dewey Blog where you may find additional helpful information.
This is a good question. Right now, we aren’t doing anything with validation from provider-neutral coding but is something we could look into.
There are differences in [local] practices and whether you do each record separately or one record for the entire thing.
So we have a lot of variations in cataloging practice here that is not easily reconcilable, and I understand that it’s difficult for different communities when they go in and find 3 different records for the same thing.
Just last week the Library of Congress issued the new MARC21 Update number 27. If you subscribe to the Library of Congress MARC discussion list you will get all of the MARC update announcements, all of the technical notice announcements, and so on. You can subscribe to that via the Library of Congress MARC Standards page.
As far as OCLC is concerned, our OCLC MARC changes are announced via the OCLC-CAT discussion list, lists for specific communities such as the Music OCLC Users Group discussion list, the Online AV Catalogers discussion list, etc. And now there is a page that is on the OCLC website which has the WorldCat Validation Release Notes and Known Issues. This page is fairly new and there does not appear to be a way to subscribe to the page.
Yes.
Actually, they are run through a validation process, it’s just a different validation process that’s not as strict as online validation, because that would possibly prevent a lot of records from being processed through matching and getting added to WorldCat. There are some safeguards in place where really serious validation errors prevent a record from being fully indexed and then they have to be manually corrected in order to be indexed and available in WorldCat. So we have some safeguards in place that keep really corrupt records and records that aren’t structurally correct from being added to WorldCat, but some levels of validation errors that we deem to not be as serious do make it through the batch load process, so those records can be processed and then eventually they can be corrected through other means.
Your options would be to update the record so that it passes validation by deleting those fields, but we hope you aren’t losing data and those fields are really illegitimate.
By all means, if you have any errors you aren’t able to correct you can always send them to bibchange@oclc.org.
Note: In Connexion client, this is located in Options… under the Tools menu option, under the General tab. There is a selection for Validation Level Options.
FAST headings can be reported to fast@oclc.org. Because the x47 is a relatively new field we have not necessarily converted all of the x11s that should now be x47 fields. There are not x47 event heading fields in the LC Authority File, so we also have to wait for that if meeting headings haven’t yet been changed to event headings.
If I remember correctly, indexing of the field 022 subfield $l simply has not been implemented yet, but it is on the list of things that need to be changed within indexing. As mentioned in today’s presentation, the indexing schedule is usually much longer, sometimes long after the MARC update to which it corresponds.
We were able to bring it up, but it locked us up too. This would be something we would need to look into further, but it may be a situation where Record Manager has a better feel for it than Connexion.