Skip to main content
OCLC Support

2025 AskQC office hour member Q&A

Review the most recent member questions from AskQC office hours.

March 2025: The mysteries of bibliographic encoding levels

11 March 2025

So, if a novel doesn't have subject headings, it won't be converted as full level, even if it had 655 fields?
That is correct based on the current logic. We could take a look at some examples to see if we would want to adjust the logic. However, you are welcome to upgrade any record that has been converted to a minimal level 7 into a full-level blank record.
Can you please show the slides for conversion logic: 7 and 3 again? I know you will post the slides and video in a day or two.
Looking back at slide 21, which covers conversion logic for full level blank records, we can talk through this again since it was a lot. We are looking for production or publication information the 245$f/$g, 260/264, or 773. We are then looking to see that field 300 is present and if there are any subject headings present in fields 600-651. We then look to see if any of the following data elements are present: 245$b, 250$b, 260/264$a, 300$b, 300$c, 490. This is the logic to code the record as full level.

On slide 22, which covers the conversion logic for a minimal level 7 record, the logic is looking for everything listed on the full level slide except that there are no subject headings (fields 600-651) present. If this is true, then the record is downgraded to a minimal level record.

On slide 23, which covers the conversion logic for an abbreviated level 3 record, the logic will code the record as an abbreviated record if it doesn't meet the criteria for either the full or the minimal level record.
Are there any plans to convert encoding level 4, and if not, are we able to upgrade them to encoding level Blank?
We do not have any plans to convert encoding level 4 because this is a standard MARC code. We were only converting the ones that were coded with alphabetic OCLC codes. If you are working with one of the encoding level 4 records and want to upgrade it to full level, you are welcome to do so.
How exactly the conversion process work? Is it just an algorithm reading fields present and assigns ELvl based on that?
Yes, that is basically how it works. We are looking at the content of the record first to establish if it should be coded as full and then if elements required by full are missing, we would down grade it to the appropriate encoding level.
If we are upgrading the encoding level to blank, can / should we remove the BATCHLOAD field, the 936?
You are welcome to if you wish but there is no need to remove it. Many of the encoding level M records are converted to blank and "BATCHLOAD" appears there. Its just an indication that that is how the record came to OCLC. We are including that in our algorithm because that was something that was requested in those focus groups. So, it is up to you whether you want to retain the field or remove it when upgrading a record.
Is there a way to filter by encoding level when searching in Record Manager? Any plans to add that functionality?
There is a search index you can use that is "lv:" to search for encoding level. For example, if you wanted to search for blank, you would use the search "lv:b". If you wanted to search for any of the other codes, you would just use those codes instead, e.g. "lv:7".
Is a Library of Congress call number required for full/blank level records? I've seen conflicting answers to this.
No, it is not required. We certainly like to have call numbers assigned but it doesn't have to be an LC call number. The presence or absence of a call number does not factor into the algorithm that we are using to convert the record.
Are 6xx indicators considered for full level?
No, they are not. We are just looking at the tag so anything that falls within the 600 to 651 range is considered a positive criteria for full level. Indicators are not evaluated.
Does OCLC have statistics on the rate of member upgrading of encoding levels?
No, we don't but that would be a really interesting thing for us to take a look at.
Do fixed fields like AccM (008/24-29) not affect encoding level? (Even though the accompanying 504 does)
The 504 doesn't affect the level of cataloging but Cynthia had showed that as an example of the input standards. Many records would not have a 504 field because the resource does not have a bibliography so we cannot use this in our algorithm to convert encoding levels in records.
I noticed that the abbreviated record slide doesn't have a 338 field? Is it true that RDA fields (336,337,338) not have an impact on the level of the records?
That is correct. In fact, I think when we convert the records, we are sometimes adding those 336, 337, and 338 fields to the records as we are doing the conversion.
A chapter in the BFAS was mentioned for what fields are required for an abbreviated record, is there a similar list for what is required to be able to upgrade to minimal/full?
There isn't a list in one spot, that I am aware of, but it is available for each field in BFAS, so you can look and see that a 250 is a required if applicable or optional or whatever it says there. So, you would look at each field to determine if the field is required or optional for the specific encoding level. In BFAS 2.4, there is some information about what is expected in full, abbreviated, and minimal-level records, but it is not field by field:
If we update to full level, should we leave the 588 field with words stating material not examined...ǂ5 TnLvILS
I would recommend removing that if you are inspecting the resource in hand and upgrading the record because that 588 no longer applies to the that record.
If we are editing a record coded as M for another reason, should we edit the encoding level? Or leave it to the eventual OCLC conversion process?
We recommend upgrading the encoding level based on the changes you are making. So, if it meets the full level standards, then by all means, please change the encoding level to full.
Is it preferred that we upgrade a level 3 record instead of creating a new one?
Yes, please upgrade the record. Don't create a new one. Duplicate records in WorldCat are one of the biggest problems that people complain about so if you are adding the problem by creating a duplicate record, you are not doing your fellow cataloging members of OCLC any favors. So, please upgrade the level 3 record. More information about when to input a new record can be found in BFAS Chapter 4.
Are there any plans to remove the 375 field from name authority records?
Yes, we have very much wanted to do that project and started it a few years ago; however, the name authority file is the Library of Congress/NACO name authority (LCNAF) and is not OCLC's file. OCLC has a copy of this file, which is synced with LC's official copy daily, but it is their file, and they had some system issues and they asked us to stop. Then the British Library which is also a NACO node and syncs with LC's file daily had their system hacked back in 2023. So, we have been asked not to load a huge amount of bulk updates because it affects the quantity of the daily exchange of records beween LC and the various NACO nodes, of which OCLC is one. The quantity can be really problematic if it gets too huge, so that is why we are not doing bulk changes to remove the 375 field from the name authority records. If you are editing an existing authority record that has a 375 field feel free to remove it. Even if there is no other reason to edit the name authority record, you have the authority record open and are viewing it. You are free to remove field 375 if you wish. The normal updating of authority records is fine, and it’s just the bulk editing that has caused problems for the NACO nodes.
Why is that some records don't allow for the encoding level to be updated? I have dealt with this a few times and it is frustrating when the records are incorrect.
I think mainly that is the encoding level 8 records from the Library of Congress or PCC members that you are not allowed to change the encoding level on. You may update the records, you just are not allowed to change the encoding level value. The reason for this is because any record coded as PCC, which all of the LC CIP encoding level 8 records are, require that all the access points be backed up with an authority record. If you are not a PCC member and you are adding a field that doesn't have an authority record, you aren't then able to create that authority record and back up the access point. You are welcome to send a message to bibchange@oclc.org if there is a record that you're not able to correct or modify and we can take care of that for you. BFAS 5.2, Member Capabilities, lists what you can change and what you cannot change based on the record and your authorization level. We are happy to make any changes for you that you are not able to make.
I'm seeing 600 10 headings with headings linked to other scripts. For example, the name in Arabic script and the romanized form. Since the authorized form is the Romanized form, is this correct?
Yes, if the library wants to enter the script form in an 880 field, I know if looks like it's another 600 field that's linked to it in our displays but it's really an 880 behind the scenes. That is perfectly acceptable.
Where are the new 653 fields coming from? I see them more and more lately.
Field 653 is a valid use of a subject heading in an alternate, uncontrolled form. If its not an LCSH or another subject scheme heading, they can be added to the record. We understand that they can be annoying if they duplicate other 6xx headings and you are welcome to delete those duplicates.
When you upgrade a 3 level or M level record, is it considered by OCLC that the upgrading library is creating an original record? Often these records are so minimal that it's almost the same as an original.
I don't think we would define it as an original for WorldCat purposes but it's still the same record. However, if you are keeping statistics for your own use, in your own library, and that's how you define creating an original record and are keeping a tally of your own statistics, that is fine. It's really up to you how you are defining it. You OCLC statistics though will not show as that being original but will instead show as being an upgrade or update of the record.
In an enhanced contents field that contains ONLY titles, what is the logic of putting every individual title in its own subfield $t?
This is a question of indexing. When you have a subfield $t, it allows additional indexing of the titles that are in the 505 field. This is obviously somewhat dependent on which system you are using.
It's frustrating to have to type all of the fields in a level 3 record when there was a previous edition that we can derive a record from and save time doing that.
Yes, I believe there is a macro you can use to derive but you wouldn't add a new record. The purpose of these level 3 records is sometime for acquisition. You are welcome to copy and paste those relevant fields from the previous edition to add to the level 3 record, when upgrading.
Is the 245 $h field that is labeled electronic resource going to be eliminated since it listed next to the the title proper and redundant?
Subfield $h is not used in current cataloging, but we know we get a lot of records through our batch processes that have a subfield $h. When we run our macros or various scripts against the WorldCat records and they have the subfield $h, the subfield $h will be removed. Because we have encoding elsewhere in the record to tell if something is an electronic resource and you see a subfield $h in a record, you are welcome to delete it if you are editing the record.

Subfield $h is still a valid subfield in the MARC format so it’s sometimes what libraries are doing in their local catalog along with older records. If you are updating a record to RDA that has a 245 field with subfield $h, then please feel free to delete it and provide that information in the corresponding 33x fields.
Please explain how to "pin" a record to be able to see two side-by-side!
While a record is open in Connexion, go to the View menu and click on Pinned. You can also use the keystroke Shift+F4. You can also set a button on your toolbar to pin a record. Go to the Tools menu and click on Toolbar Editor. Scroll down the list until you find "ViewPinned", click and drag this to your toolbar. When you want to pin a record, now you can just click on the button on your toolbar. Any of these options will pin the record you are currently on. You can pin more than one record at a time. Also, if you pin a record and then pin it again, the pin will disappear from that record.

With a record already pinned, you can open another record to view at the same time. To view them side by side when both records are open, go to the Window menu and choose either Tile horizontally or Tile vertically. See How can I pin Bibliographic record windows side by side for more information.
Is there a macro that will "unenhance" the 505? I find that often the chapter titles are extremely generic and not helpful to include in the title index.
We do not have a macro that would change the subfielding in the 505 field. We also ask that you do not remove the subfielding if it already exists in the WorldCat record. We understand that when the title is "Glossary", that doesn't seem terribly useful to you but maybe some of the others in the 505 field are. A member library made the decision to enhance the contents note so please respect that decision. You are welcome to unenhance these contents notes in your local catalog if you wish. The subfield $t provides indexing (ti:) for the title portion of the contents notes but 505 subfield $a does not. For more information about the indexing for field 505, see Fields 500-511 and Fields 518-586.
Is there a way to pin in WorldShare? I know you can "compare" or prepare a record to print to have it open in a second browser tab. I'm not aware of how to view two records in a single window and still be able to work on the bib.
None of us in the room regularly use Record Manager, we are all using Connexion primarily, so we don't have an answer for you. Here is the information about how to use "Compare" feature that displays the two records side by side. If you have an enhancement suggestion for Record Manager, you are welcome to suggest it in the OCLC's WorldShare Record Manager Community Center.
Will OCLC at any point start refusing the uploading of records from sites not following current cataloging standards? The 245 $h is just one of many examples of outdated practices constantly coming back into the database.
OCLC is a global cooperative. We have libraries coming in from all over the world. Some of those use current practices but you are right there are outdated practices coming back into the database because some libraries come to us as new members or want to load their old records. So, we get a lot of records for things that aren't the latest things, and they load them into WorldCat. Unless someone could convince each library to spend a lot of time cleaning up their database before they send it to use to load, we don't know how to prevent this. We do try to clean these records up once they are added to WorldCat though through our tools. So, if you see any patterns of anything we should try to address, please send us a message to bibchange@oclc.org and we will do what we can to take care of it.
If we are a WMS library and want to have a local record with changes to a 505 for instance, how would we go about that? So that it will show a local 505? Do we need to add a LBD?
Field 505 is not supported in LBDs so you try to adapt another field for that. However, you could also send an enhancement request to OCLC's WorldShare Record Manager Community Center asking to add field 505 to LBDs.

20 March 2025

If a level M record has 264/300 and non-LCSH subject headings, such as FAST, BNE, or MeSH, is it converted to level 7 or blank?
The record will be converted to encoding level (ELvl) blank.
So full level refers to what is required in each field, not which fields are required?
Actually, it’s both. BFAS lists whether the field is required or not. Once you know that, you will know which fields are required for the whole record.
If we edit these converted records, should we remove the 936 BATCHLOAD field or leave it as is?
That depends on the changes you are making to the record. If you are upgrading the description to fully catalog the record, you can remove it. If you are just making a few corrections, you should leave the 936 field with BATCHLOAD in the record. See Bibliographic Formats and Standards, Field 936 for more information. It contains information about when you can remove the 936 field with BATCHLOAD.
Does OCLC added any new data when converting a record?
No, we do not, other than the 936 field. For the standard encoding level conversion, we don't add any other field; however, if its run through our macros or scripts along with the encoding level change, you will sometimes see more fields added, such as the code in field 043 and stuff like that.
The slide for criteria on converting M to blank mentioned the existence of 245 $b. Yet the example record shown did not have 245 $b. Did I misread the slide?
This is just one of the data elements that we are looking for when we are applying the criteria. It does not need to be present but if it is present, it could weigh in on the criteria to be changed to full level encoding.
With the specification for this conversion developed, are there any possibilities of applying this routine to review existing encoding levels in other records?
At this time, we haven't considered that as a project, but we could evaluate it to see if it would be a useful project to have. If you have examples to share with us, please send them to askqc@oclc.org and we could look into doing something like this. The conversion process is automated. We wouldn't want to apply this across the board because there are a lot of records we would consider full level without subject headings, like for example works of fiction or records for Bibles. This would require additional consideration on the logic so these records would be accepted as full level.
When all the M level records get converted, will OCLC no longer code batchloaded records as M, will those records be converted to the appropriate level on loading?
We do have plans to change the process so that the records coming through data sync would receive the appropriate level based on the logic we are using and additional logic we need to finesse but we don't know when exactly that process is going to change. We are hoping that work will be done sometime this year, but we don't have a firm date.
If a record only has one subject heading, is it still a full record? I remember being taught before that one subject heading was a minimal level record.
Yes, according to our logic, one is enough.
We receive updated records through WorldCat Metadata Services for ebooks. Our settings indicate that we should get updated records only when an Encoding Level changes, but I have seen many examples where the Encoding Level does not appear to have changed when compared to the existing record in our database, but we got an updated record anyway. Is there an explanation for this behavior?
We recommend posing that question to our colleagues who work with the Collection Manager service by sending your question to support@oclc.org. We do not work with that service so are unable to answer your question.
I have been seeing records with 653 tags with what appear to be keywords, mostly in all caps. Do you know where these are coming from?
We believe they are transferring in from a data sync project. They appear to be legitimate 653 subject heading although the all-in caps are probably a tad bit annoying. Feel free to edit those 653 fields if you have the opportunity or desire to do so.
Can you stop them, appearing in the first place?
We cannot stop them from appearing. A lot of people have said they are annoying. You may delete them if you are editing WorldCat records, but these are not incorrect.
LC has approved the name change headings for Gulf of Mexico and Denali, have those been updated in OCLC, if not, what is the timeline for that?
We received the changed authority records for the Library of Congress Subject Headings (LCSH) on Tuesday and we started receiving bibliographic records with the changed headings from the Library of Congress today so those are in process. We are not sure what the timeline exactly will be, but some of the records will change because of our automated controlling software and some will need more manual intervention and we'll be controlling those by a more semi-automated process.
Will the changed LC headings for Gulf of Mexico and Denali be reflected in FAST?
Yes, but not right away. It takes us about 6-8 weeks to make these changes from LCSH into FAST so it will be a while.
I have recently had a group of records batch loaded into OCLC but they are marked as minimal encoding, when I know they have all the required fields for full. Why is this happening?
If added as encoding level M, that doesn't necessarily mean that they are minimal level, if that is what you are referring to. It means they are added through a batch loaded. Most projects that come through data sync, they come through a flow where they do receive the encoding level M. If you are talking about something else, let us know. We will be changing this process in the future.
Okay Cynthia, I was going to ask that question, however, since PoCo has not yet discussed this topic, is it appropriate to change before that discussion?
Adolfo is referring to the PCC Policy Committee (PoCo) and certainly LCSH are Library of Congress's subject headings, not the PCC's subject headings. A lot of thought went into this but yes, it is appropriate to change before any discussions take place. Alternative subject headings are fine to use and to add to the records. For example, the Getty Thesaurus of Geographic Names uses Gulf of Mexico or Mexico, Gulf of so you may want to use that instead but those are not our decisions here at OCLC.
Occasionally we update a portion of a title (say, to correct a misspelling) but we are not authorized to upgrade the record. The next day I think data sync has caused a duplicate record to be added to WorldCat. with our library code. What can we do differently?
We recommend reaching out to the data analyst that is assigned to your Data sync project. It might be a setting that is in the profile that they might need to change, or they can work with you to ensure that this doesn't happen. We've had this happen before and they can work with you to ensure that the record would be overlaid and get the changes you would need into WorldCat without adding a duplicate record.
I recently attended a session at Code4Lib that discussed adding ISO 639-3 codes to better represent and reflect indigenous languages (MARC language codes often group languages together). Is there any effort to incorporate this in the future in the 041?
We talked about doing a project to go back and insert those codes, but you are welcome to add those yourself. That is a modification you can easily make, and that coding is something you can do now. We don't have to do any special validation, its already acceptable in WorldCat.

Feburary 2025: Introduction to fields used in continuing resource records

11February 2025

If the beginning and ending date of publication of a serial is different from the beginning or ending chronological designation for the serial, the Dates fields reflect the chronological designation rather than the publication?
Yes, that’s correct. The OCLC BFAS page on Dates gives more information.
On slide 44, why is the first indicator in the 245 "1" for the electronic resource?
In the examples, certain fields were left out to make sure all the information fits on the slide side-by-side, making it easier to compare what is the same and different between the two types of records. So not all information in the sample record is present on the slides.
Do you need both a 245 and a 222 if the titles are the same?
You would need a key title (222) when that is assigned by the ISSN agency that assigned the key title. If you were just cataloging a serial and it didn’t have an ISSN key title assigned to it, you wouldn’t have one.
Would every record have a 210 (abbreviated title), 222 (key title), and 245 (title proper) field?
You’ll always have the 245. The other, it would all depend on the type of cataloging you’re doing and what has been provided by an ISSN center relating to that title. So not necessarily.
Would you need to include all the 5xx fields?
No, you would just use the 5xx fields that are applicable to the type of continuing resource you are cataloging. The different situations that might influence the inclusion of a note would just depend on what you’re cataloging.
Should you not use a 580 field if you're using a linking field?
The 580 field is more of a note to explain complex situations. So you would have a 580, say, if you had two serials that were merged into a single serial, you would have your two 785 fields and maybe coded showing there was a merger that took place, but you might also want to include a 580 field to explain that this title and this title were merged to form this new title. It’s more of a free-text field. You don’t exclude it just because you have linking fields, but you also don’t have to include it if you feel your linking fields adequately explain the relationship.
For continuously printed books in the library catalog, the material type is often listed as 'journal' or 'magazine,' which can be confusing. How can we change it to 'print books'? Can we globally update these records to reflect 'print books'?
This is a very interesting question that we have not seen before. We would recommend sending us some OCLC numbers at AskQC@oclc.org so we can review the bib records, compare them with how they’re being displayed—it may be only an issue in Discovery, but we would need the OCNs to investigate further.
Could you explain a little bit more about what you meant when you said that "linking fields do not take the place of authorized entry fields?"
The linking entry fields link to bibliographic records; the authorized access point fields link to the authority field. So, the title of the serial that’s used in the linking entry field might not be the same as a title that is used for an authorized access point. It could include both. But just remember they’re linking to different types of records. So, you’d have a bib record you’ve created, you have names, you have corporate names, these are things you would want to put into authorized access point fields. But you may also have linking fields. We just wouldn’t want to see a record that has a title and a bunch of linking fields, but where the record hasn’t actually been fleshed out with authorized access points or applicable entry fields.
If we have other questions related to this presentation, is it possible to get in touch?
If you have any questions about today’s topic, or questions about cataloging policy in general, you may send those at any time to AskQC@oclc.org.

20 February 2025

Does OCLC have plans to allow the use of the 023 field for Cluster ISSNs in Connexion, instead of 022 $l, since 022 $l has been obsolete since 2023 in MARC?
We are waiting for the Library of Congress to implement that in their system before we go ahead and add that to our validation and to the MARC in our database. Once all the ducks are in a row, then yes, we will be implementing field 023.
Do you any updates on how implementation planning is progressing?
Right now, we’re really just waiting on the Library of Congress to let us know when we can proceed. There is some more information on the BFAS page here: https://www.oclc.org/bibformats/en/about/introduction.html#section1.1.
Can you explain what is a linking ISSN?
A linking ISSN (ISSN-L) is one that the national ISSN center that has assigned as a link to group different versions of the same serial publication. Sometimes it’s the print, sometimes it’s the online version, but basically, they’ll take one of those ISSNs and they will use that as the link between the print and the online. It just links the different versions together so a serial title can be identified, regardless of format. More information from the ISSN Centre here: https://www.issn.org/understanding-the-issn/assignment-rules/the-issn-l-for-publications-on-multiple-media/
For unknown frequency, is the 310 field omitted?
Yes, you do not have to input a 310 field if you don’t know the frequency.
Can a 580 note be used alone, or must there always be a 780 and or 785 field?
I believe a 580 can be used alone. They’re generally used for explaining those complexities in the 780 and 785, but a 580 does not require any corresponding field.
I have some "serial" monographs (Contemporary World Issues from Bloomsbury for example) that do not have an ISSN. Are those not considered serials?
It may be that they haven’t had an ISSN assigned to them by the International Centre just yet, but the definition of a serial still stands. So, you would still catalog that as, if there was no predetermined conclusion, if it had that numbering, if it met the definition of a serial, you would catalog it as a serial. If it doesn’t meet that definition, then you would want to catalog it as a monograph.
In S/L 0 records, is the 264 first indicator 3 used when publishers change? Is that the recommended practice by CONSER? I think I saw multiple publishers listed in a 5xx field.
I believe that the 5xx field was used under previous cataloging rules. The 260 was not repeatable under earlier rules, and you will see a mix in the catalog of older and newer rules, so you may see 500 fields with multiple publishers as under earlier rules. Generally speaking, now that the 260 and 264 are repeatable, it is preferred to have multiple publishers all those fields with the first indicator correctly coded.
How are serials accounted for with the 335 extension plan (or should we just omit if we don't know better)? Would we say 334 ## $a multiple unit $2 rdami, and then how would 334 be cataloged?
You definitely want to make sure which version of RDA you should be using. I know the Program for Cooperative Cataloging, for example, has not implemented Official RDA yet, since extension plan is an element that’s part of Official RDA. I can tell you that LC PCC practice for this element right now is not to record it, so that would be what most catalogers would be doing if they were using Official RDA.