Planet Cataloging

August 30, 2014

Books and Library stuff


You Who Never Arrived

You who never arrived
in my arms, Beloved, who were lost
from the start,
I don’t even know what songs
would please you. I have given up trying to recognize you in the surging wave of the next moment. All the immense images in me — the far-off, deeply-felt landscape, cities, towers, and bridges, and unsuspected turns in the path, and those powerful lands that were once pulsing with the life of the gods–
all rise within me to mean you, who forever elude me.

You, Beloved, who are all
the gardens I have ever gazed at,
longing. An open window
in a country house–, and you almost
stepped out, pensive, to meet me.
Streets that I chanced upon,–
you had just walked down them and vanished.
And sometimes, in a shop, the mirrors
were still dizzy with your presence and,
startled, gave back my too-sudden image.
Who knows? Perhaps the same
bird echoed through both of us
yesterday, separate, in the evening…

Rainer Maria Rilke


by venessa harris at August 30, 2014 07:32 AM

August 29, 2014

First Thus

RDA-L RE: Re: When a new edition is treated as a new work

Posting to RDA-L

On 8/26/2014 3:54 PM, Benjamin A Abrahamse wrote:

While guidelines such as the ones LC has released are certainly helpful, I don’t think we will, or should, see RDA give a simple formulation that  “edition = {work|expression|manifestation}”. The word “edition” on a piece can mean many things, and it is the job of the cataloger to determine how it should be treated.

Ultimately, the question comes down to something very simple: do I add this particular item I am cataloging to this record, to that record, or do I make a new one? This is a matter of *definition* and has been interpreted by different agencies in different times in different ways. For instance, the venerable LCRI 1.0 seemed to be pretty clear (
“anything in the following areas or elements of areas differs from one bibliographic record to another: title and statement of responsibility area, edition area, the extent statement of the physical description area, and series area.”

Of course, in turn there were various interpretations of the meaning of the word “anything”. I always interpreted it quite literally: anything, but others interpreted it in their own ways. I remember when LC changed the rule about plates, which in turn made a difference to rule 1.0.

Then there is ALA’s “Differences Between, Changes Within” ( p. 6, pdf p. 12) A5a:
“A different extent of item, including the specific material designation, indicating a significant difference in extent or in the nature of the resource is MAJOR. Minor variations due to bracketed or estimated information are MINOR. Variation or presence vs. absence of preliminary paging is MINOR. Use of an equivalent conventional term vs. a specific material designation is MINOR. For example:

  • 351 p. vs. 353 p. is MINOR
  • 452 p. vs. x, 452 p. is MINOR”

which is quite different from the LCRI. What does “significant” mean?

But yet one more very important consideration in this brave new world of linked data and where everything is supposed to work with everything else is: how do other non-library agencies deal with this issue of “edition/manifestation”? After all, shouldn’t we at least consider what these other organizations are doing, or do we just ignore them?

With books: Rare book cataloging (in libraries) may look at a specific book in a completely different way from regular cataloging. Antiquarian book dealers look at it differently from either. Publishers also have their own needs and look at things quite differently from all the others.

Museums look at things even more differently. For instance, consider a teapot. There is nothing really special about the teapot itself, who made the teapot or where or when; there may be hundreds or thousands of similar teapots. But this particular teapot was used by Bonnie Prince Charlie during the Jacobite uprising, which makes it special.

I have cataloged different materials in diverse ways for different purposes using various rules. One thing I have learned is that before you begin cataloging, you must orient yourself: I am an AACR2 cataloger; I am an indexer using AGRIS rules; I am a rare book cataloger, etc. The item then looks quite different as you (re)orient yourself.

I was hoping that changes in cataloging would allow these kinds of traditional knotty questions to simply disappear. Today, it is very possible to catalog the thing in hand (that is, just copy what you see) and then let today’s very powerful software process your record as it will. In that way, the traditional question:

do I add this particular item I am cataloging to this record, to that record, or do I make a new one?

could change into one of how the software processes it: Software for purposes of rare books would process it one way; software for publishers’ needs would be processed in another; for library catalogs following LCRI1.0 in another, while other catalogs following ALA practices in yet another; software could follow other practices used in libraries around the world, and so on. I think this will be the reality no matter what we want as we enter the worlds of linked and open data as our cataloging information is taken, reformatted and repurposed, sliced and diced; and it is anybody’s guess what else will happen to it.

That is the world we need to consider, not our lonely, shrinking world of library catalogs; if nothing else because it is the world we are striving for (linked data). Our records must fit into the greater whole

There are many options today that can make our lives (and maybe everyone else’s) much easier.


by James Weinheimer at August 29, 2014 08:48 AM

August 28, 2014

Bibliographic Wilderness

UIUC and Academic Freedom

Professor Steven Salaita was offered a job at the University of Illinois in Urbana-Champaign (UIUC), as associate professor of American Indian Studies, in October 2013. He resigned his previous position at Virginia Tech, and his partner also made arrangements to move with him. 

On August 1 2014, less than a month before classes were to begin, the UIUC Chancellor rescinded the offer, due to angry posts he had made on Twitter about Israel’s attack on Gaza. 

This situation seems to me to be a pretty clear assault on academic freedom. I don’t think the UIUC or it’s chancellor dispute these basic facts — Chancellor Wise’s letter and the Board of Trustees statement of support for the Chancellor claim that “The decision regarding Prof. Salaita was not influenced in any way by his positions on the conflict in the Middle East nor his criticism of Israel”, but is somewhat less direct in explaining on what grounds ‘the decision’ was made, but imply that Salaita’s tweets constituted “personal and disrespectful words or actions that demean and abuse either viewpoints themselves or those who express them,” and that this is good cause to rescind a job offer (that is, effectively fire a professor).  (Incidentally, Salaita has a proven history of excellence in classroom instruction, including respect for diverse student opinions). 

[I have questions about what constitutes "demeaning and abusing viewpoints themselves", and generally thought that "demeaning viewpoints themselves", although never one's academic peers personally, was a standard and accepted part of scholarly discourse. But anyway.]

I’ve looked through Salaita’s tweets, and am not actually sure which ones are supposed to be the ones justifying effective dismissal.   I’m not sure Chancellor Wise or the trustees are either.  The website Inside Higher Ed made an open records request and received emails indicating that pressure from U of I funders motivated the decision — there are emails from major donors and university development (fund-raising) administrators pressuring the Chancellor to get rid of Salaita. 

This raises academic freedom issues not only in relation to firing a professor because of his political beliefs; but also issues of faculty governance and autonomy, when an administrator rescinds a job offer enthusiastically made by an academic department because of pressure from funders. 

I’ve made no secret of my support for Palestinian human rights, and an end to the Israeli occupation and apartheid system.  However, I stop to consider whether I would have the same reaction if a hypothetical professor had made the same sorts of tweets about the Ukraine/Russia conflict (partisan to either side), or tweeting anti-Palestinian content about Gaza instead. I am confident I would be just as alarmed about an assault on academic freedom. However, the fact that it’s hard to imagine funders exerting concerted pressure because of a professor’s opinions on Ukraine — or a professor’s anti-Palestinian opinions — is telling about the political context here, and I think indicates that this really is about Salaita’s “positions on the conflict in the Middle East and his criticism of Israel.”

So lots of academics are upset about this. So many that I suspected, when this story first developed, the UIUC would clearly have to back down, but instead they dug in further. The American Association of University Professors (AAUP) has expressed serious concern about violations of Salaita’s academic freedom — and the academic freedom of the faculty members who selected him for hire. The AAUP also notes that they have “long objected to using criteria of civility and collegiality in faculty evaluation,” in part just because of how easy it is to use those criteria as a cover for suppression of political dissent. 

The Chronicle of Higher Ed, in a good article covering the controversy, reports that “Thousands of scholars in a variety of disciplines signed petitions pledging to avoid the campus unless it reversed its decision to rescind the job offer,” and some have already carried through on their pledge of boycott. Including David J. Blacker, director of the Legal Studies Program and a professor of Philosophy at the University of Deleware, who cancelled an appearance in a prestigious lecture series. The UIUC Education Justice project cancelled a conference due to the boycott. The executive council of the Modern Language Association has sent a letter to UIUC urging them to reconsider. 

This isn’t a partisan issue. Instead, it’s illustrative of the increasingly corporatized academy, where administrative decisions in deference to donor preferences or objections take precedence over academic freedom or faculty decisions about their own departmental hiring and other scholarly matters.  Also, the way the university was willing to rescind a job offer due to political speech after Salaita had resigned his previous position, reminds us of the general precarity of junior faculty careers, and the lack of respect and dignity faculty receive from university administration.  

A variety of disciplinary-specific open letters and boycott pledges have been started in support of Salaita.

I think librarians have a special professional responsibility to stand up for academic freedom.  

Dr. Sarah T. Roberts, a UIUC LIS alumnus and professor of Media Studies at Western University in Ontario, hosts a pledge in support of Salaita from LIS practitioners, students and scholars, with a boycott pledge to “not engage with the University of Illinois at Urbana-Champaign, including visiting the campus, providing workshops, attending conferences, delivering talks or lectures, offering services, or co-sponsoring events of any kind.”  

I’ve signed the letter, and I encourage you to consider doing so as well. I know I see at least one other signer I know from the Code4Lib community already.   I think it is important for librarians to take action to stand up for academic freedom. 

Filed under: Uncategorized

by jrochkind at August 28, 2014 04:27 AM

August 27, 2014

First Thus

The Library Herald

I would like to announce that I have created a Library News website, that I call “The Library Herald” at

I have tried to add most of the main topics in librarianship, but what I know best are “Technology” and “Cataloging” so they are the best covered at the moment. To find the sources I am using, you can see them under each topic, e.g. Acquisitions has Acquisitions Sources, and so on. Some of these are rather sparse.

I have added the Google Translate widget, plus if you look at the RSS feeds, under Subscribe & More, you can see “Feed RSS”, where there are feeds in several languages. I did this as an experiment. You can see the Russian translation of the Technology RSS Feed. It doesn’t look as if the feeds are complete, but oh well….

I based this site on an older version that I have used for several years now, and I just wanted to try something new to help me learn WordPress.

Please feel free to share this with all and sundry who may be interested. And of course, if you have any suggestions for other sites or any other suggestions and comments, please let me know.


by James Weinheimer at August 27, 2014 11:01 AM

August 26, 2014

025.431: The Dewey blog

WebDewey Number Building Tool: Music, Part 1

Note: The general approach to building numbers described here can be applied in any discipline, not just music.

Are you having problems using the WebDewey number building tool with the add tables in the 780 Music schedule?  

If so, let’s try an example: What to Listen For in Rock: A Stylistic Analysis, to which the LCSH "Rock music--Analysis, appreciation" has been assigned.

Here is a summary of the instructions for using the WebDewey number building tool to build the DDC number 781.66117, the number for Rock music--artistic principles.  (The format of the summary is modeled on the tables used in the WebDewey training modules for the WebDewey number building tool.)

Navigate to this number / span


Number built so far

Caption of last number / notation added




Rock (Rock 'n' roll)




General principles




Artistic principles

Does that answer all your questions about how to build the number?  If not, keep reading for details.

First you need to find the record with the base number that you will use and the record with the add note that you will use.  Often—as in this case—the base number and the add note are in the same record.

A quick way to find the number for rock music is to browse the Relative Index for "rock music" and select this entry:

Rock music 781.66

Click the number to see the full record.  In the title bar of the Hierarchy window and in the middle of the hierarchy display, you will see 781.66 †Rock (Rock 'n' roll):


The base number that you will use is 781.66. In the Notes box, you will see the add note (in the form of a footnote marked with a dagger): †Add as instructed under 781.63-781.69


At this point, the Create built number box has no number in the title bar.  Inside the box appear only the number and caption 781.66 †Rock (Rock 'n' roll) plus a Start button:


Click Start in the Create built number box.  The Create built number box changes to show in the title bar that the number built so far is 781.66.  And—important!—the add note appears inside the Create built number box (†Add as instructed under 781.63-781.69). (This is a key to success: clicking Start gets the add note copied from the Notes box into the Create built number box.)  Now there are three buttons: Add, Edit Local, and Cancel. (For the purpose of this exercise, you will use only the Add button.)


If you look in the Hierarchy box, you will see that the number building tool has also taken you to the record for 781.63-781.69 Other traditions of music:


In the Notes box, you see the add table under 781.63-781.69:


How to express appreciation, using that add table? Because the complete add table is displayed in the Notes box, there are now multiple add notes visible in the Notes box. Of those add notes, which is relevant?

If you browse the Relative Index for "appreciation," you will find the entry:

Appreciation—music 781.17

You’ll get similar results if you browse the Relative Index for "music appreciation":

Music appreciation    781.17

If you click the number 781.17 in the Relative Index, you will see that the Hierarchy box focuses on 781.17 Artistic principles:


In the Notes box is a class-here note: "Class here aesthetics, appreciation, taste."


You decide to pursue the possibility of adding notation from 781.17. In order to do that, you need to look again at the add table under 781.63-781.69. The Create built number box is the same as you left it after you clicked Start:


To return to the add table under 781.63-781.69, click that number span in the add note inside the Create number box.  Looking again at the add table in the Notes box, you notice the following entry:

11-15 General principles

Add to 1 the numbers following 781 in 781.1-781.5, e.g., springtime music 15242, melody in springtime music 15242124

The add note under the span 11-15 General principles will allow you to add notation from 781.17 Artistic principles.  Click the span 11-15 in the Notes box.  Then the Hierarchy box changes to focus on that span: 781.63-781.69:11-15 General principles.


Now instead of the multiple add notes in the complete add table, the Notes box displays only the one relevant add note:


Click Add in the Create built number box.  The Create built number box changes to show that the number built so far is 781.661. Inside the Create built number box appears the number 781.63-781.69:1.  And—important!—the add note that you want to follow appears inside the Create built number box:


Now the Hierarchy box focuses on 781 General principles and musical forms, with an outline of its subdivisions, since it is the "numbers following 781" that you can add using the add note in the Create built number box:


Either by clicking down in the Hierarchy box or by searching for the number 781.17, you need to get 781.17 Artistic principles displayed as the focal point of the Hierarchy box:


Now click Add in the Create built number box.  At this point the number building tool will give you an opportunity to add a standard subdivision—but you don’t want to do that.  Look at the updated Create built number box; it shows that you have built the number 781.66117. (Note: you may have to scroll down to the bottom of the screen to see the Create built number box.) The number and caption 781.17 Artistic principles appear inside the Create built number box:


Click Save in the Create built number box.  Your newly built number will appear in the Hierarchy box:


You now have an opportunity to modify or add to the user terms associated with that new number, as explained in the "User Terms with Number Building" part of the WebDewey training modules.  Enough for now!  You have successfully built the number.

A key to success: at each step, find the record with the relevant add note, display the full record so that the add note appears in the Notes box, and click Start or Add to get that add note to appear inside the Create number box. If more than one add note appears in the Notes box, as when a complete add table is displayed, click to select the entry with the add note that you need. When only the one relevant add note is displayed in the Notes box, then click Start to get that add note to appear inside the Create built number box.

by Juli at August 26, 2014 06:18 PM

First Thus

ACAT Amazon dot com as a source of information about the date of publication

Posting to Autocat

On 26/08/2014 14.41, Brian Briscoe wrote:
I do not believe that the popularity of keyword searching is based upon a “preference” for that type of searching. I think it is a matter of the gorilla making it so common that it is now expected and what users have become accustomed to. Of course, my opinion is no more based on research than the supposition that keyword searching is what users really prefer.

Keyword searching and relevance rankings based upon imperfect data assumptions has some very serious flaws. As information professionals, we should be working to provide a solution that provides better accuracy as well as recall.

I don’t want to get into a debate because I agree with you. It is just that I remember very clearly how incredibly happy the public was when keyword was introduced. Even though I would show people the “hidden pitfalls” repeatedly, they just didn’t care. Today, keyword and relevance ranking are default in most catalogs. That didn’t just happen. They are default because the public wants it that way. They have become used to those methods and that means, they are not used to ours. If we don’t do follow along and insist: “Do not use keyword or relevance. The other ways are much better for you.” Of course, the public would completely ignore us.

But I agree that there have not been any real options for people. What does this mean? For a long time (decades?) after the introduction of the OPACs, there were no authority records available to the public at all. I never understood how anybody could find anything. We couldn’t even put in simple guide cards. Now some catalogs include authority records, but that has turned out to be barely a step forward. That is because you don’t get them with the default search of keyword and everybody has to pretend they are back in the 1950s searching a card catalog, i.e. you have to do left-anchored text browsing! We just have to admit that nobody does that anymore, except for weirdos like me. I have gone to pains to show how and why it just doesn’t work in an online environment, and why even I have shown why I hate it. In the card catalog, it wasn’t so bad. Online it just does not work at all and should be abolished, because the public won’t do it and even mentioning it makes us look “soooo 20th century!”

So what do we do? It’s obvious: we must change the catalog so that its traditional powers can be utilized in a 21st-century information environment. But the problem is, with RDA/FRBR, we have been concentrating on the individual records (an overwhelming task!) while the catalogs don’t change at all and the public thinks are are stuck in the pre-20th century. The only new things the public may notice are rather weird things, such as my current favorite:

100 1_ |a Sturrock, John, |c heckler. (

That is a really useful $c.


by James Weinheimer at August 26, 2014 02:36 PM


Library Linked Data Happening

LOD happening

On August 14 the IFLA 2014 Satellite Meeting ‘Linked Data in Libraries: Let’s make it happen! took place at the National Library of France in Paris. Rurik Greenall (who also wrote a very readable conference report) and I had the opportunity to present our paper ‘An unbroken chain: approaches to implementing Linked Open Data in libraries; comparing local, open-source, collaborative and commercial systems’. In this paper we do not go into reasons for libraries to implement linked open data, nor into detailed technical implementation options. Instead we focus on the strategies that libraries can adopt for the three objectives of linked open data, original cataloguing/creating of linked data, exposing legacy data as linked open data and consuming external linked open data. Possible approaches are: local development, using Free and open Source Software, participating in consortia or service centres, and relying on commercial vendors, or any combination of these. Our main conclusions and recommendations are: identify your business case, if you’re not big enough be part of some community, and take lifecycle planning seriously.

The other morning presentations provided some interesting examples of a number of approaches we described in our talk. Valentine Charles presented the work in the area of aggregating library and heritage data from a large number of heterogeneous sources in different languages by two European institutions that de facto function as large consortia or service centres for exposing and enriching data, Europeana and The European Library. Both platforms not only expose their aggregated content in web pages for human consumption but also as linked open data, besides other so called machine readable formats. Moreover they enrich their aggregated content by consuming data from their own network of providers and from external sources, for instance multilingual “value vocabularies” like thesauri, authority lists, classifications. The ideas is to use concepts/URIs together with display labels in multiple languages. For Europeana these sources currently are GeoNames, DBPedia and GEMET. Work is being done on including the Getty Art and Architecture Thesaurus (AAT) which was recently published as Linked Open Data. Besides using VIAF for person authorities, The European Library has started adding multilingual subject headings by integrating the Common European Research Classification Scheme, part of the CERIF format. The use of MACS (Multilingual Access to Subjects) as Linked Open Data is being investigated. This topic was also discussed during the informal networking breaks. Questions that were asked: is MACS valuable for libraries, who should be responsible for MACS and how can administering MACS in a Linked Open Data environment best be organized? Personally I believe that a multilingual concept based subject authority file for libraries, archives, museums and related institutions is long overdue and will be extremely valuable, not only in Linked Open Data environments.

The importance of multilingual issues and the advantages that Linked Open Data can offer in this area were also demonstrated in the presentation about the Linked Open Authority Data project at the National Diet Library of Japan. The Web NDL Authorities are strongly connected to VIAF and LCSH among others.

The presentation of the Linked Open Data environment of the National Library of France BnF ( highlighted a very interesting collaboration between a large library with considerable resources in expertise, people and funding on the one hand, and the non-library commercial IT company Logilab. The result of this project is a very sophisticated local environment consisting of the aggregated data sources of the National Library and a dedicated application based on the free software tool Cubicweb. An interesting situation arose when the company Logilab itself asked if the developed applications could be released as Open Source by the National Library. The BnF representative Gildas Illien (also one of the organizers of the meeting together with Emmanuelle Bermes) replied with considerations about planning, support and scalability, which is completely understandable from the perspective of lifecycle planning.

With all these success stories about exposing and publishing Linked Open Data, the question always remains if the data is actually used by others. It is impossible to incorporate this in project planning and results evaluation. Regarding the BnF data this question was answered in the presentation about Linked Open Data in the book industry. The Electre and Antidot project uses linked open data form among others

The afternoon presentations were focused on creating, maintaining and using various data models, controlled vocabularies and knowledge organisation sysems (KOS) as Linked Open Data: The EDM Europeana data Model, UNIMARC, MODS. An interesting perspective was presented by Gordon Dunsire on versioning vocabularies in a linked data world. Vocabularies change over time, so an assignment of a URI of a certain vocabulary concept should always contain version information (like timestamps and/or version numbers) in order to be able to identify the intended meaning at the time of assigning.

The meeting was concluded with a panel with representatives of three commercial companies involved in library systems and Linked Open Data developments: Ex Libris, OCLC and the afore-mentioned Logilab. The fact that this panel with commercial companies on library linked data took place was significant and important in itself, regardless of the statements that were made about the value and importance of Linked Open Data in library systems. After years of dedicated temporarily funded proof of concept projects this may be an indication that Linked Open Data in libraries is slowly becoming mainstream.


flattr this!

by Lukas Koster at August 26, 2014 01:52 PM

First Thus

ACAT Amazon dot com as a source of information about the date of publication

Posting to Autocat

On 26/08/2014 7.38, Hal Cain wrote:
We need to be practical, and put the practical level of information on the top level, where it’s most visible. The subtleties and complexities do need to be recorded, but they are not necessarily “what matters for cataloguing” where it’s our job to fit them into the bibliographical context.
Hal Cain, who doubts that all the subtleties and complexities called for by RDA matter much to the human users of the product of cataloguing

Hear, hear! To catalog an item is not to turn it into a research project suitable for publication. Certainly, the cataloger may do some bits of research when cataloging items but even then, that is when it makes a difference to the public, e.g. the cataloger finds out that materials published under a bunch of separate names were actually pseudonyms of a single author. But those occurrences are rare.

The simple fact that the public prefers Google-type results, as shown by the overwhelming preference for keyword searching and relevance ranking–even in our own catalogs–speaks volumes and should be a major consideration in our discussions. Google doesn’t have any of the subtleties that we discuss, and nobody seems to complain about it. At present, the public seems to be concerned much more with censorship (e.g. the EU’s “Right to be forgotten” and privacy (e.g. the Edward Snowden/NSA controversy).

There are also concerns, but less pressing, over “filter bubbles” and “information overload.” I have not seen that WEMI and FRBR, with their relationships (which are what is really new) deal with any of this.

But the field of librarianship and cataloging in particular, address all of these issues and can offer special insights that–I think–could even point in the directions of some solutions.


by James Weinheimer at August 26, 2014 09:14 AM

Terry's Worklog

MarcEdit’s MARCNext: JSON Object Viewer

As I noted in my last post (, I’ll be adding a new area to the MarcEdit application called MARCNext.  This will be used to expose a number of research tools for users interested in working with BibFrame data.  In addition to the BibFrame Testbed, I’ll also be releasing a JSON Object Viewer.  The JSON Object Viewer is a specialized viewer designed to parse JSON text and provide an object visualization of the data.  The idea is that this tool could be utilized to render MARC data translated into Bibframe as JSON for easy reading.  However, I’m sure that there will be other uses as well.  I’ve tried to keep the interface simple.  Essentially, you point the tool at a JSON file and the tool will render the file as objects.  From there, you can search and query the data, view the JSON file in Object or Plain text mode, and ultimately, copy data for use elsewhere. 


Some additional testing needs to be done to make sure the program works well when coming across poorly formed data – but this tool will be a part of the next update.


by reeset at August 26, 2014 04:31 AM

August 25, 2014

Resource Description & Access (RDA)

Temporary / Permanent Date in an Incomplete Multipart Monograph : Questions and Answers in the Google+ Community "RDA Cataloging"

RDA Cataloging is an online community/group/forum for library and information science students, professionals and cataloging & metadata librarians. It is a place where people can get together to share ideas, trade tips and tricks, share resources, get the latest news, and learn about Resource Description and Access (RDA), a new cataloging standard to replace AACR2, and other issues related to cataloging and metadata.

 Questions and Answers in the Google+ Community "RDA Cataloging"


Publication etc., dates (MARC21 264). These conventions do not apply to serials or integrating resources (temporary data not recorded in this field).

Temporary date. If a portion of a date is temporary, enclose the portion in angle brackets.


, 1980-〈1981〉 
v. 1-2 held; v. 2 published in 1981

, 〈1981-〉 
v. 2 held; v. 1-2 published in 1981

, 〈1979〉-1981. 
v. 2-3 held of a 3-volume set

, 〈1978-1980〉 
v. 2-3 held of a 5-volume set

Permanent date. If an entire date is judged to be permanent, record it without angle brackets.


, 1980-
〈1980-〉 or, 1980-〈 〉 
v. 1 held; v. 1 published in 1980

[Source: LC-PCC PS for RDA Rule 1.7.1]

by Salman Haider ( at August 25, 2014 02:10 AM

Relationship Designators in Resource Description and Access (RDA) : honouree, host institution, organizer, sponsoring body

RELATIONSHIP DESIGNATORAPPENDIX I.2.2 : Relationship Designators for Other Persons, Families, or Corporate Bodies Associated with a Work
honoureeUse this relationship designator with a person, family, or corporate body honoured by a work (e.g., the honouree of a festschrift or a commemorative volume) (MARC21 tag 700, 710, subfield e)
host institutionUse this relationship designator with a corporate body hosting the event, exhibit, conference, etc., which gave rise to a work, but having little or no responsibility for the content of the work (MARC21 tag 710, subfield e)
organizerUse this relationship designator with a person, family, or corporate body organizing the exhibit, event, conference, etc., which gave rise to a work (MARC21 tag 700, 710, subfield e)
sponsoring bodyUse this relationship designator with a person, family, or corporate body sponsoring some aspect of a work, e.g., funding research, sponsoring an event (MARC21 tag 700, 710, subfield e)

[Questions asked on the use and scope of some RDA Relationship designators, see RDA Toolkit Appendix I for more]

Aaron Kuperman I have found the "honouree" (not just for festscrifts) to be very useful. If they publish someone's essay to honor him, it get a double $e (author, honouree). --- The "Sponsoring party" is good for a law of legal publications where and organization wasn't the publisher, but wasn't an author, but is identified with the book.

by Salman Haider ( at August 25, 2014 02:10 AM



I had some excellent comments from my last post and I wanted to return to some ideas that came from these comments. One noted that the quality of metadata is a ticking time bomb. In both comments, issues about the place and increasing reliance on technology was raised.

Metadata quality is a hot issue. Where I work, it is a struggle to be able to respond to the number of requests and provide good metadata, especially if it isn’t there to begin with. Let me explain. Part of my job is to take EAD records and create MODS records for our digital repository. The collection level portion of these EADs are full and impressive records. The amount of work put in by our archivists, curators and their pages is tremendous. The issue, however, is that the item level descriptions are less than full. I have tried to add common subject and genre terms and any other information that can be set to a default. The reason is that I typically transform hundreds of EAD records at a time that in turn create even more MODS records. In the last 3 months, I’ve created over 20,000 MODS records. As the only person on this project, it becomes more than a challenge to go back and enhance all of these records. It is not that these records are less than minimal. They still allow for discovery and accessibility. However, they are not really unique. I would say these records are average and not above average or even good in terms of providing good or above average unique descriptions. Now the good news is that help is on the way slowly and surely. Also, I keep annoying my supervisors for little helper elves to handle enhancement. And this brings me to my second point about technology.

My main focus is to get these records into the digital repository efficiently and in a timely manner. There are 7 who work in Archives. Then there is the metadata that needs to be transformed from our partner institutions; here I work with a variety of people. Then there are others around campus including those who need help with metadata. All of this is to illustrate that I deal in bulk produce! I write transformations creating little MODS records by the hundreds. This is an automatic approach that many use. Some do this in Oxygen, the xml editor, others in MarcEdit, and others rely on programmers who have written scripts to create these records. These various technologies have been and continue to be a lifesaver. I and other metadata librarians rely on it every day to do our work. However, when push comes to shove, enhancements are still in the realm of a person adding good quality metadata. A good example is that the transformations transform the data that are present. If that data is wrong or inaccurate, that is brought over into the new xml structure. Recently, I saw one record that I transformed en mass where the title was in German and not English. In the original data file, there was only the English title; the German was never recorded. It was only thanks to a person who looked at the digital resource to discover this and add the German title to the metadata.

All of this is to say that technology and human knowledge and expertise must work together. Technology certainly can take us far and provide solutions to problems that seem amazing. For the Digital Commonwealth, programmers have written a script that controls place, name and topic headings in spreadsheets. For the this script to work, someone familiar with thee controlled vocabularies had to inform the programmers such that the requirements of the task were and continue to be met by the script. The Digital Commonwealth also has a post review by people to ensure the script and other automated processes worked as expected.

The discussion can’t be about whether technology will supplant people. Technology are tools created by us. In a sense, the better we understand these technologies, the better we can leverage it to assist us in our work. We can avoid the ticking time bomb of metadata quality by ensuring that people don’t get lost, that metadata librarians add value and quality to repositories along with technology, and that even if all of these metadata are not seen that it is thanks to people and technology that cool visualizations like timelines and mappings are possible.

Filed under: cataloging, Metadata Tagged: metadata quality, technology

by Jen at August 25, 2014 12:42 AM

August 24, 2014

Terry's Worklog

MarcEdit’s Research Toolkit – MARCNext

While developing MarcEdit 6, one of the areas that I spent a significant amount of time working on was the MarcEdit Research Toolkit.  The Research Toolkit is an easter egg of sorts – it’s a set of tools and utilities that I’ve developed to support my own personal research interests around library metadata – specifically, around the future of library metadata including topics the current BibFrame testing and linked data.  I’ve kept these tools private because they tend to not be fully realized concepts or ideas and have very little in the way of a user interface.  Just as important, many of these tools represent work being created to engage in the conversation that the library community is having around library metadata formats and standards, so things can and do change or drop out of the conversation and are then removed from my toolkit.

While developing MarcEdit 6, one of the goals of the project was to find a way to make some or parts of these tools available to the general MarcEdit community.  To that end, I’ll be making a new area available within MarcEdit called MARCNext.  MARCNext will provide a space to make proof of concept tools available for anyone to use, and offer a simple to use interface that anyone can use to test new bibliographic concepts like BibFrame. 

Presently, I’m evaluating my current workbench to see which of the available tools can be made public.  I have a handful that I think may be applicable – but will need some time to move them from concept to a utility for public consumption.  With that said, I will be making one tool immediately available as part of the next MarcEdit update, and that will be the BibFrame Testbed.  This is code that utilizes the LC XQuery files being developed and distributed at: with a handful of changes made to provide better support within MarcEdit.  These are my base files that will enable librarians to easily model their MARC metadata in a variety of serializations.  And using this initial work, I’ll likely add some additional serializations to the list. 

I have two goals for making this particular tool available.  First and foremost, I would like to enable anyone that is interested the ability to take their existing library metadata and model it using Bibframe concepts.  Currently, Library of Congress makes available a handful of commandline tools that users can utilize to process their metadata – but these tools tend to not be designed for the average user.  By making this information available in MarcEdit – I’m hoping to lower the barrier so that anyone can model their data and then engage in the larger discussion around this work. 

Secondly, I’m currently engaging in some work with Zepheira and other early implementers to take Bibframe testing mainstream.  Given the number of users working with MarcEdit, it made a lot of sense to provide tools to support this level of integration.  Likewise, by taking the time to move this work from the concept stage, I’ve been able to develop the start of a framework around these concepts. 

So how is this going to work?  On the next update, you will see a new link within the Main MarcEdit Window called MARCNext. 

MarcEdit Main Window

Click on the MARCNext link, and you will be taken to the public version of the Research Toolkit.  At this point, the only tool being made publically available is the BibFrame Testbed, though this will change.

MarcEdit’s MARCNext Window

Selecting the BibFrame Testbed initializes a simple dialog box to allow a user to select from a variety of library metadata types and convert them using BibFrame principles into a user-defined serialization. 

BibFrame Testbed window

As noted above, this test bed will be the first of a handful of tools that I will eventually be making available.  Will they be useful to anyone – who knows.  Honestly, the questions that these tools are working to answer are not ones that come up on the list serv, and at present, aren’t going to help much in one’s daily cataloging work.  But hopefully they will enable every cataloger that wants to, the ability to engage with some of these new metadata concepts and at least take their existing data and see how it may change utilizing different serializations and concepts.

Questions – feel free to ask.


by reeset at August 24, 2014 04:36 AM

August 22, 2014

TSLL TechScans

OLCC MARC Format Update 2014, phase 2

OCLC Technical Bulletin 264 describes changes to the MARC 21 formats for bibliographic, authority and holdings data to be implemented in the near future. Things we are most likely to see in our day-to-day cataloging work include:
  • Addition of $q Qualifying information to identifier fields such as 020 (ISBN), 024 (Other standard number) and 027 (Standard Technical Report Number). 
  • Definition of first indicators for field 588 (Source of Description) to provide display constants. First indicator 0 will generate a display constant meaning source of description; first indicator 1 will generate a display constant meaning latest issue consulted. CONSER participants should wait for notification by the Library of Congress and OCLC before using these new indicators.
  • Data recorded in Marc field 265 (Source for Acquisition/Subscription Address) will be converted to field 037 $b (Source of Acquisition/Source of Stock Number/Acquisition). Marc field 265 will be invalidated
  • $c (Location of Meeting) has been re-defined as repeatable for for many fields including 110, 111, 610, 611, 710 and 711)
Changes to the MARC format for authority data include:
  • Addition of $q (Qualifying information) in fields 020 and 024
  • Repeatability of $c (Location of meeting) in fields  110, 111, 410, 411, 510 and 511
Additional MARC fields relating to audience and creator characteristics have also been defined. 

by (Jackie Magagnosc) at August 22, 2014 06:24 PM

OCLC Cataloging and Metadata News

August 2014 data update now available for the WorldCat knowledge base

The WorldCat knowledge base continues to grow with new providers and collections added monthly.  The details for August updates are now available in the full release notes.

August 22, 2014 06:15 PM

First Thus

ACAT An Amazing Record redux

Posting to Autocat

On 8/22/2014 4:13 AM, Daniel CannCasciato wrote:
I wrote something back in 1999 about what would now be called social tagging, etc., and why it’s so different in concept than are the things we do in a library. I think James’ post exemplifies that. We should NOT, in all likelihood, want patron tagging for our catalogs or websites or IRs. We don’t know what they are doing, and especially we don’t know why. And depending on how the system is set up, don’t even know who they are.
That’s not what we do. We don’t know (from what I’ve read) that it*helps* patrons. So why keep it, emphasize it, except to appear trendy?

It’s not that I am “hurt” by a thumbs down. On some of these lists, I have had to:

“… suffer
The Slings and Arrows of outrageous Fortune …”

so a thumbs down counts for nothing.

Nevertheless, I think it is important not only to be able to “like” something but to show disapproval as well. Otherwise, everything becomes much too skewed. The choice of only a like is similar to Soviet elections, where you had a choice of one candidate. (True, you could vote against him/her but that was almost never done) In U.S. politics, where a dualistic structure reigns, there is somewhat more of an option: Republican or Democrat. Even in this case, it more often than not turns into a thumbs down over a specific candidate, i.e. a vote not so much for but against someone–against Jimmy Carter, against Dukakis, against George Bush, against Barack Obama, even against any Republican whatsoever or any Democrat–entire governments can be based on that kind of vague information. In Italy, where you really do have all kinds of choices, it is called chaos! :-)

If it can work for governments (or it’s supposed to), why not for bibliographic records as well?

For at least some of the reasons I mentioned. I suspect the real problem is the complete anonymity, as Mac suggested. We don’t need individual names, but if you knew that the thumbs up/down was given by a professor in the field, or that a high-school teacher thought it was good/not good for someone he or she taught, that might be useful information. Even that a student thought it was good or not could be useful information.

About blog comments, things are very strange. There is a blight of “comment spam”. I have noticed a change in the past few years. It’s a variation on the old ILOVEYOU virus. You get a nice comment, describing how good your post was, how it was so clear and helpful, how smart you are blah blah, and then there are links to websites–and they can be very, very clever at hiding these links. As an example, I just received this comment (unedited) to a posting from 2009! (By adding this link, I just made my own type of spam!) :-)

“You actually make it aplpear so easy with your presentation however I find this matter to be really something which I think I’d never understand. It sort of feels too complicated and extremely wide for me. I amm looking ahead in your next post, I will attempt to get the grasp of it!”

And then the URL links to a page about ways to cure acne problems! First time I noticed this, I was completely mystified. The reason for this seeming craziness is marketing in Google. The more links a page has to it, the higher it will be in Google’s search results. Therefore, if someone can mask links to specific pages so that you will add them to your otherwise legitimate page (and these comments are sent to hundreds or thousands of others), there can be rewards. This is a simple case, but it is merely one example of what is called “Black Hat SEO” which can be fiendishly ingenious. ( In my opinion, “White Hat SEO” is not all that much better, but I guess it’s necessary.

Catalogs are being spammed, e.g. See “The sports business” (article from the Economist) with a comment by “cheapshoesonline” Who says, “I like it” followed by 3 links where you can buy Nike shoes.

What to do? I don’t know, but it is a new world and seems unavoidable as we enter into the “Social Web” and “Linked Data”.


by James Weinheimer at August 22, 2014 10:23 AM

Resource Description & Access (RDA)

Editor of Compilation vs Compiler

The editor of a compilation, as defined in I.3.1, is not a creator of a work and thus has to be treated as a 700, not a 100.

On the other hand, a compiler (for example of a dictionary, a directory, a bibliography, etc.) can be considered a creator (see I.2.1) and thus can be treated as a 100.

Expert remarks by Aaron KupermanLaw Librarian, Library of Congress A good rule of thumb is that a compilation needs to consist of works that can (and should) be listed in the contents note.

According to RDA Toolkit: I.2.1 Relationship Designators for Creators: Compiler : A person, family, or corporate body responsible for creating a new work (e.g., a bibliography, a directory) by selecting, arranging, aggregating, and editing data, information, etc.

[Blog post revised on August 22, 2014]

by Salman Haider ( at August 22, 2014 04:57 AM

August 21, 2014

Bibliographic Wilderness

Columbian student faces jail time for sharing scholarly thesis

Columbia strengthened their copyright laws were strengthened in 2006, basically at U.S. demands as part of a free trade agreement. 

As a result, according to Nature News Blog,Diego Gómez Hoyos , a Columbian student, faces jail time for posting someone elses thesis on Scribd. 

In the U.S., of course, ‘grey’ sharing of copyrighted scholarly work without permission is fairly routine. We call it ‘grey’ only because everyone does it, and so far publishers in the U.S. have shown little inclination to stop it, when it’s being done amongst scholars on a one-by-one basis — not because it’s legal in the U.S. If you google (scholar) search recent scholarly publications, you can quite frequently find ‘grey’ publically accessible copies on the public internet, including on Scribd.  

What is done routinely by scholars in the U.S. and ignored, gets you a trial and possible jail time in Columbia — because of laws passed to satisfy the U.S. in ‘free trade’ agreements.  This case may start going around the facebooks as “copyright out of control”, and it is that, but it’s also about how neo-colonialism is alive and well, what’s good for the metropole isn’t good for the periphery, and ‘free trade’ agreements are never about equality.

Student may be jailed for posting scientist’s thesis on web
Posted on behalf of Michele Catanzaro


A Colombian biology student is facing up to 8 years in jail and a fine for sharing a thesis

by another scientist on a social network.


Diego Gómez Hoyos posted the 2006 work, about amphibian taxonomy, on Scribd in 2011. An undergraduate at the time, he had hoped that it would help fellow students with their fieldwork. But two years later, in 2013, he was notified that the author of the thesis was suing him for violating copyright laws. His case has now been taken up by the Karisma Foundation, a human rights organization in Bogotá, which has launched a campaign called “Sharing is not a crime”.




Gómez says that he deleted the thesis from the social network as soon as he was notified of the legal proceedings. But the case against him is rolling on, with the most recent hearing taking place in Bogotá in May. He faces between 4 and 8 years in jail if found guilty. The next hearing will be in September.


The student, who is currently studying for a master’s degree in conservation of protected areas at the National University of Costa Rica in Heredia, refuses to reveal who is suing him. He says he does not want to “put pressure on this person”. “My lawyer has tried unsuccessfully to establish contacts with the complainant: I am open to negotiate and get to an agreement to move this issue out of the criminal trial,” he told Nature.


The case has left Gómez feeling disappointed. “I thought people did biology for passion, not for making money,” he says. “Now other scientists are much more circumspect [about sharing publications].”


Filed under: General

by jrochkind at August 21, 2014 08:29 PM

Mod Librarian

5 Things Thursday: @SPLBuzz Online Collections, DAM, Space Age Crowdsourcing

5 Things

Here are five summer things:

  1. Check out the awesome digital collections at The Seattle Public Library.
  2. 10 must read books about libraries and librarians which I have not read.
  3. David Diamond on the sheer value of information professionals.
  4. NASA seeks public help with photo archive.
  5. Using ArchivesSpace as DAM for Austin graffiti.

Thanks for reading, loyal readers. Taking two weeks off and returning…

View On WordPress

August 21, 2014 12:20 PM

First Thus

An Amazing Record redux

Posting to Autocat & Radcat

(This does eventually get around to catalogs, so bear with me! It’s my vice)

I thought I would share something I discovered recently. A few months ago, I changed the hosting of my website ( and my original intention for my blog (hosted at Blogger) was just to keep using it. This turned out to be complicated so I decided to change everything to WordPress. That turned out to be more difficult than I thought, but I was stuck.

I am a novice at WordPress but still find it to be very good. Currently, I am learning WordPress plugins and I recently included the “thumbs-up, thumbs-down” for each post. I find it interesting that people have used the WordPress version much more often than what I had on Blogger.

But I don’t really understand it. For instance, I wrote a post An Amazing record, about a record I found on Worldcat that has a huge number of authors, and won’t even load.

I didn’t make any comments–I just pointed out that this record exists; then I found the document it pointed to at, and discovered that there are several others of these types of records in Worldcat. In another post, I discussed it a bit (where I can understand an up/down vote), but not in this one where I just mentioned it.

As it stands currently, this post has 2 thumbs up and 3 down. What do the thumbs-down represent? The fact that these records exist? That Worldcat accepted these records? Or that I pointed them out?

Another example is where I mentioned that I knew the writer of a Star Trek episode about libraries (Jean Aroeste). This also currently has 2 thumbs up and 3 down.

Again, what signifies the thumbs-down? I can’t believe that anybody could ever dislike Jean, who was one of the nicest people I ever met. Perhaps it’s people who dislike the original Star Trek series, or this particular episode. Or perhaps it is just that I wrote something about it.

There are many other similar examples I could point to.

Bringing this back to catalogs (I said I would!), it seems that most do not use thumbs up/down but employ a rating system for their records (1-5 stars), e.g. in LibraryThing Hamlet got 4.17 stars, while Harry Potter and the Sorcerer’s Stone got 4.26.

There are “likes” on on the author pages. Shakespeare got 70 while Glenn Beck got 73 and Noam Chomsky got 47. J.K. Rowling beat them all easily with 476 likes. Homer got a humiliating 19 likes but even he did better than Dante, who has a lonely single [1] like. Dislikes (thumbs-down) are not available.

What do all these stars and likes mean? I don’t know.

So, while in itself, a thumbs-down from an anonymous reader doesn’t bother me, I am not at all sure what the likes/thumbs up-down are supposed to mean. I realize I have some ideas that go against the general trends of many people’s opinions, so I understand thumbs-down on those, but a thumbs-down to a simple “This bibliographic record exists” or “I knew this person” is something different. People are communicating something, but what I cannot say.

I watched the Frontline episode Generation Like and while it is very interesting, it doesn’t seem to explain anything either. Maybe I should switch from thumbs to stars! But I don’t know if even that would help me understand. I am sure that this post will get several thumbs-down and I will still be at a loss to understand what they mean.

Perhaps others can help enlighten me.


by James Weinheimer at August 21, 2014 10:15 AM

August 19, 2014

Catalogue & Index Blog


CILIP Cataloguing and Indexing Group (CIG) invites nominations for the Alan Jeffreys Award.

The Alan Jeffreys Award is made by the Cataloguing & Indexing Group Committee in memory of Alan Jeffreys, the former Chairman of the Group, who died in 1994.

The recipient of the award should have made a substantive contribution to the development, teaching or practice of cataloguing or indexing.

Nominations should provide evidence of exceptional achievement in one or more of the following categories:

· Teaching or Professional Development

· Leadership in a changing environment

· Delivery of a project

Evidence should include assessment of the impact the nominee has had. This may include, but is not limited to, improved access to collections; academic performance or efficiencies.

The award is made annually. The recipient receives a framed certificate and will also win a complimentary registration to the next CIG Conference (or a CPD event of their choice).

Nominations should be sent to the CIG Chair: Robin Armstrong Viner, c/o Information Services, Templeman Library, University of Kent, Canterbury, CT2 7NU ( by 5th September 2014.

Please include the name and affiliation(s) of the nominee(s), together with a short indication of why they are being nominated.

by cilipcig at August 19, 2014 07:45 AM

x + 3

How to Homeschool My Children

As we prepare for Titus to enter first grade, a number of colleagues have asked for details about our homeschooling techniques, philosophy, curriculum, etc. Forthwith:


We attended the FPEA convention in May. In addition to having a chance to review materials from a smorgasbord of vendors, we also learned a lot about education philosophies from several of the presentation and workshops. There are five prevailing philosophies (or teaching styles, or learning methods, or whatever you want to call them), and I’m sure each of them has its place in different families. Our experience at the convention was one of discovery. We didn’t so much choose a philosophy; we identified the philosophy we already had. Names are powerful. By identifying the Charlotte Mason method as that which most aligned with our preferred approach, it’s much easier to find resources we think will be helpful additions to our curriculum.

So what is that approach? It’s largely structured around experiencing the world and responding to it. If you’re studying science, observe nature, do experiments. Don’t just read a book that tells you how things work; figure out why they work that way. If you’re studying the arts, seek out the masters. Live with their beauty in your house; learn why it’s beautiful. If you’re studying grammar, learn through great literature.


We started formal education at age three. At the start, the idea is to introduce concepts that we can build on in the future, and start to introduce some structure and discipline in their learning. School takes maybe 15 minutes a day at the beginning, and gradually expands to a couple of hours each day now that we’re starting first grade.

tirzah-mathWe have a designated table where we always do our lessons. Both kids will come sit down; and we’ll either alternate between them, or, wherever possible, engage them both in the same subject (this is easier for some subjects than others).

Our school year formally starts each September. We don’t take an explicit summer vacation, instead opting to take numerous breaks throughout the year (usually timed to avoid crowds that follow the public school schedule).

Beyond the formal curriculum outlined below, we’re also fortunate to live a few blocks away from O2B Kids, where the kids can do many of their “extracurriculars”. Dance, gym, art, and more; they get the opportunity to have fun with other kids and learn skills we would never be able to teach them at home.



Action BibleIt’s never too soon to start studying scripture. We started on day one at Genesis 1, and have read a chapter together every evening before bed. After a few years we reached the end, at which point we circled back to the beginning (now with two kids listening). As part of our routine, everybody gets a notebook and pencils to write down their thoughts or draw picture about the day’s chapter.

Language Arts

  • Wordy Worm – Phonograms are the smallest distinct sounds in our language. Wordy Worm teaches reading and spelling by focusing on these sounds first, and then on the letters (or combinations of letters) that can be used to create those sounds. We start by focusing on a few related phonograms each week, learning songs to help us remember them, and finding books from the library where the kids can use their new knowledge to actually start to read a few words.
  • Explode the Code – This is a more traditional approach to teaching reading and spelling through phonics.Explode the Code Each chapter covers a different sound, so one chapter will be “short a” and the next chapter is “short e”. Most chapters are about 9 pages long and reinforce the learning using several different methods. The child will copy the word, circle the picture illustrating the word, find the matching words in a list, and more. It also introduces basic literacy by having them unjumble words so they make a sentence, reading two sentences and selecting which one describes the picture, and answering yes-no questions. We have encountered a few “problems” that are inconsequential as homeschoolers. The first is that Titus’s handwriting competency was nowhere near his reading skill, making it difficult to fill out so many pages. We compromised and on one type of page he had to circle the letters, but not rewrite them. On the other type, he spelled while we wrote down the letters. The other “problem” was with the jumbled sentence. He had a great difficulty figuring out the correct order. One day, on a whim we tried putting all the words on little squares of paper and he had no trouble putting them in the correct order. So for nearly a year, he would arrange the words and glue them to the page. Now that his skills are improved, he is just writing the words as “expected”.
  • Handwriting Without Tears – This is an excellent series that teaches handwriting skills starting with pre-writing skills and advancing through cursive. In the early years, it focuses a lot on the basics of starting the letters at the top, writing on a line, and word spacing. The curriculum teaches that in the early stages, a child should copy a printed letter every time so they are always looking at a quality examples and will produce quality letters. On the other hand, if a child is required to write a row of the letter “B”, the first few “Bs” will look okay, but as they stop looking at the original “B” and start looking at their own handwriting, the letters will get worse and worse until at the end they are nearly illegible. They also slowly incorporate other aspects of grammar and composition throughout the series. We found ourselves wanting to create our own practice worksheets on occasion, so we purchased a font to match the workbooks from Educational Fontware.
  • Keyboarding Without Tears – From the makers of Handwriting without Tears, a new typing curriculum. We’ll be starting the curriculum next month, with daily 5-minute typing lessons.


  • Fred GaussLife of Fred – I can’t praise this math curriculum enough. You learn math while reading a story of the life of five-year-old Fred Gauss. The stories are silly and entertaining, and Titus will often ask to do another chapter. We started the series when he turned four. Two years in, and Titus can draw a bar graph, solve for x in 3x=12, find the union of sets, and tell you if three lines are concurrent. After we finish the intermediate series (2nd grade), Titus will likely take a break for a while and become the teacher’s assistant as we teach the books to Tirzah.
    • Year 1 (Pre-K): Apples, Butterflies, Cats
    • Year 2 (Kindergarten): Dogs, Edgewood, Farming
    • Year 3 (1st Grade): Goldfish, Honey, Ice Cream, Jelly Beans
    • Year 4 (2nd Grade): Kidneys, Liver, Mineshaft

    The curriculum goes all the way through high school and college math courses (trigonometry, calculus, statistics), so we have many years of Fred ahead of us.

  • Math-U-See – This math curriculum is structured around visual and tactile learning with colorful plastic blocks. These are great starting around age three, when you can use them to practice counting and start teaching place value. When we started the Primer at age four, I thought that Math-U-See would be our primary math curriculum, and Fred would be helpful to reinforce the concepts we were learning. I had it exactly backwards. The manipulative blocks are excellent tools for helping visualize math concepts, but the workbooks and tests mostly serve to reinforce the concepts we’re learning in Fred. The drill is still helpful, and we usually have Titus do at least a few drill pages from each chapter. But going into first grade, I’m going to focus more effort on finding pages that specifically reinforce the concepts we’re learning in Fred, instead of going through the book linearly.


Before first grade, science is mostly unstructured. Kids are naturally curious about the world, and it’s easy to guide that curiosity into experimentation and introduce the scientific method. In 2012, a couple of early tropical storms left us with an abundance of standing water, yielding a bumper crop of tree frogs. From the thousands hopping around our front yard, we adopted a handful, raising and feeding them for the next year (we learned a lot more about fruit flies than frogs). We also participated in science fairs with homeschool groups to get some formal practice with the scientific method and with public presentations. You can find books of ready-made experiments, but it’s far more interesting to come up with your own and actually learn something.

Basher BooksBeyond experimentation, good books can go a long way to open up the vast world that science covers. The Basher Science series contains a variety of books that introduce topics in bite-sized, anthropomorphized chunks perfect for preschool children. You’ll also find a number of books on any topic imaginable at your library.

With first grade, we’re starting a formal science curriculum. One of my biggest frustrations with most science curricula I’ve looked at is the lack of any science. They’ll teach you facts about the world (or some biased view of them); but that’s memorization, not science. I want to teach the method; they’ll learn plenty of facts on their own. I was delighted to discover the Real Science 4 Kids curriculum at the FPEA convention. It covers chemistry, biology, physics, astronomy, and geology, starting all of them in first grade. Every chapter is built around experimentation, so you’re not just reading and memorizing.

Computer Programming

We’re still not sure exactly how to go about teaching programming. At the moment, we’re really excited to see all the resources coming to, including their upcoming K-5 curriculum.


Castillo de San MarcosTeaching history to preschoolers is, to a large extent, an exercise in futility. They don’t really have a concept of time on any scale larger that “last week”. But this is a good age to go and explore that history in person. We’re blessed to live an hour away from the oldest city in the United States. We’ve made a number of visits to Castillo de San Marcos and toured the many museums in the area, learning about life in the area over the last half millennium.

We’re now about to start the Story of the World. It’s a four-year world history curriculum that presents the events of history intermingled with short stories to help children visualize and remember those events. It also comes with an outstanding teacher’s guide/activity book, full of related projects, recommendations for more books, and generally helpful resources.

Book of CenturiesHistory, of course, intermingles will all our other subjects. A wonderful tool we’ve learned about is the Book of Centuries. It’s a simple book that is empty other than dates in the margins, with a spread for each century. Over time, your book begins to show the relationships among the people, places, and events as you mark them in your book. We couldn’t find one we liked, so Stephanie put together her own Book of Centuries through Lulu.

Music and Art Appreciation

Stephanie is an artist. I’m a classically-trained musician. As much as we want to (and do) encourage our children to learn those same skills, it’s far more important for them to learn to appreciate and understand the beauty and history of the cultural heritage available to us.

The first step is to surround them with that beauty. There is almost always music playing in our house (usually Classical South Florida or a curated Pandora station). We have paintings on our walls. We take every opportunity we can to attend concerts. This year we’ll also start explicitly including music and art studies into our school curriculum.

Each month, we’ll pick a different composer. We’ll learn a little bit about their lives and times, but then focus on listening to their music. We’ll have a composition of the week that we will listen to every day as part of our bedtime routine, and finish up the week talking about what we heard. We get a chance to internalize each composer’s style, and find their place in the Book of Centuries to gain an understanding of the evolution of music over time.

We’ll follow a similar process with artists. Study one artist for an extended period, looking at a new piece of art each week. Then we talk about what we saw, learning to appreciate it, but also to analyze and criticize. Then we have an opportunity to create our own art in a similar style.


The government thinks it can do a better job raising my kids than I can, and likely has different goals for their education. There are plenty of undiplomatic things I could (but won’t) say here. Instead, a few anecdotes:

  • I attended public schools for nineteen years, from kindergarten through a master’s degree. I took a lot of history classes. I never finished a history book. U.S. history classes rarely made it past the Civil War (including the ones labeled “1865-Present”). When we did make it into the 20th century, it was usually part of a rushed overview of the 20th century on the last day of class.
  • When I finished kindergarten, my school tried to make me repeat it for another year. My crime: I couldn’t skip. Read that again: I couldn’t skip. You know, hopping on alternate feet. Sure, I was reading at a 2nd-grade level already (as the rest of my class was learning the names of the letters). I had outstanding marks in every subject. But I couldn’t skip. Clearly that’s the most important thing you need to succeed in public schools: physical agility.
  • Stephanie was recently speaking to another mother. The latter had a son in Kindergarten. He had one or two hours of homework every night. This is on top of the four hours he spent in the classroom every day. If you scroll up, you can re-read where I wrote, “School takes maybe 15 minutes a day at the beginning, and gradually expands to a couple of hours each day now that we’re starting first grade.” She was already homeschooling her son, but to meet somebody else’s goals, somebody else’s curriculum, somebody else’s schedule.

Sometimes the government can be forceful in asserting its supposed claim on your children. To keep matters simple, we follow a few steps:

  • Move to a location where the laws are conducive to homeschooling. Florida requires an annual standardized test or portfolio review, but otherwise is fairly unoppressive. Our county, in particular, is quite welcoming.
  • Avoid interacting with the government. Its goals are not your goals. If you go to the government for advice, it will give the wrong advice, and it will often come with legal entanglements.
  • Know the law, know your rights, and always follow the letter of the law.
  • Join the HSLDA. They do important work fighting the legal battles to keep you free.

MultiplicationObviously, homeschooling is a large investment in time (but possibly not as much as you might expect). It comes with so many benefits, though, that it’s well worth the cost. We start with giving our children the highest quality education possible. As freelance workers ourselves, homeschooling is a perfect lifestyle fit, giving our kids the same flexibility we have to work when and where we choose. Add to that the opportunity to spend quality time with them, helping them discover the world and discover themselves while we nurture our relationship with them. It keeps us busy, but who isn’t busy? We’re doing exactly what we want to be doing, and we’re loving it.

by Jonathan Brinley at August 19, 2014 03:05 AM

August 18, 2014

Bibliographic Wilderness

Google Scholar Alerts notifies me of a citation to me

So I still vainly subscribe to Google Scholar Alerts results on my name, although the service doesn’t work too well today. 

Today (after retrurning from summer vacation), I found an alert in my inbox to Googlization of Libraries edited by edited by William Miller, Rita Pellen. 

Except oddly, the Google Books version wasn’t searchable so I could find where my name was mentioned. (But clearly Google has/had text at one point to generate the alert for me!).  

But the Amazon copy was searchable. Amazon doesn’t let you copy and paste from books, but I’ll retype. 

Of course, some aspects of this comparison do not fit. For example, it is unlikely that the existence of Google Scholar is going to “dumb down” research (It might, however, make possible the distribution of less reputable research, unfinished manuscripts, etc. Scholars like Jonathan Rochkind have explored this concept. [32]).

From Standing on the Shoulders of Libraries by Charlie Potter in, Googlization of Libraries, edited by William Miller and Rita Pellen, Routledge 2009. Page 18. 

I don’t actually recall exploring that concept.  Let’s see what the cite is…  doh, the page of citations for that chapter isn’t included in the Amazon preview. Let’s see Google… afraid not, Google wouldn’t show me the page either. 

I wonder how many scholars are doing research like this, from the freely avaiable previews from Google/Amazon, giving up when they run up against the wall.  

Maybe I’ll ILL the book, Amazon search says I’m cited a few more times in other chapters, although it won’t show me them. 

Filed under: General

by jrochkind at August 18, 2014 01:40 PM

August 17, 2014

Universal Decimal Classification

IFLA General Conference 2014

If you happen to be in Lyon, please join us for our regular UDC Update session at the 2014 IFLA General Conference and Assembly. Members of the UDC Editorial team will give a brief presentation on the current developments in the UDC, ongoing translation projects, forthcoming publications and events. Monday, 18 August 12.45-13.45 Room St. Clair 3b The UDC Update Newsletter is available

by Aida Slavic ( at August 17, 2014 10:41 PM

August 16, 2014

First Thus

ACAT Friday OT : Science fiction fans, I could use some help!

Posting to Autocat

On 8/15/2014 9:32 PM, Blodget, Emily@CSL wrote:

Two original Star Trek episodes spring to mind, because I watched them recently. In “All Our Yesterdays,” the planet Sarpeidon is going to be destroyed by a supernova, but when the Enterprise arrives, there’s no one who needs saving–because everyone already went to the library, picked a favorite historical period from the collection, and hopped back in time. The extraordinarily dedicated head librarian is the only one still there, putting off saving himself until the very last minute in case he needs to help any stragglers. (His trick of making lesser-skilled doubles of himself is also a novel solution to a staff shortage!) Preserving the past saved them from a future apocalypse–albeit in a way that no doubt produces time-paradox headaches for anyone who thinks about it too hard.

“All Our Yesterdays” was written by a librarian who was later at Princeton University, a good friend of mine, and I’m sure that Mark remembers her too: Jean Aroeste. I see she actually made it into Wikipedia! That’s good.

When I asked her about it, she told me what it was like to write a TV script and how strange she found it. Then she said, “That really was another life.”

She was a good friend of Seymour Lubetzky also. I have always been a “Trekkie” and this episode has been one of my favorites, so it was a real thrill for me when I found out that Jean was the writer!


by James Weinheimer at August 16, 2014 03:49 PM