Recently there was a video posted on linked data and just the other day a blog post on linked data as the “new Dublin Core”.
The video by Sandro Hawke is called “An Introduction to Linked Data”. There are also slides to accompany the video at: files.meetup.com/1336198/LinkedDataPresentation-SandroHawke.pdf. Sandro’s talk provides an easy to follow introduction to not only the context of linked data but what linked data is and how to make linked data happen. For those really interested in pursuing this topic further, take a look at the linked data website at: http://linkeddata.org/ or listen to some of Tim Berners Lee’s talks.
What’s interesting is that in this discussion, Jeffrey Bealle asked the question of whether linked data will be the new Dublin Core. First, he provides a criticism of linked data and second a critical view of Dublin Core. As for the second, criticizing Dublin Core is not new. However, Jeffrey provides some examples about why using Dublin Core makes searching more difficult. The first point is trickier. I liked hearing a different point of view and especially a cautionary tale amide the very enthusiastic pushes for linked data. I was also intrigued to read through Jeffrey’s description of linked data, in particular after having viewed Sandro’s video. As a result, I was lead to become more aware of information sources and reliability. This is because Jeffrey was combined together an explanation and critique of a metadata schema and data formats, which are two different but albeit related things. This was confusing and the end result was that I wasn’t entirely sure if the linked data that Jeffrey was talking about was the same that Sandro was in his video on linked data. Yet, with this aside, Jeffrey’s post begins a needed discussion on just how “live uri’s”, to use Sandro’s term, can or even should be used in library metadata now. If you have time, Jeffrey’s post is not long. Have a read and then view Sandro’s video or vice versa.