Category Archives: linked data

Digital BBC World Service radio archives

The BBC World Service Archive Prototype is a website that provides access to the huge digital archive of radio programs of the BBC World Service. Yves Raimond and Tristan Ferne describe in a concise article (PDF, 8 pages) how Semantic Web technologies, automation and crowdsourcing are used to annotate, correct and add metadata for search and navigation. Ed Summers has a blog post about this project, making a comment I wholeheartedly agree with: “… [I]t is the (implied) role of the archivist, as the professional responsible for working with developers to tune these algorithms, evaluating/gauging user contributions, and helping describe the content themselves that excites me the most about this work.” I think this is not only a possible future role for archivists but also for librarians, especially catalogers and metadata specialists working with digital collections.

Advertisements

Wikidata

Could this become the linked data killer app? Wikimedia Deutschland has kicked off a project that aims to better centralize, structure and type the vast amounts of data in Wikipedia  – Wikidata. Information will be extracted from the info boxes and stored in one central database. Moreover, they will store “meta-metadata” about who said what when (thus adding an important dimension to dbpedia). The identity management aspect appeals to me most: all language versions of Wikipedia will link to one central point for the same entity (thus reducing redundancy), and a single URI will be coined that is language independent. See Daniel Kinzler’s recent presentation (PDF) at SWIB 2012 and this detailed article to learn more about the project (funded by Google, among others) that might demonstrate the concrete usefulness of linked data.