2023-09-27

Date

Attendees

  • Melanie Cofield (facilitating)
  • Benn Chang (presenting)
  • Paloma Graciani Picardo (notes)
  • Elliot Williams

  • Katie Pierce Meyer
  • Nancy Sparrow
  • Yogita Sharma

Recording

Agenda

Discussion items

ItemWhoNotes
Review discussion and action items from last meetingMelanie
  • Melanie emailed Xaoli Li and Laura Akerman of IGELU/ELUNA LOD Community of Practice/Working Group, invited to join Aug or Sept meeting. Neither responded. Perhaps Melanie and Paloma can do outreach and go interview folks to gather info/learn about use cases and methods for working with linked data via Alma/Primo. 
  • Benn will schedule Alma CloudApp workshop time together in the coming months, depending on outcome of above action item. Benn prepared to share/lead some AlmaRefine workshop time together today. 
  • Paloma will experiment with Alma Refine in August. Paloma has some findings to share, questions for discussion.
  • Melanie will add state of LD functionality in Alma/Primo, summary of COP activities, roadmap projections to draft linked data report. Still pending.
  • Group members will add notes about group accomplishments thus far, vision for future, to draft linked data report. Paloma has added content and can give an update. 

AlmaRefine demo and discussion

Benn, Paloma
  • Goal – Enrich sets on Alma with Wikidata URIs. Release of the Wikidata Primo implementation is now expected at the beginning of 2024

DEMO AND OBSERVATIONS (BENN)

  • Alma refine documentation on their GitHub - https://github.com/ExLibrisGroup/alma-refine/wiki/Help
  • Load a demo set for architecture monographs
  • Preview option (for context). Sometimes you don't get a preview (why?)
  • Wikidata is only by default targeting the 100 fields
  • Changes can be immediately reviewed on the metadata editor
  • Settings
    • User based, not system wide. Persistent throughout sessions, but will be lost on Alma sandbox refresh unless also configured on production. 
    • It is good that they are user based, but might be a problem for consistent practices
    • You can configure the system to target other MARC fields. This is available after you select a reconciliation service since you might want to have different target fields depending on the source dataset
  • Correct term option – might need more exploration
  • You can use Alma Refine with Sets, Search Results, and with single records opened on the Metadata Editor 
  • Large datasets can take a lot of time to reconcile

OBSERVATIONS (PALOMA)

  • Main concern about settings has been resolved by Benn's demo. Happy to see that you can add target fields
  • Seems like it also allows you to choose the target subfield for the URI ($1 or $0) which presents opportunity for error if folks are not aware of whether the target dataset is RWO or not. The default settings per reconciliation service seem to be as we would want them.
  • Was impossible to do the testing without constantly comparing with OpenRefine
  • The screen only displays up to 25 items per page, which is an issue for big datasets
  • Reconciliation result has to be manually selected one by one
  • Doesn't seem to allow selection of more than two data points for the reconciliation and does not show confidence level the way that OR does
  • Something else that OR does that AR does not is letting you filter labels that have something in common (e.g. that have dates), and then just look at those ones. No way to select a items based on the label characteristics
  • The URI is not linkable on the Record view, while the value of the 856 $u is (not sure if that matters in any way, but I thought it was curious)
  • How to leverage the power of OpenRefine to get a dataset enriched with URIs, and then use AlmaRefine to import the URIs in Alma? Should we try to develop a local reconciliation service, something like with OR does with ReconcileCSV?

NEXT STEPS

  • Each institution creates a set of bib records that they want to enhance with Wikidata items
  • HRC is going to create the set based on the archival collections. Will require a lot of clean up, but this has been a pending project for a while. The goal is to cross 100 field with EAD Creators for which we already have a wikidata ID. We already have a dataset of names and wikidata URIs, we just need to put it on the system. Paloma would like to work on a local reconciliation service to use with AlmaRefine
  • Architecture also wants to focus it on the archival collections. Katie would like to set up a working meeting before the next meeting to do some testing as a group

SIDE NOTE

  • UTL is starting to think about implementing more efficient workflows for archival collections discovery by harvesting the metadata directly from TARO if feasible.

Summary of IGELU/ELUNA LOD COP WG report on URIs in ExLibris Products (Alma) from May 2022

Melanie,  all
  • Useful to understand how this user community is trying to drive development of these features with the vendor
  • It gets down in the weeds and does a good job at articulating the technical details
  • Benn has switched the "Don't do partial matching on subject headings" on the configuration of Alma enrichment (Yei!)
    • New feature from August 2023 release
    • Part of the authority control functionalities in Alma, which will require further exploring
Alma August 2023 Linked Data feature releaseMelanie, all
  • Linked Open Data Enrichment for EuroVoc Authorities

    • UT Libraries and partners don't use EuroVoc, so not relevant for this group
  • New Search Indexes for URI Cataloged in $$1

    • Paloma has tested and it was exciting. Brings new possibilities for more specific ID based searches
  • ORCID URI Enrichment for Bib Records Using the Alma Refine Cloud App

    • Benn has tested with not a lot of luck - hangs on reconciliation, seems buggy.
  • Only Generate URIs for Authority Records That Fully Match the Subject Heading

    • This is a new feature in response to the issue brought out on the URIs report
    • Benn has already switched the Alma configuration to avoid partial matching

NOTE

  • "Linked data enrichment" process in the context of Alma is set up through a profile for data export. This means that the URIs don't get added to the bib records in Alma, but to the dataset when this gets exported out of Alma

Looking ahead to 2023 wrap-up/priorities, and 2024 visioning

Melanie, all

Next meeting

Melanie,

all

  • Show and tell of the datasets, testing and use cases
  • Review of the report draft
  • Start talking about potential wikidata event for 2024?

Action items

  • Everyone adds to the report
  • Review report draft in the next meeting
  • Everyone identifies specific sets of records to play with in Alma Refine, and prepares use case to share at our next meeting
  • Start thinking about potential local reconciliation service to be used with Alma Refine