Skip to main content

EHRI

The European Holocaust Research Infrastructure (EHRI) has a mission to enable transnational research, commemoration and education in Holocaust studies, and to accommodate for the wide dispersal of sources and expertise across many institutions.

Posts

  • The Learning Curve in Sharing Data with the EHRI Project

    EN
    A partnership between Kazerne Dossin and EHRI was established to enable sharing of metadata with a broader audience. This partnership resulted in changes to the practices of cataloguing archival materials within Kazerne Dossin. Using the example of the Lewkowicz family collection, this article focuses on the revolution Kazerne Dossin went through while standardising descriptions, and on the tools EHRI provided to optimise the workflow for collection holding institutes.
    Authors
    • Dorien Styven
    • Marius Caragea
    • Veerle Vanden Daelen
    Read more
  • Using Spatial Data in Tableau

    EN
    Tableau is a powerful digital tool for analysing data that can help with mapping and interrogating data. In this short guide we will focus on an aspect of data analysis using mapping that has particular application for Holocaust and refugee studies.
  • Entity Matching

    EN
    EHRI (European Holocaust Research Infrastructure) supports the use of digital tools that can assist in the research of Holocaust and refugee related topics. In a continued effort to make these tools as accessible as possible so that researchers who have no experience with digital tools will consider trying new ways of using their data, this GitHub-based lesson showcases the use of entity match tools when dealing with geographic data.
  • What Can I Do With This Messy Spreadsheet? Converting from Excel Sheets to Fully Compliant EAD-XML files

    EN
    Many Galleries, Libraries, Archives, and Museums (GLAMs) face difficulties sharing their collections metadata in standardised and sustainable ways, meaning that staff rely on more familiar general purpose office programs such as spreadsheets. However, while these tools offer a simple approach to data registration and digitisation they don’t allow for more advanced uses. This blogpost from EHRI explains a procedure for producing EAD (Encoded Archival Description) files from an Excel spreadsheet using OpenRefine.
  • Using Named Entity Recognition to Enhance Access to a Museum Catalog

    EN
    This blog discusses the applicability of services such as automatic metadata generation and semantic annotation for automatic extraction of person names and locations from large datasets. This is demonstrated using Oral History Transcripts provided by the United States Holocaust Memorial Museum (USHMM).
  • Spatial Queries and the First Deportations from Slovakia

    EN
    In the late 1930s, just before war broke in Europe, a series of chaotic deporations took place expelling thousands of Jews from what is now Slovakia. As part of his research, Michel Frankl investigates the backgrounds of the deported people, and the trajectory of the journey they were taken on. This practical blog describes the tools and processes of analysis, and shows how a spatially enabled database can be made useful for answering similar questions in the humanities, and Holocaust Studies in particular.
  • Windows Subsystem for Linux (WSL)

    EN
    Many tools and examples that are of interest to those wishing to explore, experiment, and develop projects for digital humanities or data analysis and other tasks are based on a Linux operating system. Mac iOS laptops support Linux fairly easily. However, until recently, Windows OS users have had difficulty in accessing programs and techniques that require a Linux operating system. This short tutorial will demonstrate a simple way for most Windows 10 users to run Linux programs and systems through Windows Subsystem for Linux (WSL).
  • Extracting CSV Data from the EHRI Search API

    EN
    The EHRI (European Holocaust Research Infrastructure) Search API provides a way to retrieve information about items in the EHRI portal in JSON (JavaScript Object Notation) format by making HTTP requests to particular URLs. This short tutorial shows learners how use a command line tool (CuRL) to fetch structured data and transform it into CSV (comma separated values) format for import into a spreadsheet like Excel or Google Docs.
  • Importing tables from websites into spreadsheets

    EN
    Sometimes it can be useful to take information from a website, such as document lists from archives, for future reference. This short resource will show the user how to download an extension to copy tables from websites and then import the table into a spreadsheet program.
  • Using OpenCV for Face Detection

    EN
    OpenCV is a very popular, free and open source software system used for a large variety of computer vision applications. This article is intended to help you get started in experimenting with OpenCV using an example of face detection in images as a case study.
  • quod: A Tool for Querying and Organising Digitised Historical Documents

    EN
    This blog post from EHRI introduces 'quod' (querying OCRed documents), a prototype Python-based command line tool for OCRing and querying digitised historical documents, which can be used to organise large collections and improve information about provenance. To demonstrate its use in context, this blog takes the reader through a case study of the International Tracing Service, showing workflows and the steps taken from start to finish.