The conference aimed to examine the possibilities of connecting information sciences and computer science with performing arts, focusing on three thematic blocks: archiving, artistic practices and scholarly research. The international scientific and professional conference is part of the project of the same name by the DARIAH-EU Working Group Theatralia, which is dedicated to the research of digital technology in the performing arts and the digitization of theatralia, financed from DARIAH-EU funds.
Data modeling refers to the process of creating a machine-readable conceptual representation of objects
- What are the differences between a data scientist and a corpus linguist? This course provides an overview of the different perspectives on language and different types of tools that can be used for text analytics. It also introduces topic modelling and sentiment analysis as approaches to textual data.
- This workshop, focussing on "Spatial data medieval to modern", is the first of a series of workshops from the NOS-HS project "Linking, Building, and Sustaining Humanities Digital Spatial Infrastructures for Research in the Nordic Countries". The main aims of this workshop were to define key concepts (spatial infrastructures, Linked Open Data, metadata, ontology), outline major challenges in the field, and to provide an opportunity to share experiences of addressing the issues in individual and national projects across the Nordic countries.
- This course is designed to develop your knowledge of the theory and practice of digitising material culture by producing computer generated and printed 3D models.
- Hosted by King’s Digital Lab (KDL) at King’s College London, the workshop introduced participants to best practices in project management, the Agile Dynamic System Development Methods (DSDM) as well as various theoretical and practical approaches to digital cultural heritage.
- This workshop introduces the basics of conceptual modelling and ontologies.
- This course is an introduction to the theories, practices, and methods of digitizing legacy dictionaries for research, preservation and online distribution. It focuses on a particular technique of modeling and describing lexical data using eXtensible Markup Language (XML) in accordance with the Guidelines of the Text Encoding Initiative, a de-facto standard for text encoding among humanities researchers.