Building upon the material covered in LEX2: Mastering ELEXIS Corpus Tools for Lexicographic Purposes and Lexonomy: Mastering the ELEXIS Dictionary Writing System, this course will focus specifically on the changes in dictionary production after 2000 and the increasing importance of automation and post-editing in lexicography.
Lexicography is the art and craft of making dictionaries
- This course introduces the legacy dictionary viewer Publex, a generic, modular dictionary publication tool for retrodigitized dictionaries.
- In this lecture from the Austrian Centre for Digital Humanities and Cultural Heritage (ADCH-CH), Professor Crane discusses the need for a global philology. Combining classical philology and computer science, Crane aims to apply computer-based methods to the study of human cultural development. He discusses the necessity for project oriented, research, reusable code and infrastructures which support it.
- This learning resource provides software developers and computational linguists with an overview of the typical computational processing tasks and software tools in the lexicographic workflow. The resource introduces the most widely used custom developed tools for corpus-based lexicography as well as their functionality.
- The course builds upon Extracting Lexical Data: XPath for Dictionary Nerds and introduces the basics of XSL Transformations (XSLT), a standard language for transforming XML documents.
- This course will help users learn how to use oXygen XML, a versatile, professional-grade XML editor to edit, validate, query and transform lexicographic data.
- This course will introduce the concept and the ELEXIS implementation of the dictionary matrix, a universal repository of linked senses, and other types of lexical information found in existing lexicographic resources.
- This course explores the principles of open access, open data, FAIR principles and open science as they apply to lexicography including the specific challenges posed by intellectual property rights and copyright issues in the context of lexicographic work.
- This course describes the OntoLex-Lemon model, a recent standard for the representation of lexical information on the Web as linked data.
- The course will focus on modeling dictionaries using TEI Lex-0, a subset of the community standard TEI (Text Encoding Initiative).
- This course focuses on the importance of standards to facilitate the cooperation among lexicographers in a multilingual and multicultural context.
- This course will present an overview of resources available from CLARIN that may be useful for the lexicographer; we refer to lexical datasets but also to textual resources such as corpora, as well as tools.