DM at the IMC 2019

Session report

At this year’s International Medieval Congress (IMC), which took place from 1-4 July, the Digital Medievalist sponsored two sessions and a round table focusing on “Digital Materiality”. The IMC has become the world largest annual conference dedicated to medieval studies. This time “materialities” was chosen as a special thematic focus, which proved to be an interesting topic to be tackled from different perspectives, various angles and with regard to a wide range of material objects.

Leeds Campus
The main building on the campus of Leeds University

The first of the DM sessions, organised by Georg Vogeler (Graz), and chaired by Franz Fischer (Venice) was dedicated to “The Digital Edition and Materiality” (#s224). After a brief introduction to the digital medievalists’ community, their focus and work (such as the “gold standard” open access journal), it started with a paper by Vera Isabell Schwarz-Ricci (co-authored by Antonella Ambrosio, both Naples) entitled “A Dimorphic Edition of Medieval Charters: The Documents of the Abbey Santa Maria della Grotta (near Benevento)”. In her talk, Schwarz-Ricci presented the hybrid approach taken in their project to account for both print and online edition. Aiming at to different outputs and trying to accommodate them in the best possible way, enforces the development of a very sophisticated and integrated workflow.  The encoding is based on CEI-XML, a TEI derivate especially for charters. The XML-data also works as the base for the printed edition. Both outputs serve different needs and have their strength. While a printed edition that applies to the common standards for editing charters, offers usability and also stability besides acceptance in the field, the digital version has got its benefits when it comes to availability, data integration and analyses. 

In the second paper entitled “Artificial Intelligence, Handwritten Text Recognition (HTR), Distant Reading, and Distant Editing”, Dominique Stutzman (Paris) provided insights into some recently finished or ongoing projects, which are concerned with various developments in the fields of handwritten text recognition, natural language processing (NLP), machine learning, distant reading of manuscripts or script identification. The increasing number of interdisciplinary approaches and projects has also led to the inclusion of computer scientists, so that the opportunities for further research are opening up. In recent years, computer-aided approaches have made great progress in these domains. HTR has become more accurate and can now be applied to different scripts and hands. The majority of medieval texts that have been handed down to us via handwritten tradition is still not edited, and it might also not be possible to do so in the (near) future just by manual work, because of the vast amount of material. Hence, artificial intelligence can become a game-changer for medievalists’ research. Inspired by the term “distant reading”, coined by Franco Moretti for the quantitative analysis of textual data, Stutzman suggested “distant editing” as a complementary approach, based on databases and search engines to query the source texts. 

The final paper of this session was given by Daniela Schulz (Wuppertal), who focused on the potentials and limitations of modelling material features of medieval manuscripts by using the CIDOC Conceptual Reference Model (CRM), which is an event-centric modelling tool. She started with a brief introduction to the issues connected with the term “materiality” in the domain of textual scholarship. Although, since many years, “materiality” features very prominently, apparently, still no commonly accepted definition exists. To narrow down, which material features of a manuscript can be modelled and why it is useful to do so, she referred to Jerome McGann’s definition of “bibliographic codes”. By focusing on one specific manuscript (Cod. Guelf. 97 Weiss.), Schulz demonstrated the application of the CRM and some of the CRM extensions to model its material features also in connection with the history of the codex. The suggested approach seems promising, although Schulz also drew attention to the fact, that an additional effort for the proper modelling and encoding is needed, which makes the application of this approach problematic for editorial projects with limited resources (time, money etc.).

The second DM session was organised by Roman Bleier and chaired by Sean Winslow (both Graz). It was dedicated to the question “How to Represent Materiality Digitally in Palaeography and Codicology?” (#324). It started with a paper by Peter A. Stokes (Paris) entitled Towards a Conceptual Reference Model for Palaeography”. Stokes briefly introduced the idea of a conceptual reference model and outlined the necessity to define what writing is. When taking a closer look, the answer to the question, what a grapheme (commonly defined as the smallest significant graphic units that differentiate meaning) is, is not so straightforward. It becomes more problematic, when we consider the level of shapes. Since a sign has multiple functions and can be represented by different shapes, modelling multigraphism can help us clarifying the fundamental concepts palaeographic research is based on. Whereas linguistics and palaeography have up to now neglected the meaning conveyed in using different letter shapes, the development of a conceptual model for palaeography seems a promising approach, to account for these problems.

The second paper was given by Caroline Schreiber (Munich). In her talk Book Covers as Material Objects: Possibilities and Challenges in the Brave New Digital World” Schreiberreported on her experiences in the digitization of book covers at the Bayerische Staatsbibliothek. In the course of the digitization projects, a modular standard for the description of elaborate book covers like treasure bindings has been developed. Besides the advance of a multilingual thesaurus for iconographical and also general features, also Linked Open Data approaches have been applied in this context. LIDO as well as the semantic wiki for documentation was used. She also provided deeper insights into analytical methods used and technical advancements made during the digitization and described their different potentials and limitations.

Some of the DM representatives
Some of the DM representatives
(back: Jamie B. Harr, James Cummings, Daniela Schulz;
front : Franz Fischer, Sean Winslow)

In his talk “On the Epistemological Limits of Automatic Classification of Scripts” Marc H. Smith (Paris) discussed the consequences and limits of AI-based methods in the classification of scripts. These new digital approaches not only seem promising to facilitate future research, but also provide us with an opportunity to rethink the analytical categories our research has been based upon in the past and still is. 

The number of papers with the index term “Computing in Medieval Studies” has increased over the last years, and thus the common interest of scholars working in the field of Medieval Studies. This was also testified by the fact that the room was packed in both DM sessions, and people even needed to be sent away, because there were no chairs available anymore. Given this great success, a continuation of sessions sponsored by DM jointly organized by the DM board as well by its recently founded subcommittee, is planned for IMC 2020 with its special thematic strand “borders”. See CFP here (Deadline: Sept. 15th.).

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.