Digital Humanities 2012 Hamburg – Some Thoughts on the Diversity of DH 2012

As an assistant at the Digital Humanities 2012 conference and at the same time being presumably one of the very first Swiss students obtaining a degree in Digital Humanities (at least under this denomination), I was pleased by Enrico Natale’s suggestion to share some of my experience here. After enjoying the engaging first THATcamp Switzerland that was held in Lausanne last November it seemed well worth to also attend an  international gathering in order to experience the field in its full width and to see and hear (and talk to) some of the people that had already gotten somewhat familiar from journal and mailing list contributions. The annual conference of the ADHO, the Digital Humanities, of course affords these opportunities very well and the time at DH 2012 indeed turned out to be very rewarding.

The following are three thoughts that evolve around the central notion of this year’s conference theme – (digital) diversity:

  1. Whereas the conference theme in many cases may inform the keynotes and some formal addresses, but other than that not noticeably resonates with the bulk of workshops or presentations, this was quite different for the DH 2012. Here, the conference theme ‚Digital Diversity – Cultures, languages and methods‘ had implications that went beyond the more formal occasions and promise to have a bearing for the future configuration of the field. The most notable development with regard to this is the foundation of the DHD, the regional charter of the ADHO that yet needs to precisely define its inclusiveness and final name (Enrico Natale already reported Swiss perspectives on the subject). As Paul Spence, chair of the International Program Committee and local organizer of DH 2010, notes „the DH community is clearly developing in new and exciting ways, and it is particularly exciting to see non-Anglophone groups develop a stronger presence on the international DH stage.“ Similar sentiments were echoed on Twitter during the conference and it is great to see this development honored by anglophone members of the DH community.
     
  2. The diversity of Digital Humanities also became manifest in my very direct environment during the week in Hamburg. The local organizing committee was joined by a handful of local students and three international students from Asia, South America, North America and Europe respectively.[ref]On the student assistants‘ tasks and some technical issues:
    The tasks of the student assistants encompassed live coverage of the session via a shared twitter account, sharing reflections and anecdotes on the students’ blog as well as recording most of the presentations on video. Personally, I had the pleasure to record Marc Alexander, who was later proclaimed the winner of the 2012 Fortier Prize that is awarded to the ADHO Bursary winner whose conference submission is judged by the Bursary Award Panel to be the most outstanding. While perambulating the stage area during his stunning presentation, Marc gave me a hard time keeping the camera on him at a reasonable level of zoom. As this recording might figure among the ones that are watched most frequently I am glad that the recording of the slides went flawlessly.
    Recordings such as this one were generally made available as streams within short time after the sessions, which brought a lot of praise from the audience both present in Hamburg and from remote locations. Occasionally however the Lecture2Go system that was used to record the audio and video signals and the presentation slides using an Epiphan frame grabber (the Lecture2Go setup very much resembles the SWITCHcast system used by Swiss universities) proved to be too intricate resulting either in cropped slides or no usable output altogether. Both the responsible technicians and the recording students identified room for improvement with regard to the technical aspects of the session recording.[/ref] This brought a notable diversity into the team that not only related to the geographical and cultural provenience, but also to different educational backgrounds and areas of interest ranging from buddhist informatics to urban acoustics or typography. Not everyone was equally immersed with the various theories, concepts and technological approaches that were debated during the week, but there were always common grounds of interest that allowed for interesting conversations. In this regard I was reminded of the THATcamp in Lausanne and it was a welcome change to the discussions with my classmates that – while not less interesting – are often based on the same or very similar literature and in comparison are much more homogeneous. I would encourage future organizers of the DH conference to explicitly invite international applications for student assistant bursaries and to perhaps follow the lead of DH 2012 and allow for a per continent quota in order to reflect the worldwide DH research community (provided there are enough eligible applications).
    It was nice to learn that some of my international colleagues also took the opportunity to attend other events preceding or subsequent to the DH conference such as the Digital.Humanities@Oxford Summer School or the European Summer School in Digital Humanities in Leipzig, which is certainly a great thing to do. With their emphasis on workshops and practical training these summer schools are indeed excellent complements to the DH conference.
     
  3. While the disciplinary and thematic diversity of the contributions was remarkable, language and literature related fields continue to represent the biggest share of the program. As a historian I tried to follow sessions and presentations that focussed on historical research questions and historical objects of research, but this was not always easy to reconcile with my schedule that usually involved a half-day of session coverage or presence at the reception desk. On thursday in particular, things got complicated for historians. Several sessions of interest to historians concurred and a number of them could be seen quietly packing their things and swiftly changing lecture halls between two presentations. Matters were somewhat different on Friday afternoon when Dino Buzzetti and Manfred Thaller addressed a large audience with an impressive dialogic presentation (see here, here, and here) that touched on more than a few fundamental aspects of markup theory, textuality, interpretation and semantic relations. I deem it very desirable that presentations and talks that highlight a specific research problem or showcase an ongoing project are complemented by this kind of encompassing and thought provoking discussions that have the potential to bring various disciplines of digital humanities forward.

Digital Humanities 2012 Hamburg – Poster session (18/07/12)

Pour se faire une idée de la diversité des projets représentés à la conférence Digital Humanities 2012, rien de tel qu’une plongée dans la Poster Session, où, pendant une heure et demie, une cinquantaine de projets étaient présentés par leurs équipes dans le hall de la faculté de Sciences Humaines de l’université de Hamburg.
Cette session était inclue en session plénière dans le programme, et la plupart des participants étaient présents, contribuant à l’animation du moment. Bravo aux organisateurs, qui ont trouvé la bonne formule pour une Poster Session réussie. Les projets sont présentés dans leur ordre de rencontre.

Sous le titre de „Digital Knowledge Store“ se cache la réponse de l‘Académie des sciences de Berlin-Brandebourg au problème central de toutes les institutions de recherche qui produisent des contenus numériques: Comment rendre visible, et donc utiles, ces informations ? Les universités ont depuis une dizaine d’année ouvert des serveurs institutionnels, sur lesquels l’information est stockée, mais ceux-ci restent trop isolés et peu conviviaux. La tentation serait bien sûr de regrouper tous les contenus d’une université sur un seul serveur, mais chaque équipe de recherche, chaque chercheur veut garder la main sur ses données. En France, une solution a été développée par un projet comme Isidore, un portail de recherche des sciences humaines et sociales, qui agrège les métadonnées de très nombreuses sources, tout en laissant les données là où elle sont. Principe identique avec le Knowledge Portal de l’EFPZ (qui est down au moment où j’écris, fait bien assez rare pour qu’on le signale), qui permet de faire une recherche fédérée dans plusieurs catalogues dans une interface conviviale. Mais attendez… Knowledge Portal … Digital Knowledge Store … est-ce que l’Académie de Berlin se serait inspirée de l’ETHZ ? Peu importe, car le projet berlinois veut aller plus loin que ses prédécesseurs, en proposant une interface de recherche sémantique capable de faire des recommandations intelligentes en fonction de l’historique de recherche.
Ce qui est intéressant dans ce type de projets, c’est la somme d’information et de métadonnées disparates qui doivent être harmonisées, retravaillées et catégorisées pour fournir un service de recherche efficace. Pas de la tarte…

Dans le même style de projet, mais en plus spécifique et en plus précis, il y a le travail de The European Library (TEL), qui agrège les notices bibliographiques des bibliothèques nationales européennes, et les reverse ensuite dans Europeana. Le travail de Nuno Freire consiste, en résumé, à garantir qu’il n’y ait pas de doublons parmi les noms d’auteurs des 75 millions de notices bibliographiques en une dizaine de langues que contient TEL. Ben voyons. Dans sa tâche il peut compter sur l’aide de VIAFF, un répertoire multi-lingue des noms de personnes, et sur les fichiers d’autorités qu’établissent les bibliothèques nationales.

M. Geoffrey Rockwell est une personnalité importante de la communauté des Digital Humanities. Il est notamment l’auteur des Voyant Tools, une série d’outils en ligne d’analyse et de visualisation de textes. Mais le projet qu’il présente ici est un autre, et concerne l’annotation des textes, plus précisément les langages de balisages, et plus précisément encore le langage XML, qui sert de base à la Text Encoding Initiative, très largement utilisée dans la communauté DH. L’outil présenté est un éditeur XML en ligne. Baliser un texte, c’est tout simplement annoter ses différents éléments au moyens de balises informatiques, pour permettre à la machine de les retrouver. Par exemple on annote avec une balise tous les chapitres d’un livre, et on pourra ensuite demander à l’ordinateur de nous sortir automatiquement la liste des chapitres. Évidemment les professionnels de la TEI, eux, annotent tout: noms de lieux, personnes, style, morphologie, syntaxe, structure du texte, mise en page, etc. Ils peuvent donc demander à leurs ordinateurs des choses plus compliquées, du genre „Quel sont les adjectifs les plus utilisés pour qualifier tel personnage ?“ ou „Combien de mots a-t-il en moyenne entre chaque adverbe de coordination ?“

Comme on sait, le Japon a perdu la deuxième guerre mondiale, assommé par deux bombes atomiques américaines. Mais la bombe H n’a pas été la seule arme de masse employée par les Américains contre les Japonais. Une campagne de propagande culturelle aurait aussi été mise en place aux USA pour stigmatiser l’ennemi japonais. C’est ce que voudrait prouver Yu Fujimoto de l’université de Doshisha en analysant le traitement du Japon dans la collection complète du National Geographic Magazine de 1888 à 2009. Au moyen de techniques de text mining et d’analyses de données, le projet Historical Events vs Information Content veut retracer le regard du magazine d’aventure grand public sur le Japon. N’ayant pas encore reçu l’autorisation d’étudier le contenus des articles, le projet se concentre pour le moment sur les images.

Les outils d’édition automatique de médias audiovisuels se développent rapidement. Malheureusement, le tant souhaité programme de transcription automatique d’enregistrements audio n’existe pas encore sous une forme satisfaisante. Cependant, de nombreux outils permettent de se faciliter la tâche, quand est venu le moment de traiter le matériel audiovisuel recueilli sur un terrain d’enquête. Le projet AV Processing in eHumanities du Max Plank Institut for Psycholinguistics sait faire deux choses: distinguer automatiquement les voix des différents intervenants sur un enregistrement audio, et reconnaître les mouvements de bras et de tête dans les images en mouvement.

Digital Humanities 2012 Hamburg – Switzerland takes power (Do it like Mills Kelly / 2)

People will remember Switzerland during this Conference.
I am in this very moment in the founding session of the Association Digital Humanities Deutschland. The big auditorium of the main building of Hamburg university is filled with more than 200 participants. The discussion is focusing on the future name and affiliations of this association.

People are notably protesting against the affiliation with Oxford University Press, publisher of JLLC – Journal for Literary and Linguistic Computing. As you know, Oxford is NO Open Access Publisher, and they publish only in English. And this is when Prof. Stolz from the University of Bern stood up, followed by Prof. Haber from the University of Basel and Prof. Claudine Moulin from the Universities of Luxembourg and Trier, to declare that the Digital Humanities in Europe should be multi-lingual, and not adapt themselves to the total dominance of English language, and nor should they openly support a commercial publisher like Oxford University Press.

Prof. Stolz even said that the title Digital Humanities Deutschland is problematic, because it does not acknowledges the other German-speaking countries, and among them Switzerland. An alternative would be DHDR = Digital Humanities Deutschsprachiges Raum. Prof. Stolz is also a candidate to be member of the to-be-elected committee of the Association. We’ll see how it goes. Anyway, how comes that Switzerland has become so secure of its place in the DH community ?

 

The power action of Switzerland began on Sunday afternoon, when Prof. Clivaz and Prof. Kaplan from Uni. Lausanne presented the application of University of Lausanne to host the Digital Humanities Conference in 2014. With all due modesty, I think it was my initiative to suggest to the Lausanne DH team to invite the Conference in Lausanne in 2014. And we won. The steering committee was convinced by the proposal – including a boat tour on the lake, a visit in a chocolate fabric and of course a visit to the CERN, where the web was first invented in 1989 !

To make it even better, the same day came the piece of news that the Swiss Polytechnical School of Lausanne (EPFL) is opening a Digital Humanities Lab, headed by Frédéric Kaplan, an artificial intelligence engineer with strong interest in the humanities. This brand new DH Lab will have two PhDs students and a postdoctoral fellow, and will be affiliated to the College des Humanités de l’EPFL. And they are starting their activities already in the coming weeks. Newly nominated Prof. Kaplan is currently choosing the new hardware to equip his laboratory, hesitating between a fully-automated book-scanner or a 3d object modeler. As the EPFL is not exactly a poor institution – he will probably have both. Hopefully for once the humanities sector will benefit from the financial wealth of the Swiss Polytechnical School. The research program of the DH Lab is not fixed yet, but will definitely deal mainly with history !

Digital Humanities 2012 Hamburg – DH Curriculum (Do it like Mills Kelly / 1)


As you may know, one of the biggest conference worldwide dedicated to digital humanities is starting today in Hamburg. Digital Humanities 2012 is this year edition of the annual conference of the Alliance of Digital Humanities Organisations. With more than 600 participants and a 500 pages Conference abstracts volume, it is a rather impressive gathering.

Trying to give an account of similar ventures is always tricky. One can live-Tweet (I will to a certain extent), take notes (I am too) or write blog posts. During THATCamp Switzerland, we were astonished to see that Mills Kelly (CHMN) had this amazing skill to write blog posts during the sessions, with no delay, so that the post could be published by the end of the session. That’s what I’m trying to do here… The account is of course partial as it can be. For a more neutral report see the slides of Prof. Thaller at the end of this post.

The first workshop I attended this morning was entitled „Toward a Digital Humanities Curriculum“ and was hosted by Prof. Manfred Thaller, from Uni. Köln, which is a godfather-type figure in Digital Humanities in Germany. Prof. Thaller belongs to the first generation of digital humanities, as he himself stated,  telling us an anecdote about the year 1996, where DH – at that time the word was „humanities computing“ – were first acknowledged as specific research field.

I’m coming to the point. Back in 1996, according to prof. Thaller, there was a hype in US Colleges about teaching student how to use Text processor softwares as Microsoft Word and to build their own homepages. At the same time, some researchers were starting to run computational analysis in digital corpora, mainly textual. While the ones were learning to use new available technologies, the others were trying to design new computer-based research methods. Modeling is thus a central DH skill, i.e. represent a humanities problem in such a way that it becomes possible to build a technical solution.

According to Prof. Thaller – or at least according to what I understood from his presentation – that’s where he sets the border between  DH and „Library Studies“. Prof. Thaller somehow considers DH as a humanities discipline, where the humanities remain sovereign over the technology. The research questions, and the choice of what tool are to be developed, stays in the hand of the humanities scholar. Conversely, Library Science have a mission to teach how to use existing tools and empower student traditional research skills, but that’s not proper DH research.

This debate is crucial to a DH definition. Should DH be understood as augmentation of other curricula or rather as  a new professional curriculum ?

Everybody seems to consent here that DH is a specific discipline, that differs from other information sciences. The core topic discussed here today is what skills should be taught to DH Ba/MA students in university. You will find in the slides below an impressive list of Standards important to Digital Humanities (pp. 8-11).

I should add that in Germany today there are ca. 10 Undergraduate Programs and 12 Master programs in DH, plus several news chairs in DH being created these last years. We received also a leaflet called Digitale Gesiteswissenschaften, which is a summary of the state of the art of DH university programs in Germany.