DH2015: Lots of Questions and Not So Many Answers

by José de Kruif & Melvin Wevers

Digital Humanities is still a booming business, even though at this year’s global DH conference fewer papers–some 260 papers and posters–were presented than in previous years. The fact that the conference took place in Sydney, Australia probably made attendance difficult for some. On the other hand, this location meant more submissions from Asia and Oceania, though the US was still dominant.[i] 

With regards to representation of gender, the DH community still has some catching up to do. As with many conferences nowadays, a parallel conference was taking place on Twitter. During one of the keynotes, an interesting twitter conversation arose about the under-representation of women in DH authorship and on boards.[ii] At DH2015, women made up 46% of the attendees, 34,6% of the authors, and 33.3% of the keynotes (1 out of 3). Moreover, the editorial board of the newfound journal Frontiers in Digital Humanities has zero(!) womenThese issues were addressed by organizer Deb Verhoeven in an introductory talk on the third day of the conference. 

The topics of papers and posters ranged from archives, digitization, text mining, visualizations, to network analysis. Many presenters described work in progress with a promising future for their specific tool. Unfortunately, only few talked about how computation had actually helped to answer their research question. The papers that did, stood out. For instance, Micki Kaufman’s paper Quantifying Kissinger showed how different tools can be used in conjunction to research the paradoxes in Kissinger’s political behaviour.

The Translantis team was represented by José de Kruif and Melvin Wevers. The latter presented a paper he co-authored with project members Tom Kenter and Pim Huijnen. In the paper Concepts Through Time, they showed the use of distributional semantics for conceptual history. They argued for the close collaboration of humanists and computer scientists throughout the entire research process, from coming up with a hypothesis relevant to both fields to answering it. Much too often, these two research communities only meet tangentially. 

Participants also had ample chance to reflect on Digital Humanities in plenary sessions. During the meeting Building Communities and Networks in the Humanities, it was mentioned that several Arabic countries as well as Taiwan will join the Alliance of Digital Humanities Organizations (ADHO). Moreover, this meeting revealed that similar concerns are popping up everywhere. What kind of science is it that we pursue in DH? Should every digital humanists learn at least some coding to do proper DH research? And how do we explain DH to students? And lastly, how can one obtain facilities to deploy suitable tools within major university structures?

In sum, the conference brought to light that there is much work to be done within the Digital Humanities community both on a infrastructural level, as well as on the level of the actual research content. After a long time of making promises, the actual work has just started. Let’s continue by writing papers that emphasize how computation actually helped to answer research questions.

[i] Some of the figures in this post taken from Scott Weingart’s Blog

[ii] For more on this discussion, see: Melissa Terras’s blog

Reconsidering Transatlantic Studies: A Digital and Longue Durée perspective

RSC TSA 2015by Jesper Verhoef and Lisanne Walma

The Transatlantic Studies Association’s annual conference, which has hosted by the Roosevelt Study Center in Middelburg, indicated some interesting and promising connections between Transatlantic studies and our project.

First, there seemed to be a growing interest in looking beyond diplomatic sources — notwithstanding the many panels still devoted to them – towards public circulation of ideas. In particular, it became clear that the interest in Digital Humanities and digitized material in studying this circulation is growing. There appeared to be two approaches towards Digital Humanities at this conference. One was the explicit use of digital tools, for which John Corrigan provided inspiration with examples when he elaborated on works from Spatial Humanities. Many of these papers acknowledged the need to combine the visualizations Digital Humanities techniques yield (which can be used for ‘distant reading’) with close reading; de facto moving beyond a false dichotomy between the two.

Another way scholars went about it was to covertly include digitized material and digital tools within regular historical argumentation: they used plenty of digitized sources, without making this explicit in their presentations. Interestingly, these (implicit) pleas foGienow_Hecht_still_0r Digital Humanities tied in with a call for larger historical projects, i.e. a return of the longue durée – amongst others stressed by both keynote speaker Jessica Gienow-Hecht and speakers from the opening panel ‘The Transatlantic Paradigm Reconsidered’. This is in line with recent work by, for instance, Jo Guldi and David Armitage, who in their History Manifesto stressed that digital techniques and data available nowadays invite one ‘to try out historical hypotheses across the time-scale of centuries.

 Which leads to a final observation: there were many different studies into the workings of references, in particular how across a range of discourses imaginings of America and Europe were established and changed over time. Notable was the wide variety of sources from which the scholars weighed in on these debates. This ranged from depictions of the United States in cultural magazines in Spain, to advertisement companies in Italy, to writers and others who traveled back and forth to the United States – of which American historian John Lothrop Motley, as Jaap Verheul showed, proved to be an intriguing case. The value of the vast Dutch national newspaper corpus was made explicit in our own presentations. Lisanne showed the importance of American drug dystopias in Dutch debates in the 1920s; while Jesper argued that debates on the portable radio in the 1950s and 1960s indicate that claims about the relation between modernization and Americanization needs be questioned – the two did not necessarily co-occur.

TSA logoIn sum, the presentations highlighted that a vast array of sources is able to further contextualize discussions on Americanization by showing that on the ground it is a complex, multidirectional struggle. 

 

 

 

 

Translantis gerelateerd NWO-voorstel gehonoreerd

Eind juni 2015 werd het NWO-vrije competitie voorstel “The Imperative of Regulation: local and (trans-)national dynamics of drug regulatory regimes in the Netherlands since the Second World War” goedgekeurd. Het voorstel gaat onder andere gebruikmaken van de in Translantis ontwikkelde technologie.
Aanvragers: Prof. J.C. Kennedy (UU) & Prof. T. Pieters (UU)
Aanvang: september 2015
Samenvatting:

Drugs tolerantie als mythe
Nederland heeft een reputatie als tolerant land met betrekking tot drugsgebruik. Toch is ook hier het gebruik van ‘drugs’ na 1945 in toenemende mate gereguleerd. Dit project onderzoekt hoe de groeiende bemoeienis met drugsgebruikers samenhangt met publieke debatten over drugs, ontwikkelingen in de drugshandel en processen op lokaal en internationaal niveau. Hierbij wordt op een innovatieve wijze gebruik gemaakt van een combinatie van conventionele en digitale onderzoeksmethoden zoals die beschikbaar worden gesteld door NWO in CLARIAH, Horizon-Translantis of via Beeld en Geluid en het E-Science-centre.
Summary:
Dutch drug policies since the Second World War have oscillated between tolerance and repression of drug use. The Netherlands, with their internationally (in)famous reputation of guiding in decriminalization of drug use and in public health harm reduction policies, a reputation established in the 1970s-1990s, have since become more restrictive. However, whether tolerant or restrictive, pragmatic or moralistic, from a historical perspective drug policies in the Netherlands have shown a structural undercurrent of increasing regulation. In other Western countries, too, increased institutionalised interventions in drug use have gone hand in hand with oscillations in strategies and approaches.

The development of, and swings within, the regulatory imperative have been inadequately explained, as they are focused too narrowly on a univocal – and unique – national drug policy-making process. This project broadens the scope and investigates drug regulation in the Netherlands as historically resulting from the interaction with, the development of drug economies, shifting public perceptions about drug use, and the dynamics of local drug politics. The key research question is: how can the development and intensification of drug regulatory regimes since the Second World War be explained?
Digital history methods will be employed in combination with conventional methods in a ‘blended’ research methodology. Important digitized databases of textual sources that will be used are Delpher (the Dutch online historical newspaper database of the Royal Dutch Library [KB] (covering the period 1945-1990) and the LexisNexis database for international documents, periodicals and Dutch newspapers (covering the period 1990-2015). These will be researched using the following digital tools for data mining: Texcavator and Newsreader.
Furthermore, the researchers will make use of digital tools developed by the Netherlands Institute for Sound and Vision [B&G] (e.g. Polimedia and AVResearcherXL).

Shoot with Many Guns: How Can the Digital Aid the Humanities?

Report DH Benelux 2015, Antwerp

by Jesper Verhoef and Melvin Wevers

During the annual Digital Humanities Benelux conference, this year in Antwerp, it became clear that the focus of Digital Humanities (DH) has shifted in recent times. Earlier conferences and workshops on DH were dominated by the question whether tools should be put to use in humanities research in the first place. In 2015, the attention seems to have shifted to trying to answer how tools can be put to use within different disciplines of the humanities. On the program, there were not only humanists that wanted something from computer scientist, but also projects that had successfully combined the skills of humanists and computer scientist, and presentations by computer scientists that also inspired the humanists in the audiences. 

A returning topic was that the researcher should strive to use as many tools and techniques as possible. Antal van den Bosch described this approach as shooting with many guns. After an exploratory phase, one could then compare and contrast the outputs, which directs the researcher towards new insights or hypotheses. In our own presentation, we advocated a similar approach. We stressed that there is no “one size fits all.” Every scholar, historians not in the last place, has access to a freely available digital toolkit, with which a corpus can be created, queried, and analyzed. Like a carpenter building a table, a historian needs multiple tools to construct a historical narrative, while at the same time gauging the applicability of the tool (tool criticism) and the historical relevance of the source (source criticism). We coined this process the Digital Humanities Cycle – an updated version of the empirical cycle. Based on experiences from our own Translantis project, we presented how we went about our historical research following an iterative process of heuristics, hermeneutics, tool criticism, corpus faceting, and source criticism.

Photo by Sally Chambers

Photo by Sally Chambers

DH Colloquium 4 June: Eleonora Maria Mazzoli and Willemien Sanders (Utrecht University)

At the next Digital Humanities colloquium, we will have two presenters Willemien Sanders (Utrecht University) on developing a tool for online publication of audiovisual research, and Eleonora Maria Mazzoli (Utrecht University) on the online audiovisual essay as a scholarly publication form.

Date: Thursday 4 June, 15.30-17.00
Place: Janskerkhof 13 (room 0.06), Utrecht

Continue reading

DH colloquium on 7 May: Nees Jan van Eck (CWTS, Leiden University)

At the next Digital Humanities colloquium, Nees Jan van Eck (Centre for Science and Technology Studies, Leiden University) will talk about software tools for bibliometric analysis of scientific publications – in particular the tools VOSviewer and CitNetExplorer, which Van Eck co-developed.

Date: Thursday 7 May, 15.30-17.00
Place: Janskerkhof 13 (room 0.06), Utrecht

Continue reading

THATCAMP Utrecht, 28 en 29 januari 2015

THATCampUtrechtHet THATCamp Utrecht wil een platform bieden aan alle geesteswetenschappers in Nederland om met dataproviders, IT’ers en elkaar ervaringen of vragen te delen rondom het gebruik van digitale middelen in hun onderzoek of onderwijs. Volgens de ‘regels’ van het THATCamp wordt het tweedaags programma op 28 en 29 januari 2015 grotendeels door de deelnemers zelf vastgesteld. Iedereen kan daartoe in aanloop naar januari en aan het begin van de bijeenkomst zelf voorstellen aandragen. Wel ligt de nadruk op de praktijk: het gebruik van digitale technieken in uiteenlopende aspecten van onderzoek en onderwijs en het gezamenlijk werken aan ter plekke of van tevoren vastgestelde projecten. Anders dan veel Digital Humanities-conferentie biedt het format van het THATCamp hiervoor immers een unieke gelegenheid: niet alleen geesteswetenschappen zijn vertegenwoordigd, maar ook experts op het gebied van databeheer en programmeurs met technische kennis van zaken.

 

De voorgestelde sessies die tezamen het programma vormen, hebben een gelijksoortige opzet. Ze worden door één of meerdere mensen voorgezeten die theorie, uitleg of ervaringen presenteren, waarna iedereen (neem je laptop mee!) binnen het thema van de sessie zelf aan de slag kan. Gewerkt kan worden met vrij beschikbare databestanden, maar onderzoekers wordt ook de gelegenheid geboden met hun eigen data aan de slag te gaan: iedereen die belangstelling heeft zijn (analoge of digitale) databestanden te ontsluiten of analyseren, kan deze meenemen. (Neem dan van tevoren even contact op met de organisatoren om van gedachten te wisselen over format en doel.) Tot slot verzorgt David Berry (University of Sussex), samensteller van het standaardwerk Understanding Digital Humanities, op 28 januari een keynote.

 

Deze sessies worden al aangeboden:

  •  Maak je archief nuttig: van papier naar digitale data
  • Haal je eigen online databestanden binnen: Data harvesten via (Json) rest calls
  • Het digitaal ontsluiten van audiobestanden
  • Evernote (en andere tools) voor geesteswetenschappers
  • Introductie in Python
  • Introductie in programmeren: Coding the Humanities 
  • De mogelijkheden en beperkingen van topic modeling
  • Netwerk analyse en visualisatie met Sci2 
  • Digital Humanities in het hoger onderwijs

 

thatcamp_square_logo_illKijk op Utrecht.thatcamp.org voor meer informatie. Deelname aan het THATCamp is gratis, maar aanmelding via http://utrecht.thatcamp.org/register/ is verplicht. Doe dat op tijd, want het aantal deelnemers is beperkt en ervaring leert dat de animo groot is.