All entries for July 2014

July 13, 2014

Applying New Digital Methods in the Humanities, British Library, 27th June 2014

‘Applying New Digital Methods in the Humanities’ was a one-day workshop held at the British Library on 27th June 2014. The dhAHRC organising team brought together a wonderful mixture of librarians, journalists, software engineers and academics, both Digital Humanities experts and researchers applying digital methods. Jumping in at the deep end, the morning’s papers focused on Digital Humanities as a discipline. Professor Melissa Terras (UCL) gave an excellent keynote on ‘Digital Humanities Through and Through’, where she contested the idea that Digital Humanities was a field that was only a decade old, arguing that people have been using quantitative methods in the Humanities for centuries. Terras maintained that the research questions and critical awareness of Humanities scholars remain the same, only our tools and society have developed. Dr Jane Winters (Institute of Historical Research) then spoke of her experiences of ‘Big Data’ from three interdisciplinary projects. Winters was sensitive to the weaknesses of macro data and its ‘fuzziness’ with issues such as spelling, but demonstrated that data on such a large scale reveals changes which might have passed unnoticed, which in turn can inform new research questions.

The morning then took a decidedly medieval turn, with papers by Dr Stewart Brookes (KCL), Dr David Wrisley (American University of Beirut), and Dr Jane Gilbert (UCL), Paul Vetch (UCL) and Neil Jefferies (Bodleian). Brookes presented his team’s DigiPal Database of palaeography in Anglo-Saxon manuscripts. They have classified and logged all the separate elements of a manuscript letter and their variants, which allow scholars to identify a manuscript’s provenance, date and even scribe, as well as to view the evolution of practices. Interestingly, the methods behind DigiPal have now been extended to other languages from other centuries and have even been applied to examine paratextual elements such as illustrations. This was really useful in demonstrating quite how flexible the methods of Digital Humanities can be. Moving towards Francophone culture, Gilbert and Vetch spoke about how Digital Humanities techniques have been employed on the AHRC project ‘Medieval Francophone Literary Culture Outside France’. Their paper was appropriately named ‘I was but the learner; now I am the master’ and really put the accent on collaboration between the research and Digital Humanities teams. Their result was the creation of a novel method to analyse different physical versions of the same text. This was arguably the most helpful session of the day because it really underlined the importance of communication between different specialist skill sets. Remaining with manuscripts, Jefferies talked about the ‘Shared Canvas and IIIF’. This is a DMSTech project with numerous collaborators such as the British Library and the Bibliothèque nationale de France with a two-fold aim: maximum manipulation for the reader who can re-piece folded manuscripts and review multiple images with their own annotations, whilst maintaining minimum movement of data.

The afternoon started by focusing on the contribution of the ‘citizen’. Martin Foys (KCL) and Kimberly Kowal (British Library) recounted how they use crowdsourcing for Foys’ project ‘Virtual Mappa’. This was followed by Chris Lintott (Oxford) speaking about his experience in creating and continuing Zooniverse.org. Lintott and his team realised the human brain was far more accurate in identifying galaxies than a computer, and that human brain power could be harnessed for the benefit of science through the development of fun activities allowing an individual to contribute to the world of research. This has since been extended to multiple projects from a various disciplines and countries.

The last two papers by Jason Sundram (Facebook) and Rosemary Bechler (Open Democracy) moved the day to focus on digital methods and culture more generally. Sundram explained how he combined his passion for programming and classical music to analyse Hayden recordings, which in turn affect the performances of his quartet. Rosemary Bechler then ended the day with a keynote on how digital methods are driving a revolution in journalism that prioritises the audience. She contended that whereas the passive audience was told what to read in the past through the dominance of the front page, this is now being replaced by social media, which create ‘hubs of interest’ and a much more dynamic, dual relationship between journalist and reader, allowing for a transnational flow of information.

Although the day could have benefitted from a reverse programme order, offering a softer introduction to Digital Humanities, it was an incredibly useful experience. The combination of numerous research projects and various standpoints within the field of Digital Humanities was thought provoking. However, regardless of project or position, four key points were reiterated across the papers:

  1. You do not have to be a programmer to use Digital Humanities; programmers can be built into funding bids. However, make sure that they are employed for the right amount of time (i.e. so they are not overworked or twiddling their thumbs).
  2. Make your digital methods re-purposeful for other projects.
  3. Envisage how your data will be stored and archived after the end of the project.
  4. Most importantly, critical research questions continue to drive Humanities research, not the tools.


July 08, 2014

Making social bookmarking that bit more social

I have for a while used Diigo to track and organise my bookmarks, particularly in the digital humanities. My bookmarks are shared with the 'Academic Technology at Warwick' group on Diigo. I did want to add twitter as a channel to spread these finds with my followers and those tracking the #dhwarwick tag (which also feeds the front page of the Digital Humanities website).

if [diigo] then [twitter]

To stitch these two tools together, I have used a third tool, IFTTT. IFTTT is one of a number of tools that creates 'recipes' that allow activity in one service to trigger an action in another by granting this intermediary access to both accounts, and a set of criteria to trigger.

To work neatly, I also had to come up with a vocabulary that will help me organise my bookmarks and automatically generate sensible tweets. This is what I'm using:

  • Anything tagged with #dhwarwick in my Diigo account is the trigger to send a tweet.
  • A tweet is composed of "Just bookmarked this: {{Title}} {{Url}} tagged {{Tags}}" where the curly braces are replaced with the text from Diigo.
  • Because twitter is going to use the tags, it will include #dhwarwick which will be picked-up by twitter as a hashtag, and also feed the website.
  • I'll be making sure that things I bookmark and tag #dhwarwick are succinct to fit within the 140 character limit.
  • If I find a link via a prompt from someone, I also have a tag for this. I put 'via @twitterusername' in as a tag. This will reference them on twitter in the tweet too.

I've shared my IFTTT recipe if you want to see what's going on and do something similar:

IFTTT Recipe: Push selected bookmarks to twitter connects diigo to twitter


July 07, 2014

Conference Report: Applying New Digital Methods to the Humanities

Writing about web page http://www.dhcrowdscribe.com/

This one-day interdisciplinary workshop set out to question how knowledge creation and production could be advanced through employing existing and emerging tools available in the Digital Humanities. Conveniently based at the British Library, one of the most innovative centres of digital research, this event provided the opportunity for doctoral and early career researchers to learn more about the current research being undertaken in the Digital Humanities and how, as scholars, we can use these techniques to advance the creation and dissemination of knowledge. As a doctoral student interested in looking at new directions for my future research, I thought that this would be an ideal opportunity to learn more about the past, present and future directions of this exciting field.

The programme was varied and stimulating, covering a range of topics, including Big Data, mapping and visualisation methods, audience and database development. The highlight of the day was the keynote presentation from Professor Melissa Terras (Director of Digital Humanities, UCL) who offered some practical advice for scholars considering a Digital Humanities project. This was an interesting and thought-provoking summary of some of the issues that digital humanists face and the types of strategies that should be employed in order to ensure a successful project. One of the best pieces of advice Melissa offered, and one which recurred throughout the day, was to know what the end results and outcomes of the project are. The data, as she made clear, must always be the focus of the research, since it will have a much longer life-time than the tools themselves.

The rest of the event then turned to consider digital research tools and how they had been developed to address specific research questions. Dr Jane Winters (IHR) explored in her presentation the types of methods and tools available for Big Data and discussed some of the projects in which she has been involved, including her interdisciplinary work on the Digging into Linked Parliamentary Data project. Dr Stewart Brookes (Kings College, London) talked about his work on the Digipal Database of Anglo-Saxon Manuscripts and Dr David Wrisley (American University of Beirut) explored spatial data mapping of Medieval Literature. Dr Jane Gilbert (UCL) and Dr Paul Vetch (Kings College, London) presented on how they implemented digital resources for their project on Medieval Francophone Literary Cultures Outside France, Dr Martin Foys (Kings College, London) and Kimberly Kowal (British Library) spoke about the British Library Georeferencer project and the benefits of crowd sourcing, and Neil Jefferies (University of Oxford), presented on his projects Shared Canvas and IIIF, both of which have been implemented to address specific problems with the presentation of manuscripts on digital software.

One of the outcomes of these presentations was the clear need to create research tools which were ‘repurposable’, i.e. had a life-cycle beyond the specific project and could be made available for other people to use and adapt. However, one of the gaps in the event was that the presentations focused on a set of tools that had been developed with a very precise project in-mind. As a non-specialist, it very much felt as though the focus was on creating a new tool rather than implementing existing software to answer specific research objectives. I therefore felt that some of the discussions would have benefitted from a bit more practical advice about how to source and apply existing research methods. Moreover, whilst these presentations were thoroughly thought-provoking, they did draw attention to one of the big gaps in historian’s knowledge—programming and coding. It would have therefore been helpful to have heard more about how to encourage more interdisciplinary collaboration with software engineers and programmers and how to get these people involved in a funding bid.

One of the strengths of the event, however, was its broad emphasis on interdisciplinarity and cross-disciplinary collaboration. Some of the most stimulating papers of the day were from individuals not involved in Digital Humanities projects, but whose work with programming and crowd-sourcing had specific application to Digital Humanities research. This included Dr Chris Lintott’s (University of Oxford) paper on Zooniverse, which has led to internationally acclaimed digital projects and a stronger awareness of the impact that non-specialist audiences can make to research projects. The idea of ‘connecting communities’ was a theme picked up by Jason Sundram a software engineer who has worked for Facebook, and who spoke to the delegates about how he had been able to combine the performance, analysis and visualisation of Haydn String Quartets. The final speaker of the day was Rosemary Belcher, an editor for the website openDemocracy, who provided a powerful closing message about the importance of promoting content and connecting with the audience. The act of publishing, she argued, should be part of a bigger drive to expand and connect engaged audiences.

For a researcher only just thinking about the implications of Digital Humanities, this event was an excellent opportunity for me to explore the different ways in which digital research can make a positive impact on my own work. I found the day thoroughly stimulating and enjoyed hearing about the broad range of scholars currently employing these techniques. Since so much of the event was focused on ‘connecting communities’ it seemed particularly appropriate that one output of the event was the fantastic networking opportunity it provided. I am very grateful to the Digital Humanities / Academic Technology team at Warwick for the opportunity to travel to such an intellectually stimulating and highly-relevant workshop. It has also given me the much-needed opportunity to contextualise and consider my research within a wider interdisciplinary framework.

Naomi Pullin
Department of History

July 2014

Mo Tu We Th Fr Sa Su
Jun |  Today  | Aug
   1 2 3 4 5 6
7 8 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30 31         

Search this blog

Tags

Galleries

Most recent comments

  • Thanks for this short blog by Dave on this entry

Blog archive

Loading…
Not signed in
Sign in

Powered by BlogBuilder
© MMXXIV