The Digital Methods Initiative (DMI) Amsterdam, is holding its annual Winter School on Post-truth Empericism. The format is that of a (social media and web) data sprint, with hands-on work for telling stories with data, together with a programme of keynote speakers and a Mini-conference, where PhD candidates, motivated scholars and advanced graduate students present short papers on digital methods and new media related topics, and receive feedback from the Amsterdam DMI researchers and international participants. Participants need not give a paper at the Mini-conference to attend the Winter School. For a preview of what the event is like, you can view short video clips from previous editions of the Summer School.
Nearing two years on from the scandal of fake news, web epistemology has changed irrevocably. Social media platforms have been revealed as primary vectors for rumour-mongering and conspiracy theory, and Facebook in particular (and Instagram before it) are shutting down access to data (through APIs) or deleting it, without committing it to public archives. As researchers we are compelled to rethink how we study these platforms. One conceptual response, the post-digital, became a forceful rejoinder to any naiveté about ‘new media’ and its playful study, left over after the Snowden revelations of widespread digital surveillance by governmental agencies. Another response, post-truth, concerns the new state of epistemological affairs online, a realisation that ‘we can’t have our facts back,’ as Noortje Marres phrased it, for social media — not to mention much of the non-editorial web that preceded it — are ’truthless’ media. The web has long been conceived of as a medium of ill repute, populated by pirates, pornographers and self-publishers, later to be cleaned up by folksonomy and sifted through by the wisdom of the crowd. Where are those mass editing publics these days? Have they really been replaced by small fact-checking bureaus?
This year’s Winter School is dedicated to the broad post-truth problematic, not only conceptually but empirically. How are online media being adjudicated? Previous work has examined genres of misinformation, their spread as well as detection, comparing them across platforms. Conspiracy (complicated emplotments) was found to flourish on the web, but well detected on Facebook, but the opposite held for disinformation (hard facts inverted). The ontological work of identifying and classifying post-truth continues. One ‘so what’ question that persists since the beginning of the post-truth period concerns whether these content consumers are persuaded. Are they ‘communities of believers’ or ‘active filtering audiences’? Another crucial question concerns the availability of online materials, once considered ‘big data’ but now difficult to access without scraping or other ill-gotten means. Are there materials, or perhaps remnants, still available for study? At last year’s Winter School researchers built an archive of the traces from some 88 Russian disinformation pages that had been pushed onto other platforms, and are available for study. There are homemade collections, some rather large. One is a Twitter archive put out for public use by a US news agency. Facebook’s Social Science One project also should be interrogated for the lengths researchers would go for Facebook-sanctioned data (and what kinds of research may be performed with it). Of interest as well are the relatively understudied platforms such as YouTube’s so-called dark intellectual web, the ‘manosphere’ in Reddit as well as fringe platforms such as 4chan, gab.ai, voat, etc.
At the Winter School there are the usual social media tool tutorials (and the occasional tool requiem), but also invitations for thinking through and proposing how to work on web epistemology after the fake news debacle. New tutorials include digital ethnography, data journalism and small data work, plus a new variation on the walk-through method.