After completing the research of the first sprint, we made an attempt to define what the desired features of our tool were. How can the tool -which we will call wiki2epub from now on- reflect the ethos of the Wikimedia community? Can it provide an intermediary stage for the users’ different levels of familiarity with the software?
To this purpose, we set up interviews with collectives who had worked with the mediawiki software and ultimately made publications based on the content.
The mediawiki software has the capacity to function in a multitude of ways. Originally designed to be used on the Wikipedia website, it has been implemented not only on Wikimedia projects, but also independent ones.
It has served as a repository, a collaborative writing tool, a blog, or, in some of the cases we looked at, as a backend to a website. Three of these instances were most relevant to our research, which we have named according to their particular use case: wiki-as-documentation, wiki-as-wet-classification and wiki-as-thinking-machine.



Hackers and Designers is a non-profit cross-disciplinary community of programmers, engineers, designers, and artists who organise meetups where collaborative and inclusive workshops are given.
We had the pleasure of meeting one of the founders, Anja Groten, who kindly led us through the process of documenting the summer school that H&D has been organising for two years now. Starting in 2015, H&D use the wiki in a two-fold way: as a central place to store media files for the participants and as the website’s back-end. There were some challenges that came with using the wiki as a central software, for example bot attacks that led H&D to restrict open accounts on the wiki by introducing a confirmation process.
The free and open source nature of the wiki proved to be an important factor in the choice to host the content on the wiki. Collaboration is one of the driving forces of the initiative and the method to communicate to those involved had to reflect and share this aspect. In fact, this factor is something we have encountered in all the projects we interviewed.
With the second edition of the summer school, the wiki turned out to be again a reliable archiving system, maintaining content from summer encounters and the workshops that gravitated around the project.
While discussing with Anja, an important point came up, namely the critical nature of editing. Gathering content and preparing for a book requires the editor to go through a selection process in which they organise information according to a then defined structure.



From Renee‘s blog:

The Warp and Weft of Memory, is a research project with Castrum Peregrini, which will run from September 2016 to September 2018. The work explores the wardrobe of Gisèle d’Ailly van Waterschoot van der Gracht, and the ways in which it reflects her life, work, and various histories through textiles and clothing. The aim is to not only mine the past, but also make connections to the present. As a whole the project will have different public manifestations: public lectures, educational events, an online narrative combining fact, fiction and artefacts.

Discussing with Renee Turner and her team (Manufactura Independente, Andre Castro and Cesare Davolio) whilst the project is still in its developmental phase, gave us the space to consider questions that had more to do with overcoming the constraints of classification. The team is currently establishing a connection between Gisele d’Ailly’s practice and the interface of the web archive they are working on. Inspired by the artist’s fervent efforts of cataloguing her clothing items, they chose to work with a semantic wiki in order to transpose this process in a digital environment. The challenges then became how to create meaningful semantics through the tagging system and how to maintain the poetics of the materials without reducing their complexity.

In order to overcome the confines of a “dry classification”, a system of “wet classification” was proposed, which would include the non-encyclopedic, the fictional and the fragmentary.
Our encounter with Renee and her team made us consider a different use case situation in which the tool could allow the possibility of fiction. How can we include an unpredictable element into the predefined options of the tool?



The interview with Mondotheque took place on an Etherpad, which is an open source software used for collaborative real-time editing. Authors are identified through colours and the versions of the text progression are saved.
(As a side note while thinking about how software can change the interaction with the Wikimedia content, having the interview on the pad proved to be a very different kind of experience, where instead of following one line of discussion, multiple threads developed, diverged and converged in interesting entanglements.)
Mondotheque is a collective of artists, activists and archivists who have been investigating the appropriation of the Mundaneum in Mons as cultural heritage of Google. The Mundaneum was a project started by Paul Otlet and Henri La Fontaine which responded to the universalist quest for knowledge of the early twentieth century. The Mundaneum was an attempt to gather all the world’s knowledge and to have it organised according to the Universal Decimal Classification, a system that would lay the foundation of data management. It isn’t difficult to spot similarities between the idealistic grandeur of the Mundaneum and Wikipedia projects.
Mondotheque chose to work with a semantic wiki because they considered it to be the environment that resembled Otlet’s intellectual machine the most:

We installed a Semantic MediaWiki and named it after the Mondothèque, a device imagined by Paul Otlet in 1934. The wiki functioned as an online repository and frame of reference for the work that was developed through meetings, visits and presentations[4]. For Otlet, the Mondothèque was to be an ‘intellectual machine’: at the same time archive, link generator, writing desk, catalog and broadcast station. Thinking the museum, the library, the encyclopedia, and classificatory language as a complex and interdependent web of relations, Otlet imagined each element as a point of entry for the other. He stressed that responses to displays in a museum involved intellectual and social processes that where different from those involved in reading books in a library, but that one in a sense entailed the other. [5] The dreamed capacity of his Mondothèque was to interface scales, perspectives and media at the intersection of all those different practices. For us, by transporting a historical device into the future, it figured as a kind of thinking machine, a place to analyse historical and social locations of the Mundaneum project, a platform to envision our persistent interventions together.

The printed book containing articles from the wiki was seen as a moment of ‘incision’ into the research thinking. Not a finale, but a step and opportunity to take the research to another level. We found it interesting that the book kept certain qualities of the mediawiki, such as the Disambiguation page, the Last Edited information snippet and created an alternative narrative from having the images shown independently from the text, one after the other according to a sort of category-narrative.

We started thinking about the necessity of leaving space for digression inside the tool -through comments on wiki material-, extracting elements that we consider fundamental to the wiki and embedding them into our tool, and thinking about the process of making a book as a moment of reflection that is not typically present on Wikimedia websites. In the next sprint we will make a list of the most important features we would like the tool to have and start working on a prototype.