Please note that this is an invitation only event.
(Short link to this page: http://bit.ly/radtransparency)
King’s College London, King’s Digital Lab
- Clare Birchall, (Reader in Contemporary Culture, King’s College London)
- Seda Gurses (Leuven University and Postdoctoral Research Associate at Princeton University)
- Burak Arikan (Artist, Istanbul and New York)
- James Smithies (Director, King’s Digital Lab)
- George Danezis (Professor of Security and Privacy Engineering, UCL)
- Jonathan Gray (Lecturer, Digital Humanities, King’s College London)
- Ero Balsa (Leuven University)
- Brian Maher (Developer, King’s Digital Lab)
- Paul Caton (Senior Analyst, King’s Digital Lab)
- Ginestra Ferraro (UI/UX Designer, King’s Digital Lab)
- Seb Franklin (Senior Lecturer, English, King’s College London)
This workshop will explore the technological possibilities of interrupting “shareveillance” – a distribution of digital data that entails calls upon citizens to give up, share, view, and act upon different forms of data. While dataveillance reduces the political potential of citizens to a flat data set, the kind of open data transparency on offer from many corporations and states reinforces the neoliberal tendency to responsibilize individuals without offering real power. Both open and closed data, then, work on the assumption that technological fixes can be offered in lieu of ethical commitments or social justice. Under this neoliberal and securitized regime of visibility and sharing, there is little room for much beyond highly circumscribed experiences with data.
The task that participants of this workshop will grapple with is to experiment with ways of interrupting the imperatives and protocols at the heart of shareveillance. Interruptions of shareveillance can take different digital forms – distributed clouds or servers; encrypted or anonymous communication; counter-optimization browser extensions or plug-ins; non-commercial social media platforms; radically transparent blockchain technologies (see Brunton and Nissenbaum, 2015); and target different scales of infrastructure – software, platforms, networks, servers. Theoretically, an interruption of shareveillance can move in two different directions.
First, shareveillance can be interrupted by implementing radical transparency: a transparency that does not fall into the trap of reading social problems as information problems, or makes inherently inequitable systems more efficient. “Radical” here means not producing more of the same kind of data under the same old regime of shareveillance, but of changing the kind of information that is made visible and the conditions of visibility in general. “Radical transparency” might then be envisaged as a mechanism able to challenge the circumscribed role that data transparency has been given. Radical transparency would allow data subjects to position themselves in relation to data (rather than be positioned by it).
Second, shareveillance can be interrupted by secrecy. We must look to secrecy to do this interruptive work rather than privacy for good reason. Even when people coalesce around privacy concerns, step into the light of the demos, they do so to insist on their right to step back into the apolitical shadows of individualism, away from the possibility of collective creativity or identity-in-common. Privacy is a concept closely tied with the liberal individual and the bourgeois public sphere and as such is ill equipped to challenge the subjectification of shareveillant data subjects. Any tools that want to interrupt shareveillance must take into consideration an apparent oxymoron: collective and communitarian forms of secrecy.
As well as the links and references offered here, there are many more open access links at this page of Liquid Books
If you would like to add a suggestion to these experimental tools and projects, please e mail clare.birchall [at] kcl.ac.uk Please ensure that any suggestion is substantially different from the examples offered above. Please do not offer suggestions of state sponsored open data initiatives.