dan mcquillan

lecturer in creative & social computing

Dec 23, 2015

Citizen Science and Standpoint Epistemology

This was a talk I gave at the seminar 'The Activist, the Academic and Digital Media' at Goldsmiths on 15.12.15. It describes participatory citizen science as the mobilisation of feminist and postcolonial critiques of science, using the concrete example of 'Science for Change Kosovo'.

The theme of this seminar is 'The Activist, the Academic and Digital Media', but as an activist friend of mine said recently "There's more people researching us, than there is 'us' actually doing stuff". She can't even access the papers written about her project because they're locked behind paywalls. There are times when academia can seem a bit like fracking: a fenced-off, extractive activity that drills sideways under other people's lives to extract value. On the other hand, although I've been an activist as well as an academic, I wouldn't unnecessarily valorize activism. The urgency to act can obscure complexity and put a low value on reflection, and the cocktail of personal commitment and external repression can result in an elitist culture. So, do we really want to develop the category of activist-academic? It seems to me like this could become a double-exclusion, and I don't think it's the hybrid I'm looking for. I'll talk instead about the relationship between theory and practice, the relationship between those things and our digital environment. I'm going to look at science which is, after all, the hegemonic hard man of theory and methodology. I want to start by questioning the idea of objectivity, a concept core to science but which also permeates a lot of academic frameworks. Just to be clear, I'm certainly not interested in dumping empirical methods; in fact, I'm relying on them. And I'm not going to trash the idea of a 'real world' as, however relativist your ontology, a boot in the head is definitely going to hurt. But I am going to question the orthodox scientific world view by drawing from feminist and postcolonial critiques of science.

Sandra Harding draws these together under the heading of standpoint epistemology. This perspective sees the gendered character of science in the questions it asks, the discursive resources it draws on, and the way it organises knowledge production. This also apply from a postcolonial point of view as well, with the added dimension that European science disproportionately developed to serve the needs of the European expansion. Taken together, this has created distinctive patterns of knowledge and ignorance about nature's regularities and their underlying causal tendencies. It's not saying that science 'makes up' results but that, following Kuhn, it co-evolves with the historical social order. Standpoint epistemology suggests positions of political disadvantage can be turned in to sites of analytical advantage because the prevailing version of objectivity misses the culture-wide assumptions that shape concepts & procedures. So identifying these elements in the conceptual frameworks can potentially bring forth a stronger form of objectivity. This was pithily expressed by Donna Haraway who denounces 'the god trick' of seeing everything from nowhere, in other words, the universalising abstract viewpoint of science. For her, objectivity is about particular embodiment because an unlocatable knowledge claim is irresponsible; it can't be called to account. In her words, 'only partial perspective promises objective vision'.

Of course, standpoint epistemology is still a theory, so where is its practice? I think this is where citizen science comes in. Now there are many different types of citizen science; often people are collecting data for experiments designed by scientists or participating in online crowd-sourcing that scale scientific pattern-recognition. Instead, I want to talk specifically about Science for Change Kosovo, which is not those kinds. It's the kind citizen science where participants are involved in every step of the process, from framing research questions to collecting data to interpreting results and deciding on actions. We were able to start doing citizen science in Kosova because of our 'activism'; we had already been on the ground for a few years doing digital Social Innovation Camps with local partner organisations. We also already had an specific approach to this work, which was critical pedagogy. As Paulo Freire put it in Pedagogy of the Oppressed, “Education either functions as an instrument which is used to facilitate integration of the younger generation into the logic of the present system and bring about conformity or it becomes the practice of freedom, the means by which men and women deal critically and creatively with reality and discover how to participate in the transformation of their world.”

The other driving reason for doing citizen science in Kosovo is that is has a lot of pollution; in fact it's in one of the most polluted regions of Europe. Consequently the population very bad health outcomes, with high levels of morbidity and mortality that can be linked to poor air quality. Kosovo is still struggling to it's feet after generations of occupation, a decade of strife and the war in 1999. Unfortunately it has a corrupt and completely captured political system that tries to control even production of knowledge, as I personally observed when I taught a summer school in the University of Prishtina. But, by contrast, it also has a lot of young people who want to change things. For me, one of the single most positive actions in the last year or two was the successful protests against the corrupt rectorate at the university of Prishtina after it was discovered that the rector & several other academics, who were political appointees, had fake PhDs. Students protested in the face of much pepper spray and police violence and were, surprisingly, successful. The participants in Science for Change Kosovo are drawn from that same young generation; they range from high school age to mid twenties. Our citizen science tries to enact a form of participatory action research. It follows the principle that research and action must be done ‘with’ people and not ‘on’ or ‘for’ people, and seeks to understand the world by trying to change it, collaboratively and following reflection. In our project, the young people are grouped in to a measurement committee, an education committee and a mobilisation committee, a way of organising which consciously echoes the committee form of occupy-like structures and their commitment to direct democracy.

One thing about community-driven air quality measurements is that they make literal the concept of standpoint; they depend on decisions about who's standpoint it is that is being investigated. For example, there are widely used standards in the UK for statutory measurements using diffusion tubes that says they should be placed at a height of 2 meters. While this is justifiable in terms of avoiding obstruction or vandalism, it immediately excludes the different air quality at the level that's breathed by children in pushchairs. Over last year we have ourselves used diffusion tubes in several locations in Kosovo, and identified hitherto unknown nitrogen dioxide hotspots in the capital city Prishtina. Of course, nitrogen dioxide is all over the news at the moment because it is these emissions which the defeat devices installed in Volkswagen cars were intended to obscure. In the coming year, Science for Change Kosovo will also be investigating the levels of particulates, which are parametrised as PM2.5 and PM10, using a semi-professional device called the Sidepack. This enables us to measure individual exposure levels such as the pollution on a bus journey to work, the PM levels inside and outside a school, or the air quality in the housing estates in Fushe Kosove that are downwind from the country's only power station.

Citizen science mandates a careful attention to specific differences in ways of seeing between citizen science and status quo science. Our empirical projects must be attentive to Donna Haraway's point about prosthetics; that we are already cyborgs in relation to our ways of knowing the world. "It is in the intricacies of these visualization technologies in which we are embedded that we will find metaphors and means for understanding and intervening in the patterns of objectification in the world-that is, the patterns of reality for which we must be accountable. In these metaphors, we find means for appreciating simultaneously both the concrete, "real" aspect and the aspect of semiosis and production in what we call scientific knowledge." In citizen science this becomes very obvious: we must understand the deviations of our devices so we can ground the meaning of our measurements. This is where the digital starts to re-emerge in my narrative. Both the internet and physical computing are catalysts for the growth of citizen science, and DIY devices from the maker movement, like the Arduino-based Smart Citizen kits which we also use in Science for Change Kosovo, are understood as lowering barriers to general participation in active science. We need to understand the both the views they produce and also the devices as material objects of translation. Karen Barad talks about post-Kuhnian scientific knowledge as 'constrained constructivism'. In our immediate context, the constraint is the requirement for calibration, so that we can be sure we are probing nature's regularities rather than be lost in the glossolalia of spiking signals from a device that can't be relied on.

Now situated knowledges are about communities, not simply about individuals carrying around particulate detectors. The wider question for a citizen science project is what form of social recomposition becomes possible; what change in the production of knowledge of the physical world can also be, at the same time, a political formation. As Haraway puts it, the only way to find a larger vision is to be somewhere in particular. Our 'somewhere' is Kosovo, a wholly marginalised country which also has it's own marginalised communities, such as the Roma living in Plemetina who are also participating in Science for Change. By sharing as much as possible, especially using the internet, we are trying to do do citizen science as a form open source research. The way I understand it is that we're working to mobilise affect, the embodied experience of air quality issues and their urgency, with the aim of developing what we might call a 'democracy of the air'. But we've all just seen the re-affirmation of climate colonialism at COP21 in Paris. As Adrian Lahoud points out, such events are the climate equivalent of the 1884 Berlin Conference that divided Africa for European colonial powers. While the atmosphere is parcelled up at a molecular level and the distribution of costs and benefits sustains the same global inequity, we can observe the irony that orthodox science is still the touchstone of apparently neutral truth, when science is wholly enrolled in the production of this situation in the first place. This seems a good moment to be suggesting the possibility of a more grassroots and postcolonial science. So I want to close by harking back to the very beginnings of modern science, when it was called natural philosophy. In letters in 1646 and 1647, Robert Boyle refers to "our invisible college", the far-reaching group of correspondents who's common theme was to acquire knowledge through experimental investigation. I'm proposing that critical citizen science, articulated through the internet, constitutes a new invisible college that starts, in every sense, not from the centre but from the edges.

Thank you.

Dec 14, 2015

Science for Change Kosovo Year 1

[This reflection on the first year of Science for Change Kosovo, commissioned by Datashift and released under a CC-BY-SA license.]

Science for Change Kosovo (SfCK) is a radical citizen science project. I will try to explain what our kind of citizen science is and why it's important, and why we decided to do citizen science in Kosovo. I will also talk about our preliminary results, our plans for the next year or so, and what I think the wider implications are for data, communities and democracy.

what is citizen science?

There are many forms of citizen science; in most, the participants are simply collecting data or completing tasks for an experiment designed by and for scientists. I say that we're a radical project because we believe people from the communities should be involved at every stage, from framing the research questions to designing the data collection, analysing the data and interpreting the results. This makes us less like mainstream citizen science and more like the Public Lab's idea of civic science (‘Public Lab’). We are also inspired by the idea of environmental justice; the recognition that the impact of pollution is often worse for people who are already disadvantaged by society, which goes with a commitment to supporting them to do something about it. An environmental justice project which acts as a model for Science for Change Kosovo is Global Community Monitor (‘Global Community Monitor’), who train and support disempowered "fenceline" communities harmed by serious air pollution from industrial sources and whose concerns agencies and responsible corporations are ignoring. Our ethos, like Community Based Auditing (Tattersall) in Tasmania, is to be an experiential way for citizens to undertake their own disciplined inquiry into environmental issues affecting them, so that they can assert their rights and obligations as generators of valid knowledge and as agents of change.

why does it matter?

But why should anyone take any notice of bottom-up citizen science? How can it compete in any way with the sophisticated equipment of professional scientists, not to mention their years of training? In practical terms, citizen science can fill critical gaps in knowledge. Official air quality data is often sparse, coming from a limited number of fixed monitoring sites, and has to use mathematical models to fill in the gaps. Statutory data gathering is all about averages; it can't get down to the level of everyday lives and doesn't record the variable exposures of different people, the effects of their choices, or the related impacts on their health. Institutional science also lacks citizen participation and accountability and struggles to work with local knowledge and soft data. It loses out on useful insights, relying on objectivity and distance; a stance which gave science authority years ago but more and more leads to mistrust. At a higher level, the concept of post-normal science (Funtowicz and Ravetz) suggests that orthodox scientific method is badly adapted to situations that combine high risk with high uncertainty (such as climate change) and proposes it be enhanced by an 'extended peer review' that includes all those affected by an issue. As we'll see below, Science for Change Kosovo has already started to fill some of the local gaps in data, and is trying to put in to practice the idea of a participatory peer review.

why Kosovo?

Why, though, did we decide to try citizen science in Kosovo? There are certainly easier places to start, especially in terms of available resources, and Kosovo is faced by many other equally pressing challenges. One reason is that we were already in Kosovo, doing social innovation hackathons with the young people based on the Social Innovation Camp (‘Social Innovation Camp Kosovo’) model. These camps took back-of-the-envelope ideas for digital social change and turned them in to working prototypes over the course of a weekend. The camps were hosted by the UNICEF Innovations Lab, the young participants came from the local Peer Educators Network, and many of the coders were part of Free Libre Open Source Software Kosovo. To understand Science for Change Kosovo it's important to know that we had already built credibility on the ground, brought young people in to contact with social empowerment through DIY tools, and had working partnerships with these key local groups.

Another reason for starting citizen in Kosovo is that it's a very polluted country. The ageing lignite power plants are a major source of NO2 (nitrogen dioxide), SO2 (sulphur dioxide) and particulates (dust), and one of the power stations blew up just as we started the project (Bytyci). Some daily pollutant levels exceed EU and World Health Organisation limits by several times. Kosovo's own environmental protection agency says that current data on air quality levels is poor and incomplete, and there's a lack of capacity for environmental protection at a local level. In a poor country, it's hard even for statutory agencies to get the budget for maintenance and training, and all agencies have to deal with basic problems like power cuts. Air pollution in Kosovo causes hundreds of premature deaths and thousands of emergency hospital visits each year due to respiratory tract infections. Among the countries of the region, Kosovo has the worst health outcomes, ranking behind the rest of the region, in some cases dramatically, on indicators such as life expectancy, maternal death rates or infant and child mortality.

The constitution recognizes environmental protection as one of the principles on which the Republic of Kosovo is based. The Law on Air Protection (no. 2004/30) assigns responsibility for air quality and emissions indicators and sets obligations for protection. So there's a legal framework for accountability which we work with. Kosovo also has a long-standing political aspiration to join the EU; making substantial efforts to tackle pollution will be a condition for accession. Signing up to EU principles also means signing up to measures like the Aarhus Convention which sets out rights to environmental information and to participate in environmental decisions.

The focus of Science for Change Kosovo is young people. Although half the population is under 25, their current participation in decision-making at all levels is limited. The project appeals to those young people who are hungry for change and who are enthusiastic about the potential for participatory tech innovation. Because of our environmental justice principles we are also trying to involve marginalised communities, which in Kosovo includes Roma, Ashkali and Egyptians. There's a Roma community living in Plemetina, right at the base of the Kosova A and B power stations which are the core of Kosovo's air pollution, and one of our key contacts for year 1 of Science for Change was a Roma youth activist living in this community.

year 1

Our project began in June 2014 in Prishtina with a weekend co-design event at the Unicef Innovations Lab in Prishtina. Participants included young people from several parts of Kosova that had experienced severe environmental issues, including Plementina (under the polluting power stations), Prishtina (the capital city, downwind of the power stations and with heavy traffic pollution) and Drenas (near the Ferronickel plant). The participants shared experiences of pollution and quizzed experts on an environmental health panel. There were sessions on methods for air quality measurement, such as diffusion tubes, and we had a member of the Smart Citizen Kit team who introduced their Arduino-based citizen sensing device (‘Smart Citizen’) and trained the young people on how to install it and how to connect it to the online data platform. We discussed the way that science doesn't always have the kind of certainty about environmental impacts that it claims in in public, that there are a lot of disagreements inside science and a lot of arguments about what data is valid and what isn't. Participants planned for campaigning, drawing on global examples like the Arab uprisings and local examples like the student direct action that removed a corrupt Rector from the University of Prishtina. By the end of the weekend the action groups had agreed on a plan for air quality monitoring in Prishtina, Plemetina and Drenas.

One key thing we'd understood from researching other DIY air quality sensing projects is the importance of calibration. While sounding like a pretty unexciting notion, this is one of the main discontinuities between the smoothness of data projects and the awkwardness of material reality. So much of what passes as data journalism and data visualisation takes its data from existing sources, whereas a citizen science project is generating data from interactions with the physical world. And if this data is to be at all meaningful there should be a way to tie it to the material; to set a baseline that's been verified in the lab and in the field. Although the Smart Citizen Kits represented the positive maker-movement trend to open hardware sensing, they were a concern for us because they came without any calibration. So we decided to co-locate our kits with diffusion tubes, and use the fact that both measured NO2 to calibrate SCK data against the the tube analysis from the lab.

The field mobilisation of Science for Change Kosovo was, frankly, impressive. Kosovo can be a very frustrating place, where post-Communist institutional inefficiency overlaps with entrepreneurial corruption, and it's hard to get things done if you don't know the right people. By contrast, the young people in the project self-organised with the help of the UNICEF lab, installing Smart Citizen Kits in houses in each location (‘Deployment of Smart Citizen Kits’) and installing and collecting diffusion tubes ( (‘Deployment of Tubes in Drenas’) over three monthly cycles of data gathering. This included a large group of young Roma who installed and monitored the collection devices in Plemetina (‘Deployment of Tubes in Plemetina’).

Unfortunately, the data from the Smart Citizen Kits was difficult to use. While the datasheets for the sensors on the boards showed a linear log-log correlation between gas levels and the signal, in practice there were severe spikes in the data which blew holes in our ability to compare an averaged reading with the tubes. This was a real shame because the kits were our route to live data; emitting sensor readings every second, they were connected to the net and held out the prospect of a live pollution map, not to mention ideas around live campaigning (eg. triggering tweets to members of parliament each time the pollution exceeded EU levels). On the other hand we had significant readings from some of the NO2 tubes in the capital city, Prishtina. Through the dedicated field work of the volunteers, it looks like we've identified local hotspots which were missed by the government's data and which exceeed statutory limits by a large margin. We will be following these up in the next phase of data gathering.

next steps

Our next goal is to expand our measurement activities to particulates i.e. very small dust particles, which are categorised as PM10 (under 10 microns diameter) and PM 2.5 (under 2.5 microns diameter). Particulates are a form of pollution where the link to serious and deadly health problems is absolutely unambiguous. We have acquired a semi-professional portable detector called the TSI Sidepak which will enable us to take readings at different locations and also on the move. In this way we'll be able to compare the exposure of different activities e.g. driving / walking / cycling, and the way this varies over the course of daily life for different groups of people.

Working with the Sidepak detector to make localised and journey-based PM measurements will enable us to test ameliorative measures, such as alternative walking routes that can reduce peoples' exposure. In doing this we're following the Breathe London project and the work of the air quality & health team at Kings College, London, who have been piloting alternative back street walking routes to school for children in areas of central London.

From December 2015 we'll be running junior citizen science workshops in high schools. Rather than the traditionalist pedagogy that's customary in Kosovo, these will be non-formal, experiential and practice-based workshops. The students will also get to experiment with with the Smart Citizen Kits, as a way to learn about open tech for environmental monitoring. They will deploy the Smart Citizen Kits for indoor air quality measurements in the schools.

data & change

This doesn't mean we're ignoring the need for wider campaigning. The current PM measurements in each locality are tied to holding 'Town Hall' meetings shortly afterwards as a way to inform and engage local people. We are exploring how our citizen science measurements can best be used for advocacy and campaigning, but this needs to be alert to the complexities of the local context. The data we generated from Drenas has already been used to report in the Parliamentary Commission, and the Ministry of Environment of Kosovo has initiated a court case against the heavy metal plant “Ferronikeli” in Drenas. However, these developments are heavily enmeshed in party political battles, which generates a certain amount of cynicism in everyone else. All institutions are perceived to be captured by narrow political interests, and NGOs are often seen as internationalised 'do nothings' who complain endlessly from the sidelines. We're hoping to learn from the EcoGuerilla campaign (‘Lëvizja Eco Guerilla’) from neighbouring Macedonia, where leaked information about pollution from a power plant led to mass protests against PM levels.

We are trying to avoid the attribution of agency to data, or an assumption that participation is the same as empowerment. Many data and open data projects in the wealthy West seem to assume that action will inevitably flow from aggregating data and visualising transparency. Other citizen sensing projects assume that participation of communities in gathering data will increase people's sense of responsibility and lead to the generation of solutions. But the idea that collective measurement leads to collective action seems questionable. In fact there may be a tendency for forms of governmentality like the smart city to re-constitute populations as having a duty to measure their environments, while at the same time producing a society that is, overall, less democratic.

It's the relationship between air quality and democracy which underlies Science for Change. Kosovo is democratically challenged, with different forms of corruption alongside political interference in knowledge production. Orthodox political processes are completely captured by elites and oligarchs. The older generation have a hold on power and it's very hard for young people with a more open, socially progressive outlook to make headway. The emphasis of Science for Change Kosovo is on practices, not simply on what is produced; on the construction of data and what that means about subjectivity and agency, not just on the data as such. The interesting thing about air quality is that it is also politics by other means. We know the air is political, that "the air’s chemical composition reveals a history and a politics in itself" (Nieuwenhuis). Most of the time we do not feel this with an intensity that leads to action. What is it in citizen science that causes an 'affective' response, which 'so amplifies our awareness of the injury which activates it that we are forced to be concerned, and concerned immediately'? (Tomkins and Demos) This is a question we hope that Science for Change Kosovo will help to answer over the next two years.


Bytyci, Fatos. ‘At Least One Killed in Kosovo Power Plant Blast, Supplies Hit’. Yahoo News. N.p., 6 June 2014. Web. 26 Nov. 2015. http://news.yahoo.com/explosion-hits-kosovo-coal-fired-power-plant-injuries-095441081.html.

‘Deployment of Smart Citizen Kits | Facebook’. N.p., 2014. Web. 26 Nov. 2015. https://www.facebook.com/media/set/?set=a.1451488458465993.1073741832.1423629027918603&type=3.

‘Deployment of Tubes in Drenas | Facebook’. N.p., 2014. Web. 26 Nov. 2015. https://www.facebook.com/media/set/?set=a.1448248975456608.1073741831.1423629027918603&type=3.

‘Deployment of Tubes in Plemetina | Facebook’. N.p., 2014. Web. 26 Nov. 2015. https://www.facebook.com/media/set/?set=a.1440501616231344.1073741830.1423629027918603&type=3.

Funtowicz, Silvio O., and Jerome R. Ravetz. ‘Science for the Post-Normal Age’. Futures 25.7 (1993): 739–755. ScienceDirect. Web.

‘Global Community Monitor’. 2015. Web. 26 Nov. 2015. http://www.gcmonitor.org/.

‘Lëvizja Eco Guerilla’. 2015. Web. 26 Nov. 2015. http://ecoguerilla.mk/.

Nieuwenhuis, Marijn. ‘Atemwende, or How to Breathe Differently’. Dialogues in Human Geography March (2015): 90–94. Print.

‘Public Lab: About Public Lab’. N.p., 2015. Web. 26 Nov. 2015. https://publiclab.org/about.

‘Smart Citizen’. N.p., 2015. Web. 26 Nov. 2015. https://smartcitizen.me/kits/.

‘Social Innovation Camp Kosovo’. 2013. Web. 17 Aug. 2014. http://sicampkosovo.org/.

Tattersall, Philip J. ‘What Is Community Based Auditing and How Does It Work?’ Futures 42.5 (2010): 466–474. ScienceDirect. Web.

Tomkins, Silvan S., and E. Virginia Demos. Exploring Affect: The Selected Writings of Silvan S Tomkins. Cambridge University Press, 1995. Print.

Nov 07, 2015

Ghosts in the Algorithmic Resilience Machine

My speaker notes from the panel on 'Resilience and the Future of Democracy in the Smart City' at the 25th anniversary conference of the Centre for the Study of Democracy, University of Westminster, 7th Nov 2015.

I want to start by looking at what resilience and the smart city have in common. The idea of resilience comes from Holling's original 1973 paper on ecological systems. He was looking at the balance of predator and prey, and replaced the simple idea of dynamic equilibrium with abstract concepts draw from systems theory & cybernetics. Complex systems have multiple equilibriums, and movement between these is not a collapse of the system but rather an adaptive cycle. So the population of antelope drooping by 80% is not necessarily a catastrophe, but an adaptive shift. The system persists, although in a changed form.

What does this have to do with the smart city? In it's current incarnation, the smart city appears as pervasive computation in the urban fabric, driven by the twin goals of efficiency and environmental sustainability. It posits continuous adaptation through a cycle of sensing-computation-actuation. Heterogeneous data streams from sensors are processed in to a dashboard of metrics that triggers automated changes; so, for example, speed limits and traffic lights are manipulated to modify car emissions in near real-time. The new model of the smart city explicitly includes the participation of citizens as sensing nodes. Continuous adaptations are made to optimise flows with respect to the higher parameters of smoothness and greenness. The smart city is multi-dimensional complex system constantly moving between temporary states of equilibrium. It is a manifestation of high-frequency resilience.

But resilience means more than systems ecology. It has outgrown it's origins to become a governing idea in a time of permanent crisis. As a form of governmentality [] it constitutes us as resilient populations and demands adaptation to emergencies of whatever kind, whether it's finance, envirnonment or security. In practice, the main engine of resilience is through accelerated conversion of everything to Hayek's self-organising complexity of markets, with military intervention at the peripheries where this resisted.

If resilience is the mode of crisis governance and the smart city is a form of high-frequency resilience, what does the smart city mean for democracy? To understand the implications for the future of democracy, I want to look at the emerging mode of production through which both wider resilience and specifically the smart city are being produced; that is, through the algorithmic production of preemption.

We're all becoming familiar with the idea that contemporary life generates streams of big data that are drawn through the analytic sieve of datamining and machine learning. Meaning is assigned through finding clusters, correlations and anomalies that can be used to make predictions. While its original commercial application was to predict the next set of supermarket purchases, the potential for prediction has become addictive for sectors whose main focus is risk. While algorithmic preemption drives both high-frequency trading and drone strikes, it has also spread to the more mundane areas of everyday life.

In the same way that airline websites use your online data profile to tweak the ticket prices that you see, algorithmic prediction leads to preemptive interventions in social processes. One example is human resources departments, where it's used to predicts which employees will be the next to leave. Or in company health insurance, where staff wear Fitbits and pay insurance premiums based on predicted future health. In New Zealand, the government commissioned algorithms to predict which families are likely to abuse their children, based on data available in the first week after birth. And in some US states police stop and search is targeted by prediction software like PredPol.

This preemption forecloses possible futures in favour of the preferred outcome. The smart city will be a concentrated vessel for algorithmic preemption and, because of this, it will be a machine for disassembling due process.

This year in the UK there's been a big fuss about the 800th anniversary of the signing of the Magna Carta ('the Great Charter'). The principle of due process in law is expressed in Clause 39 of the Magna Carta: "No free man shall be seized or imprisoned, or stripped of his rights or possessions, or outlawed or exiled, or deprived of his standing in any way, nor will we proceed with force against him, or send others to do so, except by the lawful judgment of his equals or by the law of the land."

But so much of this is potentially shredded by the smart city; the constant contact with algorithmic systems that can influence the friction or direction of our experience opens the space for prejudicial and discriminatory actions that escape oversight.

The characteristics of algorithmic preemption that disassemble due process include the high frequency and often invisible nature of the resilience adaptations. But also because, unlike science, algorithmic preemption make no claim to causal explanation. It simply predicts through patterns, and the derivation of those patterns through abstraction and parallel calculation at scale is opaque to human reasoning. Therefore the preemptions of big data are not understandable as intent nor accountable to 'the judgement of peers'.

Algorithmic productive force avoids causality, evades accountability, and restrict agency to participation and adaptation. To be honest, things are not looking good...

But general computation doesn't predetermine the kinds of patterns that are produced. The network protocols are open, and the ability to take advantage of code is not limited to the powerful. The question is, if there are other possibilities, how can we envision them? If enthusiastic communities participating in bottom up citizen sensing using accessible tech can be assimilated in to the resilience of the smart city, as they can, where do we look for forms of social recomposition that combine community and computation for a real alternative?

I think this is where the ghost of Gustav Landauer arises to guide us. His most famous dictum was first published in “Schwache Staatsmänner, Schwächeres Volk!” in 1910: “The State is a condition, a certain relationship between human beings, a mode of behaviour; we destroy it by contracting other relationships, by behaving differently toward one another… We are the State and we shall continue to be the State until we have created the institutions that form a real community.” You can't smash the state as an external thing, it is this networked relational form.

But the smart city is also a networked relational form. The relations span people, devices and infrastructures, with patterns of relationships modulated by algorithms. Can we use algorithms to contract other forms of relationship? Here, another distinctive aspect of Landauer’s politics becomes applicable. He said that rather than toppling the state, you have to overcome capital by leaving the current order. This is precisely the possibility raised by some current experiments in political prototyping through technology.

The one i want to look at is the blockchain, which is the technology behind Bitcoin. Bitcoin itself dispenses with the need for a central bank through having distributed ledger of transactions. These transactions can be trusted because of an algorithmic mechanism called 'proof of work' which is basically incorruptible because it's implemented through a cryptographic hashing function. The underlying mechanism is distributed, trustable records that don't require a centralised authority.

Many people are now looking at role that distributed, trustable records could play beyond cryptocurrencies, through forms of so-called smart contracts. This is where the blockchain could become a protocol for parallel structures.
Many people are now looking at role that distributed, trustable records could play beyond cryptocurrencies, through forms of so-called smart contracts.

Smart contracts enable, for example, decentralized autonomous organizations (DAOs). A DAO involves people collaborating with each other via processes recorded incorruptibly on the blockchain. While a lot of the speculation around smart contracts is libertarian, I agree with David Bollier's assessment that they also hold out the prospect for commons-based systems. A smart contract would straight away deal with issues such as the free rider problem, a.k.a. the tragedy of the commons. As the well-known hacker Jaromil, who works on a fork of bitcoin called Freecoin, says: "Bitcoin is not really about the loss of power of a few governments, but about the possibility for many more people to experiment with the building of new constituencies." It seems there could be prefigurative politics in these protocols.

One project implementing Freecoin is the Helsinki Urban Co-operative Farm. This is a community-supported agriculture project, where people collectively hire a grower but where participants can also volunteer to work in the fields. The agreement id that each member does at least 10 hours of work per year and there lots of other admin & logistical tasks that have to be done. The complex transaction types and numbers are becoming an issue for the collective, and the plan is for Freecoin to be a decentralized & transparent way to track & reward contributions, maintaing self-governance and avoiding the need to create a centralised institution.

Although this is only one small example of the application of the blockchain to common-pool resources, it is an eerie echo of Landauer, who's practical politics focused on communes for the collective production of food and other necessities. Overall, I'm suggesting that through technologies like the blockchain, Landauer's approach of leaving rather than confronting, reconstituting sets of relationships, and concentrating on common production, could be the Other of the Smart City.

Let me finish by returning to the topic of this panel: resilience and the future of democracy in the smart city. I think the current direction of travel, based on algorithmic preemption, is towards the post-democratic forms of neoliberal resilience. But it may be that the consequent creation of highly computational infrastructures is also an opening for decentralised autonomous organisation, enabling us to 'occupy' computation and implement a kind of exodus (in the spirit of Gustav Landauer) to more federal-communitarian forms supported by a protocols of commonality.

Thank you

Jul 15, 2015

Data Science and Phrenology


In this paper I look at data science through the historical lens of phrenology. I take data science seriously in it's claim to be a science, and examine its parallels with the methodological and social trajectories of phrenology as a scientific discourse. My aim is not to dismiss data science as pseudo-science but to explore the interplay of empirical and social factors in both phrenology and data science, as ways of making meaning about the world. By staying close to the practical techniques at the same time as reading them within their historical contexts, I attempt some grounded speculations about the political choices facing data science & machine learning.

In contrast to the philosophy and anatomy of the early nineteenth century, phrenology offered a plausible account of the connection between the mind and the brain by asserting that 'the brain is the organ of the mind'. Phrenologists believed that the brain is made up of a number of separate organs, each related to a distinct mental function, and the size of each organ is a measure of the power of its associated faculty. There were understood to be thirty seven faculties including Amativeness, Philoprogenitiveness, Veneration and Wit. The operations of phrenology were based on assessing the correlation between the topology of the skull and the underlying faculties, whose influence corresponded to size and therefore the specific shape of the head. It was used as a predictive empirical tool, for example to assist in the choice of servant.

The data science that is emerging in the second decade of the twenty-first century offers a plausible connection between the flood of big data and models that can say something meaningful about the world. The most widely used methods in data science can be grouped under the broad label of machine learning. In machine learning, algorithmic clustering and correlation are used to find patterns in the data that are 'interesting' in that they are both novel and potentially useful [1]. This discovery of a functional fit to existing data, involving an arbitrary number of variables, enables the predictive work that data science is doing in the world. While data mining was originally used to predict patterns of supermarket purchases, the potential to pre-empt risk factors is leading to the wide application of data science across areas such as health, social policy and anti-terrorism.

The newly developed technique of phrenology was most actively studied in Britain in the years 1810-1840. One of the factors that made it popular was the accessibility of the method to non-experts. For leading exponents such as George Combe it was a key principle that people were able to learn the methods and test them in practice: 'observe nature for yourselves, and prove by your own repeated observations the truth or falsehood of phrenology'. Some historians, such as Steven Shapin, have interpreted British phrenology as a social challenge to the the elitist control of knowledge generation, with a corresponding commitment to broadening the base of participation [2]. Shapin saw this as evidence that social factors as well as intrinsic intellectual factors help explain the work done by early phrenology, which 'enabled the participation in scientific culture of previously excluded social groups'.

A stronghold of historical phrenology in Britain was Edinburgh, where it was strongly associated with a social reformist agenda. Phrenologists there believed that the assessment of character from the shape of the skull was not the final word but a starting point for self and social improvement, because 'environmental influences could be 'brought to bear to stir one faculty into greater activity or offset the undesirable hyper-development of another. Not just the size but the tone of the organ was responsible for the degree to which its possessor manifested that behaviour' [3]. Advocates of phrenology such as Mackenzie asserted that 'until mental philosophy improves, society will not improve' and many felt that their science should influence policies on broad social issues such as penal reform and the education of the working classes.

As it stands now, data science is a highly specialised activity restricted to a narrow group of participants. The fact that data science is seen as a strategic expertise, combined with the small number of trained practitioners, has led to the demand far outstripping the supply of data scientists and its identification by the Harvard Business Review as 'the sexiest job of the 21st Century'. Most data scientists outside of academia are employed either by large corporations and financial institutions or by entrepreneurial start-ups. In terms of its social and cultural positioning, data science as we know it is a hegemonic activity.

Using the predictions of data science to drive pre-emptive interventions is also seen as having a social role. However, the form of these social interventions is shaped by the actors who are in a position to deploy data science. The characterisation of data science as a tool of the powerful derives not only from the algorithmic determination of parole conditions or predictive policing, but from its embedding within a hegemonic world view. The forms of algorithmic regulation promoted by people like Tim O'Reilly have become algorithmic governance. Predictive filtering dovetails with the 'fast policy' of behavioural insight teams, as they craft policy changes to choice architecture of everyday life.

In the 1840s phrenology ran in to problems, with increasingly successful empirical challenges to its validity. In particular, critics questioned whether the external surface of skull faithfully represented the shape of the brain underneath. If not, as came to be accepted, phrenology could no longer claim a correspondence between observations of the skull and the faculties of the individual. Supporters continued to defend phrenology on the basis of its utility rather than using measurement as a criteria: 'we have often said that Phrenology is either the most practically useful of sciences or it is not true'. But by the mid 19th century both specific objections and the general advance of the scientific method left phrenology discredited.

Unfortunately, phrenology underwent a revival in the late C19th and early C20th as part of a broad set of ideas known as scientific racism. This field of activity used scientific techniques such as craniometry (volumetric measurements of the skull) to support a belief in racial superiority; 'proposing anthropologic typologies supporting the classification of human populations into physically discrete human races, that might be asserted to be superior or inferior'. It was used in justifying racism and other narratives of racial difference in the service of European colonialism; for example, during the 1930s Belgian colonial authorities in Rwanda used phrenology to explain the so-called superiority of Tutsis over Hutus.

In 1950 UNESCO statement on race formally denounced scientific racism, saying "For all practical social purposes 'race' is not so much a biological phenomenon as a social myth. The myth of 'race' has created an enormous amount of human and social damage." However, the concept of race has been re-mobilised inside genomics, one of the crucibles of data science. Rather than Human Genome Project closing the door on the idea of race having a biological foundation, as many had hoped, some studies suggest that 'racial population difference became a key variable in studying the existence and meaning of difference and variation at the genetic level'.

The jury is still out on the long term validity of data science as an empirical method of understanding the world. Certainly there is a growing critique, largely based on privacy and ethics but also on the substitution of correlation for causation and the over-arching idea that metrics can be a proxy for meaning. I have written elsewhere about the potential already immanent in algorithmic governance to produce multiple states of exception [4]. However, my purpose here is a different one; to see the unfolding path of data science as propelled by both methodological and social factors and to use the completed trajectory of phrenology as a heuristic comparison.

Instead of being disheartened that, despite the bigness of data and the sophistication of machine learning algorithms, empirical activity is still imbricated with social values, we should recognise this as a continuing historical dynamic. This can be mobilised explicitly to offer a more hopeful future for data science and machine learning than one that derives only from the financial or governmental hegemony. Like the phrenologists of nineteenth century Edinburgh, we can choose to see in the methodologies of machine learning the opportunity to increase participation and social fairness. This can be imagined, for example, though the application of participatory action research to the process of data science. As Mackenzie wrote about phrenology "the most effectual method" (of error checking) was "to multiply, as far as possible, the number of those who can observe and judge". It is as yet a largely unexplored research question to ask how data science can be democratic, and how we can develop a machine learning for the people.

[1] Han, Jiawei, Micheline Kamber, and Jian Pei. Data mining: concepts and techniques: concepts and techniques. Elsevier, 2011.

[2] Shapin, Steven. "Phrenological knowledge and the social structure of early nineteenth-century Edinburgh." Annals of Science 32.3 (1975): 219-243.

[3] Cantor, Geoffrey N. "The Edinburgh phrenology debate: 1803–1828." Annals of Science 32.3 (1975): 195-218.

[4] McQuillan, Dan. ‘Algorithmic States of Exception’. European Journal of Cultural Studies 18.4-5 (2015): 564–576. ecs.sagepub.com.

Jun 29, 2015

Hannah Arendt and Algorithmic Thoughtlessness


In this paper I warn of the possibility that algorithmic prediction will lead to the production of thoughtlessness, as characterised by Hannah Arendt.

I set out by describing key characteristics of the algorithmic prediction produced by data science such as the nature of machine learning and the role of correlation as opposed to causation. An important feature for this paper is that applying machine learning algorithms to big data can produce results that are opaque and not reversible to human reason. Nevertheless their predictions are being applied in ever-wider spheres of society leading inexorably to a rise in preemptive actions.

I suggest that the many-dimensional character of the 'fit' that machine learning makes between the present and the future, using categories that are not static or coded by humans, has the potential for forms of algorithmic discrimination or redlining that can escape regulation. I give various examples of predictive algorithms at work in society, from employment through social services to predictive policing, and link this to an emerging govermentality that I described elsewhere as 'algorithmic states of exception' [1].

These changes have led to a rapid rise in discourse on the implications of predictive algorithms for ethics and accountability [2]. In this paper I consider in particular the concept of 'intent' that is central to most modern legal systems. Intent to do wrong is necessary for the commission of a crime and where this is absent, for whatever reason, we feel no crime has been committed. I draw on the work of Hannah Arendt and in particular her response to witnessing the trial of Adolf Eichmann in Jerusalem in 1961 [3] to illuminate the impact of algorithms on intent.

Arendt's efforts to comprehend her encounter with Eichmann led to her formulation of 'thoughtlessness' to characterise the ability of functionaries in the bureaucratic machine to participate in a genocidal process. I am concerned with assemblages of algorithmic prediction operating in everyday life and not with a regime intent on mass murder. However, I suggest that thoughtlessness, which is not a simple lack of awareness, is also a useful way to assess the operation of algorithmic governance with respect to the people enrolled in its activities.

I propose that one effect of this is to remove accountability for the actions of these algorithmic systems. Drawing on analysis of Arendt's work [4] I argue that the ability to judge is a necessary condition of justice; that legal judgement is founded on the fact that the sentence pronounced is one the accused would pass upon herself if she were prepared to view the matter from the perspective of the community of which she is a member. As we are unable to understand the judgement of the algorithms, which are opaque to us, the potential for accountability is excised. I also draw on recent scholarship to suggest that, due to the nature of algorithmic categorisation, critique of this situation is itself a challenge [5]. Taken together, these echo Arendt's conclusion that what she had witnessed had "brought to light the ruin of our categories of thought and standards of judgement".

However, Arendt's thought also offers a way to clamber out of this predicament through the action of unlearning. Her encounter with Eichmann was a shock; she expected to encounter a monster and instead encountered thoughtlessness. Faced with this she felt the need to start again, to think differently. A recent book by Marie Luise Knott describes this as unlearning, "breaking open and salvaging a traditional figure of thought and concluding that it has quite new and different things to say to us today" [6].

I conclude the paper by proposing that we need to unlearn machine learning. I suggest a practical way to do this through the application of participatory action research to the 'feature engineering' at the core of data science. I give analogous examples to support this approach and the overall claim that it is possible to radically transform the work that advanced computing does in the world.

[1] McQuillan, Daniel. 2015. Algorithmic States of Exception. European Journal of Cultural Studies, 18(4/5), ISSN 1367-5494

[2] Algorithms and Accountability Conference, Information Law Institute, New York University School of Law, February 28th, 2015.

[3] Arendt, Hannah. Eichmann in Jerusalem: A Report on the Banality of Evil. 1 edition. New York, N.Y: Penguin Classics, 2006.

[4] Menke, C. & Walker, N.(2014). At the Brink of Law: Hannah Arendt’s Revision of the Judgement on Eichmann. Social Research: An International Quarterly 81(3), 585-611. The Johns Hopkins University Press.

[5] Antoinette Rouvroy. "The end(s) of critique : data-behaviourism vs. due-process." in Privacy, Due Process and the Computational Turn. Ed. Mireille Hildebrandt, Ekatarina De Vries, Routledge, 2012.

[6] Knott, Marie Luise, 2014. Unlearning with Hannah Arendt, New York: Other Press.

Jun 23, 2015

Data Luddism


In this paper I propose Data Luddism as a radical response to the productive power of big data and predictive algorithms. My starting point is not the Romantic neo-Luddism of Kirkpatrick Sale but the historical Luddism of 1811-1816, and the Luddites' own rhetoric regarding their resistance to 'obnoxious machines' [1].

The Luddites' opposition to steam-powered machines of production was based on the new social relations of power they produced, which parallels the present emergence of data-powered algorithmic machines. Drawing on my previous work on Algorithmic States of Exception [2] I outline the operations of machine learning and datamining, and the way predictive knowledge is leading to the irruption of preemption across the social field from employment to social services and policing. I assert that the consequent loss of agency and establishment of new powers unbalanced by effective rights can be fruitfully compared to the effect of new machinery on nineteenth century woolen and silk industries. Based on this I examine key aspects of Luddite resistance for their contemporary relevance.

I compare the adoption of a collective name ('General Ludd'), and the evolution of Luddism as it expanded from the customary communities of Nottinghamshire through metropolitan Manchester and the radicalised West Riding, to the trajectory of the contemporary hacktivist movement Anonymous. I highlight the political sophistication of the Luddites and the way machine breaking was situated in a cycle of negotiation, parliamentary petition and combination, and ask what this means for a contemporary resistance to data power that restricts itself to issues of privacy and ethics.

Most importantly, I assert that the Luddites had an alternative social vision of self-governance and community commons and that we, too, should posit a positive vision against the encroachment of algorithmic states of exception. However, I ask whether (in contrast to the Luddites) we can use the new machines to bring these different possibilities in to being. The Luddites saw themselves as a self-governing socius, and I consider recent experiments in technology enabled self-organisation such as 'liquid democracy' software.

Beyond this, I focus on the Luddites call to 'put down all Machinery hurtful to Commonality' to ask if we can adapt the machines to support the commons. I examine recent proposals that the blockchain (the technology behind bitcoin) can enable distributed collaborative organizations and tackle traditional issues related to shared common-pool resources, such as the free rider problem [3]. I conclude that if we are serious about resisting the injustices that could come from data-driven algorithmic preemption we have a lot to learn from the historical Luddites, but also that we have the opportunity to 'hack' the machines in the service of a positive social vision.

[1] Binfield, K. ed., 2004. Writings of the Luddites, Baltimore: Johns Hopkins University Press.

[2] McQuillan, D., 2015. Algorithmic states of exception. European Journal of Cultural Studies, 18(4-5), pp.564–576. Available at: http://ecs.sagepub.com/content/18/4-5/564

[3] David Bollier, 2015. The Blockchain: A Promising New Infrastructure for Online Commons. Available at: http://www.bollier.org/blog/blockchain-promising-new-infrastructure-online-commons

Jun 16, 2015

Seventeen Theses on DIY Science

An opening provocation to the 'DIY Science: the challenges of quality' at the European Commission Joint Research Centre, Ispra 16-5-15

  1. diy science doesn't happen inside walls with armed guards [note: this refers to the workshop venue]
  2. the question of quality is really a question of objectivity
  3. leaving the scientific hegemony doesn't mean pure relativism; that is scare tactics
  4. use donna haraway's situated knowledge: objectivity is about particular embodiment, not the god trick
  5. "only partial perspective promises objective vision" (haraway)
  6. the question of DIY science is a question of self-governance: learn from other struggles e.g. luddites
  7. 'open' alone won't save science;  learn from open data that does bad as well as good
  8. cultivate disrespect for scientific authority (not dismissal, based on historical contingency of knowledge)
  9. book proposal: "the joy of empirical discovery" modelled on alex comfort's 1972 book "the joy of sex"
  10. science is weak, so it's a good time to be pushing
  11. DIY science is based on social justice - DIY science should always 'punch up'
  12. beware recuperation: we don't do deconstruction to benefit the religious rightwing #1980s   
  13. the hack-fab complex could be an opening for neoliberalism c.f. squatting. beware assimilation!
  14. DIY science & repression: when you start to make a difference there will be arrests.   what are you willing to risk?
  15. DIY science should seek social movements
  16. DIY science is transformative: coming to know should change the knower. find affinity with indigenous communities
  17. the universe is a trickster. "Feminist objectivity makes room for ironies at the heart knowledge production"

Jan 26, 2015

Auditing Algorithms


Algorithmic Misbehavior and Wider Proactive Engagement

We welcome the Algorithm Audits workshop as an important move towards conceptualising and investigating this emerging area. We also recognise the usefulness of auditing algorithms, as described in the workshop rationale.

However, our own research hopes to extend the scope of investigating algorithms in the world from two directions; the social effects themselves, and the specific algorithmic approaches behind them. In doing so, we also hope to shift the focus from reactive to proactive intervention.

1 In particular, the research problem we would like to tackle is the investigation of emerging algorithmic states of exception[1], where the social action of the algorithms has the force of law while escaping legal constraint. We believe that the topic of 'algorithmic misbehaviour' identified in the workshop proposal is a suitable frame for this research, because it can acknowledge both unintended consequences flowing from the opacity of algorithms and the unethical appropriation of algorithms by institutional actors.

2 We propose a range of interventions to explore this possibility. These include identifying areas of algorithmic regulation where harmful effects are possible using critical pedagogy with affected communities to generate data extending the practices of software engineering to a wider set of stakeholders testing the findings through journalistic investigation

3 Therefore we suggest that three goals for discussion at the workshop should be i. how can we audit algorithms which act beyond online platforms? ii. how can we investigate algorithmic misbehaviors? using journalistic techniques; both traditional journalism and the 'social forensics' of Eliot Higgins[2] by inverting investigating 'from the outside' by situating within affected communities iii. how can we participate in software engineering in ways that opens it up to wider discussions of impact and of 'doing no harm'? Goal 3 (software engineering) recognises the limitations of audits, including software audits, as a) reactive, and b) unable to encompass all possible outcomes. We believe this connects to a wider debate around computation and ethics, where there are attempts to apply social values to software retroactively. This seems doomed to the same cycle of endless catch-ups as we find with legal regulation.

Our proposal is to widen the interpretative community in software engineering, bringing in social science, journalism, big data analytics and user community at the start rather than afterwards. In software engineering, metrics based on a set of measures are often designed to provide an indication of the quality of some representation of the software. In an approach analogous to the emerging methodologies of citizen science, we suggest that wider communities can be engaged in the following software engineering steps:

i. Derive software measures and metrics that are appropriate for the representation of software that is being considered. ii. Establish the objectives of measurement iii. Collect data required to derive the formulated metrics. iv. Analyse appropriate metrics based on pre-established guidelines and past data. v. Interprete the analytical results to gain insight into the quality of the software. vi. Recommend modifications in the software or if necessary, loop back to the beginning of the software development cycle. Overall, our research approach is one of triangulation through a multidisciplinary methodology and addressing problems through participation. We look forward to contributing to the workshop and to subsequent developments.

[1] McQuillan, Daniel. 2015. Algorithmic States of Exception. European Journal of Cultural Studies, 18(4/5), ISSN 1367-5494 (Forthcoming) [2] https://www.bellingcat.com/

Dr Dan McQuillan, Lecturer in Creative & Social Computing, Department of Computing, Goldsmiths, University of London d.mcquillan@gold.ac.uk
Dr Ida Pu, Lecturer in Computer Science, Department of Computing, Goldsmiths, University of London I.Pu@gold.ac.uk