Image source: Gisela Giardino, Flickr
Concern with the role Facebook may or may not have played in swaying the outcome of the U.S. election have given the politics of algorithms a renewed platform in public and media discourse. Critical engagements with algorithmic decision making, a lack of algorithmic accountability, and instances of algorithmic discrimination, however, did not emerge with this election, nor are they confined to the echo chambers consumers/voters inhabit on social media.
Global data volume has grown exponentially in recent years and experts expect this trend to continue. The buzzword character of big data might warrant a few exasperated eye rolls, but the wider trend towards the pervasive datafication of our lives is not one we can just sit out. Big data and the algorithmic decisions it feeds permeate citizenship, healthcare, welfare states, education, finance, law enforcement as well as the ways in which we shop, travel, and live our social lives. They can take on a benign air of innovation and efficiency but also carry an intrinsic baggage of surveillance and control.
Big data and the algorithmic decisions it feeds permeate citizenship, healthcare, welfare states, education, finance, law enforcement as well as the ways in which we shop, travel, and live our social lives. They can take on a benign air of innovation and efficiency but also carry an intrinsic baggage of surveillance and control.
The notion of algorithmic discrimination describes instances where the outcomes of algorithmic decision making disadvantage, exclude, or disproportionately single out women, racialised groups, queers, trans* folks, religious minorities, the poor and so on. To be clear, the point is not a discussion of who’s most racist and sexist, the algorithm itself (!), the programmer who wrote it, the data it computes, the company who runs it, or society at large, but that, regardless, its outcomes can indeed be discriminatory.
The internet simultaneously serves as a major source of the data at stake, as a tool for social justice and human rights advocacy, and as a tool for surveillance and oppression. A recent UN resolution thus reaffirms the necessity to protect human rights online and recognises the internet’s potential for development, for the empowerment of women, and the pursuit of all human rights. The resolution calls for a human rights based approach to security on the internet “so that it can continue to be a vibrant force that generates economic, social and cultural development”.
While the right to privacy has enjoyed the most prominent attention in relation to data and the internet, it is important to note that human rights violations as well as human rights in and of themselves are intersectional. As the enjoyment of economic, social, and cultural rights increasingly takes place online, privacy is rarely an end in itself but a prerequisite for exercising other fundamental rights such as education or the participation in cultural life, for human rights advocacy, and for any potentially transformative use of big data. In other words, algorithmic discrimination not only disadvantages those on the intersections between race, gender and sexuality and thus violates their right to freedom from discrimination, but also (intersectionally) threatens the enjoyment and defence of all human rights.
As the enjoyment of economic, social, and cultural rights increasingly takes place online, privacy is rarely an end in itself but a prerequisite for exercising other fundamental rights such as education or the participation in cultural life, for human rights advocacy, and for any potentially transformative use of big data.
Of course, once we ditch a blind belief in the objectivity of data and algorithmic analyses, and accept that the work and infrastructures involved in collecting, storing, and computing data are never neutral but manmade (pun intended) and embedded in power relations, it might go without saying that questions of privilege extend to data and the politics of algorithms in more ways than one. In the meantime, the work of documenting and calling out gaps, bias, and the ways in which racism interacts with classism, sexism, ableism or transphobia to exclude, discriminate and further marginalise those underrepresented and otherwise othered in data practices remains essential and builds the basis for the difficult task of figuring out how to fix the problem(s).
The work of documenting and calling out gaps, bias, and the ways in which racism interacts with classism, sexism, ableism or transphobia to exclude, discriminate and further marginalise those underrepresented and otherwise othered in data practices remains essential and builds the basis for the difficult task of figuring out how to fix the problem(s).
The problem is increasingly framed in terms of the underrepresentation of women in the data – algorithmic discrimination as a result of missing women/data so to speak. Alongside wider efforts to harness the potential of big data for development and the realisation of human rights, initiatives geared towards closing the gender gap by closing the data gap, improving the quality and quantity of gender data to improve the lives of women and girls are under way. While such projects recognise that big data are no panacea for gender equality, they remain hopeful that inclusion and representation in the sense of being in the data holds potential for positive change.
This framing can be read as somewhat at odds with wider feminist and de-colonial concerns about the complicity of big data in technologies of surveillance and control and the epistemic violence inflicted through the counting, sorting, and managing of populations of which big data is but the latest incarnation. The resulting tension between counting/being counted/being included in the data and the struggle against non-consensual and disempowering uses of data is not easily resolved, nor should it be. Reflexivity, after all, has long been one of feminism’s flagships for good reasons.
The resulting tension between counting/being counted/being included in the data and the struggle against non-consensual and disempowering uses of data is not easily resolved, nor should it be. Reflexivity, after all, has long been one of feminism’s flagships for good reasons.
Considering that gender equality and the closing of digital divides remain unfinished projects but pose essential conditions for the realisation of economic, social and cultural rights, we might frame representation in data as an instance of Spivak’s “what one cannot not want” while we continue to critique and negotiate the terms of that inclusion. The desire for more, better, bigger data, particularly when the data in question is meant to foster equality and empowerment elsewhere is not so easily disentangled from a development industrial complex riddled with white saviour syndrome. This is not to argue that striving for inclusive data is never in the genuine interest of the equality, health, education, or empowerment of women and other groups underrepresented in the data. An overly celebratory stance, however, risks masking the epistemic dangers that riddle the data/development terrain.
What follows for feminist data practices is that it doesn’t have to be a contradiction to support the use of data for good – whether by the means of community driven data projects or the rather global endeavours cited above – and to resist the unaccountable and/or non-consensual use of data. On the contrary, it is crucial to question the power relations behind who gets to collect and compute data about whom, to what ends; the terms of agency, consent, ownership and access; and the resulting human rights implications every step along the way. As a reflexive practice, this applies as much to data-driven state surveillance, commercial big data practices, and big data for development initiatives, as to localised uses of data for social justice, including data-based feminist advocacy projects.
It is crucial to question the power relations behind who gets to collect and compute data about whom, to what ends; the terms of agency, consent, ownership and access; and the resulting human rights implications every step along the way. As a reflexive practice, this applies as much to data-driven state surveillance, commercial big data practices, and big data for development initiatives, as to localised uses of data for social justice, including data-based feminist advocacy projects.
Feminist data practices thus include efforts to expose and level algorithmic discriminations; increased attention to the rights and agency of those represented in the data; firm opposition to all non-consensual collection and use of data; and adequately safeguarding the data, privacy, and anonymity of activists and the communities they work with/for. Carefully choosing the data projects we engage in in terms of the risks and benefits to those they seek to support, questioning the power relations such projects challenge and uphold, as well as learning and sharing knowledge about encryption, privacy and anonymity tools are prerequisites for all feminist engagements with data big and small.
While the internet, online communication tools, and community data projects remain productive spheres for the advancement of human rights, sexual rights and the empowerment of women, advocacy work requires a situated and contextual assessment of (and mitigation against) the risks its data practices may expose activists, women, queers, and other vulnerable groups to. Similarly, big data projects can promote economic, social and cultural rights (amongst others), help detect and counter human rights violations, but can at the same time pose a threat to fundamental human rights. In cases where we remain unconvinced that the data we collect, use, or provide can’t be appropriated for means and ends that might further harm those it intends to empower, resistance becomes key. Not being in the data, subverting data, not being counted and not counting others can be a decidedly feminist praxis too.
Not being in the data, subverting data, not being counted and not counting others can be a decidedly feminist praxis too.
- 9492 views
Add new comment