Merken

Towards Data Justice: Social Justice in the Era of Datafication

Zitierlink des Filmsegments
Embed Code

Automatisierte Medienanalyse

Beta
Erkannte Entitäten
Sprachtranskript
and the and the end of the it and but
that the the at
few has a lower 1 and a so I'm on Amazon hints XLI identical hazardous territory work at Cardiff University in beautiful country of Wales on word-based that the court of School of Journalism media and cultural
studies and recently we've launched a new space in your space for research and action at the data just as allowed and so what we wanna do here today is to talk a bit about why we started to think about data in these terms in the context of social justice and why we think it's important to advance social justice in the in the era of notification but also then to explain a little bit better what we mean by data just a summary at the end we can have a bit of a discussion about what to do about this so to start so why should we think about
social justice in the context of identification the 2 starting points of really for this 1 is on 1 of the consequences of beatification uh the other are the typical responses to database analysis and surveillance also the 2 areas that brought us to think about data just think about it in these terms and think about what to do about that so let's start with uh with the vacation it's a
debate that has been a lot during these days also this conference on what we mean by this is the transformation of our lies in 2 data points the collection and analysis of health data of social media communication of data about a movements in smart homes and smart cities on about our consumption habits of networks and friendships our political preferences and it's about the way in which the exploitation of the resources of Big data has become a key factor for economic success of ability to control and the mode of governance and a new mode of decision making the the just a few examples
to on uh highlighted what we mean by this for example and predictive policing data on neighborhood crime rates and previous crimes of individuals and so on is used to predict who might be a future criminal police use programs such as prep to tell them where crime is likely to occur and who is likely to be involved in this set programs also used to collect
information on for example activists and so protesters we've done research about how police in the UK use social media data tool for example categorize activists in and out of threats and into non threats and to use social media social media data for these purposes computer programs calculate a risk score of people who been arrested and thereby the likelihood of men committing future crimes and the score is use this kind of restores use in court to set sentences for convicted criminals and decide for how long they should go to jail not supposedly this allows for more accurate more evidence-based decision-making for that's always the case as we can see for example in this picture here
that's from an investigation by journalists at ProPublica and it's about risk scores are used and the US justice system here we see of a black woman with a minor offense was rated higher risk then the white guy with a serious criminal record the as the necessarily so because race is included in the system but so these programs use factors such as employment and living environment previous interaction with the police crime among family and friends and so on which can then serve as a proxy for race for example we know that there is systematic discrimination against blacks in US we know the black people are
stopped and searched far more often than whites and they are incarcerated far more often and of course of all this feeds into the program then this is what we end up with 1 of the problems with the systems is that the algorithms of private and it's not possible for either defendants or for the public to see why they got particular score but the allocation of services such as insurance is also increasingly based on data health insurance is starting to experiment with offering lower rates if customers measure their health
with it it's and make data available car insurance is of lower rates of wh install box that's and measures of driving the most far-reaching project is planned by the Chinese government it's a kind of and social credit
score uh where everybody uh every citizen will get a score based on things like criminal records and spending habits but also social networks kind of information that the post and social engagement and so on and this is then used to decide whether someone gets
alone Prof job or better social services and access to good schools axis universities or is allowed to travel and the idea is that this is then rolled out of nationwide and 20 twenty's
inequality for a program of sulfur governments to predict and prevent risk we can also call it a program of social control on the other this is the foremost most far-reaching experiment I think uh at the moment in the context of national
politics and elections the use of data has been discussed a lot recently the uh companies like cambridge analytical offer their services of motivate and all of those to political campaigns uh so that voters can be influence of messages that are targeted to particular motor characteristics and particular vulnerabilities and the role of this and the Trump election and also in the Brecht that referendum in the UK last year and has been widely debated but more broadly at data also transforms classic national
citizenship the a can legally conduct surveillance on foreign nationals not on US citizens and so to establish whether piece of online communication belongs to US citizen or foreigner they identified a number of selected such as a phone number IP address language the degree of interaction with people inside and outside the US and so on and so many of your
Facebook friends are believed to be foreigners and you talk on e-mail a lot with somebody to exchange e-mails a
lot with with someone believed to be formed and we talk on the phone the Czech international news websites and on on then maybe you're perceived as before now according to the data and you can be monitored by John singular pulled from the University of Michigan has written about this and he called this an algorithmic form of citizenship the a database version of citizenship he says unquote it's a functionally or it functionally abundance citizenship in terms of national identity in order to privilege citizenship in terms of provisional interpretations of data it sometimes aligns with the nationality as in new passport and sometimes it doesn't the all these examples so that we are are categorized according to data assemblages and our rights and obligations are reconfigured according to these classifications but database
categorizations don't necessarily corresponds at least not always but what we experience in our everyday lives the offline lives and often we have no way of knowing how we are classified for what reason and what we can do about this this a new field of study of men in academia the field of critical data studies that addresses these problems of edification of scholars in
this field look at for example database discrimination or questions of accountability was accountable someone is arrested or sentence based on a data score the anyway so that's the 1st starting point for for think about data just as the consequences of uh beatification the 2nd for us really whether responses to the Snowden revelations and of protection from surveillance this node and links highlighted
and the problems of beatification just as digital tools became entirely integrated into our lives were not talking of course about the Internet of Things about smart devices or artificial intelligence and so on and so it's in a way it should historic moment uh in which all this comes together the complete verification of alive uh but also at the same time increase awareness and and problems of this of awareness of the problems of this is true for example the Sun revelations at Cardiff University we conducted research on the implications of this
node revelations and responses to them and yes we probably know there are mainly actually 2 word types of responses there is the technical response encryption and a modernization tools have been very successful most increase awareness of the use of of through these tools is increased use of these tools and yes but still it's something that not all people and action of many people you're doing or even source ID organizations are are doing a lot we think a problem with this approach is the technological self-defense is typically an individual act that each individual is responsible for that puts the onus on me to protect myself and it doesn't necessarily focus on the broader transformation of society through notification the other approach has being the legal 1 court
cases 1 and political advocacy and policy advocacy other uh court cases against surveillance reasons have been quite successful British there's policy was was practically declared illegal through this court cases and policy advocacy at some mixed responses were and and outcomes the UK Investigatory Powers Act was pushed through despite significant lobbying by the rights groups and the you data protection regulation of been more promising outcome but especially again this response is limited typically too small expert communities legal experts to 2 rights
groups not necessarily the wider civil society or the broader public so what about public debate well media reporting has been largely focused on justifying surveillance In the context of a broader discourse on national security and public knowledge and debate has been rather limited there is unease about under surveillance and our research has shown that when people are worried that they resign largely to the fact
that data collection remains obscure and seems impossible to avoid the the and finally well the main claim by Snowden and the critics
of surveillance of the surveillance state has been that surveillance by the NSA and others affects everyone has to be resisted because of that and yes that's a that's a very important point but at the same time we also know that surveillance affects people with particular skin color more than others particular religion more than others effect activists more than passive citizens the facts poor people more than the rich may affect those living in a certain neighborhood more than living in another neighborhood and so on and so we need to look at classic questions of discrimination power relations and social justice to really understand the implications of data vacation and so we believe that
the self protection and digital rights are important but are not sufficient to address the more fundamental transformation that we're witnessing the shift in the fabric and organization of society through data fication which impacts on people's ability to participate in society and some people more than others which leads to a renegotiation of civil liberties and democracy and a power shift between different forces in society and so it's not merely a question of protecting my messages against us states events for example from or using data ethically but we need to understand these broader transformations in which they ification infuses and changes society governance and power so what does that mean and how do we do that OK
so we have started to think about this and in terms of data justice and the term data just has also been used to describe the use
of data-driven systems in criminal justice that's not what we mean here when we're talking about data justice we're talking about the study and practice of data fication from the perspective of social justice so social justice here highlights precisely these questions of equality and fairness in how different communities and individual data subjects are implicated in data put processes this processes and how they're positioned in society as a result of data fication as well as questions of who and what it is that drives the infrastructures that shape the way the world is represented as an ordered through and beatification an emphasis on
social justice I think also invites us to consider the nature and role we think they'd ification and technology generally should or ought to have in society and 1 nature and role it also should have in our society so by data just as were predominantly talking about reframing the debate on on data and to shift the conversation from concerns primarily with individual privacy to broader questions of power equality and fairness that come with that and as as
and highlighted this to deal with questions that were to highlights the debate around the fact that data or uneven so this idea of mass surveillance is somehow that means when we talk about macerated several equally implicated in this as individuals to try and sort of have a more nuanced understanding than that and that you think of it more as also this surveillance scholar David line described as social sorting and we have to emphasize that actually where not some groups are more surreal than others and also civil for different reasons than others that we're not all implicated in this equally and link to that to highlight the data processors can discriminate and exclude at all stages really of the data processors with skewed datasets to begin with the the input in terms of the actual design of the algorithms themselves so what is a key weighted what information gets highlighted and what doesn't and also in terms of the outputs of the type of score or profile the um that gets produced the and also to highlights that data processors
crates what when I think of this new stratifications of have and have-nots meaning that there's a new power symmetry between those who were able to access to the resources to carry out profiling versus those who were subjected to those profiles were unable to even understand why and how their profile and the way that they are and this highlights issues around the Tisza categories that are used to classify citizens in different ways and also questions of course of due process and how you can challenge decisions and issues around transparency and also this idea of who actually owns dates answer questions of ownership also comes up here we start understanding beatification around these these terms the the and then also the fact that data processors advance what might think of was a new politics that space more and more on prediction and preemption meaning that we increasingly governed by what we are predicted to do in the future what we might of profile intending to do in the future rather than what we actually do and how we actually acts and who we actually are and this has implications of course for understanding of citizenship and this distance between our data double and actually who we are and I lived experience becomes incredibly important and very political and for us a key issue of of social justice the so this is what we're talking about these the size of issues we won a highlight we're reframing the debate in terms of data justice and they're different approaches to this issue so 1
is this is the kind of stuff that we have uh we have focused on a on the so far anyway based on our previous research which is about trying to articulate responses to date ification based on social justice or a more systemic and collective this form of assistance that feeds into a broader social movements than what we might have when we talk about an encryption and policy advocacy so for example to try and link up concerns between different communities we spoke for example with with someone who has a community activists in Bristol who deal with questions around fair housing who feel that surveillance and questions around data isn't really is anything to do with them they're concerned with other things so kind of outsourcing of this issue to technology activists digital rights groups and we feel that 1 response by for reframing it in terms of data just to try and
overcome this disconnect that we haven't society between technology activist on the 1 hand and social justice activists the other and actually highlight how they'd ification also comes to play a key part of the types of social justice agenda that for example community activists might pursue the 2nd approach that has been around data justice is particularly in in development where there is an emphasis on trying to develop certain principles that can underpin also was data ethics framework so highlighting for example issues around so limitative example at Tilburg University has worked on questions around visibility and and representation and discrimination to trying to come up with new principles on that can underpin how we should pursue data processes the handling of the that data process
as well as and the users of data and then there's an approach around data just as where it's more by examining data fication from the perspective of marginalized communities highlighting points of discrimination and exclusion so example Virginia Eubanks of New America has been studying how poor communities in the US engagement data-driven systems looking for example of how decisions about who should receive benefits are made through derivative driven systems and how people also understand perceive those decisions so also to try and understand what is the interest at play who gets rewarded in these systems and who gets punished when you take the perspective of already marginalized communities or disenfranchised communities the then we can think of data
justice also about applying existing social and economic rights framework that reaction to put in place for example around anti-discrimination or migrants rights or rights at work so labor law for example and applied these to date ification or at least examine how data fication infringes upon WordNet or enables are ability to enjoy economic and social rights that we have if and then there is of course also be um approach to data just as which is about developing alternative data systems that is based on architecture the lecture considers social justice in their very design of the architecture so practicing computer science in a way that makes politics 6 explicit so someone like Jeffrey Johnson from use of Al University for example talks about this in terms of information justice but there's also design just network and forms a
data activism that is about this idea to design data infrastructures that actually considers questions of discrimination and inequality in this design by working with communities who might be subject to it already so for example building things like cooperative alternatives to various forms of businesses so there's the commando cleaning property for example where it's about designing platforms together we the cleaners and in in the Bronx allows them to be in charge of decisions that are made through that platform about their work and what information should
be there and what information shouldn't whatever data should be collected what type of data should be collected so how can we advance and they that just as well we think policies to continue to do research around these how these data processes that she work because there is huge lack of public understanding that we have found around this and action also we work part in the journalism school and we find that there is a huge lack of journalistic skill in how you actually investigate some of these processes to try and hold these algorithms and data-driven decision-making to account so we also want to think of it that we have to advance data literacy also amongst journalists time and also practitioners like lawyers and so forth
and also need to broaden the stakeholders that are involved to directly connect concerns of for example to include anti-discrimination groups and other social justice activist in the debate and bring in historical perspectives on how inequality and unfairness happen in society when we talk about also data and we think it can be advance also of course is mentioned through further
policy development either by highlighting certain principles that should underpin data processes and but also applying rights framework reactant in place already to some of these issues and then to further collectivist design where we actually you bring together social justice activists and developers when we think about alternative data
infrastructures and to continue to question what interests are play what forces act to drive the state ification processes and also highlight the politics of that so that's what we want to do and with our new initiative that we set up at Cardiff University called the data just which is collaborative space for research and practice on the relationship between data fication of social justice we had of public launch in March on started by myself on Indians and Joanna readin and it's actually also about trying to advance the European concept for some of these debates because they tend to have been very US centric so forth but taking around this issue of of data and discrimination to try and also get some and some context within Europe on this and some European frameworks in place and it's not a projects we're developing a things like the use of Big Data in governance and for social policy the development of of data scores issues around data fication health issues around the impact on protected communities like refugees indicating data systems um developments of alternative forms of smart cities they may consider some social justice so
we want to talk about data justice and the data just as lab just to sum up to reframe the debates to understand data as a social economic justice issue we think we need to think through collective responses to date ification that go beyond individual privacy and what we might think of his techno-legal solution is and we think we have to overcome this disconnect in civil society between techno technology activists and social just take justice activist and connect concerns and from this is about nurturing alternative political imaginaries for what the deal and data action should be so thinking about how society ought to be organized and the social organization of technology with in that society beyond the current dominant understandings of data fication a space around efficiency and objectivity so thank you for that I don't think that I
forgot to realize the whole world
so far and only me
Hypermedia
Nichtunterscheidbarkeit
Hasard <Digitaltechnik>
Computeranimation
Beobachtungsstudie
Bit
Punkt
Flächeninhalt
Datenhaltung
Endogene Variable
Systemidentifikation
Gruppenoperation
Besprechung/Interview
Kontextbezogenes System
Term
Raum-Zeit
Analysis
Nachbarschaft <Mathematik>
ATM
Telekommunikation
Datennetz
Besprechung/Interview
Transformation <Mathematik>
Bitrate
Exploit
Teilbarkeit
Computeranimation
Chipkarte
Entscheidungstheorie
Menge
Hypermedia
Optimierung
Schlüsselverwaltung
Lie-Gruppe
Analysis
Proxy Server
Kategorizität
Likelihood-Funktion
Besprechung/Interview
Familie <Mathematik>
Interaktives Fernsehen
Physikalisches System
Teilbarkeit
Computeranimation
Entscheidungstheorie
Datensatz
Thetafunktion
Hypermedia
Information
Datenfluss
Optimierung
Programmierumgebung
Betriebsmittelverwaltung
Dienst <Informatik>
Quader
Besprechung/Interview
Vorlesung/Konferenz
Projektive Ebene
Physikalisches System
Bitrate
Optimierung
Einflussgröße
Internetworking
Transportproblem
Dienst <Informatik>
Datensatz
Prozess <Informatik>
Softwarewerkzeug
Information
Information
Grundraum
Computeranimation
Abstimmung <Frequenz>
Ungleichung
Momentenproblem
Softwareschwachstelle
Besprechung/Interview
Gamecontroller
Charakteristisches Polynom
Optimierung
Message-Passing
Telekommunikation
Facebook
Minimalgrad
Formale Sprache
Zahlenbereich
Interaktives Fernsehen
E-Mail
Netzadresse
Wärmeleitfähigkeit
Computeranimation
Beobachtungsstudie
Interpretierer
Web Site
Multifunktion
Kategorizität
Datenhaltung
EDIF
Versionsverwaltung
Term
Bildschirmmaske
Datenfeld
Algorithmus
Rechter Winkel
Nichtunterscheidbarkeit
Ordnung <Mathematik>
Knotenmenge
Punkt
Datenfeld
Momentenproblem
Datenhaltung
Endogene Variable
Besprechung/Interview
Internet der Dinge
Kontextbezogenes System
Binder <Informatik>
Neuronales Netz
Chipkarte
Expertensystem
Datenmissbrauch
Selbst organisierendes System
Gruppenoperation
Besprechung/Interview
Gruppenkeim
Quellcode
Transformation <Mathematik>
Kontextbezogenes System
Computeranimation
Knotenmenge
Chiffrierung
Rechter Winkel
Endogene Variable
Datentyp
Wort <Informatik>
Regulator <Mathematik>
Computersicherheit
Hypermedia
Gruppenkeim
Kontextbezogenes System
Verkehrsinformation
Soundverarbeitung
Nachbarschaft <Mathematik>
Subtraktion
Selbst organisierendes System
Mathematisierung
Relativitätstheorie
Besprechung/Interview
Klassische Physik
Transformation <Mathematik>
Baumechanik
Ereignishorizont
Arithmetisches Mittel
Forcing
Rechter Winkel
Digitalisierer
Kantenfärbung
Message-Passing
Aggregatzustand
Leistung <Physik>
Verschiebungsoperator
Resultante
Beobachtungsstudie
Subtraktion
Prozess <Physik>
Differenzkern
Perspektive
Physikalisches System
Term
Datenmissbrauch
Umsetzung <Informatik>
Subtraktion
Schiefe Wahrscheinlichkeitsverteilung
Natürliche Zahl
Gruppenkeim
Ruhmasse
Ein-Ausgabe
Binder <Informatik>
Term
Medianwert
Algorithmus
Differenzkern
Datentyp
Datenverarbeitung
Information
Leistung <Physik>
Funktion <Mathematik>
Subtraktion
Prozess <Physik>
Gruppenkeim
Term
Raum-Zeit
Datenmissbrauch
Computeranimation
Bildschirmmaske
Prognoseverfahren
Symmetrie
Endogene Variable
Datenverarbeitung
Notepad-Computer
Abstand
Leistung <Physik>
Prozess <Informatik>
Kategorie <Mathematik>
Profil <Aerodynamik>
Inverter <Schaltung>
Entscheidungstheorie
Arithmetisches Mittel
Chiffrierung
Rechter Winkel
Digitalisierer
Schlüsselverwaltung
Punkt
Selbstrepräsentation
Besprechung/Interview
Disjunktion <Logik>
Physikalisches System
Framework <Informatik>
Entscheidungstheorie
Perspektive
Datentyp
Mereologie
Datenverarbeitung
Inverser Limes
Softwareentwickler
Datennetz
Kategorie <Mathematik>
Gebäude <Mathematik>
Physikalisches System
Systemplattform
Gesetz <Physik>
Term
Framework <Informatik>
Computeranimation
Entscheidungstheorie
Bildschirmmaske
Arbeit <Physik>
Ungleichung
Rechter Winkel
Äußere Algebra eines Moduls
Computerarchitektur
Information
Informatik
Einfach zusammenhängender Raum
Ungleichung
Prozess <Physik>
Algorithmus
Perspektive
Gruppenoperation
Datentyp
Mereologie
Besprechung/Interview
Gruppenkeim
Information
Inklusion <Mathematik>
Computeranimation
Prozess <Physik>
Besprechung/Interview
Physikalisches System
Kontextbezogenes System
Framework <Informatik>
Raum-Zeit
Computeranimation
Chipkarte
Kollaboration <Informatik>
Forcing
Rechter Winkel
Konstante
Datenverarbeitung
Äußere Algebra eines Moduls
Projektive Ebene
Softwareentwickler
Ereignishorizont
Aggregatzustand
Objekt <Kategorie>
Einfach zusammenhängender Raum
Datenmissbrauch
Selbst organisierendes System
Gruppenoperation
Endogene Variable
Imaginäre Zahl
Strömungsrichtung
Baumechanik
Raum-Zeit
Besprechung/Interview

Metadaten

Formale Metadaten

Titel Towards Data Justice: Social Justice in the Era of Datafication
Serientitel re:publica 2017
Autor Hintz, Arne
Dencik, Lina
Lizenz CC-Namensnennung - Weitergabe unter gleichen Bedingungen 3.0 Deutschland:
Sie dürfen das Werk bzw. den Inhalt zu jedem legalen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen und das Werk bzw. diesen Inhalt auch in veränderter Form nur unter den Bedingungen dieser Lizenz weitergeben.
DOI 10.5446/33072
Herausgeber re:publica
Erscheinungsjahr 2017
Sprache Englisch

Inhaltliche Metadaten

Fachgebiet Informatik
Abstract We are living in a datafied society in which the collection and processing of massive amounts of data is being used for decision-making and governance across more and more areas of social life. How do we address possible harms and challenges for social justice? Are calls for individual privacy and encryption tools sufficient? This talk will propose a broader agenda to both understand and create social justice in the era of datafication.

Ähnliche Filme

Loading...