Online platforms as human rights arbiters
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Part Number | 156 | |
Number of Parts | 188 | |
Author | ||
License | CC Attribution - ShareAlike 3.0 Germany: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this | |
Identifiers | 10.5446/20685 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
00:00
Digital mediaWeightMetropolitan area networkGoodness of fitJSONXMLComputer animationLecture/Conference
00:22
InternetworkingMaxima and minimaMultiplication signFacebookRight angleMereologyProjective planeGame controllerInternetworkingLecture/Conference
00:45
Domain namePower (physics)Maxima and minimaPower (physics)MereologyWordPhysical lawBitRight anglePerspective (visual)Meeting/InterviewLecture/Conference
01:22
ArmRight angleProduct (business)FacebookObservational studyMeeting/Interview
01:45
GoogolPower (physics)Domain nameInternetworkingMetropolitan area networkData structureWordPoint (geometry)Power (physics)DivisorLecture/Conference
02:19
Power (physics)Abstract syntax treeCohen's kappaArithmetic meanRight angleField (computer science)Multiplication signLecture/ConferenceComputer animation
02:49
Multiplication signInternetworkingLecture/Conference
03:33
Power (physics)Digital signalIndependence (probability theory)State of matterDigital mediaDensity of statesComputing platformData structureDifferent (Kate Ryan album)Arithmetic meanGame controllerVideo gameInformationMappingLine (geometry)Computer animationLecture/Conference
04:33
State of matterDigital signalIndependence (probability theory)Digital mediaOptical disc drivePower (physics)InformationAsynchronous Transfer ModeExecutive information systemForm (programming)Control flowData structureGame controllerVideo gameInformationState of matterArithmetic meanProcess (computing)Condition numberDifferent (Kate Ryan album)Multiplication signDecision theoryBitPoint (geometry)Open setFacebookLecture/Conference
05:54
InternetworkingPower (physics)Curve fittingInternetworkingPower (physics)Term (mathematics)NumberFacebookMoment (mathematics)Position operatorNetwork topologyPairwise comparisonZuckerberg, MarkLecture/ConferenceJSONXML
06:48
FreewareWeb serviceLecture/Conference
07:29
Power (physics)Metropolitan area networkInternetworkingTerm (mathematics)Power (physics)Message passingGoogolLink (knot theory)Computer clusterSpacetimeDataflowLecture/Conference
08:18
BitPower (physics)DataflowLevel (video gaming)State of matterAverageStaff (military)Term (mathematics)SpacetimeLink (knot theory)Web serviceMereologyMeeting/Interview
09:28
Metropolitan area networkPower (physics)InternetworkingArtificial neural networkPower (physics)Software developerRoboticsAlgorithmLecture/ConferenceJSONXML
09:56
Power (physics)Multiplication signState of matterVector potentialRight angleDependent and independent variablesPhysical law2 (number)Lecture/Conference
10:44
Time zonePower (physics)Set (mathematics)Computer networkState of matterRight angleMereologyPhysical lawGoodness of fitStandard deviationDependent and independent variablesTime zoneField (computer science)Lecture/Conference
11:22
Time zoneState of matterPower (physics)Set (mathematics)Computer networkField (computer science)Standard deviationRight angleDependent and independent variablesLecture/Conference
11:45
MeasurementNegative numberPoint (geometry)Right angleElectronic mailing listProcess (computing)Core dumpThermal conductivityDependent and independent variablesInternationalization and localizationMessage passingProduct (business)Lecture/ConferenceMeeting/Interview
12:43
Metropolitan area networkComputer networkTime zoneSoftware1 (number)Right angleNumberLimit (category theory)Lecture/Conference
13:28
Event horizonNormed vector spaceSummierbarkeitServer (computing)Key (cryptography)Lecture/Conference
14:00
Staff (military)BitMilitary basePublic domainFacebookBasis <Mathematik>Lecture/ConferenceMeeting/Interview
14:39
Event horizonNormed vector spaceData Encryption StandardPerspective (visual)Right angleEvent horizonData conversionSpacetimeBitLecture/Conference
15:05
Right angleInformation privacyLecture/Conference
15:28
Event horizonRouter (computing)Multiplication signAnnihilator (ring theory)Vector potentialGoodness of fitOpen sourceFacebookComputer animationLecture/Conference
16:22
Different (Kate Ryan album)Pattern recognitionComputer animationLecture/Conference
16:50
ArmLimit (category theory)Multiplication signTheory of relativityMereologyFacebookQuicksortInformation privacyDependent and independent variablesMathematicsComputer animationLecture/Conference
17:43
Metropolitan area networkMaxima and minimaRight anglePattern recognitionCubeTheory of relativityNegative numberStress (mechanics)CASE <Informatik>WordLecture/Conference
18:55
Metropolitan area networkTerm (mathematics)Theory of relativityExpressionProcedural programmingInformation privacyDecision theoryRight angleType theoryBitWeb serviceStandard deviationPerspective (visual)Computer animationLecture/Conference
19:46
FreewareSpeech synthesisStandard deviationRight angleInformation privacyFocus (optics)ExpressionSpeech synthesisOpen sourceFreewareIdentity managementLecture/Conference
20:28
Regulärer Ausdruck <Textverarbeitung>Standard deviationFreewareSpeech synthesisMultiplication signCondition numberTraffic reportingGoogolYouTubeStandard deviationWeb serviceBitFacebookBounded variationLecture/ConferenceComputer animation
21:15
Process (computing)Web serviceMultiplication signVolume (thermodynamics)Latent heatTerm (mathematics)Computing platformContent (media)FacebookDecision theoryState of matterPhase transitionLecture/Conference
21:50
Regulärer Ausdruck <Textverarbeitung>Standard deviationSpeech synthesisFreewareStaff (military)Content (media)Decision theoryRule of inferenceBlack boxExpressionProcess (computing)Perspective (visual)Lecture/Conference
22:49
Regulärer Ausdruck <Textverarbeitung>FreewareSpeech synthesisStandard deviationMetropolitan area networkPrice indexContent (media)ExpressionRight angleArithmetic meanInternetworkingBitComputing platformInsertion lossMultiplication signNormal (geometry)Mixed realityLecture/Conference
23:30
Content (media)Computer programmingNormal (geometry)Mixed realityStandard deviationLecture/Conference
23:55
Regulärer Ausdruck <Textverarbeitung>Standard deviationSpeech synthesisFreewareContent (media)Decision theoryWeb serviceExpressionComputer programming1 (number)Perspective (visual)Right angleComputer animationLecture/Conference
24:32
Level (video gaming)Limit (category theory)FreewareWeb serviceInformationPointer (computer programming)Sheaf (mathematics)Type theoryInformation privacySource codeInformationContext awarenessKey (cryptography)Theory of relativityComputer animationLecture/Conference
25:05
Level (video gaming)FreewareWeb serviceLimit (category theory)InformationSheaf (mathematics)InformationContext awarenessRight angleBusiness modelWeb serviceBit rateFreewareExterior algebraLecture/Conference
25:38
Set (mathematics)Information privacyContext awarenessInformationGame controllerWeb serviceMeasurementArithmetic meanLecture/Conference
26:07
FreewareWeb serviceInformationLimit (category theory)Level (video gaming)Extension (kinesiology)Information privacyInformationGame controllerLevel (video gaming)Business modelWeb serviceDataflowComputer animationLecture/Conference
26:53
Level (video gaming)FreewareWeb serviceLimit (category theory)InformationState of matterInformation technology consultingChi-squared distributionCuboidLimit (category theory)Information privacyKey (cryptography)Business modelComputer animationLecture/Conference
27:15
Information technology consultingState of matterCartesian coordinate systemState of matterPower (physics)Arithmetic meanLecture/ConferenceComputer animation
27:41
Information privacyIdentity management1 (number)Different (Kate Ryan album)BitLecture/Conference
28:10
Metropolitan area networkInformation technology consultingGamma functionPerspective (visual)Table (information)Time zoneLecture/Conference
28:35
State of matterInformation technology consultingCuboidHacker (term)Endliche ModelltheorieWeb serviceLimit (category theory)Information privacyRight angleProcedural programmingState of matterInstance (computer science)Black boxTerm (mathematics)Computer animationLecture/Conference
29:09
State of matterInformation technology consultingMetropolitan area networkAsynchronous Transfer ModeGamma functionRegulator geneArithmetic meanMultiplication signCondition numberReading (process)Computer animationLecture/Conference
29:49
Digital mediaJSONXML
Transcript: English(auto-generated)
00:20
Good morning, everyone.
00:21
Nice to be here. So what I want to talk to today is Internet Giants and Human Rights. It's a research project that I'm currently working on where I'm looking at what does it mean for human rights protection that we have large corporate interests, the Google,
00:41
the Facebooks of our time that control and govern a large part of the online infrastructure. I want to go through four main themes, hopefully in 20 minutes max, so we have at least five
01:01
or ten minutes for discussion and questions. First, I'll say a few words about the power of these companies, the role that they play. Then I will talk a bit about the challenge from a human rights law perspective of having them held accountable under human rights law.
01:22
Then I'll tell you some of my findings from my current research based on empirical studies of Google and Facebook where I'm looking at the sense-making within the companies, how do they themselves see their role vis-a-vis human rights, and how is this sense-making
01:40
translated into their policies, their specific product features, the governance structures, and then a few words at the end about challenges and way forward, and maybe I would also love to hear some of your comments on that last point.
02:00
So if we start with the powers that these actors have, this is a quote from two Google executives, the vast majority of us will increasingly find ourselves living, working, and being governed in two worlds at once. I think that's a pretty strong quote.
02:23
What Eric Smith of Google and Cohen, chief executive of Google Ideas in New York, is basically saying here is that in the future we will basically be governed by two parties. I mean, coming from being a former public diplomat a long time ago, and now in the
02:44
human rights field, this is relatively provocative to me, while at the same time I also understand why he's saying it. And the more I talk to these companies, I get a sense of how they see these issues.
03:01
So this was to give you one advertiser. Then if we go into academia, we have Jack Belkin, an American legal scholar, who's stating here that he thinks it's important that we remember. We have such a strong narrative of freedom related to the internet, the freedom, the
03:23
way that it has liberated us from classical gatekeepers, and that has been the narrative for a long time, and increasingly we are recognising that we are subjected to structures of control just by different parties and different means.
03:42
And one of these very strong structures of control are the private platforms where we basically conduct a lot of our public life. And the thing about these new platforms, these new infrastructures of control, one of the things I think that's interesting is that it's the same infrastructure that gives
04:06
us the freedom that we so much cherish to express ourselves, to search information, to find like-minded, to counter governments, et cetera. That structure is exactly the same structure that also provides new means of basically
04:25
mapping us, surveilling us, retaining an unprecedented amount of data about us. So there is really very little means of opting out. We are in structures that are both found to be liberating but at the same time also
04:44
entail new means of control. One last quote, also from a legal scholar, Niebe Ilkin-Korein, that basically states here that it's important to remember that as information becomes crucial to every
05:05
aspect of our life, I guess it always has been, but the condition under which we process and deal with information, those conditions have changed. And the control over those structures may influence the way we are able to
05:23
participate in modern life. To put it a bit differently, if a lot of decisions about us are being taken within structures where we don't have access, where we're not able to see the points that make up decisions that affect our lives, then it's basically a different
05:44
kind of democratic society or open society from what we're used to. Now, if we turn to Google and Facebook for the matter of simplicity, it could be
06:01
a number of other internet giants, but I'm focusing on those two, and they are very powerful. In terms of economic power, in February this year, Google was the most highly valued company in the world. It has lost that position again now to Apple, but it's just to say that
06:23
it's up there among the three most valued countries in the world. I think the net value at the moment is around 550 billion US dollars. That's a company that was only founded in 98. Facebook, in comparison, is from 2004.
06:43
They've only been on the stock exchange for four years, and Mark Zuckerberg is now the sixth richest person in the world. So we are talking about an immense amount of wealth, and I went to visit both of their headquarters in December last year, and I think it was only when
07:04
I was there in those physical surroundings that I really grasped just how rich they are, just how many people they employ, and more importantly that all this money is basically generated from advertising.
07:22
I mean, it's generated from providing free services. That's really amazing, I think, to think about and still mind-boggling. In terms of political power, Google has the strongest lobby presence of all companies, of all companies in Washington, D.C. now.
07:44
They spend 20 million US dollars a year on lobbying alone in the US and Europe. This is not to put out a strong criticizing mark on Google. That's not my main message here. My main message here is to recognize or to have us all recognize that there
08:04
are huge money involved and that there is a huge link to political power. Otherwise, they wouldn't pay such strong lobby attendance to major capitals. Also, if we look at the flow of executives within these tech spaces
08:23
and the government, I mean, it's basically people are floating around from US State Department, from state departments in Europe and to these companies. So there's also on staff level, there's really a great flow. There's a close link between the political power specters and these companies.
08:46
In terms of social power, they have a huge social power because they have so many users. Basically, the vast majority of us are using their services every day. And by and large, that user base is pretty uncritical.
09:03
We don't have a huge user-consumer movement or anything like that. We have campaigns here and there, but generally, and especially when you move outside countries like Germany or France that are a bit more critical than the average, that's not the narrative.
09:21
When you're in the US, there's not a very critical narrative and in many other parts of the world as well. Finally, I've put down technical power because when you have so much wealth and so much of that wealth goes into engineering, into artificial intelligence,
09:42
into robotics, into algorithm development, et cetera, of course they also have a huge say in how the future of tech development looks like and how it's put to use. So this was to give you a picture of these are not just some companies.
10:02
They really have huge powers and huge influences. And then normally we would think that with great powers come great responsibility. But the tricky thing here is that the human rights treaties that were set out
10:21
after the Second World War to basically protect citizens from abuse of power are all formulated with the state in mind. They were formulated, they were drafted, they were subscribed to at a time where we were imagining power abuse or potential power abuse as abuse by the state.
10:43
Private companies are not bound by human rights law. So they might have taken up human rights a lot in their internal discourse. They are part of a lot of voluntary initiatives and they do good stuff with regard to human rights also
11:01
but they are not bound by human rights law. You cannot put a private company before a human rights court. It's all in the voluntary zone between the legal standards and then corporate social responsibility. That's, you know, a more normative baseline.
11:22
The strongest standard-setting document that we have in the field is something called the UN Guiding Principles on Business and Human Rights that was drafted by a Harvard professor in 2011. That's the main standard-setting document. It's been widely appraised and adopted across the field.
11:42
And it speaks to the company's responsibility to respect human rights and makes a point that all companies should take proactive measures
12:01
to basically mitigate any negative impact they may have on human rights. So they should basically assess all their business conduct and see are there any of the stuff we are doing in the way our processes, our products, the way we treat our staff, the way that we work in a local community, et cetera,
12:21
that may have a negative human rights impact and if so, try to mitigate that impact. That's basically the core message with regard to companies. But it's not binding. It's not binding. It's a recommendation. It's widely appraised, but it's still a recommendation.
12:41
And then I've also listed probably the most relevant industry initiative related to the tech companies, the Global Network Initiative, that was founded by the Berkman Centre for Internet and Society. A few tech companies in the beginning, it was only like three or four or five.
13:00
I think there are eight now, but all the main ones are in there. And they've also set out a number of baselines and recommendations with regard to how they should ensure that their practices are human rights compliant. However, as I will come back to, there are real limitations
13:21
to the way that they think about and implement human rights within the Global Network Initiative. Okay, then now if we go over to some of the empirical stuff I've done. When I started with this research, I had the promise from the two companies
13:43
that I would get access to talk to key policy people within. However, it has proved quite difficult to get access. It's been a challenge that could, you know, deserve a talk on its own. But I won't go into that here. But I've managed in the end to do around 20 interviews,
14:03
more or less 50-50 between Google and Facebook, a bit more Google interviews. I've also analysed around 20 talks that's in the public domain. So that's the good thing about our age, that you can actually find a lot of the corporate executives and other staff talking about these issues at places such as this.
14:25
And then afterwards, you can basically listen to it. And often, they are actually more frank in panel discussions and stuff like than when you have them on a two-hand basis. So that has also been very useful.
14:41
And finally, I've attended various policy events around these spaces and also been able to carry out conversations there. And as I mentioned initially, my idea has been to understand, to get a bit away from the naming and shaming discourse, to try to get within, try to understand almost, you know,
15:00
like from an ethnographic perspective, how do they understand and work with human rights? What is their sense-making around these issues? Why is there such a strong disconnect between the way we in my privacy community or my human rights community or a lot of other communities that I know of think about these corporate actors and human rights
15:22
and then the way they think about themselves? What's going on? What's the beef? And how does that understanding then influence the way they work? So, since we don't have that much time, I will go straight to some of the main conclusions that I've found.
15:44
So first of all, there is a strong presumption within these companies of doing good. And that actually makes a critical discourse a bit difficult because they have a strong belief that they are basically liberators.
16:02
They are very much anchored in a narrative of good-doers. And this is not to say that the Google and the Facebookers are not doing good. They also have great potentials in many respects. But it's just that whereas with other older and more established companies,
16:23
there's a more, you could say, mature, there's a different kind of recognition of that as a company. There are various aspects where you might have problematic issues in the communities where you operate. It seems that within this sector,
16:42
it's really difficult to have that critical discourse. The presumption of being good-doers are so dominant. Also, there's a strong sense of being transformative with the use of technology really being on the forefront and all the time pushing the limits of what technology can do.
17:02
And that means, for example, that if you raise privacy-critical issues, for example, in relation to some of Facebook's practices, one response that you will often counter would be that, well, we need to push the use of the technology all the time. That's our role.
17:21
And then there's always this sort of reluctance to new practices, new changes, but gradually this whole discourse or this whole practice of using technology, of using social networks is evolving, and we are part of that, and our role is to push the use all the time.
17:41
So a sense of being at the forefront of being very transformative, yet when it comes to human rights, there is actually a very conservative approach, and by that I mean that there is a sense that the human rights threats mainly steam from governments.
18:03
Human rights threats are something that we like to talk about in relation to governments in countries that we don't approve of, the easy cases, so to speak, the China, the Cuba, the North Korea, et cetera. There are many of these countries, and they can very rightly so be criticized,
18:22
but it's just too simplistic to say that human rights problems and challenges only occur in these places. And especially when we talk about companies that have such a strong impact on users' human rights, it's important to have a recognition of the role they may play, their own negative impact,
18:42
and that recognition is not really there. It's purely about governments. It's about pushing back, holding back against repressive government behavior. So in other words, the rugged guidelines that I spoke about earlier, the UN guiding principles
19:02
on human rights and business that speak to the need to assess all your business practices from that perspective, that is being translated into something that looks at business practices in relation to government requests. So there would be a due diligence procedure
19:21
if a government requests the company to shut down a service, but whereas they take decisions in relation to, for example, their terms of service enforcement or their community standards, that wouldn't be the same type of assessment. That wouldn't be perceived as a human rights, as a freedom of expression issue.
19:43
So if we zoom in a bit on some of the findings in relation to freedom of expression and privacy that I'm focused on mostly, because they are the two human rights that I think are most urgently needed to address. There are also other human rights for sure
20:00
that would be relevant, but these have been my focus. So a strong free speech identity in both companies. I mean, they're born out of the US West Coast, not surprisingly. They think highly about free speech and they see themselves as true free speech liberators
20:21
and are playing a crucial role in that regard. Have a strong pride in pushing back against government requests, also issuing transparency reports where you can see how many times they have accepted or accommodated a government request
20:42
and under which conditions. At the same time, the enforcement of their own community standards, I've called it community standards here, it's called a bit differently depending on the service. So under Facebook it would be community standards,
21:01
under YouTube it would be community guidelines, under Google search it's a more narrow regime, so there are bit variations. But for simplicity here, I speak about community standards as the kind of terms of service enforcement that the platforms do.
21:20
The volume of things that are removed here are many, many, many times bigger than government requests. Facebook told me recently that they have, I think it was one million items flagged each day,
21:40
each day by users who think that Facebook should look at this specific piece of content and potentially remove it. Yet, the processes whereby these decisions are made, whereby their staff or outsourced staff look at the request, the decisions they make,
22:02
the criteria for making this, the content is removed for which reasons, which content is not removed, all that takes place in a complete black box seen from the outside perspective. It's simply not possible to get access to that data. So you have a huge amount of content
22:22
that's being regulated in processes that are completely closed to the outside. And more importantly, they are not seen as freedom of expression issues. They are seen as a private company basically enforcing its rules of engagement.
22:41
And from a strictly legal perspective, rightly so, because very strictly legally speaking, freedom of expression is about government restrictions in content on the internet, and even though I think most people, also human rights lawyers would agree
23:02
that of course it has a freedom of expression implication how much content a major platform removes. You cannot bring it before a court as a freedom of expression issue unless you could really prove that there were no alternative means
23:22
of expressing that content. I'll have to run a bit with the time I see. So, very high volume, a mix of norms. By that I mean that the norms that decide which content is removed and which is not is a pure mix of something that's legal standards
23:40
and something that's more stuff that we don't want because of other reasons. Not because it's illegal, but because it's found inappropriate or unwanted or goes against the community norms. And it's based on what they called a neighbor watch program which basically means that we as users
24:01
are the ones that are flagging the content that we find as problematic and then the service on the other side make decisions on what to do with that content. That's also, I mean, from a freedom of expression perspective, that's also pretty problematic
24:20
because freedom of expression is precisely meant to protect those expressions that the community might not like and nevertheless deserve to be there. Okay, I'll rush through some of the findings in relation to privacy. So, the taken for granted context of these companies
24:41
is what they call the personal information economy. That's a new type of economy that's basically based on personal data as the key source of income. I mean, think about it. All that wealth basically comes from advertisement, targeting advertisement based on all the things
25:03
known about the users. That's what creates the wealth. That's the personal information economy. That's the taken for granted context. That's not something that's questioned. And that basically means it's also, so when you post questions about that, the answer will be, well, it's a free service, right?
25:22
Someone has to pay, so the advertisers pay. That's so we can provide a free service to the users. And up till now, alternative business models, for example, where users paid something, a monthly rate or something, hasn't really been in the discourse.
25:40
The pre-setting is a free service and the personal information economy, and that means that when you talk about privacy, they will list all these increasing measures whereby users can control their privacy settings. And there are increasing means
26:00
of controlling your privacy settings, but privacy control within this context basically means that you can control how you share information with other users. It's what I call front stage privacy control. So I can control which users are to see which information about me to some extent,
26:22
but the backstage privacy control, the flow that goes on behind my back between the company and between other affiliated partners of the company, that's not framed as a privacy issue, that's the business model. So you have the business model, that's the backstage privacy handling,
26:43
and then you have privacy as front stage user control, the way that we can, yeah, navigate our information between others like ourselves using the service. That's really important to understand, but that basically means, because that basically means that privacy is not limits on data collection,
27:01
which is a key principle in European data protection. Okay, I'll finish up. So some of the, I've just listed some of the key challenges. One, the business model, that I really think we need to question and to challenge and to discuss with these partners.
27:21
The corporate state nexus, I haven't addressed that very much today, but basically the interchange of data between state powers and corporate powers that we know so very little of still. Then there is, I mean, all these major actors, they are US companies and there is a sense, at least from the people I've spoken to,
27:42
of European privacy, of Europeans being overly concerned with privacy in a way that's a bit incomprehensible to most Americans, at least the ones I've spoken to, because it's just a very different conception of privacy
28:00
and the way that many Europeans have privacy is something that's really essentially linked to our identity and autonomy. It's quite different from a US perspective and I also think we need to get that up on the table and speak to that more openly and address that more openly, because with these global data flows,
28:23
these underlying presumptions, these underlying zones of contestation need to be addressed if we are ever to get some kind of more global agreement on these issues. Then we have the consenting users. Data protection is basically based on user consent
28:41
in the European model and practically all users consent as a premise for using these services. That also puts some limits on what we can then demand afterwards in terms of data protection. Then there's the very state-centred approach to human rights that are found
29:00
within these corporate entities and finally what I call the black box, the black box of internal procedures around, especially content regulation, that is almost treated as trade secrets, meaning that we can't really get into a dialogue on that. Okay, I think I'll finish here.
29:23
Thank you. Thank you very much, Rike, Frank Jorgensen, for this inspiring talk. We unfortunately don't have the time to ask some more questions but please feel free to find Rike here at the Republica.
29:45
I think she's going to spend the whole day and yeah.