We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Speech at scale

00:00

Formal Metadata

Title
Speech at scale
Title of Series
Number of Parts
132
Author
License
CC Attribution - ShareAlike 3.0 Germany:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this
Identifiers
Publisher
Release Date
Language

Content Metadata

Subject Area
Genre
Directed setSpeech synthesisScale (map)Point (geometry)FacebookData conversionMobile appXMLUMLLecture/Conference
Computing platformDifferent (Kate Ryan album)FacebookContext awarenessLecture/Conference
MereologySurface of revolutionProper mapFacebookMathematicsRight angleLecture/ConferenceMeeting/Interview
Rule of inferenceFacebookWeb pageProper mapPoint (geometry)Computing platformTelecommunicationMathematicsEvent horizonACIDDependent and independent variablesContext awarenessLecture/ConferenceMeeting/Interview
Metropolitan area networkBitTerm (mathematics)Speech synthesisSelf-organizationCASE <Informatik>Meeting/InterviewLecture/Conference
SpacetimeVideo gameComputing platformRule of inferenceSelf-organizationFacebookOrder (biology)Data managementVarianceLecture/ConferenceMeeting/Interview
State of matterFacebookRule of inferenceMeeting/Interview
Speech synthesisStatement (computer science)Rule of inferenceInternet service providerFacebookComputing platformInternetworkingQuicksortSpacetimeMeeting/Interview
Self-organizationMathematical analysisSpacetimeService (economics)ExistenceMeeting/Interview
Internet service providerInternetworkingSoftware frameworkHypermediaComputing platformSinc functionRule of inferenceMeeting/Interview
Service (economics)Regulator geneSoftware frameworkPoint (geometry)Standard deviationMeeting/Interview
Standard deviationView (database)Statement (computer science)Rule of inferenceFacebookPhysical lawMeeting/Interview
Default (computer science)Internet service providerLatin squareSpeech synthesisComputing platformStatement (computer science)InternetworkingGroup actionMeeting/Interview
Group actionDefault (computer science)Scaling (geometry)Translation (relic)Different (Kate Ryan album)Service (economics)AuthorizationMeeting/Interview
Physical lawComputing platformAnalogyDomain nameLevel (video gaming)Different (Kate Ryan album)Content (media)NumberMeeting/Interview
NumberQuicksortContent (media)AuthorizationSpacetimeDomain nameLimit (category theory)Internet service providerLogic gateBasis <Mathematik>View (database)Point (geometry)Data conversionFacebookMeeting/Interview
Standard deviationAuthorizationProcedural programmingMeeting/InterviewLecture/Conference
Rule of inferenceBitSpeech synthesisLocal ringStandard deviationFormal languageTraffic reportingForm (programming)CASE <Informatik>Staff (military)Physical lawException handlingSet (mathematics)Coma BerenicesFacebookMeeting/Interview
Physical lawLatent heatInstance (computer science)Staff (military)Content (media)Multiplication signRule of inferenceMeeting/Interview
Decision theoryFlow separationHand fanGoodness of fitBitFacebookWeb pageMereologyCASE <Informatik>Point (geometry)Meeting/Interview
Web pageDecision theoryFacebookData conversionEmailMeeting/Interview
Metropolitan area networkComputer fileProcess (computing)Traffic reportingLevel (video gaming)FacebookMeeting/Interview
Process (computing)CountingFacebookMultiplication signLogicStatement (computer science)Covering spaceMeeting/InterviewLecture/Conference
Operator (mathematics)Number10 (number)Content (media)Buffer solutionSpacetimeMultiplication signFormal languagePhysical systemComplex (psychology)Decision theoryMultiplicationAnalogyMeeting/Interview
Musical ensembleFacebookStaff (military)Web pageFormal languageRoundness (object)Physical systemSystem callProcess (computing)Set (mathematics)Mechanism designInstance (computer science)Lattice (order)Multiplication signDegree (graph theory)Meeting/Interview
Process (computing)Dependent and independent variablesSemiconductor memoryScaling (geometry)QuicksortSpacetimeMeeting/Interview
CASE <Informatik>Process (computing)QuicksortInternetworkingPhysical systemImage resolutionComputing platformMeeting/Interview
Process (computing)Rule of inferenceSystem callFacebookDoubling the cubeCASE <Informatik>Group actionMultiplication signPhysical system2 (number)Lecture/ConferenceMeeting/Interview
Content (media)FacebookRule of inferenceContext awarenessMeeting/Interview
FacebookBitIdentical particlesRootMeeting/Interview
Physical lawLimit (category theory)Service (economics)Sign (mathematics)Reflection (mathematics)AuthorizationMaxima and minimaQuicksortSimilarity (geometry)MereologyContent (media)Internet service providerFacebookAlgorithmDistribution (mathematics)Lecture/ConferenceMeeting/Interview
SubsetPhysical lawCASE <Informatik>Exception handlingService (economics)MereologyDigital photographyRule of inferenceType theoryStaff (military)MathematicsNumeral (linguistics)Spring (hydrology)ResultantFacebookMedical imagingMultiplication signCausalityState of matterModal logicMeeting/InterviewLecture/Conference
Structural loadRule of inferenceFacebookSphereAuthorizationStandard deviationVotingElement (mathematics)Meeting/Interview
PlanningFacebookRule of inferenceFigurate numberContent (media)Computing platformBlock (periodic table)Interior (topology)Meeting/Interview
Rule of inferenceFacebookPresentation of a groupThermal conductivitySet (mathematics)Line (geometry)Group actionContent (media)Meeting/InterviewLecture/Conference
Content (media)SpacetimeMassCausalityRule of inferenceLogical constantMeeting/Interview
HypermediaMultiplication signDecision theoryGroup actionFacebookService (economics)Moment (mathematics)Rule of inferenceMeeting/Interview
VotingField (computer science)Service (economics)User profileFacebookAddress spaceWebsiteMeeting/Interview
Physical lawProcess (computing)AuthorizationSummierbarkeitRight angleNormal (geometry)ConsistencyStandard deviationMeeting/InterviewLecture/Conference
Set (mathematics)ConsistencyPhysical lawRight angleCASE <Informatik>Standard deviationMeeting/Interview
FacebookPlastikkarteAreaInternet service providerInternetworkingMeeting/Interview
FacebookBitReal numberPlanningAuthorizationMeeting/Interview
TwitterPhysical lawModal logicPlanningPoint (geometry)Term (mathematics)ArmInternet service providerArithmetic meanPairwise comparisonFlow separationComplex (psychology)Meeting/Interview
Physical lawTraffic reportingMarginal distributionFacebookExecution unitMultiplication signMeeting/Interview
Multiplication signPlanningFacebookAreaMoment (mathematics)Lecture/ConferenceMeeting/Interview
FacebookPlanningTraffic reportingDecision theoryPublic domainDependent and independent variablesInformationComputing platformTable (information)Meeting/Interview
Standard deviationMeeting/InterviewXML
Transcript: English(auto-generated)
As I said, I'm very much looking forward to the next half hour, which is going to be a keynote conversation.
And this conversation is going to be held between Richard Allen, who's the Director of Policy for Europe, Middle East and Africa, at Facebook, and the creator of our fantastic Republika app, and member of the Digitale Gesellschaft, Ulf Böhmayer. At this point, maybe a short applause for the Republika app.
I hope you're all using it successfully. So, the two of them will be discussing for the next half hour the topic of how to develop a policy for a global platform like Facebook with its different target audiences and different cultural contexts. Please give them both a warm welcome.
Yeah, a very warm welcome from my part. Well, hello, Richard Allen. Robert Allen, that's right. Richard Allen, I'm sorry. Hello, Richard Allen, I'm very happy to have you here today to talk about Facebook public policy. You are the Director of Public Policy for Facebook in Europe
and the continents next to Europe. And I'd like to talk in the first place about how the importance of Facebook for the public debate, for debate in the society, changes your ways of thinking about what people may say on Facebook and may not say. For example, Facebook played a big role in...
Problem? Okay, okay. Yeah, Facebook played a big role in the revolution in Syria, in the revolution in Egypt, and Facebook plays a big role in public debate in Germany as well.
Everybody who is active in politics knows that without a proper Facebook page, it's very difficult to address electors. So, what are the rules that you have, what people may say on Facebook, and how do you develop these rules? So, I'm sorry, can you use this one?
So, I think, just at the outset, I'd like to say that Facebook as such is a neutral platform. So, I think we want to be really clear, when we think about major political events that take place, they take place because individuals, often very brave individuals on the ground, use whatever tools that are at their disposal to affect political change.
So, it's been really clear in the context of the Middle East, individuals decided that they wanted to affect political change, and one of the tools that they used was our tool, just as they used all modern communication tools and used them very effectively. In terms of our responsibility, we think it is important that we remain a neutral public space, albeit a public space that's privately managed.
I know that's something we're going to talk about in a little bit more detail, but I think at the outset, we should think about that, that many of the public spaces that we interact within are, in fact, privately managed public spaces, just like this conference here today. If I engaged in hate speech with another individual at this conference,
and that individual objected to it, they would go to the conference organisers, who would then decide what remedy to take against that individual, to tell them to shut up or throw them out or refer them to the police if it was serious enough. And so, in a sense, we believe that we run one of those very important public spaces, like many of the spaces that I move within during my daily life,
and that we want to create a neutral platform within which people can talk about the things that are of interest to them. And in order to do that, we do need to have rules that help us to keep that space orderly, just as the conference organisers here, or the managers of the hotel where I stay tonight, need to have rules to keep those spaces orderly. Yeah, as you said, you do need rules on Facebook.
Not everything is permissible on Facebook, but how do you set these rules? Of course, they are rules set by the states, so there's state legislation, national legislation, that forbids to say something publicly. For example, in Germany, we are very sensitive to right-wing activism.
You can't say something about... Don't say the thing you can't say, yeah. Just don't say it. You can't say certain things about the genocide, for example. You can't say them, and they are illegal in Germany. In other countries, they may be perfectly okay as a matter of freedom of speech. So, how do you set these rules? Which statements are okay on Facebook, and which statements do you forbid?
So, I think when we think about internet services more generally, internet platforms like Facebook, there is a sort of popular idea sometimes that the internet is an unregulated space. And I think that's just simply untrue. When you start up a service, you're primarily regulated by the jurisdiction within which the organisation that creates that service,
whether commercial or non-commercial, exists. Which is the US, in this case, in California. And I think in some ways, what you're doing is you're creating a new offshore island. When I was a kid, I was fascinated by, off the coast of Iceland, there was this new island created called Surtsey. A volcano erupted, and this island appeared.
And that island was virgin territory, but it still belonged to Iceland and was within the Icelandic jurisdiction, a country that's very relevant for our media debate today. And so, in many ways, when a new internet platform emerges, it's like a new offshore island, but it's attached to a host jurisdiction. And so, it has to comply with the rules, the jurisdiction,
the framework of that host jurisdiction, and that can lead to differences. I mean, many of the big internet services are offshore islands off the coast of California by origin, and we'll talk about how they've evolved since. You know, if somebody in Germany creates a new service, or somebody in Uzbekistan, or the United Kingdom, or Uruguay, you know, they will have that home jurisdiction as their starting point.
And then within that, they'll create a framework. But the idea is completely unregulated, I think I say, is mythical. It's that you have a home regulation, and then you need to adapt to global regulation. Okay, so you do have some legal standards from the country of origin, but there are many, many countries in the world
that have even stricter standards. For example, I quoted genocide statements in Germany, very critical. I know that Turkey, for example, has laws that forbid certain statements about Kemal Ataturk, for example. And many, many countries have rules of that kind. How does Facebook react to these rules? Do you only allow statements that are legal all over the world?
Because that would very much limit free speech on your platform, of course. I mean, not the latter. I mean, the way it works is having created this internet service, as soon as you start to become significant in other countries, and of course, remember that an internet service by default is global. Unless you take, you know, a technical action
to stop people everywhere in the world accessing your service, they will by default access your service. So you'll start to gain users around the world by default, in different jurisdictions, and then you may take active steps, you may translate or market into particular countries to gain a user base there. And I think it's fair to say that as soon as you start operating at scale,
the authorities of different countries will take an interest in your platform, and will come to you with requests which relate to their specific local law. And there are different ways of dealing with this. Again, to take the island analogy, one way is to say that I'm going to have a different landing stage for people from different countries. So you have a .de, a .fr, different domains that you use,
and you say, we will restrict the restricted content on those domains only. And of course, that's been used by Google for a number of years, Yahoo, I think, a number of other companies. That's limited, though. I mean, firstly, it doesn't actually stop access to the content, it just stops access to the content via that domain.
But where a country is trying to be reasonable, they may accept that as good enough. But secondly, it's limited in that if the whole point of the island, the space you've created, is to have a global conversation. Having people going to separate little cantons on the island is not really the desired outcome. So what you end up doing is saying, look, if you're giving me clear evidence
that a particular piece of content is illegal, I will effectively block that from view from visitors from your country to my island. So they can come, they can enjoy 99% of it, but this small amount of content, and typically it is a small amount of content that is clearly illegal in Turkey or Germany or anywhere else,
you know, at your request, public authorities with due authority, we will make sure that small piece of content is not available to those visitors using a technology known as IP blocking, which I'm sure many people are familiar with. And most of the major internet service providers will now operate on that basis. But you explain that requires that inside Facebook
there is somebody who decides if a request is going to be followed or not. So you really judge requirements by national authorities. And what I'd like to know is, what are the standards that you follow? Is there, and what is the procedure that you do inside? I imagine there is some kind of team that has a look at these requests and that judges them.
Because I don't know, and I think it's not public really. Could you explain a bit more how you address these requests and what are the rules that you follow? So we've published our community standards, those are our global standards on the site. And if you go to facebook.com slash policies,
you'll find our community standards in, I think, around 20 supported languages now that we offer them. And they will tell you the global overall guidance. And so the staff that deal with incoming reports in our company will apply those rules. Where they see something exceptional, the kind of cases you're talking about, where it's not against our general standards, but somebody is claiming that it's against local law,
then they will escalate that small set of claims. And I say, just a reminder, these typically are at the edges. What's exciting is most countries permit most speech most of the time. It's amazing how far we've actually gotten. A lot of countries, even where they have laws that prohibit certain forms of content, don't really enforce them very aggressively.
And I think that's quite a helpful thing, because they may be heritage laws that they've had for some time, but are not particularly relevant today. So if we get one of those requests, it's valid, and they do seriously want us to look at the content, it's escalated, we will then do legal review. We will get, typically, outside counsel in the country concerned
to tell us their understanding of the law and our obligations under both the local law and international law. And we'll then, having escalated that, we'll then either deal with that specific piece of content or come up with a general rule, a piece of guidance that our staff can have. So, for example, Nazi emblems in Germany is well known,
the law is well known, and our staff can be given general guidance on that. There may be other instances where, you know, we see one of these requests in three years, and therefore it's a one-off item. They have to be escalated, they have to be legal reviewed. Okay, legal review is a good topic, because I'd like to talk a bit more about what individuals can do
if they are confronted with a decision from your side. Let me just quote one example. The fan page of the left party in the European Parliament was shut down by Facebook several weeks ago. I don't want to go into the details of the case, because we can't decide if that decision was right. My point is that left party tried to appeal your decision,
and actually they just got an email from Facebook stating, we don't know what you're talking about, we don't know which page you're talking about, and we won't comment on that issue. So, actually, they did not have the possibility to address the issue with Facebook. There was no conversation. And, you know, from my professional background, I'm very much into due process, fair trial.
That's something that's very important to me, and I have to try to do that every day. And to my feeling, that was not fair trial that they got, because they just weren't heard in court. I mean, as you said, you're doing some kind of judicial review, so you're some kind of court inside of Facebook. You decide which blocking request you are going to follow.
So, I'd like you to explain, why is there no due process? Why can't people appeal your decisions? I mean, again, just as a kind of headline warning, Facebook is not perfect, which I'm sure many people in the room would agree with that statement, if nothing else I've said. Facebook is not perfect. We make mistakes.
Everybody does. That's logic. And our processes are not perfect either. We've been improving them over time. And that's really what I and my colleagues work on, is how do we get from something which is really quite simple, operating a relatively restricted community within a relatively restricted number of jurisdictions and a restricted number of languages,
to something which is incredibly complex, operating across tens of countries in multiple languages with a huge array of complexity around the different legal systems and the content that's there. I describe that complexity not to say that that's an excuse, but to say that's what we're trying to deal with. And just to give you a sense of some of those improvements,
I mean, certainly, yes, we get it wrong. And I've described sort of privately, again, to use that analogy of a privately managed public space. Sometimes people in those spaces make the wrong decision. Every time I've been thrown out of a bar, it was never my fault. It was someone else's fault. But I've been thrown out, nonetheless. And then when I want to come back in and complain about it,
again, I would expect the same. I want to be heard by the bouncers. I want to be able to get back into it. But what may happen is that those staff are then busy dealing with the next set of complaints that come in. And that's very much where we've been, I think, to a certain degree, is where dealing with the complaints, the first round of complaints and issues that people have raised, has been so time-consuming.
It frankly has been hard to put in place appeal mechanisms and things like that that we recognise we also need. We have improved. We now are rolling out a process of page appeals. So for the instance you've described where a page has been taken down, we're putting in place an appeal system for that. So you may take your word on that. Yeah, the process has started rolling out already.
I'm not sure how many languages it's yet in again. You know, the page appeals team need to be able to operate in each language. So we roll them out over time. So the Facebook call is coming. The Facebook, I wouldn't call it Facebook call. And again, I just want to be careful. When you're dealing with things at scale, all of these principles of fair processing and fair response are there.
We try really hard. We don't want to annoy people. We don't want to annoy our users. And we don't want to make the stupid mistakes that get us these terrible headlines that then stick in people's memories. So we do try hard to avoid it. But at the same time, when you're operating at this kind of scale, you know, to expect the kind of heavyweight process you get in the judiciary, or almost expect the police and judiciary to intervene in every dispute
that you have in a domestic space or, say, one of these sort of public spaces like this, I think is unrealistic. So the police and the judiciary will intervene in the most serious of cases with the kind of process you're familiar with. But in a lot of the sort of more immediate cases that occur on internet platforms,
I think the system of private resolution is here to stay. And the challenge for us is how do we make that work as well as possible? May I ask you to... Okay, there we go. Okay, so the Facebook call is not coming, but you are in the process of establishing some rules
where people can get their appeals heard. Exactly, and doing things like double review, you know, having a second reviewer look at cases as well as a first reviewer, and only taking action where they agree. There are various systems like that that, you know, three or four years ago we weren't doing, we're implementing progressively over time to get as good as we can, frankly.
Okay, so just one more question addressing the issue of content blocking on Facebook. There have been discussions about the rules that have been set that may sometimes be some kind of incomprehensible, at least for the European users of Facebook. There has been discussion about pictures of female breasts
being blocked on Facebook, even if they weren't used in any kind of sexual context. Could you please explain a bit more why Facebook sometimes fails really to distinguish if something should be blocked or not? I mean, just on the nudity thing, some people say it's kind of, you know, it's an American conspiracy to kind of restrict that.
I think it is actually related to the US roots of Facebook, but I don't think it's a conspiracy. It's just, you know, this is how things evolve. And I described at the beginning the way that services are rooted in a particular jurisdiction. And it just seems to me self-evident that, you know, the United States authorities take a particular interest
in the distribution of nude imagery, particularly to children, you know, or in places where children may be present. That's a particular reflection of their concern. And so services like Facebook that originate in the United States will acquire those concerns quite naturally. For example, we have a minimum age limit of 13.
Now, most of Europe doesn't have any minimum age limit for Internet services. Indeed, in many European countries, Internet services like ours sign up much younger children. But the United States has a very clear law, copper, which requires that 13 age limit. Actually, the EU is about to, I think, implement a similar law for Europe. But we've acquired those sort of things into our rules,
and they've now become part of the global rule set. But as you said, sometimes you do accept national laws that restrict content. So why don't you, in this case, for example, accept national laws or European laws that allow much more than the United States law would allow? And that's a live discussion. I mean, just to say, within the company,
people want to go as far as they can, but what we can't do is resolve one problem and cause another problem. And we started off talking earlier about the Arab Spring and the spread of our service within a lot of countries in the Middle East. A lot of those countries in the Middle East will block services that have a large amount of nudity on them. And so we wouldn't necessarily gain something
if we make European users happy by allowing more nudity if the net result is that other parts of the world then have a problem with that rule change. So we're not averse to rule changes. We've updated our rules numerous times. We do allow breastfeeding. We allow photos of breastfeeding women. But to give you an example on that, some of our staff, I think, were overzealous
and, under the nudity rules, removed pictures of women breastfeeding, which they shouldn't have done. If you go on Facebook today and type in breastfeeding, you will see lots of photos of women breastfeeding. Nobody kind of believes us because of that mistake. But last week, somebody posted some images of women using breast pumps. Now, the breast pump images didn't have a baby in them,
so the staff originally thought, well, that isn't a breastfeeding photo, so that doesn't meet our rules. We've now had to adjust our rules to say that breast pumps and breastfeeding are okay. We're constantly having to adjust the rules, but what we don't want to do is adjust them in a way, you know, a thoughtless way that then caused a whole load of knock-on problems. And that's what's happening every week.
Yes, we're talking about Facebook rules. We have already addressed the question that Facebook is of great importance for public debate. The standards for public debate in the really public sphere are usually set by governments, and people elect the government, so they have some kind of democratic authority. The government sets rules, at least basically along with the votes of the people.
But how does this happen on Facebook? I've never heard of Facebook elections. Are there any plans to really listen to your users, to maybe at least consult users which rules they'd like to be set? Again, just to be really clear,
Facebook produces no content. So, all of the content, including the political content on the platform, is produced by individuals. Yes, of course, but you block or you allow, and as we discussed, you have rules, and these rules are at present set by Facebook employees. If I get it right, they have no democratic legitimacy, but they are not elected by anybody.
But to make a distinction, again, you know, I go to my hotel bar this evening, and there are rules. I can't walk into the hotel bar naked without them taking action. But I can say what I like politically to another individual in there. I can't shout out racial abuse in the bar, you know, without them taking action against me. So, there are a whole set of rules that govern the conduct of the public space
that don't necessarily govern the content. And, you know, that's the line that we really try to draw. We want to deal with harmful content, content that actually genuinely causes harm to others, or deal with content that leads to a disorderly or a space that that massive community of people of all ages
doesn't want to be in because it feels unsafe to them. But user involvement is not currently on your agenda? Again, just to be clear, users are involved constantly. Yeah, because they create content. When it comes to setting rules, that was the question. What about setting rules? I mean, again, sitting there, you know, when you're in Facebook, the media storm, every time we make a bad decision,
media, activist groups, individual views themselves, we never get a moment when we're not conscious of what people who use our service feel about the rules we're operating. Again, we may not react quickly enough, we may not react accurately enough, but certainly when you're sitting inside the company,
the user voice is very loud and clear. So, you don't let people vote, but you listen? Exactly, yeah. It's a consultative arrangement. Okay. I'd like to address a third field that is government requests to get user data out of Facebook. So, I imagine that you are confronted with quite a lot of requests
that police agencies, maybe even secret services, address Facebook and would like to have the user profile of user X or Z. How do you react to these requests? Do you follow them or do you evaluate them in any way? So, again, there's a public policy on our site at facebook.com slash policies.
You can look at the law enforcement piece and we're very public about how we operate that. Essentially, any law enforcement agency with lawful authority can submit a request. And what we then do, again, as some of the process I described earlier, is check on the lawfulness of that process, check whether it's consistent with generally accepted human rights standards and norms.
And then our team, we have a dedicated law enforcement team within the company that operates from the United States and from our international headquarters in Dublin. They will evaluate those requests. And if they're lawful, consistent with international human rights standards, then they can... and consistent with prevailing law, because, of course, there is a lawful set of restrictions
on what you can do disclosing data from the United States or from Ireland. If it's consistent with all of those, then they will work with law enforcement in those cases. So, we do already have some kind of Facebook card at that place, at least. So, you do evaluate and you check if the requests are legal in the country where they come from and in the U.S. and Ireland.
Exactly. And consistent, again, I think, across the industry. Yes, the EFF, the Electronic Frontier Foundation, has just evaluated some internet services, including Facebook, when it comes to their policies if they inform users and if they fight against some kind of government request.
And they found out that you haven't fought any requests in court yet and that Facebook users are not currently informed that their data has been transferred to government authorities. Do you have any plans to be a bit more transparent here? Because people can't fight themselves against warrants if they don't know that their data has been transferred
and Facebook doesn't fight as well. So, basically, there seems to be no real protection of user data from... by Facebook or by the users themselves. Do you have any plans to inform users or to protect your users by fighting warrants in courts? Just to be clear, if we had requests that we thought were unwarranted,
inappropriate, then we would fight them. It's a question of whether we're getting those requests in or whether what happens is actually what happens, again, not just for Facebook, but for most providers, you get a request like that. You say to them, no way, Jose, we're not going to meet this request, and the agency withdraws it. I mean, they don't go to court, and so you have no need to go to court.
That's the typical arrangement that takes place. Of course, we can imagine circumstances where somebody wants to keep pushing that, and if they do, then we would engage in that battle as necessary. In terms of informing users, again, the law varies from jurisdiction to jurisdiction about whether or not that's permissible.
And there's a live debate within the industry about how far one should go. For example, again, to be really honest, most of the requests that we get from law enforcement, if it's escalated to us, are usually involving some kind of serious crime. And so there is a very open debate about the point at which somebody should be informed about law enforcement interest in them
if they're in the middle of carrying out serious crime. And we're happy to engage in that debate, but we don't think there's a simple answer. I mean, most requests that come to us are not someone trying to get hold of data about a political activist on WikiLeaks. They really are about somebody trying to get hold of data on somebody who is in the middle of conducting
some horrendous criminal activity. But still, I mean, just compare your policy with the policy of Twitter, who have already fought several warrants in the California courts, for example, and who systematically informed their users. And I'm sure that Twitter as well will respect California law and US law. So I think there is some kind of margin
that you may use in the interest of your users. And that's why I say it's an open debate across the industry. But, you know, we're, EFF report us. I mean, we're not talking about the industry at large, but just about what's the debate inside Facebook or what plays against informing users? I mean, precisely the situation I've described. If, you know, we get a warrant in that is related to an activity that we can see
is clearly an ongoing criminal activity, people use the cliché of, you know, paedophiles and terrorists all the time. But the reality is those are the kind of requests that you typically receive. It's somebody, you know, who's carrying out an offence against children, for example. In those circumstances, you have to ask whether, even if you could legally inform that person about the investigation,
whether or not that would be appropriate. I think that's a very relevant debate to be had. And again, perhaps it's one that, in some ways, this is an area where we would much rather it were settled by lawmakers who can make that kind of judgment, perhaps than by us. But at the moment, that's where we are. And finally, are there any plans inside Facebook to publish transparency reports?
It's an ongoing issue of debate. We're looking at it. We've looked at what other companies have done. And of course, it's something that I would say adds to the public debate. It certainly puts more information to the public domain. And so it's on the table, but there's no decision. But isn't that some kind of public responsibility as well?
I mean, you're a platform including about a billion people. So you're really setting standards for the industry. I'm not going to disagree with you. Thank you very much for being with us today. Thank you.