Citzens or subjects? The battle to control our bodies, speech and communications
This is a modal window.
Das Video konnte nicht geladen werden, da entweder ein Server- oder Netzwerkfehler auftrat oder das Format nicht unterstützt wird.
Formale Metadaten
Titel |
| |
Untertitel |
| |
Serientitel | ||
Anzahl der Teile | 165 | |
Autor | ||
Lizenz | CC-Namensnennung 4.0 International: Sie dürfen das Werk bzw. den Inhalt zu jedem legalen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen. | |
Identifikatoren | 10.5446/39336 (DOI) | |
Herausgeber | ||
Erscheinungsjahr | ||
Sprache |
Inhaltliche Metadaten
Fachgebiet | ||
Genre | ||
Abstract |
| |
Schlagwörter |
35C3 Refreshing Memories28 / 165
2
5
6
7
8
9
10
11
12
13
15
16
17
22
26
29
30
31
33
37
38
39
40
44
45
48
49
53
54
55
57
59
60
62
65
66
69
70
72
73
74
77
80
82
83
84
85
86
87
89
92
94
100
104
105
106
107
108
111
113
114
115
116
117
119
121
122
123
124
127
132
133
136
139
141
143
144
145
146
148
149
150
154
155
156
157
158
159
160
161
162
163
164
165
00:00
DatenfeldKugelSelbst organisierendes SystemTelekommunikationVorlesung/Konferenz
00:55
MenütechnikComputerspielVorlesung/Konferenz
01:43
GruppenoperationInhalt <Mathematik>
02:17
SprachsyntheseHalbleiterspeicherFinite-Elemente-MethodeQuick-SortRechter WinkelMusterspracheMAPSpeicherabzugForcingMomentenproblemGesetz <Physik>GamecontrollerSprachsyntheseSelbst organisierendes SystemBitComputeranimationBesprechung/InterviewVorlesung/Konferenz
03:11
Gewicht <Ausgleichsrechnung>Rechter WinkelDigitalisierungArithmetischer AusdruckQuick-SortMultiplikationsoperatorComputeranimationVorlesung/Konferenz
03:50
TelekommunikationObjektverfolgungWeg <Topologie>URLTermMereologieComputeranimation
04:36
HalbleiterspeicherObjektverfolgungWeg <Topologie>TelekommunikationURLMittelwertInformationVorlesung/Konferenz
05:16
DatenmodellObjektverfolgungSpezialrechnerJensen-MaßAggregatzustandURLInformationUnternehmensmodellProzess <Informatik>InternetworkingBesprechung/InterviewComputeranimation
05:58
InformationBenutzerprofilOrakel <Informatik>BildschirmsymbolRechenwerkProfil <Aerodynamik>InformationVariableUnternehmensmodellDatenflussXML
06:43
SpezialrechnerHalbleiterspeicherUnternehmensmodellMetadatenE-MailOrdnung <Mathematik>Familie <Mathematik>Kategorie <Mathematik>HypermediaUmsetzung <Informatik>Dienst <Informatik>FacebookComputeranimationVorlesung/Konferenz
07:47
ObjektverfolgungTelekommunikationMultiplikationsoperatorComputerspielDienst <Informatik>Computeranimation
08:21
DatenmissbrauchTelekommunikationObjektverfolgungCookie <Internet>E-MailRegulator <Mathematik>InformationCookie <Internet>GamecontrollerWeg <Topologie>TouchscreenVorlesung/KonferenzBesprechung/InterviewComputeranimation
09:09
HalbleiterspeicherGleitendes MittelOrdnung <Mathematik>ImplementierungRichtungURLInformationE-MailVorlesung/KonferenzBesprechung/Interview
09:47
HalbleiterspeicherE-MailDatenmissbrauchRechter WinkelInhalt <Mathematik>URLComputeranimationVorlesung/Konferenz
10:39
ChiffrierungDefaultDatenmissbrauchCookie <Internet>Inhalt <Mathematik>ElementargeometrieCookie <Internet>Regulator <Mathematik>GruppenoperationChiffrierungDatenmissbrauchDefaultInhalt <Mathematik>Web SiteMetadatenSpeicherabzugInformationsspeicherungComputeranimation
11:33
RechnernetzEin-AusgabeFunktionentheorieChiffrierungShape <Informatik>SchaltnetzSelbst organisierendes SystemAnalysisQuick-SortMereologieProgramm/QuellcodeXMLVorlesung/Konferenz
12:33
CAN-BusCookie <Internet>AnalysisQuick-SortAggregatzustandDatenmissbrauchOrtsoperatorBesprechung/Interview
13:22
HalbleiterspeicherSprachsyntheseWeg <Topologie>AggregatzustandMaßerweiterungVorlesung/Konferenz
14:12
Twitter <Softwareplattform>MaßerweiterungSprachsyntheseHypermediaInformationMultiplikationsoperatorArithmetischer AusdruckSystemplattformInhalt <Mathematik>ComputeranimationBesprechung/Interview
15:03
Regulator <Mathematik>Exogene VariableInhalt <Mathematik>RichtungFilter <Stochastik>InternetworkingVorlesung/KonferenzBesprechung/Interview
15:39
RichtungSpannweite <Stochastik>YouTubeFormation <Mathematik>SystemplattformAutorisierungDifferenteComputeranimation
16:38
DigitalfilterFilter <Stochastik>Elektronische PublikationFacebookInternetworkingAlgorithmusYouTubeTermDienst <Informatik>Vorlesung/KonferenzComputeranimation
17:25
Arithmetischer AusdruckEinflussgrößeComputeranimationVorlesung/Konferenz
18:19
Inhalt <Mathematik>Inhalt <Mathematik>OrdnungsreduktionRadikal <Mathematik>Ordnung <Mathematik>Physikalischer EffektEinflussgrößeComputeranimationVorlesung/KonferenzBesprechung/Interview
19:15
Inhalt <Mathematik>Content <Internet>DigitalfilterTeilmengeEinflussgrößeInhalt <Mathematik>AggregatzustandSystemplattformFilter <Stochastik>Quick-SortMechanismus-Design-TheorieGüte der AnpassungAutorisierungOrdnung <Mathematik>Computeranimation
21:50
Twitter <Softwareplattform>AlgorithmusInhalt <Mathematik>Filter <Stochastik>Vorlesung/KonferenzBesprechung/Interview
22:23
DigitalfilterHalbleiterspeicherFilter <Stochastik>Ordnung <Mathematik>Formation <Mathematik>CASE <Informatik>ComputeranimationVorlesung/Konferenz
23:02
DigitalfilterSteuerwerkFilter <Stochastik>Inhalt <Mathematik>VideokonferenzAlgorithmusBesprechung/InterviewComputeranimation
24:01
HalbleiterspeicherDigitalfilterFilter <Stochastik>Inhalt <Mathematik>ZustandsmaschineMereologieSprachsyntheseVektorpotenzialVorlesung/KonferenzComputeranimation
25:00
Profil <Aerodynamik>FacebookVorlesung/Konferenz
25:42
Filter <Stochastik>Ordnung <Mathematik>DifferenteUnternehmensmodellGesetz <Physik>Inhalt <Mathematik>YouTubeSoftwareentwicklerVorlesung/Konferenz
26:27
SoftwarepiraterieAttributierte GrammatikMathematikAggregatzustandFacebookTermSprachsyntheseProjektive EbeneKonditionszahlBeobachtungsstudieFilter <Stochastik>DifferenteVersionsverwaltungStandardabweichungVorlesung/Konferenz
27:49
HalbleiterspeicherSelbst organisierendes SystemVideokonferenzKontrollstrukturInhalt <Mathematik>KugelVideokonferenzSoftwarepiraterieEinsCASE <Informatik>Automatische HandlungsplanungProzessautomationAusnahmebehandlungFilter <Stochastik>BitProzess <Informatik>Vorlesung/KonferenzComputeranimation
30:01
DatenmissbrauchInformationHalbleiterspeicherRechter WinkelInhalt <Mathematik>SprachsyntheseBlackboxInformationQuick-SortEinflussgrößeFilter <Stochastik>BeweistheorieArithmetischer AusdruckDatenmissbrauchComputeranimationVorlesung/KonferenzBesprechung/Interview
30:59
SoftwareYouTubeComputeranimation
31:44
DifferenteDigitalsignalPerspektiveRechnernetzComputeranimation
32:18
DigitalfilterSystemaufrufSprachsyntheseDatenmissbrauchAutomatische HandlungsplanungFilter <Stochastik>PunktInformationGesetz <Physik>Rechter WinkelEinsMultiplikationsoperatorVorlesung/KonferenzComputeranimation
33:49
MultiplikationsoperatorInhalt <Mathematik>EinsWellenlehreComputeranimationVorlesung/KonferenzBesprechung/Interview
34:40
HalbleiterspeicherInhalt <Mathematik>SystemplattformChiffrierungGüte der AnpassungVorlesung/Konferenz
35:22
HalbleiterspeicherFilter <Stochastik>ZahlenbereichMechanismus-Design-TheorieProzess <Informatik>Besprechung/InterviewVorlesung/KonferenzComputeranimation
35:58
ErwartungswertAbstimmung <Frequenz>Computeranimation
36:36
AggregatzustandFundamentalsatz der AlgebraCASE <Informatik>ImplementierungEinsBesprechung/Interview
37:13
Rechter WinkelSelbst organisierendes SystemEinsVorlesung/Konferenz
38:02
BitUnordnungProzess <Informatik>Gesetz <Physik>Regulator <Mathematik>NeuroinformatikPhysikalisches SystemSoftwareentwicklerBesprechung/InterviewVorlesung/Konferenz
38:46
Quick-SortSoftwareentwicklerNeuroinformatikUnordnungBesprechung/Interview
39:32
HalbleiterspeicherComputerspielSelbst organisierendes SystemAggregatzustandMAPVorlesung/Konferenz
40:06
Selbst organisierendes SystemDynamisches SystemVorlesung/Konferenz
41:09
Vorlesung/Konferenz
41:43
MAPAggregatzustandGesetz <Physik>Vorlesung/KonferenzBesprechung/Interview
42:19
HalbleiterspeicherKartesische AbgeschlossenheitOISCVorlesung/KonferenzDiagramm
Transkript: Englisch(automatisch erzeugt)
00:19
And now to announce the speakers. These are Jegen Aranjo, who is a Senior Policy Advisor
00:28
at EDRY, and Andrea Bellou, who is Campaigns and Communications Manager, also at EDRY. EDRY stands for European Digital Rights, which is an umbrella organization of European NGOs active in the field of freedom, rights, and the digital sphere. CCC is actually
00:44
a founding member of it, and they will be talking about citizens or subjects, the battle to control our bodies, speech, and communications. The floor is yours. This one, this one here, this is my phone. There are many like it, but this one is mine.
01:12
My phone is my best friend. It is my life, and I should master it as I master my life. This is my phone, but what makes it mine? I mean, it might be quite obvious right now that I'm
01:28
holding it for all of you. What is not that obvious, though, is that my phone is also holding me. On one hand, we use our phones. We use it to connect to the internet, get online,
01:46
with our friends, exchange opinions, coordinate actions. On the other hand, we are used. We are used by third parties, governmental, private, who through our phones, through our devices,
02:06
monitor. They monitor our location, our bodies, our speech, the content we share. At EU level right now, there is a sort of a pattern. There is this tendency, a trend almost.
02:28
Certain laws like the e-privacy, the copyright directive, the terrorist regulation, have this very central core that we call the body and speech control. It looks like it is really the
02:45
driving force in the moment. So, in the next 40 minutes or so, what we will do is give you short updates about these laws, talk to you a bit about what their impact is on us, and what do they mean past the article X and Y, and hopefully convince you to get involved in
03:07
changing how they look right now. We are 39 human rights organizations from all across Europe.
03:20
We work on all sorts of human rights in the online environment, so-called digital rights. We work on data protection, net neutrality, privacy, freedom of expression online, and so on. Andrea and I are glad to be for our very first time here at 35C3. Now to continue that quote from that adapted quote from Full Metal Jacket.
03:46
My phone without me is useless. Without my phone, I am useless. We spend most of our seconds of our lifetime around devices that are connected to the internet, whether a phone, a computer,
04:01
a fridge, or whatnot. This means that these devices pretty much become attached to our bodies, especially a phone. Tracking these devices, therefore, is equal to tracking our bodies, controlling our bodies. For the purpose of this presentation, we will talk about online
04:27
tracking in terms of location tracking, the tracking of our devices, the behavior tracking of users on websites, how much do they spend on, I don't know, what part of a website,
04:43
where do they navigate next, how many clicks do they give, and the tracking of communication sent between two devices. First, location tracking. They are on average more screens on most of the households than we have people. We carry some of these devices in our pockets,
05:06
and they have more personal information than most diaries used to have before. Our phones need to be tracked because they need to be able to receive and send calls, messages, data. But this opens, of course, new ways to use location data for commercial purposes,
05:23
but also for state surveillance. When it comes to behavioral tracking, tracking our behavior online provides a lot more information than just location. It adds on top of it, right? A user can then be targeted according to that tracking, and the more this
05:46
targeting, not tracking, targeting process basically represents the business model of the internet nowadays. For this reason, the more complex and detailed someone's profiling is,
06:03
well, the more accurate the targeting can be done, the more effective and efficient most of the times, and therefore more valuable the data about the profile is. You can see here a really cool infographic from Cracked Labs, Axiom, and Oracle profiling of
06:28
populations. You see the amount of variables and the amount of information and the depth where it goes, and you get that business model cash flow.
06:44
And this business model is quite interesting. I wouldn't imagine a postman going to my physical mailbox at home, going through my letters, and then putting some leaflets for advertising in there according to what he reads. Right now, Jmail and many other services,
07:03
that's what they do. They leave out, as you well know, reading your emails to sell your stuff that you don't really need. Facebook conversations now are, through the API, are an option. They want to read those conversations in order to find patterns, for
07:21
example, for intellectual property infringements, especially for counterfeiting, but not only for copyright. Now also, the WhatsApp metadata is used by Facebook in order to know who your friends are, who you contact, and who your family is, in order for that social media services
07:41
to gain more and more power, more and more data, and more and more profit, of course. The Life of Others. Quite a good movie. If you haven't seen the movie, I guess everyone has seen it. If not, you should. It's basically about a Stasi agent who follows the life of an
08:02
individual through a period of time. After the Snowden revelations, this movie has changed in a way from a drama to a soft comedy because the capabilities of surveillance services to avail all of us, to get so much data, also for companies, to get so much intimate
08:24
information from us, has doubled, tripled, or perhaps exponentially grown compared to what the Stasi could do back then. All of this, to some extent, is going to be regulated by this regulation with a very,
08:43
very long name. I guess half of you have fell asleep already by going through it, but I'll go through it quickly to let you know what is this about. Why do we need to know about e-privacy regulation? Why is this important for the control of our bodies and our devices? The e-privacy is about online tracking. You might have heard of the
09:05
cookie directive, the one bringing all those cookie banners on your screens. In part, it's due to a bad implementation of this directive. It is also about your emails. Who's going to be able to read your emails, to use the data from your emails to sell you advertising or not?
09:24
How confidential that information can be? It's also about your chats, how you communicate nowadays with your WhatsApp signal wire or any other devices. Finally, it's also about location data. Who can track you? Why can they track you? What are the safeguards that need to be put
09:43
in place in order to safeguard your privacy? I can imagine many of you saying, well, don't we have already this GDPR thingy of the emails I received in May? Yes, we do have that, but the GDPR was not enough. After more than four
10:02
years of discussions to achieve this general data protection regulation, we have achieved a lot, and the GDPR was the best possible outcome in that current political scenario, and there's a lot to do regarding the implementation, though. We have seen problems in Romania, in Spain, and we expect that to happen in many other places, but we still need
10:26
a specific instrument to cover the right to privacy in the electronic communications, including everything we mentioned before, metadata, chats, location data, the content of your communications, and so on. So e-privacy is basically meant to complement GDPR and be more
10:44
focused on exactly the topics that Diego mentioned. What did we advocate for and still do? Privacy by design and privacy by default should be the core principles, the pillars of this regulation. Moreover, politicians need to recognize the value of maintaining and enhancing secure
11:07
encryption. Cookie walls. I mean, we should be able to visit a website without having to agree to being tracked by cookies. This is another topic that we strongly advocated for. And finally,
11:23
content should be protected together with metadata in storage and in transit. And we actually succeeded. Last year in 2017, at the end, the parliament adopted a very good
11:42
text, a very strong text. It supported most of the problems that, no, it addressed most of the problems that we pointed out and supported the values that we're going through. But this has
12:03
been quite a ride. I mean, it wasn't easy. As Diego said, we're a network, 39 organizations. They're not just legal people or tech people. It's a combination of both. So when we provided our input in the shape of analysis or recommendations, some bulleted there,
12:27
all sorts of skills were combined. And this played a big part of our success, the fact that we were able to provide a comprehensive yet complex analysis of what encryption should look like, of what cookies should act like, and also a legal analysis of existing legislation.
12:49
The diversity of our skills became productive. Did we win? Well, we are on our way. After the EU parliament adopted its position, now it needs to sort of enter into a discussion
13:05
with the member states in what is called the Council of the EU. So the parliament with a strong position is now to talk with the rest of the member states. Currently, the negotiations around the privacy are not really moving forward. They are being delayed by the national
13:21
governments. They claim that there are issues that need to be tackled, that is very technical, that we already have the GDPR. We need to see how it is implemented. And member states fear that another layer of protection may impede that some businesses grow in the European Union. And if this was not enough, they're also afraid of getting bad press from the press,
13:44
who right now depends to a high extent on behavioural advertising. They say that without tracking you all over the internet, they are unable to sustain their business model. And of course, since the national governments, the politicians, are afraid of that bad press
14:03
from the press, then they are quite cautious to move forward. Online we exercise our free speech in many ways, but one of those ways is the way we produce, share, or enjoy content online.
14:21
Our opinions, the people with whom we communicate, can be seen as a threat in a given time by certain governments. We've seen the trend in certain governments such as Poland, Hungary, and to a certain extent as well in Spain. It can be, all of this information can be as well very profitable, as we see with the mainstream social media platforms we were mentioning before.
14:46
So they are political and economical reasons to control speech, and that's why the best way to control speech is to control the way that content is shared online. Right now
15:02
there are two proposals that raise huge threats to freedom of expression online. Both propose upload filters by increasing liability for platforms, so making platforms, companies responsible for the content which they host. One of them is the famous or infamous
15:20
Article 13 of the Copyright and Directive proposal, and the second one is the regulation to prevent the dissemination of terrorist content online. Both of them, as you will see, they are just another way to make private companies the police and the judge of the internet.
15:40
This is the first one, the proposal for the director, again with a long name, just stick to the short name, the Copyright Directive. And this Copyright Directive is based on a fable. The fable goes like this. There are a wide range of lonely and poor songwriters in their attic trying to produce songs for their audience. Then there are these big platforms,
16:04
mainly YouTube but also others, that allow these uploads and gain profit. And these platforms give some pennies, a small amount of money for these authors. And the difference between what they earn and what they should be earning is what they call the value gap. The fable
16:23
though conveniently hides the fact that the music industry keeps saying year after year after year, that they increase the revenues to a high percentage every year, and that keeps growing, especially in the online world. What is the solution to this problem? Well, as you can imagine,
16:44
it's a magical algorithm that will solve this problem. And this algorithm will filter each and every file that you upload to these platforms, will identify it and match it against a database, and will block or allow the content depending if it is licensed or not, and if they like you or not,
17:05
and according to the terms of service in the end. As we will mention, there are some technical and legal problems with upload filters. In essence, if they are implemented, it will mean that YouTube and Facebook will officially become the police and the judge of the internet. The other big
17:27
fight that we have is around terrorism, to be specific about terrorist content online. After the Cold War, we needed a new official enemy once communism fell. Terrorism is a new
17:40
threat. It's very real to some extent. We lived through it in Brussels recently, but it has been also exaggerated and inserted in our daily lives. We see that in the airport controls, surveillance online and offline, restrictions for freedom of assembly and expression all over Europe. And whenever a terrorist attack occurs, we see pushes for legislation and
18:06
measures that restricts our freedoms. Usually those restrictions stay even though the risk of the threat has disappeared or has been reduced. Again, there we go with a long name.
18:21
Let's stick to the short name, TERROREC. It's the dysregulation to prevent dissemination of content or terrorist content. This proposal allegedly aims at reducing terrorist content online, not illegal content, terrorist content, in order to reduce risks of radicalization.
18:46
This avoids what we have seen through the experience that a lot of radicalization happens outside the online world and that radicalization has other causes which are not online content.
19:01
It seems that politicians need to send a strong signal before the EU elections. We need to do something strong against terrorism and the way to do that is through three measures. Three measures, three, as we will see in a minute. First, SPD consulates takedowns,
19:22
my favorite. Platforms will need to remove content based, which has been declared terrorist content by some competent authorities. And this definition of terrorist content is of course vague and also incoherent with other relevant pieces of legislation which are already in place
19:40
but not implemented all across the EU. This removal needs to happen in one hour. This is sort of fast food principles applied to online world, to other visual material, and they give you some sort of complaint mechanism. So if you have any problem with your content being taken down, you can go and say this content is legal, please take it back.
20:05
But in practice, if you read it, you will see that it's likely to be quite ineffective. First of all, also the overblocking will not be penalized. So if they overblock legal content, nothing will happen. If they leave one piece of content which is illegal on their platforms,
20:23
they will face a sanction. The second issue is those measures for voluntary consideration. According to this second measure, the state will be able to tell platforms,
20:40
I have seen this terrorist content in your platform, this looks bad, really, really bad. So I really feel that I have to ask you, could you be so kind to have a look? Just if you wish, of course, no worries. And the platform then will decide according to their own priorities how to deal with this voluntary request.
21:03
Third, good old upload filters, that's the third measure they are proposing. Upload filters or general monitoring obligations in legal jargon are prohibited in EU legislation. But anyway, let's propose them, we'll see what happens. And in order to be able to push them
21:26
in the legislation, let's give them an Orwellian twist to our filters. Let's call them in a different way. So we call them proactive measures. Platforms will need to proactively prevent that certain content is uploaded. How will they prevent this? Upload filters, of course.
21:47
I meant proactive measures. And whether it is copyright or terrorist content, we see the same trend. We see this one-size-fits-all solution. A filter, an algorithm that will compare all the
22:03
content that is uploaded, will match it against a certain database, and they will block it or not. We will need many filters, not only one filter. We need filters for audio, for images, for text, and also one specifically for terrorist content, whatever that is defined. So this is basically
22:25
the principle of lawmaking today. We really want filters, what can we invent to have them? We've got an issue with filters. Well, quite a few issues, but in big, an issue. First of all,
22:46
they're illegal. The European Court of Justice said like this, a social network cannot be obliged to install a general filtering covering all of its users in order to prevent the unlawful use of musical and audio-visual work. In the case
23:04
versus netlock. Despite this, it seems that automated content filters are okay, not general filtering covering all of its users. Of course, there are the technical issues.
23:21
Yeah, there are some technical issues. One of my best examples, the magnificent examples, of how filters do not work was James Rose, the pianist that we, a few weeks ago, tried to upload a video of himself playing Bach in his living room. The algorithm detected some copyrighted content owned by Sony Music and automatically took down the content. Of course,
23:47
he complained, he took the content back, but it's obvious that it's a good example of how filters do not work because one piece of Bach who died around three or four hundred years ago
24:00
is, of course, out of copyright. If this video of a famous artist is taken down, we can imagine the same for many of your content. So not only that filters don't recognize what is actually copyrighted and what is not, but they also don't recognize exceptions, such as remixes,
24:22
caricatures, or parodies. When it comes to copyright, filters can't tell, and this is why memes were a central part of the protests against Article 13, and this is why we will show soon why this filter has huge potential for a political tool. Another issue with the automated content
24:48
filter is that they don't even recognize contexts either. When it comes to hate speech or terrorist content, they can't tell nuances. A girl decided to share her traumatic experience
25:06
of receiving a lot of insults in her mailbox from this person who was hating her a lot and threatening her, so she took it and copy pasted it on her Facebook account, made a post,
25:25
and her profile was taken down. Why? Because the automated solutions can't tell that she was the victim, not the actual perpetrator, right? And this is very likely to continue happening if this is the solution put forward. Then it's also a problem for SMEs, of course, because
25:44
these filters are very expensive. YouTube spent around 100 million dollars to develop Content ID, which is the best, worst filter that we have now online, so we can imagine how this is going to go for a European SMEs that will need to copy that model, probably getting
26:03
a license from them, I can imagine, in order to implement those filters online. In the end, this will just empower these big companies who already have their filters in place, so they will just keep doing their business as usual, and this new company that would like to develop a different business model will be prevented from doing so, because they will need to spend a lot of money
26:24
on these filters. Then there's the issue of privatized law enforcement, the privatization of law enforcement. Attributes change. Past state attributes are now shifted over, can you take care of it, to entities that are not really driven by the same values that a state
26:47
should at least be driven by. I'll just give you one example from a project called the Mandola project, a study commissioned by the parliament to look at hate speech definition
27:01
in different EU member states. Their conclusion, their huge disparities between what it means hate speech, what hate speech means in Germany, compared to what hate speech means in Romania, to what it means in the UK. So in this context, how can we ask a company like Google or Facebook
27:27
to find the definition? I mean, are their terms and conditions the standard that we should see as the one influencing our legal definitions? Am I the only one seeing conflict of interest
27:46
here? There's a problem there that once we have these filters for copyright infringement or any other purposes, like terrorist content, we of course will have it as a political tool. Once we have this for copyright, why are you not going to look for those dissidents in every
28:04
country? Dissidents change very often. I see that in Spain, but I see it all across the EU nowadays. So once we have them in place over one thing, one small thing like copyright, why not for something else, for something more political? There's a really interesting example
28:21
coming from Denmark. Some year or a year and a half ago, the Social Democrats announced their immigration plan. They made a video in which Mette Friedrichsen talked about how great their plan is. Some people were happy, some were sad. Some of the sad ones decided to criticize the plan
28:47
and made a video about it. It was a critique during which they caricatured her, but they used two audio bits from their announcement video. Social Democrats
29:06
sent a letter to the NGO accusing them of copyright infringement and threatening a lawsuit. Obviously, the NGO thought, yeah, we don't really have enough money to go through a big court case, so we're just going to take the video down. They took it down.
29:24
Now, why is this case important? If an automated content filter for copyrighted material would have been in place, the Social Democrats wouldn't have to even lift a finger. The job would be
29:42
automatically done. Why? Automated content filters can't tell exceptions such as parodies. This is a very clear case on how copyright infringement can be strategically used to silence any critical voices in the political sphere. We see a few threats to fundamental
30:06
rights. First, on privacy, they will need to scan every piece of content so they can discard this information. Then we will live in a sort of black box society that will affect freedom
30:20
of speech. We will face also oversensoring, overblocking, chilling effects, and these tools which are going to be repurposed as a political tool. In a nutshell, rights can only be restricted when there is a proof of necessity, when the measure is proportional, and when this measure is
30:41
also effective. These filters are not necessary for the ends they want to achieve. They are not proportional, as we have seen, and they are not effective, as we have seen as well. So this, in effect, they are unlawful restriction of freedom of expression and privacy rights. Now, obviously,
31:02
we were also unhappy about this, and I mentioned before how we organized within our network to fight to get a strong e-privacy. When it comes to copyright, this fight went out of our
31:21
network. It got a lot of people mad. People like librarians, startups, the UN special rapporteur, all of those there, basically, and more, and even YouTube in the end, who thought about endorsing our great campaign. What we learn from these fights is that we really
31:51
need to share knowledge between us. We need to team up, coordinate actions, be patient with each other. When it comes to different skills, it is important to
32:05
unite them. When it comes to different perspectives, it is important to acknowledge them. If we're separate individuals by ourselves, we're just many, but if we're together,
32:20
we're one big giant. That is where the impact lays. Now, this is basically a call to you. If you're worried about anything that we've told you today, if you want to support our fight,
32:41
if you think that laws aimed at controlling our bodies and our speech should not be the ones that should rule us and our internet, I think it's time to get involved. Whether you're a journalist writing about privacy or other topics, whether you're a lawyer
33:03
working in a human rights organisation, whether you're a technical mindset, whether you have no clue about laws or anything like that, come talk to us. We will have two workshops, one on e-privacy, one on upload filters. We will be answering more questions if
33:24
you have and you can't ask them today, and try to put together an action plan. We also have a cluster called About Freedom that you can't see there but is right by the info point in SSL. Do you have any questions or comments? Thank you.
34:04
There's ample time for Q&A, so fire away if you have questions. Walk to the microphone, wave your hand. Signal Angela, are there questions from the internet? Microphone number one.
34:39
So the question is if the content is encrypted, how companies will be obliged to implement these
34:43
filters? Good question, I don't know. I don't think that's going to be possible. They're going to find a way to do that because either they ban the encryption in the channels or it doesn't matter because they will make you liable. If you have a platform with very encrypted channels and you have everything on there by any reason, they find any copyrighted
35:05
content which is not licensed, which you're not paying money for, they will make you liable. Perhaps in practice they will not be able to find you, to make you liable because they will not be able to access the content, but if they find a way to do so, they will make you pay.
35:22
Okay, microphone number two. Thank you very much for the presentation. You've been talking a lot about upload filters. A lot of the telcos and the lobbyists are saying that the upload filters don't exist. The trilog mechanism for the copyright reform
35:41
is, as I've heard, ending in January and there will be a solution in the European legislation process. How will we be able to inform this process and influence it to try and make it better before the European elections? Well, we still have time. That's why we're here.
36:04
One of our main goals to be in 35C3, apart from enjoying the conference for our very first time, is to mobilize all of those who have not been mobilized yet. Thousands of people have been active, they have been tweeting, they have been calling their MEPs, the members of the European parliament, they have been contacting the national governments, but we still have time. The
36:25
vote will be sometime around January, February, we still don't know. We are afraid it's going to be sooner than expected. But this is the last push, the last push to say no to the entire directive, to say no to upload filters, and that's why we are here, because we still have time.
36:41
Worst case scenario will go to the implementation phase, of course. We go to national member states and say, do not do this, this goes against the Charter of Fundamental Rights, and then we will stop it there. But either now, which is my hope, or in the worst case scenario we will stop it for sure in member states.
37:06
Microphone number one. Somehow nice to implement, what other incentives for the companies to do so?
37:20
What do the companies have to do with that? Why should they do that? Because it is voluntary. Well they could do that for different reasons, because they could get bad PR. Imagine you are a small company in Hungary and then goes urban and tells you, you need to block this, because I think this is terrorist. It comes from a human rights organization.
37:40
What would you do if you are an SME that depends on, perhaps not on the government, but on the general structure, you could get bad PR from the government, you could be perhaps still because you're not acting promptly on this serious content, but it's true. That is only for your voluntary consideration.
38:02
Again, microphone number one. I also think when I see a problem, oh there is a technical solution, so it's hard for me to admit maybe not, but it does look like it's the case, but also when you mention in the workshop, maybe more with a, I mean anybody can come, but more eventually with a legal background,
38:24
I don't have it, I'm a developer, but I want to understand how a system is working, and I understand a little bit about the European process and the regulatory process, but so not so much. So what's the most efficient way for me as a developer to get a better grasp of how this system, all those laws and regulation are getting implemented and all the different steps?
38:45
Well yeah, we didn't come to the lawyer's computer congress, we came to a chaos computer congress, so I hope you can make chaos out of it. We need developers, we need the lawyers, we need the journalists, we need graphic designers, we need people with all sorts of
39:01
skills as Andrea was saying before, and we need the developers to develop tools that work. So you are capable of developing any calling tool, any tweeting, or any sort of tool that we can use to transform our message and take it to Brussels, take it to the members of the European parliament, take it to the national member states, we really need you. If we need something, it's developers. We have enough lawyers in this world, I think we have too many in GEDRI
39:25
with myself already, so we need you tomorrow, and the day after tomorrow. Okay, any other questions? In that case, I'll ask one myself. Andrea, what would be a good start
39:42
at the member state level to start campaigning if you've never campaigned before? What would be a good start if one wanted to campaign at the member state level? And never campaigned before? Yes, campaigning for dummies. Well, we've got a lot of organisations in EU member states, so as a person who has never
40:08
campaigned before and was looking for someone to campaign with two years ago in Denmark, I was advised to look for the Danish Edri member, so I did, and we managed to organise
40:22
a lot of great workshops in Denmark where nothing existed, because Eitipo, the Danish member, had a very complex grasp of the political environment, and most of Edri members understand how
40:41
this is, how the dynamic is working, both politically but also journalists, also what the interests of certain nationalities are. So I would say that find your first Edri
41:01
organisation is the first step, and then unite with the rest. And if there's no Edri member, you can always contact consumer organisations, you can contact directly your members of the parliament, you can organise yourself with two or three friends and make a few phone calls, that's also already enough that you can do, so there are many ways for you to help out.
41:25
Of course, make sure you contact your country's MEP. At European level, we are being represented and we get to actually elect the parliamentaries, they're the only one
41:41
who are elected by us and not just proposed by governments or other politicians. So if we want to be connected to our country member state but influence a law at European level, like the
42:03
EU parliamentaries know that we are here and we hear them, and they came from our country to represent us at EU level.