We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Surveillance by Design

00:00

Formal Metadata

Title
Surveillance by Design
Title of Series
Number of Parts
132
Author
License
CC Attribution - ShareAlike 3.0 Germany:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this
Identifiers
Publisher
Release Date
Language

Content Metadata

Subject Area
Genre
Abstract
While the collection, storage and analysis of our data becomes ever cheaper and easier, governments around the world are eager to make the surveillance of citizens the default setting. Therefore, it has never been more important to explore countermeasures that would protect our fundamental right to privacy. While the European Union continues to take positive steps to ensuring that public and private bodies protect the privacy of citizens (for example through the Data Protection Regulation), much work remains to be done in addressing issues of due process in how governments use, protect, and request user data. Specifically, government requests for so-called "lawful access" to user data are trending in both democratic and non-democratic nations, presenting one of the greatest challenges for the protection of fundamental rights. This talk will highlight this issue as well as provide a brief overview of the main challenges facing citizens in protecting their privacy, including some recent proposed laws in the EU and the US, that will show that bit by bit, our freedoms are being chipped away. The second half of the talk will focus on the need for domestic and international jurisprudence that protects our fundamental rights, and more broadly what can to be done to counter the surveillance state.
WindowDirected setMathematical analysisFrequencyTheory of relativityQuicksortDifferent (Kate Ryan album)XMLLecture/Conference
Multiplication signVirtual machineSelf-organizationRight angleDigitizingLecture/Conference
Multiplication signTelecommunicationComputer animationLecture/ConferenceMeeting/Interview
TelecommunicationInternetworkingMultilaterationSurface of revolutionOpen setMathematicsTransformation (genetics)2 (number)Mobile WebData storage deviceComputer animation
InformationMathematicsPhysical lawMereologyMultiplication signInternetworkingWeb 2.0InformationPower (physics)WikiMeeting/Interview
MereologyWeb 2.0Product (business)CollaborationismSurfaceElement (mathematics)WordUser-generated contentLecture/ConferenceMeeting/Interview
FreewareFacebookProduct (business)Information systemsInternetworkingWordTransformation (genetics)Data miningResultantElement (mathematics)User-generated contentCombinational logicComputer animation
WordInternetworkingLine (geometry)Combinational logicInformationSoftwareVirtual machineInternet der DingeInformation privacyInternetworkingLecture/Conference
Physical lawElectric power transmissionDigitizingUniform resource locatorInformationMobile WebCloud computingService (economics)Lecture/Conference
Mobile WebPoint cloudData storage deviceComputer networkEmailArithmetic meanService-oriented architectureInformation privacyFacebookRule of inferenceAxiomInformationComputer animationLecture/Conference
Game controllerPhysical lawRegulator geneInformation privacyFlow separationProduct (business)Service (economics)Computer animationLecture/Conference
TelecommunicationCopyright infringementSign (mathematics)Lecture/ConferenceMeeting/Interview
Information securityMereologyDefault (computer science)Information securityTelecommunicationComputer animation
MereologyInformation securityGame theorySummierbarkeitRight angleMultilaterationData storage deviceProgramming paradigmShift operatorLecture/ConferenceMeeting/Interview
InformationTelecommunicationSound effectState of matterDirection (geometry)MultilaterationComputer animation
TelecommunicationBitPoint (geometry)System callAuthorizationOffice suiteSurface of revolutionMultiplication signEmailLecture/ConferenceMeeting/Interview
Data storage devicePattern recognitionProjective planeType theoryDefault (computer science)Lecture/Conference
Data miningTerm (mathematics)FrequencyExtension (kinesiology)Arithmetic meanLatent heatShift operatorInformationDefault (computer science)Multiplication signComputer animation
Data miningCarry (arithmetic)AdditionCodecCivil engineeringProxy serverInformationInformation privacyLecture/ConferenceComputer animation
SoftwarePhysical lawTelecommunicationData centerMeeting/Interview
Carry (arithmetic)Denial-of-service attackAuthorizationFlow separationDatabaseDirection (geometry)Set (mathematics)Default (computer science)Lecture/Conference
Default (computer science)Service (economics)Information securityProduct (business)Personal digital assistantTelecommunicationCASE <Informatik>Set (mathematics)Point (geometry)Order (biology)Service (economics)Default (computer science)Speech synthesisPhysical lawLecture/ConferenceComputer animation
Musical ensembleStandard deviationTelecommunicationInternet service providerGoogolBackdoor (computing)Service (economics)Product (business)Inclusion mapReal-time operating systemFacebookCharge carrierTwitterLecture/Conference
Product (business)Point (geometry)MalwareNumberLecture/Conference
SoftwareContent (media)Computer virusData managementEmailOperator (mathematics)MalwareNeuroinformatikIntercept theoremInstance (computer science)Functional (mathematics)NumberInternettelefonieType theoryAuthorizationWebcamInformationComputer animationLecture/Conference
Information securityPhysical lawTerm (mathematics)Order (biology)Information privacySummierbarkeitLecture/Conference
AuthorizationIntercept theoremTraffic reportingPhysical lawInternetworkingLine (geometry)Point (geometry)Speech synthesisMeasurementInformation securityTerm (mathematics)System callInformationAddress spaceComputer animationLecture/ConferenceMeeting/Interview
Information privacyCASE <Informatik>Process (computing)Civil engineeringInformation securityPhysical lawMeasurementInformation technology consultingLecture/Conference
Process (computing)Information privacyInformation technology consultingAuthorizationCivil engineeringPhysical lawMultiplication signVector potentialModal logicDirection (geometry)Physical systemComputer animationMeeting/InterviewLecture/Conference
Structural loadRight angleNumberCivil engineeringInformationDirection (geometry)Physical lawStandard deviationTerm (mathematics)Lecture/Conference
Term (mathematics)InformationRegulärer Ausdruck <Textverarbeitung>Information privacySystem administratorStandard deviationCASE <Informatik>FacebookProduct (business)Service (economics)InformationTelecommunicationObservational studyMultiplication signComputer programmingGroup actionInformation and communications technologyRight angleLecture/ConferenceMeeting/Interview
InformationsgesellschaftVector potentialInternetworkingOpen sourceLecture/Conference
MereologyMereologyMeeting/InterviewLecture/Conference
Open setStandard deviationFundamental theorem of algebraInformation technology consultingRule of inferenceArithmetic meanMeasurementStandard deviationPhysical lawRight angleCivil engineeringTelecommunicationSound effectRegular graphComputer animationLecture/Conference
Traffic reportingAuthorizationNumberRegular graphTelecommunicationBasis <Mathematik>State of matterNormal (geometry)Physical lawDrop (liquid)TwitterLecture/ConferenceMeeting/Interview
State of matterPower (physics)Information privacyState of matterVideoconferencingAxiom of choiceField (computer science)ResultantCASE <Informatik>Office suiteLecture/ConferenceComputer animationMeeting/Interview
TelecommunicationEntire functionInformationRight angleField (computer science)Lecture/Conference
Direction (geometry)Maxima and minimaInformationPoint (geometry)Regulator geneInformation privacyLecture/ConferenceMeeting/Interview
Latent heatMaxima and minimaRegulator geneInformationStandard deviationOperator (mathematics)Lecture/ConferenceComputer animation
Medical imagingFrequencyTheory of relativityProfil (magazine)Lecture/ConferenceMeeting/Interview
Control flowRadio-frequency identificationInformation privacyEmailEncryptionGame controllerMereologyTerm (mathematics)Dependent and independent variablesTelecommunicationIntegrated development environmentLecture/Conference
Multiplication signTelecommunicationMessage passingInternetworkingEmailLecture/ConferenceMeeting/Interview
Information privacyCollaborationismLecture/ConferenceComputer animation
Structural loadRight angleDigitizingSelf-organizationGroup actionInformation privacyRepresentation (politics)WebsiteCuboidCoalitionRegulator geneComputer configurationLecture/ConferenceMeeting/Interview
Level (video gaming)CuboidLecture/Conference
Computer animationXML
Transcript: English(auto-generated)
I'm going to switch to English now, since it's an American speaker. Canadian? Oh, I'm sorry. That's almost an insult, right?
It's just a different country, that's all. Big difference, I know. Her name is Regan McDonald, and she's going to talk about surveillance by design. And yeah, I'm going to leave it up to you. Thank you. Applause, please.
Hi, everyone. So the first thing that I'll say, if you were in the discussion before and you did understand German, I will say forget everything that Kirsten and Alex just told you. I'm Regan. I'm originally from Canada. I work with an organization called Access.
It's an international NGO that defends the digital rights of users around the world. And I head up the Brussels work. So I've been based in Brussels in the heart of the European Union's sausage-making machine. So today I want to talk to you about surveillance by design. And just a note, I know it's a little bit late, and I'm standing between you and beer.
So I will try to be as brief as possible and hopefully allow for some time for discussion and questions. So I will just do a short introduction, and then I want to walk you through the foundational principles of surveillance by design
and then conclude with what we need and what we can do to make the situation better. So it's quite obvious to say that the explosion of electronic communications has affected us and continues to affect us in ways that we're still not even yet conscious of.
But the pace of the change is something that is definitely something to note. For example, the invention of modern electronic communications, the telegraph, was in 1844. And the telephone came three decades later. But the second communications revolution, which happened in the 90s, which was basically the opening of the internet to commercial traffic,
the massive laying down of fiber optic cables around the world, and the worldwide adoption of mobile phones, took place in roughly 10 years. These massive transformations basically happened overnight. So laws are not able to keep up with these changes, which is a big part of why we found ourselves in this situation.
And then there was you. In 2006, Time magazine made you the person of the year. It was the dawn of the age of Web 2.0. Web 2.0 was a new way of interacting with the internet.
It focused on user power and user-centered design. It was about information sharing, social networking, interoperability, collaboration, and wikis. No longer would the individual be a passive consumer, but they would also have a part in the production. So there were buzz words like the prosumer, because Web 2.0 meant that individuals would have both instead of just being a consumer.
But when you scratch the surface on Web 2.0, you realize that our participatory element is not actually measured through user-generated content
or our value in producing, but actually through the mining and harvesting of our data. So today, in 2013, we're undergoing another transformation and taking another step forward. We have new buzz words, such as big data, which is the result of the combination of myriad pieces of information,
which continue to blur the line between what is personal data and what is public data. The Internet of Things, where more and more devices are becoming connected to the network, from your fridge to your alarm clock to your washing machine.
By 2020, people have estimated that there will be 100 billion internet-connected devices. This emerging digital power grid will be run on cloud computing services. This means that the origin of the data being processed and the location in which it is stored
might have conflicting laws that govern the protection of that information. When you combine this with the prevalence of mobile technology, what this means is that every minute of every day, we are basically hemorrhaging data.
This also means that it's much easier for third parties to get access to any and all of that information, whether it's companies, some we might know, like the Facebooks and the Googles, and others that we don't know, like large data brokers such as Axiom or government agencies.
The European Commission has proposed last year a proposal to update the privacy rules in Europe through a data protection regulation. This law would, among several things, give individuals much more control over their personal data and encourage accountability and transparency among public and private bodies that control your data.
My personal obsession in the privacy data protection regulation is privacy by design. This is a feature in the proposal. And it's the concept where government public and private bodies basically bake privacy
into the products and services and policies about personal data. This is a very welcome step, but when you take a step back and you look at the big picture and the way our communications infrastructure is built,
you see that there are flaws in the design of our communications ecosystem. And there are two main problems with surveillance by design. Not only does excessive surveillance infringe on our rights and undermine our ability to be able to trust the technology that we have come to depend on for our day-to-day lives,
but a communications infrastructure that is built with surveillance as the default in fact undermines our security. So that's the introduction, and now I want to walk through the six foundational principles. They are proactive and retroactive policing, surveillance as the default setting,
end-to-end insecurity, zero-sum game, opaque and undemocratic policymaking, and no respect for user rights. I'll go through them, and there will be a quiz later. So the first one.
The use of more and more intrusive surveillance techniques by public bodies has resulted in two distinct paradigm shifts in policing, proactive and retroactive policing. The first one, as the collection and storage of data becomes much cheaper and much easier,
law enforcement is increasingly prone to collect first and ask questions later. The Data Retention Directive is a primary example of this. It was passed in 2005 in the European Union, came into effect in 2006, and basically mandates that all telecommunications information is stored from between six months to two years, depending on the member state.
This blanket retention of all of our communications data threatens the backbone of democratic societies by removing the presumption of innocence and treating us as what would be criminals or maybe suspected criminals.
The second point is that a little bit of data actually says a lot about us. In the old days, before the communications revolution, law enforcement could listen into our phone calls or even read our emails. This often paid a heavy toll on resources for time and money
to pay all of these police officers to listen to certain phone calls. But with the ubiquity of technology, traffic data, which is collected under the Data Retention Directive, which includes the calls that you make, to whom, what time, can in fact reveal an awful lot about us, our habits and our relationships.
In many jurisdictions, getting access to traffic data also requires much less judicial authorization, so it's much easier to get at. Finally, behavioral surveillance. Centralizing surveillance techniques. Law enforcement seeks to predict criminal behavior.
Projects such as INDECT in the EU, which are underway right now, and through combining these surveillance techniques and types of data, such as face recognition technology, traffic data, and social networking data, plus CCTV cameras and even camera drones,
we are all increasingly classified as pre-criminals in the eyes of law enforcement. It's also when collection and storage of data is the default. This represents another shift because meaning is derived retroactively from this information.
Where before law enforcement could search for a specific individual or person of interest, they would define what's called a schema and apply this to the search terms. What happens now is that the schema can be determined after the data has been collected and stored for several years.
Considering the various amounts of data that can be collected, analyzed, and stored for extended periods of time, we are increasingly vulnerable to future crimes. In essence, it really begs this question, how are we to be sure that what is okay today will not be okay tomorrow and will not be used against us?
In addition to this, there are increasing tendencies to minimize the accountability of both law enforcement and private bodies, which governments use to access and to get to our information. Kind of like a proxy of government surveillance.
Many suspicions of civil society were actually recently confirmed when the Electronic Privacy Information Center, or EPIC, based in D.C., discovered that the U.S. Department of Justice has been issuing what's called 2511 letters to AT&T and other large companies
that would basically give them legal immunity to carry out surveillance of communications of their networks that would likely be illegal under U.S. law. Ultimately, this lack of judicial and ethical oversight over communication surveillance means that there is a problem.
On top of that, it is subject to abuse. When you actually come down to it, who is managing and running these data centers are actually just humans, and humans are flawed creatures. There is already several examples of abuse of these databases in Europe from the Data Retention Directive.
Some law enforcement authorities maybe want to use it to look up an ex-girlfriend or to look up an ex-wife or a potentially cheating husband. And this is also apparently a growing problem in Ireland. The Irish Minister recently urged the law enforcement authorities in Ireland to stop using the police database
as what he called a social network. So the second one is surveillance as the default setting. Increasingly, the products and services that we use every day are modified in order to allow for surveillance,
which fundamentally weakens the security of the services that we are supposed to depend on. Case in point, the Communications Assistance for Law Enforcement Act, or CALEA. There's a lot of acronyms in my speech, so I apologize.
This is a U.S. domestic law, and I do actually have some U.S. examples, but I only say this because the U.S. in many ways is kind of a standard setter, whether good or bad in this sense. And the CALEA has implications for all of us in Canada and the EU
since we're mostly using U.S.-made products and services. The CALEA mandates that all telecommunications carriers and manufacturers of equipment have to modify and design this equipment, facilities, and their services to ensure that they have built-in surveillance capabilities.
So basically, all of the products are made with backdoors in them. What's worse is that there have been discussions in the United States about expanding the scope of CALEA to CALEA II, which would actually include large service providers like Google or Twitter or Facebook to have them build in real-time wiretapping capabilities
and have them suffer large fines if they do not comply with these wiretapping requests. So we're halfway through the principles. Number three, when we're already starting at a point of insecurity
by using products that are built with backdoors, this makes us much more vulnerable to attacks such as malicious software and other surveillance techniques that are easily set loose on us, often by our own governments. The first example of this is Deep Packet Inspection, or DPI.
Deep Packet Inspection is a filtering technique that examines the contents of data packages that are transmitted across the network. This is a tool commonly used for traffic management, so to clean up viruses and spam, et cetera, but it can also be tweaked as a tool to spy and surveil on citizens.
This, for instance, was one of the pillars of the Tunisian Ben Ali's regime, which they used to spy on citizens but also to even modify emails. Another toy frequently used by dictatorships and democracies alike is malicious software, or malware.
These types of software can be used for a number of things, to disrupt computer operations, gather sensitive information, or just gain full access to an end user's computer. In 2011, the Chaos Computer Club in Germany has alleged that the German law enforcement authorities
were actually deploying Trojans on their population, the so-called Bundestrajana. I think I butchered that, but... This is government-designed software that is made to intercept voice-over IP calls, also on Skype, for example,
but the software's functionality extends far beyond that and includes even keystroke logging and having access to webcams. Much of the discourse around surveillance laws or national security or terrorism, et cetera,
is often mistakenly framed in terms of privacy versus security, this concept that we have to lose one in order to gain the other. And this highly flawed approach creates a kind of lose-lose situation for us, for citizens, where we're actually left in the end with neither privacy nor security,
and the ultimate sum is zero. A perfect example of this is a Canadian. Everyone says that Canadians are nice, but that one is not. That's Vic Taze.
He's the public safety minister in Canada. And the justification for intrusive surveillance laws often kind of blur these lines between terrorism, national security, whether it's fighting paedophiles, serious crime or just crime. The problem and proposed solution and the scope of the measures
are left unanswered. In Canada, there was a so-called lawful interception bill that has been proposed in the government since 2009 called Bill C-30, or the Protecting Children from Internet Predators Act, a very controversial bill in Canada.
This would have basically given Canadian law enforcement authorities warrantless access to user information, including IP address, search history, everything. The bill was luckily struck down because there was huge protest from citizens,
not only because the government was calling warrantless surveillance lawful access, which is a big problem, but also because it came with an accompanying report which revealed that the scope of the bill would have went far beyond paedophiles and would have included terrorism, national security and even a vague term called low-level violence.
I have no idea what that means in Canadian speak. It could be anything. So the fifth point is that with the zero-sum approach, where privacy and security are pinned against one another, this is what allows a lot of questions to go unanswered.
And this is particularly the case for civil liberties issues. Many laws are therefore able to kind of slip through the democratic process with very little public discourse and care for the actual impacts of such surveillance measures.
Again, I say Bill C-30 in Canada, it never once sought consultation from the federal nor any of the provincial data protection authorities, not to mention any academics or anyone from civil society. These laws are often proposed during times of crisis
or immediately after some terrible tragedy, which means that the potential collateral damage is often ignored. Some of the most intrusive surveillance bills have passed in this way. We all know about following 9-11 in the US was the Patriot Act,
but also in Europe, the Data Retention Directive, which basically followed very closely after the London and Madrid bombings in 2004. Now, I have been working in Brussels for the past few years and focusing on Brussels policy. And the Data Retention Directive passed from 2005 to 2006,
which to me is one of the fastest directives that has ever made it through that complex bureaucratic system. And what's worse, to this day, the European Commission has still not been able to provide any real evidence to show the necessity or proportionality of data retention.
So when the debate, or lack thereof, is framed in these terms, laws enabling greater surveillance pass, such as the Data Retention Directive. There are a number of basic human rights
and civil liberties that are infringed or ignored, which among other things, actually greatly reduces the ability of citizens to one, be aware of the surveillance laws, and to two, challenge them. These laws are rarely given back. Once the rights are taken away,
they are very seldomly given back to us. So here are a number of, some of the civil liberties implications. One of the, an example is the right to access information. A very dangerous standard, or upsetting standard, was recently set in the United States
by the Supreme Court of Justice in a case called Clapper versus Amnesty. Basically in the US, under the Bush administration, the National Security Agency was warrantlessly wiretapping its own citizens. Amnesty International had strong suspicions for a long time,
and perhaps rightfully so, that they might have been kind of wrapped up into this warrantless surveillance. They were worrying that they might have been implicated. So by asking for more information, the Supreme Court actually finally ruled in this case that the International Human Rights Group
actually had no standing to challenge the program, to challenge the NSA, basically because they couldn't actually prove that they were being tapped or surveilled by a secretive and illegal wiretapping program. This is exactly the kind of example that I mean
when I say that surveillance by design pulls at the fabric of our societies, because it undermines rights, it undermines the trust that we have, not only in the services and the products that we use, but in our own democracies. More and more, we're dependent on these communication technologies,
and it's almost like we're getting accustomed to these surveillance flaws. We're getting accustomed to not trust the services that we use. For example, a recent study showed that 71% of Facebook users, which now has 1.1 billion active users, self-censor themselves on the social network,
because they have basically no idea what, how much, and who has access to their personal information. Is this acceptable? No. Absolutely not. The Internet is one of the most fantastic tools that can give us so much potential for liberation,
but it has an equal potential to unravel our free and democratic societies. What we have to do is make sure that we are creating digital societies where the technology actually works for us and not against us.
So how do we do that? This is the hard part. We need four things, basically. Healthy public discourse, research and fact-based policymaking, transparency,
and more targeted and accountable solutions for law enforcement. So the first one, healthy public discourse. The backbone of any sound policymaking rests upon open consultation with all relevant stakeholders. It also depends upon credible research, proven facts,
and, of course, well-defined problems and solutions. These measures that would or could warrant communication surveillance must focus first on the actual efficacy, whether or not this measure or this law
would actually work or be a real benefit to law enforcement before we even get into the civil rights issues. And there's always international standards and national laws, the International Covenant of Civil and Political Rights, Universal Declaration of Human Rights, and in Europe the Charter of Fundamental Rights,
just to name a few standards. The second thing is transparency. As a basic rule, companies and governments must be transparent about requests for user data and the surveillance that they are mandating. Some companies already do produce regular transparency reports,
like Google or Twitter, Dropbox, and Microsoft recently started, and a few others, but this has to become the norm. States as well must be transparent about the use and scope of communication surveillance. This includes publishing these reports on an annual basis
with aggregate numbers of requests and how law enforcement authorities are conforming with domestic and international laws. Increasingly, privacy is power, so if the government must watch us,
then we should watch the government too. This is already an interesting experiment that's happening in some states in the United States. Through this experiment, they have asked law enforcement to start wearing life-logging technologies or tiny cameras, both on their person and on their tasers,
which is the weapon of choice for most police officers. The video footage is actually accessible to the public, to anyone who wants to see it. The results of this kind of two-way surveillance actually show that it has benefited
the relationship between the police and the citizenry. It's kind of like leveling the playing field. The third one. Instead of putting an entire citizenry under full-scale surveillance, other techniques are possible and in fact have shown the ability to meet the needs of law enforcement
without grossly undermining our rights. For instance, data preservation instead of blanket retention. Most telecoms actually do end up storing some information for a few months and then delete it. In fact, in the Commission's attempts to justify
the Data Retention Directive, the examples of when retained data was useful to them. They were actually citing mostly information that was not retained under the Data Retention Directive but retained by telcos for another reason. Second point is data minimization.
This is also a basic principle that's in the data protection regulation. For companies and bodies and public bodies that collect information, they should only collect what is absolutely necessary and for very specific purposes. This sounds like a very simple concept
but currently this is not how public and private bodies are functioning. The third one is to delete data. Just delete it. This should be standard operating practice for companies. And given the fact of the risk that companies incur by automatically collecting data for which they don't have much use,
this actually causes risk for them because the frequency of high-profile data breaches is actually an increasingly large problem both monetarily and for public relations image, not to mention for individuals and consumers.
So what can you do? Until we can fully change the way that electronic surveillance is regulated in the long term, there are some practical solutions that you can adopt now. The first is to be vigilant and take control.
Part of our responsibilities as citizens living in a technologically ubiquitous environment is that we have to ensure that we ourselves are aware of the risks and the problems that we have depending on communications. The second one is using privacy-enhancing technologies.
I don't have time to go into all of these but I'm happy to talk after and I can direct you to more resources. But there are ways to browse the internet anonymously, to use off-the-record messaging on your phone and on messaging
and as well to encrypt your emails. A friend once told me that the internet is kind of dirty and you want to think about it like wearing a condom. You don't want to go bareback on the internet. You need to protect yourself. And these are some of the ideas.
The second one, get involved. Birgitta Jonsdodir was here and I saw her keynote and it was about participatory democracy and the importance of getting involved and speaking out to politicians
or even becoming a politician like she did. And she kind of summarized it perfectly. We need less trolling and more collaboration. And I think that's exactly right. Access to my organization and a handful of other European digital rights groups
have joined a coalition to protect the data protection regulation. So one of the other things you can do is actually get involved. You can right now join the fight to protect your privacy in the European Union. We've set up this site nakedcitizens.eu
and today and the next few days we actually have postcards, naked postcards, that you can send to your representative. So this is a two-benefit thing. One, you will be helping to get involved and to protect your fundamental right to privacy and data protection. But then also you get to send a naked picture to an MEP,
which is kind of exciting. So this box here is where you can fill in. You have two options. You can either write your own or there's already pre-written text and you can just sign. And what you do is, after we will send it,
we will send it to the parliament ourselves. So you want to fill it in and put it into that white box. And the people that are doing this are myself. I will have lots of postcards. And there's also some other people there from Digitäligesauschaft and Bits of Freedom and also Kirsten, who was speaking before me.
That's it. Thank you. Thank you. Great. Any questions? Somebody has a question too. Questions?
Then this is it for today on this stage. We see you tomorrow at 10 o'clock. Enjoy the party outdoors on the left side. So I'm in front of the box. Questions? No. If there are no questions, that's it for today. We see us tomorrow.