Responding effectively to digital emergencies & human rights violations online
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 132 | |
Author | ||
License | CC Attribution - ShareAlike 3.0 Germany: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this | |
Identifiers | 10.5446/33593 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
00:00
Directed setDigital signalLevel (video gaming)Goodness of fitSphereInternetworkingProcess (computing)Endliche ModelltheorieAreaResultantPoint (geometry)WordPresentation of a groupQuicksortExpert systemContent (media)Limit (category theory)Different (Kate Ryan album)DigitizingArithmetic meanRow (database)Confluence (abstract rewriting)Range (statistics)DialectComputer programmingSelf-organizationWaveBitTunisAutomatic differentiationRight angleDirection (geometry)XMLLecture/ConferenceMeeting/Interview
03:04
1 (number)EmailForm (programming)SoftwareWebcamAerodynamicsTelecommunicationInternetworkingConnected spaceNeuroinformatikLaptopFlow separationInstance (computer science)Online helpBitRange (statistics)Expert systemLimit (category theory)InformationTwitterInformation securityData miningSource codeQuicksortIndependence (probability theory)Event horizonType theoryMalwarePasswordLevel (video gaming)Group actionSystem callFacebookService (economics)Lattice (order)Data conversionVideo gamePseudonymizationMobile WebDigitizingForcing (mathematics)Lecture/ConferenceMeeting/Interview
08:12
Directed setWebsiteTerm (mathematics)Point (geometry)InternetworkingEvent horizonPasswordFacebookQuicksortWeb pageMeasurementPseudonymizationService (economics)BlogType theoryContingency tableDenial-of-service attackInformation securitySelf-organizationRange (statistics)Scaling (geometry)Online helpMessage passingElectronic mailing listBitCASE <Informatik>Dependent and independent variablesPhase transitionRight angleArithmetic meanFrequencyMultiplication signView (database)Compact CassetteGraphics tabletInterior (topology)MyspaceProcess (computing)EmailPlanningLecture/ConferenceMeeting/Interview
14:16
Expert systemPoint (geometry)Latent heatPosition operatorCivil engineeringBitNatural numberInformation securityScaling (geometry)QuicksortNetwork topologyGroup actionVideoconferencingSemiconductor memoryThread (computing)MalwareMultilaterationVideo gameLecture/ConferenceMeeting/Interview
16:35
Host Identity ProtocolConvex hullGamma functionVideoconferencingComputer fileGroup actionPhysical lawNeuroinformatikContent (media)Grand Unified TheorySystem callEmailBroadcasting (networking)Factory (trading post)Operator (mathematics)BitProduct (business)SpywareComputer animationSource codeMeeting/Interview
17:21
RootQuicksortSimilarity (geometry)Product (business)Type theoryReal numberWordRevision controlCellular automatonRight angleSquare numberPressureSoftwareMereologyElectronic mailing listForcing (mathematics)Meeting/InterviewComputer animationSoftware
18:54
Chaos (cosmogony)LiquidPhysical systemMoment (mathematics)NeuroinformatikVirtual machineInternetworkingPhysical lawType theoryForm (programming)System callDecision theoryNormal (geometry)Computer animationMeeting/Interview
20:15
QuicksortSummierbarkeitMultilaterationGame controllerPhysical systemSystem callTrailMathematicsRow (database)Position operator2 (number)Sheaf (mathematics)Physical lawDependent and independent variablesProduct (business)Meeting/InterviewComputer animationSoftware
21:02
VideoconferencingIntercept theoremArithmetic meanPhysical lawSoftwareOrder (biology)TheoryJSONXMLUMLMeeting/Interview
21:55
Antivirus softwareSystem programmingTelecommunicationEmailWebcamMathematical analysisProcess (computing)Physical systemComputer forensicsDigital filterInformationSystem callComputer fileElectronic mailing listVideoconferencingHeat transferOperations researchHill differential equationAiry functionDifferent (Kate Ryan album)Group actionLocal ringKey (cryptography)Instance (computer science)Incidence algebraFlow separationComputer forensicsProxy serverDrop (liquid)Installation art1 (number)Server (computing)Mobile WebFront and back endsEmailNeuroinformatikAntivirus softwareElectronic mailing listWebcamRemote procedure callPersonal identification numberProduct (business)QuicksortWindowInformation privacyAndroid (robot)Video gameElectronic signatureTelecommunicationComputerLatent heatComputer filePhysical systemComputer animation
24:28
Hill differential equationSpywareProduct (business)Overlay-NetzControl flowFisher's exact testGastropod shellProgrammer (hardware)Gamma functionWhiteboardSoftwareVideo trackingPoint (geometry)Service (economics)Level (video gaming)GodPerspective (visual)PressureHypermediaMultiplication signRow (database)ResultantNumberQuicksortGamma functionGame controllerCivil engineeringDependent and independent variablesGroup actionNetwork topologyRight angleIncidence algebraOffice suitePosition operatorComplete metric spaceNoise (electronics)Physical lawWordProcess (computing)MathematicsExpert systemRegulator geneVideoconferencingInstance (computer science)Meeting/InterviewComputer animation
27:38
CASE <Informatik>Normal (geometry)Information securityInternetworkingDependent and independent variablesWindowSimilarity (geometry)Order (biology)System identificationNeuroinformatikService (economics)SoftwareExpert systemSet (mathematics)Online helpEvent horizonDisk read-and-write headDifferent (Kate Ryan album)Instance (computer science)Multiplication signMultiplicationLevel (video gaming)Workstation <Musikinstrument>Product (business)Self-organizationHD DVDQuicksortLatent heatSemiconductor memoryCartesian coordinate systemMedical imagingSampling (statistics)1 (number)Point (geometry)3 (number)Complex (psychology)Displacement MappingMathematicsSound effectScaling (geometry)Group actionCryptographySpacetimeIncidence algebraPerspective (visual)Connected spaceBitWater vaporProcedural programmingReading (process)Context awarenessCondition numberIdentifiabilityMechanism designObject (grammar)DistancePhysical systemExistential quantificationFiber bundleWave packetExploit (computer security)Type theoryLocal ringVulnerability (computing)Office suiteComputer hardwareVirtualizationOpen setMeeting/Interview
35:31
Staff (military)Row (database)Self-organizationPoint (geometry)Dependent and independent variablesRight angleType theoryMultiplication signWave packetInstance (computer science)Materialization (paranormal)Telecommunication1 (number)Intelligent NetworkConnected spacePresentation of a groupIntegrated development environmentNegative numberState of matterFeature spaceExtreme programmingMoment (mathematics)Office suiteCASE <Informatik>Associative propertySoftwareElectronic mailing listPressureContext awarenessInformationComputer programmingDifferent (Kate Ryan album)BuildingSystem callEmailChannel capacityLecture/ConferenceMeeting/Interview
39:19
Strategy gameLevel (video gaming)PasswordLecture/Conference
40:05
Strategy gamePasswordCuboidLaptopDifferent (Kate Ryan album)Arithmetic meanMedical imagingPresentation of a groupFigurate numberInformation securityInformationNeuroinformatikLecture/ConferenceMeeting/Interview
41:20
Musical ensemblePrice indexTraffic reportingDean numberInformation managementMassAxiom of choiceInformationTracing (software)Multiplication signProjective planeInformation securityContext awarenessStrategy gameVideo gameLevel (video gaming)Self-organizationAcoustic shadowRight angleTraffic reportingKey (cryptography)CASE <Informatik>Scaling (geometry)Constraint (mathematics)Cycle (graph theory)
42:55
Touch typingArmSelf-organizationInformation securityDependent and independent variablesSoftwareAssociative propertyProfil (magazine)Presentation of a groupInformationTracing (software)AdditionContext awarenessArithmetic meanIntegrated development environmentType theoryDigitizingLecture/ConferenceMeeting/Interview
45:16
Presentation of a groupProgrammer (hardware)QuicksortRevision controlSystem callDependent and independent variablesSoftwareMeeting/Interview
46:52
Gamma functionSound effectDistribution (mathematics)InformationInstance (computer science)Information securityArmProgrammer (hardware)Multiplication signSelf-organizationCountingSoftwareFisher's exact testTrojanisches Pferd <Informatik>Physical lawProduct (business)Service (economics)Meeting/Interview
48:45
Information securityContext awarenessBitInstallation artOrder (biology)Computer virusSinc functionPhysical lawNeuroinformatikNoise (electronics)State of matterForcing (mathematics)AuthorizationFisher's exact testMeeting/InterviewLecture/Conference
49:34
InformationLine (geometry)Type theoryPhysical lawCASE <Informatik>Point (geometry)QuicksortSystem administratorException handlingLecture/ConferenceMeeting/Interview
50:23
Addressing modeQuicksortOrbitForcing (mathematics)Computer filePlotterSoftwarePhysical law1 (number)Point cloudProcess (computing)Data storage deviceCASE <Informatik>Lecture/Conference
51:24
Traffic reportingState of matterDependent and independent variablesGoodness of fitView (database)Message passingAddress spaceDigitizingSoftware maintenanceGamma functionHypermediaInternetworkingPerspective (visual)Software developerCommitment schemeMereologyRelational database1 (number)Lecture/Conference
53:49
QuicksortContext awarenessLatent heatInformation securityIntegrated development environmentDependent and independent variablesPressureVulnerability (computing)Cellular automatonLevel (video gaming)ResultantGoodness of fitLattice (order)FamilyRight angleWebsiteProfil (magazine)Extension (kinesiology)Instance (computer science)Sheaf (mathematics)LaptopDigitizingSystem callMoment (mathematics)Row (database)Lecture/ConferenceMeeting/Interview
56:32
InternetworkingInformation securitySpywareLaptopFigurate numberInstance (computer science)Covering spaceSelf-organizationTable (information)QuicksortPressureDependent and independent variablesSound effectOperator (mathematics)SpacetimeState of matterPresentation of a groupAuthorizationOnline helpTerm (mathematics)BuildingCASE <Informatik>TwitterMultiplication signFacebookStaff (military)Strategy gameLecture/ConferenceMeeting/Interview
59:29
Directed setWordMultiplication signLecture/ConferenceXML
Transcript: English(auto-generated)
00:14
Good morning everybody, welcome to the last day of Republica, to the last day on stage
00:22
three. Today we're looking again for a very interesting day, we have a lot of discussions and panels regarding the sphere of internet and politics and society. So today's session starts already with a very nice expert panel with a topic responding
00:41
effectively to digital emergencies and human rights violations online. We will have 60 minutes to discuss and Ben Wagner, who will make the moderation, already told me that it's totally okay if you have any questions during the monologues or something
01:02
to just ask, because the topic is a very broad topic, so maybe there will be some basic questions, just raise your hands and we will get you the mic and you can ask. And yeah, I hope it will be a fruitful discussion, have fun. Thank you and thank you for coming despite the parties last night and the difficulty
01:22
of getting to Republica at 10 in the morning, I realize it's a struggle to be here, especially including coffee. And so I'm grateful that there will be free coffee here afterwards, so if you'd like to tell anybody else to come in, that'll clearly fill up the stage. We have a fantastic panel here, I think, from all areas of digital emergencies that you can imagine, and of course, the title itself, when I first suggested that the people
01:43
come here, they're all like, digital emergencies, what? So the topic itself is one of both contention and different meanings, and I think it's important to see that there's a very broad scope of things happening here, so this may also mean that in between you'll be hearing things that are from different areas, some of which you know about, some of which you've never heard about, some of which
02:01
are very different. And as a result of that, just feel free to raise your hand at any point to wave, to jump up and down, to shout at me, whatever, and then you can happily ask a question and get involved. So on my very far left, we have Stephanie Hanke from Tactical Tech, then a little bit closer to me we have Fike Janssen from HEVOS, closer further on the right we have
02:24
Claudio Gagnieri from Rapid7, and then to my left directly we have Gillian York from the Electronic Frontier Foundation. All of them have a very broad range of expertise in this area, and all of them in some way or another deal with digital emergencies. I'd like in their presentations if you could also just mention specifically what
02:40
you think your direct relationship is to this panel as well, and then sort of move through all of the things we've discussed. Fike, would you like to start? Hi. Do you want me to start with what my direct relationship is, or to start first with digital emergencies? So why I'm on this panel is that I'm setting up a new program that supports organizations
03:03
working on digital emergencies, because what you will see, also I think throughout the talks, that there's a lot of threats happening right now to people, especially political activists or journalists, on all different levels, from hacking their computers to internet blackouts, and that there's fairly limited things we can do so far.
03:25
So I'm actually working for a donor organization, and I will be talking a little bit more about on a meta level what is digital emergencies and what are actually happening, and then the other three, four experts will go more in depth on things.
03:41
So actually, I don't know, it was a weird coincidence, but last night I got an email from a friend of mine in Jordan saying that the internet was caught in Syria, and they also suspected that the mobile network was taken off, and how they found out was that Google and Cloudflare, which is a security company, they actually saw a rapid decline
04:02
in the amount of requests coming from Syria to the rest of the internet, and it really went from something to zero, so in 15 minutes, so you saw a huge, well, not a spike, but a drop, and I think the question is, is this rare, and why, and what is it happening? Well, that the internet is turned off completely, it's quite rare, I think we had two separate
04:23
events prior to last night, and one was Egypt when Mubarak pulled on the internet kill switch, and one was about, I think it was December or January when also in Syria the internet was cut. So what we see is it has huge implications because it leaves a country completely deprived of internet or mobile communication, and I think you, when people think of an internet
04:48
or a mobile blackout, they're sort of forgetting that it also affects the basic daily lives. If you are having a baby, or if you've broken a leg, you can't call emergency services, so nobody can come and actually help you, specifically when the mobile network are turned
05:04
off, because in countries like Syria or Pakistan, there's very limited landline penetration, so mobile is very important in this one. And the reason why most governments won't take down the internet as a whole is because it has quite severe economic impacts on the country, because companies can also not function,
05:24
but this doesn't mean we're not seeing other trends. I think governments and non-state actors around the world are more and more targeting mobile and internet connections, not only by taking them off, but you have theft of laptops and mobile phones, for instance, of activists, this is happening quite often
05:43
in Mexico, and the reason is independent journalists or activists who are critical of government, they have a lot of information stored on their mobile phones or their laptops, and it's usually not very highly protected. So a couple years ago, I was in Guatemala, and I was talking to some journalists, and
06:01
they had their sources sort of locked in a vault, but they had their laptops without a password on it, so if then a security force takes it, they have access to all the data. One of the other threats we see is that it's really targeted surveillance, so people are installing malware on computers and mobile phones to really find out what people are
06:24
doing, but it goes as far as, and Claudia will talk more about this type of malware, but it can go as far as turning your webcam on when you're having a meeting and turning your microphone on off your computer or your mobile phone and listening into your conversation, so it's really breaching the offline world that we're living in, it's not only looking
06:42
into your data, but it's actually looking at what we're doing in our daily life. One of the threats, and it's something a lot of people are not thinking about, it's also a breach of pseudonymity, so as Hefels, we work with a lot of LGBT groups in Africa and in Uganda, there's a death penalty on it, and people are exposing them on Facebook
07:01
for being a gay activist, which has severe implications, and what can you then do to help this person, because it's really a one-off exposure, and how can you then help them. So I think there's a whole range of threats going on, and there's different ways to mitigate them.
07:21
One is by being preventative, so really starting before something has happened, because usually when a digital emergency happens, you're too late, so it's better to work prior to it, and I think Steph will be talking more about that. During the crisis, how can you respond to it? If there's a mobile blackout, are there other ways to form communication channels?
07:44
We don't have an expert on this panel on that, but there are people working on seeing if we're together with your Wi-Fi of your phone, you can create your own network, and then if you link that up to a radio channel, you can communicate to other countries.
08:00
And then after the crisis, there's also a lot of work that's being done, but I think the three others will go more in-depth about it. Fantastic. Thank you very much. Just before we move on, are there any questions for clarification, or was any of that unclear? Were you unsure of any of the terms that were used? Do you completely disagree? Not so far. That's good.
08:21
Do you like to continue? I disagree with one point. There were actually four internet blackouts before Egypt, so I actually have a website about this, so I'll tell you later which countries shut down the internet.com, very creative. Yeah, so no, but I absolutely do agree with everything that Fika said on the subject, and I think that just before I talk about some specific examples, I wanted to really
08:44
agree with the point that a lot of the preparation that is needed isn't happening, and so we're not, in our countries, in the countries in which we work, and in countries where we have contacts, I think most people are generally not thinking about preparation
09:01
for some of the types of emergencies that we've discussed, and so I know my organization, and I think other folks here also all work on these types of things too, so hopefully we'll hear about all of them, but there's a lot of resources out there. My organization does work on surveillance self-defense, so preparing yourself against potential surveillance, and then also mitigating against attacks like distributed denial of service
09:24
attacks, but I'm gonna talk about a couple of very specific examples just to give you kind of a range of the types of things that I do, so I'll start with the sort of least scary. Everyone remembers a little Facebook page called We Are All Khaled Said from Egypt, sort
09:42
of, yeah? Not everyone, okay, so this was the Facebook page where people said, you know, let's go out on the streets on January 25th, and they had one million responses, and people went out, you know, and this was, the Facebook page was actually created in 2010 in response to the murder of Khaled Said at the hands of police in Alexandria, and this Facebook
10:05
page had been around for a while, and in November of 2010, it suddenly went down. It was just taken offline, and through some other contacts, we heard that the reason it was taken offline was that it had been a terms of service violation on Facebook, and
10:22
I thought, huh, what could possibly be the terms of service violation, like I'd followed the site, you know, and it turned out that a person named Walokonim, who has now become quite famous, had been anonymous on his Facebook, or pseudonymous, rather, he'd been using a pseudonym on his Facebook account, and this meant that his Facebook account was then taken down.
10:40
Now, this is not necessarily what you would think of as an emergency. Your Facebook page not being up may not be that important, right? But this happened to be during an election period, and it was intentionally sort of targeted by people who didn't agree with his views, and so this is something where, you know, if you, sure, yes, he violated the terms of service, perhaps unknowingly, but if you're
11:04
in that sort of situation, what paths of recourse do you have, and so that's one way that, you know, that's one of the things that I do is try to contact those companies and help people on the other end. And to give an example, which may actually seem a lot more like an emergency, there are
11:21
some companies that have recognized that having an account on their services and being asked for the password to it by security services, if you're arrested, can get you in quite a bit of trouble. There have been lots of cases where security services, for example, in Tunisia, have infiltrated
11:40
people's Facebook pages, kind of taken them over, et cetera, and so if you're arrested and you're using a social network and you're asked for your password or tortured for your password, that can, obviously, you're probably going to hand over that password and that can cause far more damage, and so one thing that some sites have been willing
12:00
to do is kind of work with us on that and maybe take down the page in advance. Another example is when you, so this, that sort of thing happens to a lot of activists who are sort of accidental activists who don't realize that their Facebook page is going to cause them threat, but then you have more, I don't want to say professional
12:21
activists, that sounds wrong, but experienced activists who are aware of the risks that they're taking but who still want to put some sort of measure in place for an event like that. So I'll give an example where a Syrian activist named Rosanna Hizawi, she's written about this and she's allowed me to use her name on this. A couple of years ago, she knew that she was under threat, this is, I think, November
12:44
2011, she was in Syria, and she knew that she was at risk. She contacted me and another friend, so had two different people, gave us a list, sent it encrypted and gave us a list of her passwords on various social networks and instructions on what she wanted us to do in the event that she was arrested.
13:03
The next day she was arrested, and so we knew that she wanted her Facebook account shut down, her blog kept up and used to update people about her situation, and her email, you know, she had left specific instructions on that as well. And so that was the type of thing where, you know, I thought through it afterward and
13:20
sort of came up with some ideas for creating a contingency plan, which is something you may or may not want to do depending on your situation. And the last one, because I think I'm getting over time, is just, there have also been more extreme examples, and this I'm sort of putting on, I wear a few different hats, I work at EFF, but I also work with an organization called Global Voices, and Global
13:42
Voices has bloggers who write for the site located all over the world, and some of them do come under threat, and so we're beginning to, well not beginning, but in the middle of thinking through how we can best support those people when they come under threat, whether that means helping them with technical means, helping them keep their website up
14:00
during an emergency, or rather, helping them to get out of the country, and this is something that very few organizations do, I mean, granted there are some doing an excellent job of it, but it's not available on scale at this point, and so this is something that's a real challenge, I think. So I'll pass it on. Wonderful, thank you very much. Are there any questions or specific clarification points on this?
14:23
I know there's lots of other experts in the audience, so if you'd like to weigh in on any of the points mentioned, you're more than welcome. If not, I realize that it's still very early in the morning for Republica, and I believe the next position may be a film which may allow you to slightly more interactively see some of these things, and also wake people up to feel more interested in participating.
14:47
So yeah, before going into the video, a little bit of introduction on what I'm actually going to talk about. So I'm a security researcher, kind of outside of this space. I'm kind of the nerd of the group, I would say. I specialize in malware research and botnet research,
15:02
and you know, threat research in general, and I've been doing that for lots of years in the commercial space, and until recently I've been involved kind of incidentally in investigating and researching into the use of surveillance technologies all over the globe. Does any of you ever heard of FinFisher before? FinFisher? One, two, three, four, five, six.
15:25
Okay, a few of them. Okay. So I've been involved in researching and investigating the nature of surveillance technology, including FinFisher, and exposing their use on a global scale. And I published, together with other researchers through Citizen Lab, a collection of researches
15:44
that you can go online and find into more details on what it actually does and how it's being used, and I'm going to talk about a little bit into that later. I actually have three copies of that research. If you guys are going to pick it up, or I don't know, distribute it some way, you can find everything there. But since most of you actually don't know FinFisher,
16:02
I'm going to show a very short introduction video that will give you kind of an idea of what we're talking about, what is surveillance malware, what kind of capabilities it has, and what relevance it had on everyone's life and civil society and activists in general.
16:27
Is this working? Yeah.
16:41
FinSpy is a commercial hacking tool. It gets into people's computers. The Gamma Group sells this to law enforcement, spy agencies around the world. And what we found by being able to take a closer look inside the guts of it is that it operates pretty much as advertised. It can listen to your Skype calls. It can intercept your emails. It's there to take over
17:01
your computer secretly from the inside and broadcast all of the contents that it wants. If it decides, I'm going to turn the microphone on, it's going to do that. And it broadcasts it back to these operating centers that are listening in and recording every bit of data and storing it. The FinSpy spyware product is produced in Germany,
17:22
and it's become a political issue partly because the same sort of products were being used domestically by law enforcement in Germany. And now that it's come out that they're exporting a similar type, there's a real conflict between people who are supporting the industry of Germany and their need to sell stuff and those who think that human rights concerns
17:41
should hold sway. A lot of the human rights activists have voiced concerns about how this is being used, who it's being used against. Once it's sold, it's in the hands of whatever country, whatever government, whatever spy agency it's in the hands of. So the human rights activists and a lot of politicians in Europe now, where this is
18:02
exported from, are calling for restrictions on how it gets exported. After revelations in the last year about FinSpy and this German software being used in countries like Bahrain and pitching it in Egypt, there has been a little public pressure after the UK launched an initiative to have export restrictions for spy software like that.
18:26
The German foreign minister also said, oh yeah, they're part of the sanctions that we use against countries like Iran and Syria. But we think that's not enough to have a blacklist of countries that include export restrictions for such software in their sanctions.
18:44
We should treat the software like weapons and restrict the export of such software in any country that is known for violating human rights. Let's say a government somewhere in the Middle East or Asia or the Americas
19:03
buys one of these systems, does it have to use it just in that country? No, I mean the system is especially and by advertisement made for targets which reside in foreign countries or who travel, which means this also I have no idea how they want to
19:22
justify this being legal because it might be legal for a government within their country, so within their jurisdiction, to say with a court decision or whatever, well, we suspect this guy to be involved in some criminal activities, we want evidence on that, so we target his computer. But the moment he travels, he is already like in other jurisdictions
19:45
where this might be totally illegal to spy on his computer, and also the way it is made is not exactly for controlling who uses the computer. So they do advertise, for example, that you can pre-install the tool in an internet cafe on each and every machine,
20:04
so when the people use Skype to make their phone calls, they can listen to all the phone calls. I have no idea because this is not the normal sense of lawful interception, where a court decides someone is a suspect. This is like a strategic approach, intercept them all, and then sort
20:20
out later who is of interest for us now or later. For Gamma Group, they are very clear. They operate within the law. They say that they obey the export restrictions of the UK, the US, and Germany. They say that the system itself has built-in controls that are good for making sure that it tracks any changes. If they listen to a Skype call and if someone tries to
20:43
edit that to make some funny business with what's in it, there will be a record of which seconds were edited out and which agent did that. So their position is they're acting within the law, and that what they do as far as the world of hacking goes is one of the most responsible products if you're in the business of intruding on people's computers.
21:07
So the video is kind of self-explanatory. What we're talking about is basically lawful interception software, which is being produced by mostly European companies but also American companies and companies from other countries as well, from the Western world mostly.
21:23
And these are sold, generally speaking, to law enforcement agencies and other government agencies of pretty much every country in the world. And they should be, in theory, being used for criminal investigations, organized crime, terrorism, and so on. What the research that we've
21:41
published and covered was that actually it's not completely true. It's obviously being used for those meanings as well, but it's also being abused or used in other ways in other countries where it's probably not supposed to be used. Some of the features that these technologies have are kind of listed. Actually, I think this was taken from documentation of pin feature itself.
22:02
So you can see that it pretty much can do anything, bypass antivirus software, have covert communication, full Skype monitoring, intercepting Skype calls, intercepting instant messaging, stealing credentials for Gmail and Yahoo mail, and so on. It can do live surveillance through using the webcam and the microphone, so basically
22:22
environmental surveillance. It can steal files, drop files, install key loggers to get the actual keystrokes from the user, and do live remote forensics and all those sorts of things. It's being produced for all operating systems, Windows, Mac OS X, Linux, and we also found
22:40
instances of these products for iOS, Symbian, Android, and Windows Mobile as well. So it's very invasive technology. I mean, it's technology that is particularly invasive for an individual's privacy, and in some situations, such as the ones that I'm going to show you in
23:03
specific social and political circumstances. So what basically we uncovered was that, I don't know if you can actually see this properly, but it all started last year, basically, about 80 activists realized there was something dodgy
23:22
going on with their computer, and she received some phishing emails. She started realizing there was something suspicious with them, and then I got in contact with another researcher, and from digging into the computer of the victim, basically, we realized that it was actually finfisher. So that was the first instance, I believe, that these technologies were
23:43
actually found being used in the wild, and specifically being used against political dissidents or activists, and that was just the start. From there, we uncovered lots of different incidents in several different countries, where there was basically, again, activists or opposition political groups and journalists and so on being targeted with these technologies,
24:04
allegedly by their local governments or foreign governments and so on. And that obviously has a strong impact on their daily life, because they operate through the computer, they operate through the internet. Everything they believe and that they work on, they're expressed through the use of computers, and that's basically nullify every sort of
24:20
privacy that might have when doing their work. And we also find ways to identify where the backend servers of these technologies are actually being located. So you can see on the map, it's pretty much scattered all over the place. So North America, lots of countries in Europe, as well as Bahrain, Qatar, the Emirates, Ethiopia, Nigeria, Pakistan, Turkmenistan,
24:45
Indonesia, Malaysia, and so on and so forth. So you can see that it's proliferating all over the place. It's completely uncontrollable at this point. The research that we've done was mostly covering the technical perspective of these
25:04
technologies, how they operate, how they're being used, how they can potentially affect an individual. But it obviously realized over the time that it had obviously a political impact, because we were finding instances of these technologies being used in countries where there are very bad records of human rights abuses. And after we started exposing these
25:28
incidents, obviously the response that we got was of complete denial from the vendors, as well as from the governments that were actually being exposed, and sometimes even disillusioned. So this is a quote from an interview that the spokesperson of Gamma,
25:46
which is the company that produces these technologies, gave to a journalist from Bloomberg, I think. So you can see that they actually don't realize the impact that they're having on the victims of the people, of the customers that they sell these technologies to. They also admittedly say that they have absolutely no control over these technologies.
26:07
Once they're being sold, they're being accessible to a country, it's up to the country how they're actually being used. And the problem is that they comply only to expert regulations. So they can sell to, as the video was saying, to Syria and Iran and so on, but they can sell to Bahrain, they can sell to other countries where there is a
26:22
critical political situation where citizens are actually at risk of their own safety. It also has some pretty funny personal outcomes. So this is again with Martin Munch. After one of the releases that we've done, he was asked if the result of this publication actually had a personal impact on him, and basically saying, yeah, if I meet a girl and she Googles
26:43
my name, she'll never call back. And I'm really sorry that it cannot get laid anymore. But the response, after there has been pressure to the media, they also started saying, okay, we're going to change things. He's been saying, okay, we're going to adopt a human
27:02
rights officer position to take care of these situations. And this was a positive, positive move forward if he wasn't going to appoint him himself, especially after, you know, below in the interview, he actually gives a comment like this saying that, you know, police does their job no matter who the actual target is.
27:22
So I hope that this kind of gives you an idea of why this is a critical emergency for individuals all over the globe. I mean, it's not something localized in civilized countries. Something localized pretty much everywhere, and that involves everyone. But we're talking about hearing this panel on how to respond to this incident. So I'm going
27:41
to try to give kind of my perspective on these things very, very briefly, because I think I'm running out as well. I would like to be able to give an answer on how you can actually respond to these incidents, but the reality is that I can't. The reason is that it's very deepness. As long as we operate on the internet as the adversaries, which in some cases might
28:06
be local governments, in other cases it might be foreign governments or even corporations against investigative journalists, it's very difficult to challenge that if we keep operating as they expect us to. I know they have lots of resources. These technologies get sold for hundreds of thousands of euros. They have these things at their disposal. They have
28:29
weaponized and known vulnerabilities in applications and operating systems. They can buy exploits from third-party vendors. They can have the resources to evade security mechanisms that security vendors like my company and other companies provide. And again, even in some
28:45
situations, silently and transparently interfere with network activity. And in some cases, even security-savvy people could find problematic to identify and respond to these situations. But there are some few things that we can actually do at least to make the situation a little bit
29:00
better. In the security industry, there is a fundamental concept in the commercial security industry, of course. There's a fundamental concept of trying to minimize the risk of compromise by making it more expensive for the attackers to be successful. And in the security space, so in our commercial organizations and government agencies, that basically translates into having
29:24
layered, a multiple layer of defenses, network defense, workstation defense, different products, different technologies in place that basically should create more and more obstacles for an attacker to be successful in compromising in the end and fulfilling its goals. We can achieve
29:41
something like that as well in the civil society. And I think that could be done in two main steps. The first one is keep researching and understanding how these technologies work, how these adversaries basically operate, and trying to keep up to date on what we're actually facing. If we don't realize, if we don't understand deeply what we're actually trying
30:02
to protect the people from, the activists and dissidents and journalists from, we cannot effectively do it realistically. And secondly, we have to stop operating on the internet as they expect us to do. Once we know and realize how they actually operate and how they actually expect us to be compromised and infected or tracked in other ways, we have to change that. We have to
30:24
be dynamic and flexible. And that's not an easy thing. There's lots of groups like the ones from this panel that advocate and train people in using cryptography and using security tools and so on. But that's quite not enough because in lots of cases, these technologies actually don't help when there's this type of surveillance and these type of technologies which are so
30:44
invasive and transparent. What we need to do is start to adopt alternative technologies. Stop using Internet Explorer and use Google Chrome, not use Windows and use Linux. Stop using Microsoft Word and use Open Office or something similar. We need to start
31:02
using disposable hardware and use virtualization to isolate critical applications. And that's something really complicated. But if we want to get to a stage where we actually want to provide some sort of guarantee that they can operate safely on the internet, these are steps that need to be taken. When these countermeasures actually do not work or they're not in place,
31:24
then we need to be prepared to respond to these things. And the security industry has been developing capabilities and procedures to do this pretty effectively. The problem is that these capabilities, generally speaking, are available only to commercial organizations and big corporations and government agencies as well. A civil society has kind of a lack of these
31:43
capabilities. And that should probably change. So I actually hope that the security industry will start moving forward to serve also society and individuals. But we can do something as well. I mean, it's very, very difficult, as I say. And we're trying to address an issue that is on a global scale. We're trying to handle people in critical condition in
32:05
different geographical, political, economical, and social situations. And that doesn't really help. In the case of these incidents, for example, if we want to respond to these things, there's different things that have to happen in order to be successful. The first thing is that the victim actually has to realize that something's wrong going on with his computer or
32:23
with his network or whatever. And that's not trivial. I mean, that's something that most likely will have to involve a security expert to identify. After this actually happened, he has to know who to contact and who to go to ask for help and to ask for expertise to fix this issue. And that might not be trivial either. Third thing, when actually able to communicate with
32:46
these organization individuals that might be able to help them, there is no infrastructure issue that comes in. They have to be able to send relevant data that could allow an investigation. And trust me, it's not easy to get gigabytes and gigabytes of these memory
33:01
images out of, I don't know, Bahrain or Africa or Ethiopia or other countries. It's very, very complicated. And by our own experience, that's not really scalable at all. And, you know, when even these examples, obviously, the contacts that are jumping into this situation have to have the technical expertise to handle these things. And, you know,
33:23
I would actually like to hear what their opinion of the other panelists. But I feel that coming from a security space, that doesn't really happen a lot. The acting community even is moving out a little bit, but the security industry is still kind of a close into itself. It's a market that is involving itself, involving customers and doesn't look beyond that. You know, it's
33:43
very narrow. So there is lots of points of failures in these things. And when Ben actually, you know, invited me to this panel and, you know, he told me, okay, we're going to discuss about how to respond to digital emergencies, I was trying to get my head around it. And I asked my contacts what they were thinking about that. Both other researchers that I work with,
34:02
as well as some of the victims of these surveillance technologies that we exposed. So the ones in Bahrain and, you know, UAE and so on. And I never, I couldn't actually get a unified answer. So everybody has different opinions on it. For instance, the victims that I was in contact with, they all said, okay, we want to have some easy to use identification
34:23
removal tools, which is not easy to do, obviously. The other researchers that I work with said that we need local presence on those countries in order to be able to actually perform these investigations successfully. And that's even more difficult. What I believe is that what it's really needed is kind of a global cooperation network where individuals like us
34:44
that actually have the technical expertise could, you know, provide and serve it to organizations like the ones in this panel and in this audience, for sure, that can coordinate and provide, you know, a deeper visibility and reachability for the point and act like point of contacts for
35:01
people at risk so that we can more effectively, you know, be present and respond more quickly and provide some solutions to this issue. So that's it. Thanks.
35:21
Okay. Thank you very much. I realize that was quite a technical way of looking at those issues, although they were very important institutional issues as well. Is there any specific questions or things that needed, please, over here? Sorry, say again? Yeah,
35:41
it's actually pretty accurate. So almost all of the things that they actually mentioned in the features list is true. Thanks. Any other questions in the room? If not, then we'll move on to a slightly less technical presentation by Stephanie. So the good thing about going last is we
36:03
have more people. The bad thing after the talks we already had is that, what does that leave me to say? Let's see. So I'm from an organization called Tactical Tech and we work more on the training capacity building side and developing materials to help people do self-learning. So it's a slightly different perspective than you've heard from the other
36:22
speakers. So I'd like to start by inviting you to imagine what it's like to get an emergency response call for an organization like ours. So one instance that happened to us not that long ago is, you know, we get an email in the afternoon saying that there's an organization with 25 staff in Russia who has just been closed down and is under financial interrogation
36:47
and then looking into the organization. Can you help the director and can you help the staff? So the question is how do you do that when you're sitting, for example, in our case, in an office in Berlin? Training people under that kind of pressure when they're dealing with
37:03
a lot of different things that are going on and they've just been arrested or some of their staff has been arrested is something that's very hard to do if not impossible. You know, teaching them how to switch to new types of software or how to install things, this is the furthest thing from their mind. It's not something they want to go through at that point in time. You could also think about bringing them out of the situation. So bringing them to the
37:24
place you are to be trained, but usually they can't travel in that situation and usually you going in and being associated with them would actually put them under greater risk. You could also think about perhaps asking them to find local technical expertise, but they may have a trust problem with who they're talking to, who's the person that's giving them the advice
37:43
if they don't already know them. And then they may also have a problem with the people giving them advice that by association, by training them at that point or by giving them technical support, they themselves get into trouble. You might not know the context that they're working in as well. You know, the organization providing support from the outside may not understand the ins and outs of what's happening
38:03
in that particular country. And even just establishing a first safe connection, you know, on the phone, by Skype, by email, just that very first connection even may be a problem. So I just wanted to kind of quickly go through a list of why what sounds like, oh, emergency response, you should just go in and help sounds really easy. But when you're
38:20
trying to transfer skills under those kinds of circumstances, it's very, very difficult, if not impossible. So, you know, in those circumstances, you do your best to basically you find ways around and you try to help them out of that situation. But for organizations like ours who are trying to deal with these problems in the long term, trying to develop a
38:40
program that responds in that kind of environment is a mistake, we think. So we've been working with helping activists use technology for over 10 years now. And one of our approaches is to think about these things in the much longer term, you know, how can we help people before they get into that situation in the first place? So how do we reduce the risk before
39:01
they get into that kind of extreme situation? And how do we make sure that they're prepared in advance? And one of the things I think it's important to remember is that a lot of information that's used against people when they're in trouble is actually from the past. So telling them to do what to do at that particular moment with their communications is very helpful. But remembering that the records online or on their computers or on their mobile
39:22
phones from two, three, four years ago are the things which will probably be used as evidence against them. So this is another reason why preparing in advance is really important. So I'm just going to show you some things I hope if it works, just to not make this completely abstract. So I'm going to just mention three different approaches that we take.
39:48
One of them is looking at obviously transferring skills, technology skills and strategies. Jillian already mentioned this example of somebody who in advance told somebody, these are my passwords, this is what I want to happen if I'm arrested. That kind of forward
40:01
thinking is quite rare. And it's sometimes at this kind of strategic level that we need to act. And you just heard in the presentation before this one about these more technical questions. But some of these things are no longer just technical issues. So for example, it may be technically possible to lock down your computer so that if it's taken away from you, it's protected. But
40:25
is that a good idea in a situation where that just moves the threat to the physical? So if you're in a situation where your computer's locked down and you're not going to open it for somebody and you hold the password, you move the threat to the physical in those extreme situations. So we need to look for strategies, basically. We need to look for
40:42
ways to help people think through in that situation. Completely locking down a laptop probably isn't a good idea, but giving some information away and keeping some completely hidden would be a better strategy. So that's what we mean by strategies. And this is just an image from a toolkit that we have called Security in a Box, which is essentially looking at different
41:01
kinds of technical solutions for some of these problems. But increasingly, I think that, sorry, technical person can't use the PDF.
41:22
Thank you. Too busy concentrating on what I'm saying. So anyway, some of the, thank you, some of the other levels is about the user choices that people are making. So how can we help people think about not necessarily technical problems, but also the information that they're giving away in the first place? And this is a project that we are involved in called Me and
41:42
My Shadow, which is helping people think about the digital traces that they're leaving online. And one of the reasons I'm showing this is because, you know, talking to people about security and technology, as I can also see from some of your faces, is quite dull. So finding ways to talk about this and engaging in an interesting way that brings people into the question can be really useful. And so in this project, we, you know, look at the question of how can we
42:05
bring to life these issues in a less technical way and in less overwhelming way. And the last thing I wanted to mention is the third approach, which is looking at this question of soft skills. So looking at the question of what information is being gathered
42:22
in the first place, looking at the question of the strategies that organizations use. So, for example, if I'm collecting corruption cases, for example, from individuals, do I really need to write down the name of the person who reports that corruption if it's not going into a legal case, for example? So how can we help organizations think about the question of
42:42
what data they're collecting in the first place knowing that at some time in the future this may be an issue for them too? I guess I'm going to cut this short by just saying one more thing. In this case, I think it might be useful to bring up in this context
43:01
is this a European problem or not? Because I think we're all sitting here in Berlin, many of us are not from these countries, perhaps don't even work in these environments. And I think one of the questions is, you know, what does this have to do with people working in Europe? A lot of the work we do is also with intermediaries. So these are funders or
43:21
organizations, international organizations based in Europe or North America, who are providing support to these types of organizations. So just by being in touch sometimes by some of these organizations, or by being a funder who transfers money to some of these organizations, or by traveling to those countries and working with these people, you're often exposing networks and exposing people. And there's not enough responsibility taken by those who have the luxury of living
43:46
and working in London or Amsterdam or Berlin or any of these places, and don't have to worry about their own physical and digital security. But they are by association often putting other people in danger. So some of the European organizations working to support activists worldwide
44:03
and journalists really need to think harder about their own digital security practices. I think you've already heard mentioned before about European technology companies. You know, what responsibility do they have? That was exactly the subject of the last presentation as well. But increasingly we're seeing that these kinds of techniques are also
44:23
being used by organizations and by governments in the European North American context, not in these extreme ways, you know, closing down accounts maybe, but not, you know, people not being arrested and so on, but by other means. And I think we need to be realistic
44:40
about the fact that there's an increase in activities to discredit activists who challenge the status quo or who act differently. And in these kinds of so-called semi-democracy or democracy environments, the threat is around things like litigation. So just to make that less abstract and sound less paranoid, this is what you see happening in the biggest scandals we've
45:01
seen over the last few years with Assange, Aaron Schwartz, and so on. That there are reasons why high-profile activists and journalists, even working in Europe and North America, need to start thinking about the digital traces that they're leaving behind online.
45:25
Fantastic. Thank you very much for that wonderful presentation. Since we have several interesting people in the audience, we've talked a lot about sort of European technology companies, but also German technology companies. Are there any questions specifically on the presentation or specifically on these European technology companies? We've been presenting about
45:46
gamma, about all sorts of other companies who are making the lives of activists more difficult, but I don't see anybody specifically wanting to talk more about it. Is there anybody on the panel that would like to respond to that? How to respond more effectively to the dangers created specifically in Europe? Clearly they're confused. Oh, there's two people in
46:10
the audience. Please, you're my guest. There's a microphone coming to you. I just have a question. Have there been any efforts
46:22
made to contact the coders or programmers of this kind of software, spy software? Finn Fisher, for example? I mean, is it possible to trace down the coders, the programmers, and talk to them and say, hey, what you're doing here is pretty fucked up?
46:44
So I'm not sure why anyone would do that, but I mean, it's a public company. It's transparent. I mean, it's officially based in the UK, and the one that actually produced the software, though, is based in Munich in Germany, I think. I never really spent much time
47:02
into investigating the details of the organization of the company and how it operates. Personally, I'm not interested in going after the individuals that actually produced these things, because I don't think it would have much effect. The things that need to be done is, first of all, regulate more strictly how these things get exported, because they will always exist.
47:24
It always exists, and it will always exist in the future. These are something which is demanded by law enforcement, so it's not going to cease to exist. The problem is try to control more who is actually going to use it and what they're going to use it for. And therefore, I'm not sure whether tracking down the programmers would be much effective
47:46
in that sense. Thank you. There's another question in the back, I believe. Hi. I've got a question about rumors and about interest behind those companies selling finch fisher products like that. There are rumors that not only the economy, for example,
48:03
if you take Germany, not only German economy and the German Ministry for Economy is interested in gamma selling or Siemens selling this kind of software, but also secret services and to attain information of the country. They're selling this software to you. I mean, I only hear this like rumors. I just wanted to ask if you hear this too, and if you could say
48:23
anything about that, if you know if that really affects the distribution, because then we don't do not only have to deal with interest from the economy, then probably, but also probably security interests of our countries to sell this kind of trojans. Thanks.
48:42
Well, I think I'm not sure if I got your question completely because the microphone is echoing quite a bit, but since last Thursday, there's been in the Netherlands, there's been a proposed law by our Ministry of Justice and Security that they will also be able to install viruses and computers of terrorists and child pornographers, which is basically the same as
49:03
what Iran or Bahrain or other governments are doing with FinFisher. So I think a lot of security agencies are already using it. The only thing is that within, I think, the European context, they have to have a court order signed to be able to use it.
49:22
I'm also not sure this room has some pretty bad noise issues. I'm also not sure I understood completely, but I mean, in the US, we recently had, I can't remember which state it was in, but a police force try to get authorization from a judge to use FinFisher. In that case, the judge turned it down, thankfully, but that's the sort of thing, that's the sort
49:41
of thinking that's going on. I mean, in the US, we've had our own problems with the NSA spying on millions of AT&T customers, completely warrantless, you know, and the Obama administration has done nothing except stamp big black lines all over any type of freedom of information request on that. So if you think that surveillance is something that happens in Iran and Syria,
50:03
no, it's something that happens in all, probably all of our countries, I don't know where everyone's from. But I think, you know, this is far beyond rumor at this point, this is something that affects people that we know, and not just outside of our countries. And just a quick follow up question to Fika as well, because I'm not sure it was quite clear,
50:21
this law is a Dutch law, and it's only relevant in the Netherlands. Oh, no. Please explain. Well, we still have three months to sort of kick it out, to have enough Congress members to vote no to it. No, it can actually also be deployed to the exterior. So it's not only national law, but it's sort of also conflicting with
50:42
international law. Because the reason behind it, and I think a lot of European countries are thinking in the same way, we're just the first ones to put it on paper, is that if you look at data storage, and it's more and more happening in the cloud, they can't be sure where it's actually stored. And then if certain things are, well, encrypted, or certain things are very
51:05
negative, we had a big case in Amsterdam with a child pornography network, and they want to then delete those files to prevent further harm for those children. So I think this was the reasoning behind it. But of course, it's very scary. So then basically,
51:22
I would be subject to this law anywhere in the world, regardless. Okay. Maybe I should be nicer to Dutch people in future. Are there any further questions? Please, Hauke. So yeah, my name is Hauke, and I do work for Reporters Without Borders here in
51:40
technology. And maybe to your question of trying to contact those people. So there's one thing, it's called Buck Planet. It's a very good glossary, basically, of all the stuff being there. And they have the private address of some of the developers or maintainers. So I don't
52:01
think that, yeah, that this is unproblematic. But to add to that, I've been in contact with Martin Munsch once or twice from from gamma. And if you talk to him, he says, basically, it is to hunt down criminals, and it's a good thing. And we are killing anyone,
52:20
and this is not mass surveillance. And so I don't think it Yeah, it helps really, because those people, they sit there and say, Okay, well, this is a good thing. And we help fight bad people. And also, after we we put out a report earlier this year about the enemies of the internet, the guys from Trovikor said, we build relational databases, which is kind of
52:48
true, but only, of course, a small part of it. So yeah, so I don't think they really have the mindset to see this is a problem. And so we have to keep pushing the public
53:02
so that they have to respond to media inquiries and state why they're doing this and why it's not problematic in their view. Just because the panel's coming to an end, if there are any quick responses to that very briefly. Okay. Okay. I just like to also ask the panelists before
53:20
we close for one last response, which is specifically, if I'm in a situation of a digital emergency, or I'm trying to prepare for a digital emergency, what should I do? Who should I contact? And just very briefly from each of you, so we have a clear message coming out of this, what should be done in the situation of a digital emergency from your perspective? Can I comment quickly? Well, I'm not too sure about that. I mean, I'm sure that
53:46
some of them have good intentions on developing and selling these things. But I also also think that they have, in lots of situations, they might be very opportunistic in how they handle their market and how they handle their sales. Because they clearly know that those things get
54:04
access to people that use it in wrong ways against wrong people. And we've been publishing details and data that actually support these claims. And the result has always been denial and sometimes even mocking our results. The problem is that we expose things with data and transparency
54:23
and they don't. And that makes me think that might not exactly be true that they have the best mindset on how they produce these things. But that's my personal opinion. And then one more comment is that I think these sort of specific surveillance companies,
54:41
I wonder to what extent they're actually, you can impact them with public pressure because you're usually not dependent on the consumers to buy things. And the public pressure should actually be directed towards the legislators because they can put pressure on these companies because they're on their territory. Yeah, so about Ben's last question,
55:04
about what to do if something like this happens. I think what we're seeing increasingly when people don't know what to do is that they just stop using technology. If you're suddenly in that situation, we've heard a lot of instances of people just dumping their phone, dumping their laptop, the same with everybody else in their family. And this is the situation
55:22
of what happens when you don't know what to do, which obviously makes people further isolated. That's one of the problems with responding like that. I mean, it's good in a way because it shuts down some of the possible vulnerabilities, but it doesn't actually solve the problem. I hate to say this, but the question of what to do is that there's no answer because it's completely context specific. The reason why somebody's under surveillance,
55:44
the kind of level of trouble they're in, the high level of profile they have, the context in which they're working, the legality of the environment in which they're working totally dictates what they should do, unfortunately. And one of the ways that we look at this question is by actually encouraging people to answer the question about digital
56:01
security concerns, especially in a high pressure and short-term environment, by looking at what is the general approach or policy to security outside of digital. People tend to think that it's just a digital issue. Somebody's getting an emergency response call in the front row, I think.
56:21
But anyway, sorry, just to say that I think that, sorry, this is actually my colleague giving me a comedy moment in the front row. Thank you. So anyway, just to say, I think these things are about looking at the offline reality of security. How do you deal with that anyway? What's in your bag? Who you're speaking to?
56:44
These are all skills that activists have before the internet. And the truth is, I'm not underestimating the problem. Many of the problems are about the risks that people don't know they're taking. But in extreme situations where you don't know where to look for help, we often try to get people to use the same strategies that they're using offline and think about their approaches to online. And to comment to the fact of staff that when people are in an
57:06
emergency, one of the reactions is to stop using technology. The other response is, especially when, for instance, journalists or activists are thinking that their laptops or mobile phones are infected, they sort of don't know what to do and they continue what they're doing.
57:24
Or once an organization is needles, they're in panic, and once the needles attack goes over, they don't think to the future. So I think once you're in an emergency and you sort of deal with it, take also the next step and sort of start planning for future attacks. Because once you've
57:41
been under attack, the chance that you will get attacked again is quite high. Yeah, just a quick comment. I actually agree with Stephanie. And especially in the case of surveillance technologies and spywares, like the one that we've talked about, I'm always very reluctant when I've faced myself in the situation where I have to
58:01
help someone out on disinfecting and responding to these situations because of the fact that they're in such a heavy surveillance state and such a close monitoring from their adversary, in some situations, might expose them even more. Because if they realize that while being
58:24
monitored, they actually go out and contact someone and the operator of their surveillance technology actually understand that they got discovered, and they're trying to remediate to that. It could be even more problematic to their safety. And without local presence, it's very difficult to actually do something through the internet and
58:44
online. There's a lot that's been said, and I know we've got to leave. So I guess the one thing that I think in terms of things that you can do with public pressure, I think Fika is right that you can't really pressure a lot of these surveillance building companies effectively. But you know who you can pressure is social networks. And a lot of them are not doing enough
59:03
to protect their users. And so I'm talking about your Facebook, your Google, your Twitter, et cetera, et cetera. Pressure them, talk back to them, and tell them what they need to be doing to better protect their users. And I think that that's something where we as the public can have an impact, even when we can't in other spaces. But you're also,
59:21
digital emergency-wise, always feel free to contact EFF. Wonderful. Thank you very much. I believe there's no time for further questions. So I'd just like to thank you very much for coming and spread the word. There's a lot more that needs to be done, both towards companies and towards governments in Europe. So thank you for your time.