We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

NOTH1NG T0 HID3: go out and fix privacy!

00:00

Formale Metadaten

Titel
NOTH1NG T0 HID3: go out and fix privacy!
Serientitel
Anzahl der Teile
254
Autor
Lizenz
CC-Namensnennung 4.0 International:
Sie dürfen das Werk bzw. den Inhalt zu jedem legalen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen.
Identifikatoren
Herausgeber
Erscheinungsjahr
Sprache

Inhaltliche Metadaten

Fachgebiet
Genre
Abstract
After the highly-successful presentation "Toll of personal privacy in 2018" at Chaos-West 35C3 where I talked about my personal experiences with trying to protect my privacy, this year I return with a completely* different talk that tries to convince the audience — you should care about privacy too! This talk revisits the theme of personal privacy in the digital world, this time centring around the "I've got nothing to hide" argument. A beam of intensive light is shed upon the motivation behind caring about one's privacy. We go in depth into what we can do to stay private and should we even try to do it at all. We talk about where we as an global society were able to fix privacy and where we have failed. New topics previously not covered are discussed, such as herd immunity and certification programs. \* 97%+
Schlagwörter
DatenmissbrauchExt-FunktorComputersicherheitProgrammfehlerHackerMAPDatenmissbrauchComputeranimationJSON
HypermediaBenutzeroberflächeHypermediaCASE <Informatik>DigitalisierungRechter WinkelZahlenbereichMereologieTexteditorGesetz <Physik>Vorlesung/KonferenzComputeranimation
DatenmissbrauchMathematikArithmetisches MittelGruppenoperationRechter WinkelMathematische LogikVerband <Mathematik>BildschirmfensterCASE <Informatik>MereologieBefehl <Informatik>Computerunterstützte ÜbersetzungTouchscreenInformationBenutzerfreundlichkeitZahlenbereichVerschlingungAdressraumBitGebäude <Mathematik>Vorzeichen <Mathematik>
BenutzeroberflächeMAPDatenmissbrauchMehrrechnersystemHoaxRechter WinkelMetropolitan area networkGruppenoperationVideokonferenzFacebookAutorisierungDiagrammComputeranimation
BeweistheorieVideokonferenzTransportproblemDatenmissbrauchSystemaufrufRechter WinkelCASE <Informatik>AutorisierungSystem FSinusfunktionSchätzfunktionStreaming <Kommunikationstechnik>FacebookAggregatzustandProzess <Informatik>Transformation <Mathematik>PlastikkarteComputeranimationBesprechung/Interview
TransportproblemInverser LimesDatenmissbrauchWechselsprungGemeinsamer SpeicherRechenschieberEchtzeitsystemNachbarschaft <Mathematik>Güte der AnpassungURLGesetz <Physik>Lesezeichen <Internet>Weg <Topologie>Programm/Quellcode
SoftwareRuhmasseGesetz <Physik>StrömungsrichtungMustererkennungMAPPhysikalisches SystemVerdeckungsrechnungMetropolitan area networkVererbungshierarchieGüte der AnpassungEchtzeitsystemDatenmissbrauchRechter WinkelGruppenoperationZahlenbereichURLKonfiguration <Informatik>RuhmasseInformationZellularer AutomatBesprechung/Interview
Vorzeichen <Mathematik>StichprobeRegistrierung <Bildverarbeitung>DatenmissbrauchDatenmissbrauchKonfiguration <Informatik>Grundsätze ordnungsmäßiger DatenverarbeitungZweiDifferenteMustererkennungProgramm/QuellcodeJSONXML
BenutzeroberflächeElektronischer DatenaustauschWhiteboardEinfach zusammenhängender RaumDatenmissbrauchStreaming <Kommunikationstechnik>MustererkennungRegulator <Mathematik>Gemeinsamer SpeicherComputeranimationProgramm/QuellcodeJSON
LeistungsbewertungComputerArithmetische FolgeGemeinsamer SpeicherVideokonferenzRobotikt-TestEinflussgrößeSoftwareKlasse <Mathematik>MustererkennungRechenschieberNeuroinformatikUniformer RaumKonzentrizitätNeuronales NetzURLEreignishorizontPortscannerAutomatische HandlungsplanungEINKAUF <Programm>InformationMAPSchlussregelMinimalgradMultiplikationsoperatorBesprechung/Interview
BenutzeroberflächeDigitalsignalSprachsyntheseRegulärer Ausdruck <Textverarbeitung>KryptologieCloud ComputingKraftHintertür <Informatik>ComputersicherheitTypentheorieRechter WinkelMultiplikationKategorie <Mathematik>DigitalisierungClientt-TestExpertensystemMathematikRichtungGruppenoperationPhysikalisches SystemMinkowski-MetrikTelekommunikationDienst <Informatik>MereologieArithmetischer AusdruckE-MailChiffrierungMaschinenschreibenWort <Informatik>Umsetzung <Informatik>SprachsynthesePerspektiveHypermediaTypentheorieMailboxMultiplikationsoperatorWeg <Topologie>BruchrechnungInternetworkingInteraktives FernsehenApp <Programm>VererbungshierarchieKonstanteBenutzerbeteiligungMaßerweiterungWellenpaketNatürliche ZahlMomentenproblemSystemaufrufCASE <Informatik>Data DictionarySoundverarbeitungEinfügungsdämpfungBenutzerschnittstellenverwaltungssystemInhalt <Mathematik>Office-PaketInformationsspeicherungVorzeichen <Mathematik>StandardabweichungProzess <Informatik>Verband <Mathematik>Puls <Technik>Gebäude <Mathematik>ComputeranimationBesprechung/Interview
Elektronische PublikationWellenlehreInstallation <Informatik>SchlussregelApp <Programm>MultiplikationsoperatorGruppenoperationQuaderRechter WinkelVektorpotenzialProgramm/QuellcodeJSON
BenutzeroberflächeRechter WinkelBitFormation <Mathematik>RoutingComputersicherheitProzess <Informatik>Programm/QuellcodeJSONXML
ProgrammverifikationFacebookFacebookDatenmissbrauchE-LearningSchnittmengeProfil <Aerodynamik>VererbungshierarchieRechter WinkelComputerspielARM <Computerarchitektur>URLBitMustererkennungE-MailOrdnung <Mathematik>Besprechung/Interview
FacebookMaschinenschreibenSeitenkanalattackeDatenmissbrauchSoziale SoftwareInformationErwartungswertProdukt <Mathematik>Web-SeiteBrowserGemeinsamer SpeicherGruppenoperationInformationMessage-PassingFacebookAlgorithmusDatenmissbrauchMereologieMultiplikationsoperatorGewicht <Ausgleichsrechnung>OrtsoperatorDynamisches SystemWeb-SeiteRobotikSchnitt <Mathematik>Lesen <Datenverarbeitung>Prozess <Informatik>NeunzehnGesetz <Physik>SpieltheorieJSON
Inhalt <Mathematik>InformationCookie <Internet>Baum <Mathematik>Lokales MinimumBenutzeroberflächePunktLastProzess <Informatik>MereologieTouchscreenBenutzerbeteiligungInformationsspeicherungWeb-SeiteMultiplikationsoperatorDigitales ZertifikatBitDatenmissbrauchInhalt <Mathematik>Digitale PhotographieArithmetisches MittelARM <Computerarchitektur>Message-PassingSpieltheorieSelbst organisierendes SystemNormalvektorRechter WinkelKonfiguration <Informatik>ComputerspielVorlesung/KonferenzComputeranimation
SystemaufrufStochastische MatrixBenutzeroberflächeDienst <Informatik>PunktDatenmissbrauchOffice-PaketQuaderPhysikalisches SystemDigitales ZertifikatZahlenbereichInformationOffene MengeMultiplikationsoperatorSoftwareentwicklerSchnittmengeEinfach zusammenhängender RaumWeb-SeiteExistenzsatzWorkstation <Musikinstrument>App <Programm>RandomisierungSystemaufrufAdressraumE-MailDienst <Informatik>PlastikkarteBenutzerbeteiligungGesetz <Physik>Rechter WinkelComputervirusWellenlehrep-BlockMathematikMailing-ListeBitPunktBildschirmmaskeCASE <Informatik>MereologieARM <Computerarchitektur>GamecontrollerGraphfärbungAutomatische IndexierungProfil <Aerodynamik>Abgeschlossene MengePerspektiveSprachsyntheseComputerspielMAPVererbungshierarchieTouchscreenDatensatzArithmetisches MittelQuick-Sort
BenutzeroberflächeMailing-ListeApp <Programm>Physikalisches SystemMini-DiscInformationURLMaschinenschreibenMinkowski-MetrikHumanoider RoboterTypentheorieCASE <Informatik>Endliche ModelltheorieKategorie <Mathematik>MathematikMAPProgramm/QuellcodeJSON
BenutzeroberflächeApp <Programm>Gemeinsamer SpeicherUmsetzung <Informatik>WasserdampftafelZahlenbereichDialektDialerFreier ParameterSystemaufrufPlastikkarteLokales MinimumChatten <Kommunikation>Message-PassingTermBildgebendes VerfahrenCodeFehlermeldungJSONComputeranimation
Freier LadungsträgerBus <Informatik>InformationPhysikalisches SystemAuflösung <Mathematik>PasswortAnalogieschlussProdukt <Mathematik>Ordnung <Mathematik>TouchscreenForcingWeg <Topologie>Gewicht <Ausgleichsrechnung>Rechter WinkelAutomatische HandlungsplanungAggregatzustandMultiplikationsoperatorNotebook-ComputerBenutzerschnittstellenverwaltungssystem
InformationsmanagementElektronischer FingerabdruckHash-AlgorithmusInformationsspeicherungWaveletComputersicherheitNichtlinearer OperatorPhysikalisches SystemProgrammierungPunktÄhnlichkeitsgeometrieBiostatistikBEEPApp <Programm>DatenbankMultiplikationsoperatorEntscheidungstheorieHackerOrtsoperatorRepository <Informatik>Message-PassingElektronische PublikationInverser LimesAlgorithmusSprachsyntheseGesetz <Physik>TropfenMereologieÜbertragBildgebendes VerfahrenFlüssiger ZustandPhysikalismusGeometrische QuantisierungElektronische UnterschriftDatenmissbrauchPasswortForcingGeschlecht <Mathematik>BelegleserSchaltwerkGraphiktablettRobotikTaskInternetworkingSchreiben <Datenverarbeitung>Rechter WinkelMetropolitan area networkProzess <Informatik>Weg <Topologie>ARM <Computerarchitektur>ProgrammfehlerMathematikTermMultiplikationInnerer PunktTeilmenge
Cookie <Internet>BenutzeroberflächeChiffrierungDatenmissbrauchSocial Engineering <Sicherheit>RechnernetzAnalysisWeltformelDatenmissbrauchGesetz <Physik>ChiffrierungE-MailEinsGruppenoperationRechter WinkelBrowserWeb SiteMessage-PassingApp <Programm>MittelwertInhalt <Mathematik>Elektronischer FingerabdruckPhasenumwandlungCookie <Internet>HochdruckMereologieStichprobenumfangMultiplikationsoperatorÄhnlichkeitsgeometriePunktEinfügungsdämpfungVierzigCASE <Informatik>Güte der AnpassungAussage <Mathematik>JSON
AnalysisRechnernetzSocial Engineering <Sicherheit>Hausdorff-DimensionFreewareSprachsyntheseDatenmissbrauchComputeranimation
Transkript: Englisch(automatisch erzeugt)
So, I'm very happy to announce for the second talk of the day, Kirill Solovsjovs, he's
a lead researcher at Possible Security and he's a bug bounty hunter, he's a IT policy activist and a white-head hacker from Latvia and he's talking today about nothing
to hide, go out and fix your privacy. Kirill, the stage is yours.
Some citizens complain about being under surveillance, but they are told that if you have nothing to hide, they have nothing to fear. Still, news media regularly cover cases where citizens with unusual behavior are put on suspicion lists, even though they have broken no laws.
Now, this is a quote from a news article from European digital rights newspaper, Edregram number 300. So it's news from the future, fictional, but clearly paints a part of this topic in future
we are all headed to. Now, anyone here took math in college? Mathematical logic? Some? So, you can tell me what this means. So basically what it says here on the screen is for every person that belongs to the group
of criminals, that person also belongs to the group of people who are hiding something. That funky sign in the middle means from that statement follows that for every person that has something to hide, that person also belongs to the group of criminals.
How many of you think this is correct? No one? Great. It's of course wrong, but it is a really common fallacy. I hear that a lot. Now, what even is privacy, though?
Privacy first of all is the autonomous right to choose who will work with my data. So maybe I'm fine with company A working with my data, but I'm not okay with company B working with my data. Also, how the information is processed.
So maybe I am okay for Amazon to process my home address to send me a package, but I do not consent to Amazon sending goons to my house to get money out of me. And of course, what information is processed?
Even though I'm okay with giving my address, maybe I'm not okay with giving my phone number. But it's not only that, it's also right to decide who I interact with. That includes right to be left alone. So in a sense, privacy is also about consent in a lot of ways.
Now, to illustrate this a bit better, to try to transfer my feeling to you, I created this concept of Schrodinger's video camera. Now, you may have heard of Schrodinger's cat. Schrodinger's video camera goes something like that. Imagine you just bought a new apartment somewhere in Paris.
Great view, nice building. And then one day you notice that a security camera has showed up outside your window on the opposite side. It has one of these non-transparent domes, and you cannot see inside it. You cannot see if it's looking at you or not. You can't even tell if there's a camera inside or if it's completely fake.
But for me, it doesn't matter. The feeling I get is terrible either way. Even if someone told me that there's no camera in there, even if someone showed me that there's no camera in there, I wouldn't feel at ease.
So that's why for me, privacy is, in many ways, the feeling of privacy. Now, exactly one year ago at this stage, I gave a talk, Tall of Personal Privacy, in 2018. I just want to acknowledge some reactions that I've been getting since the talk.
So some of the reactions are funny. Some of them are affirmative. Others are insightful. Of course, others are just plain uninformed, because you do not have to actually use a phone,
even though it's hard, but you do not have to. Even more so, you don't actually have to use Facebook. But these two remaining arguments, I want to talk closer about these two. So here's a short video. Some of you may have seen it. So it's a video of a lady trying to unlock the man's phone with a face ID.
Now, it's clear from this that it's fake, at least. I mean, it's a sketch, right? It's clearly visible. It's a sketch. But what I want to talk about is the reactions.
And I'm outraged by the reactions. I mean, all the main topic of the video aside, all these people here assume that since he has something to hide, he did something wrong, which is unacceptable to me.
So what if you really got nothing to hide? Maybe we have people in the audience here, even though I doubt, that have nothing to hide. I'm sure we have some people like that watching the stream, not here at the Congress. So what about data hoarding? What if someone, well, say, a government or a large corporation these days,
maybe that's more likely, collects enough data about you that allows them to more easily blackmail you, that allows them to impersonate you? Even if that's not the case, think about herd immunity. That's a concept used in vaccinations. You should get vaccinated because some people medically can't,
and you will also protect them by providing a human shield, if we can so say. The same applies for privacy. There's herd immunity in privacy. So many people not hiding anything will make it really hard for the few who actually have something to hide.
And it doesn't have to be anything criminal or nefarious. Many people have legitimate reasons to hide things. And by accepting that you're okay with not hiding anything, by giving up your right to privacy, you're also helping those people that actually need privacy to give up their right as well.
And of course, remember that today's authority might become totalitarian or inhumane. We see the transformation process starting in a couple Western countries right now, and I don't know where it's going to lead us in the next years, but it may be the case. Just imagine how much easier would it have been for Adolf
to have committed his atrocities here in Germany if he had Facebook, if he had access to all the data. I hope that's not coming back. Never. Now, what I want to talk about now is a state of privacy. I want to take a look at what's happening around the world.
And what's happening around the world, of course, is there are these protests in Hong Kong going on. So this is an article from middle of this year. And people in Hong Kong, they're really aware of tracking, that their transportation cards can be used to track where they move. So they try to avoid that.
We are afraid of having our data tracked, is what they say. So that's good. People are getting more aware. But it's also happening in the West. Los Angeles decided, passed a law basically, that all the scooter sharing companies
have to share real-time scooter location data with the government. And Uber, even though I dislike their methods and trying to disobey the law wherever possible, I think they took the right stand. All the other companies actually are ready to give the data
and are giving the data to the government. Uber is the only one who questioned this and tried to fight against that. I mean, the idea behind that is a good idea. So the government wants to make sure that the communities, the geographical communities where underrepresented people live or underprivileged, I shall say,
that they also get the scooters. They don't target just the rich neighborhoods. So that's good. But do they really need real-time tracking for that? I doubt that. Now, my favorite topic of CCTV, of course, this is the only slide from the last year that I included here. These are actual posters for those of you who haven't seen them from the UK,
how the government is telling you that CCTV is good. So what's new? Well, this happened. Beginning of this year, so in May, this year, a man was stopped by the police. All he did, he was walking by and someone warned him that there is facial recognition going on over there.
So what he did, he pulled over his sweater, he pulled it over his face, and walked by. Police stopped him, forced him to scan his face, found nothing wrong, and fined him 90 pounds for trying to evade the facial recognition. So it's super disturbing and super creepy. But I mean, there are ways around that, right?
We could use this cap, for example, and have a face of, I don't know, a general secretary of a communist party somewhere. So then you're not trackable, not attracting attention, right? There is this law in Hong Kong that doesn't allow you to use masks anymore in the protests.
And it's problematic because they track everyone through facial recognition. So that's why the mask was allegedly created. The good news is, it's relatively recent news, in November it was ruled, the political system in Hong Kong is complicated, but Hong Kong's court ruled that it's illegal to ban wearing masks.
And the full hearing in the next level of court is still going to happen in January next year. But currently, the law is suspended. Currently you can actually wear masks. I mean, I don't know if police actually are okay with that, but by law you can.
Now, privacy advocates, me included, have always been complaining about not being able to use public Wi-Fi without a phone number. Finally we can. We have an option to take a selfie and upload the passport, which is terrible. Oh my God.
Let's get back to facial recognition for a second. So European countries are trying to copy and paste the idea that China is doing. It's not just China, it's not just Russia, but the difference is they are asking for permission. So why are they asking for permission?
Well, and now that they've got a negative answer, the question is, will they listen? And yes, they will, because European Data Protection Board find a school in Sweden because of using facial recognition.
I mean, thanks to whoever made it, but we have GDPR, so we do have some protection in Europe. Unfortunately, not all of us here and not all of us watching the stream are so lucky to have that kind of regulation, but it does actually work, even though I've been hearing bad things about GDPR. Yes, there are things to improve, but it does work.
It does help us. Now, as you were told, I come from Latvia, and I want to share something from Latvia. This is a picture I took at a press event in Riga. So police were presenting their new vehicle, this one over here. And this vehicle, the idea is it will automatically find people
for not wearing seat belts, talking on the phone, not showing turn signals. So it has a bunch of 360-degree cameras in there and doing some fancy stuff. I mean, I'm all for traffic safety, but check this out. I mean, I like that they do have a sense of humor.
I guess BB was taken, so I mean, they decided to go with GC, which is okay, which is the one that police use, but still, I don't know. Orwell will be proud. So another slide from my previous presentation. So People's Daily China was touting how cool it is that in classrooms you can actually now use surveillance cameras
to track the progress of students. Are they learning? Are they focusing? And so on. Well, what's new? What happened 2019? Anyone knows? This happened. Brain scans. So let's take a look at this short video here.
Teachers at this primary school in China know exactly when someone isn't paying attention. These headbands measure each student's level of concentration. The information is then directly sent to the teacher's computer and to parents.
China has big plans to become a global leader in artificial intelligence. It has enabled a cashless economy where people make purchases with their faces. A giant network of surveillance cameras with facial recognition helps police monitor citizens.
Meanwhile, some schools offer glimpses of what the future of high-tech education in the country might look like. Classrooms have robots that analyze students' health and engagement levels. Students wear uniforms with chips that track their locations. There are even surveillance cameras that monitor
how often students check their phones or yawn during classes. These gadgets have alarmed Chinese netizens. Now, that's screwed up beyond repair, if you ask me. But luckily, it's just happening in China, right? No. U.S. America.
So there's a great article. I invite you to read it in Fool and the Guardian, published in October 22 this year, about how they use digital surveillance for American kids, or against American kids, I shall say. I will read you some of the quotes from the article.
And I divided this article into multiple categories so it's easier for you to understand. First of all, the reason. Why is anyone doing that? I mean, it's not China. There's no communism. You're not supposed to spy on people, at least unless you're the government in the U.S. So the reason is that lawsuits by parents of students who have committed suicide,
or parents of children who have been cyberbullied, is a problem for the school. So they see this as an easy solution. They track everything students do, and then they are off scot-free. I mean, from the perspective of the school's lawyer, even if the kid does commit suicide, their asses are covered
because they have this great system and they did everything they could. So it's kind of a loose solicitation there. Now, what about reaction time? So, as the article says, it's not, I've sent this email two days ago. It's, you've sent this email three minutes ago, come to my office, let's talk. That's the speed, that's the latency, that's the reaction time of the system.
In Velt County, Colorado, a student emailed a teacher that she heard two boys were about to smoke weed in a bathroom. And the school is proud of this.
Within four minutes of sending this email, troops were deployed to the bathroom. Scope. So, I mean, it's getting worse and worse, so I'm going to get ready for that, prepare for that. So it's not just about what they do at school. 24 hours a day, whether students are in their classrooms or their bedrooms,
the monitoring is going on. And of course, I'm not talking about video cameras here, but the content monitoring. Touch companies are also working with schools to monitor students' web searches and internet usage. And in some cases, track what they are writing in their private social media accounts.
Gaggle, which is the name of one of the companies providing the service in the US, also automatically sends students a scolding email every time they use a profanity. How is that for a chilling effect? Now, with all that, what's the justification?
Some proponents of the school monitoring say that the technology is part of educating today's students in how to be good digital citizens. What does that mean? Well, allegedly, it helps train students for constant surveillance after they graduate. That's the actual quote from the justification of this system.
And here's another quote from Bill McCulloch, a Gaggle spokesperson. Take an adult in the workforce. You can't type anything you want in your work email. It's being looked at. So their idea is, you know, let's do this to our kids in schools and then prepare them for that.
What are the effects? Of course, there are chilling effects. ACLU said schools don't post on a bulletin board. Here are the words we are going to be searching for. Of course, it forces students to be careful and to self-censor. They might not write about things or talk about things that are not, in fact, being monitored.
The idea that everything students are searching for and everything that they're writing down is going to be monitored can really inhibit growth and self-discovery. That's a quote from Natasha Duerte, policy analyst at the Center for Democracy and Technology.
And finally, it's military technology. In the United Kingdom, school surveillance technology has been already tested for use in counterterrorism efforts. Again, I don't want to get blown up by terrorists, but I don't like all these safety processes that we have either.
I mean, even here at Congress this year, we are starting to have signs, Don't leave your bags unattended. I'm not sure how I feel about that. I mean, I thought it's a safe space here. The ACLU expert that I referred to previously said, It's certainly fair to ask to what extent we feel comfortable with technologies first developed for use in war being used against our children.
Now, let's take a moment to talk about the company name, GAGL. According to Merriam-Webster dictionary, GAG is a verb that means to prevent from exercising freedom of speech or expression.
And the other definitions for the verb, to me, only emphasize the non-consensual nature of the interaction between a student, the kids, and the school. They don't have a say in that. They're being gagged, not only technologically, but also psychologically. That's unacceptable.
Okay, let's talk about something else. At the end of end-to-end encryption, or as U.S. Attorney General William Barr called it, warrant-proof encryption. GCHQ has suggested that touch firms' communication services should be able to
therapeutically add intelligence agents to conversations or group chats. This is still an ongoing discussion, but this is where this is going. So I've been looking at this problem, and I've been thinking, I've been trying to predict how will secure encrypted communications look in the future?
Because the government really has, they have strong incentive to try and access that kind of communication because of terrorist content, because of child abuse material. So I think, as a community, we managed to convince them that backdooring the crypto part is not going to work. I mean, it would work, but it's not the worst of the ideas.
So what I think is actually, this is where it's going to be going. So public clients are going to be able, like public client clients, I mean, WhatsApp, Facebook, and so on, they're going to be able to add a third party to your encrypted communication channel without you knowing, and it's going to happen in your client software. So that's what I think is going to happen.
By the way, Jim Baker, the FBI's general counsel, who has been working with William Barr on that proposal, had a change of heart. This is a cool article from October. This year you can take a look at it, and the guy finally understood
that what he's trying to do is not the right direction to go to. Okay, let's talk about those client apps. So let's take WhatsApp as an example. If WhatsApp were doing something shady on your phone, you could stop it by routing your phone, right?
That would help, because then you can install background apps that monitor the traffic, that monitor the file interactions, that take a look inside the WhatsApp, and amazing. But you cannot do that. They've had that rule for some time, and it also applies, of course, to iPhones, to gel-breaking iPhones as well. But they tend to have these waves where they reinforce the rules.
So it's been there for years, but they reinforce it once and again, right? So, okay, I can't route my phone. What about third-party apps? Well, nope. This is actually a bit newer. It hasn't been there for that long, but if you install a third-party WhatsApp app,
you're going to get banned, right? So the only question for us, the technological nerds here, is, is it going to be legal to install our beloved secure apps? Now, but I want you to think about the other people. I want you to think about non-technological people. What are they going to do? How are you going to communicate with them?
And actually, not less importantly, how will they communicate between each other? Let's take a look at another important aspect of everyone's everyday life, watching pornography online. The Australians want to use facial recognition to verify that the people who are watching porn online are the actual people.
I mean, how short-sighted have you to not see how this can go wrong, right? Those fake emails everyone's getting, we filmed you watching porn and we filmed your face, those are going to turn real if this is actually enforced.
But another thing, right, is of course online dating. It has launched in the US this year, and it rolls out in the EU next year. And I actually have a couple of things about Facebook online dating. It actually does provide you more privacy, which is good. So when you opt into Facebook dating,
it not only makes sure that your dating profile is in a way anonymized and it limits your access to your actual data, it also does something to your actual Facebook profile where it tweaks the private settings a bit that you are a bit more private. But there's a catch.
In order to opt in for Facebook dating, you have to enable location on your phone. You have to physically confirm your location. So Facebook isn't going to give us privacy for nothing. They want something in return. So not good again. Now, suicide prevention is an important topic, and Facebook is doing their share and I feel quite okay about that, that they're doing that.
That's good. And here is the algorithm from their official spec that's available publicly. So basically, they monitor everything. By the way, people with knowledge on the subject have told me, have informed me, that even if you do not post the message, even if you do not post the comment,
if you decide to write that message and then delete it before hitting submit, Facebook still gets that text. And they still launch it through this process here. So they use a classifier, they use some neural net to try to understand what's happening. And the last step, well, not the last step,
the one step before the last step is it's reviewed by a human person, which is the part that I dislike about this idea. So, I mean, giving that, that taking action over here actually means popping up with this. So the user basically gets this message here.
I mean, I think it'd be okay to have more false positives and not have it reviewed by a human reviewer. That would be better. Even though some people are more creeped out by robots reading their stuff than people reading their stuff. I'm one of those guys, actually. Still on Facebook.
Date reuse. So there's this article. Facebook lawyer was forced to testify in court. They do that all the time this year, but one of these years, it's from June 2019, one of these times this year. And what they basically said is you have no information on privacy. There's no privacy interest because by sharing with 100 friends,
you have published, you have shared with everybody. And then they go on to compare it to a birthday party in the article where you invite a couple of your close friends, like 20 friends, and you have no information on privacy because any of those 20 friends could go ahead and tell your stuff to anyone else. So I'm not okay with that. Remember, privacy is also about consent, and that's not fucking consent.
Okay. Let's talk about something more down to earth, more technical. Web browsing. Specifically, JavaScript. The technology that fuels the modern web, from dynamic web pages to tracking. This here is an interesting message that I got when trying to search for some parts on Mouser.
So it says that JavaScript is disabled, so you can either enable JavaScript or log in. I mean, if that's not admission of why the JavaScript is being used, then I don't know what is. We also have this article here. And I'm not good with German, but it's funny.
It kind of loads, but then it doesn't. So I don't know what the point of that again. They're just screwing with people like me. I mean, I used to have to browse with JavaScript disabled to have the web not work for me. Now all I get to do is browse from Europe. That's what I get. One of the comics that I read, that is the actual comic.
That part on top. Everything else is trash on my screen. And it's not just that. It's open any page on your mobile, and your screen is full of garbage, not the actual text that you want to take a look at. So this is interesting here.
If you take a look at the Russian Compose a bit closer, so this is what happens when you open it from Europe. You have this nice blah, blah, blah, and then you can click I agree and continue. And your only other option, if you don't agree to tracking, to give up your privacy according to GDPR, is back to all options. And if you click back to all options,
what you get is you can pay to access the content. So that may be legal. I mean, Washington Post is a relatively large organization, so they probably know what they're doing here, but it's not ethical at all. Now, I'd like to spend the next 10 minutes
to talk about why I do all of that. Why do I try to stay private in my everyday life? I mean, I tried to convince you at the beginning. Personally, for me, it's care for others. Even though I don't have that much to hide, I like to provide that shelter, that herd immunity for the vulnerable people that really do.
But it does, so me hiding stuff, me not disclosing as much as the normal person, does tend to create some curious situations. So I have a bunch of certificates. This is not the crowd I should advertise that to. I was kind of forced to get them.
Anyway, so after every time that I take an exam, I have to write them a message, because every time when I take an exam, I show them my ID, my governmental ID, my secondary ID, and they still take my photo and store it. So obviously, every time after the exam, I write a polite letter to them saying, thank you for the exam, please delete my stuff.
And they do. But one time, they also said we did. And then I asked, why the hell are my certificates gone? Why can't I verify them? And this is what I said. So, sorry, we misunderstood what you meant. So we deleted your whole account and all your certifications.
And the funny part, when we were trying to resolve that, multiple times they basically asked, so tell us which certifications those were, because we deleted all of them all. I could choose any of them. At one point, they said, okay, we restored them all. I took a look at my list, took a look at their list, and one was missing.
So I told them, nope, look more closely here. So another thing happened to me. I got an SMS. Anyone still gets SMS here in the audience? Yeah, about half the people. So I got an SMS. It didn't come from a number. It came from a spoofed source, so ASCII-based.
Crediton.lv. And what it says in Latvian is basically, hello. Unfortunately, your credit request has been denied. Crediton.lv. I've never, ever applied to any kind of credits or even credit cards in my life, so I was confused.
Obviously, a normal person gets SMS. What do they do? They try to phone the fuckers. They try to understand why the hell are you spamming me, because you cannot reply to that number. So I go to the web page, which is that, and I ask them, so what's up? And they don't pick up. So I phone them. I was about to ask them, and they don't connect,
because my outgoing number doesn't exist. It's set to private. I use call ID bearing, so they don't connect. They don't want to talk to me. So, okay, the only thing I can do is I hop on my bike, and I ride over to their office. And everything is fine. Everything is solved.
I arrive there. I show them my phone. They ask my phone number. I write my phone number on a piece of paper. They take it over backstage to some IT guys, and the IT guys come back saying, nope, it's not our system. We didn't send it. And I'm okay with that. I mean, I could easily have sent that myself, right? Or you could have done it, huh?
But, I mean, it's fair. Someone can spoof that. It happens. So I wave goodbye, and I'm out. So that's that. A week later, same number. Win a shopping cart for 50 euros if you go in your profile and renew your information. So what they do? Call them up.
Doesn't connect. Look at their webpage again. Take my bike. Go there about one hour before the closing time. It's not open. Apparently, the opening hours in the webpage are the opening hours for the phone, which doesn't work, not for the actual office. So 10 p.m., no one was there. People were there. I saw them, but they didn't open the door for me. So on my way back, I go into the patrol station.
I fill the bike a bit for about one euro, and I have the receipt as me. So in two days, I ride back again. They're open. On my way there, I fill the bike a bit, take the receipt, go there. And I show them this again. And you know what they told me? That, yeah, we looked more closely here,
and we found your number. Sorry. Our bad. So I told them, okay. Well, it's a bit strange that you do not verify the number. Don't you have, like, requirements by law to verify these things? And they replied, yeah, we do, but only if we approve the credit. So you can apply with a bunch of random guys' numbers
and get them spammed. So I said, okay, that's dumb of you, but I have these two euros that I need you to compensate because, you know, because of you, I was there over there. You lied to me, and now I have to take these trips here. And so they told me to send an email. So I got the email address. I went home. I sent them an email, attached the scans,
and they replied, please provide your bank account number to transfer the money. I politely and honestly replied, I do not have a bank account number that I can provide to you at this time. So that was that. They never replied. A couple weeks later, I look in my bank account,
and there's the fucking money. Many people have asked me at this point, why haven't I looked deeper into that? I'm too afraid what I'm going to find out. I don't know how they got my account number. I don't have the slightest clue. Now, from the privacy perspective, there's one more thing you can do. If you still use post, in the post office, you can use a post office box.
That means if you give someone your address, they cannot abuse it, right? If I can send them to sending me stuff, they can send me stuff, but they cannot break down my door because that's not my door. That's my PO box in the post office. So that's, I don't know, the prices here in Germany. In Latvia, they're cheap.
I actually have two PO boxes this year. One box costs 12 euros per year, so it's super, super cheap. Speaking of Congress, C3 Post is also quite good. It's anonymous, if you want it to be, and it's right behind the stage. So go send a postcard to someone you want to send a postcard to after the talk. Let's talk briefly about mobile apps.
So this is Socratic. It's a mobile app that kids use to talk about homework, right, to learn. And Apple and also Google right now have actually created a good system that allows you to control, granularly control, what can apps do and what they cannot do. And in this case, you know, they ask to access your contacts.
And the thing that Apple has done is they allow the developer to actually specify whatever text they want in here. They cannot delete or change anything, but they can add a text. So let's say it's only about chatting. It's only for chatting about homework. Don't worry. So naturally, you press Don't Allow, and this happens, yeah? Sorry. We kind of need that. So I hope they come today when Apple forbids apps like that
that block your experience fully just because you haven't given them a permission that should actually be optional. Now, I use this taxi app called Bolt. Then they change the branding to Bolt, and this is how it looks now. I can still use it, but if I want to press on the button,
which you don't see on the screen, it's super, super dim over there, where all my settings were, my name, my phone number, my previous rights. I cannot access it unless I have GPS enabled. So why do I need GPS to access my history? I have no idea. But that's how they do it now. But still, at least it was functional until they deleted my account
because I wrote a GDPR request to them to explain what's going on and why I can't access my data. They basically said, since you have lost your trust in us, sir, we will terminate your account starting next month. I still got to use it for a while, so then I switched to this other taxi that's Yandex. It may or may not be run by Russian special services. I don't know.
I was using that. I took one ride. Coincidentally, it was to the airport to go to Ukraine. Coming back from Ukraine, I enabled my phone connection again. This is what I got. Your next ride can happen 2023. Welcome. Speaking of Yandex, they have privacy policy, of course.
All companies doing business in the EU need to have that now. And it's quite okay. So they actually have the type of data, like location. They explain how they use it, and they explain how you can change your data or how you can withdraw your consent of them using the data. And it's all fine here. But if we scroll down further, we see those kinds of categories,
and we see the reason is to improve app performance quality. And we see that you cannot use the same app without giving the data. And what does that include? Amount of disk space on Android. List of installed apps on Android.
Device touch characteristics, like model, manufacturer, operating system. Sensor information. Anyone has a camera sensor on their phone? Jeez. Okay, back to WhatsApp. I was forced to use WhatsApp because my friends use WhatsApp, and I have all the apps. I'm like that guy. But I don't want to be that guy that shares your phone book with the company
because then you betray your friends. So I never do that. I always click deny. So I was using WhatsApp like that, and I want to show you how I found a way to create new conversations in WhatsApp without giving access to your phone book.
So here, I use my regular dialer. I dial the number I want to dial. I want to chat with in WhatsApp. I press the green button. I press the red button. I go to recent. I press on the I over there. Then I hold the message button. Then I choose WhatsApp. The call button, then I choose WhatsApp.
Then I call in WhatsApp, then I press the red button in WhatsApp. Then I go to calls in WhatsApp. Then I open up here, and then I press the chat. And I'm in the chat. So that's how you could create a chat in WhatsApp. And could is not because you can't do it anymore, but because they fixed it.
I mean, I still don't love WhatsApp, but now they actually fix the button for those of us who do not have the address book. So that's something. I mean, it doesn't make them perfect, but now I don't have to do that. So that's cool. I also use these things. Do you have them here in Germany? If you need the prepaid anonymous cards, you can just buy them in the shop. Yeah? Okay. So I use these a lot since I don't have any other kind of banking cards.
And if you notice, it doesn't have a name. This just says, term of use, one year, maximum. So I decided I have to be fair. And when I buy something online, I have to write as it is. So I tried to do a bunch of stuff by leaving the name on the card blank.
That's just the placeholder. That's not what I wrote. And it gives nice errors, like pork bun at DNS register here. It gives a kind of technical error, so you can work from there. But for these cards, of course, you can write any name that you want, and it just works. Now, if you take a look at this picture here, I took that in Vienna.
Cameras everywhere. I don't like to waste time, so I always, on longer rides, I try to work. I open my laptop, but I cannot work on a bus. They will see my passwords. I don't know what kind of resolution they have. I don't know if it's a 30 FPS camera or it's a 240 FPS camera.
Who the hell knows? So you can't work there. And the private information on my screen, information of my customers, it's also at risk. So basically, the only place you can still work are airplanes, even though I did see one airplane that has a camera already from the cockpit to the cabin on the door
instead of the usual analog systems that they use. So luckily, airlines want to save weight, so they probably are not going to install cameras in all seats, even though I've seen news about some low-budget carriers installing cameras in the entertainment systems on every seat. And they actually have been confronted by that, and they told everybody that, you know, we want to evaluate.
Basically, they use different wording, but they say we want to track people like we do online. We want to see how they interact with our product. Now, speaking of airports, right? Airplanes are mostly good, but they have these things. In some airports, it's actually considered a privilege. I was coming here from Riga through the fast track,
and they referred me to secondary. And the position of the thing in the Riga airport and me going there multiple times a month actually suggests to me that they only use this for fast-track travelers. It's a feature, not a bug, right? You get magnetic for everyone else and pat down, and here you get this. But you can still opt out, of course. Even in the UK. UK tried to ditch that.
European court ruled that it's not allowed. Now, until Brexit happens, we're good, including in the UK. Around the world, it differs. My main concern is that it's not that I'm going to be seen naked. It's the artificial intelligence, the robots, taking decision in a non-transparent way about me. I mean, it's going to beep or whatever anyway.
Just pat me down. I'm not even talking about transgender people, because for these things, the first thing the operator has to do, they have to select the... I don't know if it's meant to be sex or gender, but they have to select a pictogram of who's going in the system. You know. Boarding passes. Boarding passes are cool, except totally insecure.
Then again, except USA. They do have a signature part where you can sign it. But, you know, they're insecure. But my problem with boarding passes is shopping. Especially in Germany. They fixed something two years ago, and now you cannot buy a single thing in the airport without showing the boarding pass after you've gone through security.
So I hate that. And what I'm going to do is I'm going to spend the Congress to work on an app. I'm going to present it in March in InsomniHack in Switzerland. I'm going to have a talk called Travel for Hackers. I'm going to talk about how to safely travel,
what can you take and what you cannot take to different countries. And I'm also going to have this app. But you have to promise to only use it for shopping, not for boarding or accessing airport launches. Anyway, using this app you'll be able to anonymize your boarding pass, like point and click. And then you can go shop, right? Don't use it for bad stuff or I might get thrown in jail.
We don't want that, do we? Another thing. What is secure? I mean, airports, you have boarding passes, we have these scanners. Fingerprints are secure. I mean, unless you use them as your password, that's dumb. But fingerprints are good. So back, I think it was 10 years ago when Latvia started to enroll into this ICAO program for biometric passports.
And before that they just told everybody, we don't need your fingerprints because we had no biometric passports. That's fine. Now, that changed to we only store fingerprints in your passports. And that's an acceptable compromise. I mean, I see how that can improve travel safety
as opposed to the damp limitations we have on the liquids that we can carry. By the way, have I told you the story how I tried to carry ice through airport security in Belgium? They told me it was a liquid. We argued with them for about 30 minutes and then they won
because then it became liquid. True story. I gave up. I just threw it away. They have their own physics. So back to passports. So we only store fingerprints in passports. But then I learned somehow that they might somehow save that.
I talked to some friends in the Ministry of Interior and that's what they told me. Well, that's what someone hinted at me. And the thing is what happened then is there was a rush to create a law that's called on biometric data storage. So they wanted to legalize storing a hash in the database.
That's their current answer. If you ask them, what do you do with a fingerprint, we store the hash on the fingerprint database. And hash is safer. Why? Because if I do stuff that annoys my government, they cannot download my fingerprint from database and put it on a dead guy somewhere.
Hash is hash, right? So what I did is I used the kind of GDPR. We had the GDPR since 2001 in Latvia. A similar thing. It was basically the same. Only the fines were up to 1,000 euros. But everything else was the same. So I used that to request my data from the government to take a look at the hash. So they sent me the hash.
VESQ. It's a FBI's wavelet squalor quantization algorithm. It's an algorithm that can be used to compress black and white images. I got these two files left and right finger. And I did manage to find the only resource on the internet, and it was GitHub, actually, that contains an algorithm to open it up. And, yeah, my fingerprint isn't there.
That's not my actual fingerprint, but you get a drip. So my whole damn fingerprint was there. That's the repo I used. Let's summarize. So what's the status quo right now? I want to talk about multiple aspects.
First of all, user demand. As you see, I marked it with a frowny face. Users, and I'm talking about people outside this conference, do not really care about privacy. They don't need private apps. They're okay with being filmed. They're okay with their fingerprints being taken.
I think most people would be okay with their palm print being taken. Maybe even a blood sample if it only happens once in four years. So that's problematic. The cookie law. Someone remembers that? We have that in the EU. We've had it for a couple of years now. Well, it did nothing.
It basically did this, that every site had to open one more banner and inform us that, hey, cookies are being used. And we did have DNT, the DNT header in HTTP. It was a great idea. I took a look at what happened, and it basically just randomly died within some discussion group. So maybe we should revive that, because that's perfect.
Inform your browser if you want to be not tracked. And the website should be mandated to not show you the banner, but to actually take care of honoring that header. GDPR for big data? That's actually crap. Big companies have ways, both legal and technical,
to get around GDPR currently, and GDPR enforcement cannot reach them. I mean, if something happens that can be proven, fines are big, and those fines are going to be paid, I hope. But GDPR still can be improved there. Now, GDPR in general, that's great. That actually allows us to stand for our rights.
Go and ask them about what data are you holding on me. Go and ask them to delete the data, to change your data. They need to be on their toes. They need to know that we are looking at what they're doing. But the problem is all these things are EU only. I mean, cookie law is, but the other things, GDPR, it's EU only, so we have to make sure that other governments around the world
adopt something similar. Surveillance technology is getting worse and worse. We have different technology being advertised, both as a military tool and a tool to track your kids and wife. That's not okay. And for encryption, I use average phase, neutral phase here,
because encryption is good, right? Anyone who knows how, say, AES works or a verbal algorithm, you can take a look at that, and it's great. It's unbreakable if you implement correctly and use right key sizes. There are two problems.
It's not being implemented correctly in some places, and most importantly, as a user, you have no idea if the app is actually using encryption for that part or it's bypassing encryption entirely or using something else. So that's problematic. Would any of you notice that your favorite chatting app sends a copy of your message somewhere,
even though the original copy is encrypted? So what do we do? How can you fix it? Now, let's not talk about cookie law. That's not that interesting. For user demand and GDPR, it's the same thing. We have to inform users that there's GDPR,
and first-hand, we have to tell them that it's good. You can use it, and privacy is good. This is how it will help you. This is how it will help your friends. This is how it will help other people online. GDPR for big data, lots of work needs to be done. If there are any lobbies in the room, that's where you go in.
If you have contacts in European Parliament, that's where you go in. We need to fix GDPR so it works better for big data, because big data is too denominized, and you cannot do anything with it, right? Because it's safe, but it's not. Now, for surveillance technology and encryption, this needs to be fixed by us here.
We are the only ones with the technical expertise to actually try and fix these things. Privacy and indeed human rights are a relatively recent invention. They've been among us for 100, maybe 200 years, which is why, at least in my eyes,
it's even more deplorable to see corporations and governments alike hastily eating away at our right to privacy for their own benefit. Privacy shouldn't be a luxury that only the rich and powerful can afford. Privacy is for everyone. Privacy is a fundamental right.
And like with all fundamental rights, any encroachment on them needs to be aggressively and decisively terminated. Thank you so much. Thank you, Kirill, for your great talk.
Unfortunately, we don't have time for questions, but I think there was a lot of content in that. And if you have any questions, contact him. All the details are here, so enjoy all the interaction and enjoy the rest of the Congress. Thanks.