IoT ethics
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Subtitle |
| |
Title of Series | ||
Number of Parts | 95 | |
Author | ||
License | CC Attribution 4.0 International: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor. | |
Identifiers | 10.5446/32309 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
| |
Keywords |
FrOSCon 201738 / 95
4
8
9
15
20
22
23
24
25
27
29
32
36
37
38
39
40
45
46
47
48
49
50
51
53
54
59
63
64
65
74
75
76
79
83
84
86
87
88
89
91
92
93
94
95
00:00
Open sourceFreewareEvent horizonPlanningWave packetInformation securitySpeech synthesisMultiplication signVery-high-bit-rate digital subscriber lineComputer animationLecture/Conference
00:46
Wave packetService (economics)Very-high-bit-rate digital subscriber linePlanningInformation securityAerodynamicsFocus (optics)WhiteboardInternet der DingeLecture/Conference
02:26
BuildingInternet der DingeMoment (mathematics)Product (business)Modal logicControl flowNeuroinformatikBusiness modelMereologyRoboticsConnected spaceInformation privacyArmInformation security
04:22
Disk read-and-write headFlagMultiplication signDifferent (Kate Ryan album)Parameter (computer programming)Near-ringCASE <Informatik>Computer animationLecture/Conference
05:11
Logical constantCASE <Informatik>Projective planeNumberVideo gameXMLComputer animation
05:59
Internet der DingePower (physics)Multiplication signGame controllerXMLComputer animation
06:52
BuildingInformation securityIdentity managementProgramming paradigmInternet der DingeTrailNeuroinformatikRegulator geneOrder (biology)Letterpress printingMeasurementFilm editingSocial classFunctional (mathematics)PlanningPolygon meshConnected spaceComputer animation
08:24
BuildingInternet der DingeInformation securityRegulator geneConnected spacePublic key certificateProgramming paradigmCryptographyComputer animation
09:42
1 (number)Context awarenessTwitterDenial-of-service attackInternet der DingeOrder (biology)Decision theorySound effectCASE <Informatik>WebsiteCybersexLecture/ConferenceComputer animation
11:20
Field (computer science)Vulnerability (computing)Information securityInternet der DingeLecture/ConferenceXML
12:29
Axiom of choiceProduct (business)Different (Kate Ryan album)Computer animation
13:33
Raw image formatComputer networkInternetworkingInterior (topology)BitHeat transferReverse engineeringHacker (term)Internet der DingeInformation securityRevision controlInternetworkingFunctional (mathematics)Normal (geometry)Exception handlingRoundness (object)BitVapor barrierContext awarenessXMLComputer animation
15:09
Invertible matrixFloating pointOrbitInternet der DingeControl flowInternetworkingPublic key certificateDifferent (Kate Ryan album)Regulator geneTheory of relativityLecture/ConferenceJSONComputer animation
16:09
Process (computing)Product (business)Multiplication signRegulator genePhysical law1 (number)Forcing (mathematics)HTTP cookieSound effectInformationLogical constantLecture/ConferenceJSONComputer animation
17:42
Rational numberEmailDifferent (Kate Ryan album)Cartesian coordinate systemRight angleMultiplication signAddress spaceTraffic reportingInformationInformation privacyAuthorizationRegulator geneAsynchronous Transfer ModeMereologyCountingLecture/Conference
20:25
MereologyDifferent (Kate Ryan album)BitMultiplication signPresentation of a groupDisk read-and-write headDegree (graph theory)Right angleComputer animationLecture/Conference
22:06
Information securityInformation securityVulnerability (computing)Logic gateEuler anglesExtension (kinesiology)Noise (electronics)Computer animation
23:25
Universe (mathematics)Set (mathematics)Execution unitEquivalence relationPlotterDisk read-and-write headComputer animation
24:19
CloningFunction (mathematics)WebsiteMathematical analysisDisk read-and-write headVideo gameLogic gateRight angleField (computer science)CASE <Informatik>Functional (mathematics)Computer animation
25:43
CodeMetreDisk read-and-write headCybersexWordLecture/Conference
26:56
Read-only memoryVideo trackingPlastikkarteData storage deviceMetre3 (number)Coefficient of determinationSmartphoneObservational studyPole (complex analysis)Computer-assisted translationMultiplication signPeer-to-peerComputer animation
28:35
Read-only memoryVideo trackingPrisoner's dilemmaMetrePower (physics)Standard deviationOrder (biology)Beta functionWater vaporPlastikkarteComputer animation
30:01
Video trackingOrder (biology)TrailInformation securityImmersion (album)CausalityStallman, RichardLecture/Conference
30:48
Video trackingTrailIdentifiabilityInformationPosition operatorOrder (biology)Multiplication signSmartphoneInformation privacyPerspective (visual)Vertex (graph theory)PlastikkarteGodComputer animation
31:38
Video trackingMultiplication signDatabaseSmartphoneTrailAttractorIdentifiabilityPoint (geometry)Principal idealPrisoner's dilemmaBus (computing)MetadataLecture/ConferenceComputer animation
33:36
Twin primeBitStaff (military)DemosceneCodecNeuroinformatikStudent's t-testFitness functionMultiplication signHacker (term)Wave packetGoodness of fitLecture/ConferenceComputer animation
34:36
TrailMetric systemControl flowFitness functionTrailMusical ensemblePoint (geometry)Condition numberAutomatic differentiationMereologyInternet der DingeInformation privacyOrder (biology)AxiomInternetworkingRegulator geneComputer animationJSON
36:21
Software configuration managementMassNormed vector spaceDefault (computer science)System callMobile appRule of inferenceWeb pageInformation privacyArithmetic meanComputer animation
37:31
Service (economics)Normal (geometry)Data managementInteractive televisionRegulator geneExterior algebraInformation privacyCybersexCASE <Informatik>Goodness of fitInternet der DingePoint cloudMessage passingContent (media)2 (number)WeightServer (computing)Bayesian networkKey (cryptography)Information securityInformationEndliche ModelltheorieAreaPower (physics)Disk read-and-write headBitExecution unitMoment (mathematics)Product (business)UsabilityBefehlsprozessorCoefficient of determinationVideoconferencingRing (mathematics)CodeProper mapMobile appRight angleHacker (term)TrailComputer animationLecture/Conference
45:25
Data managementSide channel attackWeightOpen sourceRevision controlCondition numberKey (cryptography)Regulator geneArithmetic meanInformation privacyMobile appMultiplication signProduct (business)ACIDRule of inferenceForcing (mathematics)InformationCASE <Informatik>Information securityDifferent (Kate Ryan album)Real numberSolvable groupInternet der DingeDefault (computer science)Disk read-and-write headBit rateBitPhysical lawLevel (video gaming)Cartesian coordinate systemDependent and independent variablesConnected spaceLecture/Conference
53:08
Video gameMaxima and minimaSign (mathematics)Social class1 (number)Universe (mathematics)Information securityWater vaporPlastikkarteMatching (graph theory)Regulator geneType theoryCellular automatonCASE <Informatik>Forcing (mathematics)NumberPower (physics)Extreme programmingMobile WebGroup actionPublic key certificateState of matterMultiplication signSet (mathematics)Musical ensembleSpeech synthesisSoftware testingInternet der DingeTheory of relativityTrailIdeal (ethics)Hacker (term)VotingIncidence algebraSoftware development kitLecture/ConferenceComputer animation
01:00:48
FreewareOpen sourceEvent horizonComputer animation
Transcript: English(auto-generated)
00:07
So, thank you for, it's really cool to come here, I came here by plane which I wasn't very happy about because I'm terrified of flying and as a security analyst I know this
00:23
doesn't make sense at all. I know this but it doesn't matter, I'm still terrified. The last time I was doing a speech when I talked like this I was going by high speed train which was much much nicer, it has the same probability of dying so to speak but
00:43
it still feels so much better. Also I could use this chip implant that I have in my hand as a ticket on the high speed trains in Sweden which is kind of cool. No, so it didn't just, how come that planes are so secure, how come that trains are so
01:08
secure? Well, I'm just thinking, what if we would make a startup for planes on the same basis
01:21
that we're doing startups in the ITs? I tried to make this so I'm thinking, well, I don't really have a license to do planes but I think it will be working out fine anyway and I think I'm gonna focus on one
01:41
thing and that's gonna be the interior design and aerodynamics, I don't know, we can figure it out as we go, I think. I think we're gonna make the body of this plane out of plastic because that's light and it's cheap and yes it's proven not to withstand lightning but we put that in
02:05
the manual and also a skilled pilot would avoid lightning so it doesn't matter. Is there anyone who would board this plane?
02:21
Still this is how we do in IT and in the IOTs so since this is an IOT talk I of course have to start with some kind of definition of Internet of Things and I'm gonna use a very, very broad definition.
02:42
We can talk about sensors, moving parts and computers and they all need to be connected and most of these devices also collect privacy data which is one of the biggest issues. The business model is often built upon collecting that privacy data instead of, for example,
03:06
selling products because at the moment there is obviously no money in selling the products. Bruce Schneier, who is an IT security guru, so to speak, he is claiming that we are
03:23
building a world-size robot with the Internet of Things. He is claiming that the sensors are the eyes and the ears of this robot. The moving parts are the arms and legs of this robot and the connectivity in computers
03:42
is the brains and then we're giving that world-size robot all of this data about us. Does it sound like a perfect idea? I don't know. I promised you that I'm gonna talk about things that I have found out living in IOT
04:10
labs and using my own body as an IOT lab basically. For me it started some years ago when I moved into an apartment that had an alarm
04:25
that would give a push notification to my boyfriend every time that I would leave the house. Of course, the first red flag in my head was, well, this can be used for domestic violence
04:42
against any partner or any kid. What kind of beautiful ways that you can have to control your partners. When I'm talking about beautiful, I mean the opposite of beautiful. You can use it as a very efficient way of oppression of your near and dear.
05:07
His argument against this was, of course, that the tool doesn't make the use case. Guns don't kill people. People kill people. Sometimes things happen by accident.
05:22
Of course, in this project, it wasn't a problem. It never came to domestic violence in this case. It was never an issue really. I think the biggest issue was possibly that when I was out and I could see that,
05:44
oh, he hasn't left the house until two in the evening. He's wasting his life. Is this really something that we want to know about our partners? Maybe it is. Maybe it isn't.
06:03
I think that IoT is an extremely interesting and powerful tool. As any tool, it can be used for good and for bad. It can also be used for accidents. Let's take another example. We have the fission power that can give us cancer treatments.
06:24
It can also give us the North Korea-American situation. It can also give us Sellafield, Harrisburg, Fukushima, and Chernobyl.
06:43
This is a very strong tool, and it's not necessarily so that we actually have the control over it all the time. We have to be aware of this. The aforementioned Bruce Schneier, he claims that there are two paradigms of how we make security in the IoT.
07:05
The first one is the paradigm that comes from the physical world of highly regulated dangerous things. These things are highly regulated because they have historically been very dangerous.
07:21
If we're using plastic in the body of the plane, people will die. So you're not allowed to do that. Or you can do that, but you can put some aluminum mesh over it to make it into a failure cage. But you can't cut corners in the same way because people will die.
07:41
And the way that we ensure that people don't cut corners is through regulation. On the other side, we have the other paradigm of agile, patchable security, of things that thus have been very benevolent, like computers or just tracking my bodily functions in order to be better at biking or running,
08:08
which is what I mean by biohacking. And in the IoT, these two worlds clash together so that we, for example, have medical devices with connectivity
08:27
that if we find a security issue with them and we need to update it like we usually do in the agile security paradigm,
08:41
that may void the certification that that medical device has received from the FDA. It may or may not. We're not totally sure about this. But it's not very nice to put millions and millions into a device,
09:04
and then all of a sudden, because you did the right thing, you're not allowed to sell it anymore. So what Bruce Schneier is saying is that we need to regulate the Internet of Things.
09:20
He's also saying that this must come from us as a community because otherwise regulation will happen to us, and it will happen to us in a very un-nice way. And I think that people like Bruce Schneier or people that were around in the 90s that were there for the crypto wars know that if you're leaving the regulation to the policymakers,
09:45
they will probably make something that isn't very useful and isn't very efficient. But what he's claiming is that we need to find a way to make this work, and we are the only ones who can do that.
10:05
I'd also like to take a quote from him. The market can't fix this issue with... So the context here is the Mirai botnet from November, where badly secured devices were used to orchestrate a DDoS attack against another secured researcher,
10:29
but also against Twitter and stuff. And we see this more or less every month nowadays that badly secured IoT devices are used in order to take down other sites.
10:44
So in this case, there is an externality created. So the market can't fix this because neither the buyer nor the seller cares. And this issue is what economists call an externality. It's an effect of the purchasing decision that affects other people.
11:03
And we can think of it as invisible pollution. Before I went into the cybers, I was going to be an environmental economist. And the first, second, and third thing that you have to learn in environmental economics is we need to internalize the externalities.
11:27
So this is something that we can learn from other fields about. And there are, I can see at least three ways of internalizing these externalities,
11:46
making the industry pay for itself. And the first one, I think, so there's the issue that neither the buyer nor the seller cares about IoT security.
12:06
But I would disagree about this. I think that the buyer does care. But the buyer has no idea how to differentiate between a device that is somewhat secured,
12:22
a device that has some kind of agile practices that have vulnerability disclosure policies, and one that doesn't. So what I and others would like to propose is simply to have a voluntary seal.
12:41
And then we can look back at the environmental movements again. We have lots of different seals here. And this one is very known for the Swedish population, and we all know this. We know that if you want to make the ethical choice when it comes to ecology, we look for this.
13:03
And the same here with the eco label. If we want to do a more ethical choice, if we care about the externalities that are created by us buying these products, we look out for this, and we know what this is. This is something that is communicated to people, everyday people,
13:25
and you don't have to be an environmental scientist to understand some of the basics of this. And to my knowledge, there is not yet a seal for IoT security out there, and I think it must come soon.
13:44
I know there are initiatives for it. The second way that I think that this market will change is that we have the threat of ransomware. So if you're having a very badly secured device, you are vulnerable to ransom.
14:07
And this is something that is very bad brand damage to the seller and to the vendors. This is something that people actually have in their mind and they don't want.
14:22
They obviously don't want their Roomba to be ransomed. Of course not. And there is another very interesting version of a crypto locker that is called the Brickerbot. That functions in the same way that a normal ransomware does,
14:46
but instead of leaving a ransom note of give me this and this many bitcoins, it just bricks your device. End of story. You will never get the key back.
15:02
The person who is claiming that he is the creator of this Brickerbot is saying that what he is doing is internet chemotherapy. So no one in their right mind would go through chemotherapy if they weren't severely sick.
15:21
And he is claiming that the internet and the internet of things is severely sick and he is taking it into his own hands. Disclaimer, I of course don't think that anyone should brick other people's devices. It's illegal, but threats like this will make the world better.
15:47
I'm sorry. It's the truth. The third way that I think we will internalize the externalities of the IoT is through the General Data Protection Regulation, which is a regulation, of course.
16:05
There are different kinds of regulations. We have the CE mark, for example, that is a certification that you need pre-approval and it costs money to do. And when you have gone through there, you are approved to sell your electronic devices in the EU.
16:26
And then five years later, even though your product hasn't changed, you still need another one of those approval processes and it actually needs to go through and it's very costly and it takes a lot of time.
16:40
The General Data Protection Regulation is working in kind of the opposite way that instead of you doing pre-approval, instead of asking for permission, you are asking for forgiveness. So you can do whatever you want, but if personally identifiable information is leaked,
17:01
you are fined and you are supposed to be severely fined, we don't know if this is actually what's going to happen because it only comes in force in May 2018. So it may very well be another one of those regulations that never work.
17:22
And that's the issue about regulations that we often think about the successful ones and say, hey, regulation works, we should totally do regulation. But then we forget about those that don't work or have the opposite effect. Think EU cookie law.
17:41
The only thing that the EU cookie law is, it doesn't make sense at all. You can basically use it as a phishing technology. It's a mess, you know. There are so many regulations that haven't been working out before. So why would regulation work this time?
18:02
Maybe it does. And I think that the General Data Protection Regulation with the right push from the communities like this, people that care about privacy and know about their rights, it can make a very big difference
18:24
not only in the EU but in the global market. So one of the things that you can do, for example, is that you can say today already, hey, I want you to erase my data. And I'm doing that nowadays.
18:41
Every now and then I'm like, oh, I don't want this account anymore. So I call their support and it may take between one and ten minutes to find the email address. Obviously I'm not calling but mailing. And I'm saying, hey, I'd like to use my right to be forgotten. And most of the time I get the answer, what?
19:03
And so in eight months or seven months, I can't count at the moment, they will be fined if they can't erase you and erase you soon after you ask for it.
19:21
But most of the industry still isn't even aware of this. Another great thing about the General Data Protection Regulation is that if your personal identifiable information is leaked, they have to tell the authorities and they have to tell the authorities within 72 hours.
19:44
And if you're negligent and have no clue about what infrastructure you have or what application you have, that's tough luck. You still have to report and you still have to do this. You still have to comply all the time.
20:01
But they are not checking you beforehand. So I have very high hopes in this and I would love all you guys to join me in the quest for our personally identifiable information, making it actually owned by us because it is very clearly owned by us.
20:26
The second part of this talk starts with this chip implant that I have in my hand. Almost two years ago, I was tipped off to go to this afterward seminar in the town where I live in Malmö,
20:46
where you can get a chip implant into your hand. That's a tiny bit different. But for me it felt like it would be into your head
21:01
because I thought that anyone who would do this lacks the basic risk assessment capability and must be really, really stupid. Who here thinks that I must be really, really stupid to do this? Oh, there's only one, two, okay.
21:22
Liars. Yes, you're all polite. So I went there to that party saying no one in their right mind could do this. And then I saw the presentation and people were doing it and I felt the instant urge to do this.
21:47
So in 40 minutes I had swapped 180 degrees and this was so interesting to me. What had just happened? Why did it work out like this?
22:04
So I started to research this and some months later at the next chip installation party, I got the deed done. It's on YouTube, I think.
22:21
And what I realized is this, that me as a security geek, I stand on the side and I see a technology and I'm saying, hmm, interesting. There's a weakness, there's a weakness, there's a weakness, there's a weakness, there's a weakness.
22:44
And the entrepreneur is like, this is so cool. We can do this, we can do it. Oh, and we don't think, we don't need, you know, we figure it out as we go. And then this thing becomes a huge hit and all those weaknesses that we pointed out are still in there
23:06
and they haven't been addressed because we stood on the side and said, really? You know, I think to some extent this is an attitude problem from the security geek side.
23:21
And I say that as a security geek. So when I was thinking of this chip implant, I thought of the science fiction that I was just recently watching. This guy Ronan here, he has the equivalent of a GPS tracker in his back
23:43
and he's being tracked through the universe by life-sucking aliens. And 709, my childhood hero, as a plot device, every now and then there's a tracker in her head that is turning on
24:03
and, you know, then Jane Hoey saves the day, but you can never know that until 40 minutes later. So this is what I was thinking of when I was thinking of chip implants. It's trackers.
24:22
So just step back one step and think of what is a cyborg. You know, the keynote speaker yesterday, Karen Sandler, she was speaking about how she really avoided, she didn't want to become a cyborg. When she got the peacemaker into her chest,
24:42
one of the issues for her was, I don't want to become a cyborg. But for me it was like, I really want to become a cyborg. This is so cool. I don't know why, but it just feels good. So the original definition of a cyborg is a human being with bodily functions aided
25:04
or controlled by technological devices. So in this case we have the pacemaker or defibrillator up there. We have an insulin pump that is steadily attached to the body. This is an intrauterine device that you use so that you won't get pregnant. This is a schematic for dialysis.
25:24
And does anyone know why Malala is in this picture? Malala is the one who got shot in her head in Pakistan by the Taliban. And when she got shot in the head, she lost her hearing on one side,
25:41
and then she got a cochlear implant, which is a hearing aid that you implant into the head. So I think that she's one of the most famous cyborgs out there. The interesting thing about this cochlear implant too is that there is a possibility to get superhuman hearing with a cochlear implant.
26:06
It depends on the patient, but there is a possibility to, for example, hear things 200 meters away or something like that, but it's not deemed ethical to tinker with this. And you're not allowed to do it yourself.
26:21
You have to do it with your doctor, and your doctor is saying, sorry, this is against the code of ethics. We can't do it. So another definition of a cyborg is that you have something implanted, and by this definition, I am a cyborg.
26:41
And the pacemaker and the intrauterine device, the hip, breast implants, whatever, all of these can be counted as cyborg. And there's, of course, a third definition. Cyborg is a human being with an electronical device implanted in
27:01
or steadily attached to the body with the purpose of increasing individual senses or abilities beyond the occasional use of tools. And also by this definition, I would be a cyborg since I was 12. So I didn't have to do the chip implant.
27:21
I could just have gone for the glasses. I'm just wondering, who in here doesn't have a smartphone? We have two and a half people in here who don't, three and a half people who don't have a smartphone. Interesting.
27:43
So what people often ask about this is what kind of chip implant is this? It's the kind of chip that you would have in your bank card or in your passport. It's an N-Tec 2016, complying with ISO 14443.
28:03
It's a tiny amount of storage on there. It's less than a kilobyte. These have historically been used on cats and dogs, but then they have had some kind of tissue that grows into the tissue of the cats because for some reason veterinarians aren't that good at piercers.
28:26
I'm not sure. I had a real piercer do this implant for me. It's a 12 millimeter biocompatible glass. It's like a grain of rice basically.
28:42
It's made to be easily taken out of the body without issue because it's not going to grow into the body in any way. It has one and a half millimeter antenna and that standard ISO 14443 is calling for a credit card sized antenna.
29:01
So this antenna makes it really hard to read these and also since my body is mostly water based, it's even harder to read them. The mobile phone that I would usually use this with has an NFC reader that doesn't get very much power,
29:22
so it's very, very hard to use it. Anyway, it's a passive technology. In order for me to be tracked, we would have to have an infrastructure of very, very powerful antennas every meter basically,
29:45
which is doable. I'm not sure. It's not the most practical way of tracking a person and this is the main reason why I decided to do this. Also because I felt I needed to do a risk assessment of this
30:03
and I needed to do it by immersion. Since I realized that in order to succeed as an IT security person, I need to be able to talk to normal people and I need to be able to talk to entrepreneurs
30:22
and I need to be able to get them to understand what I'm saying. So that's one of the reasons why I did this too. So this, I'd say, this is not a dangerous tracking device. But you know what is a dangerous tracking device?
30:46
Richard Stallman calls this the governmental tracking device and some years ago people were like, oh my god, you're so paranoid. Well, of course, yeah, theoretically that can happen, but would they really do that? Yes, they would. We're tracked all the time with these governmental tracking devices.
31:07
It's not ideal from a personality identifiable information perspective or a privacy perspective. And the thing is this happened by accident because we had this really nice technology of mobile phones
31:23
and in order for mobile phones to work, you always need to triangulate the position of the person. And then that really smart technology merged with another really smart technology of the smartphone. And all of a sudden we have unique identifiers
31:42
that are continuously tracked and yes, you can turn off the tracking, but then it doesn't work. The whole idea of having a smartphone is to have the database of all of the world's knowledge
32:01
in your pocket at all times. This is why we want it. It's amazing. It's an amazing technology, but it is tracking us all the time. And it just happened. So we don't want the IOT to be something
32:25
that happens to track us all the time. The whole idea of we would rather let someone free than put them into prison when in doubt. The issue here is that that principle came along
32:44
at a time where there was very little data about people and you couldn't know whether or not someone has done it. So it's better to let someone go than let them into prison, even if it's a murderer. But nowadays we have so much data on people.
33:04
There's incredible amounts of metadata about me created all the time. I mean, one of the really strange things is my phone is telling me, oh, you're in Fuhrwinkel Wuppertal. Would you like to know where the bus goes?
33:22
And it's doing that all the time. And this is something that we're just used to. And we will come to a point when turning off my phone is a reason to be suspicious of what I'm doing.
33:43
So with this chip implant, I came into a scene that's called the biohackers. And they are doing different things. They are doing do-it-yourself DNA labs in the same way that people were doing do-it-yourself computing in the 70s. They are doing chip implants.
34:01
They are doing quantified self. They are tinkering with food and stuff. They are tinkering with drugs. But basically, in its essence, it's a cool new name for good old keeping healthy. So for example, fitness trackers is something that I've been doing.
34:21
I've been doing some glucose metering to see what happens to me when I eat sugar. I was pricking myself in the finger for like 200 times in two weeks. And I found out really interesting stuff about myself with this without a physician or a doctor.
34:42
But all of a sudden, we now have so much data about ourselves. And when I was doing fitness tracking, I realized I'm getting too much data about myself. What's happening here is that I'm getting injured because instead of listening to I have a bad knee,
35:02
I'm listening to, oh, I'm going very slow. I need to pace up. I've been talking about this already. I'll skip it. So maybe you've already heard this part.
35:20
Big data is the new oil. I've been really wondering, what do they really mean about this? What's the point? What we are right now doing is like we're collecting lots and lots and lots of data in order to sell ads. And the ad economy is basically a dying economy.
35:42
I'm not sure if this, like, I don't really believe in this, but this is an axiom that is being told over and over again, and this is also the reason why we want to know so much about people.
36:02
And going back to the general data protection regulation and the privacy issues, I think it's kind of really easy. I think when it comes to Internet of Things, when it comes to biohacking, when it comes to tracking people, there's a very simple principle. Do not collect data without informed consent.
36:23
Informed consent does not mean an end user license agreement that is 25 pages. Informed consent does not mean if you want to use this app that you need to give me access to all of your data.
36:40
And there is only one way of choosing between here. That's not informed consent. I think when it comes to privacy data, we can think of it about in the same way as with violence, that by default violence is illegal, but under some certain circumstances we can say,
37:02
you are allowed to punch me in the face because we're in a boxing match. Or, like, we don't have any rules here because this is MMA. But just because MMA does exist doesn't mean that we are allowed to kick each other in the face on the street.
37:23
I think that if we cohere to this small principle, we will come a very long way. Thank you.
37:40
I hope you have a lot of questions for me. Do you have a product that you use that you think is a good example? I find that I've never seen any product that strikes the right balance, and I'm wondering what you think is ideal. Okay, so do I have an example of informed consent in a very good way?
38:06
Not out of the top of my head. And that's also interesting. Like, eight months from now we're going to have a regulation that says that you have to pay 4% of your revenue if you're not acquiring informed consent from all of your customers.
38:24
But still, we don't have a very good idea about what this informed consent would look like. 4%, you have, in the GDPR, the highest penalty that you would have to pay
38:42
is 4% of your company's revenue annually. Is it government? Yes, government. I think that there are some VPN services out there that are specializing in privacy,
39:03
where they are really good at communicating what they are collecting. I don't know, the privacy bear. The tunnel bear? It's called tunnel bear. So I think that there are some special cases where this is done in a good way.
39:30
Actually, yesterday I was talking to a guy who has a Tesla, and in the Tesla, every now and then, apparently, he is presented with,
39:40
by the way, we're tracking you in this way. Which is nice, because it's good that they are trying to inform people, but if they are trying to inform people in a way that is boring, that's also a convenient way of saying, well, we did inform you.
40:01
Yes? I think you were looking for a definition of a cyborg, and you showed several of them. I think maybe a more relevant one would be someone who relies on computers, which have executable code inside of them, inside of your body, because there's a bigger distinction, because the mere technology
40:20
is an executable that can accept instructions that can be reprogrammed. So you're coming here with an alternative definition of a cyborg that has to do with executables inside of your body. Yes, this is also another definition of cyborg. I think that both the Internet of Things and the cyborgs
40:41
and all the interesting stuff are really hard to coherently define. So that's why I used three definitions, and we can use more of them. At the beginning, you said that the consumers do not really care
41:02
about the security of IoT devices, or they do care, but they do not really understand what's going on. The problem is we're looking at a price-sensitive area, so when a consumer goes to the supermarket and sees an Internet-enabled baby camp,
41:23
for example, and reads the statement, military-grade security on the packaging for $30. So he's going to say, well, that has military-grade security. But if you look at the proper IoT device, like, for example,
41:40
the Ring video dog bed, which is actually the only device, I think, that is a properly designed IoT device, it costs at least $200. So what you're saying is basically that there is a price issue in the IoT that people are not prepared to pay the price that you would actually have to pay for a device with a bit more CPU power, maybe,
42:07
that will actually be able to have reasonable key management. Yeah, it's absolutely an issue that the market is simply, the cost of these devices are simply too low at the moment.
42:25
Sorry, there's another. So are there initiatives to educate consumers on the issues of IoT security?
43:08
Yes, of course there are. I think that we're in the beginning of realizing that talking to users and not saying, well, you used it the wrong way, it's not a bug, it's a feature.
43:26
I think we're in the cradle of creating something, of understanding that user interaction and user experience is actually things that are important, and they are also important for security.
43:41
Until now, there are so many secure messaging apps and secure messaging services that are really secure and have focused on this, but that are unusable for normal people, because normal people, they don't want it to take half an hour to,
44:05
they don't want to understand the OSI model. They don't want it to take half an hour to configure this. They don't want to Google to get it to understand this. They want it to work intuitively.
44:24
So basically, before we had secure things and usable things. And at the moment, these are slowly starting to merge, because the security people are finally understanding that, well, people are using Snapchat.
44:41
They are sending dick pics and they are thinking, hey, this is secure now because it's only shown for 10 seconds. Yes, but it's logged and it's retained on a server where you will never reach it. So all of your dick pics are in the cloud. But you have a feeling of security because you don't see it
45:02
and no one else can see it until that server gets hacked. Whereas we now, for example, with Signal, have the disappearing messages where you can send your dick pics nice and they will disappear. And they will actually disappear. And Signal is usable and Signal has GIFs and Signal has lots of stuff.
45:25
We finally have many secure apps that actually are a bit more fun. And I think that's the key to the issue. So we had you before there, maybe. I'd like to try to do some sketch rate or bio in IoT.
45:50
Well, there is one initiative in London that I know of that's called iot.watch, I think. It's sponsored by Bosch, but it's a grassroots movement.
46:05
And that's basically the one that I have from the top of my head. When you talk about Signal, the problem with Signal is by default it has some key management built in into the application
46:21
so it's not 100% secure. You have to take an extra step to verify the key to another channel. But if you look at an application like Trima, which won't work unless you do the offline key verification, that's a bit more secure.
46:41
I think that the key management issue is something that will follow me through the next 40 years of my career. So this is hard. The same problem presents itself in WhatsApp and even iMessage.
47:01
Those applications are secure enough, but they're not really secure because if you want real security, it's very inconvenient. Well, if you want real security, you will have to watch The Americans on Netflix to get some tips on how you can do that. You have to kill people.
47:21
You don't have to meet in a park naked without anybody around. So perfect security is often not desirable even.
48:02
Well, you're saying that whenever data is collected, it will be misused. Well, maybe it doesn't have to be misused by the company in question, but then they file for insolvency and they sell whatever asset they have and their assets happen to be big data.
48:22
Or they get hacked because they don't know a first thing about security and they don't know how to Google.
49:16
You're saying that the German law requires the electricity company
49:21
to collect your electricity consumption data once a month, but they're doing it every minute. And this is also the issue about regulation. There are so many rules. We're breaking rules every day.
49:42
And still, for some reason, regulation kind of works most of the time or sometimes. And it's better to have a society that is built on explicit regulation than to not have it. But just because a regulation exists doesn't mean that it's followed.
50:21
For this reason, I would say that we must regard power consumption as personally identifiable data because you can learn a lot about a person from that. For example, now in the hotel that we're staying, most probably there is air condition that has a humidity sensor and a carbon dioxide sensor.
50:43
That can sense whenever we're sleeping, whenever we're showering, whether or not we're having sex. And this is something that could be used. You can see a spike when someone wakes up. And then you can make me do some more breakfast
51:02
because you see that there are many people that just woke up, there are many people who just made a shower, and why don't we do that? That's a use case for that IoT device and for the side channel of an air condition that is IoT connected.
51:22
The question is, is this something desirable? I don't know. In the back, we had a person. Who should do what, you said?
51:56
Who should regulate? If the government doesn't regulate the IoT, who should?
52:01
I think that first and foremost what really needs to be tried is a business, is a CSR, a company's social responsibility to take privacy matters and make it into something marketable.
52:21
Saying that, hey, you're paying, it makes a real difference if you're buying this thing for $30 or for $60 because what we are doing is that we're selling a product. What they are doing is that they are selling you. And I think it needs to come from many, many different places.
52:41
I think that the open source world is really beautiful in this way, that it can both be monetized and also used by people who are idealists at the same time. It doesn't necessarily need to exclude each other. Speaking about regulation, how will it be possible at the world level to make regulation?
53:07
We will have more than 2, 3, 5, 10 countries who will never make regulation. And they will make it too strong. Global regulation is always an issue, is what you're saying.
53:21
Nine years ago now, I started to be, when I was in the environmental movement, I started to go to these UNFCCC talks in Poznan and then in Copenhagen. And what I realized there is that there are two types of global regulation.
53:40
There's the one that is good and strong, like the child convention that no one is following. Or there's the one that everyone is following that is so watered down that it doesn't make any sense. So I quickly realized that I was so desolutionized in that situation in 2008.
54:01
And the rest of the environmental movement came after me in 2009, when they also realized that this Copenhagen treaty doesn't go anywhere. So I think that's also a reason why I am feeling strongly that we don't need, that regulation can be used for good, but frankly, how often is it used for good?
54:28
How often does it have the intended purpose, the intended outcome? To follow up on the question, what we can do is we can educate people because we cannot make a regulation that would span the world.
54:43
So let's say, the more people who have incidents because of IoT devices, the more people will learn that they need to raise the bar, so to speak, on the devices they buy.
55:01
So it's enough to have a certification in one country, and that stamp would be put on the IoT devices that have a minimum set of security features, or at least not made without security in mind, so to speak, and let the people vote in government.
55:24
So basically what you're saying is that we need first and foremost the market to mature, and secondly, that we don't need a global regulation
55:40
because if we have the GDPR, for example, that only applies to 500 million people in the world, but we have a global market and everything is made in China, so everything in the world will be made to comply with this, ideally. In the same way that the FDA has a very strong power over how medical devices
56:02
and medicines look in the rest of the world, too. So I don't know how we are with time. Three minutes. Three minutes. Is there anyone who hasn't spoken who has a question? Come on, I don't buy it.
56:20
...from government in relation to a few weeks ago, that professional groups of people,
56:51
and that was unnoticed, rather unnoticed, and do the thing...
57:11
So do we have an extreme implicit trust in the government in Sweden? Yes.
57:21
Well, I think that Sweden has been used by many companies as a way to test things because people tend to be very trans-sensitive and people tend to be extremely trusting and tend to like to do new things. So one of the use cases that they are trying for their chip implant is to use them as payment, as wallet,
57:47
which is a bad use case because these NFC MyFair chips, they are not made with any kind of security in mind, so they are so easily cracked, and there are like hacker kits that you can get for 40 euros
58:04
that are made here at the university in Bochum, actually. It's a very bad idea to make a new infrastructure that is built on that NFC technology that I have in my hand, however, the next ones that actually have some kind of security in mind,
58:22
like the Java card, might be a way to do this. But I'm digressing because what I wanted to say is that there tend to be a lot of articles done about the cashless society in Sweden, and I didn't understand this. Like, of course, you can use cash all over, except when was the last time I used cash?
58:45
The last time I used cash I was in Germany. Talking about like baselines and diverting from baselines and that being a suspicious thing, I always go to the same shop, I always shop the same things, and I always pay with the same card.
59:04
So I think that one can read so much interesting data out of what I'm buying and when. And this is something that I'm readily giving away and that we all are giving away so easily because we are implicitly trusting the banks and we are implicitly trusting the state
59:23
and it's no issue for the Swedes that all of a sudden there is no money that isn't tracked. This just happened.
01:00:06
Yeah, you can learn so much about people from what they are buying and what they are buying with cards. So I know that I should use cash, but the thing is, interestingly enough, increasingly
01:00:24
there are signs saying no cash, we only take credit card and switch, and switch is a way of paying with your mobile phone and your number, mobile number, and your mobile bank ID, yeah. So I think that's all, thank you all guys.