Social Cooling - big data’s unintended side effect
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Subtitle |
| |
Title of Series | ||
Number of Parts | 167 | |
Author | ||
License | CC Attribution 4.0 International: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor. | |
Identifiers | 10.5446/34945 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
| |
Keywords |
34th Chaos Communication Congress121 / 167
1
2
3
4
5
6
7
10
11
12
14
15
25
26
29
30
31
33
34
39
40
42
43
45
46
49
50
53
58
59
61
63
65
68
69
71
73
77
78
79
81
82
83
85
86
87
88
91
92
94
99
100
101
102
108
109
110
113
114
115
116
118
119
122
124
125
126
127
129
130
131
132
133
134
136
138
139
140
141
145
147
148
150
151
152
153
154
157
159
160
161
163
164
165
166
167
00:00
Archaeological field surveyInformation privacyMathematicsTerm (mathematics)DigitizingInformation securityMultiplication signInteractive televisionComputer animationLecture/Conference
01:04
Multiplication signProcess (computing)Lecture/ConferenceMeeting/Interview
01:36
MathematicsGravitationWordPosition operatorFrame problemComputer animationLecture/Conference
02:19
Frame problemFacebookLink (knot theory)WebsiteCASE <Informatik>Semiconductor memoryFormal grammarPower (physics)Order (biology)SummierbarkeitMeeting/Interview
03:35
Observational studySign (mathematics)Address spaceEmailConnected spaceTime zoneGoogolSound effectMathematicsWordWorkstation <Musikinstrument>Pattern languageLink (knot theory)Endliche ModelltheorieComputer animation
04:21
Sound effectPhysical systemLie groupPressureMathematicsVideo gameService-oriented architectureComputer animation
05:05
Multiplication signSound effectOrder (biology)Negative numberWater vaporWordXMLComputer animation
05:38
Mathematics2 (number)Physical systemOrder (biology)Computer animation
06:07
Arithmetic meanLocal GroupProcess (computing)Physical systemRevision controlWebsiteComputer animation
06:40
PressureSound effectPhysical systemFactory (trading post)InferenceComputer animation
07:16
Address spaceEmailLie groupException handlingVideoconferencingDressing (medical)Service (economics)Right angleMultiplication signWebsiteComputer animation
07:47
Vector spaceStudent's t-testShared memoryProcess (computing)Power (physics)Sign (mathematics)Cartesian coordinate systemComputer animation
08:21
BuildingDifferent (Kate Ryan album)ChainPhysical systemEndliche ModelltheorieBit rateService-oriented architectureProduct (business)JSONXMLComputer animation
08:51
Gamma functionVirtual machineProfil (magazine)Chemical equationPattern languageAlgorithmMachine learningComputer animation
09:30
FacebookCross-correlationState of matterPhysical systemMobile appElectronic mailing listEmailService-oriented architectureComputer animation
10:07
4 (number)Inheritance (object-oriented programming)FacebookStatement (computer science)View (database)Orientation (vector space)GenderSource code
10:49
Multiplication signGame controllerSoftware testingComputer animation
11:20
Endliche ModelltheorieArchaeological field surveyProfil (magazine)Group actionKey (cryptography)Computer animationLecture/Conference
11:54
MereologyMessage passingPoint (geometry)Level (video gaming)CASE <Informatik>Meeting/InterviewLecture/Conference
12:25
CybersexData managementControl flowVotingEndliche ModelltheorieData managementCASE <Informatik>State of matterFacebookService (economics)Profil (magazine)1 (number)MappingGeometric quantizationSelf-organizationService-oriented architectureLevel (video gaming)Core dumpMeeting/InterviewComputer animation
14:08
Mathematics2 (number)Physical systemMereologyExtension (kinesiology)FacebookComputer animation
14:46
Form (programming)Physical systemData managementState of matterExecution unitProcess (computing)Computer animationXML
15:21
Video GenieIntegrated development environmentArtificial intelligenceWebsiteAlgorithmDigital photographyTwitterData profilingIRIS-TSound effectBoiling pointPhysical systemMeasurementAlgorithmSource codeComputer animation
16:01
Sound effectSound effectState of matterInferenceGroup actionInclusion mapReading (process)Computer animation
16:36
Term (mathematics)Greatest element2 (number)Student's t-testFacebookControl flowSpring (hydrology)Sound effectComputer animation
17:24
Information privacySound effectPattern languageSurfaceBitService (economics)Computer animationXML
18:00
View (database)Form (programming)Pattern languageInformation privacyMultiplication signCoefficient of determinationPeg solitaireRight angleLecture/ConferenceMeeting/InterviewComputer animation
18:32
Operator (mathematics)Complex (psychology)Level (video gaming)Physical systemOnline helpOpen sourceComputer animationMeeting/Interview
19:00
AreaSound effectElectronvoltProcess (computing)WeightXMLComputer animation
19:38
Hacker (term)Group actionPoint cloudComputer animation
20:10
MereologyAreaCodeGoodness of fitComputer animation
20:38
ExplosionSubsetGroup actionLevel (video gaming)XMLProgram flowchartComputer animation
21:06
FreewareTheoryHypermediaBitPower (physics)Chemical equationPhysical systemGame controllerLine (geometry)Prisoner's dilemmaPhysical lawComputer animationSource code
21:44
MeasurementFreewareDirection (geometry)Physical systemMultiplication signPressureDifferent (Kate Ryan album)Exclusive orMoment (mathematics)Logical constantAsynchronous Transfer ModeRule of inferenceSource codeGroup actionComputer animation
22:47
Point cloudMereologyGame controllerSource codePhysical systemComputer animationXMLUML
23:25
Physical lawComputer animation
23:53
Control flowComputer programPhysical law
24:19
PlastikkarteMoment (mathematics)Product (business)Pulse (signal processing)Parallel portInformation privacyRight angleCore dumpComputer animation
24:55
Information privacyInformation privacyPressureRight angleBitPerfect groupPhase transitionComputer animation
25:25
Phase transitionElectronic visual displayRight angleSystem administratorFreewareSpeech synthesisInformation privacyFigurate numberPhysical systemWebsiteMIDIArithmetic progressionInformation securityComputer animationXML
26:25
System callComputer animation
27:03
Mathematical optimizationMobile appDifferent (Kate Ryan album)MathematicsBitGradientPattern languageComputer animation
27:44
Projective planeNumberWordoutputComputer animationLecture/Conference
28:29
Identity managementPoint (geometry)InternetworkingGame controllerMultiplication signLecture/Conference
29:15
Graph (mathematics)Right angleGroup actionFood energyInternetworkingContext awarenessFundamental theorem of algebra2 (number)Lecture/Conference
30:07
Link (knot theory)SurfaceGroup action1 (number)Lecture/Conference
31:02
Coma BerenicesMultiplication signIdentity managementLink (knot theory)State of matterComputer animationLecture/Conference
31:30
NumberTerm (mathematics)1 (number)Information privacyPhase transitionMessage passingMoment (mathematics)Video gameOrder (biology)Right angleWordPressureMultiplication signProcess (computing)Cellular automatonFrame problemLecture/ConferenceMeeting/Interview
33:37
Message passingForm (programming)Lecture/Conference
34:08
Channel capacity
Transcript: English(auto-generated)
00:02
Welcome to our next talk, social cooling.
00:21
You know, people say, I have no problem with surveillance. I have nothing to hide. But then, you know, maybe the neighbors and maybe this and maybe that. So tonight we're going to hear Timon Schepp, Fus vam Holland.
00:41
He's a privacy designer and a freelance security researcher. And he's going to hold a talk about how digital surveillance changes our social way of interacting.
01:00
So please, let's have a hand for Timon Schepp. Really cool that you're all here and really happy to talk here. It's really an honor.
01:21
My name is Timon Schepp and I am a technology critic. And that means that it's my job to not believe what it tells us. And that's really a lot of fun. The main goal is how do I get a wider audience involved in understanding technology and the issues that are arising from technology.
01:42
Because I believe that change comes when the public demands it. I think that's really one of the important things when change happens. And for me, as a technology critic, for me, words are very much how I hack the system, how I try to hack this world. So tonight I'm going to talk to you about one of these words that I think could help us.
02:04
Because framing the issue is half the battle. We can frame the problem. If we can explain what the problem is in a certain frame that makes a certain position already visible, that's really half the battle won. So that frame is social cooling.
02:24
But before I go into it, I want to ask a question. Who here recognized this? You're on Facebook or some other social site. And you're going to click on the link because you think, ooh, I could click on this, but it might look bad.
02:43
It might be remembered by someone, some agency might remember it and I could click on it, but I'm hesitant to click.
03:02
Is that better? Can everyone hear me now? No? Okay, yeah. Should I start again? Okay. So you're on Facebook and you're thinking, ooh, that's an interesting link, I could click on that, but you're hesitating.
03:21
Because maybe someone's going to remember that and that might come back to me later. And who here recognizes that feeling? So pretty much almost everybody. And that's increasingly what I find when I talk about this issue that people really start to recognize this. And I think a word we could use to describe that is click fear. Like this hesitation could be click fear.
03:44
And you're not alone. Increasingly we find that research points, that this is a wide problem. That people are hesitating to click on a lot of links. For example, after the Snowden revelations, people were less likely to research issues about terrorism and other things on Wikipedia because they thought, well, maybe the NSA won't like it
04:01
if I push that. Okay, I'm not gonna move. And we see it in Google as well. So this is a pattern that research are pointing to. And it's not very strange, of course. I mean, we all understand that if you feel you're being watched, you change your behavior. That's a very logical thing that we all understand.
04:22
And I believe that technology is really amplifying this effect. I think that's something that we really have to come to grips with. And that's why I think social cooling could be useful for that. Social cooling describes in a way how in an increasingly digital world, where our digital lives are increasingly digitized, it becomes easier to feel this pressure,
04:43
to feel these normative effects of these systems. And very much you see that because increasingly your data is being turned into thousands of scores by data brokers and other companies. And those scores are increasingly influencing your chances in life. And this is creating an engine of oppression,
05:01
an engine of change that we have to understand. And the fun thing is that in a way, this idea is already really being helped by Silicon Valley, who for a long time has said data is the new gold. But they've recently in the last five years changed that narrative. Now they're saying data is the new oil.
05:21
And that's really funny because if data is the new oil, then immediately you get the question, wait, oil gave us global warming, so then what does data give us? So, and I believe that if oil leads to global warming, then data could lead to social cooling. That could be the word that we use for these negative effects of big data. In order to really understand this and go into it,
05:42
we have to look at three things. First, we're gonna talk about the reputation economy, how that system works. Second chapter, we're gonna look at behavior change. How does it influencing us and changing our behavior? And finally, to not let you all go home depressed, I'm gonna talk about how can we deal with this. So first, the reputation economy.
06:01
Already we've seen today that China is building this new system, the social credit system. It's a system that will give every citizen in China a score that basically represents how well behaved they are. And it will influence your ability to get a job, a loan, a visa, and even a date. For example, the current version of the system,
06:22
Sesame Credit, one of the early prototypes, already gives everybody that wants to a score, but it also is connected to the largest dating website in China. So you can find out, is this person that I'm dating? What kind of person is this? Is this someone who's well viewed by Chinese society?
06:42
This is where it gets really heinous for me because until now you could say, well, these reputation systems, they're fair. If you're a good person, you get a higher score. If you're a bad person, you get a lower score. But it's not that simple. I mean, your friend score influences your score and your score influences your friend score. And that's where you really start to see how complex social pressures arrive
07:01
and where we can see the effects maybe of data stratification, where people start to think, hey, who are my friends and who should I be friends with? Okay, now you could think that only happens in China. Those Chinese people are different. But the exact same thing is happening here in the West except we're letting the market build it.
07:21
I'll give you an example. This is a company called Diemly, a Danish company. And this is their video for their service. Renting apartments from others and she loves to swap trendy clothes and dresses. She's looking to catch her first lift from a rideshare app, but has no previous reviews to help support her. Luckily, she's just joined Diemly
07:42
where her positive feedback from the other sites appears as a Diem score, helping her to win a rideshare in no time. Diemly is free to join and support users across many platforms, helping you to share and benefit from the great reputation you've earned. Imagine the power of using your Diem score
08:02
alongside your CV for a job application. Like in China. Perhaps to help get a bank loan. Like in. Or even to link to from your dating profile. Like in China. Sign up now at diemly.co. Diemly, better your sharing. Thanks.
08:21
There is a change, there is difference though. The fun thing about here is that it's highly invisible to us. Like the Chinese government is very open about what they're building, but here we are very blind to what's going on. So mostly when we talk about these things, then we're talking about these systems that give us a very clear rating, like Airbnb, Uber, and of course the Chinese system.
08:40
But the thing is, most of these systems are invisible to us. There's a huge market of data brokers who are, you know, not visible to you because you are not the customer. You are the product. And the data brokers, well what they do is they gather as much data as possible about you and that's not all. They then create up to 8,000 scores about you.
09:03
In the United States, these companies have up to 80,000 scores, and in Europe it's a little less of around 600. These are scores about things like your IQ, your psychological profile, your gullibility, your religion, your estimated lifespan. 8,000 of these different things about you.
09:21
And how does that work? Well, it works by machine learning. So machine learning algorithms can find patterns in society that we can really not anticipate. For example, let's say you're a diabetic. And well, let's say your, this data broker company has a mailing list or has an app that diabetic patients use.
09:42
And they also have the data of these diabetic patients, what they do on Facebook. Well then you can start to see correlations. So if diabetic patients more often like gangster rap and pottery on Facebook, well then you could deduce from that that if you also like gangster rap or pottery on Facebook, then perhaps you also are more likely to have or get diabetes.
10:03
This is highly unscientific, but this is how this system works. And this is an example of how that works with just your Facebook scores. You see it was lowest, about 60% when it came to predicting whether a user's parents were still together when they were 21. People whose parents divorced before they were 21
10:21
tended to like statements about relationships. Drug users were ID'd with about 65% accuracy. Smokers with 73% and drinkers with 70%. Sexual orientation was also easier to distinguish among men. 88% right there. For women, it was about 75%. Gender, by the way, race, religion and political views
10:43
were predicted with high accuracy as well. For instance, white versus black, 95%. So the important thing to understand here is that this isn't really about your data anymore. Like often times when we talk about data protection, we talk about, oh, I wanna keep control of my data.
11:00
But this is their data. This is data that they deduce, that they derive from your data. These are opinions about you. And these things are what make it so that even though you never filled in a psychological test, they'd have one. A great example of that and how that's used is a company called Cambridge Analytica.
11:21
This company has created detailed profiles about us through something, through what they call psychographics. And I'll let them explain it themselves. By having hundreds and hundreds of thousands of Americans undertake this survey, we were able to form a model to predict the personality of every single adult
11:41
in the United States of America. If you know the personality of the people you're targeting, you can nuance your messaging to resonate more effectively with those key audience groups. So for a highly neurotic and conscientious audience, you're gonna need a message that is rational
12:00
and fear-based or emotionally-based. In this case, the threat of a burglary and the insurance policy of a gun is very persuasive. And we can see where these people are on the map. If we wanted to drill down further, we could resolve the data to an individual level where we have somewhere close to four or 5,000 data points
12:20
on every adult in the United States. So yeah, this is the company that worked with both for the Brexit campaign and with the Trump campaign. Of course, a little after Trump campaign, all the data was leaked. So data on 200 million Americans was leaked and including you can you see this data described as modeled voter ethnicities and religions.
12:42
So this is just derived data. You might think that when you go online and use Facebook, you use all these services, that advertisers are paying for you. That's a common misperception. That's not really the case. What's really going on is that, according to SEC research, the majority of the money made in this data broker market is made from risk management.
13:02
So in a way, you could say that it's not really marketers that are paying for you. It's your bank. It's insurers. It's your employer. It's governments. These kind of organizations are the ones who buy these profiles. The most more than the other ones. Of course, the promise of big data
13:21
is that you can then manage risk. Big data is the idea that with data, you can understand things and then manage them. So what really is innovation in this big data world, in this data economy, is the democratization of the background check. That's really the core of this market that now you can find out everything about everyone. So, yeah, now you're,
13:42
in the past, only perhaps your bank could know your credit score, but now your green grocer knows your psychological profile. That's a new level of, yeah, what's going on here. This market's not only invisible, but it's also huge. According to the same research by the FCC, this market was already worth $150 billion in 2015.
14:04
So it's invisible. It's huge, and hardly anyone knows about it. But that's probably gonna change. That brings us to the second part, behavior change. We already see this first part of this, how behavioral change is happening through these systems,
14:21
and that's through outside influence. And we've talked a lot about this at this conference. For example, we see how Facebook and advertisers try to do that. We've also seen how China is doing that, trying to influence you. Russia has recently tried to use Facebook to influence the elections, and of course companies like Cambridge Analytica try to do the same thing. And here you can have a debate on, to what extent are they really influencing us?
14:42
But I think that's not actually the really most interesting question. What interests me most of all is how we are doing it ourselves, how we are creating new forms of self-censorship and are proactively anticipating these systems. Because once you realize that this is really about risk management,
15:00
and this is about banks and employers trying to understand you, when people start to understand that, this will go beyond click theory, you might remember. This will go beyond, this will become, you know, when people find out, this will be, you know, not getting a job, for example. This will be about getting really expensive insurance. This will be about all these kinds of problems, and people are increasingly finding this out.
15:21
So for example, in the United States, if you, the IRS might now use data profiles, are now using data profiles to find out who they should audit. So I was talking recently to a girl and she said, oh, I recently tweeted about, a negative tweet about the IRS, and she immediately grabbed her phone to delete it once she realized that, you know,
15:42
this could now be used against her in a way. And that's the problem. Of course, you see all kinds of other crazy examples that the big, the audience, the wider public is picking up on, like, ooh, so we now have algorithms that can find out if you're gay or not. These things scare people and these things are something you have to understand.
16:02
So chilling effects, this is what this boils down to. For me, more importantly than these influences of these big companies and nation states is how people themselves are experiencing these chilling effects like you yourself have as well. That brings us back to social cooling. For me, social cooling is about these two things combined.
16:22
At one hand, this increasing ability of agents and groups to influence you. On the other hand, increasing willingness of people themselves to change their own behavior, to proactively engage with this issue. There are three long-term consequences that I wanna dive into.
16:41
The first is how this affects the individual. The second is how it affects society. And the third is how it affects the market. So let's look at the individual. Here, we've already seen there's a rising culture of self-centership. It started for me with an article that I read in the New York Times where a student was saying, well, we're very, very reserved.
17:01
She's going to things like spring break. I said, well, you don't wanna defend yourself later so you don't do it. And what she's talking about, she's talking about doing crazy things, letting go, having fun. She's worried that the next day it'll be on Facebook. So what's happening here is that you do have all kinds of freedoms. You have the freedom to look up things. You have the freedom to say things, but you're hesitating to use it.
17:22
And that's really insidious. That has an effect on a wider society. And here, we really see the societal value of privacy because in society, often minority values later become majority values. An example is weed. I'm from the Netherlands.
17:41
And there you see, you know, at first, it's something that you just don't do and it's, you know, a bit of a woo. But then, oh, maybe you should try it as well and then people try it and slowly under the service of the society, people change their minds about these things. And then after a while, it's like, you know, what are we still worried about? Oh, this same pattern happens, of course, with way bigger things like this.
18:02
I must honestly say to you that I never intend to adjust myself to racial segregation and discrimination. This is the same pattern that's happening for all kinds of things that change in society. And that's what privacy is so important for. And that's why it's so important that people have the ability to look things up
18:21
and to change their minds and to talk about each other without feeling so watched all the time. The third thing is how this impacts the market. And here we see very much the rise of a culture of risk avoidance. An example here is that in 1995, already doctors in New York were given scores. And what happened was that the doctors
18:40
who tried to help advanced stage cancer patients, complex patients who tried to do the operation, difficult operations, got a low score because these people more often died. Well, doctors that didn't lift a finger and didn't try to help got a high score because, well, people didn't die. So you see here that these systems that they bring all kinds of perverse incentives.
19:03
They lower the willingness for everybody to take risks. And in some areas of society, we really like people to take risks, like entrepreneurs, doctors. So in the whole part, you could say that this, what we're seeing here is some kind of trickle-down risk aversion where the way that companies and governments
19:23
want to manage risk, that's trickling down to us. And we, of course, want them to like us, want to have a job, we want to have insurance. And then we increasingly start to think, oh, maybe I should not do this. It's a subtle effect. So how do we deal with this? Well, together, I think this is a really big problem.
19:41
I think this is such a big problem that it can't be managed by just some hackers or nerds building something or by politicians making a law. This is a really society-wide problem. So I wanna talk about all these groups that should get into this. The public, politicians, business, and us, finally. So the public. I think we have to talk about
20:01
and maybe extend the metaphor of the cloud and say, we have to learn to see the stars behind the cloud. That's one way that we could, that's a narrative we could use. I really like to use humor to explain this to a wider audience. So for example, last year, I was part of an exhibit, I helped develop an exhibit about dubious devices.
20:21
And one of the devices there was called Taste Your Status, which was a coffee machine that gives you coffee based on your area code. So if you live in a good area code, you get nice coffee. You live in a bad area code, you get bad coffee. I won't go into it, but these,
20:41
like oftentimes you can use humor to explain these things to a wider audience. I really like that approach. We've got a long way to go though. I mean, if we look at the long, how long it took for us to understand global warming, to really come to a stage where most people understand what it is and care about it and accept Donald Trump. Well, with data, we really got a long way to go. We're really at the beginning of understanding this issue like this.
21:03
Okay, so the second group that has to really wake up is politicians. And they have to understand that this is really about the balance of power. This is really about power. And if you'll permit me, I'll go into the big picture a little bit as a media theorist. So this is called Gilles Deleuze. He's a French philosopher.
21:21
And he explained in his work something that I find really useful. He said, you have two systems of control in society. And the one is the institutional one. And it's the one that we all know. You know, the judicial system. So you're free to do what you want, but then you cross a line, you cross a law, then the police gets you, you go in front of a judge, you go to prison.
21:40
This is a system we all understand. But he says there's another system, which is the social system. This is the social pressure system. And this for a long time wasn't really designed, but now increasingly we are able to do that. So this is a system where you perform suboptimal behavior, and then that gets measured and judged, and then you get subtly nudged in the right direction. And there's some very important differences between these two systems.
22:01
The institutional system, you know, has this idea that you're a free citizen that makes up your own mind. And, you know, the social system is like, that's working all the time, constantly. It doesn't matter if you're guilty or innocent, it's always trying to push you. The old system, the institutional system is very much about punishment. So if you break the rules, you get punishment.
22:21
But people sometimes don't really care about punishment. Sometimes it's cool to get punishment. But the social system uses something way more powerful, which is the fear of exclusion. We are social animals, and we really care to belong to a group. The other difference is that it's very important that the institutional system is accountable, you know, democratically to us.
22:40
While the social system at the moment is really, really invisible, like these algorithms, how they work, where the data is going, it's very hard to understand. Of course, it's exactly what China loves so much about it, right? You can stand in front of a tank, but you can't really stand in front of the cloud. So, yeah, that's great. It also helps me to understand when people say I have nothing to hide. I really understand that,
23:01
because when people say I have nothing to hide, what they're saying is, I have nothing to hide from the old system, from the classic system, from the institutional system. They're saying, I want to help the police. I trust our government, I trust our institutions. And that's actually really a positive thing to say. The thing is they don't really see the other part of the system, how increasingly there are parts that are not controlled, that are not democratically checked,
23:21
and that's really a problem. So the third thing that I think we have to wake up is, is business. Business has to see that this is not so much a problem, perhaps, but that it could be an opportunity. I think, I'm still looking for a metaphor here, but perhaps if we, again, compare this issue to global warming, we say that we need something
23:41
like ecological food for data. But I don't know what that's gonna look like, or how we're gonna explain that. Maybe we have to talk about fast food versus fast data versus ecological data, but we need a metaphor here. Of course, laws are also really helpful. So we might get things like this.
24:07
And, I'm actually working on this, it's funny. Or if things go really out of hand, we might get here. So luckily we see that in Europe,
24:21
the politicians are awake, and are really trying to push this market. I think that's really great. So I think in the future we'll get to a moment where people say, well, I prefer European smart products, for example. I think that's a good thing. I think this is really positive. Finally, I wanna get to all of us, what each of us can do. I think here, again, there's a parallel to global warming, where at its core, it's not so much
24:40
about the new technology and all the issues, it's about a new mindset, a new way of looking at the world. I hear you think we have to stop saying that we have nothing to hide, for example. If I've learned anything in the past years, understanding and researching privacy and this big data market, is privacy is the right to be imperfect. Increasingly, there's pressure to be the perfect citizen,
25:01
to be the perfect consumer, and privacy is a way of getting out of that. So this is how I would reframe privacy as not just being about which bits and bytes go where, but it's about a human right to be imperfect, because of course, we are human, we are all imperfect. And sometimes when I talk at technology conference, people say, well, privacy was just a phase. It's like ebb and flood,
25:21
and we got it, and it's gonna go away again. I'm like, that's crazy. You don't say women's rights were just a phase, we had it for a while, and it's gonna go again. And of course, Edward Snowden explains it way better. He's just arguing that you don't care about the right to privacy, because you have nothing to hide. It's no different than saying you don't care about free speech, because you have nothing to say.
25:40
What an eloquent system admin. So I think what we have to strive for here is that we develop a more nuanced understanding of all these issues. I think we have to go away from this idea that more data is better. Data is automatically progress. No, it's not. Data is a trade-off. For example, for the individual, more data might mean less psychological security,
26:01
less willingness to share, less willing to try things. For a country, it might mean less autonomy for citizens, and citizens need their own autonomy. They need to know what's going on. They need to be able to vote in their own autonomous way and decide what they want. In business, you could say more data might lead to less creativity, less willingness to share new ideas,
26:21
ways to come up with new ideas. So that's, again, an issue there. So, in conclusion, social cooling is a way of understanding these issues, or a way of framing these issues that I think could be useful for us, that could help us understand and engage with these issues. And yes, social cooling is an alarm. It's alarmist.
26:40
It is, I'm trying to say, this is the problem, and we have to deal with this. But it's also really about hope. I trust, not so much in technology, I trust in us, in people, that we can fix this once we understand the issue, in the same way that, when we understood the problem with global warming, we started to deal with it. It's slow progress, but we're doing that. And we can do the same thing with data.
27:00
It'll take a while, but we'll get there. And finally, this is about starting to understand the difference between shallow optimism and deep optimism. Oftentimes, the technology sector is like, ah, cool, new technology, and we're gonna fix this by creating an app. And for me, that's, you know, we have to be optimistic, but that's very shallow optimism, that TEDx make optimism. True optimism recognizes that
27:22
each technology comes with a downside, and we have to recognize that that's not a problem to point out these problems. That's a good thing. Once you understand the problems, you can deal with them, and come up with better solutions. If we don't change in this mindset, then we might create a world where we're all more well-behaved, but perhaps also a little bit less human.
27:43
Thank you.
28:02
You're welcome, you're welcome. That's the shit out of it. We still have five more minutes. We'll take some questions if you like. First, microphone number two. Hello, thanks, that was a really interesting talk.
28:23
I have a question that I hope will work. It's a bit complicated. There's a project called Indie by a foundation called the Sovereign Foundation. Do you know about it? Okay, very great, perfect. So just to quickly explain, these people want to create an identity layer that will be self-sovereign,
28:41
which means people can reveal what they want about themselves only when they want, but is one unique identity on the entire internet. So that can potentially be very liberating because you control all your identity and individual data, but at the same time, it could be used to enable something like the personal scores we were showing earlier on. So it made me think about that,
29:01
and I wondered if you had an opinion on this. Yes, well, the first thing I think about is that, as I try to explain, you see a lot of initiatives that try to be about, oh, you have to control your own data, but that's really missing the point that it's no longer really about your data. It's about this derived data, and of course, it can help to manage what you share. Then they can't derive anything from it,
29:22
but too little, I see that awareness. Second of all, this is very much for me an example of what nerds and technologists are really good at. It's like, oh, we've got a social problem. Let's create a technology app, and then we'll fix it. Well, what I'm trying to explain is that this is such a big problem that we cannot fix this with just one group alone, not the politicians, not the designers, not the nerds.
29:42
This is something that we have to really get together, grab, fix together, because this is such a fundamental issue. The idea that risk is a problem that we wanna manage risk is so deeply ingrained in people, such a base in fear. It's fundamental, and it's everywhere. So it's not enough for one group to try to fix that. It's something that we have to come to grip with together.
30:03
Thanks a lot. Okay, there is Signal Angel has a question from the internet, I think. Yes, Barking Sheep is asking, do you think there's a relationship between self-censorship and echo chambers in a sense that people become afraid to challenge their own belief, and thus isolate themselves in groups with the same ideology?
30:24
That's a really big answer to that one. I was actually, I was emailing Vint Cerf, and miraculously, he responded, and he said, what you really have to look for is this, not just reputation economy, but also the attention economy, and how they're linked.
30:41
So for a while, I've been looking for that link, and there's a lot to say there, and there definitely is a link. I think important to understand, or what I get nuanced here is that I'm not saying that everybody will become really well-behaved and gray, bookworm people. The thing is that what this situation is creating
31:00
is that we're all becoming theater players. We're all playing in identity more and more, because we're watched more of the time. And for some people, that might mean that they're, I think most people will be more conservative and more careful. Some people will go really all out, and oh, enjoy the stage. We have those people as well. And I think those people could really benefit, and that the attention economy could really
31:22
give them a lot of attention through that. So I think there's a link there. I could go on more, but I think it's, for now, where I'm at. Okay, we're short on time. We'll take, I'm sorry, one more question, the number one. So I think the audience you're talking to here
31:41
is already very aware, but I'm asking for tactics or your tips to spread your message and to talk to people that are in this phase saying, ah, I don't care, they can surveil me. Like, what's your approach? Like, in a practical way, how do you actually do this?
32:02
Yeah. So I'm really glad to be here, because I am, yes, I am a nerd, but I'm also a philosopher or a thinker. And that means that for me, what I work with is not just Arduino's, but words and ideas. I think those I've been trying to show can be really powerful. Like, a word can be a really powerful way to frame a debate or engage people.
32:26
So I haven't found yet a way to push all this talk. Like, I was making a joke that I can tell you in one sentence what privacy is and why it matters, but I have to give a whole talk before that. Privacy is the right to be imperfect, but in order to understand that, you have to understand the rise of the reputation economy
32:42
and how that affects your chances in life. The fun thing is that that will happen by itself, that people will become more aware of that. They will run into these problems. They will not get a job, or they might get other issues. And then they will start to see the problem. And so my question is not so much to help people understand it, but to help them understand it before they run into the wall, right?
33:02
That's how usually society at the moment deals with technology problems. It's like, oh, it's a problem. Oh, well, now we'll try to fix it. Well, I believe you can really see these problems come way earlier, and I think the humanities, where I'm from, is really helpful in that, in trying to, like Deleuze really clearly explaining
33:21
what the problem is, in 1995. So yeah, I think that, I don't have a short way of explaining why privacy matters, but I think it'll become easier over time as people start to really feel these pressures. Sorry, thank you very much for the question.
33:41
I think we all should go out and spread the message. This talk is over, I'm awfully sorry. When you people leave, please take your bottles and your cups and all your junk, and thank you very much again. Thank you. Damon Smith.