Freiheit und Vorhersage [English Translation]
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Subtitle |
| |
Title of Series | ||
Number of Parts | 126 | |
Author | ||
License | CC Attribution - ShareAlike 3.0 Germany: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this | |
Identifiers | 10.5446/33444 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
re:publica 20147 / 126
15
19
22
26
29
31
33
34
35
41
42
49
50
58
68
71
72
74
78
84
87
88
89
91
93
97
100
102
105
106
107
108
109
110
112
119
121
122
123
125
126
00:00
Data acquisitionBoundary value problemPredictabilityNeuroinformatikTerm (mathematics)Self-organizationControl flowQuicksortProcess (computing)Execution unitContinuum hypothesisSlide ruleVideo gameWordComputer animationMeeting/Interview
01:09
State of matter2 (number)Right angleLecture/Conference
02:06
MathematicsProcess (computing)FingerprintEndliche ModelltheorieTerm (mathematics)NumberHypothesisWebsiteCASE <Informatik>InformationPredictabilityAbsolute valueState of matterProduct (business)Multiplication signOrder (biology)Right angleMedical imagingData analysisSet (mathematics)Lie groupCartesian coordinate systemState observer2 (number)GoogolStructural loadNatural numberLecture/ConferenceComputer animation
05:09
Archaeological field surveyNatural numberConnected spaceMIDIRight angleDigitizing
06:04
SequenceProcess (computing)Integrated development environment
06:50
Maschinelle ÜbersetzungBasis <Mathematik>Roundness (object)Endliche ModelltheorieMathematicsDimensional analysisDifferent (Kate Ryan album)Connected spaceLink (knot theory)Lattice (order)Moment (mathematics)InternetworkingExistenceCASE <Informatik>Medical imagingCondition numberBit rateView (database)Ocean currentForm (programming)Nintendo Co. Ltd.Organic computingMultiplication signPower (physics)Type theoryGame theoryFacebookOrder (biology)Point (geometry)Shape (magazine)Arithmetic meanVideo gameVideoconferencing1 (number)SequenceNumberDivision (mathematics)Core dumpContent (media)DivisorSystem administratorHypermediaWritingInformationCanonical ensembleScaling (geometry)Service (economics)Group actionNeuroinformatikDisk read-and-write headInsertion lossSystem callIntegrated development environmentBeta functionField (computer science)Range (statistics)Food energyWebsiteProduct (business)Set (mathematics)Digital photographyYouTubeCross-correlationTheory of relativityAnalogyTranslation (relic)Error messageTwitterDecision theorySelectivity (electronic)Axiom of choiceLibrary (computing)MultilaterationImaginary numberLetterpress printingEqualiser (mathematics)AlgorithmBoundary value problem
15:08
Maschinelle ÜbersetzungGoogolSystem programmingObservational studyFlow separationDirection (geometry)Power (physics)Multiplication signMereologyProjective planeContent (media)FamilyDecision theoryVirtual machineAlgorithmConnectivity (graph theory)Volume (thermodynamics)PredictabilityStatisticsType theoryResultantMetropolitan area networkComputer programmingContext awarenessState of matterWordNeuroinformatikComputer scienceDigitizingLevel (video gaming)Formal languageGoodness of fitElectronic mailing listProduct (business)Software developerFuzzy logicCommunications protocolMaschinelle ÜbersetzungDifferent (Kate Ryan album)Right angleDisk read-and-write headOrder (biology)Design by contractAutomatic programmingPhysical systemTranslation (relic)Message passingRekursiv aufzählbare MengeSoftware testingTheory of relativityPredictionComputer fileWebsiteWeb 2.0GoogolCore dumpWave packetProgrammer (hardware)WritingPoint (geometry)InternetworkingDigital video recorderSpeech synthesisComputer animation
23:26
Position operatorGroup actionCASE <Informatik>Control flowEigenvalues and eigenvectorsExtension (kinesiology)Element (mathematics)Template (C++)InformationInsertion lossPresentation of a groupPredictabilityMeeting/InterviewLecture/Conference
24:27
PredictabilityDecision theoryTouch typingQuicksortDependent and independent variablesDirection (geometry)View (database)Theory of relativityVideo gameState of matterMeeting/InterviewLecture/Conference
25:20
Decision theoryDependent and independent variablesTheory of relativityElement (mathematics)Strategy gameGame theorySoftware maintenanceFile formatLengthGroup actionOrder (biology)Meeting/InterviewComputer animationLecture/Conference
26:53
Level (video gaming)Uniformer RaumBit rateDecision theoryLecture/ConferenceMeeting/InterviewComputer animation
28:09
Device driverMusical ensembleDisk read-and-write headPower (physics)CodeInstance (computer science)Core dumpPay televisionLecture/ConferenceMeeting/Interview
29:11
Connected spaceLevel (video gaming)Group actionDivisorCross-correlationEndliche ModelltheorieVelocityGame controllerLecture/ConferenceMeeting/Interview
30:16
Constraint (mathematics)MathematicsWordLine (geometry)Greatest elementCross-correlationObject (grammar)Sound effectPredictabilityLecture/ConferenceMeeting/Interview
31:10
Multiplication signPrice indexReal-time operating systemCycle (graph theory)Parameter (computer programming)Line (geometry)Video gameMeeting/InterviewLecture/Conference
32:02
Multiplication signRule of inferenceSurface of revolutionFundamental theorem of algebraSummierbarkeitLinearizationMathematicsMeeting/InterviewLecture/Conference
32:57
Multiplication signSound effectExecution unitInheritance (object-oriented programming)Form (programming)Keyboard shortcutMeeting/InterviewLecture/Conference
33:47
Multiplication signGroup actionDivisorPredictabilityWebsiteVolume (thermodynamics)Game theoryState of matterMeeting/Interview
34:31
Library (computing)Formal languageMultiplication signMotion captureData miningRight angleInformation privacyPresentation of a groupLecture/ConferenceMeeting/Interview
35:19
InferenceDifferent (Kate Ryan album)Presentation of a groupSet (mathematics)Form (programming)Right angleDigitizingWater vaporWebsiteType theoryPoint (geometry)Information privacyMathematicsMeeting/InterviewLecture/ConferenceSource code
36:20
Video gameRight angleInformation privacyQuicksortMultiplication signSoftware developerKernel (computing)Dependent and independent variablesGame controllerDirection (geometry)1 (number)Lecture/ConferenceMeeting/Interview
37:22
Multiplication signRight angleGroup actionPredictabilityRule of inferenceInsertion lossLine (geometry)MeasurementForm (programming)Lecture/ConferenceMeeting/Interview
38:07
Pay televisionAreaPlastikkarteGame theoryDecision theoryData analysisPredictabilityPublic key certificateDependent and independent variablesMeeting/InterviewLecture/Conference
39:10
AlgorithmDifferent (Kate Ryan album)Connectivity (graph theory)Multiplication signWave packetGroup actionSet (mathematics)Negative numberSound effectOnline helpSeries (mathematics)Lecture/ConferenceMeeting/Interview
40:02
Moment (mathematics)Game controllerData qualityTotal S.A.PlanningLecture/ConferenceMeeting/Interview
41:06
Reflection (mathematics)Set (mathematics)Acoustic shadowForm (programming)Sound effectLecture/ConferenceMeeting/InterviewSource code
41:52
TwitterMachine visionNeuroinformatikThumbnailTranslation (relic)Multiplication signDialectLecture/ConferenceMeeting/Interview
42:42
Point (geometry)Logical constantMereologyAreaSpecial unitary groupOrder (biology)NumberDirection (geometry)MathematicsPredictabilityRight angleAnalytic setVibrationMomentumProcess (computing)Video gameVarianceMoment (mathematics)Acoustic shadowGaussian eliminationExecution unitMultiplication signLecture/ConferenceMeeting/Interview
44:41
Particle systemSound effectDifferent (Kate Ryan album)Multiplication signPredictabilityBoundary value problemSource codeLecture/Conference
45:45
Instance (computer science)Multiplication signSound effectLinearizationAdditionProduct (business)TheoryState of matterMatrix (mathematics)SpacetimeLecture/Conference
46:44
Functional (mathematics)Video gameLipschitz-StetigkeitMultiplication signTerm (mathematics)Source codeLecture/ConferenceMeeting/Interview
47:34
Declarative programmingMetropolitan area networkStudent's t-testAnalogyIdentity managementDigitizingRow (database)Multiplication signLecture/ConferenceMeeting/Interview
48:51
Metropolitan area networkElectric generatorArmQuicksortSound effectCross-correlationSource codeLecture/Conference
49:39
Normal (geometry)Order (biology)Game theorySpacetimeTotal S.A.Lecture/ConferenceSource code
50:28
Multiplication signAuthorizationService (economics)Error messageMereologyMoment (mathematics)Term (mathematics)Lecture/Conference
51:14
WindowSign (mathematics)Online helpPosition operatorSubsetPerspective (visual)WeightFormal languageTournament (medieval)SummierbarkeitPower (physics)Moment (mathematics)Lecture/Conference
52:19
Scaling (geometry)Information privacyDepictionCartesian coordinate systemDependent and independent variablesMathematical analysisLecture/ConferenceMeeting/Interview
53:17
Multiplication signLine (geometry)Interactive televisionLecture/Conference
54:01
Pairwise comparisonMultiplication signMoment (mathematics)Algebraic closureTime seriesMeasurementPole (complex analysis)Software testingSoftware developerInformation securitySoftware development kitGodImplementationOnline helpExpert systemData analysisLecture/ConferenceMeeting/Interview
55:02
MereologyConnectivity (graph theory)Line (geometry)Electric generatorFood energyStudent's t-testGoodness of fitLecture/Conference
55:56
BuildingCircleCognitionInternetworkingProcess (computing)Coordinate systemLecture/Conference
56:47
Process (computing)QuicksortPopulation densityElement (mathematics)Term (mathematics)Lecture/Conference
57:40
Data acquisitionLecture/ConferenceComputer animation
Transcript: English(auto-generated)
00:15
So, about freedom, and prediction, and the ethical boundaries of big data, and it's
00:25
a serious topic. For that reason, I have also brought to you 620 slides, and we will talk about for about four hours, and then we will do a short break to go to the loo, and then
00:41
we will continue. I've already talked to the conference organisations about that. Freedom and prediction. Freedom and prediction in terms of big data, I want to talk to you about two stories. The first story is the ... it is about the fact that my computer is set up wrongly
01:08
here. Sorry.
01:26
So, that's what I need. So the first story is about a Canadian second therapist, Andrew Feltner, and he wrote the
01:47
guy who lives in Vancouver wrote an essay in 2001, and he wrote about having taken LSD in the 60s. In summer 2001, he wanted to cross the Canadian-American-US border in 2006, and
02:09
to pick up a friend from Seattle, the border police googled Feltner and found the article from 2001, and because Andrew Feltner had admitted his drug consumption in 1960,
02:32
he admitted immediately his drug consumption. He was integrated for three hours, then his fingerprints were taken, and then he
02:41
was barred from entering the US forever. The second story. Every day, every week, there is a commission that decides about inmates, whether they
03:03
get out of the profession, whether someone gets out of the profession. There is an automatic prediction, a prediction method that finds out that a certain inmate should be released ... that this kind of inmate should be involved in a violent crime
03:33
within the next 12 million months. Both stories rely on what we call big data, and both stories define what we understand
03:48
when we say time. They differentiate themselves, only regarding our attention we give those stories.
04:01
All our attention so far has been about the cases like the first one, about the border patrol, less about the second one, and exactly that is what could be ... could destroy our society, our way of living, our freedom, and we should get active against that.
04:20
This is my thesis. Allow me to explain. Let's begin with the first similarity with big data. It has been Googled. That is big data. The prediction determines whether an inmate is released on probation.
04:43
Also, this prediction is also not determined by a criminologist or a social worker, but by big data. But what is big data? Intuitively, we think about loads of data when we talk about big data, an absolute number
05:06
of data, and there is some truth to that. It began one or two decades ago in the natural sciences, for example, astronomy. In the year 2000, the digital sky survey started running.
05:28
It collected more astronomical data within the first couple of weeks of running than in the whole history of astronomy before.
05:41
Across the last 15 years, this telescope has collected more than 200 terabytes about astronomical phenomena, and following telescope, this shows a digital rendering that is going to start in 2016, will reach the 200 terabytes every five days.
06:05
I think about ourselves, about the DNA that is different for everybody from us, each of us. Everybody of us has a DNA with three billion pairs of bases, and about 11 years ago,
06:29
the world celebrated a huge success. The human genome has been completely sequenced. This one genome, to sequence this one genome, had to cost $1 billion, took more than ten
06:50
years, and needed a conglomerate of international researchers. Today, it costs two to three days, and costs less than $2,000. Internet companies collect a huge number of data per day.
07:05
There are more than 400 million tweets per day, 800 million YouTube users who upload one hour of video per second, and 10 million photos are uploaded on Facebook per hour,
07:22
and Google handles one petabyte of data per day. What did you have for breakfast? I had one and a half petabytes of breakfast. That's what you do today. How much is a petabyte of data? How many petabytes does the world have?
07:42
A petabyte of data is as much as all books, all magazines, all films of the largest library in the world, the Library of Congress, contains in total, if we look at the amount
08:01
of data, and how it has revolved within the last 25 years, that our best approach is that 1987, we had less than three billion gigabytes in the day in the world.
08:21
In 2007, it was about 100 times more than 2007, and, if we try to find a time in history
08:47
when there was an equal rise in data amount in the world, we have to follow Mr Eisenstein who says it would be probably 1453 to 1506, which is when Gutenberg released his print
09:08
press, and the amount of information in the world goes by a factor of two. Now we have a factor of ten. So, you can see here, 1980, everything is still in analogue in 2007, even the top
09:28
of the trend is still about 8.6 billion after analogue data, but now, obviously, this
09:44
is less than one per cent as opposed to 50 per cent 20 years before, so this is actually proving the transition to what we can obviously describe as a digital society. So, why is this rise in quantity important?
10:04
Because it can actually lead to a rise in quality or a change, a transition. Take this example, if you take a photo of a horse galloping, it still comes out, but
10:20
if you take a lot of photos, so 16th of the second, if you edit these into a montage, this is a new quality that arises just through the sequence of sequential data being put
10:44
together in this way. So, big data is determined by the following factors. My keywords for this are more, let me explain what this is, more just means we're collecting more data in relation to the phenomenon we're examining, so, as opposed to the world
11:09
of small data where we would like more content, a lot of smaller samples, so this obviously is an advantage in a sense that we can ask questions, we can know that we're there
11:22
or interesting, that we can go into details and actually let the data speak for itself. So, the guy in the second row, what he's doing, imagine what he's doing, he's taking a photograph. If I were to take a photograph of you now, this is my audience, finally.
11:49
Now, I will have to make a decision who do I want to focus on. Now, I found someone with a yellow polo shirt, and disappoint someone in six rows, so I'm
12:07
making my choice now, and obviously the selection takes place in the moment where I make the photos, the moment I press, and I cannot reverse this decision, obviously,
12:27
and get the people in the background unblurred, so this is the nice photograph I made of my four-year-old son. I cannot un-fuzz this, but this is a normal photo, because it was made of big data.
12:52
Well, this camera is a special big data camera.
13:04
Actually, I can render it in the opposite direction, even though the original photograph wasn't a complete piece, and I can reverse this and click back onto it, toothbrush and get that in focus, which usually we have to already have decided while we're
13:24
collecting the data. So, the second point, more, and our data is also not as sharp or as clearly boundary as we may be used to on other scales, so we have a lot of like garbage data, but,
13:48
of course, as you increase the quantity, maybe the range of errors will also change in relation to that accordingly, so, if you have a three billion sample, the value
14:02
makes us less brave. So, my next point is correlation. Correlations are quasi or imaginary connections between later points. They don't actually replace the causality if they're connected to each other, or maybe
14:25
just seem connected, but we have to discern it from causality, obviously. Amazon and Google, they're like products, preference algorithms. They're based on correlations.
14:41
These are causal connections, what suggests you might like this, but this is a probability and seemingly close relation between different points of data. In the 1950s, the American Secret Service, or the American Secret Services, collected
15:06
a lot of writings in Cyrillic, but I so often didn't have enough translators happens. What did they do? Instead of hiring more translators, which would have been the most easier,
15:22
they contacted computer scientists, and computer scientists smelled research fundings, and then they said it would be very easy, we can write automatic machine translation machine, and it will be finished in three months, and then we feed the machine with
15:45
the 200 grammatical rules, and the dictionary, and then it will be done. 13 years, and $1 billion later, according to today's money, this project was counted a failure and stopped.
16:01
Nothing happened within machine translation for a while. This was, by the way, not the first time the Pentagon wrote off $1 billion. Then in the 1980s, IBM had a completely different idea. They said we don't want to teach the computer how to talk Russian or German or French,
16:25
how why don't we just build statistics which word is used in a certain context in one language, and how it is used in a different language.
16:42
In order to find these statistics, they used test tests, and test texts, and they used the speeches from the Canadian Parliament which has three million texts translated
17:03
from English into French, and they found out statistics, the statistical data, and then what came out was a machine learning translation programme that didn't even understand
17:21
the language, and the result was way better than the other approach in the 50s and 60s. IBM then gave up this project, but 15 years later, approximately, a small start-up that
17:41
everybody knows in Mountain View approached the same problem. Google doesn't have the parliamentarian protocols, but the whole world wide web. Everything that they have is text on the internet. Every website of the institutions of the European Union, all languages of the European
18:07
Union, and finally, they're worth something, and they were leached, and manuals for video recorders and washing machines and phones that can be downloaded, and everything.
18:22
If I have a look at the manual for my digital VCR, I don't understand anything, and it was used to train Google Translate, yes, because the amount of data just handled the
18:47
fuzziness, and what fell out was a usable product called Google Translate. Many data can be, can make a huge qualitative difference, and when Google found out at IBM
19:00
too, and then both found out that data is more important than algorithms, much more, massively more important, because as IBM started the project in the 80s, they started to improve the algorithm, but this didn't lead to better results.
19:22
Google's chief engineer, who is responsible for this, he said we don't care about the algorithm. The algorithm can be very primitive. The point is the data. One million times more data, and the algorithm doesn't play any role.
19:41
Almost no role. More, and at certain correlations, the core quality of big data. How does big data now use, how is it, has it been used in both examples, and with what consequences that I showed you?
20:03
Let's talk Google, take the example of Erwin Feltman. The border patrol Googled him, and Google became an infrastructure of surveillance in which things from the past suddenly came back.
20:24
About this quality, and this type of big data, about this quality of surveillance, we have learned a lot, and shocking in the last couple of months, and it's nothing more than the end of forgetting, and something that I think
20:48
I've been thinking about for years now. And the second story is about something a little different. The crime forecast systems that I told you about, that decide if someone is released
21:03
on probation or not, in 30 American states, this is not an infrastructure of surveillance, it's an infrastructure of prediction. It can predict in better and better details what is happening, and what is going to happen.
21:25
It's a time-travelling device. From the past and the present, we will lead to the future. From the weather of the past, we want to predict the weather of the future.
21:41
Predictions, lower risks, and make the world more accountable. This takes away the surprise, and we can get prepared, and our society can protect
22:02
our society, and ourselves and others, and that's why predictions have become so important in our society of lower risk. Big data predicts future human behaviour, surprisingly, exactly.
22:23
So, how we will behave in the future, and not how we have behaved in the past, and that will allow the state to punish us for what we have, what we will be doing,
22:40
even before we have committed a crime. If you think about minority reports, then that is exactly the type of prediction-filled future that I mean.
23:01
You see, I have quite an interesting thought, really. Isn't it always better to prevent a crime than to punish someone afterwards? So, if the murder is in fact not, there is no victim. Wouldn't that be desirable?
23:27
Even so, this is a temptation to be wrong in the end of the picture.
23:40
We're punishing people that might not actually have committed that crime before false positives. So, we would have the opposite case. We would have punishment without any guilt.
24:03
So, what we would be doing on a general scale, we would be denying this individual or their free will, to a certain extent, because we're just assuming, making predictions about the way this person is going to act in the future.
24:24
So, but how could this person ever, in the present moment, in that case, ever prove that there is a reason for this to happen?
24:41
So, in a state or in a situation like this, the doubt or the suspicion, the prediction itself would be synonymous with the punishment.
25:01
So this is like the downside of human responsibility about the sort of moral cannot touch the doubt. But if we are punished based on prediction alone, this would be like the end of responsibility also in relation, individual to the state.
25:21
So, but if we remove the free will out of this league, if I don't have free will, if I assume that I can deny that, then I can't be guilty. But guilt is only for this to make sense if I can decide, if I can make a free decision
25:48
and if I can make a decision for the wrong things. So, in a way, with this prediction-based attitude, we're also reaching the end of the whole guilt and crime and punishment concept.
26:04
So we've been basically made human responsibility oblivious. So this is, in a way, the end of human liberty of action, at least in their relation to
26:27
society. And obviously, on a legal scale, on a governmental scale, on a government scale, to put it with
26:44
Merkle's word, that doesn't mean anything. This is something we can't really desire on a global or on a societal or political level.
27:02
The question is, what if, how people would react if we would suddenly talk to people who are about to commit a crime until the crime risk is over? How would the neighbours think about that? How would the others think about that if a police car would cross the park in front
27:25
of your house until you can't commit the crime any more? The question is, where our freedom of decision is over, how can we, how can our freedom be prevented?
27:41
If the police stands in front of my house, I can't, I decide whether to be for or against this committing this crime, and if I have no chance to, I have no chance to decide whether
28:00
or not to commit a crime, and if someone tells me, talks to me, I can, before a social worker or police, I can still decide to commit this crime afterwards, even though I might consider this a punishment. The other question is, what things by commercial companies are considered a punishment?
28:28
Maybe someone will not allow me to have a driver's licence, because the big data will tell me, because big data will say that I'm going to be a bad driver, that's clearly
28:40
a punishment, but what if the person will get a driver's licence but not a car insurance because of the same reasons, and then will not be able to ride a car, and that is still a punishment, and what if the person still gets an insurance, but the premium, the benefit
29:02
of the premium of the insurance is double as high, even though the person never has actually caused an accident? The core question is what consequences are connected to this big data, and how the freedom
29:23
of action of an individual are hindered by that? The question is not what big data can do, but how it is used to punish, to make a people accountable for their not-yet-caused actions, means that we are misusing the correlation
29:56
for a non-habitant causality, and the correlations cannot tell us anything about the
30:06
causality, and the reasons, but it makes correlations of big data really difficult to tell who is guilty.
30:20
Unfortunately, our people are trained to misunderstand the consequences of correlation and causality, and unfortunately, big data is bound to be abused for exactly that. We come to an important intermediate bottom line.
30:45
Big data shows, as our example showed, don't only give us advantages, but also really grave disadvantages, and these disadvantages of these predictions might feel differently,
31:04
but their effects are directed towards the same thing, the time, or let's phrase it better, the way we understand field time.
31:24
At first, about a thousand years ago, people understood the time in a circular way, something repeating, constantly repeating, constantly repeating cycles. If time is circular, then we people don't have a proper role in this world, and because
31:44
of whatever we do, the cycle of time is we're going to repeat and repeat again every year, every life. Then our life is, then it is not worth to form our world. Then what happened yesterday is guaranteed for tomorrow.
32:04
This, especially for those who are powerful, so, linearity of time is revolutionary, and it gives our people, it gives us people a role, linearity of time is not only the
32:24
remembrance and the, it's what we remember, it's the fundament of the enlightenment and the revolution, but we don't, we don't like the linearity of time.
32:46
Many years, for many years, we enjoyed a time that would forget where people could leave
33:03
their past behind them by immigration over the ocean, and then westwards, and the time forgot, and, at the same time, their own future, our future got open and open, and by newly opening ways of working, and producing, and this, exactly this is closing again with
33:31
data changing, big data is leaving, letting us, binds us to our past, we can't flee from
33:41
our past, we can't form our past in the past, in retrospect. So, the very passing of time actually prevents us, we stay the same a lot longer than we
34:01
were, and the prediction removes that chance, takes us the opportunity of acting according to our own wishes, and then successfully being judged by society for our actions. So this is the time factor in the way I see it in the times of big data, and we
34:24
have to rethink the way we talk about time as well. So now I would like to come to the question, what can we do? I would like to make some quite concrete suggestions how we can actually control big data, how
34:42
we can re-conquer this time, I suggest we create forgetful tools, forgetting tools. Think of the success of Snapchat.
35:02
That's a forgetting tool. That's what I mean. But, at the same time, we have to strengthen privacy rights, data security, far beyond what we have at present.
35:28
So, we have a right to digital self-determination. This is basically equivalent to a human right, and I would like to tell you an anecdote.
35:42
When I wanted to open a bank account in Berlin, I went to the bank and sat down with them, and they showed me several forms. Also, the data protection agreement.
36:01
I would like to amend one point. I said, and changed article 18, and the bank lady looked at me as if I just made an obscene joke or something, and said, do you want to open a bank account? Then sign, otherwise you will have to leave.
36:23
No, I want to open a bank account, but I am expressing my right to digital and privacy rights, and she repeats, do you want to open a bank account? So, this is the kind of reality in this sort of you can decide for yourself, well, not really.
36:46
That's the situation in this country. We need privacy rights to be really, really amended in that direction.
37:02
We need responsibility of users as well, for data users. Those ones who are actually producing the value, adding the value, working on data, and this should also be a kind of responsibility at the same time, I think.
37:23
So, secondly, like a digital ... We need an agenda for the 21st century. We need to protect and strengthen the fair trials and the independent courts.
37:43
We need new fundamental rights that protect us against the prediction. And we need the protection of human freedom of action. So, what I would like as a very concrete measure, a red line.
38:06
No punishment without any form of guilt. And any area that isn't directly related to crime and punishment, decisions like
38:22
does someone get a credit, a medical treatment, a bank account? I think we need very, very different approaches. Transparency, and a certain accountability, like certificates, for example.
38:45
So, for example, data analysis could be certified, could receive a kind of quality seal, and also we need falsifiable data.
39:01
Other people have to be able to contradict your prediction. Predictions that you're called to responsibility for. And for this, we also need another set of people,
39:22
a group I would like to call the algorithmics. These people should be able to help people who have been victims of negative effects of algorithm.
39:45
Without these algorithmics, we will be helpless. Big data we've already almost made it. It helps us understand the world in a better way,
40:00
but big data comes with a series of really dangerous challenges. And this big data, we need to be able to retain control of what happens with big data,
40:28
not only humans, but also the civil societies. And those who use big data have to keep to these rules. It is equally important not only to learn from the data,
40:45
but also to guarantee and create a room for the human, for the originality, the creativity, for the irrational, for being able to decide against the prediction,
41:03
and being allowed to do this. Because, in the end, data is nothing else than a reflection of reality, a shadow of reality, and therefore always incomplete, always a little room.
41:23
Therefore, we have to go to this big data, have to approach this big data era with a certain modesty and humility.
41:43
Thank you very much. Thank you for listening. This was Julian and Anwen. If you have any critique, anything to say, even a thank you, we are glad to hear from you. Either on Twitter using the hashtag RP14EM,
42:03
the hashtag RP14EM, or just come to the translators booth, knock your thumbs up, thumbs down, whatever, show us your middle finger or give us a beer or something. We hope you enjoyed it, or understood at least whatever we translated.
42:21
So, we have time for questions and answers. I shall do the questions, and I shall do the answers.
42:41
There's a mic in the middle. First question from the floor. Thank you, that was great. I was especially interested in your point about guilt, as this was a sort of constant in your equation.
43:06
Isn't it more probable that we will just change the kind of crimes we punish, that we will move into other areas, just like we have something like thought crimes, which actually don't exist now?
43:20
You're right here, small data prediction maybe. We are on the thin ice here. At the moment, we have always said that thoughts are free, but I have killed my maths teacher in my thoughts, I've killed my maths teachers many times,
43:42
because thoughts are free. I was never punished for that. And they are about a channel for experiencing and coping with reality, and if we want to grab that via big data,
44:01
we would close this channel, one of the last channels of human freedom, and I believe that we will go into this direction. I have told you about these probation punishments, and the question of whether someone is released on probation,
44:22
and the question of the data. Please think about the question that a number of cities in the US, and also other cities in Europe, there are analytics that the police uses to predict where the next crime
44:43
is going to happen, and the police then goes there, and, interestingly, the police obviously find a crime that happens. If you look, they find something. Of course, they both. So, if the police go somewhere, they will find something,
45:00
and then the people find out that with this predictive policing policy, that the crime is dealt with better, and we need to find a way to not be gullible towards these people
45:24
who claim that this is a success. So, we have to ask us what you can do, what you can reach, and we need a discussion about the borders and the boundaries of the usage and the prediction,
45:42
and need to find a way of dealing with this now, unless it is too late.
46:00
I like the idea about circular and linear time, and I'm interested also in the effect not only on time but also on space, on public space for instance.
46:22
So, if I'm just walking around in public and, say, wearing a hoodie, would I have to change my behaviour in any way? I'm just interested in what your thoughts are on the whole space matrix.
46:42
I have to answer. I've written a book about this. It's about forgetting. The book is called Delete, and I want to sum up what I wrote in this book. So, regarding the public space, it's a little like the panopticon.
47:00
The imagination that I am constantly surveilled, and this has a disciplinary function on me, and a disciplinary quality on my behaviour. We don't need the policemen that tell me how to behave. I already think about the policemen and behave for myself. And we have to reclaim public space.
47:26
In England, the English set up hundreds of thousands of CCTV cameras, and people found out, the graffiti sprayers, for example,
47:44
so that nobody has a look what happens on these cameras. And then they continue to behave quite normally after a time. And then now Scotland Yard has found out technology
48:00
with which they can automatically analyse old recordings, and they, for example, can find out identities over labels on T-shirts
48:21
from people that go from camera to camera instead of recognising faces. And if that would be done, we would have the panopticon, that what didn't happen in the analogue world, we would have regained this panopticon. What would that mean for the robust discourse in our society?
48:40
You know, when I was a student, I was a student politician, and I organised protests and things like that. No-one will hold me accountable for this. Maybe it's a little embarrassing if I think what we protest against,
49:01
for example, yoghurt, because... But I did this, and no-one is going to remember me about this, remind me about this. I'm the last generation. You're not. You are going to be remembered.
49:20
What does that mean if we are going to censor ourselves, if we can restrain ourselves? We need to, as a civil society, we should get up and at least do one sort of borderline illegal thing per day.
49:41
You know, still legal, but on the border. Not in order to be illegal, but in order to get our space, you know, but not to prevent our space from getting closer and closer.
50:04
Just to jaywalk once a week, for example. Thank you very much.
50:30
You told the story of the bank account, where you have to sign what's put in front of you.
50:41
Do you ask, do you work for the bank? No, luckily I don't. But this, like, consent, just the act of saying, okay, I'm all right with your terms of service,
51:00
this is in discussion at the moment. So I recently saw a talk where someone had the position, actually, we don't require this consent and we should rather
51:28
keep it with the people who work on the date. I'm not sure where the question is. We didn't understand the question properly, but I will tell you my perspective. I didn't know what Ken said.
51:42
My perspective is this. If we have to just, you know, if we just wait for the acceptance or the denial of the
52:01
person that is in question, we won't help anybody, because that's just not the, you know, if there's a position of power, this will happen or anything. And it won't help society either, because we shouldn't forget that the idea of this data protection is that we enact our data protection
52:31
by getting up and forcing counter-pressure, and the reality is that we're not doing this, and the most of us are not doing this, and this is why we should get better and more tools,
52:47
and this is why we need this responsibility, and my book says responsibility and accountability of the users of data, and this is,
53:00
it means that those who use, who do analysis of big data, or bring a big data application online, they need to do a risk assessment for the data protection of the people in question, and have to gather, you know, to use prevention
53:29
and prevention techniques. If they do not implement these prevention and protection techniques, then they would be accountable
53:43
on a civil and criminal court. And, on the other hand, we believe that data are reused, and one should not have to ask the person a question
54:10
every time you use this data. And this is what we do at the moment all the time. We've done the risk assessment, you don't have to,
54:23
you aren't accountable for the further implications. That's what we do with medical and food and cars. So, you have to, you implement security measures, and then you are not accountable for what happens. You don't go to the supermarket and take a chemistry kit with you
54:40
and test whether the food is okay. We trust the experts, but, and I believe that this is the only way, the central way, going forward. And even though I've been thinking about big data for years, I don't think I can understand a big data analyst
55:07
if it's technically complex. I have another question, very short. Thank you for a very good talk. I have a personal question.
55:21
Like you mentioned, we're kind of the first generation that will never be forgotten out of this data pool. So, we have no idea, for example, what happens with our data, with our accumulated personal data, when we die, for example.
55:40
And that's something which is going to be quite interesting for social networks, et cetera, I think in the next few years. This is a good question, and I'm interested in that. I had a student two years ago, and she did an essay on death and coping on the internet, and what I found emotionally interesting
56:08
when I thought about the question of forgetting was that cognitive psychologists, I talked to a cognitive psychologist who said, I can only forgive people if I forget.
56:24
By forgetting, I forgive. And this is with cognition, this is a process that is connected in cognition, and I didn't accept, at first I didn't accept this because I always thought that I want to forgive the,
56:51
the crimes of the Nazis, but I will never forget it, to quote a book by a famous person.
57:03
But, this is also connected to how we deal with sorrow, coping. If we always remember that someone has died, we can't, can't finish this process of dealing with the death.
57:25
And therefore, it's important for our own psychological hygiene that what goes in regards of death, we need elements we remember,
57:41
but also those we can say we want to forget this, and not only remember what, we don't want to be remember, reminded of this traumatic loss. So, this was it, thank you very much again, my name is Julian.