It's not a fax machine connect to a waffle iron
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | ||
Author | ||
License | CC Attribution - ShareAlike 3.0 Germany: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this | |
Identifiers | 10.5446/33476 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
re:publica 2013124 / 132
1
10
11
15
17
18
19
20
22
27
31
33
34
35
37
38
41
42
43
44
45
47
48
49
50
55
56
60
62
64
68
71
77
78
79
80
81
82
87
88
92
93
94
97
98
101
103
104
105
106
107
108
109
110
111
112
114
116
117
118
120
121
126
129
00:00
Metropolitan area networkEndliche ModelltheorieLogicRight angleSinc functionXMLLecture/Conference
00:47
Forcing (mathematics)Archaeological field surveySoftware testingGoodness of fitPhysical lawProduct (business)Video gameMathematicsThermodynamicsEntropie <Informationstheorie>Cycle (graph theory)Reverse engineeringMultiplication sign2 (number)Data miningBoss CorporationComputer animationMeeting/Interview
02:54
Category of beingFiber (mathematics)Water vaporMultiplication signSinc functionLecture/ConferenceMeeting/Interview
03:56
Online helpWater vaporNeuroinformatikBuildingOrder (biology)CuboidCASE <Informatik>Special unitary groupGame controllerLecture/ConferenceMeeting/Interview
04:57
NeuroinformatikCuboidCASE <Informatik>Game controllerElectric generatorLoginAnalogyPower (physics)Lecture/ConferenceMeeting/Interview
05:47
ComputerNeuroinformatikRegulator geneDigital rights managementComputer animationMeeting/Interview
06:38
Multiplication signTerm (mathematics)Digital rights managementNumbering schemeField (computer science)Floppy diskSatelliteRow (database)SoftwareComputer programmingComputer fileSoftware industryKey (cryptography)Message passingMeeting/InterviewLecture/Conference
07:44
Computer fileMessage passingKey (cryptography)Real numberCryptographyPhysical systemOrder (biology)Revision controlInformation securityLecture/ConferenceMeeting/Interview
08:39
CryptographyDigital rights managementKey (cryptography)Mobile appRight angleDigitizingCodierung <Programmierung>Computer fileExpected valueComputer hardwareArithmetic meanMessage passingLecture/ConferenceMeeting/Interview
09:26
Computer fileKey (cryptography)Hard disk driveTelecommunicationPhysical systemMobile appDigital rights managementSystem callCompilerMeeting/InterviewLecture/Conference
10:14
Physical systemComputer fileKey (cryptography)Control flowSystem callMeeting/InterviewComputer animation
11:01
HypermediaComputer fileParameter (computer programming)Digital rights managementHard disk driveCopyright infringementStrategy game1 (number)QuicksortDevice driverOffice suiteSound effectGoodness of fitDigitizingRight angleOptical disc driveMeeting/InterviewLecture/Conference
12:42
Perspective (visual)NeuroinformatikException handlingSound effectDigital rights managementPoint (geometry)TheoremComputer scienceMeeting/InterviewLecture/Conference
13:46
Multiplication signRule of inferenceCodeDigital rights managementSource codePhysical lawNetwork topologyLecture/ConferenceMeeting/Interview
14:35
Digital rights managementControl flowPhysical systemComputer fileRule of inferenceBulletin board systemWeb 2.0Twitter1 (number)Key (cryptography)String (computer science)Copyright infringementYouTubeLecture/ConferenceMeeting/Interview
15:27
Key (cryptography)Multiplication signCopyright infringementMereologyInternetworkingReal numberSoftwarePhysical lawComputer fileCondition numberDigital rights managementRippingMeeting/Interview
16:35
Copyright infringementMaxima and minimaServer (computing)Automatic differentiationChemical equationImplementationBitDesign by contractFreewareKey (cryptography)Physical systemPhysical lawLecture/ConferenceMeeting/Interview
18:11
Condition numberCodeForm (programming)Digital rights managementOpen sourceSoftwareImplementationComputing platformPhysical systemMultiplication signGoodness of fitMathematicsLecture/ConferenceMeeting/Interview
19:09
Product (business)Goodness of fitRegulator geneSocial classStandard deviationDigital rights managementFlagMeeting/Interview
20:22
FlagTransmissionskoeffizientWireless LANPhysical lawRepresentation (politics)CASE <Informatik>Suite (music)Business modelFrequencyBitDrop (liquid)Right angleMeeting/InterviewComputer animationLecture/Conference
21:44
BitWordLaptopOpen setComa BerenicesServer (computing)Capability Maturity ModelMP3Streaming mediaSocial classPoint cloudLecture/ConferenceMeeting/Interview
22:40
Digital rights managementGroup actionOrder (biology)First-order logicData storage deviceLecture/ConferenceMeeting/Interview
23:37
Data storage deviceProcess (computing)Computer programmingCodeNeuroinformatikMeeting/InterviewLecture/Conference
24:33
Data storage deviceSoftwareNeuroinformatikDirectory service1 (number)Game theoryMereologyElectronic program guideComputer programmingPhysical systemElectronic mailing listType theorySet (mathematics)InformationMeeting/Interview
25:30
Game theoryAdventure gameDefault (computer science)Computer fileProcess (computing)Directory serviceComputer programmingNeuroinformatikRight angleDigital rights managementRootkitLecture/ConferenceMeeting/Interview
26:21
RootkitNeuroinformatikComputer fileComputer programmingKernel (computing)Computer programBootingBlind spot (vehicle)Sign (mathematics)Computer virusRippingDevice driverElectronic mailing listProcess (computing)Patch (Unix)Lecture/ConferenceMeeting/Interview
27:40
Physical systemNeuroinformatikInformation securitySoftwareEnvelope (mathematics)Digital rights managementInformationCuboidRule of inferenceMultiplication signLecture/ConferenceMeeting/Interview
28:48
InformationNeuroinformatikPhysical lawPower (physics)Programmable read-only memoryTheory of relativityInstallation artMultiplication signRight angleMeeting/InterviewLecture/Conference
29:50
LaptopNeuroinformatikData conversionInformationProcess (computing)Hard disk driveTouchscreen8 (number)Software industryDigital video recorderPasswordSoftwareVideoconferencingRootRow (database)Lecture/ConferenceMeeting/Interview
31:05
NeuroinformatikPhysical systemVulnerability (computing)MalwarePatch (Unix)MereologyDistanceSoftwareSystem callPresentation of a groupDigital video recorderOnline helpRevision controlDigital rights managementInformation securityPunched cardSource codeMeeting/InterviewLecture/Conference
33:03
Beat (acoustics)Video gameInterface (computing)BitWireless LANFilm editingNeuroinformatikSpywareMetreFirmwareLecture/ConferenceMeeting/Interview
34:04
Vulnerability (computing)Game controllerOrder (biology)CybersexRight angleComputerTheoryComputer scienceComputer programmingSpywareNeuroinformatikTablet computerCodeVotingEndliche ModelltheorieCuboidMeeting/InterviewLecture/Conference
35:19
CuboidSpywareNeuroinformatikBusiness modelRootkitDigital rights managementWaveCybersexSpeciesState of matterUniverse (mathematics)DebuggerLecture/ConferenceMeeting/Interview
36:20
CybersexNeuroinformatikSoftwareBitVector potentialNuclear space2 (number)Meeting/Interview
37:15
InternetworkingSelf-organizationCryptographySoftwareCybersexMathematical optimizationPower (physics)Exterior algebraCovering spaceUniverse (mathematics)Exception handlingVapor barrierMeeting/Interview
38:06
Vapor barrierReal numberFood energyInformationCore dumpArithmetic meanKey (cryptography)Coordinate systemMessage passingTelecommunicationLine (geometry)Surface of revolutionSelf-organizationSoftwareMoment (mathematics)System callOffice suiteSingle-precision floating-point formatLecture/ConferenceMeeting/Interview
39:12
Service (economics)InternetworkingSystem callArithmetic meanDistributed computingPhysical systemSoftwareGame controllerDistribution (mathematics)Open sourceMeeting/InterviewLecture/Conference
39:58
SoftwareGateway (telecommunications)Form (programming)Game controllerEnvelope (mathematics)Address spaceFreewareOpen sourceMultiplication signVideo gameContrast (vision)Inheritance (object-oriented programming)Self-organizationSmoothingGradientLecture/ConferenceMeeting/Interview
41:10
Open sourceMultiplication signOrder (biology)Statement (computer science)Process (computing)Inclusion mapBuildingNeuroinformatikSpywareRootkitCharge carrierInternetworkingLink (knot theory)WebsiteSoftwareLecture/ConferenceMeeting/Interview
42:15
InternetworkingCharge carrierState of matterParallel portRight angleRule of inferenceOrder (biology)Product (business)SoftwareFreewareMetreExecution unitPole (complex analysis)BuildingSource codeMeeting/Interview
43:41
SoftwareMultiplicationNeuroinformatikComputer-assisted translationContent (media)Digital rights managementMereologyMonster groupWaveMeeting/Interview
44:53
Data miningProduct (business)SummierbarkeitDigital rights managementOnline helpLengthSoftwareArithmetic meanFreewareInformationFamilyProcess (computing)Meeting/Interview
46:08
Right angleStatement (computer science)Lecture/ConferenceMeeting/Interview
46:54
Statement (computer science)GenderWordInternetworkingAnalogyTerm (mathematics)Lecture/ConferenceMeeting/Interview
48:24
Right angleComputer hardwarePhysical system1 (number)Parameter (computer programming)CuboidInternetworkingConnected spaceBasis <Mathematik>Arithmetic meanKälteerzeugungWireless LANMeeting/InterviewLecture/Conference
49:38
Interface (computing)Virtual machineRight angleOperating systemInformation securityExterior algebraGame controllerOperator (mathematics)Physical systemControl flowSource codeMeeting/InterviewLecture/Conference
50:26
Direction (geometry)Execution unitLink (knot theory)CASE <Informatik>CybersexInternetworkingDialectRight angleLecture/Conference
51:26
Virtual machineTelecommunicationInternetworkingNumberDirection (geometry)Strategy gameRight angleSoftwareVector spaceCategory of beingMultiplication signComputer hardwareData storage deviceInformationForm (programming)ThumbnailChief information officerMalwareMeeting/Interview
52:29
InternetworkingWater vaporLine (geometry)InformationSoftwareOpen setPoint (geometry)ModemMeeting/Interview
53:39
Water vaporInformationRule of inferencePhysical lawResultantPhysical systemLecture/ConferenceMeeting/Interview
54:27
NeuroinformatikWater vaporVideo gameComputer programmingInformationSoftwareInternetworkingInformation securityKälteerzeugungScaling (geometry)Goodness of fitCybersexMeeting/Interview
55:28
InternetworkingScaling (geometry)Goodness of fitPlastikkarteFirmwarePay televisionCollective intelligenceComputer programmingComputer scienceFunctional programmingSoftware frameworkContext awarenessNeuroinformatikComplex (psychology)Source codeMeeting/InterviewLecture/Conference
57:03
Type theoryOrder (biology)CuboidMultiplication signNeuroinformatikDampingGame theoryDrag (physics)Computer simulationProjective planeDemo (music)Drop (liquid)Revision controlProgrammer (hardware)Interface (computing)Lecture/ConferenceMeeting/Interview
57:52
Interface (computing)Game theoryTablet computerYouTubeComputer simulationDifferent (Kate Ryan album)Inheritance (object-oriented programming)Core dumpElectric generatorNeuroinformatikRight angleGame controllerMultiplication signMeeting/InterviewComputer animationLecture/Conference
58:48
Disk read-and-write headCore dumpMultiplication signPopulation densityState of matterRippingQuicksortLecture/Conference
59:51
NeuroinformatikDifferent (Kate Ryan album)Physical systemSet (mathematics)SoftwareCuboidGroup actionSoftware testingDirection (geometry)Rule of inferenceRootkitComputer hardwareOperating systemLecture/ConferenceMeeting/Interview
01:00:56
Rule of inferenceFlow separationAreaRight angleLecture/ConferenceMeeting/InterviewXML
Transcript: Englisch(auto-generated)
00:16
We are here today to talk to a man who has already said something about the new interview.
00:24
He has been here for the last few minutes. The Republic is about 23 years old now. His background is not a fax machine with a waffle. What he does now is clear.
00:43
He is Corey Doctorow. Good evening. My name is Dominic Heron. Like so many dumb Anglos before me, I am going to go to a foreign country and speak English.
01:06
Hi. When I start speaking too fast, please do this and I'll slow down. I have a story to tell you to start this off that a friend of mine told me. He used to work for one of the big packaged goods companies.
01:20
These are the companies like Unilever and Procter & Gamble that really perfected packaged product marketing. And life cycle marketing where they sell you a bunch of stuff every time your life changes. When you get born, when you go to school, when you get married and so on. And those are very marketing driven companies as you might imagine. And so the marketing departments, they go off and they have little retreats where they kind
01:45
of try to figure out what they're going to, the next products are going to be. And they test them in the market. They do research and then they come back to their engineering force and they say, okay, here's what you're going to make next. So one day the marketing department came into the chemical engineering department. They said, we've got it.
02:00
We've done a lot of surveys. People want laundry detergent that makes their clothes newer. We've done all the testing. Everybody likes this idea. So the engineers, they tried unsuccessfully to explain the second law of thermodynamics to the marketing department. And after a while they realized that entropy was something lost on these people.
02:26
And that moreover, these people were their bosses. And so they were going to have to figure out how to make detergent that made your clothes newer. But the more they thought about it, the more they realized that the marketing people were under a delusion about the definition of newness.
02:45
What does newness mean? Well, if you're a technical person, newness is this localized state of reversed entropy where things are more orderly than they were before. If you're a marketing person, newness is the property of looking or feeling newer.
03:01
And that's where the solution came from because they realized that they had been playing with this enzyme that gets activated in hot water. And that enzyme attacked and ate fiber ends, the ends of fibers. Now, a broken fiber has twice as many ends as an intact fiber. And broken fibers are what make clothes look old.
03:22
So by putting this enzyme in the washing powder, they could make your clothes look newer when they came out of the wash. Now, they weren't actually newer. They were much, much older when they'd gone through the wash because they had been partially digested by the wash water. Every time you washed your clothes in this enzyme, it would be partially digested.
03:45
And after not very many washes, you would have just a rag with holes in it. But it satisfied the delusional non-technical definition of newer clothes. And since in the era of globalization, many of us replace our clothes before we wear them out, even with the help of these hot water enzymes, nobody was hurt.
04:04
It was a harmless delusion. But not every delusion is harmless. And that's what I want to talk to you about today. Now, it cannot have escaped your notice that we increasingly live in a world that is made out of computers. Most of our houses are computers that we put our bodies into.
04:24
They're buildings where if you remove the computers from them, they would cease to be habitable in short order. They are functionally just giant, elaborate cases for computers that we happen to live in. Likewise, your car is a computer that hurtles down the Autobahn at 120 kilometers an hour with you trapped inside of it,
04:45
surrounded by other people in their own computers, likewise hurtling down the road. The 747 that the Americans in the audience flew from the United States to Berlin in to get here, that is a Flying Sun Solaris box in a very fancy aluminum case attached to some tragically badly secured SCADA controllers.
05:07
We live in a world made of computers. But we don't just put our bodies into computers. We increasingly put computers into our bodies. Those of you who are younger than me in your 20s, you're members of the iPod generation.
05:21
Those of you who are my age or older, we're members of the Walkman generation. But no matter what our age, we have all logged enough punishing earbud hours that come the day if we live long enough, we will all have hearing aids. Now, it's vanishingly unlikely that those hearing aids will be hipster, retro, beige, analog, plastic devices with a transistor.
05:43
They will be computers that we put in our bodies with the power to hear what we hear, to stop us from hearing things selectively, to make us hear things that aren't there, and even to tell other people what we're hearing. Everything we do in our world involves computers today, and that means that every problem we have in our world also involves computers today.
06:05
And that means that increasingly our policymakers, our regulators, our politicians, and our police are going to look to regulating computers and controlling computers as a way of solving social problems. Now, as it turns out, we are in the midst of a long-running 15-year experiment in achieving policy goals through the regulation of computers.
06:28
And if we examine that experiment in detail, we can get a glimpse of what the future of that might look like. Now, of course, I'm talking about the copyright wars. Digital rights management has been around for a long time.
06:41
It predates the use of the term digital rights management. That is to say, digital rights management is older than digital rights management. And back in the 1980s, we had lots of DRM schemes in the field that we tested in the name of preventing copying of floppy disks that were distributed through retail channels, or in the name of protecting satellite signals, and so on and so forth.
07:04
Now, if you can remember those days, you will remember that none of that DRM actually worked. It was just rubbish, because conceptually it was trying to do something impossible. The idea was that you would have a supplier, a record company, a software company, a satellite company, and they would make
07:21
a scrambled message available to you, a scrambled movie, game, a song, and then they would give you a program to descramble it. And they would trust that the program would not tell you the secret key that it was using to descramble. And they would trust that when the program was done descrambling the file that they'd given you, that it would throw that file away.
07:43
And they would trust that neither the key nor the file would ever be stored somewhere where you could get at it. Otherwise, you might just decrypt the file, throw away the encrypted file, throw away the stupid player they gave you, and just pass around the file.
08:00
Now, this is kind of like a cargo cult version of security, but it doesn't really stand up to inspection. In real cryptography, we have the idea that there is a sender and a receiver, and somewhere in the middle of them is an adversary who can't be trusted, who's eavesdropping on them. That is, you have Alice and Bob, who have some kind of shared secret.
08:22
They have a key that they share. And in order to communicate in secret, they use that key. And Carol, who's in the middle of Alice and Bob, is assumed to be able to intercept the message they're sending and to know what system they're using to send it, but not to have the key. The key is something that only Alice and Bob know, it's their shared secret.
08:43
But in the voodoo cryptography world of digital rights management, you have a decoding app that you give to the adversary that has the key in it. Otherwise, the app can't decode it. And it has to make the message, the file, it has to make it into an intelligible file, a clear text
09:02
file, an unscrambled file on hardware that belongs to the person who you don't want to have access to the unscrambled file. That is, you're giving the bad guy the means to unscramble the file with the expectation that he will unscramble the file and then hoping that he won't do anything you don't like with it.
09:21
So it's not Alice and Bob versus Carol, it's just Alice versus Bob. You've got Bob sending Alice the file that's been scrambled, the keys to unscramble it, and the system to use those keys to unscramble it. But he assumes that Alice is too stupid to figure out how to save that file onto
09:41
her hard drive in the clear or get the keys out of the app he's given her. This is Bob engaging in what we might call wishful thinking. Because in reality, there's not just one Alice. Everybody in the world who has some interest in getting at the stuff that's been locked up in Bob's dumb DRM is Alice.
10:03
And as soon as one of them extracts the keys because that person happened to have an electron tunneling microscope or decompiler or some kind of wizard-like facility with technology, then everybody in the world gets access to all the files that have ever been protected with that system.
10:27
And if Alice isn't that smart, she can just share the keys she's extracted or the files she's extracted or the instructions for sharing the keys she's extracted. It all comes out to the same thing. From the beginning, it was a fool's errand.
10:42
This is a break once, break everywhere exercise in futility that can't prevent copying. Sometimes when you talk to people who still advocate this weird idea, they'll tell you, oh, no, it doesn't stop copying, but it's a speed bump, what we call in Britain weirdly enough a sleeping policeman. That's what they call a speed bump. I love that.
11:01
They say it's just a speed bump, but it presents no speed bump. In fact, it's just the opposite. The speed bump is between the people who want to do the right thing and their enjoyment of the media. Because if you've got a player and you've got the encrypted media and you want to play it, you've got to use that slightly broken, slightly weird player to get at it, and you can't do anything it prohibits.
11:23
Whereas if you are someone who doesn't care about doing the right thing, you've got access to these files that are infinitely reproducible. They are fecund. So the company releasing the official files neuters them before it sends them out, right? So they can't reproduce. And the pirates, they have ones that can breed like rabbits.
11:43
So it's not a speed bump. It's just the opposite. As anyone whoever like me has gone out and bought a DVD with the intention of ripping it to their hard drive and then realized that the optical drive was at the office and went, ugh, I've already paid for it. I'll just download it from the Pirate Bay, knows there's no speed bump there.
12:01
The speed bump is only there if you're doing the right thing. So that's what this talk is about. The harm that arises from trying to selectively break our technology to solve a social problem with reference to the copyright wars. Now, when we have discussions about digital rights management, we often get distracted with pointless arguments. Arguments like if you put DRM on a file, will that generate new sales or will it cost you sales?
12:25
Or is DRM fair? Does it prohibit things that would otherwise be permissible under fair dealing and fair use? Or is DRM a good business strategy for the entertainment industry? Or even does DRM work? Does it prevent copying?
12:41
But these are the wrong sorts of questions to ask. I mean, they're good questions to ask if you're me. If you're an artist who's trying to pay your rent with copyrighted works, it makes sense to know the answers to those questions. But from the perspective of a policy maker, from the perspective of a computer scientist, from the perspective of someone who's outside the tiny little rump
13:02
that is the entertainment industry against our wider society, these questions are totally beside the point. Now, as it happens, I happen to think that DRM reduces sales, can't accommodate fair use and fair dealing, sabotages entertainment companies and their interests, and doesn't prevent copying. But like I said, that shouldn't matter to anyone except for the tiny minority of people who happen to be in my industry, the entertainment industry.
13:27
For the wider society, DRM's effects are independent of those questions. That is, even if DRM was good for sales, even if it was totally fair, even if it was great for the entertainment industry and wholly effective,
13:42
we should still be against it on grounds that are far more important than any of those. And to understand why, we have to unpack some of our nerd complacency and realize that it's impossible to talk about technology questions without examining and weighing legal code at the same time that we consider software code.
14:03
Now, earlier, I alluded to the rules about DRM. The World Intellectual Property Organization makes those rules, WIPO, or in French, OMPI. WIPO is the UN specialized agency in charge of global copyright treaties. They have the same relationship to bad copyright law that Mordor has to evil.
14:20
And in 1996, WIPO passed a pair of treaties, the WIPO Copyright Treaty and the WIPO Phonograms and Performers Treaty, the WCT and the WPPT, the two most important treaties you've never heard of. And these created a whole bunch of special protections for digital rights management. They made it illegal to break digital rights management.
14:41
They made it illegal to take the keys from a digital rights management system. They made it illegal to tell someone how to get the keys out of a digital rights management system. They made it illegal to host those keys or host the instructions for extracting them. And they made it illegal to make tools that automated extracting keys. And moreover, they made rules that said that web hosts and other intermediaries from YouTube to Twitter to the ones that were around back in the 90s.
15:08
I think in the 90s we had bulletin board systems and tin cans and string. Must unquestionably and immediately remove the files that are claimed to be in violation of those rules. Or they face being named as parties to the infringement in any eventual lawsuit that arises.
15:24
And those lawsuits have these statutory damages that can be piled up to farcical heights. $250,000 per active infringement in the United States. Now where did that get us? Well, it made it illegal to reverse engineer or interoperate with any technology that had any DRM in it.
15:41
Now that's a pretty sweet setup if you're a commercial company. Let's see how that plays out in the real world. Let's take DVDs as an example. Now DVDs hit the market in 1996. They are not hard to interoperate with. They're made with the same presses that make our CDs. They're read with the same drives.
16:01
You can get software that allows you to read and write those files, rip them and do new things with them. But if you want to legally do anything with a DVD, you have to license the keys from the motion picture studios that control those keys. Because it's illegal to extract the keys, to share the keys, to host the keys, to use the keys. Even though they've been extracted a long time ago by a Norwegian teenager and his friends.
16:23
And they've been floating around the internet for a better part of a decade. You have to pretend that those keys aren't there. Otherwise, you're breaking the law. Now as a condition of licensing those keys, the companies that control them make you sign a document in which you promise to do a lot of other things that have nothing to do with piracy.
16:43
Instead, they have everything to do with profit maximization. For example, you have to implement unskippable ads. If you've got kids and you've started buying Disney DVDs, you'll know that there's sometimes 20 or 30 minutes worth of ads at the start of the DVD. Skipping those ads is not piracy, but implementing the unskippable bit is mandatory for all DVD players.
17:04
You have to implement region checking. Now I know in Europe it's very common to go out and buy a region-free DVD player. What you may not know is that all of those companies that make region-free DVD players are breaking the law, violating their contracts. And it's only because the people who control the keys forgot to ask their stakeholders for enough money to litigate that they're not suing them.
17:23
Blu-ray, on the other hand, when that consortium was consolidated, they did raise a fund explicitly to sue people who made region-free players. And you will not see region-free DVD players in the high street any time soon. And finally, if you're going to implement the DVD keys, you have to promise to implement something called robustness.
17:45
Now robustness means that your system has to be designed to be robust against user modification. It has to be designed so that users can't change the system. Now what does it mean to make a DVD player that is resistant to user modification?
18:03
After the keys have been in the wild for a decade, how can you make something user-modifiable resistant? Well, we can argue about what is or isn't resistant, but one thing we can be sure of is that if you want to make a technology that is licensed, and that license comes from something like the Free Software Foundation,
18:23
and the Free Software Foundation has as a licensing condition that it must be released with code in the preferred form of the work for making modifications. That is to say, as a condition of the license, you must make it modifiable by users that that is not resistant to user modification. Those two things are incompatible, which means that digital rights management effectively bans free and open source software.
18:48
You cannot implement digital rights management in a platform that allows the user to go in and set the allow this to be copied zero times bit, and change it to allow this to be copied 20 million times, otherwise it ceases to be an effective system of controlling use.
19:06
Now why does it matter if you can't interoperate with a system? Well, there's lots of good reasons to interoperate, but one crucial one is innovation. There are lots of companies that don't see value in adding features to their products. They may decide that a certain market is too small for them.
19:22
This is something that people with sensory and physical disabilities know all too well. You don't present a market to the company that's big enough for them to add the accessibility feature, and in the absence of robust regulation forcing them to do it, you are left as a second class computing citizen. Unless you or your friends or the Royal National Institute for the Blind
19:44
or some other institution can get together and add that feature for themselves, but of course they can't when there's a ban on interoperability. Or sometimes companies don't want to implement a feature because omitting it makes them more money than they would make by selling something that they could otherwise get for themselves,
20:03
selling something to users that users could otherwise take for free. Back when I was working on European digital rights management for televisions at the DVB, you know, that DVB logo on the side of your TV, they're a standards body, and they standardized something called CPCM, which is a DRM for televisions.
20:20
And one of the flags that the pro DRM people wanted was a flag that said if you were watching TV in one room, the receiver had to be in the same room as you, otherwise it wouldn't work. So you couldn't run a wireless retransmitter from your sitting room to your bedroom or into another room. And when I asked about this, I asked what's the use case for this?
20:42
I mean, surely there's no law that says you're not allowed to watch TV in one room if the receiver is in another. The representative from the Motion Picture Association of America said watching TV in one room when the receiver is in another room has value, and if it has value, we want to charge money for it.
21:00
This is the urinary tract infection business model. Right now, all of the features for your TV come in a kind of healthy, robust gush. But under this proposal, everything would come in a painful drip. Every button on your remote would have a price tag on it.
21:20
So again, back to DVDs. DVDs have been out since 1996, and not one feature has been added to them since 1996. The suite of lawful activities you are allowed to do with your DVDs in 2013 is the same as it was nearly 20 years ago. You are legally allowed to watch your DVDs, period.
21:43
Now have a little Gedanken experiment. See, I've got a German word in there. Have a little Gedanken experiment. Go back to 1996 and walk into something suitably anachronistic, like a Tower Records, and buy 1,000 euros worth of DVDs and 1,000 euros worth of CDs and stick them in a vault.
22:02
Wait 10 years. Open the vault, 2006. Well, with the CDs, they have matured. Your investment has matured. It's not usual for a technology investment to mature. You know, I spent the dot com years unwisely investing all my spare money in laptops. They did not mature in value.
22:22
But these CDs have matured in value because now you can't just listen to them. You can rip them. You can put them on your MP3 player. You can make ringtones out of them. You can make alarm tones out of them. You can stream them. You can mash them up. You can put them on a remote server. You can stick them in the cloud. There are so many things you can do with those CDs. That was the dividend that Interoperability got for you.
22:43
The DVDs, nothing. The missing dividend for DVDs is something that was taken from you and transferred to large entertainment companies. And that is what you get when you add DRM to any technology. All of the stuff that you might seize for yourself or that other industry parties or interested public interest groups might add to that technology,
23:06
those are entirely in the scope of one company, the company that originated the DRM. But Interoperability is only the first order casualty of DRM. And I've saved the most dire consequence of DRM for last.
23:20
And that's transparency. How do you make DRM work? Effectively, to make DRM work, you have to have an anti-feature in your device. Your beautiful device with a glowing apple on the front of it has to watch what you do and wait for you to do something that you're not supposed to do, like install software that didn't come from the Apple store. And then some process that has been lurking in the device unbeknownst to you has to float up to
23:45
the surface and say, instead of, yes, master, it has to say, I can't let you do that, Dave. Right?
24:02
Because there is no other way to do it. You can't remove the run code that I can compile feature. You have to have a program running that has a check to see whether all code has been signed by Apple before it's run. In that program, we can call it, like, how9000.exe. That has to be living somewhere on your computer.
24:23
And moreover, if it's living somewhere on your computer where you can find it, it has to be somehow protected against you dragging it into the trash because there's no market for it. Nobody woke up this morning and said, I wish I had a device that didn't let me run software from stores that I wanted to buy from and only from one store.
24:42
People may buy it in spite of that, but no one's bought it because of it. And so that program, that how9000.exe, has to be part of a larger system in your computer designed to lie to you about your computer. So when you go into a directory and you list that directory out and say,
25:01
is there a program called how9000.exe in this directory, the computer has to say no. Now, you may remember there was an old Zork-style Infocom game that Douglas Adams made, the Hitchhiker's Guide to the Galaxy game. And there was one room in that game, it was the most frustrating thing. There was one room in the game where you'd go in and you'd say, look, and it would say, there's nothing here.
25:20
And you'd type, look again, and it would say, there is nothing here. And you'd type, look again, and it would say, really, there's nothing here. And you'd type, look again, and it would say, OK, there's something here. And unless you did that, you couldn't possibly win the game. Now, that was a very funny thing for Douglas Adams to do in his text adventure, but it is not how our devices should work by default.
25:41
Not only do your devices need to lie to you about what files are in their directories, they also have to lie to you about what processes are running. Because even if you can't find the file, you may look at the process monitor and say, is there a program running on my computer called HAL 9000.exe? And if so, kill it with fire.
26:01
You know, pseudo kill minus nine, that son of a bitch, right? Nobody wants HAL 9000.exe running on their program. And so you also need a computer that when you say, tell me about all the processes running on you, blinds itself and says, I can't see this process. And that's how DRM works fundamentally.
26:20
That's what Sony did in 2005 when they shipped the Sony rootkit. This was an audio CD. You put it in your computer. They shipped out six million audio CDs, 51 titles, one from a Canadian artist, Celine Dion. I apologize on behalf of Canada both for Ms. Dion and for that CD. And they shipped them out. When you put the CD in your drive, it had a little auto run file that would spark up a program
26:45
that would patch your computer's kernel so that it wouldn't see any program that started with dollar sign, SYS, dollar sign. So program file listings wouldn't list it out.
27:00
Process listings wouldn't show it. And they wrote onto your computer a program that started at boot time that looked for you ripping CDs. And if it saw you trying to rip a CD, it would shut it off. Okay, so, you know, we can argue about whether or not ripping CDs is a legitimate thing or not. But here's the thing. They put a moat in your computer's eye.
27:22
They gave it a blind spot because it could no longer see programs that started dollar sign, SYS, dollar sign. They could no longer detect them if they were running. So virus writers immediately started to prepend every virus they shipped with dollar sign, SYS, dollar sign.
27:40
I mean, why wouldn't they, right? They were the opportunistic infections rushing in to fill the hole that Sony had punched through your computer's immune system. Dan Kaminsky, the security researcher, did some back of the envelope work using DNS calls from this piece of software. And estimated that it had been installed on 300,000 U.S. government and military networks, right?
28:04
So this stuff, that's what you get when you build this stuff, a moat in your computer's eye. And that is the true cost of DRM. When you add DRM to a system, you create a legal requirement for opacity and an injunction against reporting weak security.
28:21
Because any time you tell someone about how a DRM works and what it's doing to your computer, you run afoul of the rule that says you're not allowed to defeat DRM. And that matters for reasons that are much more significant than the future of the entertainment industry. Now, this is not about whether information wants to be free or not.
28:42
I sat down with information last week. We got a box of Kleenex, we sat down somewhere quiet, we had a glass of Chardonnay, we cried, we hugged, we talked it out. And in the end, information confessed something to me. It only wants one thing from us, and that's for us to stop anthropomorphizing it.
29:01
Information wants nothing, but people want to be free. The computers in our pockets, the computers we insert into our bodies, and into which we insert our bodies, have the power to liberate us or to enslave us. When computers don't tell us what they're doing, they expose us to horrible risks.
29:23
And when the law prohibits third parties finding out what our computers are doing and telling us about it, those risks are magnified. To get a sense of what those risks are, in 2012, the Federal Trade Commission, which is the U.S. regulatory body that deals with consumer rights in relation to companies,
29:43
entered into a settlement with seven companies in the higher purchase, or rent to own business. That's where you buy something on installments. You pay for it four or five times over, but if you can't afford to pay for it in the first place, it's worth doing if that thing is very important to you. These companies, mostly what they did rent to own on was laptops,
30:03
because it's very hard to be a citizen of the 21st century without a computer, without a laptop. And so that was their business. And the eighth company that they settled with was a company called Designerware in northeast Pennsylvania that made anti-laptop theft software that ran a secret process the computer was blind to
30:23
that allowed them to remotely operate the camera, the microphone, to read the hard drive, to read the keystrokes, and to read the screen. And in the settlement that these seven companies and the eighth software company entered into with the Federal Trade Commission, they admitted that they had used this software routinely and deliberately to record their customers having sex,
30:44
to video record their children in the nude, to eavesdrop on their conversations, to gather their confidential banking information and passwords, their medical information and passwords, their confidential conversations with their lawyers, and to just kind of root around on their hard drives looking for interesting stuff.
31:04
The Federal Trade Commission in the settlement, just this is neither here nor there for this talk, but to give you an idea of how badly wrong we get it, the Federal Trade Commission told these eight companies you have to stop this and can't do it anymore unless, does anyone want to guess what the unless was?
31:22
Unless you put it in the license agreement. If somewhere in that long hairball that says by being dumb enough to do business with us you agree that we're allowed to come over to your house and punch your grandmother and wear your underwear, make long distance calls and clean out your fridge, you stuck also we're allowed to video record your children in the nude and make pornographic movies of you having sex with your partner, then it's okay.
31:43
But it's not just private companies doing this, it's governments. Companies like FinFisher in the UK and Vupin in France do business with some of the most horrifying dictatorships in the world, taking versions of this software and selling it around so that it can be used to infect computers of everyday people
32:03
and help dictators control their populations. And governments not only create the market for security vulnerabilities here because instead of having those vulnerabilities disclosed to the vendors who then patch them, the companies that discover them instead sell them to governments for lots of money,
32:21
but they eliminate competition for these companies by making it illegal to report on security vulnerabilities when they form part of a DRM system. And so one of the things we've seen, for example, is this kind of malicious software disguised as iTunes updates, which is doubly problematic because any disclosure of the workings of iTunes is likewise likely to attract legal attention.
32:45
And lest you think that all of this stuff, particularly the hearing aids and computers in our bodies, sounds like science fiction, let me remind you that last November a security researcher named Barnaby Jack gave a presentation in Australia on the work that he had done on implanted defibrillators.
33:02
Now, implanted defibrillators are amazing technology. If you have a problem with your heart, if it loses the rhythm and puts you at risk, your doctor, she can anesthetize you, cut you open, spread your ribs, reach into your chest cavity, and attach a computer with a battery directly to your heart, and it listens to your heart beating.
33:22
And if your heart loses the rhythm, it gives you a little shock and puts your heart back in rhythm and saves your life. Amazing technology. Now, doctors want to read the telemetry off of these things after they put them in your body, and they want to install new firmware for them. And it's a bit messy to do that by cable because this thing is inside your chest cavity.
33:43
So they've got a wireless interface. Everything has a wireless interface. Everything in this room has a wireless interface. You are basically in a microwave oven now. That's where Barnaby Jack comes in. From ten meters away, he could detect this implanted defibrillator over its wireless interface and hijack it and reprogram it to locate other wireless defibrillators and reprogram them
34:06
and then give lethal shocks to the people who have them. Now, these wireless defibrillators run on the embedded controllers that governments are looking for vulnerabilities to in order to engage in bizarre cyber war exercises like Stuxnet.
34:23
So they are actively suppressing the disclosure of embedded controllers and instead building a market for secrecy in these vulnerabilities to make it harder for people to save their own lives. Now, at the start of this talk, I said policymakers are going to make this mistake forever because everything we do will have a computer in it and every problem will involve a computer
34:45
and every problem will arrive at the same solution. Make me a computer that does everything but doesn't run the program that causes this problem. Now, we don't have a model for that. There is no theoretical model in computer science for Turing complete minus one.
35:01
A computer that can run all the code that we can compile except for the program that the voters don't like. The closest we can come is a computer with spyware on it out of the box. Now, some people say why does this matter? After all, the success of tablets means that users don't want general purpose computers.
35:21
They only want appliances. But you can make an appliance that isn't built with spyware out of the box. A computer that's designed only to do one thing, to be like your cable box. But if that business model depends on that device having a root kit on it so that the users can't decide to use it for something else or add a feature to it, there's something wrong.
35:41
People do want appliances, of course. But they don't want their Kinect spying on them on behalf of a skeezy sextortionist or an authoritarian state. Now, as the masters of the technological universe, we are contemptuous of DRM. We know that we can defeat it with a wave of our debugger.
36:00
But that doesn't make it harmless. Our own complacency about this is the most dangerous thing. The coming century holds a thousand fights over DRM, over computers that say yes, master, or say I can't let you do that, Dave. Now, I'm one of those cyber utopians, and this is supposed to be one of those cyber utopian gatherings.
36:23
And so you may be tempted to dismiss all of this as the folly of someone who thinks that computers solve problems. After all, the entertainment industry wanted copy-proof bits. They wanted computers that could somehow work without copying data from one place to the other, an idea so stupid that it would have made both Alan Turing and Claude Shannon laugh until they wet themselves.
36:45
And we gave them clothes that dissolved in the wash, and it seemed to make them happy. So what's the harm? But here's the thing about cyber utopianism. From the beginning, people who believed that computers and networks could solve problems also saw that they had the potential for terrible oppression.
37:04
My own journey on this stuff, the place where all of this started for me, where I stopped being someone whose main activism was in things like nuclear disarmament and started to be someone who really started to worry about this stuff, began with the second issue of Wired magazine, 1993, which had a cover story by Steven Levy called Crypto Rebels about EFF, the legalization of crypto,
37:26
and the danger that the internet would become a universal tool for surveillance. These early people were mobilized by intense optimism about the power of crypto to enable cheap organization and secrecy from oppressive regimes
37:40
and by stark, frank terror over what would happen if that crypto layer, the freedom layer, was not included in our devices and networks. There's a lot of loose talk these days about cyber realism and that being an alternative to cyber utopianism. But cyber realism says the internet isn't a thing, and if it is a thing, it's not a thing that's important to the struggle for justice,
38:05
except inasmuch as it is a barrier to those struggles, a siphon for feeding activist signals intelligence to dictator's spooks, a warm, distracting bath that takes real activist energy and diffuses it through meaningless clicktivism.
38:20
At its core, realism seems to be saying that the means of information are irrelevant to the reality of the world. That is, it turns its back on the whole of human strategic history that said that coordination was the key to victory and that communications were the key to coordination, a process as old
38:41
as the Caesars tattooing secret messages on the shaved scalps of their messengers, waiting for the hare to grow in and sending them across enemy lines, right through Trotsky sending troops out on the eve of the revolution to seize the post and telegraph's office and up to this very moment where organizations with goals as diverse as Mexican drug cartels
39:03
find themselves kidnapping Motorola engineers and forcing them to build private cellular networks before murdering them and the FBI revealing that it is listening in on every single voice call in America. The internet is not nothing, nor is it irrelevant except as a means of buying and selling things,
39:23
nor is it the world's greatest pornography distribution system, nor is it the second coming of cable television, nor is it the world's best video on demand service, nor is it the next kind of telephone, nor is it a waffle iron connected to a fax machine. If it does any of those things, it is purely incidental to what the internet is, the nervous system of the 21st century
39:47
where everything we do today involves the internet and everything we do tomorrow will require it. Thank you. A network that will either be a nexus of control or of liberation.
40:08
Now, as an activist who spent the 1980s devoting 98% of my time to stuffing envelopes and putting stamps on them and 2% of my time figuring out what to put in the envelopes, I am delighted that we got the envelopes and the stamps
40:22
and the address books for free in the 21st century. And as for clicktivism, it is the greatest boon to activist organizing in history, a gateway drug to deeper forms of engagement. Increasingly, savvy activist organizers are offering a smooth gradient of engagement from clicking on a petition
40:41
all the way up to turning it into your life's work. Now, this is a marked contrast to the earlier activist world, the offline activist world, where your engagement was either total or nonexistent, which meant that your activism usually ended with employment or dire unemployment or having children, which meant that in particular disenfranchised people, parents,
41:04
and especially women who were mothers, were denied participation in struggles for their own liberation. What a privileged thing it is to sneer at clicktivism, at the idea that you can only be an activist a little of the time.
41:21
It's the statement of someone who's not worried about his next meal, someone who isn't stuck at home looking after the kids, someone who doesn't have to worry about losing his job in order to attend a protest. An inclusive movement cannot be made of those with nothing to lose and those with the privilege of being able to afford to lose something.
41:43
It's up to us. We are building the future. We can build spyware and rootkits into our computers. We can put a moat in their eyes. We can allow our governments to betray us into the hands of phone companies that want to end network neutrality and replace the internet where when you click on a link,
42:02
you get the thing that was on the end of the link to the internet where when you click on the link, you either get the website that has paid a bribe to the carrier to be delivered or a bill. Now, the carriers say, well, we're just running a for-profit entity, and if you don't like it, you can run your own for-profit entity and offer the internet that you want.
42:23
But there is no carrier that is a creature of the marketplace. Not only do these carriers have their origins in the publicly funded, publicly built state telcos that used to be here, but every single one of them to this day enjoys a public subsidy without parallel in any industry,
42:41
and that's the right of way that their wires go on. Imagine if you were starting a new carrier from fresh, Deutsche Telekom 2, and you had to go and dig up every road and put a wire into every basement and put a pole on every lawn in order to get your network there, and you had to pay the free market cost for every meter of that right of way, you would be spending trillions.
43:07
There is no carrier that could ever do that. The only way to do that is to allow the government to pass rules giving you a subsidy that is so valuable we can hardly even calculate it. Now, the carriers who say, well, we don't want to deliver neutral networks,
43:22
we want to deliver networks that are more profitable to us that treat you like the product that's being bought and sold instead of as our wires as the product that's being bought and sold. We can say to them, fine, if you want a free market network, go and build one. Go and pay the clearing cost of every meter of right of way across all of Europe,
43:43
and in the meantime, you have 60 days to get your copper out of our dirt. And after 60 days, we'll pay you the scrappage cost of it, which is very generous.
44:03
Commodity prices are peaking, thanks to our friends in China, and we will find someone who's willing to take the multi-trillion euro subsidy that we give to you and operate a network with that subsidy that we give to you in our interests, and not merely in the interests of your shareholders.
44:21
We can build a network that is part of our freedom or part of our oppression. We can make a future that makes computers into a lever that turns penny tyrants into global monsters, or we can resist. We can refuse to wave our hands at those silly people in the marketing department, the deluded politicians, the coked-up, holly weird fat cats who threaten that they will abandon the web
44:45
and take their precious content back to AOL if they don't get DRM in HTML5. It's okay. I'm an artist, and my livelihood depends on the sale of my entertainment product, on my ability to extract meaningful sums of money from the world in exchange for my amusing made-up fairy stories
45:06
that help you pass those dismal hours between birth and death. And I think DRM is rubbish and of no help to me for reasons I've gone into at considerable length in this talk. But even if I wasn't convinced of this, even if I didn't think,
45:21
even if I did think that DRM was the only way to earn a living telling my funny little stories, I wouldn't be up for it. I would go and get a real job, because as much as I want to take my family for nice weekends in Disneyland Paris, or buy my daughter nice clothes, or pay our mortgage, I want a free and fair world for my daughter even more.
45:41
And I think you should too. There is no way to fight oppression without free devices and free networks. So it is up to us to build the freedom layers onto our devices and networks that enable that struggle. To be cyber optimists, to secure the network and to use it to coordinate our struggle for freedom,
46:00
to jailbreak every device, to crack every sensor wall, to out every astroturfer, to seize the means of information and use it to liberate the planet. Thank you.
46:22
Thank you. Thanks. So, thank you. All right. Thanks. Thank you.
46:46
Thank you very much. Thanks. So we have about 15 minutes for Q&A. I'll remind you that a long rambling statement followed by what do you think of that is technically a question but not a good one. So I don't know if there are microphones.
47:02
There's microphones. So put your hand up. Unless all of that was so non-controversial and self-evident that no one disputes a word of it.
47:24
Okay. Corey? Yeah. It's a pretty complex talk, so it's difficult for me to invoke a Morozovian critique of what you just gave, but I'm reading his book and I find it very compelling. And to use an analogy, I don't think I would be ready to become a Barnaby experiment.
47:48
I wouldn't want to attach my heart to the internet. I wouldn't migrate there until I knew that it wasn't hackable. And so I think that there are some things that I would not want to go with in terms of the internet to solve my problem.
48:05
So there are some things that are not quite ready, obviously, because we have predatory corporations determining the fate of so many things. I think I would decide not to attach certain things to the internet right now.
48:22
So I agree that it's a bit weird that there's an IP-based device in people's chests, but it's kind of easy to see why when you think about it. Because if the argument is, well, I mean, I assume that you don't mean we shouldn't have implanted defibrillators, that all the people who have bad hearts should just die until we fix that. Implanted defibrillators are okay, right?
48:41
So if implanted defibrillators are okay, they've got to have something running them. And it seems pretty obvious why we end up with commodity hardware running them, because on the one hand it's cheap, but on the other hand, ironically, it's because it's secure. Although manifestly it wasn't secure enough, the best systems we have are the ones that are widely tested.
49:03
And so the reason the manufacturer put it in there is not a crazy one. It wasn't that they wanted to put on the box for the defibrillator comes with internet connectivity. I mean, that may be true of our refrigerators and stuff, that there's a lot of silly season stuff. But nobody actually is a consumer for an implanted defibrillator.
49:22
Your doctor recommends it, and your doctor recommends it on the basis of things that are presumably mostly medical, unless she's getting bribes from a company, but mostly medical, right? And there is a really good medical reason to get telemetry off of a device. So it's going to have a wireless interface, because as people who have insulin pumps will tell you,
49:44
having something like a machine that is partly on the inside of your body and partly on the outside of your body is very problematic in its own right. And so the reason this thing has a commodity controller in it running a commodity operating system is because the alternative would be, like, I made up my own controller with my own operating system,
50:05
and I swear to you that it's secure, and I can tell because I can't think of any way of breaking it. And that's actually even riskier. You know, go ahead. I'd like to hear the other side of that.
50:23
Well, you're taking a very specific case, a reason for it to be online, but there are many other situations you could look at that don't need to be online. And so this very enthusiastic cyber-utopian kind of push is saying that all things should go in that direction,
50:47
that the solutions are with the internet. So it's a little hard to have this dialectic right now, but there's many different things that we could bring up that don't seem to make sense to try to solve as an internet problem.
51:02
Sure. I mean, the implanted defibrillator was your example, which is why I was using it. But the question of whether things should or shouldn't be connected is, I think, slightly misguided inasmuch as people form a high-latency, low-reliability internet link between devices
51:23
inasmuch as, for example, people who are using the internet can be socially engineered using the internet and using badly secured machines that look like they're presenting authenticated communications into typing things in the consoles of devices that aren't connected to the internet. And they essentially become malware vectors, and that's a common vector for malware.
51:43
If there was ever a doubt that there is no such thing as an air gap in a world in which human beings spend 99 percent of their time facing this direction connected to the internet and 1 percent of the time facing this direction connected to a device that isn't connected to the internet, Stuxnet answered that question, right, that we will vector information,
52:02
either in the form of thumb drives or whatever, into those devices. I used to be a CIO, and I spent a lot of time trying to talk to users about connecting outside hardware, outside storage, and outside networks, and found that the optimal strategy for getting my network secure was not to rely on abstinence,
52:24
that no matter what I told my users, when they had a problem to solve, they would solve it by whatever means they could. It was to try and make sure that when they had a problem to solve that I was the first line of defense. I talked to Genevieve Bell last year from Intel, Intel's anthropologist,
52:46
and she cited some research she'd done on air-gapped networks, so networks that aren't connected to the internet in defense contractors, hospitals, in governments, in spy agencies, all these places Intel has its networks,
53:00
and what they found was that every single one of these air-gapped networks was cross-connected because at some point somebody on the clean side of the network really needed something from the public side of the network, and they literally did things like plugged a modem in or secretly ordered a DSL line from the phone company or VPNed in from one to the other and left the VPN tunnel open
53:21
or ran an ethernet cable from one device to the other. I mean, I agree that it's a bit weird that the public internet that everybody in the world is on is connected to people's cars, but I don't know how you stop them from being connected to people's cars, and what I think we need to do is start understanding that our information ecology is as interconnected as our water ecology.
53:46
I mean, it's true that as a Londoner most of the water I drink has its immediate origin in the kidneys of people who live in London, but ultimately the destiny of all the fresh water in the world is intermingled with my destiny because I may pick up microbes from somewhere else and add them to the water in London.
54:04
And as a result, for example, a rule that said it was against the law to tell people if a company was polluting the water would be a very, very bad law even if we wanted the systems where people polluted and the systems where people drank totally air-gapped and separated. It should always be legal to blow the whistle.
54:21
It should always be legal to know things about your water. It should always be legal to add things to your faucet to find out what your water is doing. We should regulate water with the gravitas of something that is literally a matter of life and death, not just for us, but for everybody in the world whose destinies we're intermingled with. And this is true of networks and computers. When Stuxnet moves on and mutates and starts attacking nuclear reactors that aren't in Iran,
54:44
we see that not only is the air gap a fallacy, but so is the idea of directed programs, programs that only attack one air-gapped device, that information may or may not want anything, but it has a tendency to continue to migrate around the internet, especially if it's attached to something that people want.
55:06
And so I think there's a real politic here that if you want to secure people's cars or computers or refrigerators, you can't start by saying they shouldn't be connected to the internet. You should start by saying now that people have connected their computers and phones and everything else to the internet,
55:22
what do we do to make them secure, not because cyber utopians told you to connect them to the internet, but because there's lots of economies of scale and good reasons that people want their devices connected to the internet, and that's why they do them. I mean, I have devices that aren't connected to the internet like old cameras that are all but unusable now because I can't update their firmware and they're buggy,
55:42
and reconnecting them to the internet involves like loading an SD card and flashing them, which is really hard. And people like me and everyone else who has a problem with the device, when the opportunity arises to connect that device directly to the internet, we'll take advantage of it, regardless of whether it's a good idea or a bad idea. Abstinence doesn't work.
56:00
As the father of a soon-to-be teenage girl, I'm keenly aware that abstinence is not the full solution to getting people to do the right thing. Other questions? Yeah. Is it working? Hi, Corey. I'm Stefania. Thank you very much for the talk. You just mentioned your daughters.
56:20
So I wanted to ask you, how do you think children will actually contribute to our definition of freedom, complex networks, contribution of collective intelligence, freedom within a framework, and so forth? Thank you. What can kids do to contribute to this and how can we make kids aware? Well, I mean, I think that the generational advantage that I enjoyed was growing up in an era in which computers came with the requirement
56:49
that you do something with them before they could do something useful. So, you know, when I got an Apple II Plus in the late 70s, my dad was a computer science teacher. It came with like two or three very minimally functional programs and then a subscription to a magazine that had basic programs you could type in
57:04
in order to get it to do other stuff because it did almost nothing out of the box. And while it took a very special kind of person to be willing to spend summer holiday entering basic instructions from Byte Magazine, at the same time it meant that I always perceived the computer as something that was there to say yes, master,
57:23
and not to say I can't let you do that, Dave. And, you know, the good news is that we are developing, at the same time as our devices are becoming in some sense more locked down and appliance-ized, we're also developing increasingly powerful tools for computer literacy. The Mozilla Foundation has done so much amazing work in this.
57:40
Popcorn and some of their other work. And the MIT Media Lab lifelong kindergarten Scratch project. I just saw a demo of the next version of Scratch that is intended for pre-literate kids. So it's a fully drag and drop programmable interface for writing games and simulations. And, you know, it's the difference between your kid running home, grabbing the tablet and watching My Little Pony on YouTube all afternoon
58:06
and running home, grabbing the tablet and making awesome pony simulations that lead to her having an innate understanding that the device is there to dance to her will and not the other way around all afternoon. So, I mean, every parent makes it up as they go along.
58:20
I've never done this before. And no one has ever parented a kid in 2013 before. So we're all making it up as we go along. But, you know, I really firmly believe that a sense of agency, control, and the right to tinker is at the core of raising a generation that will not allow their computers to become tools of oppression.
58:50
Thank you very much. I will try to make this short.
59:03
You talked a lot about regulating technology. A lot of people that I think are smarter than myself spend a lot of their time thinking about regulating technology when it comes to military drones. And, personally, I cannot wrap my head around this issue because at the core I think regulating technology is a dead end
59:25
and it's a bad thing in the long run. However, if you are a pacifist, which I'm not sure that I am, this being May 8th, and I'm being thankful for things that happened on this day in Europe. But if you are a pacifist, of course you should be against unmanned vehicles in the sky shooting down people.
59:48
So what is your take on drones? Is that a technology we should regulate? Gosh. Well, I think there's a difference between saying technologies may or may not be built or computers must...
01:00:00
or must not contain these instruction sets or this software out of the box, or operating systems must include this or that, and saying governments and their militaries may not kill people in certain ways. And so, I mean, I'm not a military expert, and like you, I'm something of a pacifist. I grew up in the anti-nuclear proliferation movement.
01:00:22
I got thrown out of school when I was 12 years old for founding an anti-cruise missile testing group in Canada. So this is not, my DNA is not, what are the legitimate uses of military hardware? It's not something I have a lot of direct experience with. Mostly I have direct experience with saying stop giving so much money to the goddamn military.
01:00:43
But I think that they are totally separate questions. The question of like, should there be rules demanding that drones, operating systems come with rootkits that prohibit them running anything that isn't signed by a certain agency? I think the answer is no, we shouldn't have those rules.
01:01:02
I think a separate question is, should you be allowed to kill people with drones, and under what circumstances? And the answer to that is, I'm not entirely sure. You know, I think probably not, but this is not an area of my expertise. All right, then. Finn Dunk.
01:01:20
Thank you. Good to see you. Corey, Dr Roel, can you be a dank?