Beyond The Camera Panopticon
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Part Number | 8 | |
Number of Parts | 177 | |
Author | ||
License | CC Attribution - ShareAlike 3.0 Germany: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this | |
Identifiers | 10.5446/31859 (DOI) | |
Publisher | ||
Release Date | ||
Language | ||
Production Place | Berlin |
Content Metadata
Subject Area | |
Genre |
00:00
Symbol tableMultiplication signPresentation of a groupRight angleComputer animationLecture/Conference
00:58
Metropolitan area networkRight anglePoint cloudData conversionInstant MessagingInformation privacySpacetimeFacebookCellular automatonMassMessage passingZuckerberg, MarkComputer animation
04:07
Set (mathematics)Data conversionInstant MessagingFacebookRight angleMessage passingEmailInformationFitness functionAddress spaceBitComputer animation
05:27
Group actionObservational studyWordRight angleContext awarenessSoftware testingMereologyFacebookProduct (business)Different (Kate Ryan album)TelecommunicationComputer animation
06:56
FacebookRight angleGraph coloringFloppy diskInterior (topology)SpywareType theoryDrop (liquid)MalwareMusical ensembleTwitterInformation privacySpacetimeCategory of beingEndliche ModelltheorieComputer animation
09:08
System callService (economics)SpywareInformationFacebookRight angleGoogolMathematical analysisAlgorithmComputer filePattern recognitionDecision theoryInformation privacyDigital photographySearch engine (computing)Arithmetic meanSource codeComputer animation
11:07
EmailDecision theoryGoogolDomain nameForm (programming)Game theoryService (economics)InformationLecture/Conference
11:59
Game theoryRight angleWordDigital photographyGraph (mathematics)Cellular automatonForm (programming)FreewareSign (mathematics)Grass (card game)Android (robot)Computer animation
12:49
Pattern languageGame theoryService (economics)Insertion lossLoginScaling (geometry)ChainGoogolComputer configurationPasswordAxiom of choiceEndliche ModelltheorieRight angleBit rateFreewareRoutingPoint (geometry)Computer animation
15:18
View (database)InformationCondition numberGoogle Street ViewRight angleTerm (mathematics)Metropolitan area networkInternetworkingPlastikkarteSign (mathematics)SatelliteService (economics)Bit rateArmMedical imagingMereologyProjective planePattern languageMappingObject (grammar)FacebookDimensional analysisForm (programming)DemosceneDigital photographyFreewareInstance (computer science)ExistenceGoogolBitComputer animation
20:15
InformationPattern languageView (database)Profil (magazine)SimulationPhysical lawCellular automatonMultiplication signComputer simulationRight angleComputer animation
21:57
Boundary value problemLine (geometry)TelecommunicationMotion captureGraphics tabletAnalogyRight angleExtension (kinesiology)Computer animationLecture/Conference
23:07
Multiplication signSound effectExtension (kinesiology)Right angleServer (computing)Cellular automatonGame controllerMeeting/InterviewComputer animation
23:56
Form (programming)Factory (trading post)Data conversionCategory of beingMereologyReal-time operating systemMultiplication signProduct (business)Different (Kate Ryan album)Computer animation
25:59
TelecommunicationGoogolInformation securityEncryptionMultiplication signProduct (business)Web pageException handlingInformation privacyAlgorithmRankingBitService (economics)Online helpRight angleFacebookBusiness modelWebsiteTouchscreenGroup actionComputer clusterPlanningInformationWeb 2.0Computer animationLecture/Conference
29:13
InformationShared memoryVariety (linguistics)Right angleLaserShooting methodLecture/ConferenceMeeting/InterviewComputer animation
30:11
InformationComputer programmingPoint (geometry)Power (physics)Shared memoryInformation securityCryptographyNatural numberOcean currentDisk read-and-write headGoogolRight anglePhysical lawProcess (computing)CASE <Informatik>Computer animation
32:07
Right angleDigital photographyReverse engineeringInstance (computer science)Zuckerberg, MarkAxiom of choiceInformation privacyShared memoryMeeting/InterviewLecture/Conference
33:00
Right angleFitness functionMetropolitan area networkInformation privacyCoalitionTerm (mathematics)Computer animation
33:49
Business modelProduct (business)Lie groupLevel (video gaming)Computing platformState of matterInformation privacySet (mathematics)FacebookData managementEndliche ModelltheorieRight angleMaizeSummierbarkeitLecture/ConferenceComputer animation
35:28
FacebookLatent heatPower (physics)MalwareMultiplication signVector potentialSound effectSystem callSoftwareGame controllerType theoryPoint (geometry)Digital photographyBuildingBusiness modelMereologyCellular automatonUltraviolet photoelectron spectroscopyRight angleGradientQuicksortNumberComputer animation
38:11
QuicksortBusiness modelSinc functionReading (process)Mobile appWordProduct (business)Software developerMultiplication signRight angleSoftware development kitFamilyCartesian coordinate systemEmailTouchscreenComputer animation
39:31
FacebookLine (geometry)Business modelGoogolCartesian coordinate systemHookingWeb 2.0Open setInternetworkingMobile appRight angleTwitterMetropolitan area networkArithmetic meanMultiplication signWordInternet der DingeRegular graphVideo gameRamificationSoftware developerUltraviolet photoelectron spectroscopyKey (cryptography)Client (computing)BuildingField (computer science)Lie groupLecture/ConferenceMeeting/Interview
41:19
Open setField (computer science)Right angleLogic gateMultiplication signSoftware developerWeb 2.0Direct numerical simulationInternet der DingeArithmetic meanGame controllerWordCategory of beingSelectivity (electronic)Direction (geometry)InternetworkingComputer animation
43:07
CAN busDrop (liquid)Term (mathematics)Source codeComputer animation
44:00
InformationMultiplication signShared memoryWater vaporMereologyInternetworkingGame controllerBitQuicksortInternet der DingePlastikkarteComputer animationLecture/Conference
45:21
InformationWeb pageSI-EinheitenEndliche ModelltheorieBusiness modelTraffic reportingSymbol tableWhiteboardElectronic mailing listSpacetimeQuicksortRight angleBroadcasting (networking)Event horizonLattice (order)Physical systemGroup actionWebsiteDifferent (Kate Ryan album)Ultraviolet photoelectron spectroscopyProcess (computing)Online helpDisk read-and-write headDigitizingMereologyInheritance (object-oriented programming)Computer animation
47:27
Ultraviolet photoelectron spectroscopySoftware developerDecision theoryNatural number1 (number)Right angleSphereDependent and independent variablesSpeciesFacebookInequality (mathematics)Multiplication sign2 (number)Slide ruleComputer-assisted translationPhysical systemVideoconferencingRegulator geneArithmetic meanMaxima and minimaElectric generatorSocial classComputer animation
51:27
Vector potentialInequality (mathematics)SpeciesStatisticsRight angleNumberBitRegulator geneComputer animation
52:39
SpeciesArithmetic meanComputer configurationBit rateUniverse (mathematics)Decision theoryCellular automatonMereologyLecture/Conference
53:55
HierarchyRight angleMultiplication signProduct (business)Computer animation
54:45
Product (business)HookingNear-ringRight angleError messageWordWeb 2.0WebsiteMathematicsPoint cloudOpen sourceNetwork topologyNatural numberMultiplication signSampling (statistics)Branch (computer science)Exterior algebraCore dumpConnected spaceService (economics)Greatest elementPhysical systemType theoryParameter (computer programming)Game controllerDigital photographyEndliche ModelltheorieBuildingTerm (mathematics)Computer hardwareMassGoogolInequality (mathematics)Arithmetic meanSpeciesSoftwarePersonal computerScaling (geometry)Control flowComputer architectureBridging (networking)FacebookClassical physicsSpacetimeGroup actionDirection (geometry)Pearson product-moment correlation coefficientCharacteristic polynomialShared memoryComputer animation
01:02:31
TheoryMultiplication signRoundness (object)Meeting/InterviewComputer animation
Transcript: English(auto-generated)
00:01
Hello, everybody.
00:22
Just waiting for my presentation to come on. Very snazzy graphics throughout the whole day. It's lovely to be here in Berlin. Thank you all for showing up. I know they also asked 14 of my best friends to give a talk at the same time, so I appreciate your coming to this one. So we all know that government surveillance is bad, right?
00:45
It's got this scary, scary symbolism attached to it, so I don't have to go into detail about why it's bad. We already know that it's probably something we don't want, most of us, right? And if you didn't know, it's bad, so let's just leave it there. But what if surveillance didn't look like this?
01:03
What if it was hard to identify? What if maybe it looked cute or lovable, more like this? Then would we still be able to identify it as malware? Or what if it looked like nothing at all?
01:22
What if it was invisible? Now, a lot of us already subscribe to the mass psychosis of a magical white man who lives in a cloud that knows and sees everything about us, right? So would it be perhaps easier for us to accept
01:42
another somewhat less magical white man who lives on a cloud that knows and sees everything about us? Perhaps. So to explore this topic, let's start somewhere simpler. Let's start with a concept we're all familiar with, privacy.
02:01
So if I say to you, let's have a private conversation, you probably inherently understand what that means. It's just you and me talking, no one else, right? So let's take a look at what Facebook mean when they say private, right? You might think it's the same thing. It's just you and me and we're having a conversation.
02:22
Who uses instant messenger here? Facebook instant messenger, right? Yeah, private conversation, right? It's just you and me. Well, you'd be mistaken. Because for one thing, you're not having the conversation in a public space like a park. You're having it in someone's home.
02:41
It's a private space. And it's the home of a creepy uncle that pays the rent by knowing as much about you as he can because that's what he sells to his actual customers, the people who actually pay him. Because remember, you don't pay him, right?
03:00
So here's how a private conversation takes place on Facebook. First of all, you tell this stranger what you want him to relay to your friend. The stranger takes notes, and he never forgets. That's important. And then he relays your message to your friend.
03:21
So when Facebook says private, what they really mean is public. In the novel 1984, George Orwell had a name for this. He called it doublethink. So to understand what Facebook says when they say private,
03:42
we have to engage in Orwellian doublethink. Now, we already said we're pretty OK, a lot of us, with one magical white man in the sky who knows everything about us, right? So maybe we'll be OK with this, you know, slightly less magical white man who knows everything about us, right?
04:01
So let's take a look at Mark Zuckerberg. What kind of man is he? Maybe we can trust him. So in 2010, Mark Zuckerberg gave an interview with The New Yorker in which he admitted to having had an instant messenger conversation when he first set up Facebook, when he was at Harvard, when it was tiny.
04:20
Here's how the conversation went. Zuckerberg says to his friend, yeah, so if you ever need info about anyone at Harvard, just ask. I have over 4,000 emails, pictures, addresses, SMS messages, 4,000, wow! It was tiny, right?
04:40
And his friend goes, what? How did you manage that one? And Zuckerberg says, he says, people just submitted it. I don't know why. They trust me. And here's the best bit. Dumb fucks.
05:02
Isn't that a beautiful phrase? Dumb fucks, right? And of course, you might think, Aral, you know, this was when Facebook was young. And of course, the intervening years of astronomic fame and fortune have probably only made him more contemplative and nuanced
05:20
in his opinions towards us. Or, if you've been watching what's happening with Facebook's emotional contagion experiment that they published as an academic study recently, maybe not, right? Who's aware of this? Who's seen the, yeah, some of you? So basically what they did, for those of you who don't know,
05:41
they took a group of people, about 680,000 people, and for one group, they only showed negative posts from their friends, only bad things. And they wanted to see how that affected them. Did they get depressed? Did they kill themselves? Maybe, who knows? It's fun, right? And then for another group, they showed only positive posts.
06:02
And what did that do to them, right? And they published this as a study, and people read it and they went, what? Oh no, we can't believe you're doing this. We're very angry at you, Facebook. Why did you do this? And Facebook said, sorry. Sheryl Sandberg said, this was part of ongoing research companies
06:21
due to test different products. In other words, this is what we do, guys. This is our business, right? And that was what it was. We're going to keep doing it. It was poorly communicated. And for that communication, we apologize.
06:41
She might as well have ended that sentence with dumb fucks, right? And maybe we deserve it. Because even the tech press or the mainstream press who should know better about what they're writing about, they wrote articles like this. Should Facebook manipulate its users?
07:00
I don't know, should it? What a stupid question to ask. The dumbest question you could ask. The only reason you would ask this question is if you have absolutely no idea what business Facebook is in. It is their business to manipulate users, right? Your behavior. That's how they make money. It's their business to understand you
07:21
because that's what they sell to their customers. It is their business to spy on you, to get this insight. And that insight is what we talk about when we talk about data, right? Just like the mutant plant from outer space. In the musical Little Shop of Horrors that starts out as a little sapling
07:42
and it needs drops of blood to grow and then ends up eating whole human beings, Facebook needs your data to grow. That's what it feeds on. That is Facebook's food, right? So that's not just Facebook, that's Twitter, that's Google,
08:00
that's nearly every mainstream Silicon Valley technology company's model today. And if it's so widespread, we have to call it something so we can talk about it. And in the old days, if you installed something like on a floppy disk, anyone remember floppy disks, and it didn't do exactly what you thought it was going to do,
08:23
maybe it did more, maybe it took some of your data and used it for something else, we identified it as malware and we called it spyware. But this was cloak and dagger, right? It was like government surveillance. It was so easy to identify as malware.
08:40
We're not talking about this type of spyware anymore. What we're talking about, ladies and gentlemen, is spyware 2.0. And spyware 2.0 is beautiful. It is cuddly, cuddly Google doodles. It's privacy dinosaurs on Facebook. It's all colors of the rainbow.
09:01
How could anything bad come from this, right? So this is what Shoshana Zuboff calls surveillance capitalism. So call it spyware, call it surveillance capitalism, call it corporate surveillance. Whatever we call it, there's an undeniable fact at the heart of it,
09:21
and that is that it is malware. So how does a company like Facebook, or let's take a look at Google for example, same business, right? How do they gather this information about us? Well, there are a plethora of ways. It starts with services. Remember that Google itself began as a sapling.
09:41
It was just one service. It was a search engine, right? But today, it couldn't be further from that. It is a plethora of services. So do you want a place to put your photos? Use Picasa. Will Google be running facial recognition algorithms to understand who you are,
10:01
who your friends are? Will they probably run a further analysis to see how you're looking at your friend? What does that mean? Of course, the more they can, the more they will. Do you want a place to put all your files? Then put them on Google Drive. Will Google look through all of your files to understand you better? Yes, of course, that's their business.
10:22
Remember, there aren't people doing this. There are algorithms doing it, but who writes the algorithms and who benefits from the algorithms? So take Gmail, for example, right? You might say, Aral, okay, I use Gmail. Who uses Gmail here? Just a show of hands. Don't be shy.
10:40
It's fine. I'm not going to do anything. Okay, so some of you use Gmail, right? It's great. It's convenient. It's awesome. We all love it. And you might say, Aral, gosh, you're so anal. It's okay. Don't worry about it, right? It's just a selfish decision I'm making. It only affects me, right? I'm willing to give up some of my privacy. You know, I have no idea how much I'm giving up, of course,
11:02
but I'm willing to give it up because I get a valuable service back, right? It's my selfish decision. So, Aral, just take a step back, right? Well, I beg to differ. It's not just your selfish decision, because remember, as much as you send people emails, people send you emails as well.
11:21
And Gmail and Google read their emails also. So the decision that you're making is not just that it's okay for Google to read what you write, but that it's okay for Google to read what anyone who wants to communicate with you writes. And if you have a custom domain attached to your Gmail account,
11:41
they may not even know that you have made that decision for them. So in this sense, it's not just a selfish decision, it's much more like second-hand smoke. It also affects the people around you. So how do they gather this information? These are services, there are games as well. Who's filled out a form and had to recapture on it?
12:02
And you had to fill out a word, yeah? And maybe you had a photo of a street sign or another word, right? And the reason the other word is there is they've either scanned a book or they've taken a photo of a street sign and they can't read it. So they're saying, we're going to give you some utility, some use, right? By protecting your form, but you help us as well and let us read these street signs.
12:22
That sounds like a fair trade. And there are full games, though. There are actual games. Who's played Ingress here? Anyone played Ingress? Yeah, some of you. It's a free game, right? Free? We love free. It's a free game. Android game, of course. You download it and there's been an alien invasion. And what you have to do is you have to go from one landmark
12:43
to another landmark in the city as quickly as you can. And then you need to hack into these landmarks that you've gotten to and then stop this alien invasion. And what you're really doing, though, as you're playing this really cool game, is you're giving Google very hard to come by data on pedestrian walking patterns.
13:04
Even Google doesn't have the resources to send that many people into the street to find out how they will go from one place to another, and that's why they need you. They go, go on, go find us the best routes, and here's a free game to help you do it.
13:21
So you might say, Aral, OK, these services don't sound that great, but I can just stop using them, right? I beg to differ if every other service that's an option has the same model, whether you have a choice, but let's say you can. And it's easy to stop using a service. Let's say that.
13:40
And Google's lost, right? But Google can't lose, right? So what's the next step? If services are somewhat easy to stop using, what's the next step? Well, what if I give you devices? What if I give you beautiful, gorgeous devices, like these Nexus phones over here?
14:00
They're gorgeous, aren't they? Google understands user experience now. They didn't always, but they do now, right? And these phones, I don't know if you guys heard, they're like half the price of an iPhone. That's such a steal. I wonder how they do it. Because I think, if I'm not mistaken, Tim Cook is a supply chain guy, right? Is he asleep at the wheel?
14:22
Does he not have the same economies of scale that Google has? Or are these perhaps subsidized, beautiful, gorgeous data entry devices? Because think about it, what do you do with these devices? The first thing you do is you sign into your device with your Google username and password.
14:42
So at that point, it doesn't matter what service you use, we will get some valuable data from you, because we've made the login for your device the Google login, right? But you might say, we just won't buy the devices, it's fine, dude. We've problem solved, let's all go home.
15:01
And Google loses. But Google can't lose, right? So what's the next step? If you make money from people's data, what's your end game? Well, the end game, of course, is to connect everyone to the Internet, right? In the West, we do that through things like Google Fiber, right?
15:22
Look at that, it's beautiful. Create a Google account, your one Google account for everything, including signing into the Internet, right? Because what have we done there? Oh, if I'm connecting you to the Internet, then it doesn't matter what device you use. Go ahead, use an iPhone, I'm still going to get some valuable data from you, right?
15:42
And that's really all I need. Just don't lock me out, man. I love you, all right? Just don't lock me out. Just let me in, somehow. And this is for the West. But what about the next five billion? Have you guys heard of this term? In Silicon Valley, they're very excited about the next five billion. The next five billion are these poor people in poor parts of the world
16:05
who can't connect to the Internet, and not only can they not connect to the Internet, there's no one within that five billion that can do anything about it in their own way. So what do we have to do? We have to, as the white man, bring them the Internet
16:23
using balloons and satellites and drones, because they couldn't possibly do it themselves, right? Come on, right? Yeah, am I right? I'm right, right? And how else do we do it? We do it with zero rating services. We say, here, internet.org is here,
16:42
we're going to bring you the Internet, you poor people in parts of the world that we've never actually visited, right? We're going to bring you the Internet with terms and conditions attached. The white man always has terms and conditions attached. Right now, they're just a bit longer, nobody reads them. What are those terms and conditions? Well, it's not the full Internet.
17:01
It's Facebook plus a few other things, but you get that for free. Isn't that great? Free. I love this quote by Desmond Tutu. He says, When the missionaries came to Africa, they had the Bible and we had the land. They said, let us pray. We closed our eyes. When we opened them, we had the Bible and they had the land.
17:23
It's very similar to this. It is a form of colonialism, right? It is a digital imperialism. We are bringing you the fire. The fire happens to be the Internet. So, that's information about us. That's very valuable data, right?
17:41
But there's more information and data in the world. There is information about the world, right? And Google needs that as well, as do Facebook, et cetera. How do they gather that? Well, through satellites. They have satellite imaging of the earth, of course. They buy and gather mapping information from countries that they operate in, right?
18:03
And then there's also Google Street View. You guys are all familiar with Street View, right? There's a Street View car. Who's seen the Street View car? Yeah, some people. Who's made a funny face or something? Yeah, no one. OK, yeah, there's someone who's made a funny face. But there are places you can't go to with a car, but you really need the data.
18:20
So, there's the Google Street View trike for those instances. But if you can't get there with a trike, how about a snowmobile? Because you need the data, right? You're getting the idea that you need the data, right? But you can't go with a snowmobile indoors, you know, without causing a scene. So what about the Google Street View trolley to get the data from places that you need?
18:41
And if you can't get there with a trolley, people, that's all right. Because there's the Google Street View backpack, the trekker, you can take that with you and you can get the data. And I just saw this recently. I shit you not, this is real. This exists. That's a camel, if anyone can't see it.
19:01
So, even if Google were to show up with a camel or a backpack, or maybe because of it, there are probably two places you won't let Google into, right? One is your workplace, and the other is your home. Now, I say probably because I watched a comedy show,
19:24
I think it was German, actually, where people pretending to be Google Street View went to people's homes and they said, hi, Google Street View, we're here to take photographs of your home. And some people let them in. But I'm going to assume that no one in this audience would have let them in, right?
19:42
And that's why they need you. Do you see a pattern forming here at all, right? Who's heard of Google's latest phone project? Project Tango. So Project Tango, thank you, some people have, is a phone with a 3D depth-sensing camera
20:00
and a motion sensor and object detection. And what you do is you buy this phone and you take it home with you, and you walk around your home. And as you're walking around your home, it maps your home in three dimensions, and it starts recognizing what objects there are. Oh, yes, he has an Xbox, and oh, there's that magazine that we recognize from somewhere. OK, great, and here are the DVDs that he has
20:21
if he's very old-fashioned or chic. So basically, because they can't come into your home, they need you to do the work for them. And of course, all of this information is sent to Google, right? It's the same thing. There's a pattern happening here. And it's not just the data.
20:40
The data that we gather is just atoms. What's really interesting is when we start combining that data together, because that's when we start getting a profile of you, when we start building your digital self, when we start building a simulation of you.
21:00
And that is really valuable, because think about it. I can't take you, and I can't lock you up in a room somewhere in my basement and perform experiments on you all the time, because we have laws against kidnapping, right? Our corporeal selves, our physical selves, are protected by a body of laws that we've built over the years.
21:23
We didn't always have those, right? But what if I have so much information about you that I can create a simulation of you? Well, I can take that simulation, and I can put that simulation in my lab, and I can study it, and I can psychoanalyze it, and I can prod it to see how it reacts
21:42
24 hours a day, and there are really no laws to speak of that protect our simulations. And this is what needs to change. For this to change, our understanding of our relationship with technology needs to change. We harbor under this understanding that technology is a butler.
22:01
So there we are, right? And there's our phone, and we say to our phone, do something, and the phone says, sure, I'll do it. It's the butler, right? Steve Krug, Don't Make Me Think, beautiful little book about design. This is the analogy that we use. We need to move beyond this. We need to move beyond this to understanding that, for example,
22:21
when I use the notepad, the simplest thing on my phone, and I write down a thought, what I'm really doing is extending myself using this technology. And that's really, really important, because if this is an extension of myself,
22:41
we can start to change where we draw the boundaries of the self. And if we draw the boundaries of the self to include the technology, then when we have things like surveillance, it's not merely signals capture while there's a communication happening between two parties. It is a violation of the self,
23:01
and that's a very different place to be. I think we need to start thinking about it in these lines, because we are, in effect, becoming cyborgs. Not that we implant ourselves with technologies, but we don't have to. We extend ourselves using technologies. So maybe it's time that we started to extend person rights
23:22
to the technologies that extend our persons, so that we, as individuals, can start to ask for individual ownership and control over what is essentially ourselves. Because we call data exhaust,
23:41
we call data the new oil. What a horrible way of thinking about it, right? We call it the new fossils. I don't know, I've heard so many things. But really, what data is, is people. When we think of server farms, have you ever thought what we're actually farming?
24:01
What is in those farms, if not people? You are what we're farming. Google, Facebook, and these companies are factory farms for human beings. You are what we sell. You are our product.
24:21
Our industry is in the business of selling people. In the past, and in some places in the world, unfortunately, still today, there was a very lucrative business of selling people's bodies. We used to call it slavery.
24:40
We don't do it anymore in a lot of places, and we don't like to talk about it in polite conversation. But maybe the time has come for us to ask ourselves the very uncomfortable question of what we call the business of selling everything else about a person that makes them who they are apart from their body.
25:03
What do we call that? Because that's the business that most of us are in, especially in Silicon Valley. And what does it mean if we combine data about everyone in the world with data about the world in real time continuously?
25:21
What happens when we combine that together? We get what I would call a camera panopticon, an all-seeing camera. In the past, when we hear stories about how we went with photographic cameras to native people in different parts of the world, and some of them thought that it would steal their souls,
25:42
and we laughed. What do we think about it if we have a camera that can actually simulate you? It becomes a little less funny. And what happens if you own a camera panopticon? Well, maybe you start thinking you know what true is.
26:01
Google recently announced that they're going to, they're looking into switching from using page rank to using truth as the rank for web pages, which of course begs the question, true according to whom? So if I were to create a website,
26:21
and on it it said Google is malware, I have a slight suspicion that this is not going to rank very high in Google's truth algorithm. Just a hunch. Because it's truth according to Google. And I've been going on about Google a lot, so I couldn't help but notice that they have a stand-out site.
26:41
I believe they're one of the sponsors. So I approached and I said hi, because they're really lovely people, and of course they are. People who work at these places are mostly very lovely people. But I couldn't help but notice their wall. Did you guys see this wall? I thought this would be a great chance to play some bullshit bingo.
27:01
You guys want to play bullshit bingo? So let's start with the most apparent bit. We have this beautiful screen here, and it says, Encryption. Unencrypted communications can be surveilled. Encryption is the best way we know how to protect people's communications. Wow, that's great. That's true. That's really helpful encryption.
27:20
So let's ask Google the question. What would happen, Google, if you end to end encrypted all of your services? And the only answer they can give you is, well, Aral, we would go bankrupt tomorrow. So if your business model fundamentally is incompatible, then maybe you shouldn't have it up on a screen there. So here, I'm going to call bullshit on that one.
27:43
And I'm so glad. Thank you for letting me use that transition for the first time in Keynote. Let's keep reading. Keeping your information safe, secure, and always available when you want it are among our highest priorities at Google. We work continuously to ensure strong security. You know what, guys? That's not bullshit. That's true.
28:03
Google has great security. They have probably the best people working for them. But let's not confuse security and privacy. Remember that the mafia is very secure. They keep you secure from everyone except the mafia. So Google is very secure. Once they have your data, they will keep it secure from everyone else.
28:23
Just not from themselves. So the next bit where they say, protect your privacy, now that might be true in some alternate universe, but in this one, that is absolutely, unadulteratedly false and bullshit. So we'll have a bullshit there.
28:42
And what's the last bit? Make Google even more useful and efficient for you. Now, this is true. Make Google even more useful and efficient is very important for them. But it becomes false when they add for you. When Google or Facebook improve their products, that's like the massage we give to Kobe beef, right?
29:03
It's not for the benefit of the cow. It's so that we have a better product. Thank you. So I'm going to call bullshit on that one as well. One out of four, not bad, guys. Thank you. Thank you for that stand.
29:21
So you might say, Raul, OK. But at least these companies don't share what they know with the government, right? Well, of course, we know that that also is bullshit, right? So after 9-11, in the United States of America, they founded the Information Awareness Office with the publicly stated aim of gathering all information about everything and everyone.
29:43
And this was their actual logo. You know what? If that's your aim, if that's your goal, like publicly stated goal and you're a government agency, don't make this your logo. Not with an all-seeing eye and a pyramid shooting laser beams at the world. People get scared. And so people got scared.
30:02
And they said, oh, actually, sorry, we were kidding. We're going to stop doing all of those things now, sorry. That was just like a prank. But of course, as Edward Snowden's revelations have shown, the NSA really needs to hire a PowerPoint designer. But apart from that, they did not stop the program. All of these companies that we trusted with all of our private information
30:22
were sharing that information with the government. Why? Why were they doing this? For two reasons. Because it was convenient and cheap. Because the information that you voluntarily provide to a third party is not under the same protections under the law. And it's in one place.
30:41
The NSA or GCHQ or BND or whatever has to go to one place and say, hey, give me all the data. It's like a drive-thru, like a fast food drive-thru for data, for people, right? It's convenient and it's cheap. It's sense on the person. Now, if we can take that data and distribute it back to all of you
31:00
or maybe even start it there, that's going to be more expensive. They're going to have to go out and find you and have to surveil you directly. That's expensive, right? And they're going to have to target you. And very few people have a problem with a spy agency doing targeted surveillance. Because that's what they do, right? It's when they start spying on all of us that we have a problem.
31:22
It's like Bruce Schneier, the cryptographer and security specialist says, the NSA didn't wake up and say, let's just spy on everybody. They looked up and said, wow, corporations are already doing our job, right? They're spying on everybody. Let's just get a copy because it's easy and cheap to get a copy, right? So if this is the case, I'm sure that the people who run these corporations
31:42
who are at the head of them really take this seriously. So let's hear what Eric Schmidt has to say. He was the ex-CEO and current executive chairman of Google. He says, there's been surveillance for years. I'm not going to pass judgment on that. It's the nature of our society. So, Eric, what should we do?
32:01
If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place. Or maybe you shouldn't be doing it in the first place. If you have something you don't want anyone to know, maybe you shouldn't be doing it in the first place.
32:24
Now, of course, when people published photos of Eric's house, this wasn't true then, right? So it must be true in some other instances. This wasn't true when Mark Zuckerberg signed non-disclosure agreements with the people working on his houses
32:41
so they couldn't even say that they were working at Mark Zuckerberg's houses, right? Was that something to hide? Is it true when we're in the toilet? Do we have something to hide? Or is it maybe just something we don't want to share? Of course, this is patently false, because privacy is not about whether you have something to hide or not. It's about having the choice to determine what you share
33:03
and what you keep to yourself. It's a fundamental human right that we've seen fit to enshrine in the Universal Declaration of Human Rights, in Article 12, to be precise. So when companies like Microsoft and Google say they're leading coalitions demanding an end to government surveillance,
33:25
when they form bodies like Reform Government Surveillance and they say, Yeah, the government's terrible, man. The government's terrible. Look at them. They're horrible. Or when they sponsor conferences like RightsCon, right, to say we care about human rights and your privacy, et cetera, right?
33:42
There is a technical term for this. It's called bullshit, because that is fundamentally their own business model. It's a very old trick. It's a very old trick in magic.
34:00
We call it misdirection. Don't look at what this hand is doing so I can do something with this other hand, right? Look at how horrible the government is. Forget the fact that that's our business, right? Privacy is fundamentally incompatible with the business model of these companies. All they can compete on is creating the illusion of privacy,
34:22
and they spend a lot of money and a lot of effort doing just that, whether it's lobbying in D.C., lobbying the European Council and the European Parliament, lobbying within the state level, or creating stories for you. Google actually has people they employ who are called storytellers to create that illusion.
34:41
But the best quote that sums this up for me comes from a product manager at Facebook, Michael Novak. He says, Now we're thinking of privacy as a set of experiences that help people feel comfortable. Comfortable with what? With the fact that you don't have any privacy on these platforms.
35:03
So this business of free is a lie. But it's more than a lie, because it involves a con. We have to con you. We have to lie to you. When you hear that people don't care about their privacy, they do. They're being actively lied to. When they find out exactly what's being done,
35:21
they are not happy about it. And that's what research paper after research paper that's coming out now is showing us. So what's the con? Well, it starts at the very beginning, when we have a startup. Mark's young. He's enthusiastic, right? He creates a startup, but he needs money. So he goes to a venture capitalist. That's not an illustration. That's actually a photo.
35:40
They all look like that. And he says to the venture capitalist, hey, in four years' time, I'm going to have a hundred million users if we're successful. How much money will you give me for those hundred million users today for when we sell them in four years' time? And the venture capitalist says,
36:00
well, I'll give you five million for those today, because the venture capitalist knows that if Mark is successful, he will get 50 million, right? This is the point where if they agree, the sale has happened. With venture capital, the sale happens at the beginning. It just takes a few years to mature and to close. You've already sold the users that you don't even have.
36:23
Now, here's where the hard part comes in. You've just sold all your future users. But now you need to convince them that you're never going to do that and that they should help you build value into your platform, because at this point, we have the power. We could kill Facebook right now by not joining and not using it, right?
36:43
By the time enough of us have joined and we're getting more value from the network than we're giving to it, that's where the network effects kick in, and we lose power at that point. But at this point, Mark is very persuasive. So he says, come on, come on, let's all build Facebook together. It'll be great. It'll be like this one big party, man.
37:02
And we go, sure, Mark, let's do it. And Mark is successful, whereas the nine other people that that venture capitalist invested in will fail, and that's why Mark has to be a billion-dollar success, right? Mark is successful, but that's the con. And so we all go, and we help build Facebook together, right?
37:20
And then the venture capitalist comes back after a few years and says, OK, it's time to go to an initial public offering to sell to the public or to sell to another company with this business model. And the sale is complete. It started from day zero, right? The sale is now complete. So when we talk about startups, the startups are the long tail of malware. A startup is not just any new company.
37:42
It's a very specific type of new company that takes venture capital and sells its users and loses control from day zero. So if you build a company that doesn't do that, don't call yourself a startup because you're not doing yourself any favors. You're better than a startup if that's not your business model. So in Europe especially, we're trying to be all-American and trying to be all-startup.
38:03
Let's not. We can be better than that. Because what we're really doing is we are throwing away the potential of these technologies. I met this guy who runs Spritz. Who's heard of Spritz? Anyone? It's a new way of reading. So watch the screen. And if you can read what it says, you'll be reading one word at a time at 150 words per minute.
38:23
It's probably the first new way of reading since reading was invented. That's pretty big, right? And when I was talking to their CEO at a conference, he said, Yeah, you know what we found? People with dyslexia, this is really good for them. They can read better with this. And I thought, that's really great. Wow, people with dyslexia, it helps them. That's really great.
38:41
And I asked a question I always ask. It's a really lovely question you should ask people when they come to you with an idea. What's your business model? And he looked at me like I was stupid. He said, We know what you're reading? Because it's a software developer kit that other developers take and they make products with it. And at the conference, he told the audience, We really hope you make email applications with this. We really want to see more email apps so they can read your emails.
39:03
You're following along here, right? But what are we actually saying to someone with dyslexia? We're saying we've just invented a new way of reading. And the cost is having us read everything that you do or you can read badly. I think humanity deserves better than this.
39:22
I think we can do better than this. I think we can do better than tying our future to this one toxic myopic business model that is running the show right now. Because we're shackling our infrastructure to a business that sells people. And we have to understand the ramifications of that going forward.
39:40
And if we take all of these startups, and if we take all of the monopolies of Google, Facebook, et cetera, and we add them together, what do we get? We get a monopoly of a business model. And that's what we have. This business model of selling people. Now, that's today, right? But what about tomorrow?
40:00
There are things that we're excited about right now, right? The Internet of Things, have you heard about that concept? All the things are going to be connected to the Internet? There's no Internet of Things. There's only the Internet of Data, right? This started with, I don't know if you remember, Web 2.0? It was a great time to be alive, wasn't it? We believed this man, Tim O'Reilly, when he said, look, open APIs are beautiful, right?
40:24
Build Twitter applications, build Facebook applications, build value into these beautiful closed silos. And by doing that, we will build the open web together. That turned out well, right? Because where free is the lie told to regular people, open API is the lie told to developers.
40:43
And we fell for it. Hook, line, and sinker. I did, I made a Twitter app. I'm not proud of it. Because an API key, when you think about it, is a key to a lock that you do not own and which can be changed at any time at the whim of the person or company that does own it.
41:02
There's nothing open about it, right? Our whole Web 2.0 thing was a con. Open APIs were a con. Open itself as a word has lost its meaning. I don't understand what you mean when you say open to me. Because it can have two very separate meanings. It can mean open as in an open field.
41:22
There's not even the suggestion of offense. It is a public space, right? I like that definition of open. But the definition that we normally see is that of an open gate. And the purpose of a gate, as much as it is to be open, is to be closed or locked.
41:42
And it can be closed and it can be locked. And more importantly, it implies that there is offense to begin with. That there is private property that we are being allowed selective access to. And that right can be taken away from us. So open, I don't know what that word means when you use it. And I've seen it abused so many times that it has no meaning.
42:02
Maybe we should reclaim it, I don't know. But there is one phrase where when you use it, there is no suggestion in anyone's mind of anything else, right? And that is the commons. When you tell me that something benefits the commons, I know exactly what you're talking about. So maybe let's start using that instead.
42:23
So web 1.0, we gathered data directly. It was the old school way. We didn't know what we were doing, right? Web 2.0, we got developers and lots of other people to gather data for us and build our silos. Web 3.0, we call it Internet of Things. Now we're putting those data gathering devices inside of your own homes
42:43
and we're getting developers to do the same thing, right? So this has gone from gathering impersonal data to very, very personal data. So the Internet of Things, unless we as individuals have ownership and control, is the Internet of Things that spy on us, right?
43:03
Like Nest, okay? Who's heard of Nest? Yeah, Nest is great. It's a thermostat, smart thermostat, knows when you're home, knows when you're not home, right? And last year, Google bought it in January for 3.2 billion. And they said, look, guys, we know you might be scared about this. Don't worry.
43:20
The customer data will only be used by Google Nest, not by Google. So they paid 3.2 billion because they liked how shiny they were, right? That's why they paid that money. And then Google Nest buys DropCam, a camera you put in your home that watches everything you do. But the CEO of Nest said, like Nest customer data, don't worry.
43:43
DropCam's data will not be shared by Google. Don't worry about it, right? That was on, what is it, there we go, June 20th, 2014, right? Just four days later,
44:01
Nest to share user information with Google for the first time. That headline might as well finish with dumb fucks, right? Because what were we doing? Thank you.
44:20
Because what are we doing? We're thinking, hey, I like this. I like this pot they gave us because the water is really nice. It's like a jacuzzi. This is a really cool invention. Not aware that the fire continues to burn underneath and pretty soon it's going to be too hot for us to jump out of it, right? So the internet of things, if we don't have ownership and control,
44:42
are the internet of things that spy on us in our own homes. What kind of things? Like the Samsung TV that listens to everything that you say. And when people said, we don't like that, they said, well, just turn it off. It's like, okay, but if we turn off that feature, it's not a smart TV anymore, right? Or how about a Barbie doll that you give to your child and it listens to everything that your child tells it, right?
45:03
And that's looked at by people at Barbie. Isn't that kind of cool? And who's heard of wearables? Who has a Fitbit or something like that on them? You know, the quantified self? Yeah, we love the quantified self. Again, unless we have ownership and control, what we're talking about is the surveilled self.
45:24
And we're going further than wearing things. I saw a company when I took part in a Nesta-sponsored event in the UK that makes a pill that you swallow that broadcasts data about your insides, right? And the reason they do it is so that for compliance, for medicine taking,
45:41
because people don't take their medicines properly. And so at this event, I raised my hand and I asked my favorite question. I was like, so what's your business model? And they were a Silicon Valley company and they were like, we can't get into the national health system. You guys are so backwards here in Europe. You ever heard that? Yeah, and I was like, yeah, there's a good reason you can't get into the national health system and I hope you don't. Because when I asked him, what's your business model?
46:01
He said, well, we don't know, which means it's your data selling it, right? And then he followed it up because he thought it was like a lovely kind of, you know, finger and mouth kind of friendly meeting of business people. He said, but we're pretty sure it's about the data. Yeah, right?
46:21
So these people are going to be selling data from inside you. And already, I mean, health is the next big thing, right? And already we see how these health sites are behaving. Over 90 percent of 80,000 health-related web pages exposed user information to third parties. One company, MedBase 200, reportedly used proprietary models to generate
46:40
and sell marketing lists of rape victims, domestic abuse victims, and patients with hundreds of different illnesses. But you might be saying, Aral, hey, with your American accent, I'm not American. My parents are Turkish, grew up in Malaysia. I'm French, live in Brighton. I like to think of myself as human from Earth. But you might say, surely Europe is different, right?
47:03
Well, unfortunately, no. Remember that digital imperialism thing that we had? This is an actual website called EU Startups. I couldn't help but notice how they were trying to get you to join their job board. That's a very interesting symbol there, right? So why is it not different?
47:23
Because we're in the same sort of ha-ha business head space, right? We have people like Neelie Kreuss who was at the European Commission and was basically a cheerleader for start-ups and for American imperialism in technology. Now she's doing the same thing for the Netherlands government. We have people like at the Commission today, like Ottinger,
47:44
who said that net neutrality was a Taliban-like development. I mean, seriously, these are the people who are leading the show right now. There are people like Janice Richardson, whom I met recently. She runs InSafe and European SchoolNet.
48:02
Now these are both European Union funded initiatives and their mandate is to protect children. We were recently on a show together and I couldn't believe my ears. This is what she said. If we were to use Facebook at school or any other social network at school as a learning tool, young people would really learn how to use them
48:25
in a much better way, a much more meaningful way and wouldn't waste so much time on them. So I think that there are very valuable tools online. All right, thank you. Thank you for turning it down. That was, wow, nothing happened there. That's awesome.
48:41
Thanks, tech people. So she said, we should be using Facebook in schools. And my response was, well, why don't we get McDonald's to teach nutrition classes? If we did that, we would be normalizing surveillance for a whole new generation of people, right?
49:01
So I am desperately trying to get to my next slide here, but it is not letting me do that. Give me one second. Technology, right? It's like when it works. Nah, I don't think Keynote likes me right now. So we're going to do this horrible thing. You're going to pretend nothing's happening right now?
49:22
There we go. All right, awesome. Thank you. The tech folks are so lovely, aren't they? Okay, so we call these things public-private partnerships, right? We call them multi-stakeholderism. We call it co-regulation. But what it really is, is institutional corruption.
49:43
It is the influence of corporate finance in public decision-making. And we're seeing these things get worse and worse with trade agreements, secret trade agreements, like TTIP. These are agreements that are being discussed in secret, which have clauses that are investor-state dispute settlement clauses,
50:01
which mean that investors and companies can sue governments when they take decisions that are against the interests of the companies, but in the interests of their citizens. So this is a really, really big problem that we have, and it's not getting any better. In fact, I would go so far as to say that this is a war.
50:22
It's a war on the public sphere. It's a war on the commons. It's a war on human rights and individual freedoms. And it's a war on democracy. We keep hearing about other wars. I think these are the ones that actually matter. In fact, we could probably say that we no longer live in a democracy.
50:42
We live in a corporatocracy. And technology itself is not the problem. We're very quick to blame technology. But as Melvin Kranzberg says, technology is neither good nor bad, nor is it neutral. Technology, as I see it, is a multiplier. It's an amplifier. It amplifies whatever you feed it.
51:02
You feed it bullshit, you get more bullshit back. You feed it meaning, you will get meaning back. So unless we change the nature of our technology, throwing more technology at the problem is not going to fix it. We need to change its nature. And the real problem that we're actually battling here, and you heard it in some of the other talks as well,
51:22
including the one right before this one about diversity, is systemic inequality. Systemic inequality is probably the greatest challenge our species has. It has the potential of making us extinct. And if you're wondering what it is, it's this. It's summarized by this one statistic from Oxfam.
51:41
85 of the richest people in the world have as much wealth as the poorest half of the world's population. So 85 people, that's what 85 looks like. 85 people can fit in one double-decker bus in London, right?
52:00
Have the same wealth as 3.5 billion, half of the world's poorest population. Now try to imagine 3.5 billion as a number. You can't. It's so far past the Dunbar number that we can't even visualize it. That is the problem. So if I left it here, though, guys, you'd be like,
52:20
Aral, you have depressed the shit out of us. Thank you very much. Right? I don't know why you're clapping me, depressing the shit out of you, but what is the solution? I heard Andrew Keen recently say, the solution is stronger regulation, it's government. I'd like to take that a bit further and say it differently.
52:41
I think the solution is stronger democracy. It's strengthening our democratic institutions. If our species is to survive the era of the human, where we have the technological means to destroy our species, we have got to strengthen our democratic institutions. So it's happening, right? Iceland, the Pirate Party is the most successful,
53:03
most popular party in Iceland right now. How cool is that? That gives me hope, right? We need to get rid of institutional corruption to begin with. Unless we can remove corporate finance from public policy decision making, we're not going to be able to tackle any of the other problems.
53:21
So this does mean that we need to move beyond capitalism. Capitalism is giving us monopolies, not free trade, it's not giving us competition, it's giving us monopolies. Maybe we move to something like a universal living wage. There are other options. We move to something that is sustainable. But this is not going to cut the mustard anymore.
53:41
And technology can be part of the solution if we approach it properly, the design of it. When I say design, I don't mean aesthetics, I don't mean art. That's not what I'm talking about. I'm talking about holistic design. This is how Silicon Valley does design. They design great experiences for now at the cost of your human rights
54:01
and your freedoms in the future, right? This is where we are in design thinking today. We've taken Maslow's hierarchy of needs and we've mapped it so that we can create pleasurable products at the best of times. But this is really decoration. Design alters the status quo. Decoration simply makes it more palatable, right?
54:22
And sometimes we go beyond decoration to things that are really quite evil. This is Ryanair. This is when you're trying to book a flight. And over there it says, please select your country of residence, right? And you're like, oh, nice, let me select my... No, no, they're asking you if you want to buy insurance. And the answer, no, is between Latvia and Lithuania,
54:41
where no normally lives, right? This is dark design. That's what we're practicing today, right? And we love it. We make these people heroes. Nir Eyal wrote a book called Hooked. The byline is how to build habit-forming products. There are only two professions in the world
55:01
that call the people who use their products users. One are drug dealers and the other is us, right? And if you look at what they're saying, how to get people addicted, this is evil. Or, in the words of the website, the book everyone in Silicon Valley is talking about. Because design without ethics is manipulation.
55:23
The alternative today is free and open source. But what do we do in free and open source? We say we're building products that protect your freedoms and democracy. Yeah, but the cost, oh, well, yeah, it's a shitty experience, but you can send pull requests, right? That's great. We can't do this either going forward. We're doing it because we are enthusiasts
55:43
creating products for other enthusiasts, right? You could have a car that you drive to work every day and you want it to work really well and you get pissed off and it breaks down. You could have a classic car that you love to work on. When it breaks down, you love it because it means you spend the weekend working on it. We're that group building solutions for ourselves
56:00
and wondering why people don't drive it to work every day. That's why. We need to move beyond this. This is trickle-down technology. We think we can build products for ourselves, for other enthusiasts, and that they will magically trickle down into things that regular people want to use. No. Trickle-down technology, just like trickle-down economics, doesn't work. That's why we've been giving people personal computers for 30 years
56:24
when all they really wanted were iPhones, right? And what we don't get in free and open source is experience. Apple, Google, they get experience. They get holistic experience because hardware, software, services, connectivity, that's the experience. And we have to compete on this ground.
56:42
One of these companies gets it, right? We need to start implementing ethical design, where we design beautiful things, convenient things for the here and now that don't then violate your human rights and that don't erode democracy in the future. And the cost? Well, we're going to have to charge for it,
57:00
or we're going to have to get our national institutions or supranational institutions to pay for it, right? And here's how it works. We start by taking human rights and respect for human rights at the bottom of our pyramid. That's what we're missing. Then we build things that are functional. Then we build things that are delightful. And it's all about respecting the human.
57:22
It's the three R's of ethical design. And here's where we're failing today, because we build things that don't respect human rights. And if we do that, we're not building products for people. We're building products on the backs of people. So what we need to do is we need to move beyond the clouds, right?
57:43
It's time to move beyond the web. We think the web is the solution. The web is the problem, because you might say, hey, the web is decentralized overall. We just need to re-decentralize it, right? No, because decentralized doesn't mean there are no centers. It means there are many centers. And if there are economies of scale,
58:00
those centers will grow and they will coalesce and they will become the monopolies that we know today. We need to move beyond decentralized as an architecture. We need to build a future that is distributed, or we won't have much of a future to begin with, or to talk about, right? Both in our society and in technology. And this is what a distributed topology looks like.
58:21
Every node is equal and connected to one another. It's not hard. Where Facebook says to share your photo with your friend, you must share it with us, we say no. If you want to share your photo with your friend, share your photo with your friend directly. This is independent technology. It's ethically designed technology.
58:41
It's technology that is independently funded. We can't take venture capital or we lose control. It has to be sustainable. It needs to be designed for the whole term, not just the short or the long. It needs to have a distributed topology, and it needs to be free, as in liberty, not in costs, and made by teams designing for themselves, not a colonialistic design.
59:01
And this is where diversity is so important. You heard about diversity in the last session. It's so important that we build diverse teams because you can't compete with a great design team designing for itself, right? And if I can't make myself more diverse, but I can make my team more diverse, and then by designing for ourselves,
59:20
we're designing for a diverse audience. That's why diversity matters. Do not support diversity because you're doing anyone any favors apart from yourself, right? It is in all of our interests to support diversity. There's a manifesto for what we do that you can read more about. But basically, we have to say no to a lot of things, to venture capital, to the Silicon Valley model,
59:42
to colonialism as a means of design, to privilege, to inequality, because that is the greatest problem, and fundamentally, to bullshit. Because here's what we've done, people. We have taken a bullshit seed, and we have planted it. And we have got
01:00:00
a bullshit tree. And then we climbed into the branches, and now we're wondering why we're eating bullshit fruit. Because it's the only type of fruit you can get from a bullshit tree, OK? So here's what we need to do, all right? What we're trying to do today, the companies that have created this system,
01:00:22
they're decorating the tree, and they're saying, hey, isn't it nice? No, the fruit still tastes like bullshit, right, even if we color it and we decorate it, right? And some of us see the problem, and we say, maybe if we prune the branches. No, you don't change the nature of the tree by pruning the branches. So here's what we do. We get down from the tree, we plant another seed, right?
01:00:42
We plant a seed that has reason and human rights and democracy and diversity at the core of it, and when that grows into a tree, we build a bridge, and we allow people to conveniently join us on our tree. And our tree is what will be, that bridge will be the bridge between a centralized world
01:01:02
and a distributed world that we need to live in. And our tree will be beautiful, we won't have to decorate it, because it will be diverse from its core, and the fruit we get will be diverse as well. And if we can do this, we can build a future where we start people off in their own homes, in a safe space, with no creepy uncles in the middle.
01:01:24
We can plant a new tree together. We can build a new future together. A future where reason and human rights and democracy are not merely fleeting footnotes in our history, but the defining characteristic of the human species.
01:01:45
A future that is beyond the clouds. A future without mass surveillance. A future that is sustainable and diverse. This is the future that I want to live in. This is the future in which humanity can fulfill its potential
01:02:04
and take its place among the stars. And we can build this future, starting here, today, together. Thank you.
01:02:22
Wow. Thank you. Thank you. Thank you so much, Aral. You're welcome. Thank you. You're very kind. Thank you.
01:02:43
Thank you very much. Thank you. You're very, very kind. I know that I've taken up a lot of time, so I don't think we have time for questions. Yes. But I'm here all three days. If you'd like to talk to me, please come up. And please, a round of applause for everyone
01:03:02
who's made this possible as well. And thank you so much for having me. Thank you all. Thank you, Aral.