How Democracy Survives the Internet: The lessons that Wikipedia can teach.
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 19 | |
Author | ||
License | CC Attribution 4.0 International: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor. | |
Identifiers | 10.5446/51162 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | |
Genre |
1
2
11
15
16
00:00
8 (number)InternetworkingXMLComputer animationLecture/Conference
00:43
NumberGoodness of fitRule of inferenceFinite-state machineView (database)Computer animationDiagramJSONXMLLecture/Conference
01:53
Ideal (ethics)DeterminismComputer animation
02:44
DeterminismArithmetic meanGraphics tabletVulnerability (computing)Object (grammar)Computer animationLecture/Conference
03:42
Arithmetic meanVulnerability (computing)Computer animation
04:34
Level (video gaming)Demo (music)Time evolutionEvoluteMultiplication signLevel (video gaming)System callSinc functionComputer animationLecture/Conference
05:29
Sampling (statistics)Random numberLevel (video gaming)Representation (politics)RandomizationMetropolitan area networkContext awarenessBeat (acoustics)Point (geometry)Lecture/ConferenceComputer animation
06:35
Multiplication signProjective planeBroadcasting (networking)MereologyContext awarenessKnotOnline chatKritischer Punkt <Mathematik>Lecture/Conference
07:45
Computer networkFrequencyBroadcasting (networking)Software19 (number)QuicksortPoint (geometry)Computer animation
08:40
Functional (mathematics)Broadcasting (networking)QuicksortComputer fontPoint (geometry)Lecture/Conference
09:33
Broadcasting (networking)Source code19 (number)Broadcasting (networking)FrequencyConcentricDifferent (Kate Ryan album)Source codeAxiom of choiceInformationComputer animationLecture/Conference
10:24
Program slicingAxiom of choiceHome shoppingDynamical systemSoftwareLecture/Conference
11:08
GodAveragePhysical systemNumberLine (geometry)SoftwareDiagram
11:53
NumberContext awarenessLecture/ConferenceComputer animation
12:32
Data modelNumberBusiness modelContext awarenessOpen sourceMessage passingConservation lawComputer animationLecture/Conference
13:19
SoftwareGraph (mathematics)Content (media)Business modelDifferent (Kate Ryan album)Exclusive orLecture/ConferenceDiagram
14:14
Division (mathematics)Open sourceMultiplication signConservation lawDifferent (Kate Ryan album)Domain nameException handlingDigital mediaComputer animationLecture/ConferenceMeeting/Interview
15:06
Menu (computing)Cellular automatonExecution unitLink (knot theory)Wechselseitige InformationInterior (topology)Maxima and minimaField (computer science)WaveDomain nameDifferent (Kate Ryan album)SoftwareCommitment schemeException handlingCellular automatonLine (geometry)MereologyComputer animation
16:08
QuicksortComputing platformConservation lawDigital mediaDomain nameNatural numberField (computer science)Automatic differentiationPoint (geometry)Open sourceInternetworkingLecture/Conference
16:56
InternetworkingInternetworkingAxiom of choiceData conversionSearch engine (computing)GoogolCore dumpView (database)Integrated development environmentComputer animationLecture/ConferenceMeeting/Interview
17:52
Integrated development environmentInternetworkingAutomatic differentiationFocus (optics)Computing platformBitOffice suiteComputer animationLecture/Conference
18:41
Point (geometry)Lecture/Conference
19:19
Convex hullElectronic mailing listDrum memoryInternetworkingComputer networkControl flow1 (number)Matrix (mathematics)Web pageParameter (computer programming)View (database)Different (Kate Ryan album)Goodness of fitComputer animation
19:57
WordType theoryFacebookInformationMultiplication signDeterminantGoodness of fitAlgorithmCASE <Informatik>Computer animation
20:53
CASE <Informatik>InformationGoodness of fitPerspective (visual)NeuroinformatikIdentifiabilityComputer animationLecture/Conference
21:33
Game theoryDigital electronicsContext awarenessSoftware developerCASE <Informatik>Multiplication signFamilyCategory of beingComputer animation
22:16
Shift operatorCategory of beingRight angleHacker (term)Dependent and independent variablesState of matterEvoluteInternet service providerRegulator geneComputer animationLecture/Conference
23:46
Context awarenessGoogolTheory of relativityMetropolitan area networkDigitizingHacker (term)WritingSpeech synthesisComputer animationLecture/Conference
24:32
Time evolutionArtistic renderingEvoluteChaos (cosmogony)Content (media)Virtual machineAutomatic differentiationBusiness modelComputer animationLecture/Conference
25:23
SphereSound effectAutomatic differentiationGoogolPolarization (waves)Sign (mathematics)BuildingSphereLecture/ConferenceComputer animation
26:07
Polarization (waves)BuildingFacebookCategory of beingSet (mathematics)AnalogyDependent and independent variablesMatching (graph theory)Channel capacityReal numberPairwise comparisonLimit (category theory)BitJSONComputer animationLecture/Conference
26:54
Context awarenessDependent and independent variablesBitFacebookComputer animation
27:33
FacebookZuckerberg, MarkStatement (computer science)Automatic differentiationLecture/Conference
28:16
Different (Kate Ryan album)Decision theoryEvoluteAutomatic differentiationFinite differenceWikiComputer animationLecture/Conference
29:00
RhombusAutomatic differentiationNumberData structureType theoryContext awarenessNormal (geometry)Point (geometry)1 (number)Computing platformContent (media)InternetworkingMultiplication signPlotterWikiIntegrated development environmentForestLecture/Conference
30:24
Value-added networkMultiplication signRhombusTime travelRepresentation (politics)Annihilator (ring theory)RandomizationSelectivity (electronic)Computer animationLecture/Conference
31:16
Process (computing)TheoryProjective planeCivil engineeringWeb serviceAcoustic shadowRegular graphHybrid computerComputer animationLecture/Conference
32:23
Sampling (statistics)View (database)Group actionRandomizationNatural numberSampling (statistics)Bus (computing)Computer animationLecture/Conference
33:10
outputView (database)Context awarenessGroup actionOpen sourceProcess (computing)Physical lawPoint (geometry)Translation (relic)VotingDifferent (Kate Ryan album)Projective planeElectronic mailing listMeeting/InterviewJSONXMLComputer animationLecture/Conference
34:19
Discrete element methodMassProjective planeWebsiteVotingNumberEndliche ModelltheorieRepresentation (politics)Object (grammar)Process (computing)BuildingResolvent formalismException handlingComputer animationLecture/ConferenceMeeting/Interview
35:52
Exception handlingRhombusView (database)Bookmark (World Wide Web)Computer animationLecture/Conference
36:35
Channel capacityWordLecture/ConferenceMeeting/Interview
37:12
Bookmark (World Wide Web)GodException handlingMultiplication signQuicksortRepresentation (politics)Sampling (statistics)NumberDemo (music)Computer animationLecture/Conference
38:31
Level (video gaming)MereologyDecision theoryMultiplication signPoint (geometry)Ideal (ethics)Group actionBitComputer animationLecture/Conference
40:18
Source codeProjective planeRadical (chemistry)ResultantGame controllerMultiplication signRandomizationConfidence intervalData conversionAddress spaceGroup actionFocus (optics)Representation (politics)Selectivity (electronic)Lecture/ConferenceMeeting/Interview
42:09
Basis <Mathematik>Video gameNormal (geometry)FacebookProjective planeMultiplication signView (database)Selectivity (electronic)Game controllerFreewareContent (media)VotingOpen setSingle-precision floating-point formatCountingWhiteboardComputer animationLecture/Conference
44:23
RhombusAsynchronous Transfer ModeProjective planeLecture/Conference
45:11
Digital signalInsertion lossCellular automatonPlanningVideoconferencingExterior algebraNumberDigital electronicsMultiplication signArmComa BerenicesComputer animationLecture/Conference
46:12
Multiplication signMereologyRepresentation (politics)Video gameCommitment schemeLecture/ConferenceMeeting/Interview
46:52
Lecture/ConferenceMeeting/Interview
47:30
XMLUML
Transcript: English(auto-generated)
00:07
So, I'm going to take it that we have some common ground here. I'm going to take it that we have common ground on the idea that democracy is on shaky ground.
00:23
Not just because of the white male crazies in the Anglo-democracy world, but in the 200 and plus years of this great experiment of democracy, what's striking is that we're still not certain
00:43
that it delivers what it promises it will deliver, that there's a deep uncertainty about its place in our future. Now, it's had a pretty good run. After many, many years of basically no democracies,
01:01
the last 70 years saw an explosion of democracy around the world till we come to a place where basically only the rogues deny the idea of rule by the people. But in fact, rule by the people is not so much the way rulers work.
01:27
Everywhere there is the view that society is divided between an elite and the people, and everywhere the view of the people is democracy represents the elite, and everywhere there's the view of the elite.
01:44
How can the people be so stupid? Now, we here, I think we should call ourselves technologists, not in the sense of being coders necessarily, but technologists in the sense that we think and engage about the question of technology
02:05
and recognize how technology has affected democracy. We need to recognize the way that it's affected democracy and ask the question how we can build democracy better given the technology,
02:23
or how can we change technology given the ideals of democracy. Now, whenever I say something like that, especially in Germany, I always get slammed when I say things like that in Germany, I say it and I have to say this caveat so I don't get slammed,
02:43
probably get slammed anyway, but when we think about technology like this, we don't have to think about it in a deterministic way. Nothing I'm saying means that it's determined that one technology produces what we're talking about. Here there are obviously a million possibilities,
03:00
but we need to recognize the way some paths are made more likely than others given a technology so that we can recognize that more likely path and respond, even if it is not determined that this likely path will occur.
03:24
So if we take that as the objective and that as the caveat and we say we have this common ground as technologists, I think we should recognize as technologists that technology has in an important way rendered democracy vulnerable.
03:44
How and why I want to talk about today, but I talk about it to lead to this question of whether we're going to accept the consequences of this technology or reject the consequences. And then ask how could we change or how could we accommodate the technology
04:03
to reject those vulnerabilities that technology has produced. Okay, some background. This idea of the people is central to what I want to say, but I want to insist that the people were born in 1936.
04:23
The people were born in 1936. What I mean by that, and I've said it twice, so I must mean it to be important, I do, I mean it to be important. What I mean by that is our ability to see the people in a scientific, empirical,
04:42
and reliable way gets born in 1936. Of course, since time immemorial, people have been talking about what the people believe. James Bryce wrote an extraordinary book in 1888 called The American Commonwealth where he talked about how we can understand what the people believe even though we had no way to actually measure what the people believe
05:05
in a reliable scientific way. He fantasized about a, quote, final stage in the evolution of democracy if the will of the majority of citizens were to become ascertainable at all times. This was the end where we could push a button and know what the people believe,
05:23
but he was far from that end, he thought. Turned out he was just 50 years from that end. Because 50 years after he wrote that, in 1936 George Gallup demonstrated that technology in an absolutely convincing way,
05:40
he delivered us to this final stage of democracy, an empirical way to see what the people believe through a technology of random representative polls. The context for this was a contest in the American political fight for president between Franklin Delano Roosevelt and a man named Alf Landon.
06:05
The then current technology for measuring public opinion polls, kind of a straw poll where people sent in their ballots, more than three million ballots had been collected, those ballots had predicted that Alf Landon was going to beat Franklin Roosevelt by 20 points.
06:20
Now you know, because you don't know who Alf Landon is, that Alf Landon didn't beat Franklin Roosevelt by 20 points. Indeed, what George Gallup did was to both predict that Franklin Roosevelt would win and predict precisely by how much the then dominant polling method
06:41
of the literary digest would be wrong. And when he said that, people said, you're nuts, you're just totally nuts. And Franklin Roosevelt went on to win the largest victory in the history of contested elections in the United States. So he added technology, he made a really outrageous claim,
07:02
he proved why the claim was true and thus was polling born and it became a central part of how we understand the people in the context of democracy. But here's the critical point, in the background of this project of polling, there was another technology, technology of broadcasting.
07:24
A technology which was at the same time that polling was born, just getting going in a really significant way. A technology to assure that many could hear at one time the same thing. Now what they heard was not always wonderful, that was its experience here,
07:42
but it was sometimes wonderful as America was terrified during World War II, it was Franklin Roosevelt's fireside chats that knit the nation together. People's consumption of radio news went from 60% in 1939 to 74% by 1944.
08:03
But the most important of these broadcast technologies was television, which over the course of the period of 1950 until the late 1970s, concentrated an extraordinary percentage of the public on a very narrow and dominant news segment.
08:26
I'm sorry, 90% of Americans in 1977 got their news from three television networks. Networks that told the story in what they perceived to be a neutral, down the middle of the road way.
08:41
That wasn't the truth, the story they told. It was biased in all sorts of ways, it was incomplete, it didn't consider all sorts of subjects it should have, but the point is everybody heard it and it built a certain community. Who we were was determined by this dominant presence.
09:01
And that then determined how we were polled, because as polls asked us questions, we repeated what we had learned. How we were polled was a function of this broadcast technology. It was a function of broadcasting. And how we spoke, how we were heard was a function of this technology.
09:24
You could call this the age of broadcast democracy, and when you put it in a font like that, it suggests a book which Marcus Pryor wrote, Post-Broadcast Democracy, which distinguishes between this bizarre period in the history of humanity,
09:41
1950 to 1980, when we were all focused on the same stuff, from the period that began in 1980, when concentrated delivery of news disappeared, as broadcasting disappeared. How are the people constructed
10:04
when the stories they are told are not concentrated? How are the people made by a world of many different sources? This is post-broadcast because the technology of information
10:21
had become radically more efficient, giving people choices that they didn't have before. In 1970, when everyone was watching the news, it wasn't because they wanted to watch the news, it was because they wanted to watch television, and the only thing on television for a certain slice of the day was news.
10:42
But when the technology for choice became more efficient, people could choose to watch what they wanted to watch. They don't want to watch the news, they want to watch the Home Shopping Network, or the History Channel, or sports. They could watch whatever they want, and this more efficient choice meant fewer and fewer were watching the same news.
11:05
This is the picture in America of this dynamic. This is the big three television networks, the black bar. You can see it goes from 1985, covering about 70% of the market of America, to 2002, less than 40%.
11:22
Then the white bar is cable penetration, the percentage of homes that have cable in their household. So this goes from just about 40% in 1985 to about 84% in 2002. And then this line is the average number of channels on those cable systems.
11:42
So going from about 14 to over 100. So you see, we go from a world where we're all watching just networks, to a world where we're choosing among 100 channels to watch. And so obviously as we enter the world of 100 channels,
12:02
given the free public, free to choose efficiently what they want to watch, what we produce is a fragmented public. And the question we have to ask is, how does that matter? How does it matter to who we are in a democracy?
12:22
Okay, as technologies divide us, the markets get built on top of the technology. Markets. As market size shrinks, brands within these markets become more important, more significant. The number of brands competing of course explodes,
12:41
but in the context of news in America, the number of brands begins to concentrate into relatively few. The brand has a business model. The business model is tribalism. The business model is to practice the politics of hate,
13:00
to teach us to hate the other side. To teach us liberals that conservatives are Nazis. To convince the conservatives that liberals are crazies. That's what they do. That's their message. That's how they build loyalty, brand loyalty, by building hate.
13:26
This is the most extraordinary graph I think I've read in the last five years. So this is an effort to measure the ideological content of these three major cable networks. CNN is purple.
13:40
Fox News is red. MSNBC is blue. Until 2001, there's basically no difference in the ideological content of these three networks. But as you go over time, the ideological content begins to radically separate because the most efficient business model for delivering cable news
14:02
in a fragmented, highly competitive market is the business model of exclusive ideological delivery to rally the base in this divided public. And so we have a divided public that is now all seeing a different world.
14:21
The world we liberals see is not the world the conservatives see. Barack Obama recently said, if you watch Fox News, you're in one reality. If you read the New York Times, you're in a different reality. The epistemological realities are fundamental to the story
14:41
of whether democracy can work. And it's not just that they happen to become different. There is an incentive in the media to make them different, to make them more different. So for example, it turns out that if you polled, not all science would be thought of in a partisan way.
15:04
This extraordinary work by my colleague Dan Kahan. You can't really understand these graphs except in the following sense. These are different domains of science, so nanotechnology or artificial food coloring or global warming. Where you see a line that's flat, what that means is your ideological commitments
15:22
don't affect your judgments in that scientific field. So this is saying around the questions about radio waves from cell phones, it doesn't matter whether you're a conservative or a liberal, you're going to think about it in the same way. Or the fact that you're a conservative doesn't affect how you think about it.
15:43
So this is a non-partisan domain of science. And if you look at the domains that he considers, this looks kind of optimistic because six of the nine are not partisan. Only three are partisan. So it's in those three domains you can expect the networks to begin to play off the partisan differences,
16:01
but here there's no return unless they can render those six as partisan. If they could take these domains of science and make them into polarized debates, then this would be yet another field to sow
16:22
and then harvest in this effort at making us hate each other. If the conservatives can make cell phone radiation concerns, the sort of thing that only crazy liberals think about, they not only would find another fight to rally their team on,
16:41
their tribe for, they'd probably get more ads from the cell phone companies. The point is there's an ongoing interest in making us stupid, giving this tribal nature of media and the incentives built into that platform. Okay, now that's just the story with television, which still is the most important platform, at least in America,
17:03
for affecting public opinion. But the reality is the internet only makes this worse because we've made a choice, someone did. I don't know who that person is. I'd like to have a conversation with him or her, but likely him. Someone made the choice that the internet was going to be ad driven.
17:24
It wasn't always going to be ad driven. When these guys, the Google boys, looked like that, they wrote an important paper that said advertising funded search engines will be inherently biased towards the advertisers and away from the needs of the consumer. They were going to build a search engine that was not biased in that way
17:43
until they realized they could be billionaires if they changed their view about this fact. That was their view, but their view now is advertising is at the core of everything commercial in the internet environment.
18:04
And the reality is better ads need better data. This is the insight and the focus of this extraordinary book by Shoshana Zuboff, The Age of Surveillance Capitalism. Surveillance capitalism describing capitalism enabled by surveillance
18:21
or the capitalism in surveillance, the persistent ability of the platform around us to know everything there is to know about you so they can sell you better ads. Now, of course, Zuboff is against it. I fear she's oversimplifying the story a bit.
18:40
I love this book. It's the best, most important thing to be read if you read anything in the next six months about this problem. But she writes as if all of this surveillance is awful. She invokes this idea of surveillance since it has a kind of Soviet or maybe a Stasi-like overtone to it. And the point is the surveillance is not all quite like that.
19:03
Some of it's bad, no doubt, but some of it's not bad. Some of it's quite good. And instead of thinking about surveillance in general, I think we should think about the uses of data and ask ourselves the question which uses we like and which uses we know we should not allow and which uses produce the most trouble for society.
19:24
So we can think about that. Of course, I'm an academic, so it has to be a matrix somewhere. Here it is. Think about uses that benefit the user, benefit society, or harm the user, harm society, and the ones in between. So benefit user, benefit society. If I go to my Amazon page, it knows me.
19:42
It's known me since 2003. I search for something. It tells me a whole bunch of books I might be interested in. I love that. I love that it's trying to figure out what books I want to read because it's usually pretty good. And so in my view, you can have lots of arguments about whether Amazon's good in society.
20:00
But if you like the idea of being able to get access to books, you can say this benefits users and it benefits society. It's an OK kind of use. Here's a different kind of use. Imagine that Facebook, now just to be clear, I don't get sued by Facebook. Facebook doesn't do this, but just imagine. You know what the word imagine means, right? It's hypothetical.
20:21
Imagine that Facebook monitors the way you type and compares the way you type over time. And based on an algorithm, it's able to figure out whether you have some weird neurological disease based on the way you've typed. So in this hypothetical, it compares 2009 to 2019,
20:41
the same thing being typed, and it's determined. That's what the HAL 9000i here is signaling. It's determining that this person has this neurological disease. And imagine then Facebook sends that information to the insurance companies. Just want you to know Lessig has this weird neurological disease,
21:01
just in case that's important to you. Now, from an insurance perspective, that's good information because it makes it easier for them to discriminate against me because they wouldn't want to insure me with this weird disease. But from my perspective, the idea that there's a snitch in my computer reporting to the insurance company is a really bad thing.
21:21
I should be against that. So in some sense, it benefits at least the insurance companies in society. In other sense, it harms users. Or here's an easier case for that. Imagine the technology made it possible for us to identify predators. Like the predator is against that, okay, harms the predator, but you see it can benefit society even though it harms the predator.
21:43
That use has that character. Okay, then the other case is this case of harms user and harms society. We all know about the incredible development of technology for exploiting addicts in the context of digital technologies.
22:01
Addicts, we call them whales. Very close to some in my own family. We have to regulate the whales very carefully. And this is a feature of the way the data gets used to find and exploit these whales. But the interesting category for us to think about is this category.
22:23
Uses that benefit the user but harm society. Okay, what could those be? Okay, shift for a second to food. This extraordinary book, Salt, Sugar, Fat, tells a story of food science, which is science, science where they learn to engineer food
22:41
to overcome the natural resistance you might have to eating bad food. You could call it body hacking. And what this science does is it exploits evolution, the evolution of the way our bodies have been built, with the aim to sell food, or maybe we should put that in scare quotes, sell food.
23:01
Now because most of this food is really pretty bad for us, we can say it benefits the user in the sense that the user wants to be eating the potato chips or the buffalo wings. But the externalities of this for society are pretty severe because it produces great health consequences for everyone.
23:23
Okay, the problem with this category is it's super difficult to regulate. Competition among these food providers makes it really difficult for the corporations to do the right thing. They do the right thing and then they have their lunch eaten by a competitor. When the government tries to regulate it,
23:40
the claim is it's a nanny state trying to regulate the size of your Coke bottles and freedom becomes the chant that we have to utter in response to that regulation. So it's difficult to imagine how you solve it in the context of food. Alright, now shift from the context of food to the context of the digital. You might know this man, Tristan Harris,
24:01
started something called the Center for Humane Technology, which he started after leaving Google where he was an architect in the business of addicting people to using the Google technologies. And what he describes in his speeches and writing is a science that has developed inside of Silicon Valley, a science to engineer attention,
24:25
to overcome the resistance that we would naturally have to these technologies. Think of it not as body hacking but as brain hacking. Now here too, the scientists are exploiting a certain kind of evolution,
24:42
the evolution that slot machines have traded on forever where we like random rewards, that gives us a real hit. The evolution that says we like the endless bottomless pits of content that we never can stop, like popcorn or potato chips.
25:01
All of this engineering with the aim to sell ads. Right? That business model creates an incentive to know more about you. Not just by watching you, but by poking you, by tweaking you, by asking you questions, by rendering you vulnerable. Reaching down the brain stack to leverage the insecurity you express
25:26
to sell ads to you better. This has effects, this technology. The Google News Feed has individual effects, whether it's addictions or depressions,
25:42
or as some scholars suggest, an increase in suicide. It has social effects. It is architected to isolate, to make us vulnerable. Zainab Tufakshi's extraordinary book describes companies in the business of monetizing attention and not necessarily in the ways that are conducive
26:01
to health or the success of social movements or the public sphere. That's what they're doing, monetizing attention, driving the politics of hate, building polarized and ignorant publics as they do that. This is the externalities to social data. It is why it fits in this category of benefiting the user,
26:23
we're getting what we want, but harming society as it undermines the capacity for democracy. We like it individually, but we should collectively hate it. Now, you might look at this and say, why would you make Facebook responsible? Facebook is just giving you what you want. Well, compare with some real world analog.
26:40
Imagine the responsibility we hold bartenders to. When are they responsible for the harm caused by people drinking in their bar? Well, we wouldn't say just when they open and sell drinks. That's what we expect bars to do. Maybe happy hours create a little bit of tension, a little bit more responsibility when you've got happy hours going on.
27:00
Maybe if they have no limit, like they'll sell you as much drink as you want, never will they say no to you. Then we say, in the United States at least, the bartender is responsible if you go out and kill somebody with your car. But imagine the bartender is a pharmacologist who has figured out how to spike the drinks
27:21
so that you want to drink more and can't stop yourself from drinking more. There's no doubt we would say to that bartender, that pharmacologist, you can't do this in the context of a bar. Facebook is spiking the drinks. And in this sense, it's responsible,
27:41
at least if the spiking can be shown to make democratic culture worse by rendering democracy vulnerable, rendering democracy polarized, simply to sell ads. You know, this statement, to sell ads, is so extraordinary and it's astonishing to me we just don't stop and scream about it more. If you told me you broke democracy to end global warming,
28:04
I'd say, okay, I get it. Or if you said I blew up democracy to end world hunger, I'd be, I'm not sure I'd accept that trade-off, but I understand the trade-off. But if you tell me you broke democracy to make Mark Zuckerberg richer, I don't understand what you're saying anymore.
28:22
Yet that is what we have allowed to happen. Okay, so now what do the wiki culture have to do with this? There are obvious differences with the wiki cultures. The first difference is the difference in incentives.
28:43
Whether accidental or just the insight of genius, there was a decision made in the early evolution of this culture not to render the content vulnerable to the incentives of ads. I think billions of dollars was left on the table because of that decision.
29:01
But hundreds of billions of dollars was given over to the public, given over to the public by a technology that renders itself trustworthy or at least not focused on turning you into a crazy because that's the best way to sell ads. If you remove ads from the environment,
29:22
you remove the incentives against the mission of the platform. You allow it to stay focused on mission. Unbiased knowledge to all is the mission. That's number one. Number two, governance. Wikipedia doesn't just happen. It happens within a structure, a structure governed by norms
29:42
and understandings and practices, norms that get enforced, communities that practice an enforcement of those norms. You just can't export Wikinorms to CNN. It can't work like that. We won't banish ads in the real world.
30:00
We won't kill capitalism as much as the democratic socialists would love. The point is the world is filled with rough and ugly stuff out there on the internet, most of it pretty ugly in the context of this type of content, but there are diamonds in this rough. And we can craft diamonds in the rough because if we do,
30:25
as Wikipedia did, the world will find these diamonds. So then the question is, is the final point, and I'm going to go a couple of minutes over, but I got a talk time here, so I'm taking away from my talk time. She's really quite ferocious here. I'm afraid it's a weapon she's pulling.
30:41
Oh, it's not America, so you can't have a gun, so that's fine. So the question is, how do we craft diamonds here? Or at least how do we craft democratic diamonds here that can compete with everything else that's out there but give us a way to have faith once again in this democracy? An extraordinary book by David Von Reh book, Against Elections,
31:02
which reminds us, as if we knew it, but reminds us that in the history of democracy, there is an extremely important tradition that says we select representatives, not through elections, but through sortition, random selection of representatives.
31:20
And that for most of the history of democratic theorizing, theorists said, if you select through elections, you will produce an aristocratic government. If you select through sortition, you will produce a democratic government. Now what Von Reh book is talking about is adding the process of sortition
31:43
into the process of governance. Not so much to end elections, we're not going to get rid of elections, and we shouldn't get rid of all elections, but at least to create the project of sortition as a shadow, a kind of jury service or civil service that helps inform government about what it can or should do,
32:03
a kind of hybrid. A way to construct a we, the people that we are proud of. So for example, Jim Fishkin at Stanford has developed something called Deliberative Polls for the last 30 years, gone around the world running these Deliberative Polls. These are regular polls like George Gallup would have recognized,
32:22
plus something. The plus is that they are representative, but they bring the people together and they inform them about the subject. They give them a chance to deliberate in small groups and in large groups, and they watch how their views evolve over time. So I saw an example of this in this extraordinary place.
32:42
This is Mongolia. This is not a painting, this is a picture in Mongolia. That's the nature of Mongolia. Here is a picture of Mongolia, in the sense that this is a random representative sample of Mongolia. Like the people in the teal, these are people who come from the outskirts of Ulaanbaatar.
33:04
Most of them had spent two nights on a bus to get to Ulaanbaatar to participate in this Deliberative Poll. And in this Deliberative Poll, they met in the parliament and they deliberated on proposed amendments to the Constitution of Mongolia.
33:21
They deliberated in small groups and in large groups, and in the context of that deliberation, they eventually changed their views about these proposed amendments and that became an input into the process of amending. Now, these people are not elected, but they're not ignorant. Indeed, I went in as a Harvard law professor thinking,
33:42
there's no way they can work out constitutional questions. And by the end of that weekend, listening to a translator was convinced that they were geniuses about constitutional questions. There's a deep sense, a reality that turns out constitutional law is not rocket science, but the point is, ordinary people reflecting on those questions could produce something worthy, something we could be proud of.
34:04
Now, my view is we need a million experiments like this, everywhere. Here in Germany, the Democratic Innovation Project is producing citizen councils in many different contexts. There's a More Than Voting Project, which is experimenting
34:22
with different ways to bring out the ideas of people that's not in simple voting. The Democracy R&D site lists an extraordinary number of projects around the world that are multiplying and experimenting around this basic model, all of them seeking experiments for a better we.
34:41
A we that we could respect because it's a we that is informed and reflective and judgments, the judgments it produces are judgments that are real. Here is where innovation in democracy must be. And we have to elevate that innovation to make it central to the debate about democracy,
35:05
to show all that there's a we that we can like. So you can imagine a world, not where we've replaced representative democracy, but where we complement representative democracy with a regular opportunity
35:22
to reach out on issues that politicians can't resolve and ask questions in an informed and valuable way. And like the Wiki culture, it would do it in a way that has no improper dependencies. These are not elected people. They're not selling out to their funders of campaigns.
35:41
They're not trying to get reelected. But it's not laissez-faire in the sense that there is a process that these people go through for building understanding, a process that is built and crafted and enforced with the objective to create an exception to the craziness that fills most of the public airwaves
36:05
about the views of what we the people are. Kind of a diamond in the rough to produce a people to be respected just as Wikipedia produced free and open knowledge that could be trusted.
36:20
If the people were born in 1936, then we can grow up with these experiments and finally speak in a way that we should respect. Okay, one final idea and then I'm going to stop. So you might have seen this movie, The Favourite. It's an extraordinary movie. Here's a...
36:41
Dearest Queen, you are mad. Giving me a palace. It is a monstrous extravagance, Mrs. Morley. We are at war. We won. Oh, it is not over. We must continue. Oh. Oh, I did not know that. The Queen is an extraordinary person. They were all staring, weren't they?
37:01
I can tell even if I can't see and I heard the word fat. Fat and ugly. No one but me would dare and I did not. She's been stalked by tragedy. Everyone leaves me and dies. Okay, so the general genre that The Favourite is within is the genre of rendering monarchs absurd.
37:25
Like, the reality is that many of them were absurd, embarrassing. But their embarrassment was hidden from the public. So the public didn't think of them as absurd. The public thought of them as God's chosen. It was hidden until it was not. And when it was not, monarchies across the world, except in a couple places, were destroyed.
37:46
Because when the people realized the monarchs were absurd, there was no reason to continue to trust the monarchs. Democracy faces the same fate. Because the demos right now, we the people right now, as we are represented, are embarrassing.
38:04
The demos, us, are absurd, ignorant about all sorts of questions that we are summoned to answer. All of us at the same time, at the same time, are ignorant about all of these questions we are called upon to answer.
38:20
But it could be otherwise. We could imagine multiplying the number of examples where it's just some of us who are summoned. A representative sample of us, representative of the public, informed and reflective and inspired. It must become otherwise if this democracy, this ideal of democracy, this support for the ideal of democracy is to survive.
38:52
Thank you very much.
39:12
Thank you very much. So we still have some time for questions, which is amazing. Thanks for making your point.
39:23
So, we reserved a little bit of time for questions if any of you wants to question anything. And otherwise, I see one here and maybe others in the back and I will come to you and give you the microphone for a second.
39:48
I'm member of parliament for the left party, so these are questions I'm concerned with too. And I always like the idea of selecting random people and have them take part in decision making and politics.
40:02
But how do we come from the status quo we have to this idea? And should it be 100% random selected people or maybe 30%, 50% or are there stages in between? Okay, so we don't leap from the status quo to randomly selected groups of people legislating for nations at a time.
40:27
That's not possible. We take it in steps and we develop a bunch of experiments and see how people respond to them based on the confidence that we have in what they've done. Now, my own preference is yes, depending on the question, they should be totally random.
40:42
But Ireland has demonstrated a pretty good example of why they shouldn't just be represented random selection. So Ireland had amazing citizen councils that were considering some fundamental questions that the Irish parliament was never going to answer in a different way.
41:00
Like, should there be gay marriage in Ireland and should they continue to criminalize the termination of a pregnancy? And these councils were made up of 100 people, 66 randomly selected people and 33 representatives from parliament. These weren't the only questions they addressed but these were the two that were most significant.
41:23
And they met over many months. They would meet once a weekend in months and they deliberated on it and it became a focus of Ireland. People watching the conversations and watching and the results astonished people because the results were that there should be decriminalization of the termination of an abortion and there should be support for gay marriage.
41:46
And this was affirmed by the Irish parliament and it was affirmed by the parliament because the 33 parliamentarians felt they were connected to the project. So they took the ideas, not just these but the others too, back into the parliament and made them real.
42:03
That might be a necessary hack because the other example here is Iceland. So Iceland had a crisis in 2008 like everybody did. They tied that to the absence of a constitution. They started a crowd-sourced project to write a constitution.
42:21
The government decided it was getting out of control so the government took it over. So basically it worked like this. They had a random selection of a thousand people who met for the purpose of identifying the values a constitution should have. And then they elected members to draft the constitution. So they had an election in Iceland where 524 people ran to be on the parliament, to be on this little council.
42:47
You know it's a population of 300,000, 524 run. They elect 24 or something like that to serve on this council. They meet for four months to draft a constitution which is about the time it took to write the American constitution.
43:01
But unlike the Americans they posted their constitution to Facebook every single week. So that comments were taken from people around the world about this draft. And they improved the draft on the basis of comments and eventually they took the draft to the public and they said, should the parliament amend the constitution on the basis of this draft?
43:21
More than two-thirds of those voting said yes and then the parliament ignored it. And still four years, five years later the parliament has done nothing with this. So this might be the insight that the Irish experiment shows. You need to bring the parliament in if the parliament is going to feel connected. And so I'd love to see 50 of those kinds of experiments.
43:41
But what we need to do is to give people a sense just like Wikipedia. If you had said 20 years ago we're going to open up a free online encyclopedia, anybody can contribute. The view would be, the view was, that's crazy. That's crazy talk. It's going to be crap.
44:01
And the project which many of you have been involved with for the full life of this project proved that was wrong. It didn't prove that every open content project doesn't produce crap. Most of them do, right? It proved that this one could be different because of the discipline and norms and principles that were built within it.
44:27
And people come to it because they understand it's the diamond in the rough. They see it and they've learned it's like that and they begin to steer to it. I think that's the same thing that could happen with these different modes of representing the public.
44:42
There will always be Gallup telling us nobody believes there should be thorium reactors or we should go to war with Iraq as the United States was led with those kinds of polls to say we should do. It's never going to eliminate that. But if you could begin to build things that people could have the same trust that they have with Wikipedia in the project of understanding what the public thinks, I think there's a chance we won't seem so ridiculous.
45:05
And that's the chance democracy needs. So we have two remarks still and I would ask you to give a short remark and then we can do more questions maybe outside so we don't over go with our time here.
45:25
Heiner Benking, Larry, thousands and one things. I want to ask you, have you heard about the election circus? Election circus? Yes. It was done here 20 years ago with Christoph Schlingensief. It was called Chance 2000.
45:42
And what you should look into is that we in Germany have the planum celli, that citizen jury and there has been much work done on the wisdom of the people and actually there are great videos about it. Yeah, there's a great site, esketlos.something, I don't know if it's com or org.
46:04
Esketlos that collects some of this, that collects one of the two that I pointed to. But yes, there's a huge number of experiments that are being done here. But I think what needs to happen is people from her world needs to begin to help elevate the significance of this
46:20
because if it's just seen as counter or alternate, it's not going to have the presence that it's necessary for it to begin to take on a life of its own. And my commitment, and this is a really important part for people in your world, it's not to say that this should replace representative democracy. Representative democracy is really hard. You've got to balance a whole bunch of things together.
46:41
You know, should we have more money for schools, more money for hospitals, more money for roads? Those are hard questions. You can't answer them one at a time. These deliberative polls are good at one at a time like questions. And so it's not to say that they could ever be something instead of a parliament, but I think they can help a parliament know what a people properly structured could think.
47:07
Wonderful, thank you. So we invite you all if you have more questions or more comments to get outside in the lobby and to have a few minutes of chat still.
47:20
Thank you again very much Larry that you came here.