We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Resisting the Surveillance State and its network effects

00:00

Formal Metadata

Title
Resisting the Surveillance State and its network effects
Title of Series
Part Number
7
Number of Parts
72
Author
License
CC Attribution - ShareAlike 3.0 Germany:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this
Identifiers
Publisher
Release Date
Language

Content Metadata

Subject Area
Genre
Abstract
Re-contextualizing our social interactions in the face of privatisation of data leads us into a space of social responsibility. The impact of our permissive data sharing habits and the economic models that incentivize it is not yet fully understood. How may we ensure that we're fully informed and consenting to information released or sold about us? How may try we ensure that consent is required? How can we re-contextualize and better come to a shared understanding about transitive risks posed by the surveillance state?
Slide ruleArmMathematical analysisState of matterTerm (mathematics)Network topologyInformationBitOpen setStatement (computer science)SoftwareEvent horizonMessage passingVertex (graph theory)Local ringMeeting/Interview
Social classRule of inferenceState of matterRight anglePoint (geometry)View (database)Power (physics)Autonomic computingWritingHypermediaSoftwareSelf-organizationVideo gameModal logicGame controllerBasis <Mathematik>Meeting/InterviewLecture/Conference
HypermediaSoftwareInstance (computer science)Multiplication signGroup actionLecture/ConferenceMeeting/Interview
HypermediaGame controllerTelecommunicationDifferent (Kate Ryan album)Broadcasting (networking)Computing platformError messageBasis <Mathematik>BuildingRight angleParadoxBit rateChannel capacityIterationMeeting/Interview
Business modelInternetworkingHypermediaNegative numberGame controllerFacebookArithmetic meanComputing platformSoftware developerRight angleTwitterMeeting/Interview
Software developerMultiplication signPoint (geometry)Address spaceComputing platformPeer-to-peerQuicksortView (database)Server (computing)HypermediaGame controllerInternetworkingSoftwareShared memoryForm (programming)Right angleDifferent (Kate Ryan album)Digital photographyMikroblogEmailLecture/ConferenceMeeting/Interview
Game controllerHypermediaOpen setPeer-to-peerComputing platformBusiness modelTransformation (genetics)Channel capacityTelecommunicationMeeting/InterviewLecture/Conference
QuicksortInformation privacyPosition operatorTerm (mathematics)Right angleMetropolitan area networkLevel (video gaming)Lecture/ConferenceMeeting/Interview
Metropolitan area networkRight angleRevision controlData structureIdeal (ethics)Goodness of fitPhysical systemMereologyMultiplication signForm (programming)Information privacyCAN busResultantTheoryInequality (mathematics)Meeting/Interview
VotingGoodness of fitData structureRepresentation (politics)Multiplication signBelegleserAbsolute valueTotal S.A.Virtual machineInformation privacyProcess (computing)MereologyMeeting/InterviewLecture/Conference
Information privacyOffice suiteInternetworkingOrder (biology)Multiplication signGraph coloringWave packetMeeting/Interview
State of matterQuicksortMereologyWebsiteWindowInternetworkingInformationData structureParameter (computer programming)SoftwareAxiom of choiceMeeting/InterviewLecture/Conference
Representation (politics)Total S.A.VotingInformation privacyMeeting/Interview
State of matterEncryptionDisk read-and-write headInformation privacyMeeting/Interview
QuicksortInformation privacyAnglePhysical lawDivisorState of matterGame controllerInternetworkingForcing (mathematics)Meeting/InterviewLecture/Conference
Operator (mathematics)Right angleComputing platformPhysical systemHypermediaContext awarenessCategory of beingSpacetimeInternetworkingPower (physics)Lie groupFacebookYouTubePhysical lawCopyright infringementMeeting/InterviewComputer animation
Physical lawRight angleCausalityOrder (biology)Computer fileMultiplication signSpacetimeContent (media)Meeting/InterviewLecture/Conference
Computing platformCondition numberContent (media)Right angleYouTubeInformation securitySpacetimeCentralizer and normalizerPower (physics)HypermediaMeeting/InterviewComputer animation
Information privacyRight angleContext awarenessHypermediaExterior algebraBuildingProcess (computing)Channel capacityGame controllerNumberPhysical lawWeightComputing platformTelecommunicationScaling (geometry)CASE <Informatik>Point (geometry)SphereService (economics)Lecture/ConferenceMeeting/Interview
Computing platformProcess (computing)Wave packetTelecommunicationSoftware developerPhysical systemHypermediaHacker (term)Multiplication signBuildingEndliche ModelltheoriePhysical lawChannel capacityInformation privacyFunctional (mathematics)MassRight angleLecture/ConferenceMeeting/Interview
Physical lawInformation privacyCategory of beingInternetworkingRight angleSoftwareEndliche ModelltheorieNetwork topologyLecture/ConferenceComputer animationMeeting/Interview
Web crawlerWeightCASE <Informatik>Term (mathematics)Physical lawPhysical systemMultiplication signMassInformation privacySource codeState of matterFigurate numberMeeting/Interview
MereologySpacetimeEntire functionState of matterInformation privacyCommunications protocolTerm (mathematics)TelecommunicationInternetworkingTotal S.A.Right angleVirtual machineDirection (geometry)AdditionPhysical systemSeries (mathematics)SummierbarkeitCASE <Informatik>Heat transferLevel (video gaming)Insertion lossLecture/ConferenceMeeting/Interview
Game controllerPhysical systemFacebookBuildingTheoryCASE <Informatik>Inheritance (object-oriented programming)Theory of relativityPosition operatorRoboticsAssociative propertyRight anglePower (physics)EmailYouTubeMeeting/InterviewLecture/Conference
InternetworkingTotal S.A.Extreme programmingAreaMereologyProduct (business)Form (programming)Multiplication signMedical imagingLecture/Conference
Order (biology)Vertex (graph theory)DialectLeakPower (physics)Physical lawArithmetic meanGame controllerSet (mathematics)Mechanism designInstance (computer science)Lecture/ConferenceMeeting/Interview
Wave packetRight angleWordInheritance (object-oriented programming)MereologyQuicksortLeakFeedbackGoodness of fitProcess (computing)Exception handlingPhysical systemMetropolitan area networkModal logicType theoryBuildingSign (mathematics)Meeting/Interview
QuicksortException handlingBitSelf-organizationMortality rateSpacetimeProcess (computing)Task (computing)Meeting/InterviewLecture/Conference
Data structurePower (physics)Physical systemCore dumpOrder (biology)Inequality (mathematics)BitArithmetic meanCASE <Informatik>Computing platformGame controllerServer (computing)LogicBusiness modelSoftware10 (number)Extreme programmingFacebookAttribute grammar1 (number)HypermediaDifferent (Kate Ryan album)TelecommunicationMathematicsWordSocial classRight angleTheory of relativityRepresentation (politics)Mixture modelInternetworkingLeakMeeting/InterviewLecture/Conference
Infinite conjugacy class propertyChannel capacityGame controllerBusiness modelDifferent (Kate Ryan album)Representation (politics)Message passingStatement (computer science)Service (economics)PlastikkarteFacebookProduct (business)Lecture/ConferenceMeeting/Interview
HypermediaBusiness modelDivision (mathematics)Goodness of fitInformation privacyMeeting/Interview
Revision controlInformation privacyGoodness of fitFacebookBusiness modelPower (physics)ResultantGame theoryView (database)InternetworkingComputing platformMeeting/Interview
Business modelComputing platformAxiom of choiceEvent horizonStatement (computer science)Point (geometry)MeasurementFocus (optics)Endliche ModelltheorieCASE <Informatik>TelecommunicationParameter (computer programming)Different (Kate Ryan album)QuicksortInformation privacyMoment (mathematics)Open sourceRight angleWordLine (geometry)Zuckerberg, MarkMeeting/InterviewLecture/Conference
Business modelCASE <Informatik>Traffic reportingFacebookIntercept theoremDatabaseSoftwareGroup actionEvent horizonResultantPhysical lawDigital photographyMeeting/Interview
Beta functionOpen setCentralizer and normalizerHypermediaSpeech synthesisArithmetic progressionMultiplication signInformation privacyMathematicsMusical ensembleStatement (computer science)Data structureType theoryMereologyEndliche ModelltheorieParadoxFrame problemLecture/ConferenceMeeting/Interview
Centralizer and normalizerSoftwareHypermediaRight angleMusical ensembleParadoxFrame problemComputing platformInternetworkingMotion captureBusiness modelMarkup languageSoftware developerBuildingPeer-to-peerOrder (biology)Game controllerZuckerberg, MarkGraphics tabletMeeting/Interview
Table (information)Centralizer and normalizerEnterprise architectureSoftwareMultiplication signComputing platformTelecommunicationState of matterMusical ensembleAxiom of choiceMereologySpeech synthesisInternet service providerOpen setAnglePirate BayPhysical lawWebsiteMeeting/Interview
Universe (mathematics)BitField (computer science)Axiom of choiceInternetworkingCASE <Informatik>MereologyState of matterComputer sciencePrice indexFacebookMeeting/Interview
Lie groupVideo gameWordRight angleSoftwareFacebookDifferent (Kate Ryan album)MetadataSound effectQuicksortLecture/ConferenceMeeting/Interview
Artistic renderingCASE <Informatik>Spectrum (functional analysis)SpacetimeAxiom of choiceState of matterWave packetLecture/ConferenceMeeting/Interview
Shared memoryInstance (computer science)MathematicsService (economics)Default (computer science)FacebookServer (computing)Cartesian coordinate systemComputer fileDigital photographyGroup actionRight angleDirection (geometry)QuicksortBuildingComputer architectureAffine spaceMeeting/Interview
Arithmetic meanFood energyMassCodeBuildingExterior algebraDisk read-and-write headState of matterMeeting/Interview
Peer-to-peerCopyright infringementInterpreter (computing)Context awarenessHypermediaThomas BayesPhysical systemWordBoom (sailing)SoftwareInternetworkingComa BerenicesBusiness modelRight angleGame controllerPosition operatorInformation securityState of matterMIDILecture/ConferenceMeeting/Interview
QuicksortFocus (optics)Computer clusterLecture/ConferenceMeeting/InterviewDiagram
Transcript: English(auto-generated)
Hello everyone, I can't actually see you now with these lights, but I assume you're still there. I'm Dimitri Kleiner, this is Jake Applebaum, and thanks for everyone for coming out today at 12.30.
That's much better. Hi. We're going to try to make this as much of a discussion as possible, and try to get the audience involved and talk with each other a bit. So Jake and I are just going to start off with some opening statements that will be rather short.
And then we'll take your questions, and we'll see if we can go further. So, my background is in the political economy of information. I wrote the Telekominis Manifesto, and I look at networks in a very political way in terms of topology.
So when we're talking about the surveillance state, I don't understand the state to be simply just the government, like the way you might imagine some kind of state surveilling its citizens, but I imagine the state as being an institution that is actually formed with the public and private sector,
and with the private interests are always contesting in the state. So you can consider the state to be the institution that mediates among the classes, but always on behalf of the ruling class. So the private interests drive the state,
whether the private interests of the people and their social power, or the private interests of corporations and their money and finiteers, the state is always driven by private interests. So if you want to understand its surveillance, or if you want to either resist it or comment on it at all, you have to look at it from the point of view of the private interests,
not necessarily the point of view of the state as some kind of autonomous organization. And now, the thing about surveillance is that surveillance and behavioral control, actually, have always been the economic basis of media. If you look back to the writing of the economist Dallas Smyth in the 50s about network television,
he talks about something called the audience commodity and the consciousness industry. And this is how media is funded and how it's always been funded. It's not in media, the commodity is not the media being delivered. So for instance, you're talking about network television,
the commodity in question is not the television show. The television show is just simply used to attract the real commodity, which is the audience. And the audience is not the customer, not the user in any kind of sense like in the market. The audience is actually what is being sold, and it's being sold to advertisers and lobby groups, what Smyth calls the consciousness industry,
the industry that's interested in controlling your behavior. And this has always been how media is funded, right? So what's different about modern media and communication platform is that the social aspect of it actually enables more surveillance. So whereas in a broadcast era, behavioral control was the economic basis of it,
but the surveillance was not quite as important because, well, it was not so much important but not as much available. The technology of surveillance was not nearly as developed as it is now. And if you look at the social media platforms and what makes them special
and what makes them economically viable, it is exactly that surveillance. So we built communication tools not individually but as a society. In the philosophy of technology, there's a concept called the paradox of the frame, right? Which is kind of interesting to understand.
People often assume that success follows efficiency, that if you have a technology that's efficient, it will become successful. But in actual fact, efficiency follows success, right? Our technology becomes efficient because it's successful for social reasons and therefore has social investment.
Therefore we use our social capacity to develop further and by iterations of developing it further, it becomes successful, right? So if you look at platforms like the social media platforms that we all know, like Twitter, like Facebook and things like that, they didn't become successful because they were efficient. They became efficient because they were successful.
They became efficient because they provided a business model that could be funded by capitalism and that business model is surveillance. So it's not an accident, it's a negative consequence of social media. Surveillance and behavioral control is the purpose of it. It is the reason they get funding, it's the reason they get investment and it's the means by which the investors can get a return on their investment, right?
So investors invest money into a startup and they hope to get more money back. And the business model here is based on surveillance and behavioral control. That's what funds the internet and that's what always has funded its further development.
Now we've heard a lot of times here on this question of surveillance and one of the solutions that's proposed is that we need to build a different social media, a new social media, one based around distributed servers and peer-to-peer software and all this kind of stuff. And this is actually a really funny point of view
because this sort of assumes that we have something now called social media and we can reinvent it in a peer-to-peer form, right? Well this is actually very strange because of course we've already had a distributed social media platform. Some of you may remember it, it's called the internet, right? The internet was a social media platform that was fully distributed from its earlier days.
Technology such as Usenet, email, IRC, finger enabled you to have address books of your friends, share photographs, post citizen journalism, put status updates, all of that was there. So the question that we should be asking is not how we can move from centralized surveyed social media funded by surveillance and behavioral control to open and free social media.
The question should really be how is it that we went from free and open social media to centralized platforms? That should be the question because that's a transformation that happened. We already had distributed federated social media. It's not something that we needed to invent, it's something that we actually abandoned. And why did we abandon it?
The reason we abandoned it is because, once again, what is the business model, right? When you look at the classic peer-to-peer platforms, a peer-to-peer platform is based on the idea that individual users can directly communicate with one another. When individual users can directly communicate with one another, the thing you don't have is exactly the surveillance and behavioral control that capitalist media funding depends on.
So when you're talking about resisting surveillance, you're entering a question that's not simply a matter of like outsmarting the government or creating some kind of technology that will let you evade certain kinds of surveillance. The question that you're asking is a social question. How can we build the social capacity for communication without depending on surveillance and behavioral control
as the primary funding method? I'll leave it at that for now, but we can go on further and I'll pass it over to Jake. Thanks, that was awesome. So, I was sort of invited here and brought Dima on at the last second
because I thought that it would be important to have him say exactly what he just said. Because I've heard quite a lot of people that talk about post-privacy and they talk about it in terms of feeling like, you know, it's too late, right? We're done for. There's just like no possibility for privacy left anymore and we just have to get used to it.
And this is a pretty fascinating thing because it seems to me that you never hear a feminist say that we're post-consent because there is rape, right? And why is that? The reason is because that's bullshit, right? We can't have a post-privacy world until we're post-privileged, right?
This is what Dima is talking about. So, when we cave in our sort of, I guess you could say, our autonomy, then we can sort of say, well, okay, we don't need privacy anymore. In fact, we don't have privacy anymore. I'm okay with that. Realistically though, I think most people are not comfortable with that because if you only look at it from a position of privilege like, say, white men on a stage,
then yeah, post-privacy maybe works out okay for those people. But if you've ever not been or if you are currently not a white man with a passport from one of the five good nations in the world, it might not really work out well for you. And in fact, it might be designed specifically such that it will continue to not work out well for you
because the structures themselves produce these inequalities. So, when you hear someone talking about post-privacy, I think it's really important to engage them about their own privilege in the system and what it is that they are actually arguing for. So, for example, they might say something like, well, it's a utopian world that we're talking about.
We want to live in a utopia where it's possible to have this kind of post-privacy world where we don't have to worry about this, where we can have cameras everywhere, everything we do can be recorded. We'll have lots of good data about what it is that you want to purchase when you go to a bookstore before you've even arrived. And that's maybe their version of utopia.
But I think that that sounds like a dystopia actually. I don't want to live in a world where people spy on each other all of the time. That sounds like a total nightmare. And so, if that is the utopia that they're talking about, the good news is that we can't ever reach it because we will not ever be post-privileged with the structures that exist in our society right now.
But it's still important to say that if for some reason, in theory, we could arrive there, we need to know that that's actually not the end point. Part of, at least for me, about philosophical and political anarchism is this notion that there actually isn't an end point. And as a result, this means that when people have these utopian ideals
and they discuss them as if that's the place in which we must arrive and then stay there, they're not really thinking, I think, quite clearly about the fact that for some people, we are at the end of their utopian idea and we continue to want to change it because it turns out it's actually not as good as we thought it might be or as people previously thought that it might be.
And so, I think it deserves some concrete examples here, but secrecy of the vote in democracy is extremely important. I think for representatives who are representing you, secrecy in their vote is actually about accountability, so they don't really, in many democracies, they don't actually get this secrecy
and that's maybe a good thing for accountability. Now, the structure itself is still not necessarily just, but for us to be able to have secrecy in our vote, it allows us to keep our jobs, it allows us to be able to live in the world with people that do not entirely agree with us and to have some amount of dignity. So, for example, the naked scanners that exist all across the world,
but especially in Europe and in the United States, some of which have ionizing cancer-causing machine parts, right? They're like pretty scary. Even if they were completely safe, they would still be undignified. And that in itself, regardless of whether or not you get cancer from it, is actually as important because in fact, living without dignity,
living without privacy is a cancer in itself to society, which in itself also eats away at the human and I mean, I'm an atheist, but I really think it also eats away at your soul. When you live under that kind of surveillance, like total absolute surveillance where every time you pick up the phone, you know that it's being recorded, you say things differently.
So, as we sort of move into this nightmarish world, we have to decide if that's actually the world we want to live in and at least for myself and I think for Dimitri and I would hope for one or two people in the audience, that's also you. And in fact, in Germany in particular, I find it amazing that there could be a post-privacy person at all,
like even one, especially when you consider the history of the 20th century. And you know, without bagging too much on Germany, the United States seems to have not learned very much from it, which is kind of surprising to me. We do incredible amounts of surveillance, some of which is actually illegal, in fact, most of which if you talk about it in bulk, is illegal.
And I think that if we're going to be post-privacy, we should start with the fascists first and we should do that by bringing them to justice for their crimes, especially their crimes where they violate the democratic principles of our societies and use their surveillance and use their privilege against us,
especially for political ends, but also for basically the dignity of the people that live in these regimes. And so whenever someone suggests that we should have so-called lawful intercept so that they can catch terrorists, we need to remember that for every terrorist that they've ever caught, there are 15 or 20 or more police officers that have beat up people at protests
and are still police. We don't ever hear people talking about surveilling the Internet in order to catch the people that commit physical violence on those that express their political will nonviolently, directly and legally. Why is that? Why is it that we always hear about child pornographers and terrorists
and yet at the same time, we know concretely in my country, for example, that if you happen to be a person of color riding a train, your chances of getting shot in the back, dying, and then the police officer being let go after maybe no trial at all, maybe a small trial, maybe a small punishment, is basically non-zero.
That should be something we aim for as an actual zero thing in our society. You should not have to fear people that may or may not be necessary but certainly do exist. The boogeyman of state terrorism, in fact, is sort of the lesson of the 20th century, and we're sort of growing into that in the 21st century, which is that state terrorism is as real as this so-called asymmetric threat,
like the fear of Muslims, basically, is what most people are talking about, which is pretty disgusting. And we have to ask ourselves why it is that we buy into it, and part of it is the emotional argument, and part of it is what Dimitri is talking about, which is the actual social structures that produce this. But the other reason is because people feel like they don't have agency
and they feel like they've already lost. They don't contextualize the fact that the curtains that they have on their windows are a privacy-enhancing technology or that just because you use the Internet, you can use something like the Tor network, which I and other people work on, so that sites you visit actually don't have all of the information about you
when you're using the Internet. So there are things we can actually do, but if you don't know those things, you will feel like you have no choice, and so you will rationalize, and I've seen many people do it, and they'll say, well, I'm not important, so it doesn't matter, or, well, you know, yeah, they've got it, but it doesn't matter because I'm never going to attract any attention,
which that must be really lucky of you. And, you know, even the people that get it, they realize, yeah, but there's just nothing that I can do. But the reality actually is that there is something that you can do, and the first thing that you can do is actually care about the things that are wrong, and this is a thing that is wrong, that affects the ability to maintain a republic or a democracy. When you have, for example, representatives that are under total surveillance,
intelligence communities, or even just people with, like, $20 cell phones can know how they're going to vote and influence them, can know who is important to them, can map out their social networks, can do all kinds of terrible things. For example, the Zeta Cartel, post-privacy people, I think,
that suggest that they don't need to worry about privacy at all. I think it would be really fascinating to send them to, like, you know, Juarez, Mexico, with, like, a giant wad of money in their pocket and tell them to talk about post-privacy, and how, like, we can all get along, kumbaya and everything is great, right? I don't think that that would work out very well,
and that's because, contextually, their white privilege won't serve them there. They'd be shot in the head in minutes, if not before they even got off the airplane. And I'm not saying that that's the ideal state and that should happen to them, but I think it's important to acknowledge that the reality that people talk about is not everyone's reality, but it is everyone's reality, at least in the current European and North American state,
and in lots of Asia and in lots of South America and lots of Africa, that it's possible to use, for example, strong encryption to reduce the amount of data that the surveillance state produces about us. So, with that said, I think we've both sort of covered our different angles,
and I think it would be fantastic to ask anybody in the audience if they actually are a post-privacy person. I just want to say a couple of things, I don't know if I've broken my microphone there, you can still hear me. I just want to say a couple of things on the whole public-private divide, because I think that's kind of important to take a closer look at as well.
Because, I mean, there is surveillance by the state, there is data retention laws, there's all kinds of laws that all people have known about here, from the DMCA to SOPA and PIPA, that force more surveillance and control over the Internet.
And the funny thing is, as important it is to fight them, these laws still exist expecting there to be some kind of a public space called the Internet, like a public park. And in these laws, because it's in a public space, there's still a social contestation, we can still fight, we can fight against those laws, and sometimes we even win, rarely.
Usually the corporate interests win, but sometimes we even win with social power, through social movements. Now, however, when we change the context from the public park to the shopping mall, then all of a sudden we never win, because that's a private space, right? So when we're moving from this Internet, this original decentralized system that we used to have, that's being attacked by all these horrible laws,
we're still fighting in a public space. When we move to social media monopolies, like Facebook, like YouTube, etc., we have no rights there at all, it's private property. Our rights there are simply based on the privileges granted by the operator, which means we can use the platform only in ways that benefit the operator.
So if you look at things like the DMCA, which is one of the first of the big draconian laws that came in to fight perceived damages of piracy and stuff like that, of copyrighted material, if you look at the DMCA, the DMCA assumes a coercive relationship between the rights holder and the publisher.
It assumes a situation where somebody has a website or something that is hosting some kind of copyrighted material, and the rights holder needs a coercive law that they can take to the courts to coerce this person to take it down, right? This still grants both parties rights, right?
So even the DMCA, draconian as it is, still grants some rights to the publisher. The rights holder has to file a court order, and there has to be certain timeliness to it. The court has to be filed within a certain amount of time where the content goes back up. Because it's a public space, the rights are still going both ways.
But now, because they're private spaces, places like YouTube, when the rights holder wants to take material down, they no longer need to go to the courts at all. They already have a relationship with the platform that came as a condition of their funding that allows them to simply take down whatever contents they want, and when there's a dispute, the dispute is routed not to the courts,
not to any third party, not even to the publisher themselves, but to the rights holder. So when you put something up on YouTube, for example, and it gets taken down, and then you complain about it, say, no, actually, I do have the rights to this, the person who actually decides whether or not that's true is actually the rights holder, not the platform, not you,
not any judiciary. So in some ways, the centralization of social media, this privatization of this kind of online public space has almost made these battles over the DMCA and SOPA and PIPA kind of irrelevant, because we're talking about our rights in the public park, but our social space is no longer the public park.
Our social space is now the shopping mall in which the security guards have absolute and total power. And so when you're talking about privacy in this context, it comes back to what he was saying about privilege. Privacy is ultimately a relationship. It's a relationship that we have with our society. And there might be advantages to me wanting other people to know
certain things about me, and maybe yes, maybe even vendors, there might be better services. But it's a matter of control. And if this was a public sphere, maybe we could even have some control. We can fight over data retention laws. But when it gets moved to the private sphere, we lose all control. And one answer that people often have to this is,
well, we just shouldn't use these Facebooks, these social media platforms. We should use, like, you know, Diaspora or Status Net or X number of various other kind of alternative social media platforms. And in some cases, this is a valid point, and I certainly support efforts to develop those platforms.
But in other cases, it's a fantasy. We're selling a fantasy. The reality is that our social capacity to build communication platforms has to work on a very, very large scale. And we need to have social processes that actually provide for the funding, sustenance, development of communication platforms, in the same ways we need it for roads and train systems.
Social media platforms, communication platforms don't build themselves. They're expensive to build and operate. And to imagine that a few hackers can build something in their spare time that can function for the six billion people on the planet, is strictly delusional. It's strictly delusional. It functions for us. I have no problem using it. Many people here use it.
It can be our own kind of private side channel, but if we're going to look at how the masses communicate, how society itself communicates, what our capacity for communication is, we have to ask these questions socially, and we have to ask, is capitalism, is private profit, the right model to build communication platforms? And until we ask that question, we can't really deal with surveillance or privacy.
I think it's also worth noting on some of the laws that Dima is talking about here. There's a law that was recently proposed. It's called CISPA. Have any of you heard of this at all? It's basically a surveillance law that says that all the privacy laws in the United States
don't apply anymore to corporations or the US government's NSA, and that they would be able to surveil anything. And there's this fascinating notion about private property here, where they say, well, companies can choose if they opt in or out. The funny thing is that when AT&T, who collaborated with the NSA,
also known as AT and Treason, when they got together and exported people's data, there were a lot of companies that probably were not involved in that and didn't want to be involved in it, but because they used AT&T or communicated on the internet through AT&T's networks, which is basically everyone, those people, they lost.
They lost because someone else with more privilege, instead of having a duty of care to protect the rights of people, and instead of having a duty of care that comes from certain, basically, monopolies that were broken apart, the duty of care was flipped, and they no longer had a duty of care.
They had a duty of patriotism to surveil everyone. And so CISPA is a law which is very similar to data protection, so-called data retention and protection laws here, which is about fighting criminals and so on. But the fascinating thing about it is that the UK and Canada and other places around the world are all talking about
having these massive data retention stores. And some people would say, well, that's okay, they can't really figure anything out from it. Like there's this spy that was just killed in the UK, they can't even figure out what happened with him. It's interesting because it's true that they will be incredibly incompetent at doing the things that they're selling it for.
But it is also the case that that's not the only reason that people who build these systems, these surveillance systems, want them. What they say in public and what they actually want in private are not necessarily perfectly matched. And if anybody here has ever been the subject of intense police or state surveillance, you probably know that they're a little duplicitous at times,
even if they're the best of people, which generally they are not. So we can look at this net neutrality debate as well and we can sort of look at it in terms of privacy. And we should say, I think, a big part of it is that it makes it very difficult to even have private space when we have this kind of surveillance because everyone's autonomy in the end-to-end sense
can be violated by anybody that's in the middle. And since the state generally has special relationships with all of these companies, especially companies that have fiber optic cables that are landing there, if we don't take serious, basically the threat to our privacy, I think the threat to our liberty will simply not even occur to us
when it comes down to it because we will miss that battle entirely. In fact, the medium will not be able to be used in defense of itself because the medium will in fact be subverted by these economic interests, but in addition to the economic interest, just quite simply the fact that this privilege
is in itself obviously an economic interest, but it's also in fact an interest that is worth preserving that is not necessarily about a direct dollar or euro transfer. It's very, very useful to have that relationship. So I wanted to say before we took it out to the audience from the stage,
people talk about censorship as if it is somehow independent from surveillance. And one of the things that we need to remember is that censorship is a byproduct of almost all surveillance. And so with the internet, what we've done is we've taken the sum total of all human communications and we have put them into protocols which are very easy for machines to parse.
So for example, when we speak, this is a really hard problem to solve to transcribe what we're saying, and there's a human that's doing it in the background right now. And what the internet has done is it has changed all of us into using a protocol or a series of protocols electronically
that are easy not only to understand in terms of the actual communication that is transmitted, but also in some cases semantically, like defining these friend of a friend protocols or defining these corporate social networks or having state-based ID systems that are mandatory or passports and so on. We start to build a thing that allows us to use machines to correlate
and to put all of this data together. And historically, even simple stuff like a census was enough to wipe out an entire society. So we have to ask ourselves why it is that we are building these systems now knowing what we know what people do. Even if the people at Google and Facebook and so on are the best people in the world,
and some of them are really amazing, both technologists and morally they're incredible people, what happens when those systems are not under the control of the moral people anymore? In theory, one could argue that capitalism is the best system that exists or democracy is the best that exists in some way,
and there are many people that have done that, and that's fantastic. But what safeguards exist to make sure that when that stops being the case or that people in the position of power, when they abuse it, that they're held to account? In my country, that has not happened in the last 10 years. And that's very scary when you consider that we have drones that fly and kill children, right? That's a really weird thing, because it seems to me that we cannot ignore that these things are related.
For example, Anwar El-Waki is a guy who was a big homophobic, super scary, kind of like guy in every way that you can imagine that pushes all the buttons for Americans with regard to terrorism. And he was killed by a drone in Yemen.
And many people, I would say, did not shed a tear for him. But his crime was basically his associations, and he was associated, as far as I can tell, with some pretty bad guys. It is alleged. He had no trial, of course. He posted a lot of stuff on YouTube, and he was a total asshole, as far as I could tell. And he sent a bunch of email, again, associations.
And for this, he was killed by a flying robot. And that, okay, maybe you're okay with that, because he's a total asshole, right? But his son, who is 16 years old, born in the United States, two weeks later was killed by a drone, and his crime was being the son of a total asshole.
So that kind of internet freedom is maybe not the kind of internet freedom we want. I want the freedom for total assholes like that to not be killed by drones without a trial, and for his children, for example, to not meet the same fate. And it is connected, in fact, when we talk about surveillance, and we talk about privatization,
and when we talk about privilege. It's also important to note that the drone might not be going for what you consider to be a total asshole next time. And the same is true with surveillance and with censorship, which is a product of that surveillance. Because what is a drone killing of a person who publishes things,
if not the most extreme form of censorship? So, with that said, on that cheery topic, I was hoping we could take some questions from the audience? Yeah, definitely. Let's get the audience involved.
Calisthenics. Pardon? Calisthenics. Before we take questions, can everybody stand up? Because I know there are some people that want to leave. Just stand up. Come on. Follow orders. Nobody? Come on. Just stretch out. So now it's your turn. Now it's your turn to... I'm so glad so many of you didn't stand up, but...
Okay. Who's back there? We see you back there. Yeah. Lazy. I was curious. I wanted to hear your opinion. Can you speak a little louder? Yeah. I would like to hear your opinion on this. I think a lot of these control mechanisms, they backfire on the people in power by means of law of unintended consequences and things like that.
So, for instance, I think the whole surveillance thing could die as soon as we have leaks about people in power themselves. Do you understand what I'm trying to say?
Yeah. I think that... Yeah, go ahead. I think it's true that there are unintended consequences, but I do not believe that a system will dismantle itself. Right? I think there are lots of good people that work in governments and in corporations who they contribute a very small part and they feel like the part they're doing is actually doing more good than harm.
And they're not willing to, for example, leak a document about something inside that's happening that's wrong because they know that will be the last document they ever leaked potentially. And so this causes this interesting feedback loop where people ask themselves, is this the thing that's worth throwing myself on the railroad and being run over by the train for?
And, of course, the answer is almost invariably no. So I met a man whose name is Bill Binney. He is probably the most intensely brave person I've ever met from the US government. I have to qualify that. I think Julian Assange is still the bravest person I've ever met. But Bill Binney is incredible because he worked for the NSA for 40 years.
And in 40 years, he fought the Cold War. He probably spied on your parents. I don't know what he did. But he quit the NSA as a whistleblower recently and disclosed, in fact, in public that the NSA has been spying on Americans. An interesting thing that happens at the NSA, though, is that those people often don't care about people that aren't Americans.
Like, so there's this constitutional right for Americans to not be spied on. It's this American exceptionalism. So you're not going to be saved by this American exceptionalism, right? So even the best people. Now, Bill didn't have that exact same thought process. He actually sort of said, I think it is, his exact words are sort of fuzzy to me.
It was a late night last night. But basically what he expressed was that this is the type of thing that is sort of a necessity in our modern world. And it is unfortunate that it is this way. So he sort of viewed it not in an American exceptionalist way. Other people that I've met and talked to from the NSA, they view it in this sense.
Like, it's okay to spy on everybody but not Americans. And they don't actually think it's wrong at all. There's no question of morality for them, even a little bit. Like, that's their job. That's the charter. Legal and illegal are the same as right and wrong. So that's all there is to it. And, you know, for example, that's two people from the NSA.
Well, they're a large organization. In fact, they have a lot of space around the world, like more than almost any other American government agency. And they're tasked with spying on everybody on the whole planet. And if the best you get after about 40 or 50 years is like two or six people and they leave,
I mean, it doesn't really bode well for the system changing itself from the inside, even with the most moral and ethical of people, even people who really believe that the actual core thing is wrong. Even if the people think it's not wrong but they still wish it wasn't happening to Americans, for example. In both of those cases, you're still fucked.
And that's the really sad thing because the structures themselves create that inequality and they make it such that people that are on the inside are not willing to just hang themselves in order to basically protect people they've never even met. I'd also like to add, I mean, you also have to be a little bit careful about when you say the people in power because real power in our society is held by accumulated wealth, right?
So when you're talking about power and holding power, you're talking about holding wealth. So even when certain unexpected consequences and public outcries over leaks or whatever do have leaders that face consequences, those are just representatives of wealth. They're not wealth itself.
So wealth is still held by the same classes and the relations of wealth hasn't changed. And so we have a communications infrastructure in which surveillance and behavioral control is the logic of investment. It is the business model of the platform. Then, you know, firing a few officials now and then in outrage is not ultimately going to change that, right? Unless we have a different business model for our communication platforms,
it simply can't change because it's not structurally possible to change. I mean, it's very clear, I mean, everybody here knows that social media platforms like Facebook are being valued now in the tens, hundreds of billions of dollars, right? So we're talking about a tremendous amount of accumulated wealth, right?
that is available to these platforms. Whereas in decentralized platforms, from things like the very classic ones that built and sold the Internet originally, like email, Usenet, all these kinds of things, they're basically uninvested in completely except to like lock down certain extreme spam. Nobody's really spending any money at all on these things because there's no business model there.
What makes Facebook worth hundreds of billions of dollars is not anything to do with the software or the servers. It has to do with their capacity for surveillance and behavioral control. And so if that is the business model, then changing of representatives or elites is not going to make a significant difference.
So the next question is over here. Sorry. Hi. My question is mostly to Mr. Kleiner. What you described in your opening statement was basically an advertisement-financed business model which is, of course, the business model of companies like Google and Facebook.
But the one thing I'm thinking about was, you know, when I go to a supermarket and pay with my credit card and I actually pay for the service that I get there from the credit card company and from the supermarket, my data is still going to be sold to marketers.
That's something that he didn't mention. The problem is that we have a big hole, basically, which is open for marketers or any kind of company to access my data even though I am paying for the service. Just because you're their customer doesn't mean you can't also be their product.
It's not an either-or situation. But you're right. Social media is only the most obvious example. This business model of data collection permeates all kinds of aspects of modern commerce.
Hi. I came here today, Jakob, because I saw you before and I think you're a brilliant speaker. Listening, I felt increasingly uneasy because the way you describe the world is a division between good and evil. For example, in the beginning you portrayed the transparency world
where you say you don't want to live in a fully transparently world. Could you just define transparency? I actually said I don't want to live in a world where I don't have privacy and I'm spied on. I actually think transparency is good. Transparency is for institutions, privacy is for people. I think this is really a simple thing. Institutional transparency is good.
Personal privacy is also good and they're not in conflict. You can have an institution where you understand how it works and is democratically accountable, but I don't need to know how long your dick is. Sure. And maybe you don't want. That's a fair. That's not a good and evil thing.
It's understanding what's going on and then correcting as it goes. Or maybe it's also use of vocabulary where you can say, okay, when every private person is fully transparent, you can say it's a good thing because you know something is evil or you can say everybody is spying on everybody, like you said, which is a different view on the world when you say if you're transparent,
everybody is spying on each other. Or the way you look at Facebook where you say, okay, you portrayed them as the privileged or the people in power. You can say that. It's fair to say that. But you can have a different view and say if you want to start a social network and you need to fund it somehow, you can ask people for money and say, okay, everybody pay one dollar if you want to be on Facebook,
but nobody is going to do that. So you say, okay, I take your personal data and everybody knows Facebook takes personal data and you take it as a business model. And I think what came with the internet is a business model of collecting data and everybody knows it. And as soon as everybody knows it, it's okay. So you can either pay or you can give your data.
It's the same for gaming where you have games where you pay or you have games where you don't pay but you give data. Can you give me an example of a social platform that's widely available, used by millions and billions that I can pay for? No. And I don't think it's a viable business model. That's it. So then this idea that I have a choice is a false statement, isn't it?
I don't really have a choice. And maybe some business models just don't work and I think business models for social platforms where you have to pay are not going to work. Maybe the point is that society doesn't need to just be focused around business as the only thing that matters and the only measure of what is good. I wouldn't call it evil.
I mean, for example, I think it's really interesting that in Europe welfare is a thing that society has and in America it's a thing that basically Republicans say poor black people are on. Really big difference there, right? Like I really like the European model where welfare isn't the world's worst word ever. And I think it's incredible actually to hear a sort of liberalization argument
from someone who's obviously benefited from a thing that has no business model. Right? How did you get an education? I mean, I assume you did. But it's really frustrating to me to hear this because you're basically fucking your own society over and saying, well, this is like the rational thing, the economic thing to do.
You can actually have privacy and anonymity and the ability to do social networking but you have to have people putting in for it but not by asking them to pay a dollar but by actually showing them the true value of their own social networks. And in fact, just because everyone knows a thing is happening it doesn't actually mean that they're going to be okay with it.
There were actually people that shouted out no when you were making your statement. And you disregard them when you talk about how the business model is more important than their voice. Right? In a democracy, the idea is that each person separated from their wealth is equal in their ability to say things and to be able to affect society. And we have to consider for a moment that this business model
in some cases is totally anti-democratic and it's even built from your own efforts. The telecom lines that are laid across the countries of the whole world, usually related to the ITU come from socialized investment in telecommunications infrastructure and now it's being privatized and sold off
and the argument comes down to, well, it's a viable business model so just get over it, that's the way that it is. And that's bullshit. It's a bullshit argument and I'm sorry to be dismissive of it and we could talk about it for hours afterwards but really when it comes down to it we have to decide what is important to us. And if the most important thing is making Mark Zuckerberg rich
then I think it is inherently an anti-democratic thing that you propose. And we do not need to accept it. There's a Confucian idea that what is inevitable is immoral to resist. This is a Chinese idea. And I think actually, first of all, we have to decide what is inevitable and we have to decide if that philosophy even works at all.
And I would say that it is not inevitable that this is the case because as Dimitri stated earlier we actually started out with business models that were totally viable. They survived on their own. In fact, they evolved to meet this. Maybe we didn't need to go that far. You used to, when you built an activist group
you would have to worry about infiltration. You don't have to worry about it anymore. It's guaranteed when you build it on Facebook. So-called lawful intercept ensures that you don't need to compromise people anymore. The networks do it themselves. But in the event that the network doesn't do it perfectly enough you actually have, in Facebook, the largest database of Jews ever built.
You have StasiBook. You like things, you don't like things and you report on your friends, you tag them in photographs. Who fucking cares if the business model is solid? It's the wrong world to live in. And the result will be devastating.
Hi, my name is Peter and I appreciate very much your talk here. It fits to me quite well to the opening speech of Moklin on Wednesday. He was talking about the change from consuming media
and media consuming us basically the same we are talking about. So if you want to overcome this, we have to understand what is the reason it made us this. I'm a 50-year-old so I remember quite well the time, let's say 20 years ago where it was a statement of progressiveness
to keep your privacy, to be opponent of central structures. And now it's a statement of progressiveness, obviously, to talk about post-privacy. So what made us to do this? Because if we want to overcome it I think we have to go deeper into the psychology which is behind that.
So maybe it's just convenience. It's so convenient to listen to music via Spotify. Actually, I don't do it because I don't want a central structure to understand what type of music I'm listening to, even if it's convenient. So it's not the reason alone, the convenience, I think.
Again, we have to go back to the paradox of the frame. The reason it's convenient to listen to music on Spotify is not that a centralized platform is inherently more convenient. It's not. The reason is because it was chosen and therefore invested in and therefore iterated towards efficiency. The earlier platforms like Usenet, email, IRC and so forth,
the original decentralized internet platforms that were developed before the commercialization of the internet could also have been iterated. But we didn't. And the reason we didn't is because the social question of how should media be built was answered with by private interest. And private interests will build things in certain ways
that meets the business models that capitalists require to capture value. So when the developers of the internet had to pad the pander for funding to the capitalists that were commercializing the internet, they needed to sell them centralization in order to fulfill their business model. Otherwise, they simply would not get the investment.
If you proposed a peer-to-peer Spotify, right, based on Creative Commons music or something like that, you simply would not get any interest from investors because it wouldn't be clear to them where the profit capture is, right? The business model that Mark Zuckerberg explained to the investors
makes a lot of sense to them because it's exactly the same as the business model that funded network television. The business model of surveillance and behavioral control is already known. So that is why. So we had a decentralized network, as you remember, right? And it became centralized because we as a society decided that we were going to build our communication platforms
not socially, not mutually, not publicly, but privately for the gain of profit. And this is the kind of network that profit will build. And additionally, I would say that there's a legal angle to this. So for example, in some countries, streaming music or competing with something like Spotify is actually directly controlled by the state through agreements where right holders, often artists for example,
they maybe weren't at the table together, right? And so you have this interesting situation where if you wanted to build a distributed Spotify, you know, it would probably look a lot like BitTorrent and the Pirate Bay. And you'll note that it's under attack.
And in the UK, it's now blocked by law. So in fact, part of the problem is actually the state. And the so-called solution comes from private enterprise and their collusion together creates a false choice of centralization. And it creates this situation where we have to actually have these methods of resistance
to be able to visit a website, which is totally ridiculous. I mean, the Tor network, I think, which I work on, that helps provide anonymity for people, you know, it wasn't designed for over-there-a-stan, right? It was designed for everyone to have anonymous speech. It's ironic because I think that as time goes on, people will end up using it for all kinds of things that we never even imagined
because we couldn't imagine that in so-called democratic states, we would have this kind of overt fascist censorship, blocking of things, usually without any consent from the public, often entirely because someone with like a $20 million business really lobbies quite hard. And so this false choice creates a problem where people who want convenience,
who, you know, you're talking about left-wing people, I don't really, I don't know what that means anymore, but this kind of choice that people are making, it comes from the fact that there's a gap in education about how the technology works. And it's okay to not know how the internet works.
Pretty much nobody knows how the internet works as a whole. And they shouldn't. And they maybe don't need to, yeah. I think that that's okay, but I think that that's part of the reason, is that convenience is actually a matter of education, which when you tie it back to funding of the state, you notice that education in a lot of places is being cut. Like, for example, even computer science departments from universities are being pulled.
So the one thing that makes people more literate in this field is even, in some cases, being restrained. There are people obviously working counter to that. I would also say the other reason is sex, which is maybe a little bit of a, you know, slight diversion there, but who here has ever used Facebook to get laid or gone to a party that resulted in that?
Anybody? Not one fucking person in this room? You're all fucking lying. That's ridiculous. What? Seriously, I don't believe it. Well, I'll do it. I don't have a Facebook account anymore, but for sure, right? And you know, that's the network effect,
exerting itself in really interesting, unintended ways. It used to be when you signed up on Facebook and you added a friend, you could say you hooked up with them. And they captured that metadata. And you know, that's a really fascinating thing. It's like, I'm going to add my friend from the party last night. Yep. What kind of world are we building there? And it turns out that you're incentivized to build that kind of network
where you have friends and then you get stuck in it. And you have all of these different returns from different people and you start to measure those relationships almost like a, I mean, I think it's a sociopathic way to live to like, this is my friend. I mean, Facebook actually destroyed the word friend, which is kind of nice, right? It's like in America, everyone's your friend.
And actually none of them will be there for you probably. But there's a difference between Americans and Germans that I've noticed actually. Germans never say they're your friend until they're going to take a bullet for you. And Americans are your friends instantaneously and they would never take a bullet for you. I mean, it's a cheap shot, but Facebook's friending is sort of like that.
But the reason you will participate in something like that, I think, is because of things in some cases that are not based on money. So you do trade your data, which is a currency. I mean, Eric Schmidt talked about it as sort of like you bring money or you bring data, you know. That is why people from all sides of the political spectrum use it.
And it's not just the left that is betraying humanity in this case. It's everybody, all of us, because we're incentivized to do it by our own communal spaces and our own funds being taken from us in some cases and the duty of care being rejected. And then we each get our own special deal with these companies or with the state and in return we feel okay about it or we feel like we have no choice.
And there we are. And it makes sense that we would get there. But we have to remember that it doesn't have to be that way just because it sort of is going in that direction right now. And I think it's important to acknowledge that we have the agency for change. You can still go to parties without Facebook. You can still have a social group without it. Like Rise Up, for example, has a service called Crabgrass,
which is sort of like a distributed Facebook application, but unlike Diaspora, it's distributed into islands. So you have like a Rise Up or a Crabgrass instance and then you can share files and photos and have some social networking aspects, but it's all private by default. Obviously the server can still see everything because of the architecture,
but you have to take the server to be able to get the data. So you can build privatized islands if you want, as in private, not necessarily a company, but affinity groups. And you can do that. And it actually already exists. And a lot of people do that, but you don't really hear about it in the same way because it doesn't advertise itself.
It doesn't buy a telephone and give you service for it, but it doesn't mean that it doesn't exist. So we have to use those things. There's a guy in the audience, a bald guy with glasses back there who's touching his head, if you raise your hand. He's working on building distributed social networking tools and code and stuff like that, which doesn't require massive amounts of capital.
It only requires that each person asserts that they value their privacy and that they want to be able to communicate with each other in a peer-to-peer fashion. And it's not even technically complicated. I mean, it took a lot of effort. But this kind of stuff, he's building an alternative. And now, who's going to fund him? Well, probably nobody is going to fund him fiscally,
but all of you can fund him with your social energy and by rejecting the surveillance state and talking to that guy. So maybe stand up. Say that. So that guy, he's the guy that's going to get you laid at parties. Yeah, so. I mean, I want, just riffing off of what you said about
that if you were to imagine a kind of like a peer-to-peer pirate bay, sorry, a peer-to-peer Spotify, that it would look a lot like the pirate bay. I think that's very true and that's very clear. One thing that strikes me is like, of course, the media is a great interpreter of things and really creates the context that we understand things in.
And if anybody remembers the early in the mid-90s, peer-to-peer was about the sexiest word that was in technology magazines. Everybody was talking about distributed systems, distributed computing, intelligent agents that would be like filtering and cooperatively doing things for us on this vast complicated distributed network.
And peer-to-peer was really exciting. It was a really positive word, right? And now if you think about the way peer-to-peer is used today, all of a sudden now peer-to-peer is a sinister word. It's something that's used to describe technologies used by thieves. It's a contraband used by terrorists or pirates, right? The very concept of peer-to-peer has shifted from being something really exciting
to being something very dodgy, something really to be avoided, right? And this is because it was completely incompatible with the needs of finance capital to actually build out a system and the needs for security in the state, right? So when peer-to-peer was actually taken over from the kind of academic, military and hobbyist world
that the Internet was born out of, and it was actually taken over by the dot-com boom, which I'll remember, when capital actually took over this Internet to build it up further, the first thing they wanted to do was get rid of peer-to-peer. And the reason is not so much that they're terribly worried about pirates and terrorists, as you can all imagine.
It's because it evades, again, it evades surveillance and behavioral control, which they need for their business models. Are there any women in the audience that want to ask a question? Any? Well, Jacob, I think we're a bit late. Sorry. Oh, I can't see you. It's me. Where?
Oh, it's over there. Oh, you're over there? All right. I'm sorry, but I think we have to finish it right now. Thank you very much. It was very nice. Thank you.