Your Body is a Honeypot: Loving Out Loud When There’s No Place to Hide
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 234 | |
Author | ||
License | CC Attribution - ShareAlike 3.0 Germany: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this | |
Identifiers | 10.5446/33084 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
00:00
Network topologyMultiplication signComputer virusNumbering schemeComputer animationJSONXMLUMLLecture/Conference
00:30
Perspective (visual)Self-organizationContent (media)Lecture/ConferenceMeeting/Interview
01:05
NumberPattern recognitionPhysical systemFront and back endsTransportation theory (mathematics)Lecture/Conference
01:27
Information systemsData structureFront and back endsRow (database)Artificial neural networkSpectrum (functional analysis)Virtual machineCombinational logicMereologyComplete metric spaceGroup actionSlide ruleMotion captureMeeting/InterviewLecture/Conference
02:05
Physical systemBasis <Mathematik>Transportation theory (mathematics)Pattern languageWorkstation <Musikinstrument>Group actionINTEGRALMeeting/Interview
02:50
InformationTime zoneMechanism designRight angleInformationGame controllerLecture/Conference
03:16
InformationUniverse (mathematics)Chromosomal crossover1 (number)SphereChannel capacityField (computer science)Computer animationLecture/ConferenceMeeting/Interview
04:11
InformationDigital photographyMobile appWebsiteQuicksortReal numberDigital photographyMathematicsGenderMachine learningLecture/Conference
04:50
GenderFacebookWebsiteDigital photographyProduct (business)InformationComputer animation
05:45
Digital photographyDatabaseDigital photographyVirtual machineInformationFamilyLecture/ConferenceMeeting/Interview
06:24
Digital photographyComputing platformPattern recognitionScripting languageCloningFacebookHypermediaQuicksortVisualization (computer graphics)
07:00
Coma BerenicesQuery languageHypermediaEntire functionDebugger2 (number)ResultantPower (physics)Pattern recognitionDigital photographyDatabaseComputer programmingMobile appMeeting/Interview
07:24
Digital photographyComputer-generated imageryDatabaseMatching (graph theory)DatabaseNumberBit ratePattern recognitionDigital photographyFinite-state machinePhysical lawBenchmarkLecture/Conference
07:56
Covering spaceWordMessage passingDigital photographyComputer animation
08:21
Digital photographyJames Waddell Alexander IIMessage passingWeb pageSlide ruleService (economics)RandomizationSystem callDigital photographyPhase transitionLecture/Conference
08:46
Serial portHypermediaGroup actionTheoryService (economics)Projective planeDemon
09:23
Mathematical analysisFreewareSlide ruleMathematical analysisPattern recognitionQuicksortFacebookComputer animation
09:52
Product (business)Service (economics)VideoconferencingHaar measureBeat (acoustics)AlgorithmPattern recognitionQuicksortComputer animation
10:14
VideoconferencingComputer-generated imageryDatabaseDigital photographySpeech synthesisWeb pagePattern recognitionThread (computing)Theory of relativityLecture/Conference
10:47
Service (economics)GenderAlgorithmPersonal identification numberCASE <Informatik>MomentumCore dumpIntegrated development environmentMathematical analysisPattern recognitionPhysical systemComputer animation
12:34
ComputerCASE <Informatik>Computer-generated imageryPhysical systemModel theoryAlgorithmSystem identificationSoftwareAsymmetryPattern recognitionPhysical systemPoint (geometry)Group actionFluid staticsDynamical systemSet (mathematics)Level (video gaming)Computer animation
13:29
Physical systemComputer-generated imageryInformation systemsAlgorithmPattern recognitionVector spaceLevel (video gaming)Replication (computing)Modal logicMeeting/Interview
13:59
Plane (geometry)AlgorithmGoogolReal numberComputer animation
14:28
Slide ruleType theoryMedical imagingReal numberOnline helpBitMereologyLecture/Conference
15:02
Connectivity (graph theory)Artificial neural networkSystem identificationAlgorithmDifferent (Kate Ryan album)CASE <Informatik>Pattern recognitionSoftwareSoftware testingDatabaseVisualization (computer graphics)Wave packetGroup actionBit rateMachine learningArtificial neural networkComputer animationLecture/ConferenceMeeting/Interview
15:41
Connectivity (graph theory)System identificationArtificial neural networkFreewareReduction of orderBit rateError messagePosition operatorMetropolitan area networkTask (computing)Physical systemMultiplication signState of matterDatabaseLecture/ConferenceMeeting/InterviewComputer animation
16:19
Different (Kate Ryan album)MereologyLimit (category theory)DatabaseBasis <Mathematik>Multiplication signNumberEscape characterBitMatching (graph theory)Level (video gaming)Lecture/Conference
17:06
Message passingPattern recognitionQuicksortMedical imagingWorkstation <Musikinstrument>Band matrixLogic gateComputing platformMeeting/Interview
17:45
System programmingLogic gateVirtual machineData miningPhysical systemLoop (music)Physical lawOffice suiteComputer animationLecture/Conference
18:15
Process (computing)Multiplication signOffice suiteComputer programmingRow (database)Graph coloringPattern languageNumberMeeting/Interview
18:56
Content (media)VideoconferencingComputer-generated imageryProcess (computing)Motion captureReal-time operating systemIdentity managementExtrapolationAlgorithmLikelihood functionVirtual machineNumberStatisticsReal numberOffice suiteLecture/Conference
19:25
Wave packetResultantBitOffice suiteSoftware testingAlgorithmInformationMotion captureSystem callStatisticsVirtual machineQuicksortWeightCausalityMeeting/Interview
19:56
Motion captureDifferent (Kate Ryan album)BitPhysical systemMassBit rateWater vaporMoving averageSession Initiation ProtocolComputer animationLecture/Conference
20:24
AuthenticationInformation securityDivisorPhysical systemBit rateRule of inferenceFuzzy logicMeeting/Interview
21:02
Physical systemElement (mathematics)Port scannerIRIS-TSoftwareVideo gameConnected spaceMeeting/Interview
21:24
Digital signalAlgorithmSweep line algorithmFluid staticsIRIS-TPort scannerFluid staticsSoftware developerAlgorithmVideo gameMultiplication signForcing (mathematics)Channel capacitySweep line algorithmFlickrMedical imagingMeeting/InterviewLecture/Conference
22:03
Universe (mathematics)Medical imagingExpected valueSweep line algorithmLecture/Conference
22:26
InformationExpected valueMotion captureOperations researchVisual systemControl flowExpected valueMechanism designMotion captureInformationResultantGame controllerDesign by contractLecture/Conference
23:10
Multiplication signDeterminantDifferent (Kate Ryan album)Asynchronous Transfer ModePoint (geometry)Mathematical analysisBitLecture/Conference
23:35
FreewareControl flowComputer-generated imagerySoftware developerDatabaseWhiteboardLevel (video gaming)Medical imagingTerm (mathematics)Video gameAxiom of choiceGame controllerPlastikkartePattern recognitionRegulator geneAsynchronous Transfer ModeSet (mathematics)Software developerDegree (graph theory)ArmSheaf (mathematics)Meeting/InterviewLecture/Conference
24:29
AreaBiostatisticsExpected valueQuicksortDatabaseWater vaporMeeting/Interview
25:00
Information privacyVisual systemGUI widgetInformation privacySelf-organizationVisualization (computer graphics)Right angleVideo gameStandard deviationAreaComputing platformGame controllerMultiplication signDifferent (Kate Ryan album)Representation (politics)Uniform resource locatorLecture/Conference
25:35
Order (biology)Term (mathematics)Web pagePhysical systemVideo gameAreaMultiplication signProcess (computing)Service (economics)DigitizingInformation securityCondition numberLecture/Conference
26:18
Fluid staticsDigital photographyPhysical systemMultiplication signCategory of beingGame controllerDivision (mathematics)SpacetimeSubsetDigital photographyFluid staticsLecture/Conference
27:21
QuicksortReflection (mathematics)Digital photographyAlgorithmStrategy gameFacebookProcess (computing)Lecture/ConferenceMeeting/Interview
27:53
Digital photographyFluid staticsScheduling (computing)Flash memoryMultiplication signData storage deviceLecture/Conference
28:33
Internet service providerComputer animation
Transcript: English(auto-generated)
00:17
Hello. So when we thought about this theme,
00:22
we were both at Republica in Dublin this year, or last year, we got to see the announcement of this theme of Republica, and the first thought that we had was, how can you love out loud in a time of ubiquitous surveillance? And so as we've created this talk, we've set out to answer that question. I am going to say I'm not representing my organization today, I'm just speaking from my personal perspective,
00:42
and I think, yeah, let's take it away. So, oh, go for it. And we have a lot of content we're going to cover quickly. If anybody wants to discuss things afterwards, I'm more than happy to talk. We've built kind of a long presentation, so we're going to try to fly through, be as informative and deliberate as possible.
01:01
So one of the things that we wanted to discuss is the way in which that facial recognition technologies are surrounding us more and more. It's a number of things. One, retrofitting of old existing systems like CCTV systems and subways and other transportation systems,
01:21
but as well as new and miniaturized camera systems that are now propagated to a backend of machine learning neural networks and new AI technologies. There's a combination of data right now, data that we're handing over voluntarily to the social networks that we take part in,
01:40
to the different structures in which we participate, be it medical records or anything across that spectrum. And then there's data or capture that's being taken from us without our consent. So this slide is a quote. Our actions, captured on film, paint a startling, complete picture of our lives.
02:01
I'll give you an example. In 2015, the New York metro system, the MTA, started installing around a thousand new video cameras. And we think about what this looks like on a daily basis, the things that can be gleaned from this, transportation patterns, which stations you get on and off at. If you are able to be uniquely identified by something like a metro CCTV integrated system,
02:26
even when you wake up, like I said, the stops that you go, if you are going to a new stop that you may not normally go to, that we're actually, even in this one system, a clear picture starts to emerge of our lives and the passive actions that we take on a daily basis
02:44
that we're not even really aware of that are being surveilled, but they are. And then when that data is combined with all of the other data, both that we're handing over and that's being captured about us, it all comes together to create this picture that's both distorted, but also comprehensive in a way.
03:01
So I think that we have to ask, who's creating this technology and who benefits from it? Who should have the right to collect and use information about our faces and our bodies? What are the mechanisms of control? We have government control on the one hand, capitalism on the other hand, in this murky gray zone between who's building the technology, who's capturing and who's benefiting from it.
03:23
This is going to be one of the focuses of our talk today, kind of this interplay between government-driven technology and corporate-driven technology, capitalist-driven technology. And one of the kind of interesting crossovers to bring up now is kind of the poaching of universities and other public research institutions into the private sector.
03:46
Carnegie Mellon had around 30 of its top self-driving car engineers poached by Uber to start their AI department. And we're seeing this more and more, in which this knowledge capacity from the university and research field is being captured by the corporate sector.
04:02
And so when the new advances in technology happened, it's really for-profit companies that are the ones that are kind of the tip of the spear now. And one of those things that they do is they suck us in with all of the cool features. So just raise your hand real quick, if you've ever participated in some sort of website or app or meme that asked you to hand over your photograph
04:21
and in exchange get some sort of insight into who you are. I know there's one going around right now where it kind of changes the gender appearance of a person. Has anyone participated in these? Okay, excellent. I have too. I'm guilty. Does anyone remember this one? So this was this fun little tool for endpoint users for us to interact with this cool feature.
04:42
And basically what happened was that Microsoft, about two years ago, they unveiled this experiment in machine learning. It was a website that could guess your age and your gender based on a photograph. I was reminded of it. He thought that this was kind of silly to include, like, oh, this isn't a very common example, but I was actually reminded of it on Facebook a couple days ago
05:01
when it said, oh, two years ago today this is what you were doing. And what I was doing was this, and what it was telling me was that my 25-year-old self was, like, 40 years old. So it wasn't particularly accurate technology, but nevertheless the creators of this particular demonstration had been hoping, they wrote optimistically, to lure in 50 or so people from the public to test their product.
05:23
But instead within hours the site was almost struggling to stay online because it was so popular with thousands of visitors from around the world, and particularly Turkey for some reason. There was another website called FaceMyAge that launched more recently. It doesn't try to just guess at age and gender, but it also asks users to supply information like their age,
05:42
like their gender, but also other things like their marital status, their educational background, whether they're a smoker or not, and then to upload photos of their face with no makeup on, unsmiling, so that they can try to basically create a database that would help machines to be able to guess your age even better. And so they say that, okay, well, you know, smoking, for example,
06:01
ages people's faces, so we need to have that data so that our machines can get better and better at learning this, and then better and better at guessing. And, of course, because this is presented as a fun experiment for people, they willingly upload their information without thinking about the ways in which that technology may or may not be eventually used. So, Matthew, I'm going to hand it over to you for this next example
06:21
because it makes me so angry that I need to cry in a corner for a minute. So, fine face. Vkontakte is the Russian kind of Facebook clone. It's a large platform with hundreds of millions of users. 410 million, I think. What someone did was basically they got access to the photo API.
06:44
So they had kind of the fire hose API. They were able to have access to all the photos on this rather large social media platform. Two engineers from Moscow, St. Petersburg, wrote a facial recognition script that basically is one of the, to date, is one of the top facial recognition programs,
07:04
and they were basically able to connect this on the front end, which is an app used by users to be able to query the entire Vk photo database and in a matter of seconds return results. So it gives you as a user the power to take a photo of somebody on the street,
07:21
query against the entire social media photo database, and get matches that are either the match of a person or you can find people that look similar to the person that you were trying to identify. So, NtechLab, which builds this, won the MegaFace benchmark, I love that name, which is a world championship in face recognition
07:41
organized by the University of Washington. Do you see the interplay already happening between academia and corporations? The challenge was to recognize the largest number of people in this database of more than a million photos, and with their accuracy rate of 73.3%, they bypassed more than a million, sorry, a hundred competitors, including Google.
08:02
Now, here's the thing about FindFace. It also, as Matthew pointed out, it looks for similar people. So this is one of FindFace's founders, that is a mouthful of words, Alexander Kabakov, and he said that you could just upload a photo of a movie star you like, or your ex, and then find ten girls who look similar to her and send them messages.
08:23
I'm really not okay with this, and in fact, sorry, I'm going to have to, I like my joke, but I got to go back a slide to tell you this other story, which made me even more sad, which is that a couple years ago, an artist called Igor Zvetkov highlighted how invasive this technology could be. He went through St. Petersburg, and he photographed random passengers on the subway,
08:42
and then matched the pictures to the individuals we contact the pages using FindFace. So in theory, he said, this service could be used by a serial killer or a collector trying to hunt down a debtor. Well, he was not wrong, because what happened was that after he got, after his experiment went live, and the media covered it,
09:01
there was a lot of coverage of it in the media as an art project, another group launched a campaign to basically demonize pornography actors in Russia by using this to identify them from their porn, and then harass them. And so this has already been used in this kind of way that we're pointing out as a potential.
09:20
This is already happening. Stalker tech. But I think that one of the really interesting things that Matthew found is that as we were looking through these different companies that create facial analysis technology and emotional analysis technology, the way that they're branded and marketed is really interesting. We can go through some of these examples, even just kind of the slide to slide,
09:41
and see these are some of the top facial recognition technology companies out there. And what's interesting, we're not going to be talking so much about Google, about Facebook, about Amazon, although these companies are important, but we're here highlighting one of the kind of understated facts in the facial recognition world is that there are small companies popping up that are building incredibly powerful
10:01
and sophisticated algorithms to be able to find facial recognition matches, even in low quality, low light, re-rendering and converting from 2D to 3D, these sort of things whose names we don't know. And we can sit back and kind of demonize the large technology companies,
10:21
but there is a lot to be done to hold small companies accountable. And I think you can see the familiar thread through all of these, which is what? Anyone want to wager a guess? Smiling happy faces, usually beautiful women smiling and happy, as we saw back on the contact us page. So rather than focusing on the bad guys, they're focused on this,
10:43
oh look, everyone's really happy when we use these facial recognition technologies. But what a lot of these technologies is doing is, in my opinion, dangerous. So for example, Kairos, which is a Miami-based facial recognition technology company, they also own an emotional analysis technology company that they acquired a couple of years ago,
11:00
and they wrapped into their core services. And their CEO has said that the impetus for that came from their customers. Some of their customers are banks. Specifically, a bank teller, when you go up to the bank, maybe a bank teller could use facial recognition technology to identify you, and that would be a better way than you showing your ID or signing something,
11:21
or maybe even than entering a PIN. But sometimes you could have someone maybe that day who comes in to rob the bank, and their face is kind of showing it, and so with that emotional analysis technology, the bank teller could have it indicated to them that today is a day that they will refuse you service. But my immediate thought when I read that was, what about people who live with anxiety?
11:42
What about people who are just in a hurry that day? So you could literally be shut out of your own money because some algorithm says that you're anxious, or you're too emotional that day to be able to do that. Another example that I found really troubling was this one, Cognitech, which is, I think, a Dutch company. Theirs does gender detection.
12:01
So this is used by casinos in Macau as well as in other places. I thought gender detection was a really funny concept because as our societies become more enlightened about gender and the fact that gender is not always a binary thing, these technologies are basically using your facial features to place you in a gender.
12:22
I've tested some of these things before online where it tries to guess your gender, and it often gets them wrong, but in this case, it's actually, where does our gender autonomy even fit into this? Do we have gender autonomy if these systems are trying to place us as one thing or another? So one of the things that is really quite startling about this
12:44
is the culmination of data points. When we're talking about the way in which young people are having their faces scanned earlier, in which identification software is being used on a facial, using facial recognition technology,
13:03
that no one action may be held in a static kind of container anymore. We're dealing with now dynamic data sets that are continuously being built around us, and one of the issues, there is an asymmetry of the way in which we're seen by these different systems.
13:27
As Kate Crawford has said, who she presented last year on stage one, some of you all may have caught that, and a lot of her research has gone into the engineering side, looking at the ways in which discrimination bias replicates inside of new technologies,
13:42
and as facial recognition technologies are just one vector of kind of these discriminatory algorithms, it's more important that we take a step back and say, well, what are the necessary requirements and protections and safeguards to make sure that we don't end up with things like this,
14:02
with the Google algorithm saying that a black couple are gorillas, and there's a lot of other famous examples of this, but if we are not, if engineering teams and if companies are not thinking about this holistically from the inside, then there may be kind of PR disasters like this, but the real world implications of these technologies are very unsettling and quite scary.
14:26
Now, Google was made aware of this, and they apologized, and they said that there's still clearly a lot of work to do with automatic image labeling. We're looking at how we can prevent these types of mistakes from happening in the future, and they have done great work on this, but I still think that part of the problem is that these companies are not very diverse,
14:40
and that's not what this talks about, but I can't help but say it. It's an important facet of why these companies continue to make the same mistakes over and over again. Let's talk a little bit about what happens when it's not just a company making a mistake and identifying people in an offensive way, but when the mistake has real world implications. So, next slide.
15:03
So, one of the things that, one of the things that is really quite startling, interesting, and we were talking about some of the different algorithms for VK, but here is another example that now scientists around the world, technologists, are training databases or training facial recognition algorithms on databases.
15:26
In this case, a group of innocent people, a group of guilty people, and using machine learning and neural networks to try to discern who is guilty out of this test, out of the test data, and who is guilty. Incidentally, their accuracy rate on this particular test was 89.5%.
15:43
So, 89.5%. I mean, for some things, like almost 90%, not bad, but we're talking about a 10% rate of either false positives or of just errors. And if we're thinking about a criminal justice system in which that one out of every 10 people are sentenced incorrectly,
16:03
we're talking about a whole tenth of the population in which that time in the future may not be able to have, it's like access to due process because of automated sentencing guidelines and other things. And it's happening now. Over, nearly one half of American citizens have their face in a database
16:24
that's accessible by some level of law enforcement. And that's massive. That means that one out of every two adults in the US, their face has been taken from them, that their likeness now resides in a database which is able to be used for criminal justice investigations and other things.
16:46
And so you may not even know that your face is in one of a number of different databases, and yet on a daily basis, these databases may be crawled to look for new matches for guilty people or suspects. But we're not aware of this.
17:01
And it's not just our faces. I'm going to skip through this part just a little bit because I see our time limit. It's not just our faces. It's also other identifying markers about us. It's our tattoos, which I'm covered with and which now I know. I didn't know when I got them, but now I know that that's a way that I can be identified by police. So I'm going to have to come up with some sort of thing that covers them with infrared.
17:21
I don't even know. It's also our gait. It's the way that we walk. And one of the really scary things about this is that while facial recognition usually requires high quality images, gait recognition does not. If you're walking on the subway platform and the CCTV camera picks you up, and you know that the U-Bahn stations are covered with them, you're walking that way. A low bandwidth image is enough to recognize you by your gait.
17:44
Yesterday, my boss from across the room recognized me by my gait. Even our eyes can do this, so it's possible. The machines have eyes and in some ways minds, and as more of these systems become automated, we believe that humans will be increasingly placed outside of the loop.
18:02
So just another example, going back to New York, to show how these things really don't just, we're now in this interconnected world. I don't know if you all are familiar with stop and frisk. It was an unpopular law in New York City which allowed police officers to essentially go up to anyone
18:21
they might be suspicious about and ask them for ID and pat them down. What happened, why this was eventually pulled by the police department, there was some challenges in court saying it was unconstitutional, the policy was rescinded before it went to court, but the idea of there are now records from the time that it was in place
18:43
of the people who were charged under this policy. It was found to be very discriminatory in the sense that young men of color were disproportionately targeted by this program. So if we're looking at crime statistics, let's just say a readout of the number of arrests in New York City,
19:03
and if we were to put demographic data with that, and then feed this into a machine learning algorithm, if the machine learning algorithm sees that a large percentage of individuals are young black and brown men, what is the machine learning algorithm to think except that these individuals have a higher likelihood of committing crimes?
19:24
In the real world, it was real world bias by police officers that were targeting minority communities, but a machine algorithm, if we're not weighting this sort of information in the tests and training data, there's no way for a machine to logically or intuitively see a causal relationship
19:43
between segregation and bias in the real world and the crime statistics that result from that. So, with that in mind, we want to talk a little bit about how we can love out loud in a future where ubiquitous capture is even bigger than it is now.
20:03
I'm just letting him have a sip of water. There we go. So Matthew, tell me a little bit about this example from Sesame Credit. Has anyone heard of Sesame Credit? Okay, so we've got a few people who are familiar with it. So Sesame Credit is a system that's now been implemented in China. There's an ongoing roll out. It's a social credit rating, essentially.
20:24
It uses a number of different factors. It's been pioneered by Ant Financial, which is a large financial institution in China. It's a company. It's not technically a state-owned enterprise, but it's fuzzy when it comes to the ruling Chinese Communist Party.
20:42
The interesting thing about this new system is that it uses things like your WeChat account and see who you are talking with. And if people are discussing sensitive things, your social credit rating can be docked.
21:02
So now we're seeing this system developed right now in China that brings in elements from your social life, your network connections, as well as things like your credit history, and now to paint a picture of how good of a citizen are you. Alipay is now launching in the US. It was announced today, and so they're now pioneering some very sophisticated technology like iris scans.
21:29
And the markets, even the wireless or the contactless market in China has exploded from a few hundred millions of dollars now to the hundreds of billions of dollars. And so this technology in the last even 24 months has become much more prevalent
21:44
and has become much more ubiquitous for its capacity for individual surveillance. And this is where life begins to resemble an episode of Black Mirror. So what we'd like to remind you is that digital images aren't static. With each new development, each sweep of an algorithm, each time you put something there, you've left it there.
22:03
I know that I've got hundreds, possibly thousands of images sitting on Flickr. With each new sweep of an algorithm, these images are being reassessed. They're being reconsidered and re-idealized to match other data that this company or X company or a government might have on you. What you share today may mean something else tomorrow.
22:23
So right now we feel that there's no universal reasonable expectation that exists between ourselves and our technology. The consequence of data aggregation is that increased capture of our personal information results in this more robust, yet distorted picture of who we are that we mentioned at the beginning. And so I think that we want to take the last few minutes, and we'll try to leave a few minutes for questions,
22:44
just to talk about that emerging social contract that we would like to see exist, we would like to see forged between us and technology companies and governments. We can't see behind the curtain. We have no way of knowing how the collection of our visual imagery is even being used. Aggregated or repurposed.
23:02
And we want to remind you also that these technologies are mechanisms of control. And so the first question that I want to, that's funny, the first question that I want to ask personally is what kind of world do we want? I think that's the starting point, is asking do we want a world where our faces are captured all the time, where I can walk down this hallway and have different cameras that
23:22
are attached to different companies that have different methods and modes of analysis, looking at me and trying to decide who I am and making determinations about me. But perhaps we're past that point, and so we've decided to be pragmatic a little bit in trying to formulate some things that we can do. So in terms of what we want, we want an active life that's free from passive surveillance,
23:43
we want more control over our choices and over the images that we share, and we want a technology market that isn't based on selling us out to the highest bidder. Luckily there are some people working on all of these things right now, not just us, and so we feel really supported in these choices. I'm going to turn to you for regulation. So I've had the opportunity to sit in on a couple of sessions, smart cities sessions yesterday,
24:06
talking between development of smart cities in Barcelona and Berlin, as well as smart devices yesterday. We are, I think, seeing to some degree a development of a global set of best practices, but very piecemeal and fragmented. And I think that if we think about image recognition technology as kind of a layer that fits on many of the other modes of technological development,
24:29
that it becomes clear that actually we need to have some sort of best practices as biometric databases continue to be aggregated.
24:41
It's very difficult for one person in one country to have reasonable expectations of what the best practices are going forward. I mean, maybe today it's possible, but 20 years from now, what's the world look like that we want to live in? So these are some of the areas that we think governments, and particularly local governments, can intervene.
25:01
We also think that we can have better privacy standards for technology, and we know that there are a lot of privacy advocates here and that those things are already being worked on too, so we want to acknowledge the great work of all the organizations that are fighting for this. But we see one way to do this is segment privacy by feature, location, visual representation, search, social and movement, all of these different areas in which our privacy is being violated.
25:22
We also think user-centric privacy controls. Right now, most privacy controls on different platforms that you use are not really user-friendly. And trust me, I spend a lot of time on these platforms. And then another thing too that's really important to me, and the rest of my life I work on censorship, but I think forward consent, I don't feel that I am consenting to these 15-page terms and conditions documents
25:44
that companies try to make as confusing for me as possible. And so I think that if companies keep in mind forward consent every time that you use their service, that's one way that they can manage this problem. But also, we think that you have to keep loving out loud, that you can't hide, that you can't live in fear, that just because these systems are out there,
26:01
yes, of course we have to take precautions. I talk a lot in my day job about digital security, and I think that this is that same area where we have to continue living the way that we want to live. We can take precautions, but we can't sacrifice our lives out of fear. And so one thing is photo awareness.
26:20
We're really glad to see that at a lot of conferences recently, there have been ways of identifying yourself if you don't want to be photographed. But also in clubs in Berlin, if anyone's ever gone clubbing, and if you haven't, you probably should, usually they put a sticker over your camera when you walk in the front door. And that, to me, the first time I saw that, I was so elated that I think I danced till 7 a.m.
26:41
I mean, that's not normal. We also think that, you know, regulate your own spaces. I don't want to get into this division of public-private property. That's not what I'm here for. But I think that we have our own spaces, and we decide inside of those what's acceptable and what isn't, and that includes conferences like this. So if we feel in these spaces, I don't really know what's going on with cameras here, if they exist or not,
27:01
but if we feel in these spaces that that's something we want to take control of, then we should band together and do that. Put static in the system. This is a general point, but in any system that is being focused on, how can we find ways to put static in the system,
27:21
whether this is tagging a photo that's not you with your name on Facebook, or whether that's, yeah, these sort of strategies, right? So it's ways in which, how can we think a little more outside to actually confuse the algorithms or to make their jobs a little more difficult? And there's different ways to do this, whether it's wearing kind of reflective clothing,
27:44
anti-paparazzi clothing in public, or tagging things that's under one label that may not be that. It also means wearing those flash-proof garments, covering your face, going to buy your burner phone in the store and wearing a Halloween costume. I'm not saying you should do that, but you should totally do that.
28:03
And continuing to love out loud and not live in fear. And I can see that we've completely run out of time because I think the schedule is a little bit behind, but there's some good news. If you want to keep talking to us about this, we're both pretty easily accessible, but we're also going to take advantage of that sunlight that didn't exist yesterday and go out back for a celebratory beer. So if you want to keep talking about this subject, you're welcome to join us out there.
28:23
Thank you so much.