Merken

Your Body is a Honeypot: Loving Out Loud When There’s No Place to Hide

Zitierlink des Filmsegments
Embed Code

Automatisierte Medienanalyse

Beta
Erkannte Entitäten
Sprachtranskript
the the tree that is being
alone the time of the the
hello and so when we thought about this we're both the Republic and Dublin this year we got last year we got to see the announcement of this scheme of Republic on the 1st thought that we had was how can you love out loud in a time of
ubiquitous surveillance and so as we created this talk we set out to answer that question am I am going to say I'm not representing my organization today I'm just speaking from my personal perspective and and think that let's take taken away so don't go for it and we look at constant recover quickly if anybody wants to set things afterward more than happy to talk I we don't yet come along presentations or try to fly through these informative and then delivered as possible so 1 of the things that's the 1 to and discussed is the way in
which that facial recognition technologies or surrounding us more and more it's a number of things 1 retrofitting of old existing systems like CCTV systems in some ways and other transportation systems as well as new and miniaturized camera systems that are now created of
no voted to a back end of machine learning neural networks and of new AI technologies is a combination of data right now data that we're handing over voluntarily to the social networks that we take part in to add the different structures in which we participate in medical records of anything across the spectrum
and there's or capture that being taken from us without our consent so the sliding
this approach or actions captured on film pink startling complete picture of our lives in example into 2000 of 15 The New York uh metro system the MTA
started installing run thousand new video cameras there we well this looks like on a daily basis the things that can be gleaned from this transportation patterns which stations you get on and off that if you were able to be uniquely identified by something like this and the natural CCTB integrated system they even when you wake up and this so the start you go if you were going to a new stop that you may not normally go to that were actually even in this 1 system you will clear uh a resource the pitch resorts to emerge all our lives in the past in the past of actions that we take on a daily basis and that were not really aware of the being revealed that they are and then when that data combined all the other data both are handing over
and that being captured about it all comes together to create this picture that not distorted but also comprehensive in a way that so I think we have to ask who's creating this technology and who benefits from it who should have the right to collect and use information about our faces in our bodies what are the mechanisms of control but there's you know we have a government control on the 1 hand capitalism on the other hand in this murky gray zone between whose building the
technologies capturing and who's benefiting from it so we were the focuses of our talk today cover
this interplay between government driven technology and cooperative and technology capitalism and capitalistic driven technology and we look at interesting crossover so to bring up now is the
poaching of universities and other public research institutions into the private sector and Carnegie Mellon had around 30 of its top self-driving car engineers coach work to start their AI department were seen as more and more on which that this knowledge capacity from the university and research field is being uh captured by the corporate sector and so in the new advances in technology happens it's really a for-profit companies that are the ones a kind of the the 2 the sphere now and 1 of the things they do
is they set the same with all the cool features so just raise your hand a quick if you've ever participated in some sort of web site the apple means that
asked you to hand over your photographed and in exchange get some sort of insight into who you are and others 1 going around right now where it kind of changes the the the gender appearance of a person with anyone participated means OK excellent i have to and guilty under the new remember this 1
the so this is this 1 little tool for endpoint users and for us to interact with this cool feature and basically what happens is that Microsoft um about 2 years ago then build this experiment machine learning was a website that could guess feed your age and your
gender based on a photograph I reminded of he thought that this was kind of silly to include like 0 this is a very common example that I was actually reminded of it I'm on Facebook a couple days ago when it said how 2 years ago today this is what you're doing and what I was doing with this and what was telling you is that my 25 year old self was like 40 years old and so it wasn't particularly accurate technology but nevertheless they the creators of this particular demonstration had been hoping that they optimistically taller in 50 or so people from the public to test the product that instead of an hour's the site was almost struggling to stay online because it was so popular with thousands of visitors around the world and particularly Turkey for some reason and there is another website called face my age that lots more recently it doesn't had just gas at age and gender but also asked users to supply information on like their age like the gender but also other things like the marital status their educational background whether they're a
smoker or not and then to upload photos of the face with no makeup on unsmiling so that they can try to basically created database that would help of machines to be able to guess your age even better and so they say that OK well you know smoking for example ages people's faces so we need to
have that data so that a machine can get better and better at learning this and then better and better at guessing and of course because this is presented as a fun experiment for people they willingly upload the information without thinking about the ways in which that technology may or may not be eventually used so that family under review for this next example that makes me so angry that I need to cry in the corner for a
minute so we find this kind of the
concept of the k is the Russian a Facebook clone it's a it's a large platform hundred millions of users 410 million I think with someone did was basically had get sort they had actively get access to the photo API so they have the the firearms API we were able to have access to the all the photos on on this rather large social media platform and to induce small-scale sectors bird of ropes a visual attention scripts that's basically is 1 of the 2 date is 1 of the top official
recognition um programs and they were basically able to connect this and on on the front end which is an apt used by users uh to be able to query the entire BK photo database as a matter of seconds returned
results so gives you as a user power take a photo of somebody on the street query the entire social media uh photo database and get
matches costs of that are either of the natural person or you can find people that looks similar to the person that you were trying to identify so and tackle which builds this 1 of the major benchmark of that name which is a world championship in face recognition organized by the University of Washington DC the interplay already happening between academia and corporations and the challenges to recognize law number of people in this database of more than a million photos and their recognition rate the the accuracy rate of 73 . 3 %
bypassed one-in-a-million at 1 million sorry 100 competitors including Google of now here's the thing about find face it also Matthew pointed out it looks for similar people so this is 1 of find faces founders that's a mouthful of words and Alexander cover of and he said that you could just upload a photo of a movie star you like or acts and find 10 girls you look similar to her and send the messages I am really
not OK with this on and in fact and you like my joke they can go back a slide to tell you this other story which may be even more sad which is that a couple years ago an article that you got that call highlighted how invasive this technology could be you went to St. Petersburg photograph random passengers on the subway and then match the pictures to the individuals be contacted pages using fine phase and so in
theory he said the service could be used by a serial killer or collector trying to hunt down the data while he was not wrong because what happened was that after the and got after his experiment went live and media covered there is a lot of coverage of it in the media as an art project and another group launched a campaign to basically demonized pornography actors in Russia by using this to identify them from their porn and and harassed and so this has already been used in this kind of way that were pointing out has the potential this is already happening started and but you know I think 1 of the really interesting thing that Matthew found is that as we're looking through these different companies that create facial analysis technology and emotional analysis technology the way that the branded and market is really interesting can go through so these examples even just sort like the slide and see this on the top official recognition technology
companies out there and what interesting we don't we're not really talking so much about Google about Facebook about Amazon all these companies are important but were here piloting 1 of the the come understated
facts in the face recognition world is that there are small companies popping up there Billy incredibly powerful and sophisticated algorithms to be able to find and
facial recognition matches even in low quality low light uh support really re-rendering and a converting from 2 D to 3D these sort of things that
we use names we don't know and we can sit back and the demonized other
large technology companies but there is a lot to
be done to hold small Kalman small companies accountable I think you can see the familiar
thread relatives which is what anyone 1 winter gas smiling happy faces usually beautiful women smiling and happy as we saw back on the contact of
page so rather than focusing on the
bad guys the focused on this almost everyone really happy when we use the speech recognition technology is but 1 a lot of these technologies is
doing is in my opinion dangerous so for example and Cairo switches Miami-based facial recognition technology company they also own an emotional analysis technology company that they acquire a couple years ago maybe wrapped into their core services and the CEO has said that the impetus for that came from the customers some their customers of banks and specifically that a bank teller when you go to the bank may be a bank teller could use facial recognition technology to identify you and that would be a better way than you showing IDE or signing something or maybe even than entering a PIN but the sometimes you could have someone maybe that data comes in to rob the bank and their face is kind of showing it so with that emotional analysis technology the bank teller could indeed be and habit indicated to them that today is the day that they will refuse you service but my immediate thought when I read that was what about people who live in anxiety what about people who were just in a hurry that day so you can literally be shut out of your own money because some algorithm says that you're anxious or you're too emotional that day to be able to do that another example of that I found really troubling was this 1 contact which is I think a Dutch company and there is uh does gender detection so this is used by casinos in Macao as well as in other places as thought gender detection was really funny concept and because as our societies become more enlightened about gender and the fact that gender is not always a binary thing on these technologies are basically using your facial features to to place you in agenda and 7 this is you know I've have tested some of these things before online work tries to guess agenda and often gets them wrong and but in this case it's actually where our gender autonomy even fit into this to we have gender autonomy of the systems are trying to place as as 1 thing or another that at the it yeah so what are the things
that we is is really quite startling about this is the culmination of of data points when we're talking about the way in which young people or having their faces stand school year and in which that more identification software is being used on artificial
are using these with the not recognition technology the no 1 action may to be held in in a static and a container the more they were doing with
with now with dynamic and datasets that are kind of can literature and continuously being built around us and what are the issues that this is that there is an asymmetry of the way in which that we're seeing and by these different systems as this Kate Crawford and said that she presented
last year um on on stage 1 some of you may recall that a lot her research has gone into it and the economic engineering side looking at the ways in which discrimination biased replicates inside of of new technologies and as facial recognition technologies or just 1 vector of these discriminatory algorithms it's more and more important than we take a step back and say 0 what or the necessary requirements and protections and safeguards and to make sure that we don't end up with the with things like this with
Google algorithm saying that a black couple there or rollers and there's a lot of
know there's a lot of other uh famous examples of this but if we or not if engineering teams of companies or not thinking about this whole astutely from the inside then there may be a pure disasters like this but the real world implications of these technologies or or very unsettling and quite scary and Google was made aware mostly may apologized and they said that they're still
clearly a lot of work to do with automatic image labeling we're looking at how we can prevent these types mistakes mapping the future and they have done great work on this but I still think that part of the problem is that these companies are not
very diverse and that's not that's talked about I can't help but say it it's an important facet of why these companies continue to make the same mistakes over and over again let's talk a little bit about what happens when it's not just a Company making a mistake in identifying people in an offensive way but when a mistake has real-world implications so excellent so 1 of things that
serve to have or 1 of the things that is really quite a Staley interesting and then we were talking about some of the different algorithms for for BK but and here is another example that's now
scientists around the world and technologies or or training databases in visual recognition algorithms on databases are in this case a group of innocent people a group of the guilty people and using machine learning in all networks to try to discern who is guilty of this test of the test data and who was guilty and the only the accuracy rate on this
particular task was 89 . 5 % a man high % I mean for something like almost 90 % not bad but we're talking about a 10 per cent rate of either uh of false positives or reduce errors and if we're at the
criminal justice system in which that 1 out of every 10 persons people or are
sentenced incorrectly so talking a whole 10th of the population in which state at a time in the future it may not be able to have sigh courses to do that to people access to due process because of automated of setting guidelines and other things and it's happening now um over the 1
half of American citizens have the face in a database that accessible by by some uh little law-enforcement and that's massive that means that 1 out of every other every 2 adults in the US the face is has been taken from them that they're like this now resides in the database which is able to be used for criminal justice uh investigations and ends and other things and so you may not even know that your face is in 1 of a number of different databases and yet on a daily basis these these database may be crawled to look for new matches for guilty people or suspects that were not aware of this when it's not our faces an escape through this part just a little bit this year time limit and it's not just our faces also
other identifying markers about us at their tattoos which uncovered with which now I know I didn't know when I got them now I know that actually that I can be identified by police after come up with some sort of thing that covers the mid-infrared additive now and it also Arkaitz it's the way that we walk and 1 of the really scary things about this is that
while facial recognition usually requires high quality images of gait recognition does not if you're walking on the subway platform and the CCTV camera picks you up and you know that the newborn stations are covered with them and you're walking that way of low bandwidth image is enough to recognize you by your gate yesterday my boss from across the room recognized me by my gates and even our eyes can do this so it's possible the machines of ideas and
in some ways mines and as more of these systems become automated we believe that humans will be increasingly place outside of the loop and so just another example
going back to to New York after to show like how these things really don't just uh they were now in this interconnected world going 1 of your familiar with stop and press that was unpopular law
in New York City which allowed police officers to essentially go up to anyone they might be suspicious about and ask them for ID and pattern down what happened why this was eventually pulled by the by the police department there was some challenges in court saying it was unconstitutional and it was the policy was was rescinded before it went to court but the idea of there or now records from the time that it was in place of the people who were charged under this policy was it was gonna be very discriminatory in the sense that young men of color were discipline disproportionately targeted by this program so
we're looking at were looking at of of crime statistics which is say of readout of the number of arrests in New York City and we were just good demographic data with that and then that feed this into a machine learning algorithm if the machine learning the machine learning algorithms sees that's in the large percentage of of individuals or a young black and brown then what is a machine learning algorithm to think except that these individuals have a higher likelihood of committing crimes in the real world it was real world bias by
police officers that work targeting minority communities but a machine algorithm is if we're not waiting and this sort of information in in the test and training data there's no way for a machine to logically you intuitively see a calls or lationships between 1 segregation and bias in the real world and the crime statistics data or that are the result from that so with that in mind we wanna talk a little bit about how we can move out loud in a future where ubiquitous capture is even bigger than
it is now for the mothers of the water there again on the mass you tell me a little bit
about this example from Sesame credit as an unheard of Sesame credit you have to again a few people were familiar with that the system occurs is the is the system is now been implemented in China there's an ongoing roll out it's the social credit rating essentially it uses a number of different
factors and it's it's been on bike and financial which is a large
uh a large financial institution in China and it's in it's a company is thought that mistake on arise but there are others it's it's fuzzy and when it comes to the ruling Chinese Communist Party the interesting things about so this new system is that it uses this things like you were able to look into things like you reach out and accountancy who you were talking with and people are talking about for discussing sensitive things you're credit your social federated can be talked so
now we're seeing this system developed right now in china that brings in this elements in your social life and your network connections and as well as things like 0 you're credit history and now to paint a picture of how good of a citizen or you are nowadays is now launching the US was announced today and so the now pioneering some very
sophisticated technology but like iris scans and that the market so that wireless was the consequence of market in China is exploded from each
of you who dollars mounted on a 2 billion dollars and so this technology in the last 24 months and has become much more prevalent and is much more ubiquitous force capacity for individual sensor surveillance and this is your life begins to resemble an episode of black
near so what we'd like to remind you that digital images on static with each new development each sweep of algorithm each time you put something that you've left it there I know that I've
gotten hundreds possibly thousands of images sitting on flicker when each choose we cannot rely on these images are being reassessed there being reconsidered and re idealized to match other data that that this company or X company or a government might have on you what you share today may mean something else tomorrow so so right now we feel that there is no room universal reasonable acts like
a reasonable expectation that exists between us ourselves and our technology the consequent of data aggregation is that increased capture our personal information results in a more robust yet distorted picture of who we are that we mentioned at the beginning and so I think that morning take the last few minutes and we tried a few minutes for questions just to talk about the emerging social contract that we would like to see it exist and we would like to see forged between us and technology companies and governments we can see behind the curtain with no way of knowing how the collection of our visual imagery is even being used aggregator repurposed and what online you also that these technologies and mechanisms of control and so the 1st question I want funny the 1st question that I want to have personally is what kind of world do we
want I think that's the starting point is asking do we want a world where our faces are captured all the time I can walk down the hallway and have different cameras that are attached to different companies that have different methods and modes of analysis looking at me in trying to decide who I am and making determinations about me but perhaps for past that point and so we decided to be pragmatic a little bit in trying to formulate some things that we can do
so in terms of what we want we want active life that's free from passive surveillance well more control over choices over the images that we share and they technology market that is based on telling us out to the highest bidder luckily there are some people working
on all of these things right now not just us and so we feel really supported in these choices some countries regulation so the new 0 had the opportunity to sit in on a couple of sections cities sessions uh yesterday talked between development of smart cities in Barcelona and Berland arms will smart devices uh it's there we are I think seem to some degree aid development of a global set of best practices but very piecemeal and fragmented and I think were if we think about in tracking the image recognition technology has come a layer that that it's on many of the other modes of technological development that it
becomes clear that actually we can we need to have some sort of best practices and as as a biometric databases continue to be um an aggregated work it's it's very difficult for 1 person 1 country and 1 different 1 person and 1 different country to have reasonable expectations of of what the best practices or going for I mean maybe today it's possible but when you should know this was the world look like the water with them so these are some of the areas that we think governments particularly local governments can intervene we also think
that we can have better privacy standards for technology we know that there are a lot of privacy advocates here and that those things are already being worked on to use we want our knowledge the great work of all the organizations fighting for this what we see it 1 way to do this segment privacy by feature location visual representation search social movement all these different areas in which our privacy is being violated and we also think User centric privacy controls right now most privacy controls on different platforms that you use are not really user friendly and trust me I I spent a lot of time on the level and and another thing to that's really important to me and I I
work from the rest of my life I work on censorship I think for a consensus I don't feel that I am consenting to these 15 page 2 terms and conditions documents that companies try to make it confusing for me as possible and so I think that if companies keep in mind for consent every time you use the Service that's 1 way that they can manage this problem but also we think that you have to keep loving out loud that you can't hide you can't live in fear that just because the systems are out there you know yes of course we have to take precautions I talk a lot in my day job about digital security and and I think that this is that same area where we have to continue living in the way that we want to live we can take precautions but we can't sacrifice are alive and in order you out of fear and so
1 thing is for awareness and we're really glad to see that a lot of conferences recently there been ways of identifying yourself if you don't wanna be photographed but also in clubs in Berlin if anyone's ever going clubbing if you haven't you probably should I usually just a career camera when you walk in the front door and that to me the 1st time I saw that I was so elated that I think a dance till 7 on I I mean that's not well and we also think that you know really here only spaces i don't want get to this division of public private property that's not what I'm here for but I think that we have our own spaces and we decide inside of those what's acceptable and what isn't and that includes conferences like this so if we feel in these spaces I don't really know what's going on the cameras here if they exist or not but we fill in the space of about something we want to control them we should them together and do that I'm but static and the and this is a general points but in a system that is being focused on how can we find ways to put static system whether
this is tagging a photo that's not you with your name on on Facebook or whether that's um yeah the the sort of strategies right so it's it's ways in which happily we think a little more outside the actually confused the algorithms or to make their jobs a little more difficult and there's always do this and whether it's wearing reflective um clothing antifibrotic clothing on in public or yeah tagging things that's your under 1 label that may not be that it also means you know wearing those
flash through garments covering the face going to buy your burner phone in the store and wearing Halloween costume not thing you should do that that and then did things all of loud and not live in fear and I can see that were new completely run out of time they think the schedules a little bit behind but there's some good news if you want to be talking about this for both pretty easily accessible under we're also going to take advantage of that family that didn't exist yesterday and go back for a celebratory years if you want to talk about the subject your welcome to join us out there thank you so much
during all rules or or provide over supporting me away
Netzwerktopologie
Vorlesung/Konferenz
Nummerung
Computeranimation
Perspektive
Selbst organisierendes System
Besprechung/Interview
Zahlenbereich
Vorlesung/Konferenz
Physikalisches System
Mustererkennung
Transportproblem
Virtuelle Maschine
Datensatz
Front-End <Software>
Schaltnetz
Mereologie
Vorlesung/Konferenz
Datenstruktur
Punktspektrum
Informationssystem
Neuronales Netz
Vervollständigung <Mathematik>
Arbeitsplatzcomputer
Mustersprache
Basisvektor
Gruppenoperation
Vorlesung/Konferenz
Physikalisches System
Transportproblem
Kraftfahrzeugmechatroniker
Kontrolltheorie
Rechter Winkel
Besprechung/Interview
Vorlesung/Konferenz
Information
Information
Zeitzone
Kugel
Datenfeld
Kanalkapazität
Vorlesung/Konferenz
Information
Cross over <Kritisches Phänomen>
Grundraum
Computeranimation
Eins
Web Site
Geschlecht <Mathematik>
Mathematisierung
Vorlesung/Konferenz
Information
Quick-Sort
Facebook
Web Site
Geschlecht <Mathematik>
Digitale Photographie
Besprechung/Interview
Biprodukt
Algorithmische Lerntheorie
Digitale Photographie
Virtuelle Maschine
Digitale Photographie
Datenhaltung
Besprechung/Interview
Familie <Mathematik>
Vorlesung/Konferenz
Information
Digitale Photographie
Facebook
Digitale Photographie
Hypermedia
Visualisierung
Skript <Programm>
Systemplattform
Quick-Sort
Klon <Mathematik>
Resultante
Datenhaltung
Zwei
Abfrage
Mustererkennung
Computeranimation
Digitale Photographie
COM
Debugging
Hypermedia
Optimierung
Ganze Funktion
Leistung <Physik>
Zustandsmaschine
Datenhaltung
Besprechung/Interview
Zahlenbereich
Bitrate
Mustererkennung
Gesetz <Physik>
Datenhaltung
Überlagerung <Mathematik>
Digitale Photographie
Spezialrechner
Digitale Photographie
Vorlesung/Konferenz
Wort <Informatik>
Message-Passing
Benchmark
Vektorpotenzial
Besprechung/Interview
Dualitätssatz
Gruppenkeim
Systemaufruf
Mustererkennung
Analysis
Physikalische Theorie
Quick-Sort
Digitale Photographie
Homepage
Rechenschieber
Message-Passing
Dienst <Informatik>
Digitale Photographie
Hypermedia
Randomisierung
Vorlesung/Konferenz
Serielle Schnittstelle
Projektive Ebene
Phasenumwandlung
Analysis
Videokonferenz
Freeware
Facebook
Algorithmus
Vorlesung/Konferenz
Biprodukt
Dienst <Informatik>
Mustererkennung
Schwebung
Quick-Sort
Computeranimation
Haar-Integral
Videokonferenz
Relativitätstheorie
Vorlesung/Konferenz
Thread
Impuls
Besprechung/Interview
Sprachsynthese
Physikalisches System
Mustererkennung
Datenhaltung
Digitale Photographie
Homepage
Persönliche Identifikationsnummer
Spezialrechner
Dienst <Informatik>
Algorithmus
Eindringerkennung
Geschlecht <Mathematik>
Vorlesung/Konferenz
Speicherabzug
Analysis
Spezialrechner
Punkt
Datenverarbeitungssystem
Software
Gruppenoperation
Systemidentifikation
Vorlesung/Konferenz
Mustererkennung
Algorithmus
Physikalisches System
Vektorraum
Mustererkennung
Computeranimation
Modallogik
Physikalisches System
Spezialrechner
Algorithmus
Datenreplikation
Asymmetrie
Modelltheorie
Informationssystem
Algorithmus
Ebene
Reelle Zahl
Bit
Datentyp
Mereologie
Vorlesung/Konferenz
Hilfesystem
Bildgebendes Verfahren
Systemidentifikation
Softwaretest
Subtraktion
Zusammenhängender Graph
Wellenpaket
Datennetz
Datenhaltung
Gruppenkeim
Mustererkennung
Bitrate
Computeranimation
Algorithmus
Visualisierung
Algorithmische Lerntheorie
Neuronales Netz
Systemidentifikation
Zusammenhängender Graph
Ortsoperator
Physikalisches System
Bitrate
Ordnungsreduktion
Computeranimation
Task
Freeware
Neuronales Netz
Metropolitan area network
Fehlermeldung
Aggregatzustand
Message-Passing
Subtraktion
Bit
Matching <Graphentheorie>
Datenhaltung
Basisvektor
Maskierung <Informatik>
Mereologie
Zahlenbereich
Inverser Limes
Vorlesung/Konferenz
Quick-Sort
Besprechung/Interview
Physikalisches System
Mustererkennung
Systemplattform
Computeranimation
Data Mining
Virtuelle Maschine
Loop
Systemprogrammierung
Verknüpfungsglied
Arbeitsplatzcomputer
Bandmatrix
Bildgebendes Verfahren
Systemprogrammierung
Datensatz
Prozess <Informatik>
Mustersprache
Vorlesung/Konferenz
Kantenfärbung
Optimierung
Gesetz <Physik>
Office-Paket
Resultante
Softwaretest
Bit
Statistik
Motion Capturing
Wellenpaket
Prozess <Informatik>
Likelihood-Funktion
Content <Internet>
Systemaufruf
Zahlenbereich
Statistische Analyse
Identitätsverwaltung
Echtzeitsystem
Extrapolation
Quick-Sort
Office-Paket
Videokonferenz
Motion Capturing
Spezialrechner
Virtuelle Maschine
Algorithmus
Reelle Zahl
Vorlesung/Konferenz
Information
Subtraktion
Bit
Motion Capturing
Wasserdampftafel
Ruhmasse
Physikalisches System
Gleitendes Mittel
Bitrate
Computeranimation
Fuzzy-Logik
Authentifikation
Computersicherheit
Vorlesung/Konferenz
Schlussregel
Physikalisches System
Teilbarkeit
Computeranimation
Einfach zusammenhängender Raum
Videospiel
Datennetz
Besprechung/Interview
Physikalisches System
Element <Mathematik>
Portscanner
IRIS-T
Hydrostatik
Hydrostatik
Algorithmus
Videospiel
Digitalsignal
Algorithmus
Forcing
Sweep-Algorithmus
Kanalkapazität
Vorlesung/Konferenz
Sweep-Algorithmus
Softwareentwickler
Resultante
Kraftfahrzeugmechatroniker
Motion Capturing
Kontrolltheorie
Information
Visuelles System
Design by Contract
Motion Capturing
Erwartungswert
Erwartungswert
Vorlesung/Konferenz
Information
Operations Research
Grundraum
Kontrolltheorie
Bildgebendes Verfahren
Videospiel
ATM
Subtraktion
Bit
Punkt
Kontrolltheorie
Determiniertheit <Informatik>
Besprechung/Interview
Term
Auswahlaxiom
Bildgebendes Verfahren
Analysis
ATM
Softwareentwickler
Wasserdampftafel
Datenhaltung
Übergang
Mustererkennung
Quick-Sort
Datenhaltung
Chipkarte
Spezialrechner
Freeware
Minimalgrad
Whiteboard
Flächeninhalt
Menge
Vorlesung/Konferenz
Garbentheorie
Softwareentwickler
Kontrolltheorie
Auswahlaxiom
Regulator <Mathematik>
Bildgebendes Verfahren
Videospiel
Subtraktion
Datenmissbrauch
Kontrolltheorie
Benutzerfreundlichkeit
Selbst organisierendes System
Computersicherheit
Selbstrepräsentation
Physikalisches System
Visuelles System
Term
Systemplattform
Datenmissbrauch
Homepage
Dienst <Informatik>
Flächeninhalt
Prozess <Informatik>
Digitalisierer
Konditionszahl
Widget
Visualisierung
Vorlesung/Konferenz
URL
Ordnung <Mathematik>
Standardabweichung
Teilmenge
Hydrostatik
Hydrostatik
Facebook
Algorithmus
Spiegelung <Mathematik>
Prozess <Informatik>
Digitale Photographie
Kategorie <Mathematik>
Strategisches Spiel
Vorlesung/Konferenz
Physikalisches System
Quick-Sort
Division
Raum-Zeit
Digitale Photographie
Hydrostatik
Scheduling
Flash-Speicher
Vorlesung/Konferenz
Speicher <Informatik>
Service provider
Digitale Photographie

Metadaten

Formale Metadaten

Titel Your Body is a Honeypot: Loving Out Loud When There’s No Place to Hide
Serientitel re:publica 2017
Teil 152
Anzahl der Teile 235
Autor Stender, Matthew
York, Jillian
Lizenz CC-Namensnennung - Weitergabe unter gleichen Bedingungen 3.0 Deutschland:
Sie dürfen das Werk bzw. den Inhalt zu jedem legalen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen und das Werk bzw. diesen Inhalt auch in veränderter Form nur unter den Bedingungen dieser Lizenz weitergeben.
DOI 10.5446/33084
Herausgeber re:publica
Erscheinungsjahr 2017
Sprache Englisch

Inhaltliche Metadaten

Fachgebiet Informatik
Abstract What does it mean to love out loud in a time of ubiquitous capture? Our physical selves are being recorded by proprietary image-capture systems that are used to infer behavioral traits and construct identities, challenging our notions of individual agency and a sovereign self. How humans live and love in the 21st century will be decided by how we balance governance, ethics and oversight of emerging technologies.

Ähnliche Filme

Loading...