Bestand wählen
Merken

The business of privacy by disaster

Zitierlink des Filmsegments
Embed Code

Automatisierte Medienanalyse

Beta
Erkannte Entitäten
Sprachtranskript
and so if you if and
the the the is the I don't understand
why so many people are leaving because I'm also gonna talk about stupidity different kind of stability
that it's related the so the the title of my talk is the business of privacy disaster knows a colored privacy by disaster but then I thought I had a full disclosure I run a company that pays wages and work privacy so we are making money off privacy you could say so I was like OK let's call it the business of privacy right disaster so as I said um umbrella company there's a few of us we started as university spin-off we thought we just do research get some studies and reports to write and a 1 1 but then the world has been changing as we've been growing and so we've been able to experience the recent things that happen like this no than revelations from out quite different point of
view that most people because we've gone from having an expectation of working on just doing research academic research to working with private clients were very surprised to see that pretty Soon after we started people started to call us and say actually we need your services in the real world In the private world companies and so what I wanna share with you is how we've experienced this from the business perspective where how the world and privacy are changing how they've been changing in the last few months slash URIs and that a so if I ask
you In the last few years who did the most to raise awareness about privacy who would you say it what's Edward Snowden Jennifer Lawrence you had her nude pictures stolen from MRI clouds storage or the people that campaign against full body scanners being installed at airports how many of you would say is noted it how many of you would say Jennifer Lawrence how many of you would say full body scanners at None Jennifer Lawrence was the winner right away it's funny because in the active as world people would say I'm ever on in the real world people would say Jennifer Lawrence is she's a pop star in our world it was full body scanners I don't remember what happened
in 2010 a guy got on a plane to fly from Amsterdam to Detroit with explosives he only managed to explode his own underpants I'm apparently people in the plane intervene there was a problem with the liquid he was carrying so in the end nothing happened but it was a massive scare the 1st of its kind after 9 11 so apart from a lot after many years after that and so that reopened the debate about security in
airports what can we do what we did after 9 11 clearly is not enough because people like this guy can get on planes as the the days and weeks and months progressed and past we learned that the guy was actually on a watch list that his own that reported this man for his contacts with al-Qaida so he was somewhat being monitored but no 1 had run any investigation him so he was in a large database never investigated so he was not in a no-fly list so he was able to get on that plane so you ask anyone with things rationally what was the problem we could say well within invest enough intelligence traditional police intelligence someone should have checked this guy out
and find out whether beyond his that's required he was an actual danger no and it back but that would have been the rational response any insecurity in Europe there's not many ration of responses and so we had a problem with police intelligence and the response was full body scanners How would fall what is scanners have stopped that guy we don't know there's no relation between the problem and the technological solutions that happens as a set very often if we can buy a machine why would we worry with exploring other options but that which I guess for you went pretty much unnoticed really change the privacy world 1 full body scanners were installed people rejected them people refused to go through the full body scan it was a massive campaign in the US people in Europe complaining saying this is an invasion of my privacy I want opt-out you cannot tell me they have to go through a machine that can see me naked also images of famous people make it ended up on you tube 2 was clear that the security measures were not very good and so the full body scanners were discarded they were never installed on a full scale as envisioned envisaged initially these men that many companies lots lost lots of money no company wants to create just the pilot they lose so much
money with just a pilot the idea that you create a pilot at a loss and then you make a lot more money by selling hundreds thousands of that machine that you've created and so companies lost money and governments loss reputation and trust Our from their citizens and so something began to change and we could notice that I people in government spending private companies we have begun to realize will make you would need to understand a bit better what is happening with privacy and help people understand and wondered privacy respected and so for as definitely what's been a game-changer has been the shuffle by that body scanners just a year and a half after this the European Commission organize the societal impact working group to look at how to better assess the societal impact of of data-intensive technologies of surveillance technologies right now the camp the
European Commission does not find research that does not look into this issue so there's been a lot of policy change which is not made the news but it's definitely changed the way the world that we that we work on but there's other cases there were cases before
doesn't that was 2005 when the Dutch and got the smart meters installed in their homes I guess you all have them by now by law by 2018 every European home should have a smart meter so the Dutch said well the smart meters can read all the electricity they consume in my house at any given moment so every few seconds the smart meter would send your electricity company information on
how many lights you have 1 they can even know what appliances your connecting whether you'd home or not so you've thieves hacked into them they would be able to know whether you got on holiday with anyone's at home so that that that you don't realize the worst they were privacy problems with this and people said I want all how I wanna be able to say whether I want is installed in my house and they won the battle and so they still have smart meters but now they're opt-out and the readings are not as frequent with the interesting things is that 1 of the arguments that the consumers associations of campaigned against smart meters in the Netherlands used was these house the recognizer it's an friend's house is where I'm friends family head in the 2nd World War and the argument was if the house that has smart
meter the family would not have lasted 2 days because you cannot hide In a smart near home because as soon as you connect something someone knows and what of the organs of the used pretty powerful it didn't make the news they change something in the Netherlands and after the full what is castle was 20 2005 or those things are the things that happened there was a year and a half ago London installed what they call spite
beans rubbish bins that had sensors that could reading your MAC address and your phone and so no where you were going to how fast you were walking whether you through a store or a different story and it could and connect all these data from you on several days and then the idea was that they would be able to sell you a targeted advertising so was a business model behind this this was installed with no communication to anyone and so in the media picked up on this they were like what are you telling me there's I walk down the street there are bins they're collecting data that is sensitive that can tell a company or a anatomy or the government or whoever water column walking around the state are you serious inhabited been consulted and there's nothing I can do about it the beans have to be removed so I with full body scanners loss of money a loss of reputation all the cases this
lovely young lady in the UK she was appointed the 1st use Commissioner in a local police in a small town in New York I think it was in some in the UK she was 16 when she was she went through the interview process and she was the 1 that was selected to represent young people before the police the interest of young people over the police quite a
nice development for the police to want to know how young people feel about how the police operate in our cities she was
appointed when she was 17 and as soon as she was appointed the media revealed a tweet that she had written when she was 14 some people said she was talking mudrocks she said she said she was talking about cartoons she lost her job to lot of young people heard about this intermediate chilling effect on I got the something when I was 14 that can come back to haunt me later on another example in in bloom wise because it's gone bankrupt now was 1 of the most promising startups to work on data in education in the US it was created about 3 years ago and it provided services to work from primary and high schools by giving them a service that allow them to have their own cloud in store the information from the students really useful for schools but as soon as it began to spread parents begun to ask how are you protecting my kids data why is it
going will you be able to do things in the future with this data that I'm not aware of How can I stop you from using like its data How can I stop my kids data from pointing my my son or my daughter down the line how like control that data how we might help will
my children be able to control that data when this questions were began to be asked the company had to close shop they could not answer they haven't done
much in terms of privacy of our or security take went bankrupt or 1 of the most promising startups indication they were bankrupt and right now the US arm administration and Obama announced a student Privacy Act again a privacy by disaster privacy failure that made people react in this case the on the administration another
1 this is the last 1 on the New York taxis started to publish in Open Open Data the roots that people
took and they said 0 don't worry we anonymize it it just you know people going from a to B but there's no way you will know who they are you just know they're going from a to B well guess what if I get a taxi every day
in front of my house and I go to work it's quite easy to separate me from the crowd and to identify me not only that if I am famous and someone takes a picture of me getting on but perhaps it's very easy to lean that picture that's been published in the bus
expressed to these smack in know exactly what I was going so anonymized Open Data for the future well not really if you care about your privacy if you don't know if you don't want others to know where you're going do
taxi so again privacy by disaster all of a sudden were like 0 this Open Data business by not be all that saved Indian it's all this cases have a chilling effect they'd make the
victims but also those around them realize that we have problems in there are things that we need to react and upon and some of the people that react become our clients sort clients are eager early adopters were scared and oftentimes they're both the people that become early adopters because of the that they call and say can you please help me make sure they don't have the same problem that
London had when I install sensors in my city can you please help me make sure that people don't get angry at at my Open Data scheme when they realize that there there's chances for every identifying personal data in those Open Data some websites or some or or projects and this is right I mean I guess it's meant that we've gone from being a very non aspirational company they just want to the research to working in a really exciting field of coming up with new methodologies and helping people with real life problems but it's also problematic is the only way we can relate to privacy is after the fact and that has a huge influence on what kind of solutions we can find the 1st problem that we have that it people only come to us after this disaster
most likely they've already collected the data in an unsafe way and without any concern for privacy they've already stored the data they've already analyzed the data it's usually in data sharing the problems of her 3 lost so many opportunities to do things well that the the par op chances of intervening in an effective way are very
much diminished the also when we work on the basis of privacy by disaster the media usually intervene and when the meeting when the the media intervene they tend to distort things and so it's hard to do always awareness-raising when there's the media I'm
in the middle because they tend to distort the shoes there's also an incentive for the only work on on privacy disaster and and Technology fiascos there's also an incentive to hide failures no 1 wants to be in the media because they're the ones that disregarded that their clients privacy and so that is a that is a problem and also sometimes our clients just want a quick fix or they say can you give provide me with a communication strategy and yes we can provide them with a communication strategies to communicate the things that they're doing well not to hide the things that they're not doing that the failing to do so we do not like to work in the field of privacy by disaster would like to see that change would like to be able to intervene from the very beginning so what we've been doing the last few months is developing something that people can understand as a matter of methodology to grasp of what the challenges are and how to address it address them from the beginning and so we started to look at
all the methodologies out there responsible research and innovation I am Technology Assessment Privacy Impact Assessment and were like we sure that lots of smart people have come up with a solution for this and we found that not really there's so many small initiatives here and there are people have been trying to answer these problems but not a very effective way on on in a coordinated way so what we have to do is get everything together all the methodologies that have any relationship will with were doing including I'm cost-benefit analysis environmental impact on assessments everything that we thought could be willing to believe marginally to what we were doing but all together and said OK what is important here because we cannot tell people you have to be wary of everything all the time so want to end up with a few keywords that took 2 could convey the the masses who want to convey and these are the 4 key words have you found where
necessary for anyone to take into out into account if they wanted to innovate responsibly and to deal with data in a way that did not harm people's privacy so to the desirability is what you wanna do you need it is is it socially desirable are there alternatives yeah what if you don't buy the new machine would really improve your processes because maybe you just fine as you are and oftentimes offended that is the case of desirability is a clear thing for us it's linked to policy oftentimes acceptability oftentimes you may do everything right but if you don't communicate properly people
react as if he did everything wrong I told you about this by beans in London before and they had to be removed only held a year later New York installed sensors in telephone booths and those sensors were opt-in and where beacons sort of way better than the London development in a way more plants privacy enhancing and responsible for the what happened was installed in London but people react in the exact same way people understand and acknowledge it works so as you're able to tell them what is you're doing with that with the data they will react perceive you're stealing their data and doing something
response over the because everybody else is doing that so social acceptability is a key thing that needs to be looked at and try to be arm to be able to predict how people will react to a given development in communicate hi you're dealing with data ethics really important fundamental rights and values of freedom of expression autonomy there's so many arm fundamental rights that are involved when data-intensive technologies start to work and collect your data so many dangers that need to be put on the table so we do that in ethics and finally data management and here's where anonymization plays a key role we need to anonymize a lot better and we need to develop
better anonymization techniques we're not there yet we do not have the technological solutions oftentimes we're not in doing things well because because we can't because we don't have that acknowledges the techniques yet but we're working on that and we can work with the different kinds of approaches we can use tree-based encryption differential privacy if inventive approaches adding noise to databases we can do different things at different points and come up with a Taylor for any given the problem so after coming up with this we guided all of the normative framework the socio-economic content and everything that we thought was
useful interesting with the tools that we can use the tools of come from social sciences but also from engineering from computing everything from focus groups around the anonymization techniques arms samples come from economics cost-benefit analysis etc. and ups and basically this is what we do and this is it at an exact example is this the will recall the ATC subsystem we're we using this methodology this met the methodology that we then tailored to every problem in different kinds of projects we work on arm um social media mining in disaster management on friend 6 on smart city solutions or work a lot and border crossings and biometrics in more the crossing and how the we create technologies that would the prospective both rights when you cross a border it's not easy and this is 1 of the pressure that we have the ABC for EU on ABC stands for automated border control gates I guess met some of you may have gone through them right now what did these days when you land in an airport sometimes you don't have a physical person checking your passport it's a machine as it have until have had to go through an automated machine got our clients will not like to know that because you have you will see them
installed but mainly in Europe defendants cable in the trade in Paris etcetera and basically they wanna speed up the cues by you introducing your
passport the machine reading your biometrics checking them against the database or not and saying whether you represent a threat and whether you are who you say you are that ontology about some of the challenges we found with with that because a
fascinating project because when you read the Schengen border code he talks about proportionality it also about people's rights it talks about
autonomy lots of really good things but how do you translate that into the technology I also don't forget that the way we understand more there's is very much linked to our conception of democracy why the thing you have to accuse in airports 1 for Europeans and 1 for non Europeans why do you think that it's it's up to speed things up it would be illegal to makes us because the European citizen you have your entitled to all your rights you can enter in your country or your area the Schengen Area even if you are a criminal no 1 can stop you from entering your space of sovereignty you're not an EU citizen basically you have no rights were very few rights and so that the control the check of the database completely different in 1 q or the other of course you Commission a technology company to develop an ATC system and a light will make everyone goes for the same data and will check everyone's data against all the databases because if the appeal file we can stop them well that's illegal you cannot do that because as I said you have the right to enter your country and then you'd be subject to the laws of your country but entering the country is not a police unchecked it in administrative
check child we translate all this all this wealth in our loss in our current systems into the technology this is what we try to do with our method methodology and of course they can be all the ways of approaching this with it's really important that we start thinking very deeply on how do we translate the values that we've created our fundamental rights into technological solutions and stop giving up our fundamental rights and values for the sake of technologies that we don't even know if they're taking us to a better future or 2 and a more efficient I'm present the use of it so what now yeah think you could do they could do um uh you have also Gemma sorry what was done that already at that you any questions against some of the questions no I convince them all great
things so they otherwise a demo would be here for a few minutes and you could find around that long so thank you very much thank you so much the
Chandni
and
Subtraktion
Stabilitätstheorie <Logik>
Beobachtungsstudie
Dienst <Informatik>
Client
Erwartungswert
Datenmissbrauch
Sichtenkonzept
Gemeinsamer Speicher
Perspektive
Rechter Winkel
Grundraum
Verkehrsinformation
Ebene
Datenmissbrauch
Belegleser
Reelle Zahl
Computersicherheit
Flüssiger Zustand
Speicher <Informatik>
Kontextbezogenes System
Explosion <Stochastik>
Streuungsdiagramm
Ebene
Zentrische Streckung
Datenmissbrauch
Belegleser
Datenhaltung
Computersicherheit
Relativitätstheorie
Mailing-Liste
Virtuelle Maschine
Rationale Zahl
Endogene Variable
Einflussgröße
Bildgebendes Verfahren
Metropolitan area network
Virtuelle Maschine
Bit
Einfügungsdämpfung
Datenmissbrauch
Belegleser
Mathematisierung
Gruppenkeim
Hilfesystem
Assoziativgesetz
Parametersystem
Datenmissbrauch
Zustandsmaschine
Momentenproblem
Zwei
Familie <Mathematik>
Information
Gesetz <Physik>
Schreib-Lese-Kopf
Lesen <Datenverarbeitung>
Telekommunikation
Einfügungsdämpfung
Belegleser
Selbst organisierendes System
Binärdaten
Adressraum
Hypermedia
Familie <Mathematik>
Meter
Speicher <Informatik>
Unternehmensmodell
Aggregatzustand
Prozess <Physik>
Softwareentwickler
Soundverarbeitung
Dienst <Informatik>
Twitter <Softwareplattform>
Prozess <Informatik>
Hypermedia
Gamecontroller
t-Test
Vererbungshierarchie
Information
Speicher <Informatik>
Gerade
Streuungsdiagramm
Datenmissbrauch
Computersicherheit
Systemverwaltung
t-Test
Indexberechnung
Term
Offene Menge
Wurzel <Mathematik>
Datenmissbrauch
Offene Menge
Bus <Informatik>
Soundverarbeitung
Client
Datenmissbrauch
Offene Menge
Videospiel
Datenmissbrauch
Web Site
Datenfeld
Reelle Zahl
Offene Menge
Projektive Ebene
Nummerung
Telekommunikation
Client
Datenmissbrauch
Datenfeld
Verbandstheorie
Mathematisierung
Hypermedia
Basisvektor
Strategisches Spiel
Eins
Teilmenge
Virtuelle Maschine
Datenmissbrauch
Prozess <Physik>
Rechter Winkel
Endogene Variable
Äußere Algebra eines Moduls
Ruhmasse
Wort <Informatik>
Programmierumgebung
Analysis
Chipkarte
Teilmenge
Fundamentalsatz der Algebra
Arithmetischer Ausdruck
Datenmissbrauch
Datenmanagement
Rechter Winkel
Endogene Variable
Softwareentwickler
Quick-Sort
Tabelle <Informatik>
Subtraktion
Punkt
Gruppenkeim
Geräusch
Framework <Informatik>
Data Mining
Virtuelle Maschine
Differential
Client
Datenmanagement
Stichprobenumfang
Inhalt <Mathematik>
Analysis
Datenmissbrauch
Datenhaltung
Fokalpunkt
Chipkarte
Druckverlauf
Verknüpfungsglied
Chiffrierung
Rechter Winkel
Hypermedia
Gamecontroller
Projektive Ebene
Ultraviolett-Photoelektronenspektroskopie
Biostatistik
Virtuelle Maschine
Datenhaltung
Biostatistik
Flächeninhalt
Rechter Winkel
Datenhaltung
Gamecontroller
Projektive Ebene
Physikalisches System
Elektronische Publikation
Gesetz <Physik>
Raum-Zeit
Code
Fundamentalsatz der Algebra
Demo <Programm>
Einfügungsdämpfung
Rechter Winkel
Physikalisches System

Metadaten

Formale Metadaten

Titel The business of privacy by disaster
Serientitel re:publica 2015
Teil 103
Anzahl der Teile 177
Autor Clavell, Gemma Galdon
Lizenz CC-Namensnennung - Weitergabe unter gleichen Bedingungen 3.0 Deutschland:
Sie dürfen das Werk bzw. den Inhalt zu jedem legalen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen und das Werk bzw. diesen Inhalt auch in veränderter Form nur unter den Bedingungen dieser Lizenz weitergeben.
DOI 10.5446/31887
Herausgeber re:publica
Erscheinungsjahr 2015
Sprache Englisch
Produktionsort Berlin

Inhaltliche Metadaten

Fachgebiet Informatik
Abstract 'Privacy by disaster' is currently the main driver behind real-world adoption of privacy-enhancing solutions. This session will go over some of the disasters that have raised awareness between public and private actors of the need to take privacy into account, and will present how a small start-up based in Barcelona, Eticas, is managing to seize the moment and translate societal concerns into responsible socio-technical architectures that companies want to use.

Ähnliche Filme

Loading...
Feedback