Research on the Machines: Help the FTC protect privacy & security
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 93 | |
Author | ||
License | CC Attribution 3.0 Unported: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor. | |
Identifiers | 10.5446/36293 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
DEF CON 2418 / 93
4
6
7
11
15
20
26
33
34
35
36
39
40
46
49
53
58
62
63
66
68
72
79
90
92
93
00:00
Information securityIntegrated development environmentElectronic visual displaySlide ruleInformation privacyInternet der DingeBitInternetworkingRoundness (object)StatisticsDifferent (Kate Ryan album)Term (mathematics)Computer programmingEstimatorVector potentialRange (statistics)Archaeological field surveyPersonal area networkGoodness of fitTheory of relativity
02:42
Information securityArchaeological field surveyCovering space
03:17
Speech synthesisOnline helpInformation privacyInformation securityMereologyAdditionInformation privacyInformation securityIntegrated development environmentEvent horizonCASE <Informatik>Covering spaceOnline helpBit
04:17
CASE <Informatik>Order (biology)Information privacyException handlingRule of inferenceRegulator geneCountingInformation securityMereologySinc function
05:16
Information privacyCASE <Informatik>Information securityAuthorization
05:56
GoogolSocial softwareInformationFacebookComputer-generated imageryComputer networkOracleRouter (computing)Revision controlJava appletCASE <Informatik>FacebookSet (mathematics)Information privacyInformation securitySpacetimeQuicksortTerm (mathematics)AuthorizationInternet der DingeNumberEncryptionCountingConfiguration spaceData conversionDigital photographyService (economics)Multiplication signAddress spaceDigital electronicsAreaHypermediaRouter (computing)Level (video gaming)MathematicsPlastikkarteMalwareSoftwareGroup actionLocal area networkPersonal area networkData storage deviceMereologyChainTheoryVulnerability (computing)Inheritance (object-oriented programming)Medical imagingJava appletRevision controlValidity (statistics)FrequencyOrder (biology)InformationPhysical systemInstallation artResultantGoogolBitRepetitionReal number
11:46
Personal digital assistantCASE <Informatik>Multiplication signHypermediaMereologyTraffic reportingSoftwareVariety (linguistics)State of matterConnectivity (graph theory)
12:31
Electric currentInformation securityOnline helpMereologyConnected spaceDependent and independent variablesForm (programming)Product (business)Physical law
13:29
Mobile WebSummierbarkeitSpywareInformation systemsIncidence algebraType theoryData storage deviceNumberIntrusion detection systemCharge carrierAuthenticationProcedural programmingDatabasePlastikkarteSimulation
14:59
SpywareMobile WebIdentity managementWordTraffic reportingIdentity managementTotal S.A.CASE <Informatik>DatabaseSpyware
15:50
Computer networkSpywareMobile WebIdentity managementIdentity managementTraffic reportingSimulationSpywareHypermediaTwitterComputer animation
16:26
TwitterNumberAuthenticationInformation securityDivisor
17:12
Identity managementSign (mathematics)Physical lawInformation privacyInformation securityAlphabet (computer science)State of matterRevision controlNumberTelecommunicationOperator (mathematics)
18:20
Information securityConfiguration spaceInformation securityData storage devicePasswordSpacetimeGoodness of fitVulnerability (computing)Response time (technology)Address spaceWave packetTraffic reportingMaxima and minimaMessage passingProper map
19:23
SpacetimeOffice suiteComputer programmingIntegrated development environment
20:07
PlastikkarteInformationTopological vector spaceVideoconferencingAreaOffice suiteSpacetimeSeries (mathematics)Traffic reportingInformation privacyPlastikkarteWebcastInformation securityComputer animation
21:13
Software testingType theoryAxiom of choiceObservational studySoftwareInformation privacyQuicksortCybersexMobile appPerformance appraisalTelecommunicationProduct (business)Student's t-testUsabilitySoftware testing
22:41
Information privacyEvent horizon2 (number)Internet forumAreaWebsiteInformation
23:36
Electronic mailing listElectronic mailing listDivision (mathematics)Data conversionEntire functionMultiplication signInformation securityRight angleInformation privacy
24:39
Latent heatInformation securityInformation privacyType theoryInformation securityProcess (computing)Vulnerability (computing)Metric systemAreaComputer animation
25:49
Computer networkComputing platformInformation securityInternet der DingeCAN busInformation privacySystem programmingControl flowInformationSmart DeviceTelecommunicationVideo trackingSystem identificationOnline helpSoftwareComputing platformInformation securityLine (geometry)TwitterVirtual realityRobotConnected spaceAreaArtificial neural networkDifferent (Kate Ryan album)Type theoryArithmetic progressionSpacetimeInformation privacyVariety (linguistics)Physical systemComputer animation
27:32
Control flowInformationSmart DeviceTelecommunicationVideo trackingContext awarenessSampling (statistics)Decision theoryTrailAlgorithmIdentifiabilityShared memoryMobile appType theoryLibrary (computing)Vulnerability (computing)AreaInformationComputer animation
28:40
Projective planeRight angleIntelligent NetworkPhysical lawCASE <Informatik>MereologyPhysical systemGroup actionAreaOffice suiteCivil engineeringEqualiser (mathematics)AlgorithmInheritance (object-oriented programming)Extension (kinesiology)Service (economics)FacebookSystem administratorAxiom of choiceBitAmsterdam Ordnance DatumRouter (computing)System callShared memorySoftwareMultiplication signSoftware development kitInteractive televisionExploit (computer security)TwitterQuicksortRule of inferenceSet (mathematics)Capability Maturity ModelSoftware developerComputer programmingHypermediaAuthorizationVirtual realityWebsiteVariety (linguistics)NeuroinformatikSpeech synthesisContext awarenessWeb pageInformation securityCommitment schemeMathematicsElectronic mailing listPoint (geometry)Data conversionTerm (mathematics)Uniform resource locatorGoodness of fitInformation privacyAddress spaceMobile WebEncryptionTable (information)BlogOrder (biology)Pointer (computer programming)Structural loadInformationLine (geometry)TrailEmailMobile appObservational studyOnline helpSpacetimeNumberPresentation of a groupRegulator geneDivisorIntegrated development environmentDirection (geometry)Spring (hydrology)CausalitySlide ruleGateway (telecommunications)
Transcript: English(auto-generated)
00:00
everybody for uh coming out for this talk. Um the FTC is kind of the it's it's the federal agency everybody can actually love. Yeah? Uh the FTC's been doing really cool stuff and um they're here to give everybody some really good news. Um and talk about some new programs they've got going on. So let's give them uh give them our support and uh give
00:22
them a big round of applause. Some powerpoint but uh it's not coming up right now so maybe it'll come on during our presentation. Um if if you could see it right now it would say
00:42
that the title of our talk is research on the machines. Uh help the FTC protect privacy and security of consumers. Um and then the next slide would say uh who we are but we'll just cover that. I'm Carol McSweeney I'm a federal trade commissioner um I'm uh an attorney um and I'm really interested in protecting consumer privacy and data
01:02
security. And I'm Lori Crainer I'm the chief technologist at the FTC I've been there uh since January and I've been doing a lot of security and privacy related work. So the machines um you know estimates vary I I see these wide range of different statistics it
01:22
looks like we have about 25 mil- billion connected devices right now and we're on our way to about 50 billion consumer facing connected devices in 2020. Um you know some people call this the internet of things I think that term is a little bit overused I think it's internet of a lot of stuff um but really what's going on here is that we are
01:43
connecting ourselves and the stuff in our lives in new and exciting ways. That's bringing a huge amount of innovation to consumers and we wanna make sure that consumers get the advantage of it but I don't need to tell anybody in the room that it's also creating a huge amount of insecurity for consumers and raising a lot of privacy
02:01
issues. So uh one of the things that's all that's also happening and we saw this on display terrifically yesterday in the DARPA competition is that the machines are getting smarter as well. So at the FTC we're really worried about trying to protect consumers in this increasingly interconnected environment. One of the things that we're very focused on is the potential security and privacy risk to consumers and I'd also note
02:25
that I think uh increasingly consumers themselves are very very concerned about trusting these devices. So we see some survey data that really indicates oh slide! Yay! Yes! There's even a slide ahead of where I am. Okay well. There we go. Machines, right?
02:43
Okay. Now um and and so we see some consumer survey data that really indicates um that consumers themselves are maybe not adopting some of these new technologies because they're worried about the security of them. Uh you know I've been uh attending DEFCON for
03:00
the last 3 years and I I see a lot of really creative, really interesting uh work presented here and I think you know consumers are right to be a little bit concerned about the security of of these devices. So we're starting to see that reflected as well. Um so what we're gonna cover today um is really how we're trying to approach this
03:22
challenge of protecting consumers in this environment. Um it's easy to sort of adopt this attitude of like abandon all hope ye who enter here. There's like no way we're ever gonna fix this. It's just a disaster for privacy and security. But I really prefer to approach this issue using the teachings of another great master. There is do or do not. There is no
03:47
try right? So I we're gonna talk a little bit about the do part of this and what we are trying to do at the FTC today. Uh so quick overview um we can go oh right I'm sorry
04:01
issues of the day. Um we also um in addition to bringing a bunch of enforcement cases we're also really trying to focus on the broader policy debate. And we're gonna talk about how we need your help. We're gonna talk about some of the events that we're holding and some of the ways that you can help us. So how do we respond to the rise of
04:20
the machines when the machines are everywhere? Well the FTC and and we we're using its acronym the Federal Trade Commission um actually has almost nothing to do with trade policy thank god. And everything to do with being a consumer protection and competition enforcer. So primarily what we do is bring cases against companies. These
04:40
are civil cases it means we sue people we get settlements we put them under order and then we uh operate and make sure they comply with the orders that we put put them under. That's really different than other parts of the government that are more focused on writing rules or regulations which isn't so much what we do with the exception of uh writing rules about children's privacy online under the COPA act. So we
05:02
uh primarily bring cases involving privacy and data security and by last count we've actually brought more than 400 of these cases since we began bringing privacy related cases almost 25 years ago. So it's not a new issue for us at the Federal Trade Commission. We do it by by using two authorities in the FTC act. Uh first we look at
05:24
whether a practice is unfair and it can be unfair legally if it's uh going to create a substantial injury to consumers. It's not avoidable re reasonably avoidable by them and not outweighed by some other pro competitive or consumer benefit. Or we bring cases in
05:40
situations where something has happened that has deceived consumers in a meaningful and material way. And and so those are the two primary ways in which we have really engaged in an active enforcement mission to protect consumer privacy and data security. So what does this mean? Example. Yes um so we're gonna tell you about a few
06:01
cases here. Uh so Facebook uh had had settings for users to control their privacy settings and they promised that if you limited access to some of the personal information you posted that um that it would not be viewed by by people that you did not grant access to. It wouldn't be shared with um third parties. And they also said that if
06:23
you uh deleted your account then the photos that you had posted would no longer be accessible. But as it turned out that some of the information people posted was accessible to other people and third parties beyond the settings that they had set. And some of the photos were accessible even after people deleted their accounts. So it turned out that uh
06:47
we brought an unfairness count in that case because some of the data that had been designated private uh Facebook sort of retroactively changed how it was handled and made it public. And we said wow that's super unfair. Again consumers can't avoid that and it can cause them a real harm. So that's that's the legal theory for that kind of case.
07:05
So uh in the case of Google they had promised people that their Gmail contacts wouldn't be used for anything other than as part of Gmail. However when they launched their new Buzz uh social media service they populated Buzz with the Google
07:22
contacts uh from Gmail. Um and uh and so that exposed people's contact information uh on Google Buzz. Yeah uh actually a broader case as well involving a number of accounts but mostly they're all deception based um uh uh um accounts there. So the
07:42
misreps are you are you're uh sharing information under one set of terms but actually they don't live up to that set of terms so that's that's a misrep case misrepresentation case for us it's deceptive to the consumers. So. Oh and I guess I should note here as well that that was these are um cases from like 2011 they're a
08:02
misrep case but this was the first case where the FTC remedy actually requires comprehensive privacy policy be implemented by the company. And the result of these cases are orders that are we call them consent orders resolving the claims um that put these companies under 20 year orders and then we go back every couple of years and look at how they're doing. Also gives us an additional um a way in which to make
08:24
sure they're complying with the orders uh because sometimes uh things happen and uh if they um are in violation of the order they're in contempt of it we can then uh penalize them monetarily which can be meaningful in some cases. So Snapchat uh had
08:42
promised that the images that you send on Snapchat would disappear after a short period of time and that if somebody tried to take take a screenshot of them you would get a notification. But actually there were a number of ways that you could save a Snapchat image um and you could also circumvent the notification feature.
09:02
Yeah so it doesn't disappear deceptive. So um Windham the hotel chain had 3 data breaches that unfairly exposed consumers payment card information. They had a number of security failures that led to these data breaches including uh storing payment card
09:25
information in the clear and not patching and not monitoring their systems for malware. Yeah so this is an important case because actually we proceeded using our unfairness authority saying that data security practices were unfair to consumers. Uh Windham
09:41
disagreed with us. We engaged in extensive litigation and this year uh we won at the circuit court level the use of our authority to bring data security cases to protect consumers. So that's a really important validation of the Federal Trade Commission being in this space and using our authorities. Okay Oracle provided a Java update to correct
10:00
important security flaws and they promised consumers that if they installed this update they would be safe and secure. However the installer did not automatically remove all of the old versions of Java leaving users vulnerable. Uh again um an important data security case and I and I think we'll transition now into uh another really important
10:23
data security area for us and that's the internet of things stock. Yeah so uh ACES made uh routers and they promised consumers that their routers would protect consumers local networks. However the routers were vulnerable to an exploit that provided complete access to a consumer's connected storage devices without needing any credentials. They
10:45
also did not address security flaws in a timely manner which allowed attackers to change router security configuration without a consumer's knowledge. And I just note here I mean routers are just an incredibly important feature protecting all the connected devices that you might have on your home network. So making sure that the
11:04
companies that are making claims about the security of them are actually making valid claims is really really important. So I think this was a super important case. Another feature of it was um and we've seen this in a couple of our other enforcement actions, configuration of encryption whether it was properly used and properly
11:21
configured or not and when it's not uh we've actually brought cases as well um so Fandango is a is another one of those. Uh there's several other examples that we could we could use but for the sake of time I think we'll just say these are examples of how we use our authority and we thought they were important to share so that as we have a conversation about how we can work with you to help bring cases you
11:42
understand the kind of legalese that goes along with them. Uh so how do we bring cases? Well uh we rely on researchers and research so that's going to be an important part of our talk today. We also read media reports and find those very interesting a lot of the time. Um and we actually get uh cases through consumer
12:04
complaints and other complaints that are filed with us. Uh we have a whole network actually it's called the Sentinel Network and it helps us bring in complaint data from consumers also from state law enforcement agencies from better business bureaus and from a variety of places. This network actually works for our whole mission of bulk
12:22
of what we do is also protect consumers from scams and frauds and things that are very low tech. Uh but I but it has a has a tech component of it as well. So you know I think uh we've been spending the first part of our talk talking about enforcement. It is one of the most important things that the FTC does but we're
12:41
really mindful that all of this amazing connectivity in consumers lives um is raising a host of issues that are that go far beyond simply whether the security practices are unfair to them or whether they're being deceived about what the products are actually doing. So the FTC is not just an enforcer it's also kind of an
13:01
advocate and we're trying to work with other government agencies and with other communities to make sure that we're we're putting in place the strongest possible policies and responses both to help keep consumers informed but make improvements to our laws as well so that as all of this great uh tech kind of cascades over us in our
13:23
daily lives um we we have uh better and stronger protections for consumers. So I'm gonna talk about an example of uh of something that we worked on and this started with a personal incident that happened to me uh shortly after I started working for the FTC my uh mobile phone account was hijacked and I discovered this when my phone
13:45
stopped working and on the same day my husband's phone stopped working and we called our carrier and um our carrier said oh is that your new iPhones that stopped working? And we said we don't have new iPhones and they said well in our database it says you have new iPhones. Uh so they sent us to uh the phone store to get new
14:04
SIM cards and eventually um uh they figured out that there had been fraud on our account. It turns out that somebody went into a phone store with a fake ID claimed to be me asked to upgrade the phones um and the phone company uh happily gave them two new
14:21
brand new iPhones uh charged them to my account and put our phone numbers on them. Uh so uh when this happened to me I I clean cleaned up the mess but I was really interested in how often does this happen to to other people and what could be done to prevent this. Uh so I talked to all of the major carriers about what they were doing uh to prevent it. Um
14:44
and and the type of authentication uh procedures that they're using. Uh they they are relying mostly on that driver's license um and uh a phone store employee who's not necessarily well trained in how to spot fake IDs. I looked at our consumer sentinel um
15:03
database to uh try to understand how often this was happening. Now uh consumer sentinel you know gets all these reports that that people send in um and in this case these are mostly reports that come in through identity theft dot gov. And we know that this is just
15:21
the tip of the iceberg because most people don't even know that they can submit their identity theft complaints. We're trying to get the word out so tell your friends. Um but we we we expect that this is um only maybe one percent or so of the total identity thefts that are happening we see. So I went back through this data and if you look
15:42
three years ago in a typical month say January 2013 we got about a th- a thousand reports of this mobile phone hijacking or a similar thing called sim swap. Um so we had about a thousand reports and that made up about three percent of all of our identity theft reports that month. Um then we looked three years later and we find twenty six
16:05
hundred reports and that is about six percent of all identity theft reports that month. So we're definitely seeing a trend here of this becoming an increasingly large problem. Uh I also did a lot of looking for media reports and saw that there were um a lot of reports of
16:23
people having similar things happen to them. Uh perhaps even worse uh besides just using this to get free phones some of the attackers are using this to get access to the victims phone number so that they can intercept their two factor authentication. Uh so
16:41
shortly after this happened to me it happened to DeRay McKesson who is a well known Black Lives Matter activist. He has something like four hundred thousand Twitter followers and somebody wanted to get into his Twitter account so they could tweet as him. Um and this is something that is becoming increasingly common. Um I understand that in
17:01
Europe uh they're doing this to get access to people's bank accounts and um and the attackers are successfully able to get in and actually clean out people's uh bank accounts. So is it any wonder that consumers have trust and security issues right? Uh I will note that our consumer consumer sentinel data the complaint data that we've been
17:21
talking about reflects that identity theft is the number one consumer complaint for the last five years. We get hundreds of thousands of these complaints so it's not just this kind of spoofing but it's a wider problem as well. Uh it doesn't show any signs not surprisingly of of lessening um unfortunately. Um so obviously there's a huge amount of
17:43
defenseless data data out there. We have this alphabet soup approach to our privacy protections in the US. Many of you in this room are probably familiar with it. It uh is like the the the TL you know DR version of this is like FTC Act, Perpa, COPPA, HIPAA,
18:00
Communications Act, GLB uh state laws right? But there's no comprehensive privacy law. There's no uh comprehensive data security law. So that's the the atmosphere that we're operating in. Which is why uh the FTC doesn't just do enforcement. It does a tremendous amount of education convening and trying to work broadly uh to address these issues. One
18:25
of the initiatives that we've had in the last year is something called start with security. Uh which is really trying to get our message out about what good security practices look like. I probably don't need to tell anybody in this room that a lot of consumer facing technology uh um is pretty porous and and um in fact many of the people
18:46
who are creating it probably have no idea uh what starting with security actually looks like. So we're trying to get that message out as broadly as we possibly can. Some of the the biggest problems we're continuing to see are ignored reports of vulnerabilities, uh slow response time to vulnerability reports, lack of data
19:05
minimization where appropriate, uh failure to store passwords um securely, lack of training of employees, lack of proper configuration. Um you know so we we continue to see a host of problems uh in that space as well. We're also trying to increase our in
19:22
house capabilities and our in house expertise um to understand how the technology is working and to be a better environment for uh people like you to bring research to us. Um so actually we have some of our awesome office of technology research and investigation folks here today. Uh Joe and Erin if you want to raise your hand. Um and
19:44
if you want to do like. Shirts like this. Yeah yeah so shirts like this. You want to do an IOT deep dive uh Joe and I are actually going to be in an IOT village later on uh this afternoon at 4 o'clock so we would love to talk to you then and also hear of any issues and research that you you've already been doing in the IOT space. Uh so we
20:01
also have um an internship program and we're trying to bring more technologists in through that as well. Uh one of the uh things that that uh the office of technology OTEC uh is doing is they are putting together a fall technology series um and so we have coming up in September a workshop on ransomware, in October we have a workshop on drones and in
20:24
December a workshop on smart TVs. Uh there's information about all of these workshops on our website. We're very interested if you have expertise in these areas, you have research, reports, anything you'd like to share with us, there's information on how you can share that with us either before or after the workshops. Uh if you're in the DC
20:45
area please come uh the workshops are free and open to the public. If you're not in the DC area um or even if you are you are welcome to watch our um live webcast of the workshops um and the videos will also be archived. Uh so these are are good ways for us to
21:03
uh collect information on these topics focused on the security and privacy issues and to better understand what consumer protection issues uh there are in these spaces. Another workshop we have coming up and this is one that I've been working a lot on is putting disclosures to the test. Uh so my interest in this started when I was doing work
21:24
on privacy policies which are a type of consumer disclosure. Uh but I realized that there are a lot of other types of disclosures um which the FTC is interested in which have some of the same problems that privacy policies do as far as being long and hard to understand and we would really like them to be more effective. And so uh the purpose of
21:46
this workshop is to bring in researchers who do usability, user studies and evaluate disclosures to try to figure out how to make them actually communicate well with consumers. And so uh we'll be hearing from uh people who have done work on privacy
22:03
notices but also nutrition labels and drug facts and all sorts of other types of disclosures. I thought this was covered so incredibly well this morning by um Sarah and Mudge in their talk about the cyber independent testing lab that they're putting together. This need to have consumers have more transparency so that they can make
22:21
educated choices about the products that they're buying, the software that they're buying, the apps that they're buying and just to understand what some of the risks might be associated with them. So we're trying to really um improve and increase and expand our knowledge of about the kind of communications that work with consumers and are effective with them. Oh privacy con too. So this is our second annual privacy
22:45
con this year. This is also a forum for researchers, especially researchers who are interested to us. Um we had an incredibly successful first privacy con last year. We're going to do it again this year. Um and I uh first of all learned a huge amount which was
23:01
great. Um it definitely affects our enforcement but it also I think really affects the broader policy discussion that we're having on privacy in the country as well. Um so that's coming up in January. There will be information about how to participate um that is actually currently on our website. It's currently on our website and we are seeking research papers uh in the privacy area right now. Um the deadline I believe is in
23:25
October sometime. Uh so definitely um think about submitting things and think about uh coming or tuning in um this should be a really great event. So research, this talk, talking about the research wish list that Laurie has been putting together, which I'm
23:43
really excited about because I feel like sometimes we have a very abstract conversation about what it is that would really help us to understand better with the academic community, with the researcher community. So this is our attempt um and it's going to be uh uh an attempt that we keep pursuing right to sort of refine the kinds of
24:01
issues that we think are really going to be helpful to us to understand and to really solicit research um in academia and elsewhere for for uh these kinds of topics. Um we also are going to make sure that we have time for questions too so I'll let you run through it quickly. Yeah yeah so I I spent some time working with the OTEC folks and we
24:22
went and talked to people in every division of the agency about their research needs um so that we could then go out and talk to researchers about ways you might be able to help us. Um I don't have time to go through the entire wish list but we're going to focus on some of the security and privacy items here. Um so we're very
24:42
interested in research on how to assess the risks um that are posed by breaches and vulnerabilities. You know we we know that there are risks um but we want to look at exactly what metrics can we use to assess them. Um we also are very interested in protecting consumers from ransomware, from malvertising and and other risks and so um
25:05
we're interested in research that helps us protect consumers. Um we're also interested in being able to trace exposed data to specific breaches and we're looking for research to help us do that. Uh we're looking at research that is at the intersection of
25:21
economics and security. How can we make certain types of attacks less profitable and therefore less um less desirable for an attacker to pursue? Um and then we're also very interested in protecting consumers from fraud and so we're interested in ways that we can automate the process of spotting fraud um detecting fraud quickly. Um IOT
25:46
devices is an emerging area um and we're very interested in in research related to that. Um we would like to help IOT device manufacturers and platforms have better security and so we're very interested in research along those lines. Um we're also
26:02
interested in defensive measures so that if there is a problem with an IOT device it won't compromise the entire network. Other emerging trends um uh there are increasing uh devices that have sensors in them including devices for children, Barbie dolls that talk to you and things like that. Uh we're very interested in how to prevent these
26:23
devices from compromising consumer privacy and children's privacy. Um we're very interested in how to isolate critical systems for example in connected cars. Um bots that's a new thing um increasingly we have uh bots, other artificial intelligence um and
26:44
when consumers interact with these bots we wonder do they even know that they are interacting with a machine? Um and so we want research on how consumers can uh become aware of that and what they know about this. Um virtual reality is a new area that uh
27:02
we've seen a lot of progress in lately a lot more consumer devices available in the virtual reality space and there hasn't been a whole lot of discussion of the security and privacy issues. You know it's fun, it's entertaining but we want to stay out ahead of that and try to make sure that we protect consumers as well. Um new tools
27:23
and techniques uh we're very interested in a variety of different types of tools. Uh we're interested in hearing about tools that consumers can use to control their personal information and especially across contacts as personal information is now increasingly shared across contacts to your phone versus you know shares with your TV
27:43
and what not. Um we're also um uh interested in tools that help consumers observe what data their devices are sharing. Um we are interested in uh tools that allow us to analyze apps and to understand the type of data that they are sharing and um that are
28:03
associating with third party libraries. Um we're interested in algorithms that are used um uh to make decisions about people and may actually um either on purpose or inadvertently discriminate against people. Uh we're interested in identifying when cross device tracking is occurring um and we're interested in tools that will help us
28:25
identify vulnerabilities in IOT devices and many many more. This is uh just a a quick sampling of some of the research areas that we're interested in and that you'll we hope you will come talk to us about uh if you have insights. Um so what happens if you do
28:43
um uh come talk to us about it? Um so uh our OTEC folks um will take a look at the research uh that we receive. They will look um across the agency to find um people for whom this is relevant and try to direct that to them so that we can see if it's going to
29:02
be um of use uh to to the work that we're doing or whether we should start a new project in an area that somebody brings something to our attention. Um sometimes you bring something to us and then we actually end up bringing a case so it can result in a in a lawsuit against a company as well. Um that's happened uh some of the time. So
29:22
actually segues well into the we want you slide from the creepy Uncle Sam. Um you know I think what we're trying to what we're trying to really uh if you have one take away from this talk made clear here is that uh we actually can't solve all of the challenges that are going to be confronting consumers in a hyper connected
29:43
environment without a lot of partnership particularly with the security researcher community. So we're trying to do uh the most that we can do to try to uh develop those partnerships and have it inform not only our enforcement mission but also the the research that we do, the studies that we conduct, the workshops that we conduct and the
30:04
ways in which the FTC tries to actually uh make sure that um policy makers and others in the broader space are seeing these issues that might harm consumers. Yeah so um we have set up the email address research at ftc dot gov uh please use that to
30:21
send us research and that will uh be examined by our folks in OTEC. Um and uh for pointers to all of the workshops that I mentioned and all the other things uh that I've talked about here please take a look at ftc dot gov slash tech that that's the tech blog. Um lots of other interesting stuff there too so check it out. Uh I think we're ready for
30:42
questions. Yeah and you can follow us on Twitter too. You're at lauritweet. Yeah I'm at lauritweet. And I'm at team xvini FTC. So uh thank you. Alright so we have we left plenty of extra time for questions so we'd love to field some questions and um and I
31:02
think we have like what 5 or 10 minutes? Hi uh great brief. You mentioned discriminatory algorithms that you are concerned about. We know in the news I think it was uh 2 or 3 months ago with Facebook and their news feed. There's also been recently in the uh the other uh major social media site Twitter banning people because they
31:23
did a bad movie review of uh a ghost buster uh movie and they had their account banned of uh a uh from Breitbart news it was Milo. Uh also there's been uh other censorship against talk show hosts for mentioning repeating what happened in Germany. I won't
31:43
mention religion because I don't want to be censored here. Uh an attack that he lost his um Facebook account. What are you doing for situations like that or is that in your um swim lanes? Thanks. So some of that um raises a host of really interesting sort of broader first amendment concerns. Um you know I think one of the
32:04
things that we're trying to really focus on when we're thinking about algorithms and data and especially like machine learning on top of all of that is the extent to which uh choices are being curated and offered to consumers in such a way that might limit their choice or even result in a disparate impact on them. So one of the
32:23
things that we haven't really gone into yet is the extent to which those algorithms are are kind of manipulating the the overall news that they're getting which is I think your your question. Um but but we are interested in the extent to which um it might be impacting the credit offerings that consumers are getting, the housing offerings, employment offerings, some of those core economic choices. Now we
32:44
do have laws on the books uh comfortingly, civil rights laws, equal opportunity laws right that already protect people in the brick and mortar world from this kind of discrimination. But one of the things that is really hard in the increasingly digital world is figuring out when that's even happening at all. Um and that's some of the
33:02
work that we really need help with right now. Yeah. I heard a lot of emphasis in this presentation on on regulation basically or actions against companies and consumers but I feel that more and more government is becoming a servicer of consumers and it used
33:21
to be you go into an office and deal with someone and that was a a real interaction but now the services are so broad and dynamic because the government is trying to offer electronic services on the forefront and I'd argue that they're not necessarily the most expert at it and data breaches and such these are all things that apply to this to government as much as they do as private companies so what's your
33:43
regulatory or involvement with government services? Yeah well uh we are the government but we don't actually regulate the other parts of the government so that's actually good for us because that's as you point out a big challenge. I mean I think you see this administration taking action to really try to improve both the privacy um and
34:03
security uh talent and and policies throughout the government. Um so the question for people in the back was uh you know what are you doing FTC about the government and it's problems. Um and the short answer is we we're focused on protecting consumers. But you know I think we are collaborating with the other parts of the government. We
34:21
have our own chief privacy officer, our own chief security officer, we're very mindful of these issues um and I and I think one of the other things that you really see happening in this administration is um uh government wide emphasis on bringing technologists into government. That's something the FTC has been a real leader on. We actually have been doing this for a number of years because what we recognize is that
34:41
when we're dealing with protecting consumers in an increasingly digital world we need technologists to help us understand what is even happening in that world which is why we have people like Lori uh but why we've also expanded to develop an entire office that is staffed by researchers and technologists as well. I think we need to grow those resources uh but we need to do it throughout the government as well. And when
35:04
we're having big debates like encryption debates we need to make sure that technologists are at the table for those debates because uh a lot of the time the policy talk in Washington isn't so well informed. Probably no one here is surprised to hear that. Yeah and there there is now a government wide privacy council which the FTC participates in actively and is helping to educate other
35:24
agencies. Early early in your talk you mentioned about Google and Facebook and how they were you caught them for something changing their their end user agreement. In these EULAs it often says that we can change the EULA terms anytime we want to. So then how can you kind of accuse them of unfairness if a user has agreed to these terms?
35:43
Yeah so this is a great question. The question is if you have a a user agreement that covers everything um how can you then come back and bring a deception case about about something that's sort of covered in the sixty ninety page uh user agreement? Well the answer is context matters. And I think what we're trying to make very clear in our FTC
36:03
enforcement is if users share information under one set of rules and in a way that makes sense given the kind of stuff they're doing with an app right? Then uh then you know that that's covered by the user agreement. If if you do something that's super
36:20
tricky right or really impossible for consumers to figure out or you change how that information is being handled without really giving them a clear explanation of what those changes are. Or if you set up your thing to defeat what their settings were to begin with right? That's a case we just brought called NMOBI. Uh that that we actually can bring a deception case in that situation. Last fall you had a workshop about
36:47
cross device tracking and I know you sent some warning letters to developers this spring uh that were using a tool kit that might be used for cross device tracking. What additional is it seems like this is an area that is probably going to grow rather than
37:04
shrink uh is there additional activity uh going on at FTC to to continue to track this and and what are you doing in the future? Yeah so thank you. This is a question about cross device tracking which is an issue that we're definitely trying to understand a lot more clearly. Uh it's already informed a little bit our enforcement
37:22
efforts right? So the NMOBI case which I was just talking about is a case where um we actually had a mobile ad company that is um an incredibly widely used um company that said it would only track if you opted in and in fact it tracked whether or not um opt-ins were set and had really created a whole system to kind of go around the
37:41
opt-in to begin with in order to track consumers using uh geolocation and other things. So we we said that's that's that's unfair and deceptive. Um we also you noted the silver push letter so the silver push technology which um for those of you that didn't see our letter cause I get it we're you know out there in Washington. Um we issued a warning letter to app developers saying that you installed silver
38:03
push which is a piece of software that can monitor device microphones and listen to the audio beacons that are coming off of advertisements on TV so basically um is uh technology that allows them to gather what someone's uh viewing habits are based on uh what their tele telephone microphone is picking up from these audio beacons that are
38:23
embedded in the advertisements. Uh we said that we were uh very skeptical um that that this kind of technology should be included in apps so I think that should serve as a pretty uh bright line warning letter that we're worried that consumers really don't have adequate notice and transparency about what that tech is. We're also looking at
38:42
many of the ways in which people are being passively uh I I could say surveilled it's a bit loaded but passively uh having information gathered about them. Uh last year we brought a case called Nomi which was a company that was um tracking people's um locations and retail locations and they said they would offer an opt out in retail
39:03
locations but they didn't in fact uh compel the retailers using the technology to offer the opt out so there was no opt out and there's no way a consumer can know that's happening really unless you have some kind of uh clear notice that it's occurring and some kind of choice. Um so there we said look if you're gonna say you're gonna offer an opt out you have to really offer the opt out. Um now again there's no
39:22
comprehensive privacy law in this country so there's nothing that says that that kind of thing uh can't happen without uh consumers choosing or having a choice about it so um it's a it's a area that we're continuing to monitor very carefully. Spring morning off of the uh the previous questions about consumer privacy and the transparency that goes on
39:43
between other government agencies um is your commitment to transparency uh documented if say an exploit is discovered and say the NSA wants to hold on to that exploit for some use? You want me to take this one probably. Go for it. Um so um we are a civil
40:04
enforcement agency and um I could imagine that there would be situations in which uh we wouldn't be in that in that dialogue um for a variety of reasons. Um if if something is disclosed to us uh we what we then do is try to understand whether um we have
40:24
enough facts to actually bring a case using our existing authorities about um the practices that led to uh especially if it's exploited or or whether um in some cases we have brought cases when uh something was disclosed and then um the recipient of that
40:42
didn't really react at all right? So if you don't have a mature uh disclosure program in your company to receive exploits and respond to them that can be a factor in our analysis about whether you have reasonable data security practices in your company. But I'm not really answering like your direct question um which is which is the broader like national
41:02
security question because we're a little bit less than that. The broader civil liberties question as well too because what if I discover an exploit and then I get slapped with a notice to not mention it because the government wants to use it for something else? What's my protection or the protection of consumers? So this is an area that I you know I personally think is um one that we really need to work on the maturity of
41:24
our laws in the US and how we're handling it because uh the FTC thinks that we need to have really good clear partnerships with security researchers so that uh people who are doing the work on behalf of consumers to help us understand how the technology is actually working um are able to do that work without uh you know fear of reprisals. Now
41:45
understand this is a balancing act right that there are bad actors out there we want to protect against the bad actors but um yeah I think it's a part of the broader conversation that we need to have and the FTC um maybe not all the FTC I'll say at this point I'm speaking on behalf of myself right but uh you know I think some of us
42:01
really feel strongly that we need to modernize uh how we're handling uh computer fraud and abuse act and some other things so that uh we can have a more mature system in place for handling um how research is handled and how exploits are handled when they're disclosed. Hi it's great that the FTC is trying to get ahead of privacy
42:25
risks in IOT and virtual reality which are new technologies but can you talk more about what you're doing with routers I know about the Asus case but routers are so important to consumers many of them don't realize it it's the gateway into their private networks and where everything's shared and the practices the security
42:44
practices with router vendor or router vendors have been so bad for so long and many of the same vendors are now doing IOT as well so what are you doing there to convince vendors to improve those practices that have been going on for so many years? So for
43:00
starters we're bringing cases um I I don't want to talk about any pending cases but I would say that we take we take this uh we take the security of routers and the I don't know if you want to add to that. Um yeah. Yeah. Alright. Alright well I think we're out of time so again if there's one take away it's that we really want uh to
43:24
forge a good partnership we want to hear from you we want to participate with you uh if you think there are things um that were were missing uh we would love to hear about it and add it to our call to research list so thank you for your attendance and time and happy DefCon this is awesome. Yeah thanks for coming.