How to Hack Government: Technologists as Policy Makers
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 109 | |
Author | ||
License | CC Attribution 3.0 Unported: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor. | |
Identifiers | 10.5446/36396 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
00:00
Hacker (term)Digital object identifierState of matterHacker (term)Traffic reportingWordSpacetimeEntire functionProcess (computing)QuicksortBitSlide ruleMultiplication signGreatest elementType theoryDifferent (Kate Ryan album)Speech synthesis
01:11
Coma BerenicesMultiplication signProcess (computing)Vulnerability (computing)InternetworkingWeightExpert systemQuicksortSeries (mathematics)MassTube (container)Message passingPhysical lawRight angleField (computer science)
02:29
NumberInformation securityWordRight angleInformation privacyPoint cloudNeuroinformatikCybersexLevel (video gaming)Mobile appPlastikkarteSinc functionRow (database)Connectivity (graph theory)SmartphoneMereologyPoint (geometry)Internet der DingeMultiplication signSpacetimeInformationQuicksortSet (mathematics)Meeting/Interview
04:57
Internet forumState of matterInformationState of matterInformation privacyStudent's t-testPhysical lawMereologyRow (database)Alphabet (computer science)Sound effectSlide ruleSet (mathematics)Medizinische InformatikTelecommunicationRegulator geneFlow separationPatch (Unix)Operator (mathematics)Fitness functionComputer animation
06:10
MereologyAuthorizationPower (physics)SpacetimeFitness functionSlide ruleStandard deviationComputer animation
06:53
Physical lawRight angleLikelihood functionGoodness of fitInformation privacyArithmetic meanLatent heatNumberAuthorizationMultiplication sign
08:06
Product (business)Weight
09:04
Virtual machineCASE <Informatik>NeuroinformatikMereologyComputer animation
09:42
InformationPhysical systemLibrary (computing)Vulnerability (computing)Android (robot)MultiplicationParallel portSpacetimeContrast (vision)MeasurementProjective planeType theoryContent (media)Online helpCASE <Informatik>Product (business)Information securityInformation privacyComputer animationLecture/Conference
10:54
Mobile WebDesign by contractInformation privacySoftwareCASE <Informatik>ÜberlastkontrolleParameter (computer programming)Communications protocolPlanningCore dumpOrder (biology)Element (mathematics)Product (business)Physical systemBitRight angleThresholding (image processing)Netzwerkverwaltung
12:09
GoogolInformationComputer networkService (economics)EmailInformation privacyInformation privacyOrder (biology)CASE <Informatik>Regular graphInformationGroup actionMultiplication signHTTP cookieFacebookShape (magazine)Online helpAuthorizationGoogolWeb browser
13:37
FacebookInformationInformation privacyMathematicsVideoconferencingTrailService (economics)WebsiteAddress spaceMobile WebHypermediaBroadcasting (networking)ZugriffskontrolleClient (computing)Address spaceAsynchronous Transfer ModeData storage deviceCASE <Informatik>InformationFacebookUniform resource locatorHash functionMathematicsTrailPoint (geometry)Ocean currentDigital photographyLink (knot theory)MereologyCore dumpParameter (computer programming)NeuroinformatikLaptopMultiplication signMobile appComputer configurationForcing (mathematics)SoftwareSpacetimeFunctional (mathematics)Staff (military)Shared memoryRepresentation (politics)Information privacyTable (information)Web pageAuthorizationComputer scienceProduct (business)Set (mathematics)CodeRight angleClient (computing)
18:30
VideoconferencingMaxima and minimaInformation securityInformation privacyInformation privacyAreaMessage passingInformation securityMobile appAdditionDigital photographyCASE <Informatik>MereologyInformation
19:35
Heat transfer coefficientSoftwareTable (information)Software developerInformationInformation securityVector potentialMultiplicationMessage passingCASE <Informatik>INTEGRALQuicksortKey (cryptography)Information securityComputer configurationMobile appCodeNumberProcedural programmingDemonPerfect groupPhysical systemEndliche ModelltheorieAndroid (robot)MeasurementInformationBitContext awarenessUniversal product codeGroup actionAuthorizationCharge carrierSmartphoneMultiplicationCarry (arithmetic)WordAdditionSign (mathematics)Information privacySoftware testingLattice (order)Interface (computing)Product (business)
22:02
Mobile WebInformationInformation securityInternetworkingSoftwareAddress spacePasswordVideoconferencingAuthorizationValidity (statistics)Information securityMobile appCASE <Informatik>Element (mathematics)Metropolitan area networkConditional-access moduleWordNumberInterface (computing)TwitterState of matterFunctional (mathematics)VideoconferencingView (database)System administratorIntegrated development environmentInternetworkingPlastikkarteInformation privacyInternet der DingeDefault (computer science)IP addressWebcamCommutatorSoftwareInformationSpacetimeGraphics tabletArea
24:25
ComputerInternet service providerSoftwareWebcamAsynchronous Transfer ModeInformationComputerTouchscreenInternetworkingFitness functionCASE <Informatik>Vector potentialSpacetimeInternet der DingeFlash memoryVariety (linguistics)Information securityInformation privacyMobile appWebcamSet (mathematics)Video gameInformationWebsiteGame controllerRight angleAdditionDigital photographyNeuroinformatikSoftwareNumberQuicksortWeb browserRoboticsHTTP cookie
27:03
Fitness functionVulnerability (computing)Fitness functionTrailMedizinische InformatikVector potentialInformationRoboticsProduct (business)Data conversionWeb browserFingerprintDistanceNeuroinformatikUniform resource locatorFunctional (mathematics)Pulse (signal processing)Term (mathematics)WebsiteLibrary (computing)Service (economics)Correlation and dependenceCross-correlationAxiom of choiceQuicksortOcean currentAerodynamics
29:16
Information securityUltraviolet photoelectron spectroscopyQuicksortInformation securityCASE <Informatik>5 (number)Traffic reportingSpacetimeBlogField (computer science)Website
30:07
Information securityAdditionSpacetimeBlogInformation securitySystem callBlock (periodic table)Robot
31:05
Service (economics)Physical lawHydraulic jumpSystem callRobotRoboticsRight angleElectronic mailing listSoftware developer
32:23
Internet der DingeAlgorithmVulnerability (computing)Office suiteGroup actionBlack box
33:05
Computer musicPhysical systemOnline helpAreaDistanceAlgorithmBlack boxHTTP cookieDivisorTexture mappingRoutingGenderRight angleCodeMessage passingÜberlastkontrolleEmailQuicksortWordGoogle MapsData storage deviceComputer animation
34:11
Computer musicInformation securityInformation privacyStudent's t-testGUI widgetOffice suiteLevel (video gaming)Information securityInformation privacySlide ruleQuicksortRight angleTable (information)Student's t-testPattern recognitionGoodness of fitProcess (computing)Computer animation
35:45
Client (computing)SummierbarkeitSystem callRight anglePhysical lawMultiplication signEmailInternetworkingCASE <Informatik>Computer animation
36:44
EmailLogical constantBlogShooting methodPlug-in (computing)Office suiteLine (geometry)CASE <Informatik>BitPresentation of a groupOnline help
37:30
Right angleEmailOnline helpAdditionOffice suiteConstraint (mathematics)Process (computing)WebsiteCoordinate system
38:35
Right angleProcess (computing)EmailForm (programming)PurchasingDifferent (Kate Ryan album)Information securityCASE <Informatik>Independence (probability theory)Order (biology)Information privacyFocus (optics)Point (geometry)Decision theoryP (complexity)Data conversionGroup actionStaff (military)BuildingPhysical lawSystem callInformationBitMereologyQuicksortBridging (networking)Block (periodic table)Personal digital assistantPosition operatorRow (database)Online helpDiagramCognitionLevel (video gaming)AdditionMedical imagingState of matterFilm editingPower (physics)Goodness of fitAuthorizationExpert systemMultiplication signDivisorVideoconferencingException handlingCommitment schemeAxiom of choiceHacker (term)SpacetimeMathematicsType theoryData storage device
Transcript: English(auto-generated)
00:00
So this morning we're talking about how to hack government, technologists as policy makers. I'm Ashkan Sultani, currently the chief technologist at the FTC. You might know me from my work with the Wall Street Journal or the Washington Post. I was one of the reporters that worked on the Snowden documents that brought you that smiley face. And I'm Terrell
00:22
McSweeney, I'm a commissioner with the Federal Trade Commission. And I'm an attorney, I'm not a technologist. And I've been in the policy space for a long time at both FTC, White House and DOJ. As you can probably tell by the difference between these pictures, I'm the type of person that really relies on technologists like Ashkan to
00:42
help me do my job, which is a little bit about what we're going to be talking about today. Just a word about the disclaimer that's at the bottom of this slide. Both Ashkan and I work for the Federal Trade Commission. So do a lot of other people and I actually have four other colleagues who are also commissioners and they don't always agree with everything we say. So we're here talking
01:02
today about our experiences and we're speaking sort of individually and not on behalf of the whole entire trade commission. Or the government. Or the government. So this talk, we're going to talk about what is tech policy? What are the big debates right now? And why do technologists, you and the audience, why do you guys
01:21
matter? And what this talk isn't, it's not a tech talk. We're not dropping any of it. We're talking about some vulnerabilities, but they're the human kind. They're people and process. Just to start, does that work? The Internet is not something that you just dump something on. It's not a
01:42
big truck. It's a series of troops. And if you don't understand, those troops can be filled. And if they're filled, when you put your message in, it gets in line, it's going to be delayed by anyone. It puts it into an enormous mass of material. So this is senator Ted Stevens and that's his relatively famous quote during the 2006 net
02:03
neutrality debate where he's trying to describe the infrastructure of the Internet. And we included it just because it sort of underscores our message, which is that people who make laws and who make policy do need to have technical experts that help them understand the technology that they are impacting. And to be fair, he
02:23
wasn't totally wrong. It is kind of tubes. It's more tubes than trucks, for example. So not bad, right? So there are a number of policy issues that are hotly debated in D.C. right now. This is kind of a word cloud of many of them. Privacy, data security, cross-border data flows, the
02:41
right to be forgotten. You have computer fraud, which you guys are probably very intimately familiar with. There is cyber security legislation currently on the floor of the Senate, data security. There's a ton of debates. But I want
03:00
to start here at the FTC because every day you take the lead in making sure that Americans, their hard-earned money and their privacy are protected, especially when they go online. And these days that's pretty much for everything. Managing our bank accounts, paying our bills, handling
03:20
everything from medical records to movie tickets, controlling our homes, smart houses from smart phones. Secret Service does not let me do that. But I know other people do. So this is President Obama actually at the Federal Trade Commission earlier this year talking about a lot of the
03:43
tech policy issues that we at the Trade Commission spend a lot of time on, consumer data privacy, data security, the Internet of Things. And this was actually kind of really memorable for the FTC. President Obama was the first president since President Roosevelt to visit the Trade Commission. But I think it really underscores how important a
04:03
lot of these technology policy debates are right now and how they've risen up literally to the highest levels of government. Just to note also, I think it's important to explain that at the Trade Commission we protect consumers and we're focused on practices in the private sector that are impacting them. So that's what we're going to
04:22
focus on in this talk today. So when I'm talking about consumer privacy, I'm talking about privacy as it relates to the commercial space. We're not the part of the government that gathers information about people and things like that. That's a different debate, very important one, but we're going to stay out of that for now because that's not in the FTC jurisdiction. And he makes an important point that you guys all know which is that nearly
04:43
every part of society now has some technological component. Nearly everything we do is online or technically mediated by an app or computer of some sort. And that's where I think we play. We come in as technologists. So just to level set, there's a kind of
05:01
alphabet soup of laws that protect consumer data and privacy in the United States. Sometimes we refer to this as a sector based approach. And this is just a slide that runs through many of those laws. We protect children under 13 in COPPA. We have special protections for financial
05:20
information and health information for certain kinds of student records. That's FERPA. We have obviously privacy protections in the telecommunications act as well. That's the jurisdiction of the FCC. And several state laws as well. So that's the kind of landscape that we're operating in and the kinds of protections that are out there.
05:43
And then we also have the FTC. And one of the things I think about is for the most part, there aren't a lot of restrictions on what companies, what information companies can gather or what consumers data protection practices in law. We have a patch work, as Terrell said, of laws and regulation
06:02
that essentially achieve that effect. But there's nothing specifically prohibiting companies from gathering your information, for example. So the FTC. How does this fit into this space? The FTC was created in 1914 by President
06:20
Woodrow Wilson. That's him on the left. It was actually part of a policy debate that was really focused on trying to combat the economic power of the trust. That's like a cartoon about standard oil and all that stuff. So it was created because there was a lot of concern about the power that these trusts had in our economy. And that was 100
06:41
years ago. But it was given this relatively broad authority to protect consumers from unfair deceptive acts and practices. And that's the next slide. And as Terrell said, we're ‑‑ I'll go ahead. You go ahead. So we're kind of ‑‑ you can think of us as the white hats of government. We're here to protect consumers. We're here to
07:02
promote good practices by companies, get them to fix their shit when it's broken. We're kind of, as far as government goes, on the consumer side of things. And so the authority that we have, we use essentially in a number of ways. And we're going to spend a little time talking about that today. Mostly to check to make sure that privacy
07:23
promises that are being made to consumers are being actually adhered to. And that consumers are protected from unfair practices, especially when it comes to securing their data. And unfair, it took me a while to get, but unfair doesn't mean, hey, that's unfair. There's a specific legal
07:42
definition of unfairness under FTC law. And that is really important to understand as you try to ‑‑ or as we try to highlight practices that we think might be problematic. There's restrictions like it can't be offset by counter‑vailing benefits. There's likelihood to cause an injury or harm in some way. And so I'll let Terrell explain
08:04
more about the law. I don't know anything about this. Maybe you know something about this. It has to do with the Federal Trade Commission. And apparently there was a product being marketed in the United States. It was caffeine laced undergarments. The idea was the caffeine in the
08:22
undergarment, the wearer could lose weight and have less cellulite. By putting caffeine in your underpants. That seems like a really cool idea. It turns out this product doesn't actually work, which is a bummer. And we,
08:41
literally, right, we at the FTC protect consumers from a bunch of products and marketing claims for products that don't do what they claim they do. That's a really low tech example. And, you know, we also ‑‑ that's like our
09:01
bread and butter. We also do a lot of high tech products as well. It wasn't just our undergarments, underpants. Consumers in 1914 were adopting technologies like the phone and rail. In fact, one of our first cases was against
09:20
this company that produced a calculating machine. We brought in a case alleging it did not do the claims. There we had to understand what a calculating machine, an early computer, what the capabilities were and what claims they were making and understand the technology underlying their claims. It's
09:42
always been a part of our work. We, in the 60s or 70s, for example, would build systems to automatically test tar content in cigarettes. This was a parallel smoking machine that would just inhale cigarettes and measure the ingredients. It became the FTC method. And there, again, we're building
10:00
technology to measure things. If folks remember like the Droid Army project that would measure vulnerable libraries and multiple Android devices in parallel. These are the types of things that we've always been engaged in to help measure and understand practices in particular spaces. So the FTC at
10:22
100 is increasingly engaged in looking at technology and high tech products. This gives you a sense of some of the cases that we've had involving huge tech companies in the last five years. And sometimes the FTC is even referred to now as the Federal Technology Commission because of our role increasingly
10:43
looking at the practices of these companies as it relates to the impact that they're having on consumers, especially their privacy and their data security. And we're going to step through a couple of these cases, but to start? So to start, one important thing, and we've been talking about it a little bit, is we have the ability to try to
11:02
make sure people are marketing their products fairly to consumers and not deceiving them. So this case actually is currently in litigation. And here the issue is, does an unlimited plan that is throttled, is that actually an unlimited plan? And our allegation in this case is that it is
11:23
deceptive to consumers to say that they are getting an unlimited plan if in fact they're going to be throttled over certain thresholds. Right, and here we have to understand things like how network routing works, what is congestion-based throttling versus just network management, you know, what are, for example, SS7 and weird
11:42
networking and GSM protocols that, you know, people, some people now in this audience know, but obscure technical underpinnings of old telco systems in order to bring this case, right, so companies would make arguments that this is how it needs to be managed, and we would say, for example, if you're capping people at 5 gigabytes
12:02
regardless of network congestion, that might not be congestion throttling, and so, and it becomes a core element of our case. So privacy promises, this is a big piece of what we do on the enforcement side, and the reason we're spending a little time talking about a bunch of the FTC enforcement cases is that enforcement is actually one of the strongest tools that
12:23
we have in our toolbox to help shape policy and practices in the private sector. Google Buzz, for example, this case involves a broken privacy promise, which is that Google made to Gmail users about how they would use Gmail information and whether they would use that in a way that
12:43
was different than the Gmail users expected, and so it's relatively important because we allege that, in fact, the Gmail users couldn't avoid, couldn't have anticipated having their information used for Buzz and then didn't have adequate notice to be able to opt out. And once the
13:02
settlement is reached, the company is often put under order, for example, Google is under order for 20 years to maintain their privacy policies, sorry, privacy promises, and in fact have regular privacy assessments, and we then use that authority to later bring other enforcement actions, so you might recall Google was also found circumventing Safari
13:24
browser settings, basically respawning Safari cookies, and we later then brought an enforcement action with monetary penalties once they're under order. So similarly, Facebook, and then, you know, the Facebook case again, we took a hard look at whether the representations that
13:43
Facebook had made to users about how they could restrict their information were actually true and determined that they weren't, and brought a case there. This is also pretty important because we looked at how the retroactive change to
14:01
how users' information was handled actually was deceptive to and unfair to consumers, so we looked at a change in policy and tried to hold Facebook accountable for the promises that they had made. And this is one of my first cases, I was a staff technologist at the time, 2010,
14:24
and there was a lot of technical work that needed to be done. For example, we alleged that Facebook apps would have access to more information than what users were told, so you could restrict your information sharing to private, for example, via Facebook settings, but as folks
14:42
in this audience know, via the Facebook API, you could pull a lot more information, you could pull whatever information the user had said. So we had to do things like understand the API, run apps to demonstrate or write apps to demonstrate and verify our claims. There was a claim regarding deletion of photos, so there was the need to
15:02
understand how CDN works, so when you delete your photo on Facebook, if you deep link to that photo, it's still available on the caching network because they did not delete the photo off their CDN. There was another case with regards to sharing information, and folks here know how advertisers get
15:22
information via referrer headers, so Facebook claimed that they would not share information with advertisers, but in fact, your Facebook ID and perhaps the page you're on would be sent to advertisers embedded on the Facebook page because they didn't properly iframe the advertisement, etc. So a lot of the case, the findings, the case was a legal
15:42
finding, but we had to essentially technically demonstrate what our claims were with regards to information sharing. If I could just sort of underscore that for a second, I'm a lawyer, not a computer scientist. Half the time I might not even understand some of the terminology Ashkahn was just using, but when I can work
16:01
with a technologist who does, they explain it clearly, then it helps inform our mission. So it's like a core part of why this partnership is so important. Nomi. So this is a relatively recent case, again, privacy promises. This is a company whose technology allows their clients, retailers, to track users as they're coming in and out of their retail
16:22
locations, and again, we brought a case here because the company said that users, that consumers would have the option to opt out in retail locations that was using the technology, and that they would have noticed in the retail locations using the technology, neither of which were
16:42
happening. So we brought a case saying that was deceptive. And do folks know what this is, the retail tracking? So this is basically promise mode Wi-Fi sniffing, right? So you have to, you know, the retailers and malls will install essentially access points that are passively monitoring for Wi-Fi or GSM beacons, and
17:00
they will track you, you know, whether you return to the store or where you walk through the mall, and again, there was a lot of technical claims, for example, a lot of the companies in this space will argue that the information is anonymous. They collect, for example, they might collect MAC addresses, but they use cryptographic functions to anonymize the MAC
17:20
addresses, and in fact, if you guys well know that they're hashing the MAC addresses with known hashes, oftentimes it's like 6-byte MAC addresses, and the first part is essentially the manufacturer code, so the space required to actually brute force a hash is quite small, something like 2 to the 30th, and you can do that on a regular laptop or you
17:42
can download a rainbow table. So a lot of the argument was that, you know, when companies make claims that this information is anonymous, that we would demonstrate that in fact it's not really, right? The current status quo is you can reverse the hash or you can go back to the MAC address. Similarly, you know, with regards to claims about notice, companies would make claims that, like, for
18:00
example, the tracking only occurred within the mall, but as you all know, Wi-Fi signals can traverse, you know, past walls, right? So the store next door would also, visitors to the store next door would also have their information captured by this location tracking technology. So informing that, informing the pros and cons and the pitfalls were
18:22
critical to actually bringing this case. I think this is where we as a community can really contribute. Finally, Snapchat. This is a case that's part of our ongoing effort to make sure that apps are being marketed truthfully to consumers, and in this case, we alleged it was deceptive
18:40
when the company claimed that messages would disappear forever when in fact it was relatively easy to capture them. We also had looked carefully at data security practices and some of the other practices and found that the claims being made were also similarly misleading. And this is another, folks in this community, there was a talk in 2013 by the straws guys about, you know, ephemeral apps and
19:05
how they weren't in fact ephemeral. And this is an area where this community has contributed quite a lot in research and demonstrating that when companies make privacy and security claims, which is something we want, right, we want privacy preserving apps, but when they're not true, it actually harms consumers and harms trust in the industry,
19:22
right? So if you can Snapchat, you know, some sensitive photos to someone and they can easily scrape that information, consumers are effectively harmed because they were proceeding or sending those photos under assumption of trust. And so in addition to kind of the deceptive claims, we, deceptive cases, we also bring enforcement actions
19:42
against companies that fail to meet reasonable security practices. So one of the cases that I worked on was HTC. Folks remember there was this hubbub around carrier IQ, which was a telemetry app inside multiple smart phones. The carrier would essentially get the OEM to load this kind of
20:05
logging feature to help understand how people use the phones or monitor the devices. And in fact, in this case, carrier IQ, sorry, HTC integrated the carrier IQ system as well as their logger in a way that, for example, broke the Android permissions model. They allowed unsigned code. They
20:22
um, the daemon was using the iNetD listener bound to essentially all interfaces so any other app could also pull information from the daemon. They enabled debug options in the production code, so it was logging all sorts of things like key presses and SMS. And so here we made the case
20:43
that by this poor integration, by not having proper procedure to review code and tests and doing code signing, that they were not essentially maintaining reasonable practices. They broke the Android permission model and would effectively leak information to other apps. And just a word about these
21:00
security cases. We've been talking before about privacy cases that usually involve some sort of deception or a promise that's made that isn't kept. These security cases, we also use our unfairness authority to bring them, which means essentially, and we talked about this a little bit in the beginning, that we're looking at whether consumers are, can avoid the harm to them and whether they
21:25
are in fact harmed. And that's sort of the way we look at it. In the security context, that leads us to, excuse me, take a look at whether the practices that are being used by the company are reasonable. We don't believe that they have to be, that there's such a thing as perfect security or
21:41
that what we are expecting here is perfect security, but what we do start to enforce here are reasonable security measures in place to protect the consumer data that the company has. And that's where the technologists are absolutely vital to our mission because we have to understand and have expertise to understand the security practices and
22:00
procedures that are in place. We have a number of these apps. This is another case where multiple apps, in this case Fandango, the ticketing app and Credit Karma, an app that you can pull your credit score were essentially not cert-validating and allowing SSL man in the middle attack defeating the whole purpose of SSL. And so we brought a case
22:23
against them that by breaking cert-validation they were in fact not engaging in reasonable security practices. And these cases also involve a deception element as well. So promising you're securing something when you're not can be deceptive.
22:40
In one of the recent cases, trend net, which is an IOT camera, folks are probably, this is one of our first IOT cases, folks are probably familiar with this. It was kind of a webcam that allowed essentially users to, and they would advertise it could be used to monitor your baby or it could be used in banking environments. It was a secure camera, except
23:03
the secure view functionality allowed any user that could pull the IP address of the camera to pull the video feed, even if the user had marked the video as private. And this is one of these cases where again, the understanding for example how to connect and how the network, this was a really easy one, you
23:21
can connect directly to the camera, but some of the newer cams for example will have either bad defaults or the way they do port negotiation will allow any attacker to pull video for a number of cameras without even needing to go to the admin interface. So these IOT cases are really interesting
23:40
for us because we're trying to understand this space. There's been a number of talks here informing some of the problems in IOT and I think this is a critical area for us. And just a word about the harm in this case. So you know it's obvious if the harm is credit card information, financial information, that kind of thing. Here the harm though is exposing video feeds from people's homes to the public
24:01
internet. And we think that's a violation of privacy that is deeply harmful. Yeah I think in the case there were examples of people in various states of undress and engage in sexual activity, which we would argue is somewhat harmful. If an attacker can review that or not even an attacker in this case
24:20
someone can punch an IP address in their camera and watch that feed. And then the last case we'll talk about is a case called designerware. This is an older case, I didn't actually work on this case, but it was essentially, I'm sorry, I got that mixed up. The nudity was in this case, not the earlier case. This was essentially a rat, right? So
24:43
designerware made software that allowed rent to own companies to monitor devices that they rented to consumers. And so the software could be enabled to monitor keystrokes, take screenshots every two minutes, take web camera shots every two minutes, and there was no notice to consumers. And so we argue
25:03
that this was an unreasonable practice, especially that as I said, in some cases, photos of pictures of children, individuals not fully enclosed, and couples engaged in sexual activity were captured by the software. And this was like for people that were late on their payments, right? This is like the rent to own companies would enable this for people
25:24
that were late on payments and would capture this data without the users knowing. So again, unfair and deceptive, harmful to consumers, because it's exposing them to things that they can't avoid, they're not aware of, and they're really very vulnerable if your computer is turning on and
25:41
taking screenshots, sort of like Mr. Robot, of your home or your personal life. And I just want to highlight, we just covered a few of our cases, but there are a ton that are technical and informed by this community. We've brought cases against companies using flash cookies to circumvent browser
26:02
settings and privacy controls. We recently brought a case against a mobile app that had a bitcoin miner enabled in it that would essentially, kind of like SETI at home, would use your phone to mine bitcoins when you were not using the phone. We've brought cases against companies using CSS history sniffing, which is again an academic
26:22
paper that informed us of this practice, but we then verify and demonstrate the problem ourselves, but again, a very technical concept, and then we have a number of cases in data security. In addition to enforcement. Yeah, so we've been talking about our big stick, which is our enforcement cases, and there's a bunch more of them,
26:41
as Ashkan just said, so we can provide that information for you, and it's on our websites. But we also shape policy and the public policy debates in a variety of other ways. We convene workshops, for example. We've been very focused on the internet of things. We think the innovation and the potential in this space is absolutely terrific for
27:00
consumers, but as a bunch of the folks in this room know, and as has been demonstrated throughout Defcon this year and last year, there are a lot of potential pitfalls and vulnerabilities in these products, so we're looking carefully at that. We're looking at data discrimination. We're looking at health and fitness wearables and some of the practices there that are impacting consumers. Again, if
27:22
you're generating your own health information and sharing it, it's not HIPAA protected, and a lot of people may not totally understand that, so we're trying to understand how that information is being collected and shared. And cross-device tracking. We're also doing a workshop in November on cross-device tracking. Do folks know what this is? Cross-device tracking. It's an industry
27:41
term. Essentially, advertisers want to know, for example, when you see an ad on your mobile device and you later purchase it on your tablet or on your home computer, that you're the same user and they can attribute the impression or the conversion correctly, right? But the technology behind it is quite interesting, so we're having a workshop to kind of discuss some of the concerns and some of the
28:02
consumer notice and some of the choice functionality. For example, the technology works either through logged in services, so you could be logged into one of the big companies, Google, Facebook, Twitter, and they're able to know you're the same user. But for a good portion of the practice, a lot of companies will try to behaviorally model
28:21
that two devices are related. So you might have a burner phone and you might have your regular phone and you try to keep those separate, but in fact the technology will try to fingerprint and then identify that the same phone is connecting from the same location or the two phones are connecting from the same location or maybe they visit the same websites and it's using browser fingerprinting or
28:42
some sort of behavioral correlation that they, statistical correlation. One of the more interesting technologies, for example, is something that might resonate in the bad bios crowd. One company might use audio beacons, subsonic audio beacons that we can't hear, but one phone, the ad library in one phone will emit this pulse and an ad
29:03
library in the other phone will pick up that these two phones are in proximity to each other and then they will link that together. So that's a current practice and something we want to learn more about. So we also put out material. Really excited right now about our start with security
29:22
initiative. We're going to be convening workshops around the country starting in the fall, but we've already released a report called start with security. It's on our website. If this is your field, I encourage you to take a look at it. It's based on the more than 50 cases that we've brought in this space and really gets into sort of lessons learned and
29:41
best practices that we're drawing from our enforcement efforts. So one is happening September 9th and the other is happening September 5th in Austin. We're trying to bring start ups and researchers like yourself and VCs and others together to say how can we think about security from the
30:00
get go? How can we prioritize and what are those best practices? I also have a tech FTC blog and there I try to highlight some of the more technical best practices or some technical concerns in the space. Again, it's very much informed by this community. They tend to be me or other technical lawyers
30:22
at the FTC. There are actually quite a few. This is a blog post on the principle of least privilege and how to access API securely. So there's a bunch of work that we're trying to do in addition to enforcement to inform the community and what should be best practices. Again, it's very much informed by the work you all do as well. We also have been really excited to use the America competes act to run
30:43
contests to harness the technical know how of this community and others. So this week we've actually been running our humanity strikes back contest which is helping us bring new tools to consumers to block robocalls and then report them into a crowd sourced honey pot. Pretty
31:02
excited about this contest. It builds on our effort here last year which was called zapping Rachel. Rachel is that annoying robot customer service voice like I'm Rachel from customer service. This is really important to us because we operate the do not call list but we get about 170,000 complaints a month about robocalls from people who are
31:22
on the do not call list. It's really hard for us to try to, you know, enforce our laws and protect people from these really annoying calls. So we're trying to develop new tools, harness new technology so that not only will help us with our enforcement effort but also give consumers new tools.
31:41
So contests are awesome. And the contest will be announced today at 2 here at the award ceremony. And this is a great example of this need that, this harmony between technology and law, right? So a lot of what happens in policy is the lawyers are like technology will fix it and us our techs are like someone should make a law against that, right? But in fact you need the two working together. So we have the
32:03
do not call list but as we know robocalls can jump on PBXs and make calls from any number and they're hard to do, send a reputation, right? So what are some other tools we can employ to essentially protect consumers or kind of enact the do not call mission which is to protect
32:21
consumers from robocalls. We also have, I'm happy to announce, we're kicking off an office of technology research and this is essentially an in house research group that I'm helping put together. I actually have some interns in the audience here that managed to wake up at 10. Nice work. And
32:42
they're doing ongoing kind of proactive research into emerging technologies, right? So we're black boxing things ourself. We're poking at things. We're looking for vulnerabilities. We're looking at data issues. We're looking at data discriminations. Some of the topics of interest for me personally are the internet of things, obviously connected cars
33:01
which is a hot issue and this idea of algorithmic transparency and by that I mean, so in my past work I've helped highlight companies charging different prices to people based on your zip code or based on some refer headers or what cookies you might have. One way I like to send the message home is how many folks here in the audience use
33:20
some sort of mapping software, Google maps or Bing or whatever else, right? Almost everyone. But how many actually know whether you're being served the best routes, right? So we assume the system is routing us based on shortest distance or traffic or congestion. But how do you know it's not routing you in front of a store front or a billboard the company receives
33:42
kickbacks for? We don't. We don't have a way to look into algorithms currently and know what are the biases inherent. And so driving might, driving is one factor but there's issues with regards to discrimination and gender biases and a lot of problematic areas in society that algorithms
34:01
can help either directly or inadvertently contribute to bad practices or bias practices. So that's an area that I'm very interested in, in black boxing. Can I just, a word about this OTRI office which I think is really, really important and in fact Ashkan's role and the role of technologists at the FTC. What we see is increasingly the
34:24
need for us to protect consumers in an increasingly wired and connected world. We need to have our own technologists helping inform our law enforcement and policy mission. And so we're expanding the role that the technologists are playing and I think it is vitally important so that we can understand exactly what's happening in the marketplace and
34:43
keep up with a very dynamic and exciting innovative marketplace. And we've got to hog the stage about the FTC's work, right? So this is all about what we're doing now at the FTC, but there are a ton of ‑ ‑ ‑ there's a ton of other technical debates that are currently going on in DC that a
35:01
lot of us are not engaged on, but they're in fact critically important to technology or have ‑ ‑ require a deep understanding of technology to facilitate, right? So data security, export controls, we talked about a bunch of drones, student privacy, patents ‑ ‑ I think this was your slide ‑ ‑ student privacy, patents, facial
35:22
recognition, trolls. These are all really important tech policy debates. The FTC plays a role in them and so do a lot of other government agencies and so is Congress and we are here to make a plug for you to bring your technical know-how into that debate and have a voice in it as well because we
35:41
think having technologists at the table is absolutely vital to getting these policies right. So this is call of duty, it's really fun, but this is another call of duty, right? This is actually a lot more fun, right? Telling a high-powered policy makers that their understanding of the world is completely wrong is really great actually, it's quite fun, I've done it, I've testified a few times, but it's also
36:00
critical, right? Otherwise, if we're not engaged in these debates, if we're not trying to inform how things work and what the policy impact of the laws that will be proposed are, then people that don't have any technical background or brag about the fact that they don't use email or have a telephone, in Congress there's a flip phone caucus, and I don't
36:23
want to pick on particular members, but there are people that say we don't need to know about technology to make the right law, we think the internet should work like this or we think technology should work like this, and if we don't actually engage and inform this debate, other people will make these laws that affect us, affect you, without any kind of understanding of the
36:41
underlying technical impact. So main takeaways in case you haven't missed it, we need your help, your work has a lot of impact on what we do, and it matters, and we want you to come help us, we've been talking a little bit about some of the more formal ways we engage, but I also want to put
37:02
a plug in for just coming in and having a brown bag and talking to us about your research, we do that regularly at the FTC and it's really really helpful. Yeah, so if you want to contact us, ftc.gov slash tech is my blog, tech at ftc.gov is also how you can reach me, if you're coming to DC
37:21
and you want to talk about, if you want to give a presentation of something that we might be interested in, shoot me a line, check out the office of technology research, OTRI, in fact we are hiring, right, we're trying to bring, we're currently looking for white hat researchers, research coordinators, people to do research in house in our office of technology research, so I urge you to either email me or
37:44
check out the USA jobs posting under FTC, it's a horrible website, so you might want to actually just email me. But you can come work and I won't argue, it's not necessarily the most glamorous job, there's a bunch of constraints, but
38:01
again it's a tour of duty, so just if you come for one or two years and help work on some of these issues and help do research and you get to work on pretty much really fun stuff, you can poke apart any new technology and then make policy recommendations, so in addition to like poking apart stuff, you can say well you know what, this is how things should work, so you can make suggestions for policy and
38:21
how companies might want to implement or just ideas, so you can create a solution to these problems, so it's pretty fun even though it's government and your friends will make fun of you and spot the fed. So that's our talk, we have a few minutes for questions, I'm happy to take questions, I'm also happy to kind of talk about a few tips and
38:43
tricks on what I've found with regards to talking to policy makers, so it's not always, I think a lot of it is being able to communicate things effectively and something that our audience, not that we don't do a good job of, but there's a particular way you need to engage policy makers that I've found to be particularly effective, so I'm
39:01
happy to talk about some of those tips and tricks, whatever you guys want, so there's going to be a mic going around if anyone has a question, raise your hand, otherwise I can ramble on for another 10 minutes. So right here, fourth row. Good morning, thank you for coming, one of the big
39:22
stumbling blocks in consumer privacy right now is how to do demonstrable cognizable privacy harms and that's been a stumbling block for the private plaintiff's bar, whereas in the enforcement actions you mentioned, the commission doesn't have to show actual harm to consumers, I was wondering if you could speak to the commission's efforts, if
39:42
any, to bridge that gap or provide any sort of guidance or assistance to private plaintiffs who do have to show actual harm to themselves. Yeah, this is a really important question and as you point out, we continue to use our authority to really try to make sure that privacy promises that are being made are being adhered to, but I think again,
40:04
I would underscore the fact that we really look at the kinds of information that are exposed for example in data security cases and look very carefully at the promises and commitments that were made and whether they were adhered
40:22
to or whether people were misled. And it's important to stress that there are currently no baseline privacy laws in the U.S., right, there's currently some bills to provide those laws, but right now a lot of the issues, if companies say they want to, you know, pwn you in some way and they say clearly and consumers understand, for the most part,
40:43
with the exception of, you know, if you can demonstrate harm or there are other concerns, they get to do that, right, and that's part of the issue is the way the policy is written. And that might change over time as people's understanding of privacy and its impact changes, but right now, you know, a lot of our authority is based on this unfair and deceptive, which we've been trying to use
41:01
to protect consumers, but, you know, the laws are different than they are in Europe, for example. Yeah, my question is, and you guys have done a great job explaining the cases and why it matters, what happens to the companies whenever they're found to be in violation?
41:21
So we generally, in these consent orders we've been talking about, have the company under a 20-year order. They contain these two-year independent audits and requirements to have privacy and security policies. So, as a part of that process, we can hold them in contempt if
41:40
they're in violation and have actually done that. Right, and we can also say stop doing that, that's bad, or we can, like, find them as well, depending on the type of issue. So we can say, like, if there are particular practices that are problematic, a lot of our other cases are like fly-by-night scammers, so people that, you know,
42:00
scam your grandmother out of, you know, her savings, we go after those companies and shut them down as well. So we have a bunch of different authorities, and it's pretty fun, I will say. We can also, one of our authorities, and it comes up more in the scamming space, is we can get consumers money back. So, in our cramming, mobile cramming cases, for example, in in-app purchases for kids that were misleading,
42:22
we got hundreds of millions of dollars for consumers that goes directly back to consumers in the form of redress, which is like a check in the mail. There's some questions over here, too. Maybe speak loudly and I'll repeat your question. How about that? Actually, there's one with a mic right there. Oh, there's one with a mic right there. Sorry. Can you
42:42
detail the process of what happens when a tech policy law comes in, like the SOPA, TPP, CISA, when a policymaker is more or less out of their depth, how are they getting that technical information in order to make an informed decision? Sure. So, you know, it's definitely not the
43:02
schoolhouse rock, how a bill becomes a law, right? There's a lot of, I will say some policy makers have very technically adept staff. There's folks in Congress that have really CS majors in Congress or their staff does, but some don't, right? And some will, you know, oftentimes, and this
43:21
goes into some of the points I want to make, oftentimes they don't know what they don't know and in fact they don't care, right? Even though these are important issues, so are starving children and, you know, nuclear holocaust and whatever else, right? So there are a lot of issues in Congress and so to make this stuff kind of tractable or
43:42
important, you have to highlight what are those critical points and make sure that in the talking points, in the debates, the point is being made, right? So if it's car security, people have to demonstrate that it is in fact an issue and people should consider automobile safety in addition to, you know, increasing jobs, for example,
44:01
right? And so ‑‑ so can I jump in? As a nontechnical policy maker, you know, I think when you're talking to us, metaphors matter, images are good, pictures are good, acronyms are bad, we don't understand them anyway, you know? So you have to really kind of back out a little bit and sort of speak plain English, use pictures and
44:24
diagrams and try to make it real right at the beginning and then get technical, that's good. So don't assume your audience knows what the hell you're talking about but just nod and in the same way that you guys saw all those
44:40
acronyms, CFA and COPPA and you just nodded, don't assume that the people you're talking to even know what TCPIP is or what HTTP is or any of these ports. You have to find ‑‑ you have to always back and then go forward. But also don't make us feel dumb for not knowing. Right. Definitely don't
45:00
condescend. Metaphors are critical and metaphors ‑‑ so metaphor is a really powerful tool because you can help someone understand the concept but the metaphor has to be intact, right? So as you say, it's more like a phone call than a postal mail. As you start ‑‑ if the thing in fact is stored forward, then maybe it's not like a phone
45:23
call. Maybe it's cached. You have to make sure you pick a metaphor that's maintained but also it helps a lot even to just do a little bit of homework on what the law is around the previous metaphor. So law often works on precedent. So there's a law regarding phone calls and wiretapping. So if you understand those laws, then you
45:43
can use the metaphor and build on it in a way that resonates with the person who is working on the law. Don't ‑‑ so we have a tendency, I have a tendency to make sure that you know every frickin' detail about the thing and all my findings. Right. Realize that, again, people have limited time and you want to start with the crux. I got this a
46:03
little bit from my work as a journalist as well. What resonates with people and what is the turning point that this decision hinges on. If it's policy regarding where data is stored at international borders, first focus on just that piece, not the transits and not all the other details and then
46:21
expand. Again, you need to hit home with the one piece and not go into too much detail because otherwise you'll lose your audience. And then finally just be patient. A lot of the issue, I think a lot of ‑‑ so this video when we started with Ted Stevens was kind of funny but a lot of what
46:40
happened there and I feel bad because we contributed to it, a lot of what happens there is now the technical debate doesn't happen on the floor, on the senate floor. People are not willing to engage in technical conversation because they don't want to feel like noobs or they get made fun of for saying something wrong. So it's a lot of our ‑‑ you know, it's a lot of this community's work to say like
47:00
these are important issues, here I'll help you understand, I'll help you bring you along, you're not dumb for understanding it, in the same way that you guys aren't dumb for not understanding the law when your lawyer starts giving you a bunch of acronyms and your eyes glaze over it, at least it does for me. So we have to have that same level of respect and patience as well. Hi. I don't want you to
47:24
take it as a personal attack. Sure. But I think it's important for full disclosure. Ashkar, I assume that you were going through procurement when you received your position at the
47:40
FTC. However, Terrell, if I understand correctly, as a commissioner, you're an appointee, right? So at least you have a perceived alliance that would allow you ‑‑ that you know, people would think that you would make decisions,
48:04
you know, of prioritizing what to investigate and so forth, based on, you know, whoever appointed you. Even, you know, maybe fire Ashkar if he, you know, voiced opinions that you don't fully agree with where you want to take. Can you
48:22
address that a little bit? Whether I'm going to fire Ashkar or not, just to be clear. So I think that the point is that this is a really cool one, actually, because you're talking about a little bit about how our government works, which is
48:40
that people who are political hacks ‑‑ well, that's pejorative. People who are political are deeply engaged in making very important policy choices and hopefully we are using experts to help us do that. I am definitely political.
49:03
I'm appointed by President Obama. I'm actually filling a democratic seat on the commission. The way we're set up is it's two republicans, two democrats and a chair who's appointed by the president. So there's no question that there's sort of a divide there that's along some sort of political and partisan philosophy, right? And that
49:22
does reflect a little bit. We don't always agree. For example, sometimes three of us agree and two of us don't or something like that. We try to explain where the differences are. But all of us are really committed to using experts. So I would say also we use economic experts, we use technical experts and it's a really, really important part
49:41
of our work. And we're out of time, but just to jump in, this is actually ‑‑ so there's political agendas for sure and this is how government works, but you can't underestimate the power of truth or information. If you can demonstrate that you can pop a system or if you can demonstrate that the information is accessible or if you can essentially use science to prove a point, then
50:03
the politicians or the people with an agenda have to ignore you or silence you. But basically highlighting kind of realities of the world and at least in the security world are very powerful and kind of go above the politics. Even if your state of security is kind of a bipartisan issue, which is great, nobody wants to get popped in Congress. That's the
50:25
point. And so you might find that by highlighting and just kind of trying to speak the truth and working with the press or working with people to highlight these issues and shine the light, you can kind of cut through a little bit of that political bullshit. Sweet. I think that's our talk. Thank you very much for getting up early and
50:42
coming.