Pervasive Cloaking
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 122 | |
Author | ||
License | CC Attribution 3.0 Unported: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor. | |
Identifiers | 10.5446/40527 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
DEF CON 19115 / 122
3
5
10
11
12
22
23
24
30
31
32
38
43
46
47
49
51
54
56
59
60
62
71
73
76
84
85
88
92
93
96
97
98
104
106
109
112
113
115
119
00:25
Identity managementDigital signalChainControl flowInformation privacyDependent and independent variablesCryptographyPhysical lawInternetworkingSystem identificationAuthenticationEncryptionException handlingSource codeSoftwareResultantLattice (order)Identity managementRegulator geneBuildingChainShared memoryPasswordInternetworkingBitCryptographyReal numberInformation privacyGame controllerCybersexQuicksortSource codeAnalytic setSystem identificationPublic-key cryptographyArithmetic meanDegree (graph theory)Open setComputer clusterDependent and independent variablesMetropolitan area networkTouch typingException handlingEncryptionRight angleLie groupSoftwareDirect numerical simulationDomain namePerspective (visual)Task (computing)Forcing (mathematics)Ubiquitous computingScripting languagePhysical lawPhysicalismAuthenticationComputer animation
06:22
Execution unitMenu (computing)Internet forumElectronic mailing listControl flowFormal verificationMereologyGamma functionLink (knot theory)Form (programming)Information privacyCore dumpMathematicsIdentity managementStrategy gameIntegrated development environmentWeb serviceGUI widgetGraph (mathematics)Expected valueCircleGreatest elementImplementationInformationCommunications protocolInformation securityInequality (mathematics)Standard deviationTraffic reportingSimilarity (geometry)Wave packetReal numberRight angleRegulärer Ausdruck <Textverarbeitung>SpacetimeCyberspaceFeedbackRegulator geneQuicksortResultantGroup actionSource codeExpected valueCategory of beingPRINCE2ChainInformation privacy1 (number)Identity managementDevice driverState of matterKey (cryptography)Self-organizationProjective planeRule of inferenceGraph (mathematics)CircleSoftware developerVideo gameWeb serviceMultiplication signDistribution (mathematics)Computer animation
12:20
MereologyIdentity managementDomain nameRight angle
12:54
State of matterKey (cryptography)System identificationLaptopRAIDEncryptionIdentity managementForceInsertion lossControl flowIntelInternet service providerInformation securitySpreadsheetInternetworkingDigital signalAnalogyChainSpacetimeTrailForcing (mathematics)Identity managementState of matterMereologySet (mathematics)TrailRule of inferencePersonal identification number (Denmark)Computer programmingMultiplicationInformationDifferent (Kate Ryan album)AnalogyRegulator geneInformation securityMultiplication signPresentation of a groupRoutingSpacetimeContinuum hypothesisPlastikkarteBitDigitizingReal numberCryptography1 (number)SpreadsheetCyberspacePoint (geometry)Binary codeVotingIndependence (probability theory)Physical lawWeb serviceError messageConnected spaceInheritance (object-oriented programming)Degree (graph theory)Data miningCoefficient of determinationKey (cryptography)Internet service providerInternetworkingLevel (video gaming)Lattice (order)Public-key cryptographyPattern languageWhiteboardMappingSource codeVulnerability (computing)QuicksortProcess (computing)Right angleNumberWebsiteChainPerspective (visual)Client (computing)BootingDatabaseInformation privacyECosOnline helpComputer animation
Transcript: English(auto-generated)
00:00
Thank you very much for having the perseverance and the gonadinal fortitude to stick around. At the end of the week, you've had lunch, and this is the talk that's going to put you to sleep. Or not. I will ask some questions, and I am going to encourage audience participation.
00:22
I was left ammunition up here, so I may have to throw things. I had to give this a title, and I put parentheses around pervasive, and the idea of cloaking. How many people in the room have a nick, or some sort of handle that they use because they don't want to use their real name?
00:43
We're going to talk a little bit about why you do that. Actually, why do you do that? You don't get caught?
01:01
Okay, so we want to be protected against physical harm. Plausible deniability. Lawsuits. Shorter than typing your own name. Man after my own heart. The only problem is that when you create these digital identities, you create a chain.
01:26
A chain of custody. We're going to talk a little bit about that, and what that means. First of all, the disclaimer. My name is Bill Manning. It is not my nick. That's my real name. If you want to find me, you can Google me, and you'll find five or six other guys named Bill Manning.
01:41
Make sure you pick the right one. I'm currently employed by Booz Allen Hamilton in their analytics group, and these are my opinions. I have not actually had these vetted by the company, so hopefully they won't fire me when I go to work Monday. I look at this as people have digital identities because they want to be in control.
02:04
You want to control your own identity, your own destiny. I, me, I'm in control. It's my identity. I'm entitled to privacy and anonymity. I want to be able to be safe from physical harm when I say things that people don't like.
02:21
I want to do things where I have plausible deniability. I want to be called anonymous. How many of you in here are called anonymous? Everybody raise your hand. You're all anonymous to some degree. And so what you have is you have this sort of cloak that you wrap around yourself to hide.
02:42
And that's what I mean by cloaking, is that you hide your true persona, and you project another persona out there. Unfortunately, in this digital or in this cyber domain, it's a chain of trust. You can't unilaterally and independently create an identity and have it meaningful unless you share it with somebody.
03:09
So the question is, how long is your chain? You're going to be tethered to somewhere. So we're going to look a little bit in the back about how these chains were formed with crypto identity.
03:20
Does anybody remember the crypto wars of the last century? What was the result of the crypto wars? Phil Zimmerman. We love Phil Zimmerman. What was the result? The result turns out to be that there were some regulations that said you cannot export cryptography source code outside your country.
03:44
This was actually pretty much applicable across all countries. Import and export of cryptography was considered bad. It was an artifact of war, it was a munition, and therefore was controlled that way. So there's sort of this blanket, you can't do it.
04:01
After the crypto wars, partly because of Phil Zimmerman and PGP, and partly because of the DNSSEC work, those regulations changed. And now it's possible to actually move crypto software around. The problem here is, what is this threat to anonymity if you have pervasive cryptography?
04:25
How many people need to know if you actually encrypt something? In the previous talk, it was, I encrypt something with a password, and then I want to share that encrypted data, somebody else needs the password. And then public key, private key kind of get around that, but you have to share.
04:43
And as soon as you share, what happens to privacy? Anybody build tools to do eavesdropping so nobody knows what's going on? Liars. Or maybe they've already left town, maybe they're the smart people.
05:03
So the responses are, very briefly, all of this crypto was considered prior to about 1998, was considered an artifact of war. And they had the laws, and covered most of this.
05:21
From the internet perspective, the Internet Engineering Task Force, the ITF, has regular meetings. And they had a meeting in Danvers, Massachusetts in the mid-90s. And I remember in that meeting, a guy got up on the podium in his camo, and with his M16, and he said, they're going to pry cryptography out of the internet over my cold, dead fingers.
05:46
Great guy. He's dead now. They didn't pry it out of his fingers. What we have is we have strong crypto for authentication, identification was allowed and encouraged. That's why things like DNSSEC made it out into the world. And even the source for encryption.
06:01
This little quote here is from the Department of Commerce, from February 18th, 2000. And it basically says, we throw in the towel, if it's open source crypto, even for encryption, you can send it, you can export it, we can't touch you.
06:21
This is the general rule, except for some people. And this is where it gets dicey. These are the current regs. If you fall into one of those five categories, they can detain you, they can harass you, they can arrest you. And it basically says, if you're somebody we don't like or we're suspicious of, we can hold you.
06:46
We can make your life miserable. So previously it was, nobody can do it, and now it's, if we think you're suspicious, you can do it. What's going to trigger the US government's suspicion?
07:02
Well, that's one side. Let's move on to this next thing. There is a thing called NISTIC, and it has to do with trusted identities in cyberspace. This is an aspirational document from the White House that says, privacy and anonymity are important.
07:23
They are core principles of this document. We think everybody in the United States is entitled to anonymity and privacy. Except, there has to be a trusted third party.
07:41
Previously the United States did this, and they did this with a project called Clipper, which was, you have to give your keys to the US government and trust them, they'll take care of you, right? If I'm doing this wrong, throw something at me. Yes, you. You know this. No, no, no. Someone further back.
08:04
So that was called Clipper, and basically people kicked at this and they said, your ESCO policies are not sufficient, not good, we don't really want this. And so the government backed off. And now they're coming back and saying, let's try this again. And this time you don't have to use the government. You just have to use somebody that we trust.
08:22
And a few people looked at this and they said, wait a minute, this is a national ID. And they came back and said, no it isn't. Not really. It's not a national ID because there are a bunch of people in this ecosystem that we're going to create. And they're going to be the ones that hold your keys.
08:42
They just have to meet our guidelines, whatever those guidelines happen to be. And then everything will be hunky-dory because we have these organizations, services, devices, individuals that can trust each other because we have these third-party authoritative sources. This morning's talk with Whitfield Diffie and Moxie.
09:03
Moxie is really hot on this idea about these trusted authoritative sources. It's a little fuzzy on how that works, and we'll see that there's some problems there. The end result is if we got this ecosystem defined by the US government, it's kind of barren because there's only a few players that are actually going to meet,
09:25
potentially, what the regulations require for a trusted third party. It really boils down to who vouches for you. In real-world space, the people who vouch for you, how many people got a passport?
09:43
People have a driver's license, work ID, work badge, school ID. All of those institutions vouch for you. That's your nation-state, the nation you're in, the state you're in,
10:05
the organization you work for, the school that you attend. All of those things vouch for you, and they assemble this little chain that says, that's really who you are. You pile all this together. Phil Zimmerman took a slightly different approach,
10:20
and I think we see some of that in this community, which is that circle of friends. Who do you hang out with? Who do you share things with? Who do you swap keys with? Then there are a few people who do self-assertion. I don't really need anybody else. I don't need to have anybody tell me who it is.
10:41
I am Prince. I am Madonna. Self-assertion. Important people really want to do that by themselves. And the social graph really says, you can do this. From a NISTIC expectation, whatever they pick as trusted third parties have to meet their guidelines, which are not defined.
11:05
The question becomes, can bottom-up groups become approved NISTIC-branded, if you will, parties? Don't know. So the idea here is that self-assertion is sort of right out unless you are actually really famous anyway.
11:23
The problem here is that that aspirational document for NISTIC doesn't actually have the backing of any regulation. This is where the trusted third parties break down. Does everybody use the same rules for keeping your private data? Do you have any rights to manage the data that you've been given?
11:47
Or that you're giving to somebody else? Can you control the distribution of that data? Do you have any feedback? Is there any sort of watchdog agency over the folks that manage your identity?
12:03
How do you do this transient trust problem? And how do we have this idea about having meaningful voice in the development of policy for this stuff? None of that's really there yet. Those are open questions.
12:21
So I looked at this and I said, wait a minute, there's already an ecosystem out there that manages identity. And it's pretty robust. And it's got all this good stuff. So what the US government is really talking about is that we're going to approve this little corner in the lower right-hand side. That's going to be our little domain of trust.
12:41
Well what happens if you're outside of the US? You've got a whole other part of that ecosystem that you can wander in and still not have a problem. So if I'm a multinational corporation, if I'm based in Bangalore, India,
13:02
do I care about NISTIC and I don't do business in the US? Probably not. But I do care about tracking my employees. And it's probably associated with whatever regulations are inside India about managing digital identities.
13:23
Similarly, if I'm in South Africa. Or if I'm in Taiwan. Or Japan. So if I'm outside, I'm either a nation state or I'm a multinational corporation, the US rules are only a part of the landscape that I have to work in.
13:42
And then there are these people that I call hobos. I used to hang out with a lot of people who spent more time on airplanes, in more different places, and had multiple passports from many different jurisdictions. Who do they report to? I don't know.
14:02
Which rules do they follow? And it really boils down to who are the trusted third parties? Trusted by whom? Do you trust your government? Raise your hand. Do you trust your government? All in favor, say aye. All against, say aye.
14:23
I trust my government to not have my best interests at heart most of the time. You can put it on a t-shirt, I'll take 10% off the top. And then there's this idea of self-identification.
14:41
I told you, or somebody else told you, the program told you, my name was Will Manning. Am I really? How do you know? There's this idea about trusted by whom. If there was a trusted third party, and this woman had given her digital identity, her credentials, her secret keys to this trusted third party,
15:07
would the U.S. Department of Justice be asking these questions? Does anybody know what's going on here? Basically, what they're doing is saying we want a trusted third party,
15:22
and we think that probably one of the rules of being a trusted third party under our program is if there's a warrant, you have to give us somebody else's data that they've escrowed with you. Do I want that?
15:41
Maybe. Maybe not. You don't want that. You really don't. You don't want me to hand over to my fed buddy over here. I mean, you don't want me to hand that data over to them, right? Just because you told me something?
16:03
Don't give it to me. If I don't have it, I can't give it to somebody else. So from a law enforcement perspective, though, they have some real legitimate reasons for tracking people and identities and stuff, right? And if there are multiple digital identities, and you hand them out to intermediaries who manage that stuff for you,
16:27
you're making their job a little bit more difficult. Because then they have to correlate and data mine many, many more sources of data to try and actually track the patterns that identify you.
16:41
But they'll do it, right? And then there's this other part that we haven't really talked about much, which is market forces. If I'm going to become one of these trusted intermediaries, or I'm going to become this notary, unless there are some explicit provisions for me to not share your data,
17:04
I'm going to collect this big database, and the first thing I'm going to do is I'm going to go to my board of directors and say, see, we've got all of these clients. They're going to go, how do we monetize this data? They're going to want me to sell the data, either individually or in aggregate, to other people so we can make more money.
17:23
And so what I'm doing now is I'm actually creating a two-phase thing. I talk to my people that say, register with me. You can trust us. And out the other side of my face, I'm saying, look at all this nice, interesting, aggregated data that talks about people that use this service.
17:43
Right? That's a false sense of security on the part of users, because I think I'm protected. But my intermediary, my trusted third party, isn't. So they're going to collect my data, they're not going to tell me about it,
18:01
and they're going to commoditize that data with aggregates with everyone else. Just for grins, what if somebody collected all the data for all the attendees at DEF CON? And sold that. Bad idea. And then there are always, and this came up this morning again, is the budget trusted identity ecosystem provider.
18:25
We're going to hold your personal identity, your data, your shop, 1024-bit keys, using high security. Actually, it's kind of like, high security, we're over here, you're over there. We're going to put it in this nice little digital locker. We're going to protect it with really, really strong security.
18:45
Rote 13. My dog can decrypt Rote 13, my dog is dead. We're going to put it in an expel spreadsheet on this XP connected cluster directly connected to the internet for 20 bucks a month. What a deal.
19:04
So the real problem is that when you actually see the emergence of multiple providers, people wanting to be your trusted third party, there's no regulation, no information about how these people should operate and protect your data.
19:21
There's nothing. That still needs to occur. So I'm going to ask the question. Is anonymity analog or digital? Digital, I mean, is it binary? Am I either anonymous or am I completely exposed? Or is it more of an analog thing?
19:41
I can show you a little bit, but I'm going to show these guys more. Which is it? We have a vote for analog. All in favor of analog, say aye. Why are you raising your hands? Can't you follow instructions?
20:01
Digital, binary. Both. How is it both? Why not? Okay, so what I'm telling you is that I have exposed to you a couple of bits of information about myself.
20:23
My employer, my name, and the fact that I'm sitting here at DEF CON. I have given you nothing else. I haven't given you my date of birth. I haven't given you my mother's maiden name. I haven't given you my social security number. I haven't given you my private keys.
20:41
I haven't given you my credit card information, my bank account number, any of that stuff. I've only given you two bits of information. Exposed a little bit. Other people over here know more about me than that. So I think it's kind of a continuum when you think about anonymity.
21:01
You cannot be completely anonymous, I don't think. And the real kicker for me here is that we are known by the company we keep. I can try and be anonymous. But unless I'm actually using something like Route 13 where I have plausible deniability.
21:22
If I'm using strong crypto, I'm trackable. Or my strong crypto is trackable. And as soon as you can track me, all you have to do at one point in time is to correlate that digital identity with me. And then you've got my entire history of what happened.
21:42
And I have no longer any sort of plausible deniability. This is a weakness with people using strong crypto. And then there's this question, what does it mean to be or to remain anonymous? In the 1930s, there was a cartoon called Betty Boop.
22:05
Anybody remember Betty Boop? That would be the old school people. Remember Betty Boop, right? And Betty Boop had, in one of the things in 1930, actually 1929, there was a character that emerged called Mysterious Mose.
22:20
Which turned out to be an artifact that frightened the socks off of this poor lady. And everybody else, including the guy who was running it. It turns out it was a fabricated identity that nobody really knew anything about. And so, if we attempt to be anonymous or attempt to maintain our level of privacy,
22:44
we need to be a little bit worried because sometimes those things can turn on us. And at the end of the day, the problem is how do you manage your digital identity trust chains? Can you do it alone? How much information have you shared with others and can you be assured they haven't shared it with other people?
23:04
And how do you revoke information that's bad? Help me. Give me an answer. I'm sorry?
23:22
Whatever I put out there is out there and I can't recover it, right? You know what's scarier? Whatever you put out there about me is non-recoverable. And you may be lying through your teeth about me and I have no idea, right? So there's this whole idea about actually being able to manage the digital identities that you create.
23:47
And I don't know that we should create too many. So I'm going to start wrapping up here. The digital identity, or identity ecosystem, or online persona, or however you want to call it, all of those things map back to meet space identity,
24:04
where existing law and rule applies. I'm sorry, John Perry Barlow, there is no independence in cyberspace. It all maps back to meet space.
24:22
Then that would be an admission that I attended the California State Public Schools when I was little, and I didn't spell check. Please feel free to correct the spelling errors. So nobody is really isolated after this stuff.
24:42
You are all, at some place, grounded in a rule of law. And everyone, to some degree or other, is partially cloaked. Nobody knows everything about you. Conversely, everybody is partially exposed. Somebody knows something about you.
25:02
Maybe many people. I hope that no one goes and knocks on my parents' door and asks them about me because they're going to get all kinds of stories about how I was as a little boy. Those stories are getting out. The point being is that there is a trail. Everybody leaves a trail.
25:22
The more you work in the digital space, those trails are actually more persistent than the ones in the real world because nothing is ever really erased or gotten rid of. There are always copies around some place, and people make money going and investigating those trails.
25:43
Additionally, if you build tools to try and hide yourself, those tools will be distributed. If they ever show up on the internet, they will get copied. And they will be used contrary to your interests. This is a war of escalation.
26:01
The more we try and hide, the more people try and uncover what we're doing, and they use the same tools against us that we use against them. So the advice here is to choose your traveling companions wisely. Be very comfortable in who you trust with your digital identities.
26:22
Anchoring this back to NISTIC, it's still being formulated. There are some discussions going on, and it's not too late to influence the outcome, particularly when it comes to the sets of regulations that are going to describe and regulate these trusted third parties. So if people in this community want to influence
26:41
what the U.S. government is likely to do, there's an opportunity to do that. And it really boils down to, this is something I mentioned this morning, is the idea of influence ripples. As soon as you do anything, it affects somebody else. And that affects how they interact with other people.
27:03
And so your influence, the things that you do online, ripple out and affect a larger and larger community. And you cannot recall that information. So it's important to be very careful in what you do. And that's my presentation.