Protecting Data with Short-Lived Encryption Keys and Hardware Root of Trust
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 112 | |
Author | ||
License | CC Attribution 3.0 Unported: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor. | |
Identifiers | 10.5446/38912 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
DEF CON 2192 / 112
3
6
8
9
13
14
15
16
17
22
23
24
25
29
32
33
36
37
39
42
45
47
49
53
60
61
64
65
66
71
76
79
80
82
89
103
106
108
00:00
Computer hardwareRootKey (cryptography)EncryptionSoftware developerComputer hardwareInformation technology consultingPresentation of a groupKey (cryptography)Information securityRootSoftwareComputer animation
00:28
Bound stateKey (cryptography)Slide ruleBlogPresentation of a groupKey (cryptography)Information privacyContext awarenessInformation securityDisk read-and-write headComputerMultiplication signMobile WebUniform resource locatorComputer animation
01:04
Non-standard analysisMobile ComputingService (economics)Point cloudMessage passingContent (media)Mechanism designPoint cloudSign (mathematics)ComputerMobile WebRight angleComputer animation
02:06
LaptopPersonal digital assistantMeeting/Interview
02:25
SmartphoneLaptopMultiplication signHacker (term)Information securityComputing platformMaxima and minimaFocus (optics)ComputerMeeting/InterviewComputer animation
03:01
Virtual machineRadical (chemistry)InternetworkingCharacteristic polynomialMeeting/Interview
03:18
ComputerPoint cloudData integrityDatabase transactionInternetworkingVirtual machineLaptopPolar coordinate systemComputer animation
03:40
Mobile WebInformation securityComputing platformModule (mathematics)PasswordTask (computing)Antivirus softwareOcean currentData storage deviceThermodynamisches SystemTime zoneAntivirus softwareMobile appThermodynamisches SystemInformation securityTablet computerRootkitComputing platformLevel (video gaming)Complex (psychology)PasswordKeyboard shortcutFunctional (mathematics)Goodness of fitExtension (kinesiology)Mobile WebComputer animation
04:34
RootkitService (economics)Mobile appInformation securityRemote procedure callGame theoryChainLevel (video gaming)Service (economics)Cartesian coordinate systemRange (statistics)Computer clusterComputer animation
05:16
EncryptionBootingComputing platformIntegrated development environmentComputing platformModule (mathematics)RootService (economics)Normal (geometry)InformationInformation securityCoprocessorINTEGRALFirmwareSystem on a chipTraffic reportingComputer animation
05:58
Computer hardwareRootToken ringEncryptionCarry (arithmetic)Virtual machineContent (media)State of matterINTEGRALPoint (geometry)Computer animation
06:46
Client (computing)Server (computing)Attribute grammarRootBootingChainRootkitPresentation of a groupRootMultiplication signRemote procedure callState of matterBootingMobile appFirmwareVideoconferencingOperating systemComputer animation
07:19
CodeAntivirus softwareAsynchronous Transfer ModeInternet service providerRemote Access ServiceService (economics)Hash functionMobile WebDatabase transactionInternetworkingCryptographySocial classWeightComputer animation
07:51
Remote Access ServiceHash functionElectronic signatureBootingComputing platformBootingHash functionServer (computing)Configuration spaceMiniDiscMedical imagingKey (cryptography)MeasurementEncryptionVolume (thermodynamics)MetadataAuthorizationPhysical lawCodeCryptographyContent (media)State of matterMathematicsDeterminantConnectivity (graph theory)ChainMechanism designSign (mathematics)Power (physics)Binary codeComputer animation
09:14
Remote Access ServiceBootingComputer hardwareRootElectronic mailing listWindowStandard deviationSet (mathematics)Real-time operating systemKey (cryptography)Public-key infrastructureSensitivity analysisWordPublic key certificateMultiplication signChainComputer configurationCombinational logicFirmwareComputer hardwareVariety (linguistics)MeasurementClient (computing)Computer animation
10:26
SurfaceAuthorizationLimit (category theory)Set (mathematics)Point (geometry)Client (computing)FrequencyService (economics)Key (cryptography)Remote procedure callKerberos <Kryptologie>Validity (statistics)MeasurementGoodness of fitInsertion lossComputer animation
11:10
Digital rights managementDigital rights managementInsertion lossMeasurementComputer animation
11:28
Common Language InfrastructureThermodynamisches SystemKey (cryptography)CryptographyComputer2 (number)LaptopElement (mathematics)Point (geometry)Level (video gaming)Slide ruleMultiplication signTerm (mathematics)Minimal surfaceSequenceSampling (statistics)CodeBlogPlanningBuildingVariety (linguistics)WindowDefault (computer science)Group actionDemo (music)Tracing (software)MiddlewareRSA (algorithm)Symmetric-key algorithmPay televisionTablet computerComputer simulationCASE <Informatik>Elliptic curveImplementationInferenceMeasurementStreaming mediaComputer animation
14:24
Computer fileMeasurementRepository (publishing)MiniDiscComputer hardwareComputer fileClient (computing)Workstation <Musikinstrument>Personal identification numberReal numberEncryptionIdentity managementCASE <Informatik>Computing platformCuboidComputer animation
15:15
Data modelMeasurementBootingBus (computing)Virtual machineThermodynamisches SystemHash functionSource codeToken ringInternetworkingComputing platformRemote procedure callEndliche ModelltheorieBootingRootkitINTEGRALMeasurementMiddlewareData flow diagramWordMereologyInformation securityCASE <Informatik>CybersexWindowNational Institute of Standards and TechnologyThermodynamisches SystemChainHeuristicRootMobile WebDiagramOrder (biology)Right angleNational Institute of Standards and TechnologyComputerMultiplication signRing (mathematics)Set (mathematics)Engineering drawing
17:39
Computer hardwareChainPolar coordinate systemResultantPoint (geometry)Keyboard shortcutComputer hardwareIntegrated development environmentState of matterInformation securityComputer animation
18:18
Keyboard shortcutBootingKey (cryptography)Personal identification numberMobile appSimilarity (geometry)EncryptionKey (cryptography)MereologyBootingMeasurementSemiconductor memoryNumberWindowGoodness of fitMiniDiscReduction of orderCombinational logicMultiplication signType theoryInsertion lossPerspective (visual)SoftwareInformation securityComputer animation
19:35
DatabaseHash functionHeuristicMeasurementCodeMobile WebSoftware industryComputing platformIntelArmINTEGRALSoftware developerPerspective (visual)Key (cryptography)Point (geometry)Strategy gameSoftwareDependent and independent variablesInformation privacyDifferential (mechanical device)Information securityComputer animation
20:35
Cloud computingControl flowInformation securityEnterprise architectureComputerMultiplication signFrequencyDemo (music)Ocean currentCombinational logicNormal (geometry)WindowPoint (geometry)Statistical hypothesis testingBitMeeting/InterviewComputer animation
21:40
ComputerInformation securityCloud computingControl flowEnterprise architectureComputer animation
Transcript: English(auto-generated)
00:00
Thank you very much, everyone, for coming. Thank you very much to DEF CON for having me. This is a nice, big crowd, and so I'm really grateful for this opportunity. This presentation is on protecting data with short-lived encryption keys and hardware root of trust. My name is Dan Griffin. I'm the president of JW Secure. We're a
00:22
Seattle-based consulting company that specializes in custom security software development. Today, JW Secure released a tool for experimenting with short-lived encryption keys and a white paper discussing secure time and mobile computing. The tool and
00:42
white paper are linked in blog posts that we put out this morning, and the URL of my blog is on the slide. This presentation will talk about the context of that work, including why we need high-assurance data protection, what are the foundations for achieving it, and how it can be undermined. General Alexander, the head of the National
01:08
Security Agency, was at DEF CON last year to give a recruiting talk. Did everyone go to that? All right. Did everyone sign up? Are there any feds in the room right now? I
01:24
actually don't see any hands. I guess General Alexander's message was a little too controversial for DEF CON, which is saying something. In other more open and accepting venues, the NSA has discussed the need to be proactive about secure mobile computing.
01:43
Does that strike anybody else as hypocritical? All right. But the big takeaway for the rest of us is that the NSA is working on mechanisms for trusting their content from the cloud to mobile devices. They've stated as much. So there must be ways that the rest
02:03
of us can do the same thing. Let's review the checkered history of mobile devices. When laptop computers first appeared, they were awkward and obviously from the early 80s. Early PDAs were also awkward, although I think the trio was kind of cute. Now half
02:27
the people sitting at Starbucks are on a laptop and everyone has a smartphone. And now that we can work and play in the digital world anywhere and at any time, we actually do. But as work moves to less secure computing platforms, hackers like Elizabeth
02:44
here have a rich new world of opportunities. After all, hackers and spies are just like everyone else. They look for the maximum return for the minimum investment. So their focus is now on mobile devices. What can be done about this? The first thing we want
03:04
to do is define some characteristics of a trustworthy machine. But if there's one thing we learned from Terminator 1 and 2, it's that machines are sometimes trustworthy and sometimes not. Likewise, people talk about identifying the person in an Internet
03:23
transaction. But it's not a person. It's a machine and it may or may not be properly expressing the user's intentions. Given all of that, how can we be sure that a remote party is telling the truth? Desktop PCs and laptops can be configured to be secure. And
03:48
users can be trained. Some of this applies to phones and tablets and some of it doesn't. For example, it's hard enough to enforce complex password policy when users have full-size keyboards. But on phones, the keyboard experience is built around autocomplete
04:07
and it's not consistent across apps or devices. Likewise, many mobile platforms still lack the system level extensibility required by the good third-party antivirus systems. Still,
04:24
mobile devices have lots of untapped functionality that can be used for increasing security. So let's see how to use it. Relatively low rootkit risk is perhaps the only aspect of the phone that is still more secure than the typical PC, at least for now. But
04:46
application-level attacks have become much more prevalent. When a mobile app or game costs 99 cents or less, you can bet that zero attention has been paid to security in that supply chain. You get what you pay for. And in mobile, there's a whole new
05:09
theft of data, remote theft of service, including by the telcos. In short, it remains difficult to get the device to reliably report on its current state. And that presents a
05:25
risk both to back-end services and to sensitive data. To get reliable information from the device, we need to get an authenticated report from a tamper-resistant root of trust. This is the purpose of the Trusted Platform Module, or TPM. The TPM is a crypto processor
05:44
typically implemented as a tamper-resistant chip on the motherboard or as a secure execution environment in system-on-a-chip firmware. The latter is becoming the norm.
06:00
Device integrity is the point of the exercise, but how can that help protect the stuff that the user and the custodian of the data care about? First, we want to measure that the device is running the way that it's supposed to be running. That's the first consideration. The second consideration is that we want to ensure that only on a
06:21
compliant device can I decrypt sensitive data and only while the device remains compliant. By issuing encrypted content bound to a specific TPM and to a specific state of that device, as reflected in the TPM, we can really lock down the lifetime of that
06:44
content. So how can we determine if the device is telling the truth about its state given the risk posed by installing bad apps, if not rootkits? Remote attestation uses the TPM, root of trust, to determine if the firmware, bootloader and operating system are
07:04
known good. I talked about remote attestation in detail in my DEF CON presentation last year, and I'm not going to go into as much detail this time. You guys probably know, DEF CON takes really high quality videos of these talks, and so they're all available.
07:21
What this means is that determining the health of a mobile user device requires some infrastructure, and it requires a little cryptographic dance, but it's achievable. Nevertheless, I'm not suggesting that these techniques be applied to every Internet transaction. They're too heavyweight for that. But for securing high value data on
07:43
consumer class devices that are sometimes disconnected, this is currently the best foundation we have. So let's dig in. With measured boot, starting with the BIOS, before each component in the boot chain is loaded, the previous component computes its hash on
08:02
disk and sticks that hash in the TPM platform configuration registers, or PCRs. After boot, a boot log can be retrieved from the TPM. The log includes the boot image hashes that I just talked about. It also includes some code signing information, as in
08:21
who signed the binary. And it also includes other boot metadata. For example, was disk encryption used to unlock the boot volume? Importantly, the TPM can sign that boot log with a special purpose key that you can, of course, downstream determine if you trust or not. The server can then issue content decryption or authorization keys bound
08:44
to those measurements. Sorry. The server encrypts the key to the manufacturer endorsement key, and that endorsement key is unique to each TPM. So when the device
09:00
state changes, the measurements change, and the TPM will refuse to use that encrypted key thereafter. So we have a pretty powerful mechanism here. Let's see how to wire it together. Trust starts with the TPM and the key distributed with it. As I mentioned, this key is set by the manufacturer, or the OEM, along with a PKI
09:25
certificate that's signed by that manufacturer. And there's a pretty short list. So if you keep that set of issuing certificates, you can determine across a variety of devices whether you trust that chain. Thus, the TPM is as protected as any hardware or firmware can be.
09:47
In other words, electron microscopes are a problem, as is an insecure supply chain. Importantly, TPM 2.0 includes a secure monotonically increasing counter. This at least
10:03
is a more secure option than the standard PC real time clock is when it comes to enforcing policies that are time sensitive. For example, they need to start now, they need to end at a certain time, they have a limited window, they can only be run once, various combinations of all those things. This counter on the TPM is the foundation of our
10:22
work on short‑lived keys. Once the client device has received a measurement bound key from the remote attestation service, how can we use that key? Well, consider that constant reauthorization is expensive and that users hate it. Kerberos, for example, uses a
10:44
token that is good for, say, eight hours. And Kerberos has been able to mitigate some of the burden of reauthorization that way. The point is that this validity period is a policy setting that can be ratcheted down for the truly paranoid or increased for a
11:07
technique to protect mobile data. So again, for gory detail of how to implement a data loss prevention or a digital rights management solution, using TPM measurement bound keys,
11:23
please see the white paper that I mentioned at the beginning of this talk. To run your TPM through its paces, please check out the time key tool that I mentioned as well. I'll warn you that currently the prerequisites for running that tool are relatively steep
11:42
because unless you have a system on a chip‑based Windows 8 tablet or laptop from the past couple of months, you probably do not have a TPM 2.0 system, which I admit is a little lame. But bear with me. Obviously this is a particular concern if you're planning
12:02
on building solutions based on that capability like my company is. But we believe it will make sense in the near term, and especially for small high‑value deployments. Also, if you're serious about developing custom TPM middleware, you'll want to join the Trusted Computing Group, and specifically the TPM subgroup, because that's how you can download
12:26
their full TPM simulator reference implementation. So that will save you months of engineering. But as an additional warning, the ability to download that code requires a premium‑level membership because they know the value that they're giving you. So you
12:42
won't be able to do that. You won't want to do that as an individual unless you're a dot com millionaire. My blog provides a trace of running the time key demo using this tool. These traces are sufficiently verbose to allow you to infer the TPM ‑‑ excuse me, infer the TPM command sequence. In summary, you can use the first three
13:05
commands listed on the slide to do the following sequence. And you can do other things as well, as you can see. There are a variety of commands. But let me run through what we consider to be the default case. First, you use the tool to create a 2048‑bit RSA key,
13:20
and you specify policy to limit the lifetime of that key to 60 seconds. Note that another major improvement in TPM 2.0 is that it supports both elliptic curve and symmetric cryptography. So that raises some very interesting scenarios around, of course, streaming. The caveat being the TPMs are not fast. But there are some
13:45
opportunities there. Anyway, TPM 1.2 only supports RSA. And TPM 1.2 does not support the time bound keys thing. So again, this tool is not going to run on a 1.2 device.
14:01
The next step then is running the tool to encrypt some sample data. And third, still within that 60‑second window, decrypt some data. Finally, after 60 seconds, do the decryption again. You'll get the expected failure saying that your key is out of policy. With these capabilities in mind, again, how can we use measurement bound keys
14:31
to protect mobile data in the real world? Consider data access by trusted insiders. In this case, we want to ensure that only users with trusted client machines and
14:43
encrypted disks can download sensitive files from the document repository, say, such as SharePoint or box.com. By using platform attestation to enforce pin‑protected disk encryption, hardware identity and a limited key lifetime, we decrease the chance that the
15:04
data can be recovered from a lost or stolen device. And arguably, we decrease the chance that the data can be stolen even with the device still in the hands of the customer. When you deploy such a system, you should understand what can go wrong. The
15:24
BIOS integrity measurements or BIM model was published by the National Institute of Standards and Technology to protect computers from rootkits. BIM basically boils down to measured boot plus TPM remote platform attestation. In other words, BIM is the NIST
15:43
model for everything we've just been discussing. Based on this, my company implemented a solution called BHT or BIOS integrity measurements heuristics tool for DARPA cyber fast track last year. We included this data flow diagram along with a
16:01
threat model as part of that deliverable and we like it for obvious shock in all purposes. Looking at the diagram, security increases from the left where you have the user to the right where we have the TPM root of trust provisioned by the
16:22
manufacturer. So insecure, more secure. Note that it is only the two rings on the right, the TPM and the supply chain that we're depending upon for measuring the trustworthiness of the device. Of course, this assumes that your remote attestation solution, your
16:43
middleware, has been implemented correctly, which can be a big or small assumption depending. Given that assumption, two other questions arise. First, can you trust your supply chain? And two, are your adversaries sufficiently well funded to attack the
17:01
TPM directly? We believe that properly engineered middleware can significantly narrow the window of attack on mobile data. But there's nothing that middleware can do if your chip set is owned. And in order for users to be able to interact with plain text
17:22
data within the allowed policy window, there is necessarily an increased risk of attack during that time. In any case, remember that the goal is to prevent the device from lying about its integrity. TPM 1.2 is widely available on PCs, but to date not widely used. As a
17:49
result, the user experience around initialization and provisioning is quite poor. And this can necessitate shortcuts when you're trying to deploy a solution based on these
18:01
capabilities in a typical heterogeneous environment. And of course, shortcuts are the enemy of security. This problem is essentially solved on the latest hardware. So hopefully adoption of TPM 2.0 will continue to increase. As I mentioned a couple of times,
18:21
we recommend that measured boot be used to enforce disk encryption, if possible, as part of any data loss prevention solution. But on Windows, that combination pretty much implies that you're going to be using their BitLocker feature. BitLocker has been the subject of a number of published attacks. Nevertheless, when properly configured, we
18:43
think it's good protection. Note that for other TPM‑based encryption solutions, such as those you might implement using this measurement‑based keys capability, many of the same types of attacks may apply. So you need to do your research and do what you can to
19:05
prevent exposure, reduce exposure of your keys in main memory. This gets back to the streaming opportunity I talked about before. That could be a really compelling way to mitigate the exposure of keys in main memory at the risk of trying to figure out the
19:21
throughput you're going to want to get to make streaming interesting today. Can you decrypt HD content fast enough on a TPM? I don't know. But when somebody figures out, that's going to be cool. Looking forward from my perspective as a security software integrator, there are three points to consider. Intel has excellent developer support for
19:46
the technologies we've been discussing, particularly on the Adam chipset. But will their mobile platform adoption strategy for Adam be successful? ARM has been slower to embrace
20:01
TPM, presumably because no consumer cares. But if Intel makes TPM a differentiator, will ARM respond? I think they would. I hope they would. Finally, can software companies successfully integrate these capabilities in a way that is both secure and usable, that being
20:21
the usual tradeoff? Either way, as I've demonstrated, short‑lived keys are a great tool for mobile data protection. And support for TPM 2.0 is increasing. So learn to trust machines, if only for short periods of time. Thanks. I left tons of time for questions,
20:54
if anybody has any. Yeah, the question was about ‑‑ I mentioned a prerequisite that our
21:13
current demo tool depends on TPM 2.0 as well as 32‑bit Windows. The point being that that's a rare combination these days. Yes, the one test device we've been able to get our
21:24
hands on is the Acer Iconia, the new one, Acer Iconia W3, which happens to only be 32 bit. That is not going to be the norm, I can guarantee. If anybody else has questions,
21:42
you're invited to use the mic or I can repeat it. I'll go to the Q&A room, which is down the hall. Thanks, guys.