We (the government) are Here to Help: How FIPS 140 Helps (and Hurts) Security

Video thumbnail (Frame 0) Video thumbnail (Frame 1288) Video thumbnail (Frame 2783) Video thumbnail (Frame 4840) Video thumbnail (Frame 6889) Video thumbnail (Frame 8974) Video thumbnail (Frame 11478) Video thumbnail (Frame 12681) Video thumbnail (Frame 13754) Video thumbnail (Frame 15589) Video thumbnail (Frame 18062) Video thumbnail (Frame 19612) Video thumbnail (Frame 21495) Video thumbnail (Frame 24151) Video thumbnail (Frame 27331) Video thumbnail (Frame 30638) Video thumbnail (Frame 32144) Video thumbnail (Frame 33505) Video thumbnail (Frame 35159) Video thumbnail (Frame 37995) Video thumbnail (Frame 41492) Video thumbnail (Frame 43664) Video thumbnail (Frame 46010) Video thumbnail (Frame 48227) Video thumbnail (Frame 50332) Video thumbnail (Frame 52534) Video thumbnail (Frame 59505) Video thumbnail (Frame 62338) Video thumbnail (Frame 64833) Video thumbnail (Frame 66619) Video thumbnail (Frame 68335)
Video in TIB AV-Portal: We (the government) are Here to Help: How FIPS 140 Helps (and Hurts) Security

Formal Metadata

Title
We (the government) are Here to Help: How FIPS 140 Helps (and Hurts) Security
Title of Series
Author
License
CC Attribution 3.0 Unported:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor.
Identifiers
Publisher
Release Date
2013
Language
English

Content Metadata

Subject Area
Abstract
Many standards, especially those provided by the government, are often viewed as more trouble the actual help. The goal of this talk is to shed a new light onto onesuch standard (FIPS 140) and show what it is inteded for and how is can sometimes help ensure good design practices for security products. But everything is not roses and there are certain things that these standards cannot help with or may even inhibit. By examining these strengths and potential weakness, the hope is everyone will have a new opinion of this and similar standards and how they are used. Joey Maresca is a security analyst/engineer with a background in computer hardware and software, including a BS in Electrical and Computer Engineering from The Ohio State University. In a past life, he worked at the US Patent Office, while not the most exciting job it was an informative experience. Over the past five years he has worked in the security field with a primary focus on FIPS 140 testing and validations. This has allowed him inside access to dozens of commercial products.

Related Material

Video is accompanying material for the following resource
Goodness of fit Standard deviation Process (computing) Validity (statistics) Gender Multiplication sign Bit Process (computing)
Area Presentation of a group Frequency Programmer (hardware) Validity (statistics) Different (Kate Ryan album) Multiplication sign View (database) Information security Twitter Product (business)
Standard deviation Sensitivity analysis Computer program Group action Module (mathematics) Algorithm Cryptosystem Multiplication sign Maxima and minima Mereology Subset Number Cryptography Software testing Process (computing) Information security Module (mathematics) Area Standard deviation Algorithm Information Validity (statistics) Computer program Cryptography Computer Digital rights management Process (computing) Personal digital assistant Information security
Standard deviation Validity (statistics) Software developer 1 (number) Variance Public key certificate Product (business) Product (business) Number Revision control Pointer (computer programming) Number Process (computing) Revision control Software testing Process (computing) Traffic reporting Electric current Proof theory Newton's law of universal gravitation
Web page Standard deviation Slide rule Implementation Standard deviation Validity (statistics) Link (knot theory) Sheaf (mathematics) Core dump Bit Group action Perspective (visual) Component-based software engineering Word Component-based software engineering Different (Kate Ryan album) Core dump Energy level Software testing Information security Information security
Standard deviation Standard deviation Implementation Personal digital assistant Multiplication sign Control flow Software testing Directed set Software testing Implementation
Slide rule Greatest element Implementation Existential quantification Module (mathematics) Service (economics) Texture mapping Multiplication sign Sheaf (mathematics) 1 (number) Black box Mereology Product (business) Revision control Latent heat Software testing Digital rights management Information security Traffic reporting Physical system Module (mathematics) Area Authentication Execution unit Standard deviation Link (knot theory) Mapping Information Key (cryptography) Validity (statistics) Interface (computing) Fitness function Cryptography Digital rights management Integrated development environment Software Video game Cycle (graph theory) Boiling point
Sensitivity analysis Module (mathematics) Sheaf (mathematics) Black box Vector potential Product (business) Latent heat Cryptography Type theory Different (Kate Ryan album) Operator (mathematics) Logic Energy level Software testing Configuration space Information security Vulnerability (computing) Module (mathematics) Vulnerability (computing) Algorithm Validity (statistics) Key (cryptography) Information View (database) Physicalism Bit Cryptography Flow separation Type theory Error message Logic Password Interface (computing) Configuration space Normal (geometry) Energy level Key (cryptography) Asynchronous Transfer Mode
Service (economics) Link (knot theory) Multiplication sign Authentication Combinational logic Sheaf (mathematics) Password Set (mathematics) Login Mereology Public key certificate Revision control Mathematics Operator (mathematics) Single-precision floating-point format Computer hardware Energy level Gamma function Information security Identity management Authentication Module (mathematics) Personal identification number Addition Execution unit Standard deviation Information Validity (statistics) Digitizing Forcing (mathematics) Java applet Bit Term (mathematics) Cartesian coordinate system Type theory Word Software Password Energy level Quicksort Resultant Library (computing)
Digital electronics Multiplication sign Set (mathematics) Mereology Component-based software engineering Envelope (mathematics) Different (Kate Ryan album) Single-precision floating-point format Information security Computer font Gradient Bit Computer Electronic signature User profile Type theory Drill commands Reflektor <Informatik> Order (biology) Modul <Datentyp> Energy level Quicksort Information security Slide rule Module (mathematics) Dependent and independent variables Process capability index Modulare Programmierung Product (business) Number Software Computer hardware Energy level Software testing Router (computing) Module (mathematics) Dependent and independent variables Multiplication Standard deviation Demon Key (cryptography) Validity (statistics) Information Plastikkarte Cryptography Single-precision floating-point format Embedded system Personal digital assistant Interpreter (computing)
Digital electronics Intel Information Bit Mereology Value-added network Product (business) Medical imaging Uniform resource locator Component-based software engineering Personal digital assistant Computer hardware Conditional-access module Physical system
Module (mathematics) Materialization (paranormal) Mereology Graph coloring Event horizon Machine vision Term (mathematics) Cuboid Energy level Reflektor <Informatik> Software testing Vulnerability (computing) Physical system Module (mathematics) Dependent and independent variables Inheritance (object-oriented programming) Key (cryptography) Core dump Bit Message passing Error message Personal digital assistant Reflektor <Informatik> Software testing Spacetime
Multiplication sign Mehrplatzsystem Sheaf (mathematics) Client (computing) Function (mathematics) Mereology Usability Semiconductor memory Vulnerability (computing) Physical system Electric generator Electronic mailing list Data storage device Bit Instance (computer science) Digital rights management Process (computing) Order (biology) System programming Cycle (graph theory) Physical system Directed graph Asynchronous Transfer Mode Asynchronous Transfer Mode Random number Server (computing) Random number generation Data storage device Modulare Programmierung Portable communications device Latent heat Operator (mathematics) Computer hardware Energy level Integrated development environment Software testing Digital rights management Computing platform Computer architecture Standard deviation LTI system theory Validity (statistics) Key (cryptography) Cartesian coordinate system Number Integrated development environment Personal digital assistant Finite difference Function (mathematics) Video game
Random number Random number generation Distribution (mathematics) Local area network Token ring Set (mathematics) Function (mathematics) Food energy Computer Number Product (business) Hypermedia Internetworking Formal verification Wide area network Symmetric-key algorithm Series (mathematics) Booting Execution unit Standard deviation Distribution (mathematics) Link (knot theory) Electric generator Key (cryptography) Data storage device Computer Public-key cryptography Number Hexagon Hypermedia Software Personal digital assistant Function (mathematics) Order (biology) Asymmetry Wide area network
Randomization Random number generation Service (economics) INTEGRAL Multiplication sign 1 (number) Data storage device Product (business) Power (physics) Operator (mathematics) Ring (mathematics) Computer hardware Energy level Symmetric-key algorithm Reflektor <Informatik> Software testing Series (mathematics) Analytic continuation Associative property Firmware Proxy server Form (programming) Module (mathematics) Pairwise comparison Dependent and independent variables Algorithm Key (cryptography) Validity (statistics) Consistency Structural load Data storage device Physicalism Bit Public-key cryptography Power (physics) Category of being Software Personal digital assistant Software testing Conditional probability
Proxy server Algorithm Direction (geometry) Multiplication sign Sheaf (mathematics) Translation (relic) Data storage device Information privacy Side channel attack Perspective (visual) Computer Product (business) Number Exclusive or Operator (mathematics) Computer hardware Encryption Energy level Software testing Router (computing) Proxy server Information security Hydraulic jump Physical system Chi-squared distribution Module (mathematics) Standard deviation Algorithm Key (cryptography) Information IPSec Fitness function Data storage device Physicalism Bit Binary file Limit (category theory) Computer Type theory Message passing Process (computing) Loop (music) Software Configuration space Software testing Encryption Information security
Standard deviation Perfect group Multiplication sign Maxima and minima Mathematical analysis Shape (magazine) Mereology Side channel attack Machine vision Revision control Latent heat Goodness of fit Mathematics Single-precision floating-point format Software testing Data conversion Information security Form (programming) Window Module (mathematics) Authentication Execution unit Standard deviation Key (cryptography) Software developer Maxima and minima Bit Type theory Arithmetic mean Process (computing) Password Revision control National Institute of Standards and Technology Procedural programming
Module (mathematics) Web page Computer program Slide rule Standard deviation Implementation Algorithm Link (knot theory) Link (knot theory) Validity (statistics) Software developer Multiplication sign Projective plane Cryptography Public key certificate Product (business) Local Group Revision control Mathematics Vector space National Institute of Standards and Technology Hill differential equation Software testing Information security
Module (mathematics) Standard deviation Validity (statistics) Direction (geometry) Multiplication sign System administrator Bit Open set Cartesian coordinate system Mereology Perspective (visual) Product (business) Wave packet Revision control Word Process (computing) Software Password Right angle Software testing Object (grammar) Information security
good afternoon i'm here to present the talk on we the government are here to help we're going to do is talk about the FIPS 140-2 and ered and take a look at take a look at it so the agenda here as
you can see i want to give you a little bit of who i am then we'll talk about the background of the standard getting to fips 140-2 the validation process and then we'll look at some of the requirements it's kind of hard to talk about the the good and the bad of the standard without actually knowing what's in there we'll also take a look at some of the future kind of give you an idea of what's what's to come I've also allotted a little bit of time hopefully for some Q&A any here if necessary there's also the Q&A room after the talk as well so Who am I I am joy maresca
some people may know me as lost knowledge as my handle on Twitter and what I go by a couple other places I work directly with fips 140-2 I have experience in the area for the last five years I've been doing phipps validations i presently run a phipps validation lab and i have seen hundreds of validations in that in that time period so I've seen a lot of different products and seeing how a lot of different vendors implement the requirements outside of you know work I have other interests I I do programming I'm a lock picker I I'm just a general security enthusiast I think it's you know I think it is one of the areas that gets overlooked a lot by by some people because it's hard to start to convince people they need security until it's usually too late one
disclaimer and I added this at the last second because I realize there are a few people here who probably know who my employer is so just the standard disclaimer that any views and opinions in the talk are my own not that of my employer or anybody else for that matter
so that question is why am I here so I wanted to shine a new light on security standards particularly some of the government standards I hear a lot of people who talk about negatively about the standards they complain about the processes they complain about the requirements and having to implement them without having a lot of knowledge about what's actually within those standards and what actually goes into them so the hope was that you know to give a new look at these standards you know the important thing to remember is that no standard is going to protect against everything you know they standards become dated because they all take a long time to develop their enforcement is always going to vary and so that has a great impact on the security as well the last thing is that they do provide sometimes they do provide a false sense of security and that can also be a negative issue you know that no matter how much I talk about it is always going to be something that's going to exist so what is the
FIPS 140-2 and ered it is the federal information processing standard number 140 this defines the requirements for cryptographic systems that are to be used within the government for protection of sensitive information the program is actually managed by by both missed here in the US as well as the Canadian government and through us there csec which is the computer security establishment of Canada there are two groups and miss that actually handled the management of the program the sea MVP in the cryptographic module validation program and the sea AVP the cryptographic algorithm validation program and there the group who are responsible for the kind of for all the algorithm testing that's required as part of the phipps process the last kind of thing is as I mentioned it is you know largely a federal government standard though it does have acceptance in other areas we've seen cases of other standards bodies who have required phipps validations as a pre wet prereq yes I can't speak this morning yes well I need a real drink so anybody hasn't so the past president future of
phipps so the original version of the standard was 141 it was originally published in 1994 and was eventually replaced by fips 140-2 the currently is a development for a new revision of the standard 143 it has been in draft since 2005 you know I always tell people you know it's going to be coming we're going to be seeing it eventually I've been saying that since I started doing this work so how does the process itself
actually work for validation so there's actually three main parties involved in the validation process there are product vendors somebody has to create the product to be sold that is going to be validated there's the accredited labs there over 15 at present i think the actual numbers maybe 18 or 19 and then there's the sea MVP the ones are responsible for issuing the actual validation certificates what's important to remember is that the government doesn't do much beyond review reports from the labs so there's a lot of variance in testing because each lab even though they're following the standard they may have different methods of actually performing testing to verify requirements so we're going to start
taking a look into the actual standard itself there are kind of three key components to fips 140-2 and ER dit self which is kind of the the core document what was originally developed and there are two kind of ancillary documents that they've created the drive test requirements the the derived test requirements and the implementation guidance draw even won't know what it is yeah Oh wake you up in the morning so there are the requirements are actually divided up into 11 sections and we'll kind of go I'll take a look at each of those a little bit later on and there are four increasing levels of security and so you definitely see within those four different levels you will see kind of the difference you know a level one validation from a security perspective isn't gaining you a lot all the documents all three of these documents are available on the CMD p web page I've included a link in the slides I do not know for sure that the slides made it into the DVD or not but i will post them on my webpage after Def Con so they'll be there even if they're not on the DVD
so the FIPS 140-2 standard as I mentioned this is kind of the original the core document from which the other two documents are derived this defines all the requirements and it also defines all the terminology one thing that you'll learn not with just phipps but with most standards is that they have they like to come up with their own words and so it can be it can be a little bit of a pain sometimes to to correlate those that terminology to the words that we all use you know normally the document can be vague this has been one of the issues I think with implementations is that the standard will read one way and then it's up to an individual to actually interpret interpret that you know requirement
the derived test requirements is a much longer document and actually the one that kind of breaks out what needs to be tested by both the labs as well as what needs to be provided by vendors so it's organized kind of as I've thrown here using assertions which are direct quotes out of the standard and then underlying vendor evidence items or what you'll hear me refer to as VES or test or evidence items that are referred to as tease and these two items are usually counterparts to each other for a vendor evidence item there's usually a matching tester evidence for that we have to perform to verify it so the
implementation guidance this is the smallest of the documents and it actually is intended to clarify the requirements and the other two documents it said that it's not supposed to introduce new requirements however that's not really always the case and there are times where the implementation guidance can be said to extend or even introduce new requirements that aren't currently existing on the standard but it does tie back into both the standard in the drive test requirements so as you
see on this slide here I've tried to provide a mapping and I hope it comes out pretty well but as you can see in the top the top portion is taken from the standard I've tried to underline the I've tried to underline the quote from the standard and as you can see that same quote appears in the assertion in the line below and the middle picture is taken from the derived test requirements on the bottom is an implementation guidance and again you can see the relevant versioning and information that is provided to identify you know what assertions what sections of the standard that these requirements apply to so as I
mentioned there were 11 areas of security they're all outlined here and for those I've actually kind of taken out the ones that are largely documentation as you can see there are you know four sections there that that are mostly for the most part documentation you know most of the requirements simply boil down to verifying the something's being done properly for the other sections there are actual requirements that need to be tested and that aren't really covered anywhere else within the standard asst besides in those sections so I'll just kind of briefly go through each of these and there's some more detail on each of them as we go through but the cryptographic module specification is actually the section that kind of defines what the what the product is what's being tested the cryptograph reports and interfaces is a very kind of black box look at a validation module role service and authentication is pretty straightforward it's you know who can use the module how do they authenticate and what can they do physical security is just as it sounds the operational environment is operating system requirements those are really only applicable when validating software cryptographic key management deals with key life cycles from the time they're generated to the time they're destroyed and in the self-test section which is probably one of these sections that is missing from most modules that haven't been phipps validated because they're very fit specific requirements for self test implementations so as I
mentioned the cryptographic module specification defines the validate module the biggest weakness in this section is at lower levels where the configuration and use of the module and an approved manner is dependent on user configuration so for low-level validations at level one and two it's up to the operator to ensure they're not using weak algorithms that might be that might be detrimental to their security at higher levels this gets a little stronger because at levels three and four it's all enforced by a module so level three and four the module or the product has to actually be as actually have a switch essentially that it either always has to work in a compliant manner or it has to use a it has to you have a which that basically turns on the FIPS mode of operation thereby kind of preventing users from Miss configuring the device as I mentioned the crib
graphic module ports interfaces is a black box kind of testing it defines the requirements for the types of data that flow at level 1 and 2 there's really the only requirements differences are that at levels 1 & 2 physical data and there's no need to have physical or logical separation of critical data so essentially keys keys passwords and other kind of sensitive information can go through the same physical ports as as normal data at level 3 & 4 there has to be some physical or logical segmentation of this data and also if there's plain text data then you have to have what they call a trusted path or a directly attached cable the trusted path is a little vague you know it creates and actually create a bit of a catch-22 one thing that was originally viewed as a trusted path was a mutually authenticated TLS session well the problem with that was that now your data is encrypted so you no longer actually are sending stuff in plain text so you had a trusted path but you also didn't have to meet that requirement as I
mentioned in section 3 as role services an authentication like I said the name pretty much says it all at level 1 there is no authentication requirement so basically there's no need for an operator to authenticate this is probably the most common level for software validations because for the most part software applications and particularly software libraries don't implement a don't implement a sort of authentication at level 2 there's what's referred to as role based authentication and so the downside to this obvious are there's no real accountability you basically have a set of authentication roles and you know 20 people may all have that login information so you don't particularly know you know who's actually you know performing the who's actually the one logging in who's actually doing the changes Pat and the other thing is that obviously password links can be enforced by policy so again it's up to the operator to ensure that they're using properly strong passwords at level three and four it gets better again this is kind of a common theme of the standard for the most part is that you know higher levels of security gain you a little bit more security so they have identity based authentication at these higher levels that includes some user accountability because now each user is uniquely identifiable you know what roles they can assume is based on you know credentialing and so you would have individual user name password combinations or individuals would have their own unique certificates so each person would be uniquely identified when accessing the module so the password
requirements are probably some of the are relatively weak and I think this is actually a result of age more than anything else as I mentioned the standard as almost is actually 10 years old now so its strength requirements or you know that there's a one-in-a-million chance of an attacker being able to brute force the the authentication information well a simple for character alphanumeric password is enough to meet this requirement so obviously nobody's going to think a four character password is secure I mean even a six digit pin is enough to meet this requirement so it's it's a very weak requirement in that sense there are no restriction on the types of passwords so again you know you know dictionary words you know for you know you can name all your colorful four-letter words that could technically be legitimate passwords to meet the one in 1,000 one in a million chance the one in a hundred thousand chance is based on multiple successes within a minute so you know in addition to being able to prevent a single brute-force attack you have to find a way to limit people from just pounding away until they can successfully get a password that matches so this is normally enforced via lockouts it's the most common method this kind of ignores long-term attacks though so but again I think one of the reasons that they've focused on a single minute is that the goal the idea is that if you know for a long-term tak if somebody has your hardware has that much access to your hardware you're pretty much Sol anyway the future might be a little better and that's that all depends on what the final version of 140 dash 3 looks like and something that we can discuss I don't know if we'll have time to get through it today but if if you have questions about that we can you know look into that little deeper so as
I mentioned physical security this is not really applicable to software modules again because they're all reliant on computers the requirements are actually broken down not only by level but also by the type of product that's being validated so single chip modules or signature products are you know like integrated circuits or smart cards or have a different set of requirements and multi chip standalone or multi chip embedded devices where you know you might have routers and switches or hardware security modules you know PCI crypto accelerators know like at level one they're essentially is no physical security the requirement essentially is that it be a production grade be made of production grade components essentially were just the testing is limited to ensuring that that it's not just a bunch of components that I've been soldered together and you know in somebody's garage and I was actually been you know commercially produced at level two they move to opacity and tamper evidence so this level what they're looking for is that you know you can view the internals of the module and that if somebody does open up the enclosure that you actually can tell that they've done so level three is tamper response so in this case if you have an enclosure that's actually removable they're looking for a response to be taken a response to be taken when when somebody opens the enclosure and in that case it's zero ization of the of the keys within the module at level 4 they have tamper detection and level 4 is very rare because in part because of level four requirements the tamper detection essentially means that you have an envelope around the product that any attempt to puncture drill Mel the enclosure would actually result in the zero ization of the keys within the module thereby securely destroying any sort of you know plaintext keys that might be used by that module so what is
opacity it's actually a very subjective requirement of the standard and so that's part of the reason why I wanted to take a little bit deeper look at it the ventilation can be a little bit tricky especially for networking modules which are the most common level 2 validations and the interpretation has kind of changed over time and i'll show you an example of that in the coming slides previously in order to actually fail this requirement you had to be able to make out manufacture information and part number information off of the integrated circuits and the components within the module no longer is that really required now they will even take outlines of components or you know silhouettes as being sufficient to fail this requirement so it is very subjective and it comes down to impart the government reviewer as well as the testing laboratory so in this example
here I've shown an example of a van obviously failing case and I don't know it actually does show up pretty well so as you can see there the one that circuit that you can make out and it's kind of blurry but it's actually an intel chip and and it was actually a surprisingly good picture considering I had to use my cell phone to take it so you know mo typically we have a we would use a better camera than a cell phone cam or to actually capture these images and obviously what you can usually make out with the human eyes a little better than what you can get with a lens because we can't really poke it through those grates that you can clearly see inhibiting the inhibiting the camera and so in this case we have an obviously failing case both under the old methods and the new methods in this example we
have a case it's a little bit more vague it's the same device but taking a different angle and from a different location so in this case really the only integrated circuit you can make out is this you know side of an integrated circuit here you can't really tell what it is actually even when using just holding it up and actually using the human eye it's very difficult to tell or make out any parts or information from that from that chip now under the old system that would be that would be acceptable under the new one it's a little more vague and typically this would be considered a fail because because you can see the component even if you can't necessarily make out what it is one of the big complaints about this requirement obviously is that what does that really gain you and you know if an attacker really wanted to know what's inside the product they can go buy most of these pretty much every hardware device that's been validated is you know the same off-the-shelf item that vendors the vendor sell this last
one that looks like a giant gray box is the top of the case this is a pretty apparent pass if you can see through that then you have x-ray vision and superpowers are good for you so obviously in this case there is no opacity concern
so a tamper evidence in response I level two it's impairing it just needs to be a parent that an attacker has you know attempted to compromise the system there's a little bit of weakness on this because one there's limited testing that can be performed by the labs in particular they're not allowed to add new materials now obviously if labs could sit there and spend 10 hours and take off a label and kind of paint it back on that would be a problem that that might be a little excessive for testing particularly for what they're trying to capture at level 2 however you know there are some there are some modules that honestly could probably be failed with nothing but a sharpie and filling filling in a little bit of you know space with a matching color so it's a little bit you know since you aren't allowed to do that as part of testing it's a little bit kind of just in disingenuous to think that that you're actually gaining much from the tamper evidence obviously sometimes the labels fail so magnificent magnificently that you know it's not an issue level 3 I mean as I mentioned you have to respond to tamper events when a door or covers removed you have to actually you actually have 20 eyes all the keys essentially destruction of the keys zeros a shins one of those nice terms that they decided to invent it's all about key destruction so for the
operational environment at level 1 a single user mode is defined and this is actually a definition that's changed over time again about four or five years ago the definition used to be that you had to in order to be validated as a software module level one you had to be tested and validated on a system that could only log in a single person at a time they went so far as to actually provide guidance to the individual labs on how to configure a unix based system to operate in a single user mode it's a little ridiculous because it ignores the entire you know server client architecture and the fact that you know most servers are going to have more than a single user at a sink at one time the this definition has changed a little bit they've actually kind of you know vaguely hand waved it away by saying that you know a single user mode means that you know only a single instance of a software module can be accessed by a single user you know using the concept you know under the concept or thought process that well when you load an application into memory if different operators are loading that application may have they supposedly would have their own instances obviously this has its own weaknesses but it's kind of satisfied the server architecture requirements and again for level one what you're trying to get from a validation is a little different than at levels two three and four a little too and higher they have a requirement for a common criteria evaluated operating system and so in these cases it actually limits the platforms that can be validated because you have to tie two specific validation platforms and so for each common criteria validation for salt for operating systems there's actually a list of hardware platforms on which they can on which the testing for that was performed and so you have to match that as part of this and that greatly limits the usability and portability of some of the level to software modules and that's why you don't see a lot of level to software modules and for some reason you don't see any three and four level 3 or level for software modules so cryptographic
key management this is actually probably one of the two sections of the standard that has the most requirements related to it this deals with everything everything to do with the key life cycle so this includes random number generation the key generation itself how keys are entered an output as well as their establishment the key storage requirements which as I say here they're mostly meaningless and Kizer ization
random number generation and heat generation are both required use approved standards so for rent a number generation they've got a set of approved deterministic our energies that are acceptable for symmetric keys obviously they just must use these approved our energies asymmetric key generation methods actually have to follow the sets of approved standards so for a symmetric keys and there's two there's two or three standards that are acceptable for the methods and it varies based on which key generation which asymmetric policy you're using for DSA ecdsa and RSA this methods are defined and phipps 186 dash 2 and 186 dash 3 additionally the key generation method for RSA keys and ansi x9 31 is also acceptable
Sookie establishment and entry and output and these are really the only requirements that vary based on kind of behaviors that the products don't necessarily control so Emanuel distribution are are considered largely impractical I consider them largely and practical methods but they're considered relatively secure essentially if I wanted to assign the same key to a series of devices I would walk around and punch in the same key that into each device either using a key loader you know verbally telling it to somebody else which you know passing around hex characters is that way it would be real fun and or using tokens which is probably the most practical method of the manual distribution methods electronic distribution our keys over what they call unsecured media and so in this case this is both your local networks your wide area networks the internet anytime that you don't necessarily have a pure I'd you know verification of what of the network these are considered unsecured methods and so in those cases they have to be encrypted you have to use methods like TLS SSH and diffie-hellman in order to actually protect the protect the key methods so for key storage and zero
ization as I mentioned the key storage requirements are pretty weak there's no real requirements for the form of stored keys so it's not required that to encrypt keys even at higher levels the other requirements for storage are at best a little vague they have what they call an association of a key and an entity and so in this case a lot of times it boils down to well Keys associated with the product that it's in or maybe it's tied to a specific user but there's no real there's no real requirement on how that key is stored on the zero ization which boils down to just the destruction of keys involves actually overriding keys and that's one thing that's important a lot of times for software validations in particular because most the time they just do simple free operations which may not necessarily be destroying the data so they actually require that they be overwritten with zeros ones or random data it needs to exist for all plaintext keys but it can be done procedurally so there's no requirement necessarily for when it's performed or how it's done it doesn't need to be automatic and for any reason at all except for tamper response with physical modules at levels three and four so the self-test there are two
categories of self tests the power up self tests which are basically boiled down to a series of known answer tests for approved algorithms and a firmware integrity test the conditional tests are done for specific services so they have a continuous random number generation test which is designed to ensure that you aren't generating the same value over and over again the funny thing is that you know there's still some probability that you should generate that you could potentially generate the same value twice well you technically can't with a continuous R&G there's the pairwise consistency test which is designed to check the validity of key pairs that are for a symmetric keys now the firmware load test which is designed to for when you actually have hardware modules that can load and upgrade their firmware bypass tests which will go a little bit more into because it's a little bit more complicated and the manual key entry tests which is essentially to ensure integrity when you're when you're doing manual key entry which basically is very rare the only time you do manual key entry is if you're actually punching Keys and through a touchpad on the product itself
so as I mentioned take a little bit more of a look at the bypass tests the exclusive there are two types and exclusive and an alternating bypass the exclusive bypass is basically it is basically having a switch where it's either encryption is either on or off so you're either encrypting all data going through the product or your I'm letting it all pass through without doing any process in an alternating bypass you actually have differing channels where some data might be configured to go in plain text and other might being configured to go encrypted the most common example would be with a router where you might have data going in multiple directions and some of that may be through an ipsec tunnel and so again the configuration determines which of those are actually being encrypted in which are plain text the bypass test has one common theme and basically it's not to accidentally pass plain text data there are a couple steps to this process they have a couple of requirements related to it however it's most commonly accomplished using a test fire I'm essentially sending a date sending a data packet in ensuring that or sending a test packet for network devices usually through a loop back and verifying that it's being encrypted that you're not just you know spitting out plain text data when you should be encrypting it so are the best and the
worst of the requirements well from from the good side there's the enforcing of stronger algorithms you know one thing that you would that I was very surprised about is the number of people who actually when you talk to them still use algorithms like md5 and des to do data protection and it's and it's one of those things that the standard does do is that it does try to ensure that only strong algorithms are being used the physical security at higher levels is is actually very good at level 3 and 4 you're getting what you would expect for physical security an operator at level at little three and four you know you're going to have a module that you know is going to be destructible that's not going to retain information you know after it's been tampered with the bypass test while it can be a little cumbersome I actually think it's a very good policy in and of itself even without the standard you know you never want to actually accidentally be sending data and the clear that you don't want to be so it's a very common sense test in that sense and then the encrypted electronic key entry again you know passing data passing keys and plain text / or wires never a good idea but again you'll be surprised how many people do it the worst well the limitations on physical security testing as I mentioned the limitations on what you can actually do from a from a physical from what the lads can actually do from a testing perspective on the tamper evidence is it is a little bit is bad because you know obviously it doesn't take into account you know the level of the level of skill that an attacker would have the limited zero ization requirements again not having not forcing the the module to actually perform zero zation at set times is is not going to improve security at all so again that's one of those items that needs to that's not really addressed well and that's how to protect the keys you know when somebody's gaining access to the product the third is something I didn't jump too much into but it's about how hardware centric the standard is if you've actually ever read the standard or if you ever do decide to read it you'll notice that a lot of the requirements are very hardware centric that they you know talk about certain requirements that when it comes to software they just don't make sense and there's no easy translation and so this is actually created issue sometimes for software product because they're trying to enforce requirements that don't necessarily you know fit into the the mold of the standard again the another item there is the lack of protection for key storage you know while it's somewhat understandable at lower levels I do believe that higher levels of security should come with least some requirements that certain keys be protected not only when they're not only when they're being distributed but also while they're stored in the module and obviously you know the if you have plain text keys within a product then they're vulnerable to to some level of access as somebody can get physical access to the system the last requirement is that the standard is largely ignorant of side-channel attacks and again this is a this is something that's kind of largely again based on the age of the standard the standard is like I said 10 years old and so it's when they developed a standard they created a documentation based section that was kind of a you know an approach to look at the requirements you know and to allow vendors to say oh yeah we protect against side channel but not necessarily require any testing of that that's something that they're going to potentially correct and 14 th three but two how many types of modules into what products that will apply to is something that's still kind of up in the air so as
I've mentioned a couple times there is a new revision of the standard that's being drafted you know the the newer vision has been in development for over seven years almost seven years now and it's kind of it's almost you know become a little bit of a you know wait and see you know there are some new requirements that exist in the current draft particularly authentication requirements that need to be enforced by modules themselves so instead of telling a user you have to use an eight character password or 12 character password those would be enforced you know within the did with you know actively reject small passwords and hopefully early revisions included specific strength increases and I'm hopeful those will remain in or that they'll be added in some way shape or form there's been side as mentioned their side channel testing requirements particularly at higher levels and at a minimum these would be at a minimum these would be for single chip modules there's been talk of expanding it out an optionally I been or optionally allowing multi-chip modules to also test against these side channel methods there's also the improves your zation requirements those limitations of procedural zero ization so you know it's no longer just as easy as saying oh yeah well when you went to zero as a key you can call this command but it's up to the user to perform it you know they've kind of increased the circumstances under which those types of zeros ations can be performed to kind of prevent overuse of procedural key destruction so when is
the future coming well that's that's a good question as I've mentioned the standard has been in development for seven years currently the best guesses are that by the end of 2012 or early 2013 I've actually started taking you very being the pessimist i am very negative approach which is that I'll believe it when I see it what I'll say is that I think a large part of the reason for the time but it's taken to develop the standard is that they feel what they have is relatively good for what they're trying to do so they've kind of taken their time in developing the new version now the pro current public draft is actually rather dated it was released in December of of 2009 it was actually a Christmas gift a very bad Christmas gift nonetheless but it was a that that draft is a little dated from what we've from what I've had conversations with with people at nest the they have internal drafts that have made some changes in Corrections based on public provided comments but again nobody knows for sure outside of NIST what's actually in that standard it will it should be and will be an improvement over the FIPS 140-2 standard but it's still by no means perfect if somebody came out with a perfect standard for security then I guess we'd all be out of jobs so to summarize the FIPS 140-2
standard does provide some good requirements that can be improvements over baseline security you know as I mentioned there are a lot of products that that I've had a chance to look at over the years where simple changes would greatly improve the security of the product while it is a good first step it obviously doesn't guarantee that you are any safer particularly since it's up to a lot of times to make sure that you're implementing that properly I recommend you know incorporating some of the good into projects that you might be working on if you're doing crypto and security if you're doing cryptographic relevant development I recommend implementing some of those items for for two reasons potentially some of them the requirements are actually good they actually can improve security and the other is you never know someday that product may actually want to be used by a federal government agency and you'll have to implement it anyway if they really want to procure their product I
have included a couple of links and the slides the first two are to the NIST pages for the cryptographic module Allah Dacian program and the cryptographic algorithm validation program I didn't speak a lot about the sea AVP but one thing to note is that if nothing else they're good for testing for testing your algorithm implementations so if you're ever in doubt if your implementation is sound they actually have links on their page to test vectors that can be used for verifying your implementations and you can always go to a validation lab if you want to get official algorithm validation certificates I include a link to the sub page on the sea MVPs website that has the standards on the FIPS 140-2 standard is it's currently drafted and used and a link to the NIST page for the development version of phipps 143 as I mentioned the public draft is a little dated but it is the best that's available right now for public consumption so I do have a little bit of
time left over to answer any questions if you have any and if if there are any questions you have now we can start going over those if we run out of time we can always move to the Q&A room you can yell really loud I'll repeat the question so long as I here right so the question is about openssl they do have a validation about for what they call a phipps object module and as you mentioned it's not updated as frequently as the main openssl path what they've actually done is what a lot of labs considered a software-based validation which is kind of contrary to what in the past everyone had deemed acceptable what they have done essentially is you can take the openssl object module and build it and then compile it into a version of open SSL and the phipps object module so long as it's built according to the directions and their security policy and compiled into a version of open SSL or into your own application would be acceptable and would be phipps compliant and one of the key things to note though and something I will admit a little bit of ignorance on is whether or not the object module will work under both the old ODOT 98 trains or if it'll work in the one datos I do not know if it works with the newer open SSL or not I do believe they're going through a new validation for a new version of the object module and that maybe for a version that's compliant to the new versions yes you so the question was basically about the requirements in 143 and about the use of password enforcement within the module and whether that's kind of a change from you know the industry standard of you know requiring those through policy or actually as part of the product there actually are trying to push away from from policy-driven items like that in part because you know if you have policies words where it's enforced by user excuse me where it's enforced by the user that a lot of times those aren't being implemented or they're not creating strong enough passwords so the thought process is being that you know if you had a you know if you had a network appliance with a administrator password that the device itself would say oh when you set up your password you only gave me a six character password and rejected outright instead of you know just kind of letting that go and record and actually putting it on the end users to implement that right and so that is something that you know one of the original requirements that got a little bit of flack was they actually had in an early draft requirement that specifically called out that specifically called out the use of the use of like preventing password-based or dictionary based passwords and so one of the concerns obviously was you know all the vendors were like well we don't want to have to implement that so it was obviously a concern from that perspective and obviously from a testing perspective it's like well are we supposed to implement you know attempt to enter you know a couple hundred passwords all using dictionary words and see if we get rejected or not so there's definitely a scoping issue with testing on that it's one of those kind of ongoing issues with with the validation process that's it okay so we're out of time if you have any other questions will be in the QA room i'll be making so it's a Miranda one so it's right back back where we all came from so Oh
Feedback