We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

TLS Canary: Keeping your dick pics safe(r)

00:00

Formal Metadata

Title
TLS Canary: Keeping your dick pics safe(r)
Title of Series
Number of Parts
109
Author
License
CC Attribution 3.0 Unported:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor.
Identifiers
Publisher
Release Date
Language

Content Metadata

Subject Area
Genre
Abstract
The security of SSL/TLS is built on a rickety scaffolding of trust. At the core of this system is an ever growing number of Certificate Authorities that most people (and software) take for granted. Recent attacks have exploited this inherent trust to covertly intercept, monitor and manipulate supposedly secure communications. These types of attack endanger everyone, especially when they remain undetected. Unfortunately, there are few tools that non-technical humans can use to verify that their HTTPS traffic is actually secure. We will present our research into the technical and political problems underlying SSL/TLS. We will also demonstrate a tool, currently called “Canary”, that will allow all types users to validate the digital certificates presented by services on the Internet. Speaker Bios: Evilrob is a Security Engineer and Penetration Tester with over 14 years of experience with large network architecture and engineering. His current focus is on network security architecture, tool development, and high-assurance encryption devices. He currently spends his days contemplating new and exciting ways to do terrible things to all manner of healthcare related systems in the name of safety. Twitter: @knomes xaphan is a "Senior Cyber Security Penetration Testing Specialist" for a happy, non-threatening US government agency. He has been a penetration tester for 17 years, but maintains his sanity with a variety of distractions. He is the author of several ancient and obsolete security tools and the creator of DEFCOIN. Twitter: @slugbait
32
Thumbnail
45:07
Transport Layer SecurityBitData conversionCASE <Informatik>EncryptionRoundness (object)Multiplication signMilitary rankWeb browserSoftwareInformation securityGoodness of fitProjective planeRight angleGreen's functionComputer animation
Information securityDirectory serviceComputerInformationInformation securityProjective planeQuantumMultiplication signNeuroinformatikState of matterReal numberProduct (business)Exception handlingComputer animation
System programmingHacker (term)Public key certificateEmailAngleProjective planeIntercept theoremPublic key certificateCountingHacker (term)Session Initiation ProtocolSystem programmingError messageInternetworkingCommunications protocolMechanism designTelecommunicationWeb pageAuthorizationAdditionEntire functionPhysical systemSystem callInformation securityMeeting/Interview
Hacker (term)Public key certificateEmailFrame problemMultiplication signHacker (term)Technical failureWhiteboardEmail2 (number)PlastikkarteRootPublic key certificateGraphical user interfaceGoogolProjective planeCartesian coordinate systemRight angleDuality (mathematics)Metropolitan area networkComputer animation
Self-organizationFood energyInstance (computer science)Entire functionSystem programmingSelf-organizationPublic key certificateFood energyRight angleLecture/ConferenceComputer animation
Self-organizationSoftwareComputerComputer hardwarePublic key certificateFormal verificationWeb browserChainOpen setPasswordType theoryArithmetic progressionPerspective (visual)Information securitySymbol tableCommunications protocolWeb browserNeuroinformatikGoogolPublic key certificateDefault (computer science)NumberOperating systemRootProxy serverCASE <Informatik>Data storage deviceHash functionNP-hardElectronic signatureMultiplication signSelf-organizationIntegrated development environmentStaff (military)Physical systemPower (physics)1 (number)PlastikkarteStructural loadSystem programmingAuthorizationKey (cryptography)Row (database)TrailSign (mathematics)Auditory maskingWhiteboardSoftwareTransport Layer SecurityHacker (term)GeometryLevel (video gaming)EmailIdentity managementCartesian coordinate systemChainCellular automatonMassMetropolitan area network
Information securityPhysical systemFormal verificationDecision theorySystem programmingProduct (business)Level (video gaming)Point (geometry)Type theoryPerspective (visual)CodeLine (geometry)Game controllerRootSign (mathematics)
Radio-frequency identificationTransport Layer SecurityMessage passingKey (cryptography)Identical particlesEncryptionSuite (music)Network socketForceStructural loadLibrary (computing)Tube (container)Information securityComputer programCartesian coordinate systemSystem programmingMechanism designFile Transfer ProtocolSequenceRoutingType theoryPhysical systemSoftwarePublic key certificateMessage passingPerspective (visual)Communications protocolRight anglePlastikkarteKey (cryptography)WebsiteIdentity managementInternetworkingArithmetic meanTransport Layer SecurityReading (process)Point (geometry)Core dumpSoftware developerEncryptionFormal verificationLevel (video gaming)Intercept theoremForm (programming)Self-organizationPhysical lawLenovo GroupLastteilungWeb browserContent (media)Term (mathematics)CASE <Informatik>RootWindow1 (number)Link (knot theory)Electronic mailing listAuthorizationGoodness of fitVideo gameProxy serverCodeoutputCategory of beingMusical ensembleTelecommunicationCellular automatonHacker (term)NeuroinformatikSuite (music)Vulnerability (computing)Revision controlBlock (periodic table)Scripting languageOffice suiteAutomatic differentiationExpected valuePerimeterCountingFront and back endsState of matterLocal ringOperating systemBookmark (World Wide Web)CybersexMultiplication signComputer animation
Public key certificateGraphical user interfaceGoogolPlastikkarteLink (knot theory)Power (physics)Projective planeMusical ensemblePerspective (visual)DeterminismInteractive televisionPeer-to-peerVacuumStudent's t-testPersonal identification numberPublic key certificateAuthorizationWeb browserGodPlug-in (computing)Slide ruleEnterprise architectureThomas BayesForm (programming)Extension (kinesiology)System programmingGoogolGraphical user interfaceDatabase transactionComputer animation
GoogolConfiguration spaceImplementationHacker (term)Software developerChainComputer networkTransport Layer SecurityScalability2 (number)Endliche ModelltheorieServer (computing)Client (computing)WebsiteContext awarenessSoftware developerWhiteboardTwitterSoftware repositoryCore dumpLine (geometry)LaptopQuery languageLink (knot theory)Cartesian coordinate systemPerspective (visual)FacebookRevision controlPublic key certificatePersonal identification numberGoodness of fitInternetworkingService (economics)ChainMobile appPairwise comparisonSystem programmingGroup actionHacker (term)Form (programming)ImplementationMathematicsSubsetGame controllerSelf-organizationPlug-in (computing)Scripting languageCodeOffice suiteBounded variationType theoryProcess (computing)System administratorNormal (geometry)ResultantProjective planeRepresentation (politics)Power (physics)File formatMoment (mathematics)Product (business)Enterprise architectureMultiplication signWeb 2.0SoftwareWeb pageTransport Layer SecurityDatabaseWeb browserInsertion lossCellular automatonAreaProxy serverMatching (graph theory)Googol
Public key certificateInformationQuery languageMetropolitan area networkData structureOnline helpServer (computing)Real numberMultiplication signMatching (graph theory)Formal languageForm (programming)Presentation of a groupField (computer science)Dependent and independent variablesCode
SatelliteSystem programmingMatching (graph theory)Transport Layer SecurityExtension (kinesiology)Right anglePlug-in (computing)Pairwise comparisonBinary codeRevision controlFreezingInformation security
Web browserPublic key certificateSystem callSoftware developerServer (computing)Green's functionGame controllerSoftwareDifferent (Kate Ryan album)outputCartesian coordinate systemLibrary (computing)PasswordWeb 2.0Graphical user interfaceCellular automatonQuery languageMatching (graph theory)Group actionCASE <Informatik>Pairwise comparison
Cache (computing)Cross-correlationQuery languageMereologyCodeInternetworkingCache (computing)Point (geometry)Inheritance (object-oriented programming)WebsitePublic key certificateTrailTransport Layer SecurityGodRight angleInternet service providerGraph (mathematics)Context awarenessMathematicsComputer animation
AdditionNumberMereologyState of matterShared memoryAuthorizationException handlingPerspective (visual)Cartesian coordinate systemSign (mathematics)Server (computing)Overhead (computing)InformationPublic key certificatePower (physics)Graphical user interfaceMusical ensembleTape driveVector potential1 (number)MathematicsFigurate numberQuery languageCodeDecision theoryOpen sourceHacker (term)Extension (kinesiology)3 (number)GoogolTransverse waveTransport Layer SecurityRight angleSoftwareProjective planeProcedural programmingLaptopCommunications protocolProcess (computing)RoutingType theoryState observerElectronic program guideWeightGroup actionDifferent (Kate Ryan album)MassGoodness of fitMixed realityCASE <Informatik>Replication (computing)Library (computing)Database transactionData structurePairwise comparisonInstance (computer science)Computing platformAndroid (robot)Client (computing)BenutzerhandbuchIntelContent (media)INTEGRALChainInternetworkingLecture/Conference
InformationNumberPublic key certificateAlgorithmType theoryGoodness of fitMixed realityMultiplication signCASE <Informatik>Instance (computer science)MathematicsDatabaseTerm (mathematics)Query languageLoginPoint (geometry)Electronic signatureFormal verificationLastteilungConfidence intervalSampling (statistics)Optical disc driveStructural load
Transcript: English(auto-generated)
How are we doing? Do we have? Yes, we do. How many people have been watching the closed captioning all weekend? Yeah? We like that? I've had a lot of people ask me what software is that? That's not software. That's meat wear. There's
actually a person on the other end of the wire that is typing that in. And they've been obviously very patient and probably doing the typing things that they normally don't type out in a regular day to day. So why don't we take this opportunity and ask our person doing the transcriptions, what have you thought of the conference so far? Well, thank
you very much. Since this has been going on, it's really
helped out and it's really added to the conference experience for everybody and we're really glad that you guys have been putting in the time to help us out. So let's give them a round of applause one more time. Great. Next, Jeff and Rob are going to talk about web browsers and encryption.
And we're at a spot where a lot of people, certainly our muggle friends at home, still look for the little green lock. If it's got the green lock, it has the same security banks use. Therefore it's secure, right? Military grade.
Yes. And we're going to talk about why that may not be the case and maybe some things that we can do to make sure that when you see that little lock that it might mean a little bit more that it's locked. Let's give Jeff and Rob a big hand. I hope that's not premature. I'd hold your
applause until you see what happens. First, I'd like to say I'm sorry to the person typing everything I'm about to say. So, you know, welcome to the TLS canary which is our
mission of keeping your dick pic safer. I don't mean to look relaxed, but this is really the only way I can talk into this mic. So we're going to have a little intimate conversation going on here. So a little bit about us. I am Ebal Rob, Rob Bathurst. You may see me walking around. I'm the director for healthcare and biomedical security for an
antivirus company called Silence except it's not McAfee Symantec or any of those other things. I don't mean to insult your company, but your product kind of sucks. So I will turn it over to Jeff real quick. I'm Jeff Thomas. I do stuff with computers. And he is barely awake, I assure you. So
I'm glad you all decided to make it here in a semi-conscious hungover or fully awake kind of state. This is one of the greatest quotes in all of reporting I think that there could be. How many of you saw the actual interview that this was taken from? Was it not amazing? It was truly the best
thing. I was riveted the entire time. Because aside from the quantum project and everything else the NSA runs, I truly want to know whether my dick pics are being collected. The answer is always yes. The answer is always yes. So, you know, we want to talk about, you know, better ways to take
those dick pics that are being collected so analysts aren't bored and generally intrigued what angle you should take them at, if it's better from the side or up top. Always put a penny or something next to it just to, you know, enhance that. So realistically, you know, why are, you know, why
this? Why this project? You know, things we like ourselves obviously. Some of you, a very small percentage, but, you know, you're out there. And systems that are not built on blind trust. I mean, the whole reason we started looking into
this and the whole reason that we built this project is because a lot of the infrastructure behind the Internet was an add-on and in addition well after the fact they're like, oh, yeah, okay, so I can see that webpage, but I really need to transmit something securely. Well, we could re-engineer it or just tack it on. It's fine.
It's fine. So we'll just use the same communications mechanisms and protocols and we'll throw some things in it that will sign some stuff and, yeah, it's all great. And, you know, I'm sure there's no other system out there like that other than, you know, the PKI. I was trying to give
them benefit of the doubt. You know, things we don't like ourselves, which is why we put on this talk, more of you and, you know, things that fail without warning. You know, I'm okay for systems that have failures and flaws and errors as long as they warn and yell and scream when those errors
actually occur unlike most things. So why? You know, I'm sure a large percent of you are familiar with the did you know tar. It was compromised by a lone Iranian hacker and an entire registrar, one guy. He's just like, you know what, I'm
bored today. I could look at 4chan or I could go hacking and issue a bunch of certificates for no purpose other than boredom. You know, 531 of them I believe was the count that we came up with and it was used to intercept Gmail users in Iran and only Iran. You compromise an entire
registrar or entire certificate authority and that's all you do with it. I'm not sure what happened. We'll call that technical failure. Try again. Because about that time frame there's some things going on and intercepting Gmail is
what hackers do when they're really bored. This was on a bored Saturday night. He's like, you know what, I hate these people and I just want to read their email. Second thing, Turk trust. Two certificates were issued out of this. They were used to man in the middle wild card,
Google.com. Wildcard, Google.com. How could that ever be a bad thing? Obviously Google doesn't mind. It was detected by some of the technology built into chrome with the pinning they do with their certificates. Only their applications will trust it and talk about pinning. Just a
quick precursor to what we'll be talking about. Google has done a lot of work in this and what we're doing is not to detract from anything they're doing. It's just their project is going to take a long time to get to the end. And obviously issuing the man in the middle certificates for Google with a wild card was for totally legit purposes. Like
this. Another instance we were able to pull up is the Brazil ministry of minds and energy. Allegedly, allegedly,
it was used to target individuals and organizations with the same thing. The same theme we keep going over. Your entire certificate system is based on trusting someone. Seriously. Like that right there is the epitome of everybody, don't worry, it's safe. It's full of kittens and
there's nothing that could possibly go wrong. And what we like to point out is these are just the attacks that are public, ones we know about because someone detected them, Google detected them, Microsoft detected them or there was some system out there, some organization looking out
for realistically their own self-interest. They weren't looking out for you, they were looking out for reputation damage, they were looking out for protecting their systems. They're going to say they're looking out for you but realistically there's not a lot of systems out there to protect you as an individual and give you the power to figure out what's going on. CAs are run by companies, agencies,
people, organizations, staffs. How many of you work in a corporate environment where we're going to load our certificate on it and it's cool? You'll get some e-mail from it, it will be great, we can contact you whenever we want and don't worry, we're not doing anything we shouldn't. But all people regardless and all organizations are
vulnerable to corruption, common mistakes, issuance of bad certificates because they got lazy that day. So those are things you really need to look at with this type of infrastructure. And it's just computers and software and a
bunch of signatures and hashes and things that get moved around and the way the system is built is we just blindly accept that. And from an attacker's perspective and I'm not saying the bored Iranian hacker by himself but from an attacker in general that may want to do a mass collection or government agency or something like that, CAs
are high value. They are absolutely high value because if you can get to the root CAs or even the intermediary CAs you have a tremendous amount of power over anyone who has to connect and authenticate through those CAs which is basically everyone. We've also extended certificates to be used for other purposes such as software signing and things that
can actually be embedded in your operating system. Being able to subvert those has a proven track record of allowing you to do some very nasty things. Or even as we're going to bring up an example, your citizenship to a country is controlled by a PKI certificate. So what is a
certificate? I'm not going to bore you with all the details but in general it verifies ownership of a key associated with a resource. It's issued by a certificate authority at some level that is trusted. And they are used on your
computers. They are used in applications. They are used on your cell phones. They are used for software updates. They are used for basically anything that we need to establish a unique identity of some kind at a high level. So how does it work? Well, again, it's all built on trust. It doesn't
actually work and it's super broken but I can't go out and say that right away because then I sound like a pessimist. So legit certs are signed by a trusted CA. In this case like a geo trust or a verisign or the large ones that collect lots of money for doing this time and time again. Our
browsers trust anything that is signed by these CAs by default because they are inside of our root of trust inside the browsers, cell phones, et cetera, et cetera. Sessions get negotiated by magic. I say that because my talk is not to talk about how broken TLS and SSL are as a protocol, just the
infrastructure around them. And then after you get that lock symbol or you get that it's all good, you're free to send your very secure dick pics through them. And that's really what we're here for. From the user's perspective, they've been trained. If you see the green lock, this is considered progress. If they see the green lock, they know it's
okay to type in their password, they know it's okay to do their banking or whatever else they're doing online. Yeah, so if you get on the DEF CON open network and you see the green lock, you're good to go. Nothing bad is happening. Bally's just wants to make sure that you have the room number and credit card associated with wherever you happen to be all the time. So we call this the chain of
fools, right? It's I'm trusted by the world as the root CA for whatever I'm doing. You pay me some money. It's a lot of hard work to sign those certificates. So I sign your cert. I take your money. Your cert is now trusted by anyone who trusts me, which by the way is everyone because I'm in
your root store doing whatever I do. And now you can say you're whoever you ask to be. In the case of the man in the wildcard for Google.com, a computer that uses that as a reference for trust or if you're being funneled into that proxy, you're not going to know. Your browser is going to give you the same hey, it's all good as the certs that Google
issues. And that's because your browser, any of the 200 certs that it has loaded in its root store is legitimately okay to say that Google is this definite computer over here not collecting your data whatsoever. So this is the example I brought up earlier. You know, I got to credit
Citrix for this picture that kind of proved my point. I think they were trying to sell one of their products to the Belgian government to help them do the CA infrastructure and verification. But you have the global signed root CA, you have the Belgian intermediary root CAs and you have a bunch of these citizen CAs down here that
verify the citizenship of the individual down at the very bottom, the army of people. So if you were to take control of one of those Belgian CAs or one of the citizen CAs that has the ability to say this person, this EID is legitimate and they are a citizen. What implication does that have for
border control? What implication does that have for actually verifying whether or not somebody is who they say they are? Because you know you've been trained not to be able to trust the physical picture. You've been trained not to be able to trust the paper in front of you because how easily forged it is. If you had control of one of these
CAs, it's like six lines of code and you've generated a citizen. I say this in a very broad kind of way. But that's the world is on fire perspective in this type of example. This also highlights a level of trust that we put in technology in this system, this very complicated system of
systems in that we can ‑‑ they feel they can rely on it to make high level decisions or to do a high level of verification of someone's identity and someone's citizenship. It's not like you can harden these CAs and stick them away and put them somewhere and no one can touch them because at some point it has to connect to another
system to another system to another system that's going to tell you where the one with the keys is hiding. And then you can target that organization and do ‑‑ I'm sure there's been 50 other talks already about that type of stuff. That is the root and the sequence of trust that is the world we
live in today. So, you know, again, I'm going to do a basic overview of things. We have HTTP pop 3, SMTP, FTP. They're not an encryption mechanism. They were never designed to be encryption systems. They were communications protocols. At the
beginning, everything is all about how do we talk. Any time an application developer builds a program, it's like how do I get A to B? Okay. I got A to B. My design goal is done. Security is here. I need to go A to B in a somewhat secure way. So we'll just use SSL certificates and TLS and, you
know, we'll just forget about it. There's some security strapped on. It's great. But at the core of the problem, at the end of the day, everybody is just concerned with how do I get A to B? How do I get the dig pick here to here and how do I make it look good? That is really the concern people have. So how do the secure tubes work? It's SSL. It
came from Netscape. Seriously. This stuff was tacked on. If you watch Moxie's talk back in 2011, he found the actual doctor from Netscape and not the MD kind, the PhD kind that actually wrote the original SSL libraries and
everything else. And he admitted, yeah, we just tacked it on. Somebody is like, hey, we need some security. We'll just band it to the side. And that was 1.0. Since then, you've had the RFC 6101, 6176 and the TLS 13, which is the new new draft coming out. And if you really want to read them and need some bedtime reading, by all means, go
ahead. It's not hyperbole to say that SSL is the reason we have the Internet we have today where business can be conducted because people feel safe putting their credit card online and giving it to a website because they know, they know, they see the green lock and that their credit card can't be stolen. Yeah. Anyone who says that marketing doesn't
deserve money has not been paying attention. Seriously, you can sell ice to an Eskimo if you have the right marketing guys. So, you know, why is it old and busted from a protocol perspective? SSL has a problem with using identical
keys for messages and encryption, lack of protection from the handshake. I'm not going to list out specifically what version uses what. But it's vulnerable to truncation attacks, cyber suites, problems with RC4. Something about a poodle. Something about a poodle and a beast and a
something fire sheep. Creative names really. TLS is the solution. It's not SSL and it took researchers, I don't know, like a year to be like and it's just as broken as it was before. But it's broken in new and interesting ways. Ultimately it still rides on the same certificate authority
infrastructure. Right. It doesn't matter if we create TLS 55, it's still using the same certificates issued by the same things with the same problems that we were discussing before. So, interception. I can't play the music. I know. We were
going to do like the serious gopher, but unfortunately it's not quite conducive to what we're working with. But I did an inventory of iOS, it trusted about 226 certificate authorities. About. I didn't get an exact count at the
version I last looked at. You can go to Apple, they list all the ones they look at. My favorite is the Hong Kong post office. So if the Hong Kong post office wants to be Google one day, they can be Google. Fire trucks, fire fox trust about 180. You can go to that link. Again, they will list all of
them. It is nice from a transparency perspective that most of these manufacturers of various systems actually list out the certificate authorities in their route. Unfortunately, it's still a shit load. And there's more that get added every day. Countries can be like, you know
what, I need you to trust these five organizations and the certificate authorities. It's very rare and I think Microsoft was one of the first people to do it to actually pull certificates out during a refresh. You know, one of the
things to consider, too, from an issuance of certificate authorities is they're subject to many laws. They're subject to local laws, state laws, federal laws, international laws, what have you. If an organization decides that for lawful purposes they want to do some form of interception, they do all their government forms and you
tell me SCA in the world is going to be like, no, I'm just not going to comply with this. And it only takes one. It only takes one globally trusted route authority to issue a bad certificate for lots and lots and lots of people to have a very bad time. If you never looked at the list and
you want to feel a little more paranoia, I do recommend go to one of these links. Just Google it. Look for yourself. You can see which certificate is chrome trust, Firefox, which ones are embedded in windows and the other various OS's. You can see just how many route authorities there are and the global span. They're in so many different countries under so
many different jurisdictions. It will make you worry. Something to also consider that we didn't actually bring up because it's kind of a corner case, the computers you're using, the manufacturers can put their own certificate on it and basically tell that system to trust it no matter what for
updates and everything else. But that type of activity, whether malicious or not, if implemented incorrectly, can wreak havoc in terms of what gets trusted if you can be intercepted while not actually realizing what's happening. So
there's a subversion of secure communications in the sense that there's basically two categories. There's the legal reasons to do it. You all go to work and you decide I want to use a work computer and they make you sign some forms and they're like okay, we can monitor everything you're
doing. You hit a blue code, you hit some other appliance, it proxies it out, yes, you realize you're being watched. Doesn't stop people from browsing porn. I mean, seriously. But I spent the past five or six years of my professional life recommending that a lot of places do this because it's a good thing if you have to protect a
corporate network and you have good policies in place, you're subverting the security that's inherent in the communication protocol. Right. I mean, it would be ideal to have it end to end but that doesn't work for most corporations. Which is why you get to load balancers, you frontend offload everything out and now all of a sudden it's only secure from this point to this point or it's
only secure until it hits the perimeter and now it's no longer secure. So while you're browsing gmail or anything else, it's all in the clear text on the inside of the network. The last potentially reasonable expectation is a government request, although, I mean, that has its own geopolitical issues that are not really
something I want to get into. Which leads us to the next category. Which is the next category, the not so good but maybe secretly legal. You know, the government request versus government demand, criminals, the lone Iranian hacker that wants to break into something and my personal favorite, advertisers. It's much easier if you're not
secure to target more ads to you which makes more money for other people and so on and so forth. But they have a secret agenda or not so secret agenda to make sure they're able to serve ads. Well, they like to play it off as it's better for
you and the freemium economy. But a good example of this is actually no script. No script I believe or ad block plus one of the two is working on what constitutes a good advertisement.
And what can go through because it's not obtrusive. And these are systems that we rely on to protect ourselves from malicious drive bys and all kinds of other attack techniques that we're hoping these block but then we're like, well, it's okay because this advertiser is only it's a static content. There's nothing bad that can occur here. And that's the same type of approach we're taking
with the PKI and everything else. It's okay because somebody else has our best interest in mind. And the depressing reality check, most people are just conditioned to click okay and then when something bad happens they look like this guy over here. You know, their credit. You all
know this, that people are the weak link always. I mean, this poor sheep's credit card was used in Bulgaria and the bank won't give him back his money and that's how he looks in the end. So solutions. I stand up here and it's a doom and gloom perspective for everything and anything. The convergence
project which is what we mentioned earlier, Moxie did back in the day. I highly recommend going to look at it. It's some great work. It takes a different approach to what we were really attempting to do in our band aid. And then, you know, the more encompassing project, the
certificate transparency project, Google, you know, they have some power. I'm just going to put it out there. But the certificate transparency.org, while not fully implemented or public in any way, in the sense that they have something that you can just grab and easily insert into everything you're doing, they do have it built into Chrome. They do
use a certificate extension for things they control called HSTS which allows them to do a form of pinning that allows them to say this is definitely a trusted certificate that I issued and I know nobody is subverting it and it belongs to me. I highly recommend you go look at it. What's that?
Thank you. What was that? It was HDKP. HPKP. Thank you. Sorry. I got all my stuff stolen at Mandalay Bay by a very
enterprising thief. I haven't gotten to that slide yet. I'm a little frazzle that has had to re-write a bunch of plug-ins in the past 24 hours. One of the problems with the current infrastructure is that a certificate can be issued in secret. There's no way to see that a certificate has been
generated. So the 531 in the previous examples and God knows how many others that have been generated exist and they won't be noticed until something goes wrong. A user notices that the certificate is by the wrong authority which
just doesn't happen or they try to hit Google which uses certificate pinning which is a great way to detect when some chicanery is going on. The cool thing about the certificate is that it's in a vacuum. All the peers involved in the system, the transactions are visible and
public. And because they're public, they can be verified or noticed when something bad happens. Exactly. Go look at it seriously. In the end, it's going to be that or a project like that that really fixes a lot of the issues associated with the infrastructure we're using. So
pinning, basically it's a every application inside should be using a form of pinning but it's cumbersome to get done. But at the core of it is trust this and only this certificate to say that I am who I am. This model works really well
with mobile apps that are dedicated to your Facebook app on your phone or Twitter. It's easy to pin a certificate in the app because the app developer is the one providing the service so they can embed that and it's truly end to end. It's great. When it works, it works. Unfortunately, it's hard to
configure. If you go out to OAS, they tell you how to do the pinning and it's this web page. From a normal administrator's perspective whose day job is not to deal with this type of thing, it's going to turn into one of those I could but type of situations which is what in the end we really do not want. Implementation varies. Obviously,
the Google I had mentioned incorrectly acronymed. And then the example I had mentioned before, the Hong Kong post office cannot issue a search for your send my dick pic application which is really what we're here for in the end. So, caveats. We're not developers, we're hackers. I
write scripts to achieve an end goal. I write code to do enterprise products which is why we are putting this out there in an open format so others that may want to contribute to the project can really contribute. We have no
power. I do not represent Google or Microsoft or the IETF or anyone like that. We are simply trying to find a temporary solution to what seems to be a permanent problem that we can go with. Canary, our application and our project is not the solution. But it is a far leave better than what is
currently out there in the silently failing, you never know who you're trusting type of approach we're using today. And yes, I got robbed. And there are some various ideas as to how that occurred. But the end result is that stuff got
stolen out of my room and I lost my laptops and had to rewrite like three or four hundred lines of plug ins last night. I obviously should have backed it up to get or a repo of some kind prior to not realizing my stuff would be stolen. So, if it looks a little hodgepodge and some of
the upcoming things, it's because I was up most of the night writing them. So, our goals. We want to protect your dick pics from board analysts. If you need dick pic consulting, it's not us, but we'll try to protect them if somebody were to get ahold of them in transit only. User
awareness. We want to make sure that you realize the site you're going to and the certificate chain you're looking at is actually the legitimate one that you are going to. You're going to and the one you want to be at. And really, it's stopping shady shit. I mean, come on, like this, the whole infrastructure we're on is meant to be a good idea and turned into a horribly abused idea. It's meant to be
trustworthy. Yeah, it's meant to be trustworthy. So, what the tool does. The tool is built in kind of a client server model. We built it that way so you could specifically leverage our infrastructure and build your own plug ins and do whatever you want to do and we'll show that here in a second. At the base, it does a certain comparison between what
you see on your end of the browser, the plug in, the distributed system of server sees. So, you go out and it basically will keep, it will collect your search chain locally, send it to our server, compare the search
chains, if they match, hey, you're good. If it doesn't match, hey, you might have a problem. That is version one. That is what it does. It tells you, hey, you have a problem because what you're seeing is not what the rest of the world sees. Maybe where the name came from. If the canary dies, then you have a problem. Yes, no one wants to kill the tiny, fat, cute bird, but if he dies, he dies. So,
what it doesn't do, protect you from compromised sites. I cannot tell you that Facebook.com is jacked up if Facebook.com is jacked up around the world. It does not protect you from subpoenas, warrants, people kicking down your door, someone hacking the, you know, someone controlling the AS from a BGP routing perspective and it will
not encrypt or protect your dick pics at rest. That is not what our application does, but I'm sure someone else is willing to sell you something that will. All canary does is provide multiple perspectives. So, if your link is being hijacked or intercepted or routed through a transparent proxy, you can submit a query to canary, which one of our
distributed servers will be outside of your jurisdiction and compare what you see to what the world sees. And if they're different, then there's probably something going on, something wrong. And we are going to release the server code on to our repo. So, if you want to spin your own up, you can spin your own up. And if you're like, well, obviously
some nefarious organization could just spin up a bunch of these servers and put them somewhere they control. But realistically, I'm not that interesting. So, I doubt they might do that. But if everyone adapts to it, maybe they will. Who knows? So, the goal is a globally distributed network outside of jurisdiction of one individual
organization, power, agency, control area, you know, what have you. It is scalable. It's designed to be able to accept many, many, many requests per second and the return is basically a one or zero, it's good or it's bad. That's V1. We're going to eventually hopefully build in, you know,
certificate history chain so it keeps track of if any of the changes occurred anywhere in the chain or, you know, some of the more features around robiting across the globe and stuff like that. If any of you heard of the SSL observatory, a very cool project. One of the developers
actually, I contacted them about using some of their code. We didn't end up using it. But what they do is they scan the internet looking for SSL web servers and gather those certificates and keep them in a database. And they've been doing that for a long time. So, they have multiple snapshots and running history of SSL certificates. And the
canary has the potential to do the same thing. Rather than doing the active scanning, relying on user queries to build this database. It's going to be focused on the most commonly used, the most popular websites. It's not meant to be
a scan over time. As Jeff was saying, it's meant to be as it happens. I don't want to compare it from a month ago. I presented at this particular moment. So, this is the JSON structure. This is basically it. And again, we'll release this. But this is all it takes to submit a query to our server
and get a response back, whether or not it matches. I'd also like to take this time to thank Red Beard. He is our silent third presenter here. He was instrumental. I'm learning Go because of him. It is a very cool language. So... I am not learning Go. Yeah. He was instrumental in
helping us get this done on time. Yes. I cannot thank my friends enough. Jeff and Red Beard and forkus and various other people who helped recover from the enterprising thief. And helped actually make this to be possible. So, I
just want to say thank you to the man in the beard up there. I was just... If you look at the JSON structure, it's very straightforward. We don't want your information. We don't want to log. And if you look at the code, there
is no logging other than a query was made. But all you have to do is in whatever language or tool you want, you can construct this JSON structure that just has these fields in this form. We go out and grab from the same website that's presented in the query, grab the TLS certificate,
compare it to what you gave to us. If it matches, we return a yes or a zero. If it doesn't match, we return a one. It's as simple as that. The only IP or name it ever keeps is the target that we have to pull the certificates from. It doesn't keep yours. It doesn't keep a history on anything else. Once we've pulled it, we just keep the host name in
the certificate chain and we don't really care about the IP anyway. So this is essentially a version, this is the plug-in I was writing last night. It's a Firefox plug-in. It turns out the NRO does not have a very secure website apparently. So your dick pics could potentially be stolen. Mostly because our
system also detects it's HDB versus HTTPS. It's going to warn you the same way, hey, this isn't secure. But it basically throws a red banner up around it, gives you a JavaScript alert. I don't know how many of you programmed Firefox extensions, but essentially JavaScript or Python or C depending on what you're trying to do with
it. This particular add-on use JavaScript is like, hey, this doesn't match. There's no cert attached to it. Hey, your dick pic is going to get taken by these guys. Although I really want to meet you if they redirect a billion-dollar intelligence satellite to get your dick pic. I think you would be a very interesting person to have
beard with. Just saying. And essentially the next version, we said it's a binary comparison. It goes up, hits it, it says yea, it says nay. If you hit the next one, this is a good
place to submit your dick pics. And the only difference being the green border in this case, again, I wrote the plug-in to do red or green. Green border says, yes, we've taken the certificate chain, it matches what I see, it matches what the server sees. And when it did the comparison, it's like, cool, if you want to put your dick pic here, it will
be protected by the same certificates that you're seeing that we're seeing. And this is where real developers can actually contribute because we know Firefox and we can see there's hooks for intercepting SSL calls. We can intercept the web request before they go out. And if the certificate chain
doesn't match, then we can block them. But when you're dealing with an Android or iOS, it's a lot more difficult to intercept. So if you open up your browser and try to log into your bank or whatever, your password could be transmitted before canary knows that something has gone
wrong. So your credentials have been compromised, your activity has been logged, and it didn't serve its purpose. Yeah. The browsers are okay. There's stuff actually built in from Netscape libraries that can allow you full control over some of the stuff with Chrome, some of the stuff with
Firefox. But it really is, as Jeff was saying, the applications where iOS has a tight control over execution and network hooks that it becomes more after the fact warning that we're trying to work with and get around. So why use it, right? The whole of everything, why use it? It's here to
provide awareness. It's here to provide, hey, something isn't right. I'm looking at this site and it's not matching. Is it just me? Is it the rest of the world? What's going on? We don't want silent failures. We want bright, screaming, red, flaming, lights on fire failures where it's become really obvious. It's super light weight. The whole point is for
you to take the code examples, take whatever we have, take it, throw it in, do it yourself or use what we have. We don't cache data requests. We don't, we might start caching observed certs, but that really has less to do with you and more to do with the general infrastructure of the Internet. And we will never store your personal requests.
This is the SSL observatory light part of it where it's more of an intellectual curiosity and to be able to speed up if it does start becoming popular, God forbid, being able to speed up the queries and also do correlation and history. When
something changes, if it changes before the certificate expires and just general tracking. Yeah. So why not use it? I mean, really, come on. I mean, that's what we're here for. That's why I spent all last night writing this crap, right? And he spent all week. No reason I can think of not to use it unless you don't trust us. I wouldn't really
trust us, but that's why we're making it open source. That's why we're putting it out there for everybody. Go trust yourself. Look at the code. Figure it out. Does it work for you? Does it work for what you're attempting to do? Does it work for the goals you're trying to achieve? Some people just don't care and I can't change that opinion. Other people like myself like to know when their stuff is being messed
with. We didn't write this for just the general masses of users, but if you are truly concerned, if you have a reason to be concerned, like if you disagree with your government and want to change it somehow, some governments don't like that and some governments take affirmative action in that to
limit your capability and limit your ability to communicate in private. Or if you're just on the DEF CON network. Yeah. There's a lot of things where you can find it. The TLS canary dot net is where we're going to end up tossing, you know, instructions, kind of user guides here and there, what
it's meant to do, some observations. Right now it's just some plain text and then get hub is where you can pull the source from that we'll be uploading probably tomorrow or Tuesday. But that's going to have the server. When I get my new laptop shipped in. But it should ‑‑ it will contain client code server code,
Firefox example, hopefully our chrome example and some of the code we were working on to try to do the applications on the iPhone Android platforms. It is designed to be very lightweight. If you want to spin up an Amazon instance, EC2 instance and run the code yourself and use your own
private canary, great. Have fun. It works. The request structure is so light and what it's pulling is not even HTML content. It's just literally certificates. So it's a very low transaction overhead if you want to run it yourself. So scotch and questions. It will say that a self‑signed
cert does not match what I see globally unless the server you're hitting is also the one that has the self‑signed cert. If you present it with a self‑signed certificate and we see the same one, we tell you it matches. We don't care. That's not our problem. If you trust it and all we're
saying is it is a one‑to‑one. And that's what matters from the application perspective. So this is pretty cool stuff, guys. I was wondering if ‑‑ kind of goes along with OCSP. This is kind of more to gathering intel across the community and sharing that information. I was
wondering if you guys thought about publishing this as part of an RFC to kind of standardize this as a method for additional checks. Because it would be really cool if this information, this intel starts getting out there to where people can start making decisions on certificate authorities and go wait a minute, those certificate authorities are bad. We need to start disabling those
certificate authorities and looking at those, monitoring those guys and say which ones are doing the proper things and which ones are not doing the proper things and those are not doing the proper things, we just get rid of them. Sure. The potential is out there and it's all going to be open source. So the query structures and what we observe
and everything like that is meant to be wholly transparent. So if somebody with that authority wanted to go through and do that effort, we'd be more than happy to support that and provide whatever information we have to provide. This is a band aid. It is a band aid and will require someone with power to actually implement those type of process and protocols and procedures. I personally ‑‑
something like the Google transparency project move forward. Something that solves the problem. We're happy if someone from Google is here and wants to integrate this, we're cool with that, too. Something that actually solves the problem and doesn't just put a band aid on it that lets
you detect when the problem is being exploited. Firstly, thank you for helping to protect my dick pics. That's what we're here for, right? If your schlong transverses the internet unsecurely, it hurts all of us. So my question is, did you guys contemplate doing revocation checking also as
part of the canary exercise? So I do have revocation checking actually built into the extension. The canary project on its own is meant as a comparison tool. The revocation checking could be built into the server but it's one more check that would slow it down. Whereas if you're
already utilizing the extension libraries, it can do the modification check before it pulls the search chain. In the example of the firefox extension, it would just check it, say it's bad and not even bother sending the thing up. We're not replacing, we're just augmenting. I'm curious
in a compromised case where you have a mix of good state and bad state, how do you ultimately know what reality is? Are you just going by numbers? Seems like since you're not logging any kind of identifying information or tracking information for users that potentially maybe you're a lone Iranian hacker can go and send you 5X the amount of data
from the botnet that they personally write. So if for instance in the case, I think I'm hearing you correctly, in the case of a mix, some things are bad and some things are good type of scenario, you have to have basically a weighted algorithm that says 65% of the world sees it this
way and if I have 300 sensors, what are the odds that 65% of my 300 sensors are bad? And there is something we're considering putting in is like the confidence of how good this is from what you're seeing. It's just in the V1 that it's not something we put in. But we were thinking
about that problem. It's just one of those at what point and what sample size do I need to actually achieve a confidence. There's also the implications of two jackasses being an arbiter of what's good and what's bad. Other than you send a query, we do our verification. And it's very simple. This JSON struct is very simple. It just presents the signature, the serial number, we verify if they
match and send it right back. I think my point is just that fundamentally in saying you will not log session information, you're limiting yourself. There's a lot of nasty tricks you can do. We won't log your queries. I don't care where those queries come from. But when we do our own, we grab the certificates on our own, those will probably end up being cached at first just for
efficiency and for load balancing and for spreading the load. But in the long term, I'd like to do something like what SSL observatory has already done and start building our own database and looking at how certificates change over time. It's just purely academic and also to enhance the service and make it better. All right. Well, we're
out of time. Come find us in the bar if you want, if you're still here. I appreciate your time and hopefully your dick pics will be safe in the future.