We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Questions and Answers

00:00

Formale Metadaten

Titel
Questions and Answers
Serientitel
Anzahl der Teile
109
Autor
Lizenz
CC-Namensnennung 3.0 Unported:
Sie dürfen das Werk bzw. den Inhalt zu jedem legalen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen.
Identifikatoren
Herausgeber
Erscheinungsjahr
Sprache

Inhaltliche Metadaten

Fachgebiet
Genre
Abstract
Bruce Schneier Talks Security. Come hear about what's new, what's hot, and what's hype in security. NSA surveillance, airports, voting machines, ID cards, cryptography -- he'll talk about what's in the news and what matters. Always a lively and interesting talk. Speaker Bio: Bruce Schneier is an internationally renowned security technologist, called a security guru by the Economist. He is the author of 12 books including the New York Times best-seller Data and Goliath: The Hidden Values to Collect Your Data and Control Your World as well as hundreds of articles, essays, and academic papers. His influential newsletter Crypto-Gram and his blog Schneier on Security are read by over 250,000 people. He has testified before Congress, is a frequent guest on television and radio, has served on several government committees, and is regularly quoted in the press. Schneier is a fellow at the Berkman Center for Internet and Society at Harvard Law School, a program fellow at the New America Foundations Open Technology Institute, a board member of the Electronic Frontier Foundation, and an advisory board member of the Electronic Privacy Information Center. He is the CTO of Resilient Systems.
32
Vorschaubild
45:07
ZählenGüte der AnpassungRechter WinkelBesprechung/Interview
ChiffrierungParametersystemComputersicherheitMultiplikationsoperatorMathematikKryptologieFrequenzPlastikkarteGesetz <Physik>ZahlenbereichSchlüsselverwaltungInternetworkingNeuroinformatikDatenbankMetadatenBesprechung/Interview
App <Programm>MultiplikationsoperatorProgrammfehlerAuflösung <Mathematik>Quick-SortFlash-SpeicherTwitter <Softwareplattform>YouTubeTermHypermediaZahlenbereichKryptologieElektronisches ForumCybersexLesen <Datenverarbeitung>ChiffrierungSoftwareProgrammierungInternetworkingGüte der AnpassungDatenbankGleitendes MittelHackerE-MailRichtungInstantiierungNichtlinearer OperatorComputersicherheitRandomisierungWeb SiteSpeicherabzugSchlüsselverwaltungElektronische PublikationWellenpaketMetadatenInteraktives FernsehenPunktVorlesung/Konferenz
ComputersicherheitQuick-SortEigentliche AbbildungGüte der AnpassungChiffrierungCybersexLeistung <Physik>MereologieBesprechung/Interview
ParametersystemHackerAutomatische HandlungsplanungReelle ZahlMultiplikationsoperatorSelbst organisierendes SystemInternet der DingeGüte der AnpassungART-NetzNichtlinearer OperatorInternetworkingQuaderVorlesung/KonferenzBesprechung/Interview
Komplex <Algebra>SoftwareschwachstelleComputersicherheitKryptologieMultiplikationsoperatorHash-AlgorithmusGarbentheorieVorlesung/KonferenzBesprechung/Interview
SoftwaretestKartesische KoordinatenCodeHash-AlgorithmusMaschinensprachePrivate-key-KryptosystemChiffrierungPrimitive <Informatik>Funktionale ProgrammierungEinsKryptologieNational Institute of Standards and TechnologyVorlesung/KonferenzBesprechung/Interview
Ordnung <Mathematik>SoftwareHecke-OperatorHumanoider RoboterProtokoll <Datenverarbeitungssystem>MathematikAlgorithmusHintertür <Informatik>National Institute of Standards and TechnologyTypentheorieWeb-SeiteProzess <Informatik>Message-PassingEinfacher RingNichtlineares GleichungssystemParametersystemGesetz <Physik>Open SourceKryptologieSoftwareindustriePrimitive <Informatik>StrömungsrichtungMereologiePerspektiveFächer <Mathematik>ChiffrierungDualitätstheorieHackerComputersicherheitBeweistheorieVorlesung/KonferenzBesprechung/Interview
Patch <Software>Einfach zusammenhängender RaumMathematikHumanoider RoboterSoftwareschwachstelleGoogolSoftwareTermComputersicherheitDatenmissbrauchSondierungGruppenoperationInformationEigentliche AbbildungQuick-SortMAPFächer <Mathematik>FitnessfunktionTouchscreenMechanismus-Design-TheorieRegulator <Mathematik>Ein-AusgabeDynamisches SystemSystemprogrammierungBesprechung/Interview
DatenmissbrauchTelekommunikationMathematikGewicht <Ausgleichsrechnung>Prozess <Informatik>Klasse <Mathematik>Quick-SortKryptologieGesetz <Physik>Schreiben <Datenverarbeitung>Ordnung <Mathematik>KurvenanpassungPunktTrennschärfe <Statistik>Generator <Informatik>InstantiierungCASE <Informatik>ParametersystemFacebookRegulärer GraphPi <Zahl>RouterSelbst organisierendes SystemFächer <Mathematik>Suite <Programmpaket>SichtenkonzeptEinfacher RingInformationGemeinsamer SpeicherCybersexVirtualisierungQuantisierung <Physik>GeradeInterface <Schaltung>Metropolitan area networkExpertensystemChiffrierungPhysikalismusGruppenoperationReelle ZahlSoftwareUnternehmensarchitekturAutomatische HandlungsplanungBus <Informatik>MultiplikationsoperatorTwitter <Softwareplattform>Elliptische KurveSystemprogrammierungProgrammiergerätSchlüsselverwaltungAggregatzustandBildschirmmaskeZahlenbereichMehrrechnersystemWeb logComputersicherheitAnalytische FortsetzungProdukt <Mathematik>KontrollstrukturKategorie <Mathematik>National Institute of Standards and TechnologyPhysikalische TheorieQuantencomputerAlgorithmusDickePotenz <Mathematik>Public-Key-KryptosystemForcingTeilbarkeitGamecontrollerWurzel <Mathematik>StrömungsrichtungQuadratzahlSchlüsselverteilungPrivate-key-KryptosystemQuantenkryptologieEinsArithmetische FolgeMAPVorzeichen <Mathematik>Minkowski-MetrikTypentheorieGefangenendilemmaARM <Computerarchitektur>Rechter WinkelGüte der AnpassungEllipseSondierungFirewallDatenfeldE-MailInternettelefonieVorlesung/Konferenz
Transkript: Englisch(automatisch erzeugt)
count off. Good. Thanks. Anybody, is an empty seat next to you? Raise your hand. Looks pretty full. Anyone know if this place
has a fire code or not? Don't start fires. If you start a fire, you have to share. Some seats over there. People who
have their hands up have a seat next to them. So the smart money goes that way. The smart money is not in Vegas. All right. So hi
there. I'm Bruce Schneier. Are there any questions? It's good.
That was the easiest talk I ever did. So if there are any questions, I'm happy to take them. Yes? There are actually
mics here. There's one there and one I can't see behind the camera guy there. We'll find out, right? Yes, they work. That one works. First off, thanks for coming out. Always a pleasure to see you here. The question is, it's an old question and I'm
wondering if maybe you have any new insight into an answer on this. With cryptography becoming more in the collective consciousness, especially with people who are less technically savvy, there has been an argument for a long time trying to explain to people that encryption is not security. So
it's very common for people who are not technically savvy to say, oh, we'll just encrypt this shit and then we're secure, which obviously is total bullshit. Do you have any insight on how to better explain to those people why that's fundamentally flawed? It's interesting. I think you're
right that a lot of people think of crypto as a panacea, where in fact it is just a tool and a very powerful tool for a bunch of reasons I'll talk about in a second, but it doesn't automatically make security. Any data has to be used. One of the things that the Snowden documents have really brought forward, which I think is a good thing we're
just talking about, is metadata, data that has to be unencrypted and sensitive to operate. So this cell phone is a great surveillance device because the metadata, where this phone is, has to be in the clear. Otherwise it can't ring. I should turn the ringer off. So there's a lot of things
encryption can't do. Encryption can protect data at rest, but if you are going to make this up target corporation, you have a database of credit card numbers that you're using, it can't be encrypted. The key has to be there. I talk about
encryption as a tool not as security, just like your door lock is a really important tool, but doesn't magically make your house secure. What encryption does, and I think this is real important, is NSA surveillance, it forces the
listeners to target. What we know about the NSA is that they might have a bigger budget than everyone else on the planet, but they're not made of magic. They are limited by the same laws of physics and math and economics as everybody else. And if data is unencrypted, if they can tap a translated
Internet cable, they can get everything. But if stuff is unencrypted, they have to target who they're listening to. If the NSA wants into your computer, they are in. Period. Right? Done. And if they're not, one of two things is true. One, it's illegal and they're following the law. And
two is you're not high enough on the priorities list. So what encryption does is it forces them to go through their priorities list. They can hack into your computer and that's no problem. They can't hack into everybody's computer. So encryption is just a tool, but it's actually a really powerful
tool because it denies a lot of the bulk access and forces the listeners to do targeted access. And there's a lot of security benefit in that. Are you first in line? Okay. So I sort of see where that mic is. I wanted to see your opinion on the back door that Obama wants. Which one does he
want? I'm not sure Obama personally has an opinion here. It's interesting. This is the same back door that the FBI has
been wanting since the mid 90s. Now we call that the first crypto war. So number three, I'm done. It is you guys. I only do two crypto wars per lifetime. It's interesting. FBI director
Comey gave a really interesting talk at the Aspen security forum. I recommend listening to these talks. This is a very high-level, mostly government discussions about security,
cyber security, national security. Really interesting stuff. He was interviewed by I think Wolf Blitzer who asked a great question saying, this is kind of personal, but why don't you like the term lone wolf terrorist? That was kind of
funny. Anyway, he was talking about the going dark problem and the need for back doors. And this is the scenario he is worried about. And he is very explicit. It is an ISIS scenario. ISIS is a new kind of adversary in the government's eyes because of the way it uses social media. Unlike
Al Qaeda, which is like your normal terrorist organization, would recruit terrorists to go to Afghanistan and get trained and come back. ISIS does it with Twitter. And this freaks the government out. So this story, and they swear up and down this happens, is that ISIS is really good at social media,
at Twitter and YouTube and various other websites. They get people to talk to them who are in the U.S. like you guys, except a little less socially adept and maybe kind of a little crazier. But they find these marginal people and they talk to them and the FBI can monitor this and go FBI, rah, rah. But
then they say go use this secure app. And then this radicalized American does. They talk more securely and the FBI can't listen. And then dot, dot, dot, explosion. So this is the
scenario that the FBI is worried about. Very explicitly. And they use this story again and again. And they say this is real, this is happening. It's sort of interesting. If this is
true, let's take it as read that is true. The other phrase they use is actually a new phrase I recommend. Talk about the time between flash to bang. Flash is when they find the guy, bang is when the explosion happens. And that time is decreasing. So the FBI has to be able to monitor. So they
are pissed off that things like iMessage and other apps cannot be monitored even if they get a warrant. And this really bugs them. I have a warrant, damn it. Why can't I listen? I can get the metadata. I can't listen. So if you think about
that as a scenario and assume that it's true, it is not a scenario that any kind of mandatory back door solves. Because the problem isn't that the main security apps are encrypted. The problem is there exists one security app that is
encrypted. Because the ISIS handler can say go download signal, go download secrets, go download this random file encryption app I just uploaded on GitHub ten minutes ago. So
the problem is not what he thinks it is. The problem is general purpose computers. The problem is an international market in software. So I think the back door is a really bad idea for a whole bunch of reasons. I've written papers about this. But what I've come to realize in the past few weeks is it's not going to solve the problem the FBI
claims it has. And I think we need to start talking about that. Because otherwise we're going to get some really bad policy. So the question there. You've got like a. Good morning. So this will probably go less in the
direction of for instance crypto. My question is somewhat two fold. I'm going to focus more on the first one. In the course of day to day interactions both with security people and with less security minded folks, I've come to the
point where it's very difficult to instill. From your experience is there an easier approach to getting the understanding of OPCEC through to lay people? So I think OPCEC is pretty much impossible. Even general Petraeus got screwed up with his OPCEC dealing with his
mistress. And if the director of the CIA can't get OPCEC right, we're all done. We see that in the hackers that OPCEC is screwing up OPCEC again and again. I'm not sure there is a solution. Because good OPCEC is really and truly
annoying. Mediocre OPCEC means leaving your cell phone at home. Who's going to do that? It means not using e-mail or for certain things. I've come to the belief that we're not going to be able to train people in good OPCEC.
That the best security is going to be pretty good security that's ubiquitous. And so we saw some of this in some more of the Snowden documents. A lot of people read the real recent article that came out of Germany on X key score. Really good article, actually okay article, really great document
dump on documents and how X key score works. This is one NSA's very flexible databases for monitoring the internet. And you can read their documents and they talk about how they can find people using encryption and roll up the networks. They
can't read the traffic. Remember metadata, encryption doesn't solve everything. I'm reading this and it's clear you can do this with PGP as well. So you want to find out who's using encryption, it's easy if you monitor enough of the internet. And what that tells me is that someone would
be better off not using the great encryption program they wrote or the really powerful one they just downloaded but the average one that everyone else is using. That you actually are better off using an iPhone with iMessage. Even though I'm pretty sure the FBI can get at it individually,
but you can hide using it because we're all using it. You don't stand out. So I think there is a lot of power in that. You got a second part. Make it quick. Excellent. I
think we like OPSEC invisible. Good security works if people don't even know it's there. The encryption from the handset to the base station, great if it was better but it's working because nobody knows it's there. So in thinking maybe not
as thoroughly or deeply as I should about like cyber terrorist threats and bad actors that want to do corporations or infrastructure harm, these sorts of things, or just the public, right? It seems like all the ingredients are there for people to do really bad things and there are
a lot of holes and security flaws. What keeps there from being enough motivated bad actors and people, what keeps them at bay? I think fundamentally people are good. Society works because most of us are honest. You're kind of looking to be funny but none of you have jumped up and
attacked the person sitting next to you. You laugh but if this was a room full of chimpanzees, that would have happened. We are the only species that can get away with this. A room full of what a lot of strangers sitting quietly listening to me. So I mean this sounds weird but I
think a lot of what keeps the really bad things from happening is most people don't want to do really bad things. If that wasn't true, society wouldn't work. So I think you're right that all the pieces, a lot of the pieces are there. There's a couple of things. Terrorism is
harder than you think. Yes, technically it can be easy but the whole operation is actually harder which is why you don't see a lot of terrorist attacks. What you do see tend to be these lone wolves that wake up one morning and say I'm going to do something bad. There's no
conspiracy to detect. There are no mistakes you can make over the course of the planning. That flash to bang time is so short. So I really do think that's why. Something that's interesting, I ought to mention as long as we're on this close topic, that it's a new tactic we're seeing more of. We've seen it for a few years. Now we're seeing it I
think a lot. We're going to see a lot more of it. This notion of institutional doxing. You can go into a company, take all of their stuff and publish it. It's freaking people out. This happened to Sony. This is what happened to hacking team. The guy who did that is in this room, thank you very much. It's what might have happened to
Ashley Madison which is a little more awkward for some people. If you remember a few years ago it was HB Gary Federal. I think this is a really interesting tactic
because it empowers individuals against very powerful organizations. And it is the first, I think the first real counter argument I've heard to the increasing practice of hiring a sociopathic CEO. That indeed if you are worried
about everything your CEO says becoming public in three to five years, you might want to hire a jerk. But I expect to see more of this. People are noticing that WikiLeaks is publishing Saudi diplomatic cables. Man, are they a corrupt
country. This is again someone hacked in and is just dumping all this stuff. So that's an interesting example of a new bad thing that's being enabled by technology that's happening more and more of. But in general I do worry
about the bad things happening. But I think it's less common than we think because most people don't do them. It's the systemic stuff that bothers me. The Internet of Things and being able to hack cars and planes and heart monitors and other stuff. And the interconnection of those. I think we're going to see more unanticipated
vulnerabilities. Because remember complexity is the worst enemy of security. And it's not just any complexity. It is nonlinear tightly coupled complexity. And that's really what the net gives us. So we've got to be real careful there.
Yes? I got a ton of reading practical cryptography years ago. I had occasion to go back and look at it recently in the hash section. I noticed at that time you guys had assessed that our ability to analyze hash functions was a good ten to twenty years behind our ability to analyze the other primitives. So I was wondering if you think that gap has closed
in the last decade. I think we're much better at understanding hash functions now. We're still implementing bad ones but that's more legacy. But I do think we are better. It's a very hard, mathematically it is hard because your assumptions are squirrely and I'm not going to
bore everybody with it. I think I revised that in the revised edition of that book which is cryptography engineering. But I do think we understand encryption primitives better than hash function primitives. Even though there's an interesting, you can make one from the other. Which is
why enough people remember when there was the hash function contest that NIST ran five years ago, I built a hash function on top of a symmetric algorithm because I felt I understood that better than doing a hash function natively like SHA was. I'm not even convinced the NSA understood hash functions very well. SHA, they had a vulnerability, they
fixed in SHA-1 which still was a little dicey and it's been updated. So because a hash function is not something they're using in military applications much until recently. So they didn't have their rich history like they had with encryption. The code in Reindell is not the
code that's in AES and I use two fish because I trust you more than I trust the feds. Do you think that AES is actually a trustworthy cipher? I think AES is. I trust AES. It is Reindell. There were a bunch of tweaks in the parameters but they're totally above board and kosher and everyone's happy
with them. It is weird because you can actually describe the algorithm in an equation that fits on one page. It's kind of small type but it fits on one page which kind of freaks people out a little bit. But I do trust it. I think it is secure. I do not think there is a back door or anything snuck
in. I truly don't. NIST did a great job with the AES process, with their SHA-3 process as well. I really do, you know, NIST unfortunately got tarred with the dual ECU generator and they're trying to rebuild their trust but they've done a fantastic job with crypto primitives by and
large. I like AES. Thanks for using two fish. I like it too. Kind of wish it won because that would have been cool. But no, I use AES without reservation. Don't worry about it. As disturbing as the current crypto war is,
something that actually scares me a lot are stories like LavaBit or even the larger tech companies getting national security letters. I'm wondering what we can do as tech companies or just the infosec community in general to defend against governments secretly ordering companies to put back doors into their products. So this is actually I think the
thing that should freak us out the most. And to me this is the biggest deal revelation of Snowden and all the stories around it and LavaBit especially. It's not that we believe that encryption was perfect and nobody could break it. But we did believe that the tech rose and fell on its own
merits. And the idea that the government can go into a company and say you have to break your encryption and then lie to your customers about it is terrifying. The law can subvert technology. And we cannot as a community, as a
society truly believe anything is secure as long as that's true. I just talked about iMessage and so we don't know. And I blogged about this a couple days ago. It didn't get enough play. It was kind of the last paragraph of a post. Maybe no one reads that far. There is a persistent rumor
going around right now that Apple is in the FISA court fighting an order to back door iMessage and FaceTime. And Nicholas Weaver, I don't know if he's here this week, has written about how they could do that. How they can modify their protocols to make that happen. And we don't know.
That is fundamentally terrifying. And I don't know how to fix that. We have to fix that through the legal. There's no tech fix. The kind of things you can do. I think that if we thought about it, we could rewrite the Apple protocols such that if they did have to put a back door in,
we would notice. If they did have to make a change, we would notice that a change was made. We'd say why did you make a change? They would say bullshit answer. We would know something was up. So maybe there's something in making your algorithms not back door proof but back door evidence.
So maybe just think more about that. But this is a hard one. And I don't have a good answer. And it is one that I think really should disturb us. More open source is going to be good here. More sunlight, harder to subvert. But
as long as the government can issue secret orders and secret courts based on secret laws, we have a problem. And it is not a tech problem. It's a legal problem. I think I'm on that side now. We seem to be in a situation where the
software industry can release software that runs on billions of devices and it's completely insecure and badly written. And there's no consequence whatsoever to those companies for the problems that they create. Just recently what comes to mind is the MMS hack on Android. Can you just
discuss generally what you think about this from a legal perspective and software companies being held liable or accountable for the bad software that they write? So I've always been a fan of liabilities. I've written the first thing about it maybe in like 02 or something. Actually
even before Y2K. And so here's the basic argument that right now as you say there are no costs to writing bad software. You read a software license and it says pretty much explicitly if this software maims your children and if we knew that it would do that then decided not to tell you
because it would hurt sales, we're not liable. Those shrink wrap licensees, even security software says no, read the license it will say no claims about security are made even though they are. So liability changes that. It adds
a cost to not designing software properly. It adds a cost to insecurity. It adds a cost to non-reliability. And that has real value. We are already paying these costs, we're paying it in losses, we're paying it aftermarket security devices, we're paying it in the entire industry that
has strung up around dealing with the fact that the software sucks. But with a liability we would pay anyway, the cost would be passed on to us of course but at least we'll be getting more secure software out of it. So I see a collective action problem, I see the market failure here, the
market is not rewarding good security, the cost of insecurity is too low, the value of insecurity is high and liability changes that. It is a lever we can use to rebalance this cost benefit ratio. And I think it's a powerful one. It is not a panacea, lots of ways
liabilities go wrong. But liabilities do a lot. They really provide value. And I think they would 100% here in software. So I want to see liabilities. I mean we know why the Android vulnerability isn't being promulgated. They designed their system so that they, Google produced the
patch, it won't go down to the phones. Because the phone manufacturers don't care very much and you don't have that tight connection between phone and iOS like you have in the iOS world. So the patch doesn't go downstream. If suddenly the phone manufacturers were liable, I
assure you the patch mechanism would work better. And that's a lever we have as society and we should use it. I think it's a better one than regulation here because it's one that's dynamic and tends to seek its own level. But that's why you use it and I'm a big fan of it. Actually
thinking about this, hang on. Everybody smile. There's more of you than fits on the screen. It's not going to work. Hang on. People at the edges, you don't have to smile. All right. Thanks. Who was next? It was you, right? Bruce, it seems like less and less surveys seem to show
that Americans are concerned about the privacy of their information. Often you hear terms like I'm not hiding anything. I don't have anything to hide so I'm not worried. And it seems like people my age and younger don't have much of an understanding of Edward Snowden or the relevance of what he released. What would you say to those perspectives? So I know people know I had a
book published in March called Data and Goliath and it talks about surveillance, government and corporate. And I spend a lot of time, I just spent a whole chapter on privacy and why it's important. And it's not true that people don't care about privacy. And it's not true that young people
don't. All surveys show that they do. They're very concerned about it. And you know this is true. You remember being a teenager. You're concerned about privacy a lot. From your parents, from your teachers, from your friends. You don't care about the government because who cares. But you're concerned about the privacy in your world. And people who are fluent, kids, teenagers who are fluent in the net are very
fluent in how to maintain their privacy. They might not do a good job but they try a lot. I argue that privacy is fundamental to human dignity, to individuality, to who we are, that without privacy we become conformists. We don't
speak out. And I think there's a really interesting argument in social change. So we're in an extraordinary year where gay marriage is legal in all 50 states. And that issue went
from impossible to inevitable with no intervening middle ground. It's amazing. But what it means is, and you can take legalization of pot. A lot of issues you can take in this way. Back then, something is illegal and, let's say,
immoral. It goes from illegal and immoral to some cool kids are doing it to illegal and we're not sure and then suddenly it becomes legal. But in order to get from here to here, there's got to be a point here where the thing is
illegal and people do it anyway. You've got to do it and say, you know, that gay sex wasn't that bad. That was kind of okay. You know, I tried pot and the world didn't end. And it might take 40 years and a couple of generations but then
you get to the point where it's legal. Interracial marriage, any of these issues. But if you have surveillance here, if you can stop people from trying the thing and saying, you know, that's not that bad. Maybe we're wrong. You never get to the point where the majority of
us believe we're wrong. So I think surveillance, broad government surveillance will really have a stifling influence on social progress. Because it won't let experiments to happen. Now, it's not an argument you can make to
anybody, right? But I think it is probably the most important one. But really, anyone who says I have nothing to hide, you know they're lying, right? I mean, there aren't cameras in Scott McNealy's house because he has nothing to hide. So I
think you really have to point out that those arguments aren't true and that privacy isn't about something to hide. It is about maintaining your sense of self in a public world. I get to determine what I tell you people and what I don't tell you people. And that is empowering. And if I lose that,
I am fundamentally a prisoner of society. So attaching privacy to something to hide, to secrets, is just wrong. It's about human dignity and it's about liberty. And I do. I do it better in the chapter. So I offer that up. Yes?
Most people seem to be more worried about back doors and forced government back doors. But I'm sort of more worried about sneakers, no more secrets, Marty type of deal. What is your opinion on quantum computing and current
encryption and also quantum encryption and its rebuttal to quantum computing? So quantum encryption has nothing to do with quantum computing. They're completely separate. Let's do quantum computing first. Quantum computing is going to become real, probably not in our lifetime, but it will become real. It's a technology. It's advancing. I
think we can factor like 24 now. But it will get better. It has potential to change crypto but not destroy it. It will break all of the common public cryptography algorithms. The ones based on factoring discrete log problems. So RSA,
Diffie-Hellman and those. It will break those in linear time and be very nasty. But we do have public key algorithms that do work. Even the original Merkle knapsack, the first public key algorithm, not knapsack, his early one that had a district work factor of a square instead of
exponential. That still works. There are coding theory algorithms still work. They're less efficient but they still work. We still have public key cryptography, symmetric cryptography. In theory the best quantum cryptography does is have your key length. It reduces your brute force search by a factor of a square root. So double your key length and you're done. And NIST is
actually hosting conferences on post quantum cryptography. Go download the papers. People are thinking about this. How can we build secure systems that are resilient to a quantum cryptography, sorry, quantum computing theoretical world. So
that's the breaking. That's quantum computing. Quantum crypto is really quantum key exchange invented in the 80s. Research continues. I think there's a product now. And it is a clever way to exchange keys using quantum properties. Really neat. Great science. Great physics. Something I as a
quantum programmer absolutely no need for. I can exchange keys just fine, thank you, the math is working. So I think it is kind of pointless from a practical point of view. Great science. I love reading the papers but I would never buy such a device because I would use one of the math systems and they work just fine. So that's sort of my quick
quantum primer. But it's great science and I love the research. And eventually, yes, we'll be able to factor numbers very quickly which will be cool. Yes. So I definitely want
things simple. This is the answer about the opposite. The more you make it invisible, the more you make it transparent, easy to use, no work, even sacrificing some security, I
think we do better. I'm really liking my signal right now on my iPhone. It's a great program. And it's just has a really clean interface. It works. I can actually all the key exchange happens in the background. It's well designed.
I can actually confirm there's no man in the middle. I don't have to, but I can. The fact that I can is enough of a deterrent for people trying it. So I really like simple stuff, simple stuff that's easy to use because I want everyone to use it. There's value in it being ubiquitous. So expert only encryption has much less effectiveness. One last comment
to the quantum guy, one of the things we know about the NSA documents is they have a budget line for quantum crypto, it's not very large. Which means they're doing research but they don't have the quantum computer in Utah. I'm pretty sure that's not something they can do. Yes? First of all, Bruce, you're my security crush, mind if I take a picture
with you after the show? I don't, but you guys all have weird pie plates on your chest, I'm just saying. You look like some embarrassing cult. It's flavor flavor. Bruce, so with the explosion of software-defined networking and
enterprises looking to use it and employ it quickly, do you have specific concerns around the security of such a bleeding edge technology and this virtualization of router switches, firewall, et cetera? You have some thoughts on that? I don't have any specific concerns, just the general of more complexity, more things to be insecure and
another layer of organization means someone else to serve a warrant to. So those are my concerns. There's huge value in this. I'm a big fan of security outsourcing for organizations because it's very hard to do right and the more you can consolidate the expertise, I think the better
you'll do. So I tend to like those but there are legal risks that someone else can ‑‑ we've been seeing some court cases that the FBI can serve a warrant on Facebook for your stuff bypassing you, that they can do that. And that does cause problems. But in general, I think the value of
outsourcing, the value of software-defined this and that are great. And there are security risks but I think in benefit, I tend to like that technology. So as a balance, no major concerns over shared control planes? Yeah, that's it. You got it all. Those are the things to be concerned
about. But are they major concerns? They're like regular size concerns. All right. Thanks. Yes. First of all, thank you, Bruce, for everything you do. With the pie plate also, I'm saying. My question is, even if they wanted to, would policymakers be able to stay current with the pace of technology? You know, it's interesting. I think
where ‑‑ I've come to the belief that the United States, probably other countries are fundamentally ungovernable. And that's one of the reasons. That technology is moving so fast that the people who understand it can run rings around policymakers. And whether it's writing laws that, you
know, by five years in retrospect you realize, whoa, they understood that and put that sentence in and we didn't realize that. This is hard. I like seeing laws that are technologically invariant. Instead of writing laws to keep up with the technology, write laws that don't have to. And
there are examples. I mean, you know, laws about assault and murder don't really care about the weapon. I could write a law about privacy for communications that doesn't care if it's e‑mail or voice or voice over IP or something. I can do that. I think that's better. I'm not sure it will
happen. I mean, there's so much co‑option of the legal and policy process by people who seem to make and lose a lot of money. I mean, right now the cybersecurity bill that's probably going to get signed has got all sorts of
amendments and riders and what it actually does and what they say it does. That's an easy one. You start doing something like health care or climate change. Forget it. So I'm not optimistic about lawmakers staying current because of technology. I think we're going to have to go
through some bad times where we realize how to create law in a society where tech moves so fast. There's an argument to be made that the modern constitutional democracy is the best form of government mid 18th century technology could invent. Travel and communications are hard so we
got to pick one of us to go all the way over there and make laws in our name. It made sense in 1782. It's kind of what? Now. And there's a lot of ways that our systems that were designed when nation states started becoming a thing
are sort of breaking now because things are different. Things are moving too fast. The communication is different. It's all different. And I think we have to redesign democracy. This of course is ridiculous. It didn't ever happen, right? But I think we kind of need to. That wasn't
an optimistic answer, was it? Sorry. Yes. So a few months ago there was news about Chris Roberts being detained at the airport after he posted a tweet about the security of the United States. Does anyone know if he's here? Okay. And then your blog post said that maybe FBI knew that Chris
Roberts works in the field of avionics. That's why he was detained. And then recently Wall Street went down and there was this news that the anonymous had posted a warning about it. Even though Wall Street claims that it was an IT issue, it was a minor IT issue. So what do you think? Is
the emphasis on offensive security right? Like the issue is similar with Chris Roberts. So we didn't know in the Chris Roberts case that he actually was being at least somewhat watched by the FBI. That he talked to them before. And the Chris Roberts case is actually very complicated. And I stopped commenting when I realized it was way more
complicated than I understood. You know, people don't know this is the case of him being on a plane and saying something about going into the avionics bus via the USB port in his seat. Which would be crazy if you could, but it wouldn't surprise me if Airbus forgot about that. It really seems
every time you put physical security people and give them an IT security problem, they completely forget that they should talk to an IT security person. Anyone follow the hack on the brink safe? Completely embarrassing. It's like they
never even open an IT security book. Oh, yeah, we can do this. No problem. I don't know how much proactive. It does seem like the FBI is monitoring more groups and more individuals. And we see them monitoring the occupy movement or
black lives matter. Real social change movements that might not be as, I don't know, as mainstream as they could be. So there's a lot of that going on. How much, in those cases
I don't know. The wall street case I have no idea. Certainly there's always a lot of bragging that that might not be true. I don't know. I don't know the details. It's hard to speculate. I think there is more monitoring than we all
think. This is the point of fusion centers. This is the point of information sharing between the NSA and others. I think a lot of it is going on. Yes? So do you trust the elliptic curve based cyber suites or non-elliptic curve and why? So I've been skeptical on elliptic curves for a bunch of
years. I'm in the minority here. Most cryptographers do trust them. So I'm giving you a minority view. My feeling is that there's a lot we don't know about elliptic curve mathematics. And there certainly could be classes of elliptic
curves that have hidden trap doors that we don't know. The NSA uses elliptic curve crypto. So I can tell you that. So in general or at least in some instances it is secure, believed by the NSA. But I also know that they have in some cases tried to influence curve selection. Now for good or for
bad I can't tell you. So I worry about elliptic curves where the curve selection process isn't transparent. Now if you want to use elliptic curves in your system, Dan Bernstein, a guy we all trust or at least I do, had a public
process and they call it Bernstein curves and they are available and I would use those without reservation. The NSA said here, here's some great curves. I would say, you know, huh? So that's my feeling on that. I think the sound, but I think there are gotchas we don't fully
understand. All right. I'm getting the go or get off stage soon signal. So I'm going to take one. Do you have a yes or no question? No. All right. Short answer. Go. Okay. So you got me into security. I've been in this industry ten years. Thank you so much. So my question is now that I find
myself in a position, you know, hypothetically where I'm working with government agency where they use the same shitty software as everyone else. They got the same problems as everyone else and I'm convinced that the actions of stockpiling zero days, weakening crypto, all that stuff is just as harmful. What can I do to convince or show these people that actually
these other arms of the government are doing things that hurt us? This is hard. And a lot of us are trying to do this. Really just keep saying it. This is the political process. It's not the tech process. It's not clean. All right. I have to leave. I'm doing a book signing at the bookstore at 4. So come by and say hi. Not all of you
at once. I will be around. I'm going to go outside there right now assuming there is like space to breathe. Happy to sign books and say hi. Thank you very much for coming. Thanks for sitting. Have a good weekend.