ETHICS VILLAGE - Responsible Disclosure Panel

Video thumbnail (Frame 0) Video thumbnail (Frame 10735) Video thumbnail (Frame 18419) Video thumbnail (Frame 27803) Video thumbnail (Frame 31752) Video thumbnail (Frame 42376) Video thumbnail (Frame 44130) Video thumbnail (Frame 45106) Video thumbnail (Frame 46430) Video thumbnail (Frame 48427) Video thumbnail (Frame 51747) Video thumbnail (Frame 54030) Video thumbnail (Frame 54972)
Video in TIB AV-Portal: ETHICS VILLAGE - Responsible Disclosure Panel

Formal Metadata

ETHICS VILLAGE - Responsible Disclosure Panel
Title of Series
CC Attribution 3.0 Unported:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor.
Release Date

Content Metadata

Subject Area
In today's climate of data breaches and information leaks, how do we in the infosec community disclose the vulnerabilities we discover responsibly? Who are we responsible to? Can we set a standard practice that is ethical, fair and effective? These and other questions will be discussed by some familiar faces on our Responsible Disclosure Panel
Computer virus Perfect group Code State of matter Patch (Unix) Multiplication sign Open set Mereology Computer programming Software bug Formal language Twitter Different (Kate Ryan album) Software testing Data conversion Computer-assisted translation Information security Traffic reporting Metropolitan area network Vulnerability (computing) Modem Mobile Web Dependent and independent variables Standard deviation Touchscreen Host Identity Protocol Coordinate system Bit Instance (computer science) Type theory Word Process (computing) Internet service provider Volumenvisualisierung Right angle Musical ensemble Rainforest Arithmetic progression Writing Row (database)
Axiom of choice Point (geometry) Divisor State of matter Code Multiplication sign Hidden Markov model Set (mathematics) Drop (liquid) Client (computing) Mereology Software bug Product (business) Number Bit rate Authorization Software testing Information security Position operator Control system Vulnerability (computing) Email Dependent and independent variables Content (media) Bit Category of being Process (computing) Software Video game Game theory
Computer virus Point (geometry) Game controller Functional (mathematics) Open source Multiplication sign Patch (Unix) Decision theory Design by contract Similarity (geometry) Streaming media Shape (magazine) Drop (liquid) Perspective (visual) Software bug Product (business) Heegaard splitting Mechanism design Goodness of fit Mathematics Different (Kate Ryan album) Videoconferencing Utility software Software testing Information security Tunis Position operator Physical system Form (programming) Vulnerability (computing) Standard deviation Software engineering Arm Inheritance (object-oriented programming) Software developer Plastikkarte Planning Bit Flow separation Process (computing) Exterior algebra Software Order (biology) Self-organization Right angle Quicksort Sinc function Window
Context awareness Mereology Computer programming Neuroinformatik Software bug Formal language Expected value Hypermedia Computer configuration Different (Kate Ryan album) Information security Vulnerability (computing) Physical system Area Email Digitizing Bit Connected space Type theory Process (computing) Exterior algebra Order (biology) Normal (geometry) Self-organization Right angle Spacetime Point (geometry) Gauge theory Drop (liquid) Software industry Number Goodness of fit Internet forum Hacker (term) Term (mathematics) Computer hardware Software testing Traffic reporting Condition number Standard deviation Dependent and independent variables Information Inheritance (object-oriented programming) Projective plane Expression Vector potential Blog Video game Game theory
Bit Line (geometry) Software bug Formal language
Email Dependent and independent variables Multiplication sign Line (geometry) Figurate number Regular graph Metropolitan area network Computer programming Row (database) Software bug
Existence Computer file Confidence interval Civil engineering Mereology Perspective (visual) Information technology consulting Power (physics) Software bug Hacker (term) Authorization Energy level Flag Utility software Physical system Vulnerability (computing) Standard deviation Service Pack Chemical equation Bit Word Process (computing) Telecommunication Pattern language Annihilator (ring theory) Window
Point (geometry) Shift operator Dependent and independent variables 1 (number) Computer programming Twitter Software bug Data management Word Process (computing) Software Personal digital assistant Chain Self-organization Video game Software testing Metropolitan area network
Point (geometry) Trail Group action Thread (computing) Wage labour Code Multiplication sign Workstation <Musikinstrument> Online help Mereology Rule of inference Perspective (visual) Computer programming Software bug Neuroinformatik Revision control Mechanism design Root Causality Operator (mathematics) Analogy Authorization Information security Vulnerability (computing) God Physical system Area Cybersex Dependent and independent variables Email Regulator gene Physical law Line (geometry) Degree (graph theory) Type theory Arithmetic mean Process (computing) Software Charge carrier Video game Speech synthesis Right angle Freeware Arithmetic progression
I'm not important they're important hey i drinkin in Italy I am Katie mo if you want to know who I am I don't know I'll tell you later [Music] I'm a Chris on hello I'm in the room Chris Weiss opal CTO and founder of Eric code but I've been doing disclosure stuff since the loft days in the end of
90s was a buddy of RF policy 1.0 and render man I hack stuff lately sex toys but also recently decided to not disclose to certain agencies and countries and such so alright I accidentally seated my time because they were like serious introductions so here we go being super serious as I always am Katie my Soros founder and CEO of loot s security I launched I helped the Pentagon launch hack the Pentagon made it so that our people were no longer thrown in jail for that and instead paid some money launched Microsoft's bug bounty programs wrote Microsoft's owner ability coordination policy created Microsoft vulnerability research created Symantec vulnerability research RIT Symantec's vulnerability disclosure policy co-author co-editor of the iso standard for 100 release culture vulnerability handling processes former pen tester at at stake okay there will be a test later Katie Orr well-funded Congress well for sure 1998 yeah he was he was 1998 I was 20 18 it only took 20 years for them to invite me oh okay we're making some progress it's progress okay so it's the cat ears they're vestigial so in deference to your employer would you like to remember one of the questions that were approved by your employer or should I just shoot from the hip yeah okay so it's only us in Bruce and and a camera so so I'm Big Easy a couple of these folks up here and my friends I hang out with them we know we see each other at conferences and things like that to start off the ethics village I wanted to have a conversation not necessarily about responsible disclosure I want to have a conversation about the ethics of responsible disclosure so when we're talking about this we're not necessarily talking about responsible disclosure we're talking about whether or not it is still ethical to responsibly disclose that is the question that we would like to explore and if anybody in the panel would like to step down now get the up can you can you define what you mean by responsible disclosure is that like tell the vendor and low-low them to fix it before you tell the world is that you that's right and I think that it is no longer an ethical to maintain this practice as a security researcher I'm an ICS SCADA security researcher I work at the University of Illinois I've been there for seven years before that I did financial I've been a part of responsible disclosure and a lot of things ICS SCADA and I'm not speaking for my employee or anybody else so I'm on the off with virus but the reason I wanted to have a panel was to just talk about this issue of you know Microsoft has a bug bounty program I don't work there I am not quite I am not used to work at Microsoft okay but I'm saying like that as an example Cisco might have a vulnerability skulls disclosure program or big companies have well come on they have these programs and I can remember back going back to the 90s I started one of the first internet service providers in the state of Kentucky and back then when people will hung up on the modems we used to just knock them off with the blue screen of death and I remember when responsible disclosure first came up and we were hoping to try and get somebody like rainforest puppy to come for this maybe next ethics village we can get him out to discuss this but I really started to think about this as I invited panelists and then I saw render man last night and luckily we were both still sober enough to remember that this panel exists I had to shake him a little bit you really should be here at 3 o'clock because I caught your take on in Twitter a few weeks ago about responsible disclosure I really think that the question really is as security researchers should we still responsibly disclose ok JD looks like she's really anxious to talk so go ahead I really am responsible to keep using that word no seriously we we stopped using it in the ISO standard we stopped using at Microsoft and that is thanks to a conversation with Jake Coons who was formerly open security foundation now he runs risk-based security but Jake Coons came up to me after I was on a responsible disclosure panel at RSA in 2010 and he was like can we stop using that word it has moral judgment blah blah blah okay so with that word in mind I know what you mean it's that whole you know it's that whole thing of letting the vendor letting the vendor take their time to patch before you go public with it having had to write Microsoft's policy we wrote it with three roles in mind finder of the vulnerability coordinator of the vulnerability and multi-party instances like meltdown specter type of things and vendor receiving the vulnerability report it was very important to me as the creator of Microsoft full mobility research that there was no language in there that said we wouldn't release security in details of the security vulnerability before a patch was ready because it was important to be able to pull the trigger pull the ripcord if there is evidence of attacks and whatnot that is a nuance difference between Microsoft's policy and Google's policy which has a strict deadline of disclosure so there's ways that you can deal with this and I absolutely made sure that Microsoft never used that word while I was there okay so thank you Katie okay so yeah anything to say Bruce Katie was able there to strike that from my side right I mean the responsibility if you're getting use the word lies on the people who wrote the code in the record that's the way how we held it your responsibility right I'm sorry can you slow down your words just a little bit we've got some lag real slow there so that's perfect that's perfect talk like you're from South Carolina or from the South I'm from New Orleans and we have a drawl you need to draw because we're lagging a little bit all right
well I will keep brief I think that the
responsibilities ice would be under and passages if this was in party you're responsible for a state code responsible in chaining your code and at the end of the day you know you bear that burden so I'm all for writing it from the disclosure debate and trying to enforce some accountability on the vendors to do the right thing and push them all or becomes actually running stop it son okay so what about the the right side over there do you have anything to add I think that the the big thing right now is like we all consume will allow these services and such that we're fighting vulnerabilities in and we have a stake in this previously I found a bunch of stuff with their traffic control systems and it's really weird giving a talk about that when you have to fly home afterwards you know so my ass is in the game it's you know I have a stake in this but I think lately and and where my position comes from is that you could be helping various authorities or regimes whatever to facilitate things that you might have other ethical issues with and it becomes this thing of yeah you could be saving life and limb but you could also be making something more secure that could be then turn during used against people so it's okay hmm well it's tricky so when I look at it I think there's like there's a short-term benefit to disclosing issues and there's a long-term benefit you know the short-term benefit is that one little bug gets fixed and you know there's some there's certainly some benefit to that the product gets a little bit better but the long-term benefit is that companies will start to realize it's if they're going to get all these point whack Amole vulnerabilities that they're gonna have to deal with they're gonna have to come up with a better process for dealing with them and then they're gonna realize it's even cheaper to use the techniques that researchers are using themselves and fix the things so like with parisa's talk at the keynote for blackhat she talked about the 90 day why they had that 90 day drop dead date with all disclosed and the reasoning behind that was it just would force vendors to get better at responding right and what's that well a lot of companies have lawyers too I mean yeah individuals like yes but the the compliance the compliance rate of fixing within ninety days is now at 98% so over the time that they've done that they've gotten a set of customers to be able to respond within 90 days and the fact that Google did that means that you could do the same thing to and they'd probably be able to respond within 90 days so I look at there's a long-term game to doing this to of getting vendors to have an expectation of doing the right thing well there's now Legion too the second part of my question would you like a chance to hurt the first one before I all right this whole thing is irrelevant because this assumes two factors one that assumes that the vendor is ethical - it assumes that it is always legal to act subjectively ethically as the finder and I can sit here and tell you all kinds of scenarios where that is horseshit yeah well in here I work in M&A if we don't buy the company the bugs get burned I end up sitting on a bunch of O'Dea if we don't do that everybody goes to jail for insider trading I've also done pen tests where we broke a bug in a vendor and we turned the bugs over and then that client chooses to sell the buds and a nonzero number of those bugs have actually killed human beings like these are things I've seen yeah so the concept of debating a generic ethical approach to discovery and/or disclosure like the doing the right thing is always legal is fundamentally flawed so and then that is absolutely correct Thanks but the this the second part of my question is what what I feel why I am trying to forward the idea that it's no longer ethical to use the disclosure process is what vendors are doing with automatic updates so I have no choice as a user but to accept the automatic update and have no idea what other feature changes the vendor are pushing on me so you have an important security update now all of a sudden you've lost a lot of features in the software that you originally bought and then you've got a lot of features you didn't want like oh I'm going to start reading your email and targeting advertisement based on the content of your email oh and you just
signed off on that in the EULA and vendors are using this this automatic update to push features out into you rescind features hmm or rescind features or features exactly and that I think that's why as researchers as what does a researcher do when I submit the process is fundamentally dysfunctional yeah but yeah update symptoms is one of the symptoms so the vendors like oh I can use this mechanism to change the product to suit my needs economically but not necessarily oh this is a security update you always see these things you have a security update and all of a sudden things change it fundamentally in the software that you use to wear it it's not the same anymore I think the fact is that they're bundling these things that in order to get the security update you have to accept this other stuff that you may not want exactly I don't understand how that discloses problems yeah because you have no control over that some companies are gonna do and some are not what are you gonna do you're gonna research how they're bundling anti features with fixes before you disclose I mean making it too hard for the disclosure to even make a decision but you'd also decide that if the company is a dick about it and does stuff like that you can say I don't want to deal with you I mean the only way I could see like what you're talking about being relevant is in a extremely activist perspective where you're saying alright I'm just gonna drop out a time after time on this vendor without telling anybody to force them to release updates that only fix the security flaw and don't package other which okay like I mean I like hostile play I'm cool with that but I mean that's the way we did is talk what hey what do we think what do we think everybody's got cards what do you think is is what vendors are doing with the process ethical or unethical so you're advocating split up the the security updates from all of their updates right that's kind of where you're going yeah I think that's what that's one thing I don't want to see feature changes in security updates yeah and so okay so well something that something that actually freaked me out when I was when I was working a Microsoft AC so of a major not utility company but let's just say very important company whatever they basically said look we'll do automatic updates for our corporate IT for Windows but there's an XP controller on smelting device that is you know it's basically the investment has to last us for 50 years so we are never patching that in any way shape or form they air-gapped it good luck with that but I mean essentially like the customer was like we don't apply updates in certain scenarios and then another weird thing that I learned was that some customers basically wanted to do update Tuesday update quarterly instead they were asking because it cost them to do the testing of the fixes so it's weird what people will actually accept in the end use scenarios and that was something that even as a vendor we were surprised you know by our customers we also were surprised that they were not willing to give up XP we basically kept trying to you know kick them off and make support you know that much more expensive and they just were like oh thanks for telling us that it instead of twenty five million dollar extended support contract it'll be 50 million dollars next year thanks for telling us we'll just write it into our budget like super weird stuff like that so anyway this is all beyond the control of the disclosure right of the person who found it yeah and can we bring the open source into this are you gonna make open source teams do separate updates too they have they they've been working on an update and they get a security fix in there and you're gonna make them do extra work to come up with no I'm just saying I'm just saying this is more this is this is more work for the development organization if they're working towards a new release just it's easier to stream in the security fixes so there's going to be a higher cost for security fixes if it needs to be separate if you if you find a bug in something open-source what's stopping you from also writing the patch yeah well not every owner ability researcher is a good software engineer it doesn't have to be a good patch but it'll be a shitty patch the guise of anything is better than nothing if you find the bug by definition if it's open-source there's nothing stopping you from fixing the bug so can I ask a question of the panel since there's people have been around doing this a long time and I'm not trying to call anyone hold what what was the first real update that that movie released by my recollection I didn't work for Symantec under contract to evaluate the auto updating of virus epic tunes I think in 2000 when did auto-updating come into vogue and has there ever been a separation between functionality and security updates and auto updating windows 98 the earliest one I can remember yeah that's that's pretty close I think well and so Windows actually or actually Microsoft had several different updating mechanisms it's not like each individual team there was at war with each other and just wrote their own or anything nothing like that ever occurred but anyway there was at some point there was some point like I think 16 different updaters from different teams so weird stuff was happening and then they finally unified yeah exactly right and then they finally they finally unified under one system but even that you know Vlad we have some pretty badass panelists does anybody have any questions about the topic
well for the video Thanks so you were talking about the you know responsible or ever think we want to call it coordinated disclosure whatever we want to call it are we what's the alternative if we're saying coordinated responsible disclosures unethical are we talking full disclosure or what what sort of alternatives would you or the panel recommend or suggest the virus disclosure polity I am an arms dealer off I mean an arm society is a polite society and I have yet to hear a policy yet that makes the society unarmed okay so now this Canadian needs to speak up so for me it's a case of okay at least if I find something I can't leave it alone I have to at least tell somebody you know do best efforts sometimes that takes an ungodly amount of efforts to even find out who the hell to talk to like you know finding vulnerabilities in our traffic control there's you know it's a global standard but you know ICAO doesn't it so I have teeth to enforce anything and it's got to be adopted everybody I was like where do you even start with something like that so I was like okay I'm giving a talk at DEFCON where I'm saying here's my evidence I can't prove that they've mitigated any of these threats please prove me wrong six years later DHS actually proved I wasn't crazy which is a position to be in but at the same time I'm looking at it now and I'm thinking I potentially help them secure things it could be now used against us because yes my asses are plane flying home those similar kinds of systems though are used to drop bombs on people what's your quandary so it's it's one of those you also have to look at what the system is if it's some little IOT gadget or something like that you know maybe a little bit of PII or something like that that's one thing but as you said you know yeah everything to
do we'll use so if there's a potential that the technology or the owner of the technology use it negatively I'm like gonna reconsider so it's yeah drink sitting so you know I think I want to use the word responsible on the bender side because I don't think it's it's talked about enough in that context and you know as an example vendors have been getting more onerous with trying to report a bug you know some of them make you go through the bug bounty program if they have one with those all these terms and conditions there was a good blog post by someone from project zero about trying to report a bug to Samsung and the number of different click throughs they were supposed to go through in order reported bug you had to agree not to ever disclose the bug until they fixed it like that's ridiculous right so they decided we can't sign off on that because we have a 90 day that's our policy is 90 day disclosure so how can I submit a bug saying I I won't do that so eventually write it around their whole forum based system and found someone to send an email to but that's not always easy when it's a foreign company right like finding a right person to speak to and when it's a Korean company is not that easy so I think vendors are making it increasingly difficult for for people to disclose to them they're starting starting to game the system more or it's just they don't realize they need to take you know reports like I've been doing like I said with what your sex toy vendors that have basically failed to realize they we're hardware manufacturers making a manually operated device now they've added connectivity they're a software company they don't think of themselves that way so trying to wake them up to this this process because like we've all found at one point or another there's we held you email like yeah yes all right I mean you know everything's you know information is valuable but so the the idea the question was right what is what's the alternative if coordinated well massager so there isn't any one answer to it I mean I think the I think the thing is flexibility in the process and being able to kind of gauge what your principles are as the as the finder what your principles are ideally as a responsible vendor because I agree with weld here it's the the responsibility for dealing with these vulnerabilities short term and long term it's absolutely on the vendor the vendor has had a huge advantage legally with lobbyists and the Computer Fraud and Abuse Act and before the exemptions to the Digital Millennium Copyright Act they had a huge advantage over being able to threaten and intimidate and silenced researchers and one of the main things when I first was asked to to it be involved with the ISO standards it started actually trying to define the roles of researchers and I was like excuse me I don't know any hackers who strive for ISO compliance so can we please make this about the vendors and what they should be doing and stuff so that's why the ISO standards are like that I do think that deadlines are actually important the original you know all original Vohland disclosure policies set expectations for deadlines that is the norm and what I've seen in especially in media coverage people freaking out about people disclosing the presence of a bug not even the full technical details and confusing that and then again blaming the researcher it's like kill the messenger a little harder why don't you I think part of my like whole life's mission I mean this is my 19th year coming to this town for this purpose yes I am olds this is not this is what happens when your hair goes gray where do you plug in an AC charger in on Bruce right but like the point the point here is it is the alternative something else is it always better to do full disclosure you know without waiting is it always better to wait forever I don't think either of those is the answer I think reasonable deadlines are important setting expectations is important and then no matter what you do reason people are gonna disagree about it I mean like so many bugs I've been a part of where the parent organization that owns the company alcohol abuse the parent organization that owns the company that I find the bug in decides you know what this bug adds too much risk we're gonna sell the company okay well I'm under NDA because it's a pen test so I can't tell anybody but that company doesn't exist anymore for me to pressure to fix it what happens or when I'm on an M&A gig and we don't buy the company and now we're sitting on a bunch of silent otay some of which is in very large companies like the kind that get targeted by spy agencies what happens to those bugs or bugs where you legit disclosed to a vendor the vendor is forthright and says that they're gonna take care of it and they issue a fake patch and then sells their own bug to an intelligence organization also a scenario I've been through like what what is this response to these gray spaces I'm curious there's always a whistleblower scare drop yeah that's okay so so the response to these gray areas is I have to put myself personally at risk simply because I know something when I owe the world know I say it's an option still up to you to decide what you do but you know fair enough to drop details to reporter you know or some other interested party you know yes no option is an option I will never take it but it exists that's not in the ISO standard we kept the question for the audience he said that there was a security researcher who contacted him with a bug a supposed bug that was was he never disclosed any details of and they asked for a Bitcoin lots of Bitcoin even though it's going down a little bit today still was a lot probably yeah so I
mean that's just yeah yeah can I could I just say that's that's not a researcher that's a criminal well it's not that's obviously that is literal extortion like that is the definition of expression so I don't think that that I mean I is it is extortion unethical I think so right yeah I mean but you bring you bring up a point that in the increase of bug bounties a lot of researchers are asking and some of them are doing straight-up extortion like that and some of them are simply asking if there is a bug bounty present and they're not accompanying that with a threat and it would I deal with a lot is organizations who confuse those two they're like how dare this researcher ask if we have a bug bounty program like and I'm like well they did this work and they're just asking so do you or don't you you know type of thing did they did they threaten to do anything with the information no well then they're not threatening you that's not extortion I'm seeing like where language and culture issue in the
way there was an interesting graphic I saw the other day where the bug bounties were coming from and that who was fulfilling them and there's a lot of overseas bug hunters and I've had
numerous exchanges where it was clear we were having a language breakdown and a
person could read it as extortion or you could read it as do you have a bug bounty that access and so it's a pretty nuanced line very thank you yeah I had to testify before Congress about this a little bit you know so just a little so the uber data breach yeah just a little testimony a little bit Cattier so the so the the
whole uber data breach 50 million records downloaded by a florida man why is it always a Florida man exactly but but what happened was you know uber this guy emailed them he didn't actually know
about the bug bounty program they referred him to the bug bounty program saying we have one over here friendly researcher who's telling us about a flaw and you know the maximum payment was ten thousand dollars for the bug bounty literally the emails that were released he's like yeah I was thinking more six figures so he successfully extorted them for ten times the amount of their regular bug bounty and uber during that hearing actually said yes that was an extortion payment and we should not have laundered it through our bug bounty program so absolutely like that there are ethical lines that were breached in that and I think Buber took responsibility for it I think that was
the right thing for them to do I have a question now I'm gonna change the subject just a little bit maybe turn this around I had a career at a power utility before I was at the University of Illinois and I had the you know a really cool job where I hung out with hackers and a hack this out of my SCADA system so I got to disclose a lot of faux date some of it which is still out there a decade later now the question because I heard the word criminal earlier when a vendor comes in and threatens the career of everybody that is working at a place since is oh if you want to be a consultant when you retire you should really forget about this and just let this go should that be criminal and is a is that illegal I don't know coercion from people in authority to suppress bugs if we're gonna talk about criminalization in policy versus private how about we talk about why the hell is it a standard practice that does it so awful up it's decide they get to move into private sector afterwards ok talk about that level yeah well I mean it's you know personally as a disclosure of
many vulnerabilities that I know are still out there you know from an ethical perspective how long do you wait when civilization teeters in the balance don't-don't-don't I made this panel to ask this question well you can you live with so I think it's harder 15 years well I'll have a much better talk for shmoocon Bruce so I I disclosed the bug to Microsoft this was around like 2003 or something and they took a whole year to fix it because it was in their file auditing system which helps them get C to compliance so the existence of the bug made everyone who had to use a c2 compliant Windows non compliant which is why c2 compliance is ridiculous but it was a bug where if you used hard linking in NTFS you bypassed the auditing system it seemed to me like an easy thing to fix they said we had to completely rewrite the auditing system to have like new flags on every single file it wasn't something that we could just easily update we'll do it with anti service pack 3 or 4 I don't know or Windows 2000 Service Pack 3 or 4 and they gave a good explanation to me they gave me a good technical explanation they gave me status along the way every couple months that they were actually working on the problem and they they said there was just no way to fix it faster and the fact that they gave me confidence that they were actually acting in good faith some bugs just do take that long and so I wait right but if you if they were completely silent with me they didn't acknowledge I sent gave them the bug or they were completely silent with me I would have no idea that they're ever going to fix it so why not disclose so I think that's part of it is is vendor finder communication can give you confidence to weight that is in the ISO standard way it's all about good faith like you know did you try your best to disclose the vulnerability you know did you pattern up doors ringing uh phones sometimes
yeah you hit a wall and it's really difficult and you need to just save like there's a problem here that needs to get fixed if the only way I need to you know that I can do this is to you know send up a flare and then set something on fire here it may need to be may may be necessary but again it's if you've got a case where yeah they seem to be a little slow but because they're having you've got half the system and start fresh no you have to take that into consideration I think it'd be unethical on the side of the researcher to say no no only 90 days triple shift you know it doesn't work that way why would the researcher have to in your words try their best to resolve these things it's not the onus should not be on the researcher but before they you know drop out a public - at least I've given it a fair shot to have tried to report it and so many so many organizations actually don't have any clear way to report them you were giving the you know example even with ones that have bug bounty programs they don't have a clear way to report so yeah I mean it shouldn't be on the researcher to try and find a contact via Twitter it's I mean I'll put myself on the spot I'm sitting on a bug right now and the largest health care management software in the world like have actual confirmation that ninety three point one percent of the entire agencies or the entire industry uses it and it was a pen
test for a company that owned a company that owned the company and about half way up the chain like day two and at my job they said you know what this is actually a massive massive risk we're just gonna sell the company so we don't have to deal with this but so they did they never told the vendor I can't legally tell anybody I'm just sitting on this bug I mean this is like money this is like millions plural of dollars right there is no F like what is my ethical response what is the threat to life and limb to do anything Bruce on this I was just gonna ask if is it okay to drop out hey I mean surrender man save it before like
why is it on the researcher like is it
okay just to drop everything I think there's some nuanced answers but in general can I just like it drop it and feel okay about myself yes and it was ruled that code is speech so it at least in the United States they can't go after you for that you know it's I think where some where some problems may arise are things where the laws especially in the United States will allow companies to go after the researchers for doing so regardless right and they even the legal threat and the threats to the employer that was a what you brought up as well but I mean who in this room is dropped today put your hand up yeah my hand Katie did it freaking did it right come out right we dropped O'Day and it's like that was a seat the thing was it was so good no I'm what happened was we well this was this was a carryover from at stake days we had an advisory we were trying to contact the vendor for I think four months no response we called them on the phone I hate phones and like we let you know we had email threads for four months so transition we get bought by Symantec and everything like that eventually we're like we're gonna publish a non detailed version of this just to warn users of the threat because especially as a you know security defense company we had the right to protect our customers about a vulnerability we knew about right so we dropped O'Day and oh my god the anger mails and I think I wish I had saved that boy smell because there were swears I did not know and that is unique for me well I think part of it was this is the first time this company had the deal with this type of issue which is there's always a first time for every company they called us in responsible so that was true so I think that's that's if you're the if you're the finder for and dealing with the company for the first time it's gonna be a lot more work because they're not gonna have any way to cave and communicate with you really and think about that at the time Symantec was the largest security software vendor in the world we were threatened by the vendor for dropping a no detailed advisory to let people know that something they had in their possession was insecure and that the vendor hadn't responded so think about how hard it is for an individual researcher to deal with this it is not the onus is not on the researcher this brings up a question that came from the back medical devices so you have a node a on a medical device like pacemaker or an insulin pump or something like that and you try to responsibly disclose this vulnerability to the maker and a manufacturer so where does the line get drawn if nothing is done about this I think you can send that to the FDA and they're gonna take care of it not true from experience really yes somebody who actually has relevant current experience in that exact area needs to talk I can do what I want for the next 15 minutes so my name is Steve Christie kohli I work at mitre supporting FDA providing subject matter expertise in the area of medical devices when they receive vulnerabilities and so FDA has a regulatory authority over medical devices when it comes to safety and of course cybersecurity can have an impact on safety I've seen them wield their influence which is much easier than in apparently the unregulated world of software so anybody who's had any difficulties certainly come to me or you can reach out to the FDA as well they are literally here you can go to the bio hacking village and meet them but yes there has been some critique but there has been some critique of their practices in the past but they've been doing as much as possible to to make it better you know and I think I think that's an important point is that everybody was was really really terrible or so terrible they didn't even exist on the scale of terrible - not terrible and I think that we should be encouraging especially the regulatory authorities who regulate vendors when they're making progress agreed not a good process before not good outcomes but we should be encouraging those regulatory authorities to come down on the vendors who they regulate yeah I mean it's a case of if if you are making something that has the ability to affect life and limb that's gonna have to have some responsibilities attached to it okay so I did I talked to some Congress people last night and they asked how can we more you know have greater engagement with the researchers how can we you know help them and everything and I said go after the vendors Congress critters this is what you should be doing right some more laws that actually apply to the vendors and regulate the vendors and reform the C Computer Fraud and Abuse Act well and it sent advisors to it can all these sticks there's gotta be carrots there I mean no-fault disclosure mechanism that's a great item I mean like I'm a sitting example of like yes you're all the things you guys said also none of those help me write I don't like just I'm putting myself I think maybe a vulnerability researchers should be treated like a common carrier like in telephone systems you know agnostic this is not anything other than what has been discovered and they shouldn't be faulted for anything that they've discovered because they're not didn't necessarily the root cause of that it was some mistake made at the vendor and you would ask him me about a question was pulling on my pants so I was just trying to take them off just one no I think it's some some people on the panel will know this is probably a question I would've been uncomfortable asking maybe six or seven months ago so I apologize it might be controversial but I I think from a regulatory perspective when something affects life and limb I'm curious what role the government should play not only just for vendors but also for researchers because when you're when you're not all speech is protected right so if you're if you're disclosing something that you found that could cost someone their lives or cause them serious personal damage not companies but individuals like what what role does the government play in that well one I think you're assuming that a researcher is going to necessarily know all the uses of the code that they found the vulnerability in right they are not necessarily going to know that something they found of own and it you know over here is actually also code reuse used in some kind of life and limb scenario so I think there's there's a degree of having to just put it this way researcher already did a bunch of free labor right having them understand exalt you sirs of the code I don't think that's in scope and and frankly I think what we need to think about is like what is the real threat to life and limb is it full disclosure or is it non-disclosure of discoverable bugs it's and I think that's the biggest danger it's probably nuanced right I think like we're making this black in a way but I think the question that I really like interested in is does the role of the government government include any regulation around researchers in Holland they I think get Public Safety it's illegal to walk into a room and you'll fire if there's no fire because you're incite a panic if there is a fire you're not lying so you're fine there's no rule that says you have to yell fire if you see fire right so just because we may subjectively decide that ethics exist on both sides it doesn't mean that it's the job of policy to police both sides yeah where I think the government needs to step in more so is so over besides there was a talk at the the underground track about the federal FBI cyber ninja program basically providing access to FBI people in there their technical technical operations that before you pull the trigger on something and go all the way let them know they're actually working with the e FF you know Kurt for the FF was there you know with Russ from the FBI where the analogy they drew was you find a big bag of drugs on the street and you're like I should probably drop these off at the police station said no one black ever [Laughter]