Ethics Village - Teaching Consulting Pentesting and Ethics
This is a modal window.
Das Video konnte nicht geladen werden, da entweder ein Server- oder Netzwerkfehler auftrat oder das Format nicht unterstützt wird.
Formale Metadaten
Titel |
| |
Untertitel |
| |
Serientitel | ||
Anzahl der Teile | 335 | |
Autor | ||
Lizenz | CC-Namensnennung 3.0 Unported: Sie dürfen das Werk bzw. den Inhalt zu jedem legalen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen. | |
Identifikatoren | 10.5446/48338 (DOI) | |
Herausgeber | ||
Erscheinungsjahr | ||
Sprache |
Inhaltliche Metadaten
Fachgebiet | ||
Genre | ||
Abstract |
|
DEF CON 272 / 335
8
9
12
14
32
38
41
58
60
61
72
75
83
87
92
96
108
115
128
132
143
152
158
159
191
193
218
230
268
271
273
276
278
295
310
320
321
335
00:00
InjektivitätPenetrationstestEDV-BeratungCASE <Informatik>Güte der Anpassungt-TestMAPMereologieZusammenhängender GraphSoftwaretestPhysikalischer EffektZeitzonePlastikkarteAbfrageComputeranimation
01:50
EindeutigkeitPenetrationstestRechenwerkWhiteboardProzess <Informatik>SoftwaretestVererbungshierarchieKontextbezogenes SystemInverser LimesDatenverwaltungSpieltheorieNichtlinearer OperatorComputersicherheitMereologieGruppenoperationSchlussregelMultiplikationsoperatorHilfesystemNP-hartes ProblemEDV-BeratungEreignishorizontRichtungt-TestApp <Programm>BitHauptidealInformationImplementierungComputeranimation
03:54
FokalpunktEDV-BeratungComputersicherheitProgrammierumgebungProgrammierumgebungWhiteboardMultiplikationsoperatorDialektSoftwaretestVerkehrsinformationFahne <Mathematik>Coxeter-GruppeGeradeMAPProzess <Informatik>Nichtlineares GleichungssystemEDV-BeratungGrundraumComputersicherheitE-MailProfil <Aerodynamik>Ausdruck <Logik>App <Programm>Nichtlinearer OperatorServerSoftwareBitrateHoaxComputeranimation
05:44
TelekommunikationRechnernetzSoftwaretestProzess <Informatik>EntscheidungstheorieFlächeninhaltMultiplikationsoperatorSpieltheorieSoftwaretestInteraktives FernsehenPerspektiveComputersimulationKonfiguration <Informatik>App <Programm>Zusammenhängender GraphBildgebendes VerfahrenProgrammierumgebungPhysikalisches SystemSoftwareFahne <Mathematik>Computeranimation
07:18
TopologieInverser LimesCloud ComputingVirtuelle RealitätMultiplikationsoperatorDialektSoftwaretestProzess <Informatik>OrtsoperatorInteraktives FernsehenPhysikalisches Systemt-TestTelekommunikationProgrammierumgebungZahlenbereichCybersexObjekt <Kategorie>EntscheidungstheorieZweiComputeranimation
08:54
HackerRandwertSchlussregelt-TestClientHackerClientRandwertMultiplikationsoperatorDatenstrukturPunktVektorpotenzialQuick-SortEinsGruppenoperationSchlussregelVererbungshierarchieProgrammierumgebungGüte der AnpassungInteraktives FernsehenMathematikInverser LimesFahne <Mathematik>Whiteboardt-TestPerspektiveLoginTabelleKontrollstrukturVerschiebungsoperatorEntscheidungstheorieVerkehrsinformationReelle ZahlComputersicherheitCybersexComputeranimation
12:14
Physikalisches SystemInformationPlastikkarteUmsetzung <Informatik>ClientEinsBitHackerRandwertDatensatzFacebookInformationHypermediaAttributierte GrammatikDesign by ContractProfil <Aerodynamik>Chatten <Kommunikation>GrundraumSchlussregelMultiplikationExogene VariableSoftwaretestWeg <Topologie>Fakultät <Mathematik>GruppenoperationProzess <Informatik>MereologieCASE <Informatik>NetzadresseSoftwareHoaxVerkehrsinformationPunktEDV-BeratungKontextbezogenes SystemPerspektiveRechter WinkelKategorie <Mathematik>t-TestSocial Engineering <Sicherheit>ProgrammierumgebungPortscannerOffene MengeVerschlingungComputersicherheitWeb SiteVollständiger VerbandDichotomieEntscheidungstheoriePhysikalisches SystemLoginProjektive EbeneSelbst organisierendes SystemDatenverwaltungParallele SchnittstelleDifferentePhishingRegulärer GraphBetrag <Mathematik>BeobachtungsstudieComputeranimationBesprechung/Interview
22:25
Binder <Informatik>SoftwareMAPMultiplikationsoperatorLoginSpieltheorieDatenfeldZusammenhängender GraphDigitale PhotographieInzidenzalgebraDifferentePunktDivergente ReiheSoftwaretestExogene VariableWort <Informatik>t-TestReelle ZahlGrundraumFokalpunktClientSchlussregelAdditionProfil <Aerodynamik>DifferentialDynamisches SystemEntscheidungstheorieGewicht <Ausgleichsrechnung>InstantiierungMailing-ListeURLCASE <Informatik>WellenpaketGesetz <Physik>Güte der AnpassungMereologieInteraktives FernsehenGruppenoperationRandomisierungVollständiger VerbandGefangenendilemmaDialektComputerforensikDatensatzComputersicherheitRegulärer GraphSchnittmengeProjektive EbeneKontrollstrukturDesign by ContractStrategisches SpielForcingProgrammierungTermBildschirmfensterDokumentenserverSpiegelung <Mathematik>Selbst organisierendes SystemErwartungswertBitHackerSystemaufrufRechter WinkelDomain <Netzwerk>ComputerspielFlächeninhaltKontextbezogenes SystemVerkehrsinformationKreisflächeUmwandlungsenthalpieArithmetisches MittelFeuchteleitungProgrammierumgebungPhysikalischer EffektGradientPlastikkarteVersionsverwaltungGamecontrollerInformationEDV-BeratungComputeranimation
31:29
EDV-BeratungMereologieGrundraumMaschinenschreibenPhysikalisches SystemMehrrechnersystemProzess <Informatik>ComputersicherheitGüte der AnpassungE-MailMultiplikationsoperatorInformationBitNeuroinformatikClientExogene VariableLogint-TestSchlussregelStochastische AbhängigkeitQuick-SortNetzadressePlastikkarteVerfügbarkeitVollständiger VerbandSpieltheorieSystemzusammenbruchAdressraumPerspektiveCoxeter-GruppeEuler-WinkelVerzweigungspunktZusammenhängender GraphVektorpotenzialComputerforensikAnalysisEinsHackerDienst <Informatik>SoftwareVerkehrsinformationDatenfeldRandomisierungGemeinsamer SpeicherDifferentep-BlockForcingFormation <Mathematik>PunktEreignishorizontSoftwareentwicklerKurvenanpassungComputerspielSelbst organisierendes SystemRechter WinkelTermRankingKontrollstrukturProgrammbibliothekSpannweite <Stochastik>Algorithmische ProgrammierspracheSoftwaretestProgrammierumgebungOrtsoperatorGleitendes MittelGebäude <Mathematik>FreewareKorrelationsfunktionSichtenkonzeptMomentenproblemDatenflussplanLie-GruppePerfekte GruppeNetzbetriebssystemProdukt <Mathematik>DatenverwaltungKontextbezogenes SystemMAPStrategisches SpielProgrammierungImplementierungTelekommunikationQuaderRückkopplungElektronisches ForumUmsetzung <Informatik>Arbeit <Physik>Varietät <Mathematik>FrequenzKomponente <Software>Kategorie <Mathematik>Materialisation <Physik>Reelle ZahlEinfach zusammenhängender RaumPhysikalische TheorieDrucksondierungFakultät <Mathematik>RechenschieberComputeranimation
40:33
FokalpunktExploitEDV-BeratungEinfügungsdämpfungSoftwaretestFunktionalVirtuelle MaschineSpieltheorieComputerspielMAPMereologieCybersexEntscheidungstheorieAutomatische HandlungsplanungMultiplikationsoperatorUmwandlungsenthalpiePhysikalischer EffektComputersicherheitCASE <Informatik>ClientResultanteSelbst organisierendes SystemReelle ZahlRückkopplungAnalysisDatenbankPunktKette <Mathematik>Serviceorientierte ArchitekturZentralisatorZusammenhängender GraphDeterminanteQuick-SortSprachsyntheseZahlenbereichFirewallTelekommunikationZehnZweiWurzel <Mathematik>GraphfärbungInformationKonfiguration <Informatik>Physikalisches SystemDesign by ContractGeradeVerfügbarkeitErwartungswertE-MailGüte der AnpassungSchlussregelStereometrieTermBitrateExogene VariableGefangenendilemmaBitVerkehrsinformationOrdnungsreduktionTeilbarkeitSchreib-Lese-KopfVererbungshierarchiePlastikkarteRechter WinkelWhiteboardMathematikMIDI <Musikelektronik>FlächeninhaltSoftwareschwachstelleWeb SiteWeb-SeiteServerSoftwareMomentenproblemPerspektiveSummengleichungGrenzschichtablösungProgrammierumgebungSchwellwertverfahrenEreignishorizontRobotikHydrostatischer AntriebGradientMechanismus-Design-TheorieStrategisches SpielWort <Informatik>Interaktives FernsehenProzess <Informatik>DifferenteComputeranimation
49:37
PortscannerE-MailRichtungt-TestSpieltheorieDatenfeldMechanismus-Design-TheorieGrenzschichtablösungCASE <Informatik>Kontextbezogenes SystemInformationHypermediaMustererkennungMultiplikationsoperatorFormale SpracheBitOrtsoperatorAdditionElektronischer ProgrammführerFahne <Mathematik>Interaktives FernsehenCybersexDatensichtgerätClientWhiteboardRechter WinkelVideokonferenzDichotomieStrategisches SpielSpeicherabzugVersionsverwaltungGüte der AnpassungDickeProzess <Informatik>SoftwareBenutzerschnittstellenverwaltungssystemParametersystemTypentheorieTwitter <Softwareplattform>MereologieShape <Informatik>Mailing-ListeZweiSkriptspracheArithmetisches MittelMessage-PassingForcingSoftwaretestPerspektiveBitrateSystemverwaltungDomain <Netzwerk>SchlussregelProgrammierumgebungMathematikGemeinsamer SpeicherAdressraumKollaboration <Informatik>VerkehrsinformationService providerKurvenanpassungExogene VariableSchlüsselverwaltungChiffrierungErwartungswertOrdnung <Mathematik>FeuchteleitungEreignishorizontProgrammierungStellenringPasswortFirewallMAPMixed RealityNegative ZahlMatchingPunktAutorisierungGarbentheorieRechenschieberÄhnlichkeitsgeometrieWort <Informatik>MomentenproblemVektorpotenzialFlächeninhaltTeilbarkeitFrequenzComputeranimation
58:42
PortscannerZusammenhängender GraphPunktt-TestGemeinsamer SpeicherMereologieInzidenzalgebraSelbst organisierendes SystemAutomatische HandlungsplanungExogene VariableMultiplikationsoperatorMultiplikationFirewallSystemverwaltungNatürliche ZahlUmwandlungsenthalpieUmsetzung <Informatik>DokumentenserverDatenfeldGrenzschichtablösungVerkehrsinformationBitClientSelbstrepräsentationPerspektivePeer-to-Peer-NetzEreignishorizontE-MailCMM <Software Engineering>DifferentePhysikalisches SystemMalwareProgrammfehlerProgrammierumgebungSoftwaretestMathematikProzess <Informatik>Rechter WinkelInternetworkingInformationInjektivitätSystemaufrufWeb SiteMaschinenschreibenGeradeFormale SpracheRückkopplungWort <Informatik>Kontextbezogenes SystemResultanteErwartungswertData DictionaryBenutzerschnittstellenverwaltungssystemEntscheidungstheorieMathematische LogikHidden-Markov-ModellGroße VereinheitlichungEDV-BeratungForcingStrategisches SpielFokalpunktProgrammierungGrundraumWellenpaketMAPLoginKreisflächeInstantiierungInteraktives FernsehenDialektDynamisches SystemProjektive EbeneSpieltheorieDivergente ReiheURLGruppenoperationPlastikkarteFeuchteleitungGamecontrollerRandomisierungComputersicherheitSchlussregelBildschirmfensterReelle ZahlAdditionComputeranimation
01:07:46
PortscannerEreignishorizontMereologieProgrammierumgebungImplementierungSoftwarePhysikalisches SystemSchlussregelStereometrieQuaderDifferenteMAPOrtsoperatorGüte der AnpassungPhysikalische TheorieEDV-BeratungSelbst organisierendes SystemComputersicherheitStrategisches SpielSoftwaretestMultiplikationsoperatorInformationRückkopplungQuick-SortÄhnlichkeitsgeometrieVarietät <Mathematik>ComputerspielRankingTermProgrammbibliothekKollaboration <Informatik>KurvenanpassungDatenfeldGebäude <Mathematik>CybersexElektronisches ForumTelekommunikationt-TestPunktFakultät <Mathematik>NetzbetriebssystemVerkehrsinformationForcingProdukt <Mathematik>DatenverwaltungKontextbezogenes SystemVirtuelle RealitätProgrammierungReelle ZahlPlastikkarteRandomisierungClientVererbungshierarchieBeweistheorieProgrammverifikationEuler-WinkelRechter WinkelStabHackerEntscheidungstheorieProfil <Aerodynamik>EchtzeitsystemGradientProzess <Informatik>Umsetzung <Informatik>GruppenoperationPerspektiveSystemaufrufBefehl <Informatik>RechenschieberGenerator <Informatik>DialektGefangenendilemmaCheat <Computerspiel>Lemma <Logik>SpeicherabzugProjektive EbeneElement <Gruppentheorie>GamecontrollerSoftwareschwachstelleComputeranimation
01:16:50
SchlussregelRechter WinkelDemoszene <Programmierung>SchlussregelBefehl <Informatik>Quick-SortBitMomentenproblemMereologieSchreib-Lese-KopfEreignishorizontGrenzschichtablösungClientFokalpunktPunktTermSelbst organisierendes SystemProgrammierumgebungSummengleichungWeb-SeiteProzess <Informatik>MIDI <Musikelektronik>SoftwaretestMehrrechnersystemSoftwareTouchscreenMultiplikationsoperatorRobotikMAPMathematikSpieltheorieDifferenteWeb SiteComputeranimation
01:23:37
ClientClientSoftwareschwachstelleVerkehrsinformationMathematikSoftwaretestMechanismus-Design-TheorieOrdnung <Mathematik>CASE <Informatik>PasswortFeuchteleitungKollaboration <Informatik>StellenringPhishingPunktFirewallProgrammierumgebungWeg <Topologie>E-MailChiffrierungNegative ZahlMAPProgrammierungt-TestDomain <Netzwerk>SkriptspracheSystemverwaltungMessage-PassingSchlussregelMultiplikationsoperatorMereologieZehnIterationBitrateOrtsoperatorOffene MengeTeilbarkeitErwartungswertExogene VariableFlächeninhaltBitVektorpotenzialGarbentheorieComputeranimation
01:32:08
ClientPortscannerMalwareProgrammfehlerMalwareMultiplikationsoperatort-TestClientInzidenzalgebraMereologieCMM <Software Engineering>Überlagerung <Mathematik>MathematikFirewallProzess <Informatik>Exogene VariableSystemverwaltungWeb SiteSystemaufrufInjektivitätPhysikalisches SystemEreignishorizontVerkehrsinformationMultiplikationSelbst organisierendes SystemSoftwaretestProgrammierumgebungPerspektiveOrtsoperatorRückkopplungRechter WinkelBenutzerschnittstellenverwaltungssystemPunktMomentenproblemKontextbezogenes SystemMessage-PassingUmsetzung <Informatik>SpieltheorieWort <Informatik>Güte der AnpassungDifferenteResultanteComputeranimation
01:40:11
MalwareProgrammfehlerProgrammierumgebungBenutzerschnittstellenverwaltungssystemHackerMultiplikationsoperatorVerkehrsinformationWeb SiteEinsMereologieProgrammverifikationDifferenteClientVererbungshierarchieEntscheidungstheorieGefangenendilemmaRechter WinkelGroße VereinheitlichungProzess <Informatik>t-TestProgrammierumgebungPerspektiveBeweistheorieExogene VariableData DictionaryMaschinenschreibenBitSelbst organisierendes SystemSystemverwaltungUmsetzung <Informatik>SelbstrepräsentationZusammenhängender GraphGrundraumMathematische LogikSoftwareschwachstelleDatensatzPunktComputeranimation
01:48:13
Euler-WinkelClientElement <Gruppentheorie>TelekommunikationDatenverwaltungComputersicherheitDatenverwaltungQuick-SortReelle ZahlStabProgrammierungElement <Gruppentheorie>Euler-WinkelInformationAbgeschlossene MengeComputeranimation
01:50:10
ComputersicherheitCybersexHilfesystemWhiteboardRückkopplungInjektivitätGradientProzess <Informatik>SystemaufrufVerkehrsinformationGruppenoperationGebäude <Mathematik>WhiteboardDialektSpeicherabzugt-TestBefehl <Informatik>Generator <Informatik>Projektive EbeneMultiplikationsoperatorComputeranimation
Transkript: Englisch(automatisch erzeugt)
00:00
Alright, welcome everyone to Ethics Village. My name is Shane, I'm one of the founders of the Village and today we have our final talk of the day is uh concerning teaching, consulting, pen testing and ethics, lessons learned from running a national penetration testing competition. I will um let the guys introduce themselves but this is intended to be an interactive
00:22
session. If you look near you, you should have an ethical slash unethical card. They may ask questions as to your opinion and if you don't have one uh after I'm done announcing them, you can raise your hands and I will hand some more out. Um and what they'll do is they'll they'll pose a query to you and ask whether or not you think the scenario is
00:41
ethical or unethical. Um if you do have questions or comments about uh the scenario, I ask that you step up to the microphone because we are recording this uh for posterity and so with that I will let the guys take it away. Okay, so good evening, good
01:01
afternoon everybody, I suppose it depends on what time zone your body or in my case my stomach is still in but um we are here to talk about uh certainly the ethics of what we have seen interacting with collegiate level students doing a competition. So we've got a really short introduction to give you all some background on what the competition is, we're
01:20
gonna introduce ourselves as we kinda get to some components of that and then for the most part we just have some scenarios. These are things that we've experienced in the last 5 years of doing the competition and this is a good size so expect to start to see everybody else coming up to the microphone or uh well we can't really just have you yell it out cause we are recording it but we really wanna get everyone's uh
01:41
involvement so I know a lot of people in the crowd so we might be like picking on you and making you come up to. Ah next. So um the agenda here is like I said we're gonna talk through what the competition is uh we're gonna go through the ethics and then we have some scenarios. So who are we? Uh my name is Lucas Morris, I am uh at my day job, I am a
02:05
senior manager uh of the uh Crow Horwath or Crow now uh that does penetration testing, information security, been doing it for about 15 years. In the auspices of the competition I am one of the members of the black team that actually builds the infrastructure, builds the world and runs the competition itself also as a member of the
02:24
advisory board. Yeah and I am Tom Kupchak, I in my real job I am a director of technical operations at Hurricane Labs uh we basically do Splunk work and I manage the team that does our Splunk implementations and pretend to do other things that happen as needed. Within the competition I am in the white team and handle a lot of the rules and
02:45
the operations of the event and basically working with Lucas and other members of the advisory board to give the students the experiences that we're hoping to achieve. Hi I'm Jason, uh day job, principal consultant for NCC groups, breaking things all day
03:04
long. Uh within the competition I'm on the black team, I help do app dev so a lot of times we write super custom apps and I think we're gonna talk a little bit about some of that in a bit but uh and I also am part of the pentesting advisory board as well to kinda help set that direction. Hey guys I'm Dan Borges, uh for my day job I'm an
03:24
internal red teamer um but for CPTC I help put together all the OSINT and world um scenarios and that's kind of like the context and the flavor of the game, it's the fake employees, it's the operations of the company, um like the docs and a lot of the storylines uh and then I just wanted to emphasize that there's, it takes a
03:42
village to make this competition, we're just four people representing uh a lot of people that put in a lot of hard work. Yeah um and every year the volunteers grow, there's actually some other volunteers uh in the audience. So let's talk about the competition here very quickly, we really want to get to our scenarios. So what is it? It is an annual collegiate level penetration testing competition. So
04:02
what does this mean? Well we're not a CTF, we're not a defense competition, those are also very good but the gap that we saw several years ago was uh as a consultant people come out and they're very technical, they know how to do the tools, they know how to build a network, they know how to break into AD, Linux, something like that but where they were lacking was interacting with
04:20
clients, interacting with a business. In internal jobs they were having a hard time communicating risk to their superiors. So what did we do? We put together a competition where the teams from various universities from around the country and actually this year will be around the world come together into various regions and then to a national competition and perform a penetration
04:41
test. So they are given time inside an environment where we build a fake company including apps, including hundreds of emails, OSINT profiles, fake servers, an entire internal environment to attack. But the important thing is they're not scored only on their technical competency, there are no flags in the environment. It is simply a basically a corporate environment and they are
05:05
scored based on a report and a national is also a presentation to a board which are executives from our vol- from our sponsors that rate them. So if they don't put something in a report and if they don't communicate it effectively in that report they get no credit even if all of the technical details were there. And I really like the last line on here where there's basically the
05:25
formula for CBTC. We take offensive security, pen testing is the vehicle through which we teach offensive security, but we build a custom environment especially Jason Lucas working together to lead that operation. We throw in the business and you get CBTC. So it's definitely an equation that we have
05:44
going on. So this isn't really a competition we treat it like an engagement. We are actually in character most of the time so if a team comes to us and says well how does this work for the competition we're like well what do you mean this isn't a competition between your folks and our IT team. Like what do you think this is some kind of game? This is we act like we are an executive or engineer or someone at this company. And
06:05
so they do test it for technical issues. We have a ton of technical issues but you won't see a ton about this because a lot of the ethical issues that we have come with the interaction and the decisions that the teams make. And it's great because this is a learning ground and a testing ground for them
06:22
where they can make and you'll see some serious mistakes that frankly at least at my job we would fire someone immediately for that they can then start to talk about. We have a couple of those we actually focused on more some of the gray area because we wanted to get some good debate for what we have happening. But they do have to communicate. So from a technical perspective well each year we do have a different theme. This year
06:42
actually we're doing a bank so I think there's gonna be some really interesting options there. But every year there's a lot of environment. The whole apps are custom. We'll have some off-the-shelf stuff as well but for example when we were the hospital we had an EMR we had lots of different we tried to simulate imaging systems and other things and custom apps and
07:01
certainly our sponsors also will throw software at us that actually really works out for them getting tested. For the autonomous vehicles we actually had car simulators and Uber was a big sponsor. So there's a big technical component to this but as I said there are no flags we just try and make it a company. So with that we're not here to talk about the
07:20
competition itself. If you want to do that we'll have some stuff if you want to volunteer later we can talk about at the very end. We want to talk about ethics. So we are here to teach these students. That is what we are there to do. And we are teaching them realistically three major objectives. The first is technology. We need there's a skills gap in cyber so we need it. But the second is soft skills, interaction. And then the third is
07:42
decision-making and team interaction because there are six students per team with one captain that's a student and a coach but the coach is not allowed to be engaged during the competition. In fact they can't talk to them for most of that time. So the teams have an opportunity to be dangerous. It's a simulated environment. Each team actually gets a separate environment so we do actually spend like 20 grand during the weekend of regionals and for
08:03
our cloud provider who thankfully is a very positive sponsor of ours because everyone gets a dip at their own mirrored copy of the company. So they can be dangerous. They will take things down. They will change environments. They will actually, as you'll see in our first scenario, leave red herrings for other teams during the OSINT process and during a bidding process that we
08:23
make them go through. We don't give them really a lot about scope. They have to discover that themselves and for those of us that do pen testing you know scope is important. Taking systems down is very bad. So they now will learn if they go into consulting, which a small number of the students do, they
08:41
will have the direct skills. But for those that go into an analyst or an engineering role or something in-house, they now also know how do I be a better customer and how do I be a better communicator to the people that I'm working with. So what do we learn through this? Well, a lot of stuff.
09:02
Technically minded college students, they're really bad at interaction the first time through. I know everybody's probably real surprised about that. The they treat it as such. They will regularly plow through limits that we've put together. And I know every single person at this table has said, but I'm a
09:23
hacker, rules don't apply to me. So just to point out too, we dock them points if they go out of scope. And we will actually dock them more points or less points depending on their reaction. If we walk into a room sometimes we'll channel angry clients that we've had and we'll start
09:40
screaming at them. Not angrily, but we'll be making a tense situation. Lucas has a ton of experience with that sort of thing too. And I'm generally a very nice person. He's good at being an angry client. But if they say we're very sorry, let us look into this, they may get points back. Especially if they come back with a good answer. Then not all of them, but most. If they say that's not us, and we
10:02
have logs, we monitor, we generate 400 gigs worth of data in just the national competition every year, we know. They lose all the points. So are there boundaries for hackers? I think that is a big piece when we think of the ethics of cyber security. And what are the boundaries that we have working with
10:21
our clients, working with our companies, both from a responsible reporting, but how we scope and structure our engagements to make sure they're the best. Yeah, I think a lot of times hackers think if something isn't explicitly denied then they can get away with it. But when you're working in a client engagement, there's unspoken rules of how you interact with
10:42
the clients. And this is a super interesting change of perspective for them because as a community we've built up this great environment where people can come in and just crap all over everything all the time in CTFs, right? And so they come into this with that mindset of like I'm just gonna bang a lamb, break all the things, and that's not a professional environment at all.
11:04
So it's a very big mind shift for them to come into a CTF where that doesn't work. And we have the rules written in such a way that they are broad. And so there is a board of industry professionals that not only makes decisions on what we do and certainly has very little free time putting together this competition throughout the year, but also whenever
11:22
there's a potential rules violation some of them are very clear-cut. That's not what we want to talk about here today. We want to talk about the ones that are gray because we get together and we build consensus between it's not just this group but there's often 10 or more of us before we create a rules violation and flag it as such. Because our rules are broad and we want to try and give them a real world interaction. Also knowing that
11:43
frankly sometimes if you piss a client off it doesn't matter what you did you are in the wrong. And we've given ourselves the flexibility to do that. And we firmly believe that teams might, you know, they treat this as a competition even though we don't want it to be a competition. So they look for nuances in the rules or things that we haven't considered and
12:03
we don't consider that professional. And we want to make it clear that if that comes up we have the ability to deal with it just like a client would. So the way this is going to work now that we've given you about 10 minutes worth of background here is there are a lot of challenges that we deal with.
12:21
So we have 11 stories. Our hope is to not get through all of them because we want to hear your thoughts. Whether it's raising your card but more importantly we'd love to hear your comments at the microphone as we go through these. So the four of us are going to tell a story. We've worked through what ones we wanted to tell but it'll probably be a little bit of communal storytelling. We're going to keep that short.
12:41
Then we want to have a discussion around how we'll talk about how we responded but we also want to hear about how would you have responded and where do you think things belong. And feel free to tweak the scenarios to talk about something a little different if you want. We definitely want to make this creative and get into some good conversation about what is acceptable and where do we want to take this competition in the future.
13:03
Because the idea is these are things that as industry professionals you've probably ran into yourself with either clients or personally as a grad teamer. So things that we'll be talking about are generally going to be things that you're going to have experience as well. So I think Dan you probably are going to be the best one to start this one off.
13:27
This one was pretty interesting. As part of the OSINT for the competition we had set up a fake company website. The company website links to multiple social media profiles for fake employees and then from there it started going into this crazy social network.
13:44
At one point a bunch of these fake employees were part of a group where they shared an open Slack invitation. So anybody that found this public group on say Facebook could now join a private Slack chat which was the company Slack chat. And this was OSINT before the competition started and it's public so anybody can join.
14:03
And what we had is we had a whole bunch and then we seeded this chat with a bunch of conversations. So what we had is we had students log in and then where the real issue came is instead of just logging in and like learning this information and doing the OSINT they started to make profiles of the company
14:22
and seed false data to the other people that were getting the OSINT to throw them off the trail. So I think we want to stop right there and just take a poll of the audience with your cards. How do you feel about that? I'm seeing a lot of black. This first one we felt like it was maybe a little more black and white
14:41
at the beginning but we got the easy one. We want to make sure everyone understands the system. Everybody's in practice. So then we had another team end up reporting it because they basically found a bunch of information that they started doing research on and then they found it was open information and not part of our environment.
15:01
So then they came to us and they said is this you? And we told them no, this is not us. And then we started the investigation, found out it was another team, at which point we had to talk to both teams, the team that found the false information and the team that submitted it. So the information itself was something that was not copyrighted but owned.
15:25
It was proprietary, thank you. So there were two issues. Again, I'd like to see the cards before we start the discussion. So the team that found this information, not the team that planted it, but the team that found it immediately recognized where it came from.
15:42
So their first response actually was to come to us but in parallel also start going to that organization before they had any information from us and report us for using that information. So I would like to hear from everybody on this as well. What are your thoughts on that? Was that the ethical decision to immediately go to the proprietary company
16:04
or should they have waited until they got information from us? Seeing some black, but it feels questionable. I saw some white too. Yeah, there was a little bit of white. I think you may have set a false dichotomy in your question.
16:20
Oh, did I? Somebody said should they have immediately reported it to them or should they have come to us and asked us about it? But what I feel they should have done was find out information as to whether or not it was something they should report. Absolutely. It was poorly worded on my part. So somebody with white, come up and tell us why you think that's ethical.
16:47
There was ambiguity for me because I was like, well, depending on what kind of information, this could have been like, holy shit, this is a really bad, you know, is this bad? Guys, this is bad too. And sort of like reporting it to both parties,
17:00
whether or not they get you in trouble is an aside to, oh, by the way, I found child porn on this website. I reported to the FBI and you. Let's just clarify, this was intellectual property. Whatever the intellectual property, it could be of the, holy crap, I'm going to tell both. Well, ethically, we're trying to be a little vague on this.
17:22
Innocent parties, innocent. But we will say intellectual property, not that other stuff. So just to make everyone clear. I want to take that example, though, because I've actually run into that example in a pen test where I have found child pornography on my client's network. So let me ask this question. Outside of this context with CPTC, what's the ethical responsibility there?
17:40
Is the ethical responsibility because you're under an NDA, right? So am I allowed to go to the FBI or do I just report it to the client and allow them to escalate it? You're forced to go to the FBI. Right, correct. Do not pass go and do not collect $200. Right. So from that perspective, they kind of did the right thing, right?
18:01
Like reported to the body that owns the information and then suss out how it got there and who owns it. But from one perspective, right, there's multiple perspectives here. If they were claiming attribution. We're recording, so if you wouldn't, sorry. Yeah, on the other hand, if they were claiming a specific attribution to you of this violation, that is the unethical thing.
18:21
Reporting it is completely ethical. So then we started the investigation. We found out it was a team that was doing this to see false information for other teams. So we had the IP address of where it came from and it was an IP address that was most assuredly attributed to a university.
18:43
And it happened to be a university that was competing. And it happened to be a university that was competing. So circumstantial evidence, but we called. They came forward when we asked them about it. Yeah, so we called the coach. So a member of the faculty or an employee, the coach has to be. And so we called the coach and through that process,
19:02
we actually met with the team. And we met with the team in two ways. So the first ten minutes or so of the meeting, we just met with their coach and their captain. So it would be akin to saying, maybe we're meeting with the partner and the project manager or something like that in a consulting engagement. And we were entirely in character and asked them,
19:21
so why is it that we found you planting false information for our other people that are bidding on and working in our environment? And they did a very fantastic job of saying, you know what, we did not do that. I did not know about this, but if you believe it's coming from our IP address. I think that's a different scenario.
19:41
Was it? Yeah, they admitted fault when we did this. My bad, I'm combining a couple. Needless to say, over the years, I've had to have a few of these meetings. So apologies. So they immediately came forward and said, it was us. Yeah, and they removed the data. And they removed the data. So our response was to dock them points, basically to assess a penalty.
20:05
Do you want to talk about that? I would just say that we held that as unprofessional behavior, and it's no different than if your client discovers you doing something unprofessional, they might fire you, or they might not renew your contract in the future.
20:22
So very similar to how we held the team accountable in that case. That said, I think once they removed the data, even the group that was going to report to the organization, which theoretically would have kicked them out of whatever they were trying to accomplish with that organization, didn't. It all resolved kind of peacefully, and I don't think he lost his access to the thing.
20:41
That's correct. The offending university had uploaded a port scan of a private testing and a proprietary network that is used for studying. So that's pretty vague. Next one?
21:01
So next. Okay, so this is very similar. In this one, we set up LinkedIn profiles for all of our employees, and we set up a fake company. In this one, we had student teams create fake employees of our company to add them as friends and friend request them on LinkedIn.
21:23
We didn't really appreciate this from that perspective because let's say we were working at that company, that's not a real employee, and we kind of recognize that. So how do you guys feel about ethical or not ethical for someone to do this in a pen testing? I think we need to clarify that phishing was not part of the scope. Yes.
21:41
It's just a regular network pen test. Hold them up high. All right, seems pretty universally unethical. Now what about if phishing, social engineering, security awareness testing was in scope? Pretty ethical. Pretty ethical. Okay. So that really shows the importance of scope there.
22:03
And one of the things that, as Lucas was saying, that we try to teach is the importance of sticking to your scope and understanding it. And I would say especially that's one of the boundaries that college students like to push. And this kind of gets back to the hackers, no rules thing. Unless we explicitly deny it, they will try it.
22:23
Specifically, they hadn't even been brought in for their intro interviews yet. This was pre-pen test. This was still doing OSINT, and they were impersonating employees of the company. So one of the things we do actually ask them to do is to put together a proposal and bid on the work. It does have a minor component in the score, but in reality, we are not looking.
22:42
We don't expect them to know how to bid on work, put together proposals. In my professional life, we don't really look at people to do that until they've got a few couple of years of experience. And so what happens is it's more to get them in character. They put together a proposal, and we get some wild bids. Like this is a $2 million pen test. We've also had a $150 pen test before.
23:02
I really want to go with that one in the future. And we give them actually that component. We give them a lot of guidance on we expect to see these items. They get a bulleted list to try and help. Then, actually the Friday of the competition, we have them sign an engagement letter. What that actually is is there's a whole set of non-disclosures and other things for other sponsors and the software they're providing,
23:25
but also photo release and other minor things that you get at any one of these competitions. So they were doing this before the official start date of the engagement. So I'd also like to see people's thoughts on ethical, unethical first for simply creating the account,
23:44
but not asking anyone to friend you or do anything. What are your thoughts on that? Is that ethical or is that unethical? It was positioned as an employee of the company. Okay. So I see lots of black, but let me ask this question.
24:01
How many of you have a start date on a pen test and did phishing's included, and to prep for that pen test before the start date, you go out and create profiles for the company because you know you're going to need them next week when you start this engagement. That's ethical, but it's not ethical for the college kids to do it before that. Why is that different?
24:21
Please step up to the microphone. Thank you. Quick question. What would they gain from that, from starting early? Well, so I know when we do this at work, certainly some efficiency, but the other example I was thinking of is we will also create domains and we will list them and we will start to get them filtered a week or two before we start so that they sit in a filter.
24:46
Two words, account age. It's very easy to see if an account is two days old versus two years old. We rename accounts and continue to reuse them for that purpose. Very cool. So I think that tells you what you get out of it, and I don't disagree with you,
25:03
but then I guess my question still stands. Why is it not okay for the college kids to do that? Everybody held up black for that, but white for pen testers to do that. Why is that different? So the question was engagement letter being signed versus start date.
25:20
Fair enough. If you know your start date is two weeks out, you know your start date is two weeks out, so you're prepped for it. I like that. And, yeah, you're very correct. We often have our engagement letters signed months in advance or weeks in advance, so that's a good point. I guess I just wanted to comment. I don't know if it's a sentiment that others share, but there's a difference at least for me between ethical and unethical
25:41
and then allowed or not allowed, so I guess of the same hacker mentality, if it's not explicitly said, then I probably would do that as well, but I'd also acknowledge if I was caught on it that it's unethical. Okay, very cool. And I know that's actually a very good point. One of the things that I think it's a lesson that we've debated a lot internally to our firm
26:04
is allowed, unallowed, ethical, unethical, and then also the third level is how upset will the client be versus not and who is our client because that, their personality, may greatly sway how we approach something.
26:20
So let me ask this question. Should unethical behavior be allowed in pen testing? Contract. Real world should because we're designing the game. Sorry, I should use the mic for the speaker. Since we designed the game to reflect real world contract,
26:41
that's our whole goal is we want to get kids exposed to what this life, what contracting life is like, so we'll say real world should that be allowed. In the auspices of this game, those are the same. I'm going to say if you talk to the client and you agree with the client on the specific thing that you're emulating, then in that scenario I think there's value in testing unethical techniques.
27:03
But I also think the client needs to be aware of that and also give you approval and sign off on that. Yeah, that's definitely... Because that makes your approach at least ethical in terms of working with the client. Exactly, exactly. In this context, would you say that it's more important to be ethical
27:21
or professional with a client, and what's the difference between the two? I would say yes. I mean I feel like at least, and I'm sure this may be different for different people, but for me and for what we do at our firm, those are the same, right? So something unethical I do not feel like can be professional.
27:42
I wanted to make this differentiation earlier when we said rules versus ethics. I think a lot of times when there's a break of rules, we address that out of character and out of the game. We say should we pause the game? Should we adjust scores? Should we correct this somehow? And if it's an unethical decision, we handle that in character in the game and kind of just run it to ground and play with it.
28:05
So I have a comment about some of the terms you've been using and then a question. As you describe this, you use the words game and you use the words competition, and competition is actually in the title. So words are really important, especially for undergrads who don't have that experience of here's what consulting,
28:23
here's what contracting is. So as soon as you start using the words game and competition, they're like, oh yeah, points, we need to win this. So I think unfortunately some of these situations are set up because of the terms that are being used. So I know you probably do a lot of discussion with the teams afterwards.
28:42
There's a lot of lessons learned, and they're probably like, yeah, you know, that does make sense now that I know about it. What are some of the universities or what are you all doing beforehand to teach them some of these lessons before it happens so that they have that knowledge so that they can run through it on their own as they're considering doing some of these scenarios?
29:02
So I actually think there's a fair amount that we try to do, but it is actually a very good point that there is a lot more that we can do. So at the beginning of the competition, as I mentioned, we will have them do a bid and a proposal, and we take time out of character, and almost every meeting that we have up until the competition begins
29:22
to remind them and explain to them, look, here's the differentiation between game and competition and real world. Here's what consulting looks like, and there's actually some competitors in the room, so if we're not doing this, please tell us. But I feel like we make a best effort to explain the difference to them in as simplistic of terms as possible, but I know we can grow.
29:45
Yeah, I would say one of the emphasis we're doing this year is making sure that there's a conference call that every team can be on in advance so we set these ground rules in advance. Also, by doing this, we are providing OSINT about the environment and making this recording publicly available,
30:01
and we expect teams who are interested in competing to watch this and learn about what they're going to be doing. I think also we're meeting with the coaches. We've been meeting with the coaches for months, even though the competition is still months away, and I don't think we ever clearly set this expectation, and maybe we should, but our hope is that the coaches would be going back to the teams and providing that guidance
30:21
based on the discussions that we're having in those calls. To add into the comment, though, when you're looking at unethical situations, and let's say if we're looking at a company and you're doing your work both nationally and internationally, some international laws could actually put those people into prison,
30:42
so that would be a real-life scenario where being unethical in that scenario would be bad for them. Sure, so just to put it out there, the international component this year, I'm really excited for this because it's a Middle Eastern region, and it's going to be interesting to see how they approach things perhaps differently than U.S. students.
31:01
I'm really kind of excited about this. And certainly all of our infrastructure is run locally, so we're not exposing our students to that liability, but I mean, yeah, we at work have contracts and terms in our engagement letters around that exact component. But that all said, talking about liability and doing things,
31:21
we have a later scenario where teams are coming up with creative ways to potentially cause real risks to them if they were an organization, and actually causing risk for us, which we unfortunately have to deal with. But yeah, that's all in the joys of running this thing. Ready to step on? I think so. Cool.
31:41
So yeah, this one's good. You want to take it? Yeah, take it, man. So this scenario was really funny. Basically, we started doing the OSINT, and we released an e-mail address at one point, and somebody e-mailed us a fork bomb. And I don't even know what they were thinking because what do I do, pipe my mail to my command line?
32:03
No. So they e-mail us a fork bomb, and I'm just like, okay, who did this? And they use an anonymous mail service. When you look at the headers of the anonymous mail service, it records the IP address that it was sent from. It was sent from the public IP of a university that was competing in our competition. Yeah, spoiler alert.
32:21
University IPs are tracked to you pretty well. It's hard to deny that. So we sent an e-mail to the school at the thing, and we're like, did you send us a fork bomb? Like, hello? And they replied, and they were like, no, bro, not us. So stop real quick.
32:40
Cards again. Ethical or non-ethical to send a fork bomb to someone. No, this is not a joke. This actually happened. It gets better. Try to do it as a joke. Oh, it gets better. No, it gets better. Keep going, Dan. So they deny it.
33:02
It's not them. And we're like, OK, we have logs. We'd like to see your logs from that time period, and hopefully we can see if there's any connections or anything. So we start to run it down with the school's IT team. This is a computer security and forensics competition. We're going to find out.
33:21
We start to run it down with the IT team, and they're like, yeah, dude, it was their freaking lab. Here's the packets. At this point, the competition keeps going on, and this is one of those things where they're not really breaking rules, but it's kind of unethical. So we're like, whatever, we'll just deal with this in character. The competition will continue,
33:41
and we just had these random meetings with this team to resolve this issue that we were having with them. So by the end of the weekend, we have this final meeting with them where we have all of this data that shows these packets were coming from their logs. They had done an independent analysis of their own logs. During the competition. This was before the competition. This one was before. Sorry, we have a lot of these.
34:03
They prepared a presentation for us to let us know what happened, and they came in to give this presentation, and they basically doubled down on their lie, and they said they had forensic logs that showed UDP traffic from Mexico accessing their lab at the time of this fork bomb.
34:21
So they were like, we basically had these actors in there sending you fork bombs. Yeah, somehow Mexico is targeting this competition that their team happens to be participating in. Even though none of the logs from the institution say this, but whatever. So ethical or unethical, doubling down on your stories.
34:41
Yeah, black and white, right? So there's a couple other components to this, though. So as Dan mentioned, during the competition, they provided a nice report for us that essentially doubled down.
35:00
Ahead of the competition, we had another meeting with them, and this is the one I was thinking of earlier, where the first 10 or 15 minutes or so were with their coach, who already knew. We provided him the courtesy ahead of time of just professional courtesy of, hey, we've seen this happen. He's the one that got us in touch with their IT team to do the investigation. But with the coach and their captain, which was a student,
35:21
and the captain actually did a very good job of saying, we are going to do an investigation. Give us a little bit of time to figure this out. We need some time to determine, with the additional information you've provided us and our IT team has provided us, to really determine what happened. And that is an excellent response for that sort of scenario. Independent investigation, ethical or unethical?
35:46
Pretty straightforward, yeah? Okay. So we then had an out-of-character meeting to discuss this with them as a teachable moment, right? This is an educational piece, so we wanted to just explain from the perspective of, actually to your comment of professionalism and game
36:02
versus competition versus consulting. Let's talk about the consulting component of this and no matter what's happening, right, you have created an issue with your client. So there are some potential ramifications for that. One of the students doubled down in a different way than we expected.
36:22
So the first thing they said was, something I mentioned earlier, almost verbatim, I'm a hacker, the rules don't apply. So I would like to hear, maybe this is more of a broader comment with the slides, as a hacker, do the rules apply to us in all of the things that we've talked about?
36:42
There's probably a better way to word this. As a cop, the rules don't apply? I mean, this is fucking legit. So we all say that, but I have people at my company that I work with that maintain that same attitude, and maybe it's just because they're fresh out of college, we do hire a lot of young kids straight out of school,
37:01
but that's the attitude that a lot of people are coming through four years of school in security and they come out with that attitude. It doesn't matter, I can do what I want because I'm a hacker and that's what you're paying me for, that mindset, which is true. We are paying them for that mindset. I mean, I'm going to go with that's completely unethical, but it's also a flaw in your hiring process.
37:21
I don't disagree with you. Don't disagree there. So should we ask ethics questions as part of our hiring process? Yes. You should. How many of you do? Nice. Good. Good. And I was just going to point out, because I've been competing in defense competition and so on, but they're not kids, they're students.
37:44
I'm sorry. I'm sorry. Thank you. Sorry about that. So the other piece of this was during this meeting, a member of their team, even though the captain was still leading most of the conversation, which again is from a professionalism perspective
38:01
what we would expect, a member of their team very loudly spoke up, spoke over their captain and said, it's someone else, I'm not going to name who, that has access to our lab that did this. So I think there are two things here I'd really like to see people's comments on.
38:21
The first is, outing someone like that comments directly, but also, are you as a consultancy still responsible for the things that someone else on your team may do? I feel like yes.
38:41
If someone broke into my company and used our network to attack our clients, contractually, we are most definitely liable. There's no getting around that. Next? Do you have a question? Hopefully start to get to a little more
39:01
some gray stuff here, I think. This one was really funny. Do you want me to describe it? So this was during our healthcare scenario, and basically we had a bunch of systems, and our emphasis was on high availability. These were systems that were part of healthcare monitoring systems, and we really needed to make sure they were up at all times.
39:22
Anyway, one of these systems was susceptible to the dirty cow privilege escalation. I forget the exact CVE thing, but it was pretty popular the last few years as a Linux LPE. So the students got access, and we actually had the gambit of responses on this one. We had students that didn't talk to us, and they were like, oh, sweet, LPE.
39:41
They ran the exploit, and the system crashed, and it wouldn't come back up on its own, and then they had to come talk to us. We had students that first reached out to us and said, hey, this is vulnerable to dirty cow, but there's a risk of it not coming back up. Can I run this? We said go for it, and then we had students that just decided not to run it at all due to the risk.
40:06
So I guess the question here is we'll start with the ones that find the LPE and don't talk to the client. Is that ethical or unethical? Assuming no crash.
40:23
There's a possibility of a crash. Correct. There's a possibility, but we're not assuming there was or was not. What's the SOW saying? It says high availability healthcare systems. What's the SOW for the pen testers? Are we testing denial of service? No.
40:41
No, no, no, no. It's explicitly out of scope. So I see there's actually some different colors coming up here, some whites and blacks, so I'd like to hear from both. Anybody want to take a shot at running the exploit? This almost just feels less like an ethics question and more being bad at your job.
41:03
Like, that is just contractually bad that you crashed a system when you signed a thing saying that you would maintain high availability. Just talk to the client. But there is always a chance that we will crash a system. There's just some things that have a higher chance than others. Sure, but what's the threshold?
41:24
Okay, so let's say you ask to run it, and it's a system that tells you that it's critical for you to get root on to pivot into another network, and the healthcare system leaves the determination up to the pen tester. Should the pen tester exploit the system
41:41
with a chance of taking it down to further their access? Yes, because they can identify the client. So I'm going to speak for possibly half of the room. I have no idea how many people agree with me, but I would say yes, definitely, because you've notified the client. You can tell someone, hey, make sure if you see this system's light stop blinking,
42:02
go turn it back on. But once you've notified them and they know that it might crash, and they say, all right, then it's up to your decision, then it's totally ethical, at least. It might not be the best idea, but I would say it's at least ethical to do it now. Does it matter what the target is? Say that's a life support machine specifically.
42:21
Well, that would be the idea of, like, that would be something you would have to talk through with the client and say, do you have someone that's there on that person making sure that this won't, like, cause loss of life or loss of, I don't know, brain function for someone who's, like, on life support. So are you saying that ethically it changes if that is a life support machine hooked up to a patient
42:43
versus one where someone isn't hooked up? Yeah, like a dev or a pool. Well, for sure, like, if loss of life is involved, that's a whole other level of issue than, like, losing money in the middle of, say, like last year's thing. Actually, that was a perfect example. Last year's thing is a whole different case of,
43:02
oh, yeah, like, we're really, really sorry. This was very bad. We crashed your database that stored, like, banking information. Or, yeah, we're really, really sorry. I think we're going to go to jail. We just killed one of your clients. Okay. That's two totally different ethical issues. It is. So let's say instead of a life support machine,
43:21
it's the stock exchange broker central component. That manages all the trades. So you're losing, like, tens of millions of dollars every second that it's down. How do you feel about that one? I'm speaking for substantially less numbers of people in the room as we go along.
43:41
I'm going to throw out an idea, right? Like, if you get into a scenario like that and you talk to the client, you can explain the risk of the exploit. You can maybe try it on a dev system, and then they could just put you at the level of access that you need. I think fundamentally the thing we're discussing here is talking to the client and getting their approval and feedback is really one of the most important parts of it.
44:02
Because you as a pen tester aren't going to know if that system is hooked up to a human being that you could kill versus it's their dev. Not that there's ever a hospital that has a dev life support machine. But if there was such a thing, that would be you would essentially make an ethical decision as a pen tester to have the client tell you the right feedback.
44:22
Now if the client's wrong and that results into that, you still did something bad, but from an ethical perspective, I think you've made the best effort to make the right decision in that case and have the client give you the guidance and the client is the one that screwed up in that case. So communication is critical and that's something that we are terrible at as an industry, both in teaching people
44:41
and in just fostering that expectation as you kind of go through things like Def Con and everything else. We don't talk about that much. The thing to point out or at least bring up is if you are getting that agreement from the client, I'm sorry, if you are getting that agreement from the client, making sure to get it in writing
45:01
and not verbally over the phone. That is an excellent point because if something is oral, they can easily deny that. If it's written and their headers in the email come from their mail server, that definitely makes it a lot more solid.
45:20
Yeah, I'm just going to confirm, yeah. Come a little closer to Nick. Yeah, there has to be the, I guess, terms of what? I'm sorry. The rules of engagement need to be defined and there is an operational risk and it's up to both parties if they want to engage or not. But when you're testing like the OT side of things or something critical where there might be loss of life,
45:42
like that is an option that you're dealt with and sure, it's an ethical dilemma but you're going to do your best. I guess I wanted to add to this. I think I kind of disagree because it seems like the focus here is more on passing liability or just contractual responsibility
46:02
that if the other party acknowledges the risk, then we can proceed. I'd actually like the industry to approach it with more rigor in that you do end up playing a part whether it's acknowledged or not and you should be able to assess for yourself whether or not if this down the line could end up harming someone even if the contract that you have with the company
46:23
you're working with gives you the okay. We should still take that for ourselves as pen testers or people within the cyber security community and be able to address that separately from just passing on the liability. I think that's an amazing point. I see a lot of times in my business, people defer risk.
46:41
They will keep asking up the chain what is this decision, what is this decision and they don't often add some kind of analysis of like hey, maybe we can come together on this as a group or like this is my opinion and this is how I vote in this situation and then send it up the chain. A lot of times I just see people deferring that responsibility,
47:01
that risk to the next person so they don't have to. Yeah, a lot of times you'll see they don't necessarily want to have, they want to have a scapegoat pretty much as opposed to a solution. Yeah, we see that. That's a very good point. We see that a lot. And a lot of times the pen test firm is the scapegoat.
47:23
Yes, and so, I mean, yes. And then another thing that comes into play in the actual competition but you don't really capture it during these questions is there's so much politics usually involved in a pen test, right? And like maybe the organization won't give them that access
47:40
because they don't want them to see what's on the other side of that machine. So I've seen that too where the organization is resistant to the pen tester continuing to get access, which might cause them to try an exploit like this to find out what's out there. The other thing I'd say is this goes beyond just pen testing. And even in general consulting I've ran into this sort of thing where like I've been blamed for taking down websites
48:02
after doing a firewall upgrade when the firewall upgrade was canceled. So it's a whole thing where everyone is trying to look for someone to defer the blame to and we really shouldn't be looking at just blaming. We should be just trying to solve problems.
48:21
But what do I know? Just a consultant. Next one? Before you go, you mentioned that you do role-playing as part of the competition or game. Is there any time that you've done this or plan to do it in the future to test ethics to maybe put a trap door in there, a scenario where you may come across in character to try to test that?
48:41
And if so, if you're thinking about in the future, do you have an ethical obligation to let them know that that may be coming as a real world experience that could come if they're pen testing for the more ornery organizations that are out there? That's good news, everyone. You want to take it, Tom? Yeah, sure. So we have many things planned for the coming year
49:02
and the following year. OSINT, OSINT, OSINT. Yeah. A little hint there. This year, this year, this year. I would say that we do not necessarily plan on telling the teams that they will be subject to this sort of thing. We expect them to operate professionally and handle things in a way that they think is best
49:20
given their role as a pen tester working for the company that has hired them. So without delving too much into what we're planning on doing and what we aren't planning on doing, I would say that's absolutely something that is something that we want to teach. And one of the things I will add as we're doing this,
49:41
we are not trying to set them up to fail. So we will be giving in the process of that, one of the things we have been discussing at length is how do we do this in a way to give them every hint and every chance to succeed so that it's not a, ha ha ha, we got you, did you learn? It's more of an opportunity at every step
50:03
to get something out of it. Right, and I think that's key. Our goal as the Penthouse Advisory Board in particular, like we want students to come out of this because we want to hire them, right? And we want students that have these skills because right now what we've seen, and the reason this all got started is because we're seeing that gap.
50:22
People can come out and they can break stuff all day. I have people on my team that I can throw anything at them and they will be able to break it and it's amazing to watch, but I cannot put them in front of a client at all because they have no concept of how to interact with the client in a professional way. So we do a lot of coaching before the game on professionalism and what that means
50:42
within the context of this competition, and I appreciate that comment earlier about our language. I don't know what a better way to define this is, so suggestions afterward would be awesome, but we spend a lot of time with them, coaching them, what does this mean, and then repeatedly throughout, we expect you to behave professionally, professionally, professionally, but the problem is a lot of these students
51:02
have never been in a professional environment, so we have no idea what that means, really. And that's another thing that goes beyond pen testing, where if you are in a position where you're working with clients, you could be the best technical person in the world, but if I can't trust you to not be an idiot in front of a customer, I can't have you be the first impression.
51:23
So that's something that we have to deal with beyond just as a pen tester, but just technically in general. So just to explicitly call it out, yes, we have ethical human interaction challenges this year. Yeah, he's not wrong.
51:43
And we now need to send an email to all the teams about that. Because we're ethical. Because they may not see this. Actually, that's a good question. No, we're going to tell all the teams to watch the video for this. Yeah, so I guess that's maybe a good question here, is from an ethical perspective, do we need to email or provide direct,
52:02
because there are competitors in this room right now, I won't call them out. I'm a competitor. No, you don't need to. So I guess that is an ethical question of should we, is it unethical for us to just say leave it and not say anything and say the people that were here
52:22
are in a better shape. I think it's there, but I actually think you could play the devil's advocate and make an argument of. Did you invite them all here? No, we did not. We have told everyone that follows our Twitter that we are going to be here, and we have told everyone that has registered for the competition
52:43
to look at our Twitter for news. And I plan on sharing this as an awesome thing. So as someone else who puts together competitions, social media is a very poor mechanism to share information,
53:01
especially in the context of being ethical. If you're going to say you get an advantage and you get a disadvantage for being able to have access to something, you're raising the bar of entry to the competition. So you've said you can do this, but now you actually mean you need to go all the way up here
53:21
because you have to monitor social media accounts in addition to the social media accounts that are being given for the competition. So whether or not that's realistic, it's fun to put it out there, here's this hint, but if those hints are going out there for something at such a big level, in my opinion at least, for teams who are playing,
53:41
they need those bits of information to help build the strategy. You're saying it's too desperate, too hard to access? Yeah. I think that's a great comment. He's saying the information is too desperate, too hard to access, and we almost need to provide these lessons learned directly and I think do a thinner version of the OSINT.
54:05
So do you think that if we were to, say, send an email to all the competitors with a list of things that they might want to consider looking at, would that be? Would that resolve that? That was my question also. Do you think we need to directly say, hey, there will be this type of challenge to be ethical,
54:22
or can we just say, hey, you might want to check out this talk? So I don't think spelling everything out for a team helps them. After all, the idea of a competition is to help build the skills of a student participating. So no, you don't have to spell it out, but giving everyone the same access to the information sets the entry level,
54:41
and that's where a competition takes it from there. I agree with that. I would say this talk aside, step out of character here for a second. We completely agree, but it's a good discussion. As an additional point, yes, access, but also inform them of the availability of it.
55:03
Yes, it's available for anyone who happens to know the video, but I'm not going to put it out on Twitter, I'm not going to send an email to the teams because they've got to find it themselves. That becomes really hard, not good. But sending them out and saying, hey, here's the URL, and if they don't look, they're a damn problem.
55:20
So we don't have to provide the information directly, but we should provide the accessibility directly. Fair enough. One of the other things that's actually been coming up for us a lot as a board this year, now that we're in several years of this, is how do we also level the playing field with new schools versus those that have played before? And that's a very serious consideration that we've had,
55:41
and actually a future slide that we'll hopefully be discussing, but how do we deal with cases where some schools have more experience competing, plus there are people who have recognition of schools. You're going to know. I'm looking in the audience. I know there are people that have competed, and believe it or not, we have an opinion of you.
56:04
We judge you. Oh, no. We have a black flag being held up on that. I think that this bit quickly kind of devolved into the first chapter of Hitchhiker's Guide to the Galaxy.
56:21
Like, oh, yeah, the information's been on display in Alpha Centauri for the last four years. Why haven't you checked? But it goes a little bit back to the core idea. You started off by saying, oh, yeah, this is a competition, but we don't want people being competitive. We want this to be a learning experience, and that creates an interesting dichotomy
56:41
where you can say if you wanted to provide out-of-the-way information that only some people knew about and create, like, that insider's club, that's really good from a competitive standpoint to have more people hunting for these things, but it's not as good for learning and creating a broader base of professionals.
57:00
So that's interesting, and I agree with you. I guess the question that I have, and since we do have so many competitors or previous competitors, would you be interested if this wasn't a competition? It was just billed as learn how this works. Was that something that would still hold your interest? Or no, is the competitive part of it a key part? Do both.
57:22
Having done cyber defense side of things, yes, pen testing, good, we also had to publish curricula and say, like, look, this is what you need to know. This is, you know, here's all the information, and we published about six or seven weeks' worth of curricula for them to go through for all schools.
57:40
So that was our attempt to also level the playing field. So I don't know if you guys do something similar, but suggest. That's a good idea. I think the networking opportunity between the student teams is one of the most fantastic aspects of the entire, I mean, probably the pinnacle of the entire competition.
58:02
What about getting them to work together? I mean, like, so maybe it's still a competition, but maybe you split the teams and, you know, two, two, and two, or whatever, and now they have to work together. We were talking about some curveballs like that. I would love that.
58:20
I would absolutely, because then you're working with, you know, these are the people, these are going to be your clients, these are going to be your customers, these are going to be your coworkers, this might even be your boss. This is a very small community despite having 30,000 people here at this conference. You know, let's catalyze that opportunity to mix and match.
58:41
Let me just say, maybe not within the scope of the actual competition, like, scoring points aspect of it, but there are things very, very similar to what you're just saying in a Google Doc that we're using for planning. Oh, great. Can you share that with me? No. Okay, we will share it with you in November.
59:06
I absolutely want to highlight that comment. These students are going to be working with each other in the industry, as much as us, but, like, they are peers, and it's a shame because I feel like the competitive aspect of it stops them from interacting as much as I wish they did.
59:21
I really, I would love it if teams sat with each other, shared techniques, like, kind of shared the stuff, but there's that competitive aspect where they don't want to reveal their hand, so, yeah, I'd like to review that. There's a very firm plan for us to have more time to interact with the students, not only in a competitive, but also in a professional manner. And I know you're going, and I'll give you a chance in just a second,
59:40
but also some of the things that we've done to try to kind of prevent that competitive nature is require things like you can't have private tools repositories. You can use whatever you want, but it's got to be publicly available so you don't have a separate advantage from other teams. So we try to even the playing field that way a little bit. And I also want to bring up, that's a really good idea,
01:00:00
happens in the professional world. I've been on incident response engagements where they've brought in my company and another company and we have to work together to find the data. So it's a real world example that you're working with competing companies. And to Jason's point, one of the rule additions that we've made for 2019 is allowing teams to use a repository that we provide
01:00:22
for any tools that they develop. That is something that has to be public. They have to document it. They have to know how the tool works and be able to explain it to someone else. But it's our belief that if you create something for the competition that's going to be a contribution to the rest of the information security community,
01:00:41
that shouldn't be something that's secret. So we wanna make that something that we can share under the umbrella of CPTC but further the community as opposed to just having teams hiding in their own circles developing awesome tools that they don't wanna share. And we figured the best way to do that was to open that up
01:01:01
and make it something that was allowed with certain restrictions. So personally, I come not from a CPTC background but from the CCDC background, but I think the idea still relates, which is imagine that I'm playing in like a Southeastern regional and somebody else is playing in a different region, there is a very, very low percent chance
01:01:22
in which I am going to interact with teams that do not qualify from another region. And even the teams that don't even qualify for their region don't interact with anybody. So one of the things that we've been thinking about and this is coming from previous work experience and we've done this before where we said, we're gonna work on a project with our entire group,
01:01:40
the whole group, this whole department, the large department. Instead of it just being your little department that's in one location, you're gonna work with, you have to work with at least another group that's in another location. So not only do you have to work with like your, in this case, it'd be your region, but also you have to work with people from other regions. So you have to learn, here's how I work remotely even, like how do I work with people
01:02:01
that don't even live anywhere near me, may work in different time zone, as well as possibly saying like, I need to work with people from other teams within my region, just to like increase the diversity of thought within like inter-regional ideas. That is an interesting idea. It's a really good idea. We'll have to figure out how to make that work, but I like your thoughts.
01:02:21
So hopefully not going too far off topic, but- That's just the whole point of this, so that's no problem. Just trying to go from the focus of ethics, and I'm gonna throw this back here, but if you are, as competition organizers, as competition organizers, you say, okay, now you have to, you no longer have your team.
01:02:43
You no longer have those people who you trained with, you studied with, however you prepared. You are now split amongst 10 teams. Congratulations. Is that something that is, to the context of a competition, is that something that's ethical? Is that to rip all the strategy and all that stuff
01:03:01
and force teams to do stuff that they don't want? Is that something that is actually ethical? So we just got a card for ethical and unethical. Come on up. I think, before we do, I'd like to see. See everybody. Let's see everybody. Including us. Yes? Uh, hm? So can you just restate the question, please? Would you wanna restate the question?
01:03:23
Actually, here, I can do it. Is it ethical, under the auspices of a competition, to force everyone to randomly work with other people knowing that they have trained together? We even have some disagreement up here.
01:03:43
How would you score? Well, I'd like to hear from the audience, from one of each. Matt, get up there. Like, a lot of us are millennials. We just think everyone wins, right? Well, bad joke. Whatever. If it's the NFL All-Star Game, that's one thing. If it's competition, you're training as a team.
01:04:03
The team is competing, not the individual. So, all right, now you get to work with different people that you've never worked with, but you're still being judged as a team. You have no control over who the other people are. You've never worked together. You might, out of random luck, you might get the lame-ass team.
01:04:20
So, to clarify, for those that don't see your card, you're holding unethical cards. Yeah, I'm holding unethical. Okay. Just like the real world. Yeah. That is why I held unethical, because I agree with that. I think if you train with a team, you should play with that team. So, if it's a different thing, like the All-Star Game, which is not part of the regular season, it's like, hey, let's have the Christmas.
01:04:41
And then they do a, it's kind of a pickup game. That becomes ethical, because it's not counting. It's more fun. So, basically, the difference between the World Series and the All-Star Game, or? Bingo. So, I'd love to hear from someone that said ethical. Yeah, so it's still in the context of, in the competition, we are still, our university team is this random team,
01:05:02
or it's just like, we are brand new teams that we just, that we are all. Brand new teams. So, if we took one person from every team, and I'll put them on this collective team. Yeah. We shuffled all the teams. So, I don't see an issue with that, because it's not like we're representing our university. We're representing this team that was just formed. So, we're pretty much all starting off
01:05:21
the same level playing field. So, there's no, yes, you may get someone that isn't as good, but basically, everyone has that same percentage of risk, where they may get someone that's really good, someone that gets someone that's really bad. And overall, since we're not representing our official university, or whatever you're doing, I see no problem with it, especially since this competition is for learning. So, I don't see, just because you train with someone
01:05:41
doesn't mean that you can't work with another group, especially if you're not officially representing university, I see no issue with it. And I've done the competition before, so. Yeah, and that's why I said ethical, too, because I kind of agree with that. I don't train as a team with my colleagues, right? It's all just a random grab bag. So, it doesn't make a difference if I'm training with, working with somebody at my company, or some other company.
01:06:01
I've never worked with them before, anyway. And so, the competition environment, like CCDC, where you're like, okay, we need, somebody that knows Linux really well, we need somebody that knows Windows really well, and the teams do come in with that kind of breakdown, but that's not how it works in a company. That's not how that works at all. To add to that, in real companies,
01:06:21
are you gonna be pulled into, you have other teams pulled into your work, and you guys need to go, we have a detection monitoring team, and I work in our team for a company I currently work at. We have instances where we're both working together, but even though we're still relatively close, we still have to interact with someone else that we usually don't, we don't do day-to-day work with, we don't know what their skill level is. Do I know if this person knows what Plazo is,
01:06:42
can they do forensics, can they do anything, can they read PCAP, do they know what logs are? So you don't know who you're working with, which is really sad. I wonder if it would change the internal dynamics of the team, because then you would get one person from each team at the school that won, versus a team that won. You could potentially have a scenario
01:07:02
where the team that won represents three or four different universities. Right, it would be one person from every university. Yes, and it would be a new team. But imagine the opportunity for cross-pollination between these programs. Many of these programs are focused in very specific areas, they have a very specific methodology or strategy.
01:07:23
If we can remove the barriers to sharing information and change the, I mean, ultimately, somebody has to win one of these competitions by points, but if you look at these competitions, CPTC, CCDC, ultimately you're actually not competing against each other,
01:07:40
you're actually competing against some external entity, right? In CPTC, you're competing against a red team. There is nothing in the rules, I'm not giving you guys any ideas here, but. CPTC or CCDC? CPTC, there is nothing in the rules that says that we can't collaborate. Sure, yeah. And the same is true for, in CCDC,
01:08:01
you're essentially going up against a red team and you're all kind of at the same starting block. There's nothing that says that blue teams couldn't collaborate together to, yeah. But we approach this from a PVP point of view, we need to switch, I think, to PVE. Right, and I think the opportunity there
01:08:20
would be something really special. One of the reasons I really like that idea is because some of the schools that come into us don't have security programs, they've just got students that enjoy security and there's somebody on the faculty that is willing to work with them and kind of foster that. And so they come in and they don't have that. Stanford has no undergrad security program at all.
01:08:43
And you're not unique amongst other teams that are similar, so. And we've actually seen, so we've seen schools that, so we've seen schools that have strong business programs do well in the event because they have the technical skills that people get out of passion
01:09:01
just from working with it. And then they also have solid business skills that the technologists kind of suck at. So that's worked out really well. We've got this incredible diversity of, you know, outstanding polytechnical institutes, outstanding theory institutes. Let's get these people together and start working together. Yeah, that could create lifelong friendships too.
01:09:21
Like those people will probably trade information, you know what I mean, stay in touch. The people that I've met through this event and this competition are, you know, they're gonna be with me for the rest of my life. We love curve balls too, so I would like getting ideas. I'm the director of the Michigan Cyber Engine. We've struggled with kind of leveling the playing field.
01:09:42
I've hosted numerous competitions and, you know, how do you really level the playing field across teams? You can either, one, randomize the teams so you'd have a diverse team, or you can say, hey, pick your own team, kind of force that diversity upon them. You should, if you want to win, you need a diverse team, right? We all know that. So it's kind of, you choose your poison there
01:10:03
and we've hosted a bunch of exercises, as I said, and one of the things that we've done is kind of leveled the playing field by giving everybody the exact same operating system with all the same tools, right? I mean, that's another way of doing it. So it's really, you know, it's an interesting problem. In the end, you know, it's all about skill
01:10:23
and experience for each team, right? So you can never really, truly have this perfect level playing field, right? That's why you have winners, right? Anyway, thanks. No, and yeah, I mean, we do, so, the teams are all provided, all of the systems that they perpetuate their attacks from and start from are inside the virtual environment we build
01:10:43
so everyone does have the same image, but you're correct, right? There's a wide variety in skill sets, in creativity, in business, in communication skills, in writing, and it's very tough to take that diversity at a level of where you have smaller teams.
01:11:01
Do you give the teams access to the material? We haven't because they're branded, but they're branded by team, and we are working to provide some anonymity so that we can. And we also want to be able to release some of the things that we develop for the organization
01:11:21
so that there's a library that other teams can use to prepare and... And other competitions. Our tooling's public, but our secret sauce right now isn't, and we are working on making it so that we can do that, but it's one of those things that happens in our free time, free time.
01:11:42
Which is even smaller than our free time. Oh yeah, yeah, so I'm glad that a lot of people seem to be bringing up concerns about the whole kind of information siloing aspect of this, because when you've got a team that's got years of a legacy with success at not just CPTC,
01:12:00
but other competitions as well, you end up in a position where they can just kind of get the snowball of success to such a size that it just steamrolls everyone else. And what I've heard so far being brought up as far as forcing teams to kind of integrate together with their rosters, I think that's an interesting concept,
01:12:22
especially since that's something that people would see in a real world scenario in different parts of a company. But I also think that there are ways for you guys to attack this from kind of the competition organization standpoint as well. For example, with releasing old materials from teams,
01:12:41
obviously that puts them at a disadvantage, because their strategies for success, those are going to get released, but at the same time, that puts them in a position where they have to conscientiously decide if that's a trade-off they want to make. Do they want to put that forward in a scenario where it's going to be eventually released
01:13:00
for the sake of winning this competition this year? Or in another example, maybe you want to split up the network environment where it's like, you'd get assigned to sort of collaborate with another team, you're not necessarily integrating throughout the entire rank of one team. This is team A, this is team B. Team A is going to look at one part
01:13:22
of the network environment that might have certain information pertinent to the other part, and then when it's time for the two teams to trade off, they would then share different tools and techniques and information that they've found over the course of their assessment, at which point they gain exposure to the other teams, like TTPs and things, and relevant information,
01:13:43
and they're actually collaborating and working together. That's a really interesting idea, and that correlates to when I go into a company and do a pen test, and they provide me the last pen testing report that they had, and it was from a different company, right? So that'd be pretty easy to work into a real-world scenario. Let me ask you this question, though. We've never previously, that I'm aware of,
01:14:01
told teams that we would make their reporting, for example, even anonymized, public. Would it be ethical for us to do that, or is this something we should look at in the future, and we make, from here forward, we put something in that says, this is something that will happen? We'd love to see the cards. I think to retroactively go back and do that
01:14:22
without their permission would be unethical, but if you get their permission, that'd be fine, and definitely, going forward, having a clause in that. Totally fine. I would have to say that I don't know if there's anything in the rules that would prevent us from doing that. Yeah, but as we've been talking about. I just wanted to add a quick note.
01:14:41
That would take it from PVP to PVE to cooperative. Co-op, they need to work together to solve the challenge. I really like that. Two things. I guess on the reporting side, I guess companies with pen test reports, that is part of their identity. That's their intellectual property, so if you were to actually publish that, that would be pretty bad. But another thing is, one thing that they did in CCDC,
01:15:02
at least starting a few years ago, is that we have a threat intelligence exchange where the captains are people from each of the teams. They sit in a room for an hour, and they go and they talk about, hey, what'd you see? And you basically give the other blue teams things like, oh, I saw this on this box doing this, et cetera, et cetera.
01:15:22
And that was something they implemented last year. But that could be really cool if you guys could think of something how you would implement that, at least in terms of CBTC, some sort of techniques or procedures, tactics, like, hey, we discovered some sort of old building on this box. What about a forum? A forum would be good. I think a forum would be good.
01:15:41
Like an open forum for like 30 minutes where whatever the product manager or whoever the lead consultant is goes and talks about what they found, or it kinda gives like a short summary, like a hell of a review. Very good ideas everyone has. First, good to hear that someone actually liked that implementation that we have put in.
01:16:02
We never really hear feedback. But the other thing is, with a similar topic, if you, we all are talking about whether it's okay when things aren't documented, and for ethics of competitions and things like that, is it ever okay,
01:16:20
and obviously the whole room can answer this one, but is it okay that for us when, as a competitor, you, your strategy is to find the little details, and everyone who's competed, you know you're looking for those in the rules where it misses that one word, where it's this little piece of information
01:16:41
that you can just sneak by so you can get a strategy that is successful. And everyone's laughing because they know it's true, but that's how a competition works, at least these competitions work. Now. Hold on. I'm gonna find something real quick. Okay. There you go. Oh, I thought there was a bunch of screens that were. No, no, no, no. This, this is what we've added to the rules.
01:17:05
Good job keeping that. Can I take a picture of it? Yeah, it's on our website. I get what you're saying, Joe. I competed in an event in college. It was like my last week of college. It was called the Great Race. We had to build these robots, and I hacked the competition. We couldn't change the motors,
01:17:21
and we were fixed on these axles, and I changed the wheel size, so my robot went way faster than everybody else's robot. It's basically mechanical gear ratios, right? Anyway, we won on a technicality because we hacked the rules, and I felt bad afterwards because I looked at everybody else, and it wasn't fair, you know what I mean?
01:17:41
We found a flaw in the rules, and we took advantage of it, and it wasn't the same competition. Yeah, the CCDC team I was on at RIT is responsible for some rules at the national level. So that's part of the things that are going through our heads when we're actually trying to develop this sort of thing, and that's honestly the reason
01:18:02
we have something like this that we've added. So those nuances we can say, we know what you're trying to do. It's not something that's professional, and we don't want that to happen. There was a second half to this, and that is that we know as organizers that teams are doing this. We know that they're trying
01:18:20
to find a way around the rules. When we do find those, I know there's no clear-cut answer, but as organizers, do we, if we're going to be completely ethical, then the rules must state. If you have violated the rules, and we've deemed you violating the rules like what you say,
01:18:41
then you must be disqualified or punished in some way, because that is how the rules work. However, in the reality, it's hard, because sometimes it's unknowing, it's accidental, and that's what this whole talk is about, so I'm just throwing that out there. How do we as organizers find that balance in our own policies to update them
01:19:00
and make them better before we make something that's 100 pages of rules? So just a thought. I was just gonna say it's a gray area, and that's why I like to take the soft approach, because it's a learning opportunity for us even as we go through these scenarios. I like to give people the benefit of the doubt, and I don't always assume malicious intent,
01:19:22
and I try to make it a learning opportunity rather than a you're kicked out thing. I would agree, oh, go ahead. I would agree on the professionalism side. It's how the team interacts with us in character when something like this would come up. And that's an advantage that we have that maybe CCD does not have, because our focus isn't just on technical,
01:19:42
can you keep it up, can you find the holes, can you do all this stuff? In fact, that's not even the bulk of the scoring. The bulk of the scoring is your report, how you're interacting with the team during the, with the CBTC team during the engagement, and it's those soft skills. And so it's a little bit,
01:20:01
we have a little bit less of a problem there, I think, than some other schools where it's kind of, I wanna find the technical rules, because you can't find technical problems with soft skill, right? It's a little harder to do. And the other thing that I would add, at least what we have tried to do, our approach thus far has been that we look at rule violations by committee,
01:20:20
by advisory board, and sometimes, in fact, we'll even bring some of our sponsors in, maybe not read them into every intricate detail or who's responsible, but to ask their opinion, and almost go a yes, no, where are we at? And I think that's probably one of the stronger pieces we have to this as well, is just, and there have been some times
01:20:40
we have been vehemently disagreeing with each other, but we generally come to a consensus. I can't think of a situation where we voted in the end. But there's very strong debate. But there may be very strong, aggressive debate. And I guess just to directly answer your question, it has resulted in rule changes so that we can address it.
01:21:00
Yeah, because we want to make it a valuable educational experience. We don't want people looking for things that we screwed up in not accounting for in the rules with the comma in the wrong place or something like that. We're not lawyers. But realistically, if you're actually doing this in the real world, whatever the customer wants is what is the end game.
01:21:22
So if the customer decides they don't like you or didn't like the work that you did, or you did something that was against whatever they decided they cared about, that doesn't matter if you're right or wrong. It's what the customer thinks. And we're the customer ultimately in this event. I just want to say rules of engagement
01:21:42
is super important, and you have to be explicit the entire time. And one of the things I've started to dabble with is kind of putting in a disclaimer at the end scene. My rules of engagement may evolve during the course of the exercises because of unforeseen bullshit that can happen, right? So you can't just say, here are the rules of engagement.
01:22:02
You need to phrase that a little bit differently in a professional term. Exactly. If a team put that in their report, we might dock a few points. So we just call it technical terms. So that's interesting though, because as a pen tester, I hate when a client does that. Right. I come in and that rule of engagement is what I'm coming in as, and I can't stand it when a client does that.
01:22:22
So I don't disagree with you, and it's not a professional environment. It is a competition, so it's not the same thing. But in a world like this where we're trying to emulate the real world, I almost feel like we shouldn't do that. Let's take that to the audience real quick. Let's take that to the audience. So you're contracted for a pen test,
01:22:42
and then they change the rules of engagement mid pen test on you, ethical, unethical. There we go. Ethical or pain in the ass. There's a big difference between, yeah, I agree. Okay, now. Change it without agreeing.
01:23:02
You wanna go up and talk about that really quick? Yeah. Because this is. We've done pen tests, and the customers in halfway through the pen test, they're like, oh shit, don't do that network. Because we forgot that that shouldn't be there. That subnet's not owned by us. And also, as part of our pen test, I mean, previous question, we popped a fetal heart monitor
01:23:22
that was connected to a patient. We freaked the hell out with the moment we knew what it was. Talked to them, and they're like, oh no, no, it's okay. We're like, mm-mm, we're not gonna do it. And pulled that out of the statement of work, because we're not gonna do that, and made it as a separate statement of work, not connected to a patient kind of thing, because it terrified us.
01:23:40
Yeah, that's awesome. Yeah. So one of the other scenarios we actually had was, that I think is very pertinent to this, is client asks you to remove something from a report. Not just change in scope, but. First, before we put that out there, how many of you have been in that scenario in your professional life? Yeah, change of scope happens.
01:24:01
No, not change of scope. You've got a vulnerability on a report, and the client comes back and says, we don't want that on the report, take it out. You almost had an old pen test, or even as a doctor, like it's gonna affect tens of thousands of people. What are you gonna do? Yep, yep. But what if it's not, we want you to remove this, what if it's, I have all these mitigating factors that we haven't tested, and I think it should be reduced
01:24:22
from maybe a 10 to a five. That's what we're here for. What if we didn't test that? That's a very good, very good. Oh, so the, thank you, the comment was,
01:24:41
what if they fixed it after we tested it? Yeah, what if they fix it, we retest it, and it's completely fixed, do we still report on it? So before the test is over. Before the test is over, that's good, I like that. They're doing their due diligence, was it too late, you know,
01:25:02
to the stakeholders need to know that they weren't doing their due diligence in time. So let's see cards, remove it from a report, ethical, unethical, unethical. It almost depends. Okay, so let's go with some of these others, reduce the risk rating. Quick question, does it depend who the report is for?
01:25:23
Like if it's just for the client, just for a. Or public. Yeah, I actually don't think so because it would be used internally. But that's my opinion. My opinion is similar to that though. So what about reducing the risk? What about removing it after we retest it
01:25:41
and find it has been resolved? So I can tell you how I handle that on my reporting, is we leave the finding. When we find a finding, the finding's there, period. If you mitigate it or if you have other factors that reduce the risk, we'll note that in the documentation.
01:26:01
We've actually got a special section for client response and we also have like fixed things. So we'll add notes and we'll change the status from like uncategorized or whatever to fixed. But it's on the report because it was a finding when we were there. At the point of time that you did the test. Correct. Can I just make a comment about the,
01:26:20
going back to the rules? So I think this violating the spirit of the competition is the key here. And there have been things that we've found or done where we internally voted it down. That it was not something we were gonna do because it violated the spirit.
01:26:41
Even though it might have been, we could have gotten away with it. So I think there's some expectation that teams self-police on this. I know this is the real world, but we're all adults and there should be an expectation that you're a professional and that you self-police on these things. That said, there is a certain beauty
01:27:01
in creative approaches. And I would just encourage you guys to not write so many rules or become so draconian or stringent that it weeds out that creativity. And in some of the competitions, some of the most interesting work that we've done
01:27:21
in preparation for a competition and during the competition is creativity around gaining a competitive advantage and going right up to the edge of the rule but not crossing over. And they learn more from doing this than they would learn from learning how to instantiate local firewall rules as quickly as possible
01:27:40
or change all the passwords as quickly as possible and remove everybody from domain admins. I mean, we all know how to do that. We can all script that. That's kind of passe. So don't lose the creative aspect of this that forces people to think about things in new ways and iterate and evolve.
01:28:03
Yeah, I totally agree with that. And at the end of the day, as pen testers, that's what we're paid for is our creativity. When I'm hiring people, that's what I'm looking for is can you think broken and can you apply that creativity in a professional environment? And that's what we're trying to foster and I think most of the teams are similar years. They all do a really good job self-policing
01:28:21
and one thing that I really enjoy about this competition in particular is that it is very kind of community. All the teams competing have that kind of community so at least from my perspective, they do where it's like we enjoy doing this and we're here together to learn, not kill each other. I think the openness and flexibility
01:28:42
and approaches is a key reason why any team can be successful in this engagement. There are many possible ways to win this competition and I would say don't rule, create so many rules that it takes away that incredible facet of that.
01:29:01
Definitely. Definitely. And I would say part of the reasoning for the rule that allows teams to use publicly available tools that they create is to kind of, yes, we are seeing scenarios where some of that creativity would be the intent of maybe not using or making it clear that they're using something that they pre-staged, for example.
01:29:22
And this is actually an area that you and I debated quite a bit on how we wanted to approach it. But our philosophy was, yes, you would be potentially running into a scenario where teams are being very creative and getting around potential detection mechanisms
01:29:40
or something in order to get an advantage, kind of scooting around what the rules might allow. And our thought was to make that a much wider scope of what was allowed to make, remove some of the barriers at Tantry, remove some of the secrecy and increase the collaboration. Yeah, I mean, in the professional world, if you're doing defensive expertise
01:30:02
because you think everybody's gonna learn your secrets, you don't know that much, I mean, so. Just like you're not making your own encryption. Yeah, exactly. So open this up, level out the playing field, and let the best team win. I do wanna just add, I think, we have all these issues up here and we've had issues with teams,
01:30:22
but at the end of the day, I can't think of a single team that I really had a negative overall experience with. They do self-police, they are amazing. Yeah, it's been a great experience. And there have been so many times, myself, Lucas, everyone here have been walking around
01:30:41
and we've been approached by students who have gone through this program and are just raving about how much it has been a positive impact on their career and experience. And that is the best, most touching thing for me to have that positive impact on people's lives. Okay. With regards to what the last gentleman said about the fetal heart rate monitor,
01:31:02
would you guys, as the clients, ever ask the students, as the pen testers, to perform something unethical? Like, say, phishing was allowed. Are they allowed to only phish? Like, what does that mean? Is it just company email addresses, personal Gmail accounts, their kids?
01:31:21
How far do you go? So, last year, we had an insider threat scenario, and the insider threat actively tried to get people to delete data, destroy data, to kinda cover his tracks. Many of the teams reported this to a, we had this other investigator,
01:31:40
this third-party investigator that we were bringing in, many of the teams reported this. Some of the teams took the data back to the guy. So, we've asked them in non-direct ways to do unethical things through this insider threat guy. But in the case of, let's just say, there was an environment where there could be something that could be completely unethical
01:32:02
that could be somehow within scope or be considered appropriate for the pen test, we may totally ask the teams to do that. And unlike some other events where, you know, saying no is frowned upon, we would actually appreciate if the teams were to say no about that.
01:32:21
And this is not ethical, we don't want to do this, this is why. And that would be actually something that would result in the teams getting points. And doing what we ask would be putting the teams in a position where they're doing the wrong thing and losing points. See, I was just gonna say the exact opposite to that. Okay. I was gonna say, like, specifically that scenario
01:32:41
of you've got something in this scope and you find out, oh, I am not okay testing this specific thing. That's an interesting idea to put that in. I think we could, and I think we maybe should put it in just as an exercise, but I don't think I would want to score that. I really, I do like it from the perspective
01:33:02
of somebody in a position of authority, right? Somebody that's your boss telling you to do something you are not comfortable to do. I really like that scenario. Because that happens to us professionally. Where clients try to say, you know, I'm the client, I'm important, you're doing this. And we have to push back on it. And that is where one of the things
01:33:21
that we have to do in the competition, though, is to determine, as Dan said, is this scorable or not? If we are setting up a teachable moment that is, we feel like the majority of the teams are gonna have a difficult time with, the way we score it may be very different. Because our intention is to teach, but we still are running in a competition, a game.
01:33:42
And although we, I know we've talked a lot about how do we message this, how do we, what words do we use. So I was trying to think, as I'm sitting here, a better way to say this. We do keep that in mind as we're going through this process as well. Yeah, and I think, in some cases, it's pretty black and white, right? You went outside of scope, we have it defined to scope.
01:34:02
That's pretty black and white. We can score that, because it's pretty clear what the ethical implications there. Feudal heart monitor, I feel like that's pretty black and white. I think most of us here felt like that was pretty black and white. But again, as soon as we flipped that scenario a little bit, and now it's just money that's involved, suddenly we started getting differing opinions. And I think if you put the students in that position,
01:34:22
that's why I don't think I would want to score their responses, but I think it is a useful teaching experience. And something like that, where we're trying to teach something, and almost expecting the students to not pick the right thing, that is one of those things where we have a duty to make sure we communicate that teaching experience afterwards,
01:34:42
as part of the wrap-up of the event, or something like that, where everyone gets to see, this is why we did this. This is what we wanted to teach. This is how we think you should handle that professionally, because they're going to face that later in their careers.
01:35:00
Cover one more scenario. Yeah. We've got about 10, 15 minutes left here, so. So do you like this one? No, malware triage. Malware triage? Okay. Yeah. Okay. You want to do that, Dan? So this was a really interesting event. It was during our healthcare thing. We had some sponsors build part of the infrastructure, and they, in this infrastructure,
01:35:22
they planted malware, and they had a breach, and they didn't tell us about it as the competition organizers. And in the role that I was playing, I was playing the director of incident response. And basically I had multiple student teams bring this compromise to me, at which point I didn't think we put this in the environment. So originally I didn't believe them. I said, you need to bring me evidence
01:35:40
and convince me of this, otherwise you're kind of wasting my time. And specific teams, they didn't just bring me evidence. They showed me the evidence, but they had a conversation that was just different than the way other teams handled it. They would show you the evidence, they sat down, they explained what they found, and it was undeniable. It was really a good experience for me,
01:36:01
because then I was like, holy crap, this is here. I got to go tell every other team that found this that they're right. So the idea is, how do you tell a client something when you have evidence, and they're convinced you're wrong, but you're right?
01:36:21
So just to get some feedback from those of you who do this, have you ran into a situation before where you found something, the client doesn't believe you, and how do you handle that? I mean, this is a regular thing that we deal with, which is, well, we issue a report,
01:36:41
and it's, you didn't do that. Well, we did, we have the evidence for it. No, you couldn't have done that. We would have caught you. So, and this also goes back to some of my firewall admin days, where half of my job was fixing firewall problems, and the other half was proving it wasn't a firewall problem.
01:37:00
And it's the same kind of thing. You can't just walk in a client and tell them, no, believe me, I'm right, you're wrong, you're idiot, because they're not gonna be a client anymore, even if you're right. So I think Lucas provided the key there, taking it outside of the context of malware and just putting it in the context of providing a client with evidence
01:37:21
that whatever you say you did, you did, and how do you handle that when they're coming back and saying, no, you didn't do that? How do you handle that? I mean, for me, it seems pretty clear, because I just, I document it and say, there you go. Here's your CEO's email, yes, I did.
01:37:41
Yeah, the team that had convinced me of this, like, they basically sat me down in the room, and they were like, yeah, check it out. It's like right here, and I was like, oh, crap, and then they're like, this is the system it's on, whereas every other team just told me, right? They were just like, yeah, you know, your own. There's malware in the environment. I'm like, where, you know what I mean? This team, they're like, hey, come here. Check this out, sit down. So I think this may go into the bigger role
01:38:00
of what is the actual role of the offensive team? Is it to document the results, or is it to convince the company that their results are correct, or rather that their results are fully actionable? So is their job to push change within the company or to show that there is a problem and that the company should enact change? I think it's another one where the answer is yes.
01:38:22
I mean, from my perspective, I feel like it's our responsibility as consultants, right? Because as pen testers, we are consultants. So it is our responsibility, at least in our work, for my part of the world to raise the bar and increase the maturity of my client and help them understand their risks. So if I am not doing that, I am doing them a disservice.
01:38:43
And I have had multiple engagements over the years where a pen test has become incident response in that. I think that was the subtle difference here, where the finding of the malware would trigger an incident response process that would kind of stop the pen test. I get what you're saying, though. Normally, you would just put it in the report,
01:39:01
and you're just presenting the evidence and driving the changes up to the company to believe it or do something about it, right? But here, as the director of incident response, I said, if you find a compromise, let me know immediately so that we can do something, right? And basically, multiple people were like, hey, this is compromise, and I just didn't believe them, I didn't have the evidence, et cetera, which happens all the time in IR, right?
01:39:22
Like, you get people saying, this is weird, this is funny, this is an alert. You dig into it with the things they give you, and it might not be, right? I mean, I've had also situations, it's very situational, right? What do we put in the report and we leave for later, and what do we run into the client's room screaming and yelling, hey, you need to fix this? And it might be, I've had a client
01:39:41
where they've messed up their gnats, and every single port on every single host that they have in their DMZ is wide open to the internet. Yeah, we tell them about that pretty quick. But if it's some sort of inject on an internet-accessible website that really doesn't get me to any information or data, then, well, that one is a tougher call.
01:40:03
And maybe at that point, the question at least coming from the competition side would be how much do you want to push the team to try and argue that their point is correct? So if you're having a, if you, as the organizers say, like, we're gonna make them argue that this thing is true,
01:40:20
and we're not gonna believe it for a while, so how far do you push them? Do you push them like a little to say like, oh, no, we don't believe it, and then they come back with some evidence, and okay, we're starting to believe this, or is it like, we don't believe this, come back with like your dictionary of everything that you come up with, and. And it's.
01:40:40
That's the line, and it's not how much do they push us, it's how do they do it, right? And it was the elegance of the soft touch of sitting us down and showing us, rather than pushing evidence on us and having it to be adversarial. And this comes down to language, too, where you're using the word argue.
01:41:00
There's different definitions of argue or nuances to that, and argue as in you're wrong, believe me, damn it, is not the argue we're looking for, but argue in the sense of providing evidence and helping the client logic and reach the conclusion that we're hoping to achieve.
01:41:21
That is exactly what you're going with, Dan. And I think from a teachable perspective, because I think there was a component of that in your comment as well, I really feel like, so as everyone has said, I play the pain in the butt part of the client a lot or the aggressive portion or the angry bad cop, whatever you want to call it.
01:41:41
And there may be a push really hard, but then someone else from our team may come in and provide some support. So again, there's teaching, but then education. There's the expectation, here's a hard lesson, and then someone immediately comes behind or 20, 30 minutes later to say, hey, we need to get this together,
01:42:01
the boss was upset when he came in, or hey, the engineer was upset, but let me talk to you about what we actually need. So there is certainly some lines, and we push them in different ways, and I'll be honest, a lot of times we are doing and making those decisions on the fly, and that's why, again, we don't really go into rooms alone, we're talking with people,
01:42:21
and then we actually do gut checks as we leave, and we're walking from team to team to say, okay, we need to keep this consistent amongst all the teams, now what do we need to do to go back to maybe change how we approach this for everybody? It's very dynamic, how we do this, and I think the competition and the process allows us to do that, which is cool,
01:42:40
but we have to be very self-aware in it as well. I think ultimately you're right. If you believe something, you should push for it, and it's our responsibility to run it to ground. I think we have time. We could probably do a quick scenario. We got one more, I think. Which one do you wanna do? Let's do the next one.
01:43:01
Okay. So I'll talk a little bit about this one. So one of the things, this isn't something that's actually occurred for us yet, but it's something we've been talking about a lot. So we have these coaches. They're full-time employees, professors, engineers, administrators. We have a lot of different folks that are the representative from the university that comes with their team.
01:43:21
You have to have an FTE come and join them from certainly a liability perspective, but we also then have this unique opportunity to have people and leaders from around the country and around these universities on site with us together. And so the first couple of years, we actually tried to do a little conference for them
01:43:40
and have everybody, each one of the coaches bring something to present to the others. And one year, actually, they just said, can we just work? We just wanna work. We're here, let us hunker down and just work. But what we actually did last year is we gave them access to the environment after a part of the environment, basically the regional environment, while the students were doing nationals.
01:44:02
Team 11, coach team. Yep, we had team 11, the coach team. And they actually really enjoyed that. But we've had a lot of internal debate on, from the perspective of a coach, are we creating an ethical dilemma on the coach where maybe they identify something? And even though we are all professionals,
01:44:20
they are now in the dilemma of can they inadvertently, maybe not maliciously, but inadvertently help their team as they're at dinner or as they're talking about things? So from our perspective, what we are going to do is we're gonna try and make it so that the coaches now get a year delay. So they will get last year. But I think there's some good conversation around,
01:44:42
even if it's accidental, what are some of the ethical dilemmas in that? So first, let's see, coach team, if they have an older environment, is that ethical or unethical to the competition? Pretty easy, hopefully. What if they have the same environment, ethical or unethical to the competition?
01:45:01
Would somebody who marked ethical and somebody who marked unethical want to talk? Let's start with ethical. Ethical or unethical? So what's the delay between you guys giving them the environment and then us actually competing? Real time. Let's say, yeah, we gave them
01:45:21
the same time you all started. All right, so I personally don't think it matters because when you go to write the report, you have to have support of your evidence anyway. And so by the time we finish our day, even though they may know something, we still have to write the report. And if we don't have evidence for it, it didn't happen. So but we don't control, okay, I see your point.
01:45:41
Yeah, so even, let's say we go through our days or our coach is like a super guy, whatever, hacking, which is not true, by the way. Let's put it out there, for the record. Yeah, let's say he's like, you know, he turns like a super hacker guy, you know, well, like three, one, two, three, seven, you know, zero days all day. And he's like hacking.
01:46:00
Don't forget, we're going to email this out now. That's fine. Okay, all right, he goes and hacks everything and he's like, all right, guys, they got all the cheat codes, here's what happened, everything here. And he tells us, you know, the night of we're writing the report, well, it doesn't really matter because why don't we now screenshot some of the evidence. We can't prove that we did it. And we can't tell the client like, hey, we know there's a vulnerability in your environment. Sorry, we can't tell you how we did it,
01:46:20
how we found out about it or prove that we actually exploited it, but it's there. It's awesome. But what about if you have evidence of something but you don't have a great way to substantiate that? What do you mean? Well, not evidence, they write something and they don't substantiate it. Well, what if, no, let's just say you were competing. You found that there was something there
01:46:41
but you couldn't necessarily explain why. But your coach figured it all out. We found something and we couldn't explain why. Yeah, yeah, like there's evidence of something or you have a hunch. You have some screenshots that could be somewhat crafted to potentially indicate that. Well, then that would, us being like us,
01:47:00
that we'd be unethical for a while. You know, you can't write a hunch in a report. You can write like, we believe X, Y, and Z happened. I can say that teams have definitely tried to write hunches. Yeah, and they're not gonna get points for that. So the technical scoring happens with just a couple of us and it's those of us that are,
01:47:21
not that we all don't do pen testing, but there's a couple of us that are super hardcore, ridiculous people. And the scoring comes down to us and I can tell you, there's a whole lot of hunches in those reports that we go through and none of them get scored. And so what happens a lot of times is the teams will see their scores and they see the technical and they're like, we had way more findings.
01:47:41
And we're like, yeah, but none of them counted because you didn't put any evidence in and it does not count as a finding if you don't have that evidence. Yeah, so yeah, I mean, at least on our end, like we only put things that we have like 100% verifiable proof and screenshot for. Even there are things that we even redact out of our reports that we can't, we don't have enough supporting evidence for it. Thank you. So yeah, so like if you don't have the proof for it,
01:48:02
it doesn't really matter because you can't put something that you believe happened in our report. It's not true and no one's gonna like that. I think we have like one minute. So we've got about one minute left. So I know they're, sorry, Alex. We can talk after. We can talk after. So there's a couple of real world lessons
01:48:21
we wanted to talk through here. I think, you know, bringing this all together. The human element is very important. How we handle some of these disagreements and attitude are probably just as important for those of us in the real world as the way and how and the technical information on what we approach. More importantly, we as the educators of this competition
01:48:42
are trying to teach people speaking skills, interpersonal skills. And so we've talked about this a lot, right? It's not a competition, except it is. And so this is something that we are constantly, constantly debating. I don't know if you guys have any other closing comments
01:49:00
on this otherwise? I really do want to call out that seventh bullet specifically about, remember who you're working for. We'll see often teams will present and say, your IT staff sucks. And like for me, role playing the role of the customer,
01:49:22
and you're just telling me that I hired shitty people, you're fired. You know, we don't want to see that sort of thing. And that is a very valuable learning experience to have. How do you approach those sorts of situations where yes, maybe your client, your customer isn't all that good and they have poor security practices,
01:49:40
but how do you present that in a way that's appropriate for the audience and the management of those people that are making poor decisions? Just closing remarks, I wanted to thank everybody that came here. I got a ton of great notes that I'm going to incorporate back into our program. So thank you for your thoughts and your comments
01:50:00
because I thought we had a great discussion and I'm going to digest them and actually apply them. Yeah, this is a phenomenal audience and we really appreciate your participation. So last couple things here. If you want to get involved, we'd love to have you. We need to build profiles, we need to write,
01:50:20
we need to grade reports in the fall. We are in the process of building the 2019 infrastructure. We're in the process of planning the 2020 competition. So if you want to get involved, this is a call to action. If you want to go to the next slide, be an ethical influence. This is our really final comment here,
01:50:41
is be the influence that we want to be. The whole purpose of this competition, and this is our, minus the little edit here, is essentially our mission statement is to be an influencer and an educator to create the next generation of cybersecurity professionals. That is the goal of CPTC. Now, we also need to be ethical about it
01:51:00
and that is really what we want to do. So we'd really like to hear from you on either ideas that you've got or if you'd like to get involved. For prior competitors, we love you the best. So you know what you've been through. If you are no longer a student in full time, we'd love to hear from you as well. But anybody in the audience or anyone that you know that you think might want to get involved,
01:51:20
we have regionals all over the country, but we actually run the infrastructure and we run everything for them and we create all of that. So we need people in person. We certainly can always use money, but we're actually pretty good there. Really right now, we just need people to help us build. We're building a bank and you should see what people's reaction is when I say, yeah, I'm just building a whole bank today.
01:51:41
It's pretty fun. Yeah, we actually end up spending between probably the core group of 20 people or so, about 10,000 hours a year building this. We can use people across the board if you're technical, if you're non-technical, if you're a project manager, if you like marketing, whatever. We have a job for you. Yeah, the thing we were joking, Lucas and me,
01:52:01
that the core group that runs a lot of this is smaller than a lot of the teams, which is insane. But without the support of everyone else who donates time and their talents and all that, we can never make it happen. So thank you all for coming. We really appreciate it and we'd love to talk to you. So we'll be outside, I think, yes? And grab stickers before you leave.
01:52:23
All right, thank you very much, yes.