Red Team Village - Trust, but Verify Maintaining Democracy
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 374 | |
Author | ||
License | CC Attribution 3.0 Unported: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor. | |
Identifiers | 10.5446/49193 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
00:00
Information securityOffice suiteAsynchronous Transfer ModeInformationOffice suiteDifferent (Kate Ryan album)Computer engineeringVotingInformation securityPresentation of a groupSoftware developerBitCybersexComputer animation
00:41
Asynchronous Transfer ModeHierarchyHigh-level programming languageVideo gameHierarchyWater vaporLevel (video gaming)Information securityComputer animation
01:20
Asynchronous Transfer ModeLevel (video gaming)Element (mathematics)Software developerLogical constant
01:51
Asynchronous Transfer ModeLevel (video gaming)Order (biology)Information securityVotingComputer animation
02:19
VotingCurve fittingAsynchronous Transfer ModeVotingBitInformation securityFitness functionCybersexSeries (mathematics)Local ringExpert systemHacker (term)Office suiteIdentifiabilityOperator (mathematics)Service (economics)Computer animation
03:24
Asynchronous Transfer ModePerspective (visual)Virtual machineDifferent (Kate Ryan album)Vector potentialStorage area networkHydraulic jumpVotingPerspective (visual)Chaos (cosmogony)Order (biology)Video game
04:20
Perspective (visual)Asynchronous Transfer ModeData structureImage registrationSystem programmingVotingData storage devicePhysical systemPerspective (visual)Hydraulic jumpData storage deviceImage registrationPhysical systemVotingHierarchyComputer animation
04:58
HierarchyAsynchronous Transfer ModeVotingHierarchyDifferent (Kate Ryan album)Set (mathematics)VotingFamilyNumbering schemeMereologyInformation securityParallel portProcess (computing)Multiplication signVideo gameCategory of being
06:42
HierarchyAsynchronous Transfer ModeVotingHierarchyVotingLevel (video gaming)Computer animation
07:16
System programmingAsynchronous Transfer ModeCAN busVotingPhysical systemVotingState of matterInformation security
07:43
Asynchronous Transfer ModePerspective (visual)Information securityVotingAnnihilator (ring theory)Hydraulic jumpPerspective (visual)Computer animation
08:13
Asynchronous Transfer ModePattern recognitionPhysical systemSoftware maintenancePoint (geometry)Pattern recognitionState of matterCASE <Informatik>Thread (computing)Power (physics)Computer animation
08:48
Asynchronous Transfer ModeDifferent (Kate Ryan album)InformationType theoryObject (grammar)CybersexPhysicalismPhysical systemCore dumpComputer animation
09:25
Plane (geometry)Asynchronous Transfer ModePhysical systemCybersexInformation securityContrast (vision)MassData conversionRight angleVotingCoalitionSound effectMultiplication signComputer animation
10:41
Asynchronous Transfer ModeProcess (computing)Process (computing)Connectivity (graph theory)Digital libraryShape (magazine)Hacker (term)Point (geometry)View (database)MathematicsGroup action
11:32
Asynchronous Transfer ModeSocial softwareBoom (sailing)VotingStack (abstract data type)Multiplication signEquivalence relationCybersexType theoryCASE <Informatik>VotingInformationNumberDivision (mathematics)HypermediaRight angleEmailCountingPhysical systemEngineering drawingComputer animation
13:08
Asynchronous Transfer ModeMereologyType theorySelf-organizationDependent and independent variablesHypermediaExtension (kinesiology)Group actionInformation securitySource codeComputer animation
14:06
InformationPlane (geometry)Asynchronous Transfer ModeData modelInformationReal numberTelecommunicationPlanningMereologySurface of revolutionTheoryChaos (cosmogony)CausalityOrder (biology)Speech synthesisComputer animation
15:31
Asynchronous Transfer ModeNumberFreewareComputer programming1 (number)Game controllerInstance (computer science)Direction (geometry)Workstation <Musikinstrument>AverageComputer animation
16:50
Metropolitan area networkInformation securityAsynchronous Transfer ModeUser interfacePerspective (visual)AuthorizationContent (media)Panel painting
17:24
Asynchronous Transfer ModeCybersexHypermediaWebsiteMultiplication signValidity (statistics)VotingAnnihilator (ring theory)Denial-of-service attackSoftwareCountingLogical constantSource code
18:27
TelecommunicationAsynchronous Transfer ModeMach's principleVotingLine (geometry)PlanningTelecommunicationUniform resource locatorMessage passingInformationType theoryReal numberFrustrationPoint (geometry)Right angleComputer animation
19:46
Plane (geometry)Asynchronous Transfer ModeHigh-level programming languagePlanningCybersexMultiplication signInformation securityConnectivity (graph theory)
20:18
Asynchronous Transfer ModePrime idealInstance (computer science)Equivalence relationIndependence (probability theory)Type theoryGroup actionComputer animation
21:21
Asynchronous Transfer ModeType theoryRepresentation (politics)Computer animation
22:33
Asynchronous Transfer ModeDenial-of-service attackCombinational logicService (economics)TelebankingConnectivity (graph theory)HypermediaRepresentation (politics)Multiplication signInformationTraffic reportingHand fanBroadcasting (networking)Decision theoryEmailCybersexComputer animation
23:58
CybersexKinetische GastheorieLatent heatAsynchronous Transfer ModeTheoryType theoryChaos (cosmogony)Physical systemCybersexBitPanel painting
24:49
Asynchronous Transfer ModeHigh-level programming languageHierarchyVotingDivisorType theoryPower (physics)Family
25:51
Asynchronous Transfer ModeHigh-level programming languagePhysical systemGroup actionState transition systemWave packetStorage area networkTwitterWorkstation <Musikinstrument>Computer animation
26:37
Asynchronous Transfer ModeStack (abstract data type)Perspective (visual)High-level programming languageInformation securityHydraulic jumpMereologyPerspective (visual)View (database)VotingDatabaseImage registrationInformation securityComputer animation
27:13
Asynchronous Transfer ModeEmailAreaVotingState of matterDifferent (Kate Ryan album)HoaxDistortion (mathematics)Representation (politics)Panel paintingComputer animation
27:57
Discrete element methodAsynchronous Transfer ModeMereologyPattern recognitionView (database)Order (biology)Different (Kate Ryan album)MathematicsInformation securityPanel painting
28:40
Asynchronous Transfer ModeInformation securityTelecommunicationInformation security
29:24
VotingAsynchronous Transfer ModeBeta functionTwitterPerspective (visual)VotingRight angleVapor barrierOffice suiteEmailPosition operatorMereologyEvent horizonPower (physics)TwitterComputer animation
31:05
Asynchronous Transfer ModeVotingType theoryVideo gameDialectStatement (computer science)Multiplication signHoaxRandomizationElement (mathematics)
32:19
Asynchronous Transfer ModeInformation securityInformation securityTelecommunicationMereologyGroup actionMathematicsOrder (biology)HypermediaPhysical systemSource codeComputer animation
33:01
Asynchronous Transfer ModeCAN busInformation securityElement (mathematics)Different (Kate Ryan album)Context awarenessCASE <Informatik>Group actionSpacetimeEnterprise architectureMereologyOrder (biology)Physical system2 (number)MeasurementFocus (optics)Goodness of fit
34:42
Information securityAsynchronous Transfer ModeMessage passingEmailLink (knot theory)TouchscreenPresentation of a groupGreatest elementHacker (term)Computer animation
35:27
Asynchronous Transfer ModePresentation of a groupTouchscreenControl flowGreatest elementData conversionLink (knot theory)YouTubeSource codeComputer animation
Transcript: English(auto-generated)
00:01
Hello, everyone, and welcome to Trust but Verify, Maintaining Democracy in Spite of Information Countermeasures. My name is Allie Mellon. I'm a security strategist in the office of the CISO at Cybereason. A little bit about me. I'm a computer engineer at heart. I've been in various engineering and development roles for the past 10 years.
00:25
I've been a previous security researcher and now security strategist at Cybereason. I am a frequent presenter on different topics, including election security. And before we jump into the voting aspects of this talk, what I really want to talk
00:43
about is something very important and related, which is what is truly critical to our daily life. And the way that I immediately think about this is through Maslow's hierarchy of needs at the base of Maslow's hierarchy of needs.
01:02
You have the physiological needs, things like air, water and food. It's critical to daily life. At the next level, you have things like personal security, employment, resources, these things that are a complete requirement if you're going to continue to survive. And then you get into the more psychological side of things with things like belonging.
01:26
The majority of first world countries, we have safety and the physiological needs relatively covered. But once you get to this level of belonging and that psychological element, you start
01:40
to get into something that I think a lot of people are under constant development with. And it's not a given that you're going to have these things just because you are in a first world country. Similarly, esteem, things like respect, self-esteem, status. These are things that you need on this path towards what is ultimately self-actualization and the desire to become the most one can be.
02:05
And you can be taking steps at each of these levels. But the idea is that in order to reach the top of the pyramid, you need to have all of these different levels met. So the reason that I talk about this is because I think that it plays an important role in voting and in election security as well.
02:26
And it raises the question of where does voting fit into this? Now, a little bit of background. The original, the start of this idea came from a series of exercises we did at Cybereason called Operation Blackout Protect the Vote,
02:45
where we worked with local and federal law enforcement officers to host tabletop exercises where we had one red team of hackers and cybersecurity experts and a blue team of local and federal law enforcement from DHS, FBI, and the Secret Service.
03:02
And then we also had a white team set up for adjudication. And it was basically a turn-based tabletop exercise so that we could ultimately identify gaps in law enforcement's ability to protect Election Day with a fictional country known as Adversaria.
03:22
And we actually held these exercises in Washington, D.C., in Boston, in San Francisco, in London, in Tel Aviv, and Paris. And what we learned from them is that even when you're not actually hacking, even when this is just a thought experiment,
03:40
you can see a lot of potential for different ways to attack an election that isn't just limited to those voting machines. And so that's what I'm going to be talking about today. We're going to start with the defender perspective and really getting into the daily life of a voter before jumping into the attacker perspective. And then we're going to start brainstorming what you could do to stop an election in order to understand better how we can actually defend it.
04:07
And so throughout this talk, I want you guys to be brainstorming and thinking about the different things that you would do to stop an election or to cause chaos during an Election Day. So let's jump into the defender perspective.
04:25
According to the DHS, election infrastructure includes things like voter registration databases, IT infrastructure, voting systems, storage facilities, and polling places. This is a very classical definition, totally understandable. These are all the things that have to do with actually submitting your vote.
04:45
But I don't think that that's all that we need to consider. I think that there's a lot more that we need to consider before the voter is actually able to go to the polling place and put in their ballot. And the way that I like to talk about this is through the voter hierarchy of needs. These are
05:03
much like Maslow's hierarchy, a set of different things that voters need before they're actually comfortable going out to vote. We can see the parallels with Maslow's hierarchy of needs with things like the physiological needs, life and death, very important.
05:23
Property, having a place to live, having a place to do your daily activities. Those things are automatically going to come before voting. If you are afraid for your life, you're probably not going to take the time to vote. For those who do, it's going to be much less than it would if they weren't afraid for their lives.
05:43
And that also plays into the safety concept, but it can be extended to things like family health or job safety, financial security. People, when they don't feel financially secure, are they going to want to spend the extra money to get to the polling place? Some will not.
06:04
And then also that belonging aspect. If you don't feel like you're a part of a country, a part of a culture, you're not going to want to contribute to it by giving back your vote. And then esteem, feeling like your vote matters, feeling like you matter in the grand scheme of
06:21
things, feeling like it's worth it to vote because you think that you're making something a better place. And then, of course, being all that you can be and really being able to feel good about that as you move forward. And also feel good about your voting and that contributing to your country being the best that it can be.
06:43
So this is how I like to talk about the voter hierarchy of needs and all of the levels that we have to consider as different places that can be attacked. And that's why I question if election infrastructure is really just limited to the ways that you submit your ballots.
07:01
And I think it's important to say that I'm not suggesting that election commissions need to control all of these levels, but they need to be considered as we look to secure our elections and protect our democracy. Because these systems that we've identified that we need to protect are also systems that can become a target.
07:27
And you might think that this would be a small impact on an election, but take the United States where 40% of eligible US citizens do not vote at all. And any kind of voter suppression will have an impact on the election.
07:42
So when we're thinking about election security and election infrastructure, we need to think about things that relates to voter suppression and faith in the government. There are ways to influence an election outside of attacking the actual polling places.
08:05
So let's jump into the attacker perspective and talk about a more historical approach and modern day what we're seeing, as well as motivations and things like that. So, on the motivation side, these nation state attackers, they're looking to either gain power, spread a particular ideology, or maintain global recognition and support.
08:26
And for any of you who are able to attend some of the Black Hat keynotes, we saw this as a common thread. These countries are looking to make sure that their particular belief system is spread across the world, that they maintain
08:41
their status as a superpower, whatever the case may be, or that they get to the point of a superpower. And when I'm thinking about the different types of attacks, the different types of cyber attacks that we can see, I like to use a way of describing it that our CISO Sam Curry talks about,
09:00
which is three different layers, the infrastructure layer, the information layer, and the ethos layer. The infrastructure layer, that's that physical objects like the electric grid, things like that that you can take out. The information layer is how you spread things like misinformation or disinformation. And the ethos layer, that's all about the core belief system of the country.
09:25
So we're going to start with that belief system and delve into some historical examples, and then get to some modern day examples that incorporate a lot more of the cybersecurity aspect. The historical examples don't as much, however, they give us a really good baseline to understand that this is not a
09:43
new problem, and that the way that we're seeing people do it digitally is just a new approach to that problem. So let's take the Italian elections of 1948. At the time, it was right after World War Two, the US government was intent on psychological warfare.
10:02
They were obsessed with spreading democracy, and they were putting millions of dollars towards Christian democratic and right wing socialist parties. In effect, they were spreading a massive propaganda campaign against communist socialist coalitions. And the goal was to change the perception in that election specifically to a conversation about democracy versus totalitarianism,
10:26
Christianity versus atheism, America versus the Soviet Union, and, of course, abundance versus starvation, trying to draw that contrast. And they ended up successfully swaying the votes in the election.
10:41
And so they repeated that process in places like Guatemala, in South Vietnam, in Afghanistan, and Indonesia, as well as many, many more. And this was without having the digital component at all, without having to do any hacking. But it was the start of something that we're seeing take shape even more now.
11:05
Another example. Moscow founded the Communist International Group Comintern in 1919, and they urged the American Communist Party to pursue revolutionary regime change. This was very similar to democracy promotion, which was, again, later pursued by
11:25
Washington, all with the purpose of spreading their point of view within America. And, of course, the 1996 Chinese influence on the US election, where a Chinese general sent $300,000 to influence a US presidential election by funneling money to the Clinton campaign.
11:50
So let's look at the cyber equivalent of these things that's happening right now, and has been happening for some time. A good example of this is social media attacks. So on the left, you can see this ad, which is meant to persuade voters
12:07
to vote ahead by voting from home, and saying that you can text Hillary to a certain number to vote, which is obviously not true. But especially right now, it might be something that people would find really appealing, not having to worry about
12:25
any mail and voting, just having to text a name to a certain number, but of course it doesn't work. So that's kind of a double edged attack, because it's not just targeting that belief system, and
12:42
that belief in your government and their ability to actually protect you from these types of things. But it's also targeting that information layer, and making it so that your vote doesn't actually count, even though you think that it did. And then on the right hand side, you can see an advertisement, also not from someone in the US, but aimed at highlighting divisions within the US, and also creating them, depending on the case.
13:09
And an important part of this talk is recognizing that there are other organizations than the government that have a role in election security, even if they choose not to.
13:25
And that through actions that certain social media organizations have taken, they have set up governments to spread this type of attack more easily.
13:40
And that's not to say that these organizations should be controlled or anything like that, but to some extent they do have a responsibility as American citizens, as people in the United States, to consider this and to consider the implications of these types of attacks, and to try to build a stronger foundation for democracy within their own organizations.
14:07
So now let's look at the information plane. And remember, this is all of the communication, this is where we get into the real misinformation and disinformation. And I want to start with a really baffling example that highlights that, you know, this is not a new thing.
14:27
And as much as we are recognizing disinformation and misinformation right now in American culture, America was really founded on this. It was founded on the premise of a conspiracy theory where the British were trying to enslave us. Sam
14:43
Adams argued that Britain's taxations before the Revolutionary War were part of an elaborate conspiracy to eventually enslave American colonists. He actually spread that and other early disinformation through pamphlets and speeches that he would hand out that contained information that he knew was not true.
15:04
And he used the Boston Massacre as a tool in order to continue to spread that disinformation and to continue to cause chaos within America, leading to the American Revolution. So we have this foundation in the US, for those of you who are from the US, of disinformation and of misinformation from our very earliest days.
15:30
And then of course you can consider things like the CIA and the infamous propaganda asset inventory. During the Cold War era in the 1950s and 60s, the CIA had a huge number of radio
15:48
stations and newspapers and magazines across the globe that they were using to push the United States agenda. In one very interesting instance, they had control of a program called Radio Free Asia,
16:05
but they realized that the average Taiwanese person, individual, would not actually have a radio. So they strapped small radios to balloons and sent them flying to try and get them to Taiwan.
16:20
Needless to say, the wind took them the wrong direction, but the idea was very interesting nonetheless. In another interesting little tidbit, they had control of a magazine, but the magazine became so popular that it was actually also being distributed to the United States.
16:42
And so they had to be very careful to only put the propaganda in the ones that were not going to the US. Similarly, in the 50s, 60s, and 70s, left-wing newspapers in Europe were pretty much all financed directly from Moscow, pushing Moscow's agenda.
17:02
And then the US radio, also in Moscow, which would play rock music like Elton John and the Beatles that they couldn't get anywhere else, but it was always followed by short segments of what they called editorial content, which Soviet authorities considered disinformation, which was giving that American perspective.
17:25
One that's more focused on the cyber realm is the 2014 Ukraine elections. On May 21st of 2014, attackers compromised the CEC network and actually disabled the vote counting. Now, this would automatically attack that ethos layer and you start to lose faith in your
17:44
government to actually be able to accurately count your votes and the validity of the election. But then they managed to get it back up after I think like 12 hours. And four days later was the actual election day. At the time, the same CEC website was under constant DDoS attack. And 12 minutes before the polls closed, attackers
18:10
actually posted a picture of a former leader to the right sector on the CEC website claiming he won the election. And what's really interesting about this is it was immediately shared by Russian media. Almost as if it was coordinated. Almost.
18:29
And let's also look at the communication plane. This is very interesting and it really makes it so that people are confused about what to do, where to go. It's a direct line to the voters.
18:44
And here are two examples of that. On the left, you have an example of a text that is supposedly from President Trump shaming someone into voting, saying your ballot hasn't been submitted. What are you still doing? In reality, this is not from the campaign. This is not from Trump. And they would not be able to track this type of information.
19:03
But they even got an interesting URL, vote.gop. And then on the right-hand side, you can see an example that's supposed to be a very helpful message from the government telling you your polling location and what the hours are and where to go.
19:21
In reality, that's not a real polling location. That's someone sending a message that purports to be from the government as a helpful reminder to go vote, when in reality, there's nothing there and the voter will go get there and not be able to vote. And at that point, you're either one, late for work if you go into work, or two, you're frustrated and less likely to vote.
19:48
Now let's look at the infrastructure plane. And this one is really interesting because it's that physical layer, which I think that a lot of times, especially more broadly, we don't think about or more broadly, I mean,
20:02
the world instead of just the cybersecurity community doesn't think of cyber attacks in this infrastructure plane as much. And we're going to go back to those historical examples that don't incorporate a cybersecurity component, but lay the groundwork for where this can go in the future. Starting with the assassination of Lumumba.
20:23
In 1960, the Democratic Republic of Congo gained its independence and Lumumba was the Prime Minister of the DRC. He was also elected that year. And then he was subsequently assassinated one year later in 1961.
20:41
Now, this was immediately politically fracturing for this very, very new independent nation. And it turns out that the groups behind it were American and Belgian. And they were actually from the US government and the Belgian government, and they had worked together to plan this assassination.
21:01
And this is the type of thing that can really be destructive to such a new nation. It would be the equivalent in the US is George Washington being assassinated a year after American independence, which would have who would have who knows what would have happened in that instance.
21:23
And then there's also the 1965 Indonesian elections and subsequent massacre. The Communist Party finished fourth in an Indonesian elections, and they were offered a proportional representation in the government.
21:41
Now, this was not aligned to US interests, needless to say, and for fear of how this would impact US interests in the region. The US secretly supported the purge of suspected communists, causing thousands to millions, they're not entirely sure how many, to die over the course of months, and the military took over as the most powerful institution.
22:06
So this is not only direct impact because they have supported and pushed for the death and the massacre of these many, many people. But it's also something that I deeply believe would terrify the public, who is going to
22:24
be voting and speaking about their beliefs freely after this type of thing happens in their country. And now this one isn't an example of election day but it's a good representation again and it does involve that cyber component. So the bronze knight in Estonia in 2007.
22:45
This was really a combination of an infrastructure attack and an information attack. In 2007, Estonians made the decision to move a Red Army soldier statue to a Soviet cemetery because needless to say, they did not want a Red Army soldier in the middle of their country.
23:04
Not exactly a fan favorite there. So, the night that it was being moved, fake news started to spread with Russian news reports claiming that the statue and the Soviet war graves were being destroyed.
23:20
And this resulted in two nights straight of riots and looting. 156 people were injured, one person was left dead and 1000 were detained. But at the exact same time, there were a ton of denial of service attacks going on inside Estonia, across banks, media outlets, governments, cash machines,
23:41
things like online banking was out of service, government employees couldn't communicate over email, and newspapers and broadcasters couldn't deliver the news. So all they were left with was the Russian news reports that were spreading this fake news.
24:00
And this quote, I really like this quote because it talks a little bit about this with, in all of these historical examples, these campaigns were done covertly. And cyber aggression really gives us the ability to have these type of campaigns very covertly.
24:22
And it leaves us really vulnerable to these types of attacks, and to really sowing chaos internally in the country with people believing conspiracy theories about the situation with people not knowing what's true and what isn't. And that's why I think it's so scary, the potential, not just on the misinformation
24:42
and disinformation side, but also on that belief system layer and even that infrastructure layer. So, some examples of things that we talked about during these tabletop exercises that I think could have a huge impact on Election Day come to things like the electric grid.
25:05
So in 2003 a four day power outage left at least 100 people in the United States and Canada dead. Now imagine there's a power outage on Election Day, and you have yet to leave your house. Are you going to
25:21
be worried about voting, or are you going to be worried about making sure that whatever's in your fridge doesn't go bad? Are you going to be worried about your family? Maybe you know someone who needs the type of care that requires electricity to be available. There are a lot of factors here that will immediately, just like with those voter hierarchy of
25:44
needs, that will immediately come before you're actually willing to vote when something sporadic happens, like losing power. And we're also going to talk about transit. So, in 2016, San Francisco's transit system was infected with ransomware.
26:04
Now, imagine a scenario, a lot of people go to vote after work. So you go to work, you spend your day on Twitter, probably seeing some disinformation, you get to the end of the day, you've taken the train into work, and you go to take the train to the polling station and then back home, and it's down.
26:25
And you can't access it. Are you going to be thinking about voting now or are you going to be thinking about how you're going to get home and potentially being a little annoyed that the train is down? So we can see that these attacks go across layers, but also across all the layers of the pyramid.
26:46
And they're just a few examples, there are way more that we can look at through our own tabletop exercises. So let's jump back into the defender perspective before we really get to that brainstorming part.
27:00
Where does this leave us? We have this small view of election security, just with voter registration, just with databases, IT infrastructure, things like that. And we end up with quotes like these from leaders in different countries. We cannot exclude such
27:22
activities in Germany either in the election campaign, we'll also have to confront distortions and fake stories. Now, it's great that they're being honest about it, but where does this leave a citizen when knowing what's fake, what's real, when knowing how to actually combat this?
27:41
Similarly, we have representatives saying that people are trying to steal another election, that it's all rigged, that it's a scam, pushing this same narrative that these countries want other countries to believe in. And then, of course, just from a couple of weeks ago, congressional Democrats
28:02
talking about how they were gravely concerned about foreign interference in an election. Now, I think the recognition, again, the recognition is good. It's just a matter of giving us the tools to combat what they're seeing that we're missing here, we're missing the leadership part.
28:21
So where should we really be going in order to create a different reality for ourselves, where we have a more well-rounded view of election security, where we understand the problem, and where we, not just the government, but also the citizens and the private sector can affect change.
28:42
And this is what it should look like, where we have all of those different aspects that are front and center, and that we're thinking about, and that we're making clear these are the things that are going to be under attack, these are the things that you need to look out for. Having that communication, because in reality, as I've mentioned before, a lot of these channels are not government-owned entities.
29:04
A lot of them are privately held, which means that election security cannot just be the government's problem. It has to be something considered by the private sector. It has to be something that we work together as a community, as a society, to help stop.
29:24
So I want you guys, the goal of this was, if we were in person, to do a small tabletop exercise together. But since we can't really do that, I want you guys to imagine how this scenario would go, what you would do from either perspective.
29:44
And just start by thinking about, do you vote before work? Do you mail in your ballot? How do modern events, like the pandemic that we're going through right now, change all of that? Are you on Twitter? Do you fact check? I typically do or would vote before work if we are still going into the office.
30:04
I am doing a mail-in ballot. I think that mail-in ballots are critical right now, given the current situation, and because it really takes away those time-based attacks that we're seeing. If the day of the election, you lose power, but you already voted three weeks ago, then you're not actually going to be
30:25
part of the voter suppression problem, and it's already taken care of. And that's a huge positive. That's a huge benefit. Not to say that mail-in voting doesn't have its own issues, but that's at least a start to removing some of those time-based barriers.
30:43
How do modern events change this? Are you considering things like mail-in voting? Are you on Twitter? I know a lot of us are. InfoSecTwitter is great and terrible. But it raises the question of, do you fact check? Are you checking the things that you see? Are you sharing articles without actually reading them? These are important, and they tend to get lost in the shuffle, even though people talk about them.
31:06
And then take a look at, how would you attack your life of a voter, your typical day? What would you do that would make you stop and think, I'm going to take care of this problem, and I'm not going to go vote today? That's what's critical.
31:24
And some of the things that we saw in these tabletop exercises to affect an election were things like attackers creating deepfakes of certain candidates. And the thing is, if you release a deepfake of a candidate on the day
31:41
of the election, on that morning, there really isn't enough time for the candidate to react. And maybe they'll issue a statement, but the damage will have been done. And it's not something you can fix over the course of eight hours. So these are the type of questions like, how does that time-based element affect things? If you took
32:01
out the electric grid exclusively in regions that are known to be very conservative, how would that look to people? What would they take away from that? Would they take away that it was just a random power outage? Or would they suspect that there was foul play from the other side? And then take those ideas and think about what you can do for your country instead. Because at
32:27
the end of the day, as we know with red teaming, all of this is working towards stronger security. And that's what's important is that we come out of this considering how to make security stronger, how to make election security stronger, the whole thing.
32:41
So how do you defend? Are you involved in any of these sectors? Are you in the transit sector or communication sector? Do you work in a social media company? How can you affect change within these organizations? Because at the end of the day, we all have to do our part in order to protect our system.
33:04
So this kind of left me with a lot of questions because it feels like an insurmountable problem. It's this huge, huge problem. But at the end of the day, what I think is the most important and what we can take away from this is the first thing is to, whether you work at a private company or a government entity,
33:23
work to improve your security, your personal security, your enterprise security, whatever the case may be. We need it because all of these different elements can be used in a way that they're not supposed to be used and you just don't know what creative method an attacker is going to use in order to affect the belief system of the country.
33:48
So really focus on, of course, having good security measures. The second is to work with the government. I know, build partnerships with the government. I'm a part of InfraGuard. It's the government group for infrastructure security. I think
34:04
that that's great. It's a great way to share and give back, not just to the community, but also to the government. So I highly recommend just finding different avenues and different partnerships that you can have to make security stronger in the government space, even if you don't work there directly.
34:22
And also to spread awareness. And then the last one is to work to fight misinformation, because I think that's a continuous struggle that we all have to consider. And the more that we work at it, the better off we will be inevitably. I mean, it's a lot of work, but that's how it is.
34:43
I also wrote a white paper on this. Feel free to download. You don't need to put an email or anything like that. It's totally free. If you go to this link, and it elaborates on a lot of what I've talked about here. So feel free to download that and also message me if you have any questions. I'm happy to help.
35:03
Thank you so much for attending. I hope you guys have a great rest of DEFCON Red Team Village. And if you have any questions, feel free to reach out. I will respond to all the comments in Discord, and feel free to ask me any questions. I'm hackerbella on Discord. So thank you so much. Awesome. Thank you. Amazing presentation. Thank you so much for supporting us, supporting DEFCON, the Red Team Village.
35:26
And again, as we said before, please look at all the talks and activities that are happening right now in the bottom of the screen. You should have a link to our website. We're streaming on Twitch and YouTube and so on. And please, as Ali mentioned, please join the conversation on Discord. We're going in
35:45
a break, in a real small break, and the next presenter will be hearing about...