ETHICS VILLAGE - Ethics of Technology in Humanitarian and Disaster Response
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 322 | |
Author | ||
License | CC Attribution 3.0 Unported: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor. | |
Identifiers | 10.5446/39886 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
00:00
Dependent and independent variablesMultiplication signComputer programmingGoodness of fitSlide ruleDecision theoryEndliche ModelltheorieData managementDependent and independent variablesPoint (geometry)
01:34
Axiom of choiceType theoryTheorySystem programmingStandard deviationGroup actionDecision theoryOnline helpDecision theoryMultiplication signSlide rulePhysical lawAxiom of choiceQuicksortDifferent (Kate Ryan album)Goodness of fitRule of inferenceSoftware frameworkBitGreatest elementBasis <Mathematik>Group action1 (number)NumberPrisoner's dilemmaEndliche ModelltheorieMilitary baseMoment (mathematics)Peer-to-peerPressurePhysical systemInformationValidity (statistics)Point (geometry)Absolute valueObject-oriented programmingOntologyTouchscreenProcess (computing)Hidden Markov modelGodPointer (computer programming)
07:10
Different (Kate Ryan album)Multiplication signSubsetDecision theoryDefault (computer science)Goodness of fitNumberComputer animation
08:39
Data managementDependent and independent variablesElectric currentState of matterSelf-organizationElectronic mailing listDependent and independent variablesPhysical lawFamily1 (number)WordSmartphoneBitLevel (video gaming)Data managementEmailFrustrationLocal ringDifferent (Kate Ryan album)Contingency tableVariety (linguistics)MereologyEndliche ModelltheorieColor management
11:50
Revision controlRule of inferenceCollisionStandard deviationExtreme programmingHypermediaCASE <Informatik>Rule of inferenceOnline helpCompass (drafting)Spectrum (functional analysis)Image resolutionDependent and independent variablesType theoryComputer animation
12:53
ComputerData analysisPeer-to-peerNeuroinformatikInternetworkingSpectrum (functional analysis)Self-organizationDifferent (Kate Ryan album)QuicksortData managementData analysis
14:13
BuildingOperations researchData recoveryData managementQuicksortMereologyException handlingLocal ringData structureProcess (computing)Order (biology)Data recovery
16:01
Entire functionShape (magazine)Multiplication signMedical imagingPoint (geometry)
16:53
Field (computer science)Proof theoryPoint (geometry)Multiplication signAxiom of choiceMedical imagingDecision theoryBuilding
18:01
Field (computer science)Proof theoryMereologyMedical imagingLetterpress printingMultiplication signState of matterComplex (psychology)Operator (mathematics)RandomizationRoutingAreaElectronic mailing listYouTubeBitMultiplicationField (computer science)Order (biology)Physical systemIntegrated development environmentConstructor (object-oriented programming)PlanningPoint (geometry)Entire functionEndliche ModelltheorieMathematicsChaos (cosmogony)Information securityProof theoryData structureOnline helpLattice (order)SpacetimeDifferent (Kate Ryan album)Decision theoryAddress spaceGroup actionInformationTerm (mathematics)Data miningMultilateration
25:30
Complex (psychology)Decision theoryRule of inferenceOffice suiteState of matterPoint (geometry)Direction (geometry)Data structureParameter (computer programming)Multiplication signCASE <Informatik>WaveProcess (computing)Slide ruleQuicksortMultiplicationMathematicsMereologyGroup actionShooting method
28:38
Complex (psychology)MereologyPole (complex analysis)Spur <Mathematik>Square numberInformationHill differential equationState of matterTelecommunicationTowerPower (physics)Incidence algebraGreatest elementView (database)Cellular automaton
30:15
Term (mathematics)InformationPhysical systemComplex (psychology)HypermediaSocial softwareFormal languageNumberFacebookState of matterThresholding (image processing)Endliche ModelltheorieLetterpress printingMessage passingMereologyDecision theorySolvable groupGoodness of fitAreaDifferent (Kate Ryan album)Data managementLibrary (computing)Physical lawPhysical systemMappingWater vaporMathematicsReverse engineeringPoint (geometry)Local ringInformationVideo gameDialectReal-time operating systemDeclarative programmingPlotterMultiplication signSystem callTerm (mathematics)1 (number)Sign (mathematics)Data storage deviceIn-System-ProgrammierungExpected valueCondensationComputer clusterResultantWeb pageCountingFrame problemLine (geometry)Incidence algebra
37:43
Sanitary sewerCASE <Informatik>Self-organizationField (computer science)Active contour modelFamilyDampingException handlingGroup actionSlide rulePresentation of a groupWater vaporPower (physics)Sanitary sewer
39:24
Data managementGroup actionLocal ringLattice (order)MereologyRelief
40:14
Design by contractBuildingMereologyData managementBuildingPower (physics)Key (cryptography)Control flowRandomizationAxiom of choicePhysical systemAdditionDesign by contractRule of inferenceExtension (kinesiology)Service (economics)MereologyQuicksortData storage deviceBitOrder (biology)Sign (mathematics)Group actionData managementComputer animation
42:43
Self-organizationOperator (mathematics)LaptopData recoveryData managementInternetworkingFigurate numberCategory of beingBuildingTerm (mathematics)Information
44:17
FacebookWeb pageVideo gameData recoverySelf-organizationWindowInformationLaptopBijectionSet (mathematics)Data management
45:59
TelecommunicationBoiling pointOrder (biology)Order (biology)Water vaporSanitary sewerService (economics)Local ringQuicksortTelecommunicationInformationBoiling pointData managementArchaeological field surveyArithmetic meanLimit (category theory)
47:14
Link (knot theory)CodeStress (mechanics)Source codePhase transitionNavigationArchaeological field surveyLevel (video gaming)BitQR code1 (number)Information
48:09
Field (computer science)Local ringInformationLine (geometry)Line (geometry)InformationLocal ringMedizinische InformatikNeuroinformatikNumberFitness functionInternetworking
49:25
DatabaseMereologyGroup actionInteractive televisionHypermediaMessage passingTask (computing)Forcing (mathematics)SpreadsheetMathematicsBitOnline helpWater vaporSystem callTerm (mathematics)
52:14
Communications systemNumberDecision theoryCuboidDependent and independent variablesPhysical systemTelecommunicationStudent's t-testHypothesisPhysical lawInternetworkingMaxima and minimaDatabase normalizationPlanningMusical ensembleCASE <Informatik>Standard deviationSemiconductor memoryOnline helpRight angle
55:12
Machine learningLevel (video gaming)Electronic mailing listExistential quantificationData managementPredictabilityState of matterAuthorizationPhysical lawGroup actionMachine learningMaschinelle ÜbersetzungInformationTranslation (relic)
56:24
Software frameworkDecision theoryProcess (computing)Focus (optics)Core dumpInformation securityAddress spaceBitThermal conductivitySimilarity (geometry)Software frameworkCore dumpProcess (computing)Self-organizationLine (geometry)Multiplication signMereologyTerm (mathematics)ChecklistDecision theoryDependent and independent variablesPlanningInformationOnline helpAddress spaceNumberGoodness of fitRight angleSystem callComputer animation
Transcript: English(auto-generated)
00:00
So, I'm a little creaky this morning, I'll just warn you, yeah, something about a little bit of cold and a little cigarette smoke out there and a little too much karaoke and I sound like a gargled gravel. And I'll apologize to you, I throw things when I talk and I have a microphone and a clicker to throw. So, at some point, I will also try to click with the microphone and talk with that.
00:22
It just happens and I know this, so feel free to make fun of me when it happens. So, good morning. I'm Sarah. I am an emergency manager. I am a technology enthusiast and have been for a very long time. I'll just put this out here. My first DEFCON was DEFCON 4, so yeah, for those of you who may not have been on the
00:40
planet at that time, I apologize, but I do a lot of disaster response and I see a lot of technology in there. I also teach ethics in the graduate program at Georgetown University and in an undergraduate program at Pierce College and it's ethics for disasters. And I've been looking at this and I was super excited when I saw this village pop
01:04
up because we talk about ethics, we talk about disasters, we talk about technology, but nobody ever talks about all three together and so that's what I'm going to do. So I've got some examples. I tend to illustrate things with stories. I have a couple of really ugly slides.
01:20
I've got a lot of slides that are just pictures, but I just want to walk us through some stories of how decisions get made and why they get made and then go to maybe a model for how we can do it better in the disaster realm. So we're here because bad things happen. Bad things happen all the time. Some of them are really, really big and people need help.
01:42
And the people who need help are also the people in charge sometimes. And people want to help and they kind of converge, but there's not a good understanding of what help is needed and what help can be accepted. And if we make the wrong decisions, people can die. And we want to be able, we don't want that to happen.
02:01
We want to be able to help as many as possible in ways that make sense. So this is an ugly slide. This is the what is ethics slide. So I want to just set sort of a framework for what we're talking about ethics and where they come from. Obviously, ethos is the Greek origin of ethics.
02:22
And it talks about a construct of rules that are commonly accepted by society as being right and wrong. They are, there's generally three different ones. There's consequential, oh, I have a pointer. There's consequentialist, which is what we talked about when we talk about doing the greatest good. So here's somebody say, we're going to do the greatest good for the greatest number of people
02:43
in the least amount of time. You'll hear that in the disaster world. They're talking about consequentialist. What are the consequences of the actions that we're taking? There's non-consequentialist. So what's the intent? Did we intend to do the right thing? And then there's agent-centered.
03:00
Did I do a thing that builds my character? Which is on all of these are valid models of ethics. And all of them have a place in this entire process. But when you hear people argue about the ethics of something, they're typically because they're coming at it from a different framework than the people around them.
03:21
It doesn't mean that either is wrong. It means that they have the opportunity for healthy discussion and learning. Whether they take that opportunity or not is another thing. So how do we determine what is ethical? What do you, how do you decide what is ethical as a person? Anybody? I know it's early.
03:42
Oh, respective traditions. That's very good. That's a very, very good point. And I'm going to illustrate that point just a tiny bit because we often have laws and rules around ethics. You cannot do this. You must do that. If you work in government, you cannot accept a gift that's worth more than $25 if you go to attend a thing.
04:01
But there are traditions and cultures where that is offensive. And then you have the choice. Do I break the law because the ethics law says I can't take the thing? Or do I not offend my host by accepting the gift that they are giving me? And there's two right answers to that, depending on the choice you make and where you're grounded in that.
04:25
But it is an excellent example that often gets overlooked. What is right for the people that I'm interacting with? What are some other ways that you determine what is ethical? Your upbringing. There's sort of a kind of fell off the bottom of the screen, but that sort of goes to morals as well.
04:43
We're going to talk about that a little bit. But yes, our morals very much inform the ethical stance that we take on things. Anything else that pops to mind? Yeah, back there. Your emotions. Yes, your emotions absolutely come into play in very big ways, especially if you don't take a moment to stop and go,
05:03
okay, what is it that we're trying to accomplish here other than, oh, my God, I feel horrible about the thing and I want to fix it. I saw a hand up here. Peer pressure. Yeah, peer pressure can have a huge role in how we address and embrace ethics. Anybody else?
05:20
Yeah, education. Absolutely. Typically, the more education you have, the more time you think about ethical dilemmas. You have more information roaming around in your head. And you're exposed to different ideas the longer you stay in academia. That's why PhDs, it's called philosophy for a reason because they talk about a lot of things and they discuss the consequences of those.
05:47
Anything else? Culture. Yep, culture is also a huge one. Different cultures have different ethics for different situations. So what's the difference between ethics and morals? You kind of touched on it when you said upbringing.
06:03
What is the difference? Yeah. Morals are a personal choice. Yeah, morals talk about the character of a person and your basis for decision making. So, and you'll see, so I've got, there's two basic moral systems. Normative, which is consequences based, which ties a lot into a consequentialist ethics base.
06:28
Model, and there's the ontology, which is rule based, which goes very much to taking the right action. So they're very closely linked. But most people come at morals from one of these. What are the consequences or what are the rules?
06:42
Some people, not most of the ones at DEF CON, are very rule based. Like there's the rules that we must follow. I am that person. Usually, but you find when you get into disaster situations that that doesn't work anymore a lot of times. So you have rule based, where there's a rule for the thing, whatever it is, and we're going to follow the rule and everything's going to work out.
07:05
But that does not happen in disasters. So disasters, I'm going to give you a definition of disasters for our working purposes today. And a disaster for us is any situation that overwhelms the ability of the local community to deal with it.
07:24
So there's tons of different definitions of disaster, but that's the one we're going to use. If the local community, whatever that looks like, is overwhelmed, it's a disaster to them, which by default kind of tells you that small communities can have disasters that would not be disasters if they happen in a larger community, simply because they have resources.
07:47
And disasters impact everybody. But one of the things to bear in mind that often gets overlooked is that disasters don't affect every person equally. And so a lot of times when you talk about the greatest good mentality, there's often
08:06
an overlooked subset of the population that has historically overlooked in that over and over again. If you look through disaster research, it tells you over and over again, marginalized populations are left out of the greatest good
08:21
for the greatest number of people because, well, there's a lot of reasons for it, usually because we made bad decisions beforehand. But disasters impact everyone in the community and they impact them in different ways. And so it's important to keep that in mind as we roll through how we help with these problems.
08:41
So here's my little bit of bash on my emergency management family. We suck at technology historically. Emergency disaster response in the United States is heavily government centric. If you haven't noticed, it's very, very government centric. And government, if you've ever looked at Everett's
09:02
adoption of technology or innovation, the innovation adoption model, you've got the early adopters and there at the end is your grandparents, the laggards. Government is always the laggards by and large, not military, but local government. You still got a whole contingent with flip phones and they're happy with that.
09:20
And so we also have this disparity issue within government where we have, I call them the haves and the have nots. You have very well funded government entities at the local county state level and you have very, very poor ones. And this affects their ability to use, understand, or even be exposed to technology.
09:42
So what you get is a mixed bag. You get all kinds of things that are new and exciting to some people and old, old news to others. One of the things I do is I'm the chair of the International Association of Emergency Managers Emerging Technology Caucus. I know it's a huge, we go IAM, ETC, it's just seven letters, much easier.
10:02
But we run the Crisis Technology Center at their annual conference every year. And part of that is we have our technology petting zoo, which we just bring in things, 3D printers, basic AR, basic VR, how to put email on your smartphone.
10:23
Like, I am dead serious, it runs the gamut from how do I set up my email, can you, here, take my phone, make it happen, yeah, to people who are using AI. And so we try and expose the emergency management world to all these
10:42
concepts, but exposing them to them doesn't help them understand them or use them. It just lets them know that they're there. And so we still end up with this problem of they don't understand the technology, they don't have access to it for a variety of reasons. Disaster response also highly, highly supplemented by NGOs and private industry.
11:02
So you've got a laundry list of non-governmental organizations who help in disaster response. You have a lot of private industry that does a lot of different things in disaster response. And you have general random people who just want to be useful and helpful and will show up and try and be useful and helpful. And a lot of them bring technology.
11:22
The problem becomes when the government you're interacting with has absolutely no idea what to do with the cool toy that you're talking about or doesn't even understand it. And so there's this frustration because you work for somebody who has the cool toy and you're like, I can solve your problem with my cool thing. And they're like, I have no, I don't understand the words that you're saying and I don't understand how it would solve my problem.
11:45
So go away. And unfortunately, that happens a lot. So disasters and humanitarian crises create situations for which there's no rules or it seems to us that the rules no longer apply.
12:01
We had rules. Now we're looking at this vast whatever it is and the rules don't make sense anymore because we didn't think about this type of scenario. And they create usually a profound sense of compassion and responsibility towards people that we perceive to be suffering immensely. So you've got, it's fueled a lot.
12:21
And I'm not saying this is a bad thing, but you see on the media. The media typically will pull the worst case scenarios and show them and people respond to that because we're humans. By and large, we want to help other humans and animals. I will not leave out animals. But there's this disconnect then from everybody wants to help to what do we actually do with that?
12:46
Because we don't have a standard across the spectrum that helps us get to a good resolution. So this is a very basic, like, what kind of stuff gets used in disasters? There's basic stuff.
13:01
And some of my emergency management peers can't even handle the basics. Phones, computers, radios, Internet. I didn't put Internet on there. Some of them are not able to use these things in a disaster. Sometimes it's because they lost their capacity because of the disaster itself. Sometimes it's because they just never had those things to begin with.
13:21
They had a computer and when the disaster happened, the computer died or the person who knows how to log into it isn't there anymore, can't find them. And I've been to disasters where that was an actual thing. And then there's data analysis, GIS stuff that more advanced or larger organizations use.
13:43
And then you've got AI, you've got UAV, UAS, drone stuff. You've got 3D imagery going on in the larger organizations that have funding. So it runs the spectrum and it can be used in a lot of different ways. We rely a lot on the basics and yet we don't necessarily have enough of them redundantly built in some communities for them to actually work.
14:08
And I'm going to show you a couple of examples of how some of these things just sort of fall apart when the rubber hits the road. So I will preface this by saying the three actual disasters I'm going to talk about are three that I were at because I tried to only talk about disasters I was there for.
14:25
So I don't armchair quarterback somebody else's disaster because that's rude. But so the SR530 or OSO landslide happened in March of 2014 in Snohomish County, Washington. And where it happened is a very rural area east.
14:41
Snohomish County is part of the Puget Sound region. So it's east or north of Seattle. The landslide happened east on a mountain road that connects east to west, connects some mountain villages with the rest of the world. Forty-three people were nearly instantly killed. Forty-nine homes and structures were destroyed in a very, very short order.
15:04
The search for victims lasted a month and knowing that they were not going to be live victims, they were going to find the remains of victims. The final victim was found in July. So however many months later that was as part of the recovering, rebuilding process, that's how long it took.
15:24
And so that's sort of the backdrop for this. And an interesting side note, you'll see I call it the SR530 landslide, a lot of people call it the OSO landslide. OSO is the tiny little town there. People of OSO don't want to be known for this landslide and they were very vocal about that.
15:41
And so collectively within the emergency management community we said, okay, we're going to honor that. We're going to call it the name of the road that was destroyed instead of the name of the town. But it didn't stick beyond our community. So that's one of those things where you're honoring that local culture, that local request, except that that doesn't trickle up necessarily to others.
16:00
So this is what it looked like. It's a little bright in here. So this is what it looked like before. So you can see, if you've ever seen LIDAR images, you can see there's an old landslide here. Anytime you see this horseshoe shape in a hillside, that means that at some point in time there was a landslide. There were many here. The history of this particular canyon is mind-boggling.
16:23
So here's the hillside, here's some streets and houses, a little subdivision thing going on here, the road, the river, and here's the landslide. Here were the houses, here's the landslide. That entire side of the mountain fell off in a very, very rapid instant.
16:44
And it was devastating. There were issues that led up to this, and I actually think I put that, yeah. So there's a whole ethical thing behind this ever happening.
17:03
And it's not a technology ethical thing, but it's important to recognize how these become cascade failures. You make one wrong choice and suddenly you have a whole cascade failure of wrong choices. And this was unpermitted building. They never had permits to build it.
17:22
It was built and then at some point they were given occupancy permits. And when you look at the lidar images of this canyon, just with the naked eye you can count 13 different previous landslides. And it's not a place where you would build a home if you knew that the hillside was going to fall down on you at some point.
17:44
But what was done was done and they just rolled with it, but that led directly to this. There's some great timelines about this that go back almost 100 years of the decisions that were made in this canyon, in this valley. And the things that happened over time. But then we jump to what the EEOC, so the EEOC is the Emergency Operations Center.
18:05
That's where the organizing, the coordinating, all of that happens in a disaster. This was by far the largest disaster we had seen in Washington state in a very long time. We used resources from, I say we, it wasn't my disaster. I was one of the resources that went to help.
18:22
They used every resource in the state at some point. And so all of the people working in there were from somewhere else, by and large. They had a system, I think, but it didn't work. We reverted to paper. And paper is an ugly, ugly thing in this kind of environment.
18:42
Things were getting lost, so we were doing resource orders and they weren't getting tracked, right? There were people putting them on sticky notes, we'd find them a week later. It was an absolute nightmare. But what happened here, there were multiple people in that room who said, I have a fix for this. Technology fix. And we were just going to throw it on a SharePoint list because that was better than paper.
19:06
But we couldn't, and there were multiple, like we had this little organizing meeting in the corner and said, we can do this. Like we can do this and convert this in like less than an hour probably. Multiple people made the pitch to the people in charge and were told no.
19:21
Because, and I think really, they didn't understand what in the world we were saying. They understood Excel. It was in Excel, which led to a whole other brand of chaos. If you've ever used Excel with multiple people simultaneously, it's no, no. Yeah, I spent a good chunk of an evening rebuilding Excel at one point
19:42
because something horrible happened. But we could not convince the right people to just give us SharePoint. They had it. We couldn't do it. And so there was that decision point where we were like, we felt like ethically we had an obligation to make this better
20:03
and make it sustainable. Because what we were doing was clearly not sustainable. It broke. Somebody blew it up accidentally. It just didn't work. And yet it didn't jive with their idea of how it could be better. And so for us, it was a very, and by us, I mean the rebels in the corner
20:23
who were trying to do this. For us, it felt very wrong to not be able to do this. We actually talked about just doing it anyway. But we realized we didn't have the right access to things because we didn't work there. But the other thing that happened in this, and I call them technology tourists.
20:41
We have this thing in disasters where we call them disaster tourists. Random people just show up because they want to see what a disaster looks like. And I don't like them, just so we're clear on that. If you show up and you bring supplies, fine. And you take a look. If you just show up to show up, go away. But we had the same thing happen in this room.
21:04
Part of what happened when we said, hey, we think we can put this on SharePoint and make it better for everybody. Oh no, a very large technology partner from the Puget Sound area is going to come and save the world. And we went, yeah, huh? We had what I now call technology tourists. We had at least one, maybe two large technology companies
21:23
who may or may not be located in the Puget Sound area roam through with a tour group to say they were going to fix the problem. And we never heard from them again because they didn't understand the problem. The people there couldn't articulate the problem. And we were well entrenched in paper at this point.
21:41
There was no way that we were going to something else. But it was the first time I had seen technology tourists. And now that is also a thing. You see them pop up. You'll see YouTube videos of people who made a great cool toy for some disaster. And they were probably a technology tourist
22:01
instead of going with what was already available. They reinvented the wheel. They made a new thing. It's a growing issue. And then we have stuff in the field. So the field was interesting because we had aerial imagery that had come in. They did LIDAR. I mean, that was a critical piece of rebuilding.
22:21
Part of the issue with this is it cut off the main route for the people who lived in one of the towns outside, giving them an 80-mile trip to and from work now that was impassable in winter. So this happened in March. They absolutely had to rebuild that highway before winter.
22:41
Like, there was no way they couldn't do it because those people would have been cut off. And there was no fix for that. And so there was a big rush. And it's not a bad rush, but there was a huge rush to solve this issue, clear the highway, get it rebuilt in time for winter. And roads don't get built that quickly. They had to reengineer the entire road structure,
23:02
elevate it, do all this stuff. But a lot of offers for help came in. All kinds of nonprofits who, with mostly drones, like, yeah, we've got the thing. We can bring this technology. We can help you solve your problem. They weren't ready for that. They didn't understand the technology. They didn't understand what could be done with the technology. They did say yes to some of it.
23:21
Obviously, Lidar was a huge thing. They had to do that. They had to look at that landscape. But then somebody's like, oh, yeah, we're going to get different imaging with drones. And we're going to 3D print those models, which is super cool, but not in the least bit helpful to this effort. It was slow. It was cumbersome. It became a proof of concept for Lidar,
23:42
but it wasn't useful here. So they were probably right to reject some of those offers. But of course, the NGO that was all excited about their new cool proof of concept, they didn't understand why they weren't being allowed to help. And it's because there wasn't really a need for it. And in the end, it didn't become all of that useful.
24:01
Ultimately, what happened, where they did have a huge success in technology and math, they, and this was a hard thing to explain, at about a month, they cut off search operations. They're not looking anymore. And yet, using math and some really good models, they said, here's the people who are still missing.
24:21
Here's where they were when it hit. Here's where we're pretty sure you're going to find them as you're excavating the road. That was really hard for them to convey to the crews that were going to do that excavation. So there was this huge, like there was a huge public, I'll say, ethical battle from the construction crews who were like, no.
24:42
We are not clearing that road. We're not doing it. There's still bodies in there. We're not doing it. And the big brains who've mapped it and said, no. Here's where you're going to find them. So you dig, dig, dig, slow down here in this general area. So there had to be a melding of the mines there to convince the construction guys and people
25:03
that this was the right thing to do. And it turned out the math was right. They found every one of them roughly where they were going to. But they had to come up with a system that allowed the construction workers to fulfill their obligation without thinking that they were intruding on someone else's space. I mean, there was huge concern about that.
25:22
And it worked. And they found everybody eventually and built the road by winter, which in itself was a Herculean effort. So the Okanagan Complex fire, huge fire in 2015. And if you have questions as we go, just wave something in the air. Biggest fire in the state of Washington,
25:42
although there's been some, there's been argument over that now. Was it the biggest? I don't know. Did the fires merge? If they merged, it was the biggest. If they didn't, it wasn't. Who cares? It was a huge, damn fire. Huge, huge fire. And the scary, so Okanagan County, Washington, Eastern Washington, very rural, very, very poor.
26:02
Lots of separatists out there. Uh-huh. That was an interesting thing. It was the highest priority fire in the country at one point in time. There were three firefighters killed, which may or may not have been avoidable. Containment took just about a month, and over 300,000 acres were burned.
26:22
So there's weird things that happen, and some of these aren't on the slides, because I'm going to publish the slides publicly, and I don't need the separatists showing up at my house. Yeah, no. But there is a very obvious, very off-the-grid community out there, and with National Guard assets out there. And the sheriff, who was elected, had to approve every movement of the National Guard,
26:43
because in some places, they would have been shot on sight, and it was not a joke. Uh, yeah, oh, the government's finally come to take my things. I'm making my stand was sort of the mentality of some of the people. Some places, they had to go with a sheriff's office escort. Some places, the sheriff could call ahead and say, hey, I'm sending some National Guard troops.
27:03
Please don't shoot them. And that was okay. But they couldn't go anywhere without sheriff's direct approval. Did they know about the fire coming? Did they know about the fire? Everybody in multiple states and two countries knew about the fire. Like, yeah, it didn't matter.
27:20
They, if you've ever interacted with a true separatist group, they really, they don't recognize government. They don't recognize the federal government. Well, they recognize them as the sworn enemy of their own selves. Like, it's a weird deal. Um, yeah. Um, but it was, yeah, the fire was,
27:41
they knew. We had firefighters held at gunpoint. Um, they, imagine those. Like, here's a firefighter. They're structural. They're from somewhere else. They're like, we're here to protect the structure. We gotta go over there. And some guy with a gun says, nope, you stay here and protect my stuff. All right, we're gonna stay here and protect your stuff. There's a decision point in there.
28:00
It's a really easy decision point. There is an ethical decision. Do I live or do I die? That changes every other rule that might be out there. Like, for most people. Hmm, if I leave, I get shot. If I stay, I'm still technically doing my job and I don't get shot. I think I'll stay. Those are things we don't typically expect to find in disasters.
28:21
We expect that people are going to be happy we're there. They'll take advantage of whatever it is we're offering. And be happy about it. But that's not always the case. And none of that has anything to do with technology. But it's the weird ethical stuff that happens in disasters. So part of that fire, so part of the reason National Guard was there
28:41
is communications was out. So this, these are pictures I took. This is the top of the power pole. This is the bottom of the power pole. It burned through the middle. In entire swaths of that county. There was no communication to some communities. None. The only communication they had was somebody, the fire chief usually would drive out of the community
29:01
to the EOC once or twice a day to get information and then go back. That was the way information was exchanged because there was nothing there. Like an entire roadway with every pole that looked like this. We lost cell towers. We lost all communications in some places. And this picture is just there.
29:22
That was the view from my hotel the night that we got there. So I'm in a hotel and that's the, that's the hill. I took that with my cell phone. So if you blow it up anymore it gets really fuzzy because it's pretty dark. But, but that's the kind of thing you're looking at. To get to this fire we had to drive through another fire. It was, it was weird and creepy
29:42
and humongous. And there were interesting technology issues. And I meant to say when I was talking about Snohomish County, Snohomish County is what I would call a, a have county. They have resources. They have money. They have assets. They also came very, very close to bankrupting their county over an incident that was one square mile.
30:01
The state was actually looking into what they do with an insolvent county because they were spending more money than they ever would have. And Okanagan County is what I would definitely call a have not county. They, very low resources, very low funding. As a result, yeah? Sorry, at the beginning you said
30:21
there's usually a disparity between funding. So, what's a disaster? Where does the primary funding come from? Like in New Jersey? I'm just, there's a whole bunch of resources that come from that. Yes. That's why they call it. In the country you notice that
30:43
you're forgiven for such a No, it's okay. So, now I have to condense that down for the camera guy. Sure. Where does disaster money come from? Is that a good condensation? So, they're saying in emergencies is all disasters are local disasters. Which means ultimately the local community has to pay.
31:02
Some states have funds to make up the difference. My state, Washington, has a, we have a fund. It's empty. There's not a dime in it. This particular one was interesting because the state appropriated money to help this county
31:20
and then after the fact discovered it was illegal and the state had to pay itself back. Yeah, that was weird. So, ultimately local you should be planning. Like, you know, you should have a savings account. You should plan for disasters, emergencies just like you do in real life. Then you've got state government which may or may not have a rainy day fund
31:41
or a disaster fund and it may not have actual money in it. And when you reach a certain threshold you hit what's called a presidential disaster declaration. And there's a, these are the big ones. These are the ones where you see FEMA went to a thing. That was a presidential disaster declaration. There's a threshold and that threshold is based on a per capita number.
32:02
So, you have to meet it within the county and then with the state as a whole. And often these small counties can't meet the threshold because they have to meet the threshold for the whole state. They can meet their local threshold but the state as a whole has to expend a certain amount of money. So, it's a complicated thing.
32:23
And in Snohomish County where the landslide happened had they not gotten a presidential disaster declaration they would have bankrupt the county but that's how they did not. They got a presidential declaration which opened up federal funding for it. Okanagan, however, very small.
32:41
The thing that saved them in this is there were fires all over the state and ultimately they did meet the threshold in the state. Yeah, go ahead. What are the consequences of a county or city going bankrupt? Well, that depends on where you're at. California has had this issue.
33:01
California for a while was issuing IOUs to employees. I think it was Orange County. Yeah, ooh. How long are you going to stay working for a place that owes you not? So, I worked for an ISP a long time ago and they still owe me money but I don't work there anymore. Like, they quit paying me and I quit going to work because why would you stay? So, it really depends
33:20
on state laws. Part of what happened has happened in Michigan. So, Michigan has had all of these weird insolvencies. They've had all of these Detroit area bankrupt towns and they assigned financial managers to them which is exactly how you ended up with the Flint water crisis. Exceptionally poor decision making
33:41
because of money. So, there's no one answer to that. It really depends on how each state is structured and what kind of government they have. Sorry, I'm a government nerd. I could go all day just on that but I will not because you'll go to sleep. So, Okanagan County,
34:02
very small. Large geographically, more cows than people. They had, wow, somebody let that guy in. They had a huge problem with information dissemination. With lifelines cut, phones not working, people couldn't get messages that were going out. They didn't have a reverse dial system.
34:22
So, we use these a lot in disasters. We have a system that we pay money for. It calls you to tell you things you don't know or things you need to know. They didn't have one of those in Okanagan County. Never dawned on that. They use Facebook a lot. Actually, it's a very good purpose but
34:41
one of the things that happened is one of the responding entities said, hey, I'm going to call my vendor for the company that I use and I'm going to ask my vendor if they will just one time let us use our system to make a call in your county. The vendor said yes because they were good guys and they're like, yeah, sure, that's fine. But unfortunately,
35:00
that kind of set a precedent. Now, a call has been made in this county and now people are like, oh, we have this thing. But they don't have the thing. When the visiting county left, that technology left with them. It was not sustainable. And so, this information dissemination piece, we had to go old school. It was very complicated.
35:21
Well, it's not complicated. You put up A-frame signs at the grocery store in the library and you hang flyers on them. But that doesn't get everyone in a county like that where there's multiple languages spoken. Most of the stuff was being done only in English. Finally, they got a partnership with public health for their Facebook stuff and they made two Facebook pages,
35:41
one in English, one in Spanish. And literally, some volunteers from public health took every message we posted in English, translated it in real time and put it out in Spanish. And it kind of became a model for our state and how that was going to work. But it was unsupportable long term because once the crisis was over,
36:00
the public health volunteers went back to doing whatever they were doing and there was nobody left in the county who could do that work. So, they created something that was not sustainable and yet which now maybe some of the population expects to find there in future disasters and they've got fires going on there right now. It actually produced a new change in our state law which is,
36:21
it's a good, it's well-intentioned but it's really hard to meet. And then we had GIS. GIS was hugely useful. We had a GIS guy come in from another agency. He was doing the thing because they had, they had a GIS person in that county who was totally overwhelmed, had never done disasters and our guy was just cranking out stuff. And they finally came in and said, dude,
36:40
you've got to quit printing maps. That's all the paper we have for the whole year. We don't have budget for more. You have to quit printing maps. That was an interesting, this was definitely what the EPA have, not county. They didn't have the money to buy more paper for the plotter. When we used what they had,
37:01
that was all they had. And it was a great effort but again, it was unsupported locally. The point in this is decisions were made about the use of technology that could not be supported after the people who were helping went home. And that is an issue. It's a huge issue
37:21
in disasters when you create a system and it's awesome, it works. And then the people who created it went home and now there's no system. It doesn't solve a long-term problem. And it's a short-term fix that now sets an expectation that cannot be lived up to by whoever stays behind
37:41
because it remains a local incident, always. So, I'm going to talk about Hurricane Harvey, a very, very, very small sliver of Hurricane Harvey. In case you missed it, it was last year. It was very, very large. It was this big red blob and the eye of it went over Rockport, Texas. And it was category four. I ended up in Rockport, Texas
38:01
with an organization called the Field Innovation Team. I don't know if anybody has heard of them. They're a nonprofit. They do exactly what it sounds like they might do. They do innovation. Some of it is technology, some of it is not. So, this was Rockport when we got there. There was nothing. This slide I borrowed from another presentation I did
38:21
but there's no power, there's no water, there's no sewer. There were no kids because school had started already. Every family with a kid relocated to someplace that had schools. It was weird. There were no animals because they took their animals, except for wild pigs. There was occasionally wild pigs that would go barreling through things.
38:41
There was almost no healthcare. There was a curfew. There was debris everywhere. There were lots of sunken boats. And this last one that's somehow fallen off. There were resources everywhere. This is not debris. This is donations. It became debris because after a few days of sitting in Texas,
39:00
now there's snakes and other things that have moved into it and it's kind of damp because, yeah, gross. So, resources everywhere. There were all kinds of groups that are helping. NGOs, private companies, and there was no organization around it. We obviously couldn't
39:21
tackle that problem. When we looked around, we thought, okay, there are some things that we can help with. We can help them with their donations management if they'll let us. We can help them with their community health issue if they'll let us and we can form some partnerships that can let these things continue after we leave.
39:41
And we got permission. I invited myself to a meeting. Yep. It was interesting because I invited, I actually got myself invited to a meeting and it was the only meeting I needed to go to but I'm an emergency manager and the local emergency management guy was like, he looked at our group of people and he said, you can go.
40:01
All right. And it worked out because I talked to the two people we needed to talk to and got permission for our group to do, we actually did six things but we only needed permission for two. So, we did donations management in an old parts store that we didn't know that there was no contract for
40:21
that had been severely damaged by the storm that was running on a generator which if you look closely at this picture, so this is an electrical panel, this is an extension cord wired to the electrical panel. That extension cord goes out around a corner and outside to a generator. If your butt hit that cord,
40:41
all the power went out. Uh-huh. There are so many issues with this building, so many, but it's not my disaster. I'm there to help the local community and this is what they're doing. So, you have a choice. Do I support them knowing that what they're doing
41:01
is maybe illegal? Probably. They're probably, they don't have permission to be in this building. This building is absolutely going to be condemned. Like, I looked at it and went, ooh, how are we using this building? Well, somebody had the keys. Like, a property manager dude who's known in the community, it was still standing,
41:21
he had keys, he found two random dudes and said, hey, you want to set up donations in there? And they said, sure, who are you? And keys were exchanged and things happened because disasters sort of break our rules sometimes. This is what they needed to do. And we had a choice as a group. Like, we knew, we knew eventually things
41:41
were going to go weird here. But, so we could either support what they were doing or we could leave. But we went there to help, so we went there and we supported what they were doing, helped them organize. But what they really needed, in addition to organizing, they needed an ability to publish their needs. They had real needs and they had things that they had to get to support their community.
42:00
And so they were doing the best they could but they didn't have technology. I mean, they did. Cell phones were working, which was good. Cell phones worked great. Best data service I think I've ever had. It was amazing. But nothing else worked. And so we said, we can help you.
42:20
We can help you organize. Like, we, I'll say we hacked a solution to this donations management. We scoured through rubble and built signage and somebody had spray paint and we made signs. Like, we did all of it with the things we had on hand. We didn't bring in new stuff. We just used what they had. And, but we said, we can help you organize this a little bit. So we set up a system for them
42:40
with a partnership and I'll talk about them. Anybody familiar with ITD? ITDRC. ITDRC is an amazingly wonderful nonprofit organization. Information Technology Disaster Resource Center. They're a huge nonprofit. Like you might guess from their name, they take technology of all kinds
43:01
to disasters and they support it for as long as it needs to be supported. They're a great, great, great organization. But we, we ended up partnering with them because they were there and I know them. And so, here we all were together and I said, we need to set up internet for this donation center because they need to be able to reach out and tell the world what they need.
43:21
We need internet. And they said, we can do that. So they called their friends Dish. Dish sent us a VSAT which was an amazing day and it worked great until anybody would bump the generator and then we had to restart everything. I know. It is what it is. But this happened because of partnership because we didn't want to build this and then say,
43:41
okay, we're all going home. You're back on your own again. The very cool thing about this donation center, eventually, yes, the property owner is like, no really, you got to get the hell out of my building. My altruism is done. My insurance needs to pay me out. But my insurance won't pay me because the building is occupied so why should they? So that became a whole thing. But this operation for donations management
44:01
turned into their own local nonprofit organization doing long-term recovery. And it started right here with a laptop and a VSAT and a bunch of people just trying to make the community better. And so, and it was important for them to be able to do that. We helped them figure out how to set up a Facebook page. Like this is
44:20
when ITDRC's laptops, which unfortunately decided it needed a Windows update over a VSAT. So that was kind of, I know, I know. I was like, no, it's going to happen. I'll see you in a couple hours. Yeah, couldn't stop it. It happens. But they were able to leave that for as long as the community
44:40
needed it. And then, the people there, we helped them learn, we helped teach them how to do this. They didn't know how to set up a Facebook page or any of that. But it was useful then because that's where people were looking for information. So they sent up a Facebook page. And they started pushing the information out that they needed. And people started responding appropriately.
45:02
It's interesting because as an emergency manager, I say, don't show up without being asked and don't send donations without being asked. And this changed my mind because Rockport was overlooked by everything. There was no, I say there were all kinds of people there, but there weren't resources. There weren't money. There was no Red Cross shelter. There was no any shelter.
45:21
There was nothing. There was food. There was some of the best disaster food I have ever eaten in my life. Mercy chefs. If you ever go to a disaster, make sure you're one where mercy chefs are because they kept everybody fed. But nothing else was there. So this helped them long-term then become their long-term recovery organization. Their Facebook page
45:41
has changed names. But it's the same page they started with and it chronicles all the way from when they set it up to now doing their long-term recovery. And that, I mean, it's been a year, not quite a year, and that they're far, far, far from recovery. They lost a lot. But one of the other things we did,
46:00
there were huge, huge, huge health concerns. They don't have public health in Texas or at least in Aransas County in the way that I'm used to public health happening. Meaning that they don't actually have public health concerns. There's a lot that does health things for people. And I come from Seattle. I know. They check on sewers and stuff like that. But they don't actually, they don't have public health services like I'm used to.
46:21
So there was no communication to the public about public health. There were still people live. This picture, we found a Buddhist monastery. Some of the locals didn't even know there was a Buddhist monastery there. But we sort of did a survey and we're like, hey, there's a Buddhist monastery and this guy did not know that there was a boil water order.
46:42
It got to him. There were boil water orders for days, meaning the water's not safe to drink. You need to boil it. He didn't know. Other people didn't know. So they had limited health care availability and information was not getting out. And that's an issue. I mean, that can lead to a whole secondary disaster.
47:01
And for me, like, for me, this was a huge ethical issue on the emergency management side. Not the technology side, but the emergency side. Like, in my heart, I don't know how this happened. I don't know, but I can't get over why it happened. We did a couple of things. We did a survey of where all the resources were in the community. Where is all of
47:20
the stuff going? And we put it on a map, a Google map, and just labeled it. And we distributed that to the community. And now they knew where they could go to find stuff. And if you wanted to bring donations, you knew where to go to drop them off. And then we took that
47:41
and somebody came up with a QR code for it and a way to get to the map. And we posted these flyers all over town so that people could get information about their needs and their resources and what to do with them. And then a little bit of mental health stuff because there was no mental health going on in that community. None. We were deployed with a partner team that was doing mental health
48:01
and they were the only ones there. And it was, well, this was a week and a half, almost two weeks in by now. Nobody was there. And then we set up a health information line. This was a thing that we had to partner with again. Someone donated a phone line and fit set it up,
48:21
a call-in line. And we found random people. She lived locally. She finished med school. She hadn't taken her exams yet. She was bilingual in English and Spanish. She was exactly who we needed. And I called the guys at ITDRC and said, hey, can you get her a computer because hers had been damaged?
48:40
They said sure. We'll give her a computer for as long as she needs it. And so she was able to keep that phone line updated with information so that people could call a phone number and get in English or Spanish the status of pharmacies, health clinics, hospitals, basic, whatever basic stuff they needed. And it could go
49:01
for as long as was needed. And because we had set up Internet, so there was Internet, we got our computer and she made it go well after we left. And so this, I think, is an example of, this whole thing is a good example of making sure that what you're doing is sustainable and works for the community and with the community.
49:21
Not going in and doing it, but helping them do it. So whatever it is, continues long after you leave. And I threw in this one about the Coast Guard. The Coast Guard, the official messaging from the Coast Guard was if you need help, call 911 or call the EOC or call the Coast Guard. Don't put it on
49:40
social media. But the reality is the Coast Guard knows people are putting on social media. And this is public knowledge now. So you can Google it and you can read the story. They partnered with the Coast Guard Academy and assigned a couple of cadets to interface with a couple of nonprofits. The Standby Task Force in Humanitarian Road who did social media monitoring for them,
50:00
found people needed to be rescued, put it in a Google spreadsheet, which was monitored by a Coast Guard Academy cadet who put it into their database, cross-referenced it, and started saving people's lives. Very, very basic stuff. But now they have a policy change. This is what they do now. The ethical thing in here is,
50:20
and it's one of those like, oh, they're saying one thing and they're doing something else because they couldn't go against official policy and yet they knew that they needed to be able to save lives. Go ahead. Yeah.
50:48
Cajun Navy. Cajun Navy. Yes. Yeah. Yeah, and they were, so Cajun Navy did a lot of rescue stuff and they used Zello
51:01
for that. Yeah. They should stick to boat rescue. I'm just going to leave that right there. I had an interaction with them, which was most unpleasant because it had, it didn't do, it had to do with water and boats and they just needed to get the hell out of my town. Yeah. Yeah, it was, it was, it was I think a rogue person, but, you know,
51:21
sometimes a rogue person speaks for the whole group and it's not good. So, yeah, they did and so there was a little bit of that cross-referencing too because now you've got four different groups potentially that are trying to rescue the same person. And so long term they're working on fixes for that because you don't want to duplicate that effort. You want to be able to cross-reference all that information
51:41
and send one group to rescue Granny off the roof and not four groups because, but we're working on that. Like that's a, that's part of what the Coast Guard was doing. They were cross-referencing some of each of these things. duplication, but the important part was people got saved. The amazing thing
52:00
to me is nobody died in Houston, at least in that initial thing. Like, nobody drowned and that's huge and people were rescued. So, official change. Now Coast Guard does that. That's part of their MO now. And this I throw in there as a reminder of what happens when we don't do things right.
52:20
So Hurricane Katrina Memorial Hospital, anybody familiar with this? So they lost all communication in the outside world. All of their technology failed. And they euthanized people. They killed people because they didn't want them to suffer. It's a valid response.
52:41
It is an absolutely valid ethically based response to hopelessness. You have no hope of rescue. You don't think anybody's coming. But it was completely avoidable. Completely 100% avoidable. And one of my graduate students just wrote his thesis paper on technology
53:00
in hospitals and he said something that stuck with me. He said if they had even done the bare minimum, those people would still be alive because bare minimum technology, they could have had a ham radio and talked to somebody. They could have had any number of things. They could have had a satellite phone and talked to somebody and known that help was coming. And ethically,
53:21
the failure didn't happen in the response. I see two of you. I'll get to you. Ethically, there was not a failure in the response. It was in the planning. They met the letter of the law and the letter of accreditation, but that did not solve the problem. Go there.
53:44
So five. So the question is how many and why did I say euthanized? Five is the number, if I'm remembering correctly. And I use that word because though they were charged legally with manslaughter, they were not convicted
54:01
because they were found to have acted appropriately. And so they were not, by legal standard, they were not murdered. So go ahead. So do hospitals currently carry radios in case of communication failure? So hospitals are required by standards to have redundant
54:21
communication systems. Check the box. Move on. They're not required to test them. They're not required to know how to use them. They're not required to do anything except have redundant communication systems and that is in the eye of the assessor. So if you have a phone
54:40
and you have internet, that might be your redundant communication. Now what they should have, yes, absolutely, they should have radio systems that can talk to the outside world. They should have ham radio systems. They should be doing, but they're not required to. And healthcare in particular, if they're not required to do a thing, they often don't do it because it costs money.
55:01
So this is just a reminder of how technology should have been used in advance and the decisions that led them to not do it led to five people losing their lives, ultimately. There's a laundry list of things that are coming in disaster management. Self-driving cars.
55:20
For me, that solves a huge problem. It helps us get people out of harm's way if there's an evacuation, but they also just randomly hit people because self-driving cars every once in a while are like, hey, screw you and hit somebody. Automated translations. That's a thing that's becoming more and more urgent that state laws are requiring that all things be translated.
55:41
We don't have a capability to do it well yet. Big data. There's huge implications in machine learning and AI. There's some group out there that claims that they are predicting earthquakes. And they got up on a stage at a really important conference and they got challenged by multiple people and they just basically said, shoo, we're smarter than you.
56:01
And it was ugly because they're like, well, we know, but we don't know what to do with the information. And I called bullshit on that. I said, of course you know what to do. If you know that something's going to happen, you call the authorities and they didn't really like that answer because they can't really do this. They can't do earthquake prediction yet.
56:21
And there's tons of things. Like there's things we've never thought of. All of which are going to lead us to more problems. So this is a little bit on ethical decision-making. You kind of recognize probably the framework's consequentialist. So what are the consequences? What's my duty? What's the virtuous way to take? All of them have this similarity.
56:40
They're all a deliberative process. They focus on their core principles. They each have a definition of ethical conduct and they have a clear motivation. And in disasters we have to use all of them. In disasters, we have to merge these things together to go through, I'll call it a checklist. This is adapted
57:00
from something written by Naomi Zach. Which I think, when we're making decisions about technology and disaster, we can use these questions to help guide us to the right answer or a right answer. And it's important that we do this not alone. You can't make these decisions alone. So we have to ask things like what's our moral obligation?
57:21
What are we morally obligated to do and provide? Is our solution adequate? Does it actually solve a problem? And is it fair? Does it unreasonably exclude some portion of the population either because we didn't think about them or we didn't know about them? Are we,
57:40
as individuals, ready and able to take care of ourselves? I can't even tell you the number of times people show up to help in a disaster completely unprepared to take care of themselves. First responders do this. The best guy I met in Hurricane Harvey though, he had driven, he had come from California. Great, great, great dude. And I put him to work
58:01
directing traffic like the instant he walked up. I'm like, oh, are you afraid of cars? No, I'm like, good, put on this vest and go help that guy in the street. And that was how he started doing disaster response. But he came prepared. He flew all the way from California. He rented a car. He threw in living supplies and he bought beer to share. So I have pictures.
58:21
They're not in here. I have pictures of people sitting around at the end of the day drinking his beer. But he came prepared to help with whatever. He didn't care what he did, just help. And he was useful. But he was able to take care of himself. Who's obligated to do something and how can we support them? And I think this gets overlooked a lot.
58:41
We don't say, who's actually responsible for doing this and how can I help them? We just go, oh, I see a problem. I'm going to fix it. Well, now you got 15 people trying to fix the same problem and none of them are communicating with the people who are actually responsible for it. Where do you see the dividing line between say a small town
59:01
on a port that maybe every three or four years gets nailed on something like this? The obligation I see is the big thing is should they have the responsibility to at least have some, not everything. I mean, you're not going to, shouldn't they have some basics? I mean, when they cry foul,
59:20
when weather happens, you're like, I think some small towns I was in Texas for quite a while. Yeah. And it's, I'm going to condense that question. What obligation do, I said people have an obligation. Organizations also have an obligation. They do. The town, yeah, towns, cities,
59:41
they do have an obligation and sometimes they think they've met that obligation and that's part of the issue is how do those organizations meet those obligations in a way that's effective? And yeah, they do absolutely have an obligation. They usually have a legal obligation and they often think they're meeting it.
01:00:00
And that's what I found usually with small towns is they think they're meeting their obligation because they don't understand the problem. And that's a huge, huge issue on the disaster side is not understanding the issue. So does our solution ensure safety and security? Does it make things safer and more secure
01:00:21
in terms of the people, but in the solution itself? Did we make a solution that we have secured in a way that is reasonable? Or did we just put it on a Google Sheet for everybody to see? And people do that. I have found appalling things on Google Sheets that were compiled in disasters and just left out there as now public information
01:00:43
that should never, ever be there. Very personal information that should never been on the Google. Does our solution ensure dignity for survivors? And how we look at that, there's lots of ways you can look at that, but are they involved in the solution? Are they involved in the planning of the solution?
01:01:02
Are they involved in carrying out the solution? Really helps ensure dignity. Like people generally don't want you to come take care of them. They want you to come help them and help them fix the problem. And so I think that that gets to this. Does our solution address an actual need? Or do we just have a cool toy
01:01:21
and we wanna try it out and we're gonna try and convince these people at this disaster to let us try it? If there's not an actual need that's articulated, then we shouldn't be there. And this question, who is not served by our solution? If we've identified a problem, we've identified who's affected by it, have we identified who is not served by that solution?
01:01:43
And I think that this is the piece that often gets left out. Like who did we skip? And who did we just make things either worse for or not better for?