Ethics Village - Law Professor Round Robin
This is a modal window.
Das Video konnte nicht geladen werden, da entweder ein Server- oder Netzwerkfehler auftrat oder das Format nicht unterstützt wird.
Formale Metadaten
Titel |
| |
Serientitel | ||
Anzahl der Teile | 335 | |
Autor | ||
Lizenz | CC-Namensnennung 3.0 Unported: Sie dürfen das Werk bzw. den Inhalt zu jedem legalen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen. | |
Identifikatoren | 10.5446/48336 (DOI) | |
Herausgeber | ||
Erscheinungsjahr | ||
Sprache |
Inhaltliche Metadaten
Fachgebiet | ||
Genre | ||
Abstract |
|
DEF CON 274 / 335
8
9
12
14
32
38
41
58
60
61
72
75
83
87
92
96
108
115
128
132
143
152
158
159
191
193
218
230
268
271
273
276
278
295
310
320
321
335
00:00
UnrundheitGesetz <Physik>DatenmissbrauchCybersexRechter WinkelMereologieComputersicherheitVerschiebungsoperatorUmsetzung <Informatik>Fakultät <Mathematik>MultiplikationsoperatorPunktInteraktives FernsehenStrömungsrichtungSichtenkonzeptGenerator <Informatik>FlächentheorieOffene MengeDifferenteCASE <Informatik>AnalysisQuick-SortBitGarbentheorieEinfach zusammenhängender RaumQuaderMehrrechnersystemTwitter <Softwareplattform>GamecontrollerDigitaltechnikRechenwerkWort <Informatik>VisualisierungKreisflächeInternetworkingZusammenhängender GraphLogistische VerteilungDean-ZahlInformationVorlesung/KonferenzBesprechung/Interview
06:12
VerschiebungsoperatorMailing-ListeProzess <Informatik>BitPunktTypentheorieFundamentalsatz der AlgebraUmsetzung <Informatik>Physikalisches SystemComputersicherheitExistenzsatzCodeAlgorithmusAlgorithmische ProgrammierspracheKreisflächeVirtuelle MaschineRechter WinkelSpeicherabzugDatenmissbrauchRahmenproblemSoftwareschwachstelleEntscheidungsunterstützungssystemMustererkennungSelbst organisierendes SystemTaskOrdnung <Mathematik>Leistung <Physik>Arithmetisches MittelPatch <Software>NebenbedingungArithmetische FolgeBildschirmmaskeStatistische HypotheseAsymmetrieRechenwerkErwartungswertAnalysisInformationsspeicherungGesetz <Physik>DifferenteCybersexIntegralVorlesung/KonferenzBesprechung/Interview
12:21
Physikalisches SystemElement <Gruppentheorie>Zellularer AutomatBitMultiplikationsoperatorInternetworkingPunktSoftwareschwachstelleComputersicherheitBasis <Mathematik>StörungstheorieDatenmissbrauchWechselsprungTypentheorieURLSoftwareRechter WinkelFokalpunktKälteerzeugungInformationAlgorithmusGenerator <Informatik>DifferenzkernQuick-SortOrdnung <Mathematik>AuswahlaxiomCoxeter-GruppeGesetz <Physik>MAPGrenzschichtablösungUmsetzung <Informatik>Kontextbezogenes SystemStellenringSoftwaretestCybersexCASE <Informatik>Internet der DingeAggregatzustandTwitter <Softwareplattform>RandwertInhalt <Mathematik>QuantenzustandClientTurm <Mathematik>Wort <Informatik>PlastikkarteMotion CapturingVorlesung/Konferenz
21:42
Gebäude <Mathematik>Gesetz <Physik>ComputerspielBildverstehenEinfach zusammenhängender RaumAssoziativgesetzSpieltheorieInternetworkingMaßerweiterungZusammenhängender GraphSoftwarepiraterieSystemzusammenbruchVerschlingungSummengleichungSoftwareChirurgie <Mathematik>Rechter WinkelAlgorithmische ProgrammierspracheInterface <Schaltung>Physikalisches SystemVideokonferenzTermCASE <Informatik>GruppenoperationProzess <Informatik>MereologieMinimalgradE-MailAbstimmung <Frequenz>PräkonditionierungBitAuswahlaxiomMultiplikationsoperatorService providerExtreme programmingInverser LimesKategorie <Mathematik>FlächeninhaltAutomatische HandlungsplanungSystemaufrufWort <Informatik>MathematikSchreiben <Datenverarbeitung>Lesen <Datenverarbeitung>Internet der DingeGeradeDienst <Informatik>Selbst organisierendes SystemAbenteuerspielMAPCodeGüte der AnpassungFunktionalOrdnung <Mathematik>Autonomic ComputingGlobale OptimierungGenerator <Informatik>Kontextbezogenes SystemLeistung <Physik>ComputersicherheitSchätzfunktionARM <Computerarchitektur>PhysikalismusPunktwolkeTabelleInformationsspeicherungEreignishorizontRegulator <Mathematik>VerschiebungsoperatorRuhmasseIdeal <Mathematik>Vorlesung/Konferenz
31:03
Open SourceDatenmissbrauchComputersicherheitAbstimmung <Frequenz>TelekommunikationInformationCybersexGesetz <Physik>ATMFundamentalsatz der AlgebraPhysikalisches SystemSystemplattformPunktBildschirmmaskeAnalytische MengeIterationOrdnung <Mathematik>Profil <Aerodynamik>Gebäude <Mathematik>SpeicherabzugDigitalisierungKryptologieHypermediaFormale SpracheTypentheorieZahlenbereichBitEinflussgrößeHintertür <Informatik>Rechter WinkelMaßerweiterungVerkehrsinformationRahmenproblemVorlesung/Konferenz
36:16
DatenmissbrauchPhysikalische TheoriePersönliche IdentifikationsnummerFokalpunktEntscheidungstheorieSummengleichungMereologieIntegralStrategisches SpielKontextbezogenes SystemZweiComputersicherheitKreisflächeParametersystemQuick-SortRechter WinkelGebäude <Mathematik>AbstandStrömungsrichtungGüte der AnpassungARM <Computerarchitektur>SpieltheorieMAPInformationDatenfeldGamecontrollerGesetz <Physik>IterationSchreiben <Datenverarbeitung>AggregatzustandInteraktives FernsehenFreewareSoftwareschwachstelleChiffrierungHintertür <Informatik>MaßerweiterungDifferenteLeistung <Physik>CybersexKryptologieGeradeArithmetisches MittelTermVorlesung/Konferenz
41:25
ComputersicherheitInverser LimesRahmenproblemMAPDatenmissbrauchCASE <Informatik>Gesetz <Mathematik>Quick-SortChiffrierungZusammenhängender GraphFormale SpracheGesetz <Physik>Rechter WinkelInformationSummengleichungGeradeKryptologieNotepad-ComputerBildverstehenCybersexDienst <Informatik>iPhoneSoundverarbeitungPhysikalisches SystemSchlüsselverwaltungBefehl <Informatik>AdressraumKategorie <Mathematik>Vorlesung/Konferenz
46:34
ComputersicherheitInzidenzalgebraVerschiebungsoperatorDienst <Informatik>SoftwareMaßerweiterungAggregatzustandUmsetzung <Informatik>AnalysisARM <Computerarchitektur>SummengleichungDatenmissbrauchInformationMAPEndliche ModelltheorieMultiplikationsoperatorMereologieProgrammierparadigmaGüte der AnpassungFrequenzGruppenoperationOrtsoperatorRechter WinkelSchnitt <Mathematik>DifferenteGesetz <Physik>App <Programm>Virtuelles privates NetzwerkPhysikalische TheorieExogene VariableSoftwarewartungPatch <Software>Lokales MinimumVorlesung/KonferenzBesprechung/Interview
50:46
Umsetzung <Informatik>MehrrechnersystemVorlesung/Konferenz
Transkript: Englisch(automatisch erzeugt)
00:00
First up, we have a law professor round robin for you. We couldn't find any robins that were round. So instead, we've brought some sparkly other birds that we'll pass around to create some visual interest, in case we get boring.
00:20
So behind me is a banner for a policy lab that I run at Penn State. And up here, I have a stack of stickers that feature a cow wearing a wig, hopefully in an aesthetically pleasing fashion, that has the handle for the Twitter feed for the lab.
00:41
So if you're interested in the intersection of security, policy, and law, please do grab a sticker or cows. It's cool if you just like cows, that's fine. Grab a sticker, follow us on Twitter, and look forward to your thoughts in the discussion section today.
01:01
So first, we have this law professor round robin. After that, we have a one-on-one with me and former FBI member and former National Security Council cyber lead, I'm not sure what his official title was
01:21
during the Obama years, Anthony Ferrante. After that, we have Dr. Suzanne Schwartz from the FDA talking about the new medical device guidance that's coming out. Also one-on-one in conversation with me, in each case followed by open season for questions, of course. And then after Dr. Schwartz, we have Josh Steinman.
01:48
Josh Steinman from the White House, who is the current National Security Council cyber lead. And then after that, we have Erie Meyer,
02:02
the chief technologist for Commissioner Chopra from the FTC. So please stay for as much of the afternoon as you are able to, and give us your best questions, because we're hoping to trigger an engaging conversation.
02:22
And thanks for being here. So without further ado, the first step, we have a discussion among three opinionated law professors. I'm Andrea Matusian. I am a professor of law and a professor in the engineering faculty at Penn State, the associate dean of innovation
02:42
at Penn State Law. This is my 16th DEF CON, so I've been around a while. And to my immediate right is Professor Stephanie Powell from West Point. And to Stephanie's right is Professor Margaret Hu from Washington and Lee Law School.
03:01
So what we're going to do is we're just going to sort of start talking about some of the key issues that we see happening in the intersection of security and law, and interrogate each other a little bit, and hopefully trigger some questions for you all. And then we're going to open it up for discussion.
03:23
Since this is the ethics village, can I give my ethical disclaimer? Please. OK, so as Andrea said, I work at West Point. And in doing so, every time I open my mouth, I have to say these are my personal views and do not reflect the views of West Point, the US Army, or the United States government.
03:40
So thank you. Sure. I'm ethically compliant now. You can complain to my deans about my views if you want. It's OK. I'm tenured. It's fine. So Andrea and I actually met almost a decade ago at a privacy-related conference. But what I learned about Andrea over the years
04:03
is a lot of her interest, her scholarship interests, her policy interests, and her longtime interaction with the security community was based on her interest in security, information security,
04:22
and what we now have to, I guess, say cyber security. But she was doing this well before the cybers came along. So can you talk to us, first of all, about how you, as you approach various policy questions, how you distinguish security from privacy?
04:43
Sure. So I'll start with flushing out the cyber allergy that you've referenced. So here are the reasons why I tend not to use the word cyber security. So first is just sort of a logistical one,
05:01
that in law, we've had a generation of courses called cyber law that referred to internet law. So by calling security courses cyber law, you end up with cyber cyber law. And that just sounds like I'm stuttering or something. But in reality, part of the concern is that when we talk about cyber security,
05:23
we seem to be emphasizing an internet component. But of course, security is physical as much as it is digital. And so thinking about the fact that if you physical control over a box, you can do as much or more damage than you can over a remote connection.
05:41
So the digital and the physical need to be interwoven when we're thinking about the attack surface of different situations and the possible harm that attackers can cause. So with that, I'll shift to the question about the distinction between privacy and security. In policy and legal circuses, I guess
06:00
that's not wrong, but it's a Freudian slip circles. Those two issues get confounded. And why do they matter? Well, they matter partially because of the unit of analysis. So in security, we're talking about whether systems can successfully defend against attacks on confidentiality, integrity, and availability of the system.
06:22
In privacy, the unit of analysis is an individual, an individual's expectations, and what the negotiated deal was around those expectations in light of a particular party's data collection, storage, handling, sharing practices. So security is about successful defense
06:43
against attacker, Mallory. And the privacy side is about what Alice and Bob negotiated with each other and whether Alice kept her promises to Bob. And what consequences happen if Alice doesn't keep her promises to Bob.
07:02
So what that means is that from a policy standpoint, you can see many security win-wins in ways that you can't as neatly frame win-wins on the privacy side. And it means that security challenges are, by design, more tractable when they're framed that way in policy conversations and in legal conversations.
07:21
But this frame is often blurred or blended, and that's why sometimes we don't get the progress on security issues that those of us in this room would probably almost all agree are kind of low-hanging fruit issues about encouraging companies to patch known vulnerabilities that
07:41
have a significant negative impact on their users or on consumers or on national security, et cetera. So there's the security-privacy divide. The last piece that I'll highlight that comes from my scholarship is the recognition that is intuitive in this world, but not
08:02
intuitive in policy and legal worlds, that vulnerability is reciprocal. It never stays in just the private sector or the public sector. The vulnerability exists wherever the code exists. So you can't think about policy approaches or regulatory approaches in a truly segmented way
08:20
because you're not going to address the problem as a whole. The attacker's going to go wherever the easiest point of entry is, whether that's a private system or a public system. Similarly, what we see from the national security compromises that we've encountered during the last 10 years, it's often private sector contractors
08:41
that have the point of entry or the insider attack happening rather than the strict framing of the organization, the public sector organization itself. So the blending of private sector and public sector in performing public tasks and the fact that
09:00
the code is going to be vulnerable wherever the code resides, regardless of whether it's a private sector or public sector entity, I call that the problem of reciprocal security vulnerability. And again, while that's intuitive here, it's not intuitive in policy circles. So with that, maybe I'll shift over to Margaret
09:22
and ask Margaret, tell us a little bit about the privacy side and the kind of stuff that you research. Yeah, great. Thank you so much because I am relatively new to the security world and it's such a pleasure and privilege to be here today. As Andrea said, a lot of my research focuses on the privacy side and in particular, my research focuses
09:41
on the constitutional law privacy rights that I think can be brought to bear into the way that these technologies now influence our, what I consider our core fundamental freedoms and liberties that are protected under the Constitution. So I'm working on a book that is called
10:01
The Big Data Constitution. And the thesis of the book is that our Constitution was forged in a small data world. And so our founders thought about small data power that was available to the government and what kind of restraints and constraints did you need to build into the Constitution
10:21
to try to preserve forms of democratic governance. But now that we have big data power, we have to really reconceptualize what does it mean to interpret our Constitution in order to protect those rights, liberties, freedoms, and privileges. So I wanted to quickly summarize a few pieces that I've published.
10:40
And I think that that helps to contextualize what I mean by reinterpreting the Constitution to encompass these new types of algorithmic decision-making tools that the government has, big data cyber surveillance, machine learning, et cetera. So one of the first pieces that I wrote on this topic was called Big Data Black Listing.
11:01
In this article, I said that, look at what the government is doing with things like the no-fly list. They're creating forms of governance that have huge amounts of asymmetric power. So there's no way to interrogate exactly what are the algorithms that are being used or the data that's being collected, but suddenly you're being told you can't fly.
11:22
And what kind of constitutional remedies do you have if you find yourself on the no-fly list? Right now, in the no-fly list litigation, for example, it's procedural due process. You can say, I was denied due process under the law. I don't know how to challenge the process in which I was nominated to the no-fly list,
11:41
and I don't have a way to challenge how to get off the no-fly list. So what I said was, we actually probably need to switch it over to something called substantive due process. And under substantive due process of the Constitution, you basically are arguing that no amount of process by the government can cure the violation, that once the violation has occurred,
12:02
that there's no way that the government can do it at all. And I said that these types of big data systems and the types of harms that they incur have to fall within a different type of remedy, such as substantive due process, because what's happening with these big data systems is a lot of the harms are correlative.
12:21
They're stigma harms. They're harms that focus on basically harming our digital selves or our cyber self or our digital footprint, not necessarily even targeting a human or a person. And that's how our Constitution first envisioned our rights, that it would impact us as humans,
12:42
not our cyber selves or our digital selves. Another piece that I wrote was called Algorithmic Jim Crow. And in this piece, I look at the need to vindicate equal protection rights in light of the fact that you can have disparate impact or targeting of minority communities through algorithmic decision-making.
13:01
But you can't challenge that under the 14th Amendment Equal Protection Clause because a lot of these systems are applied equally to everyone, yet you still might have the same types of harms that flow that we saw with the Jim Crow regime. So Jim Crow regimes were set up on the basis of classification and then sorting.
13:21
So race-based classification and then sorting systems or segregated systems. In Algorithmic Jim Crow, I said now we're gonna flip it around and you're not going to have a race-based classification system, but you're gonna have a risk-based classification system. But you're still going to have sorting. But because it's not race-based, you're not gonna have the Equal Protection Clause. Then in another piece that I wrote,
13:42
it's called Orwell's 1984 and a Fourth Amendment Cyber-Surveillance Non-Intrusion Test. And I look at the fact that during, some of you are aware of the warrantless GPS case that was argued before the Supreme Court. And during that case, 1984 was brought up half a dozen times.
14:03
Why? Why would the Supreme Court justices talk about 1984 in light of the Fourth Amendment, which prohibits unreasonable searches and seizures? And I argue that we're now in the realm of not having the proper legal vocabulary. We have to reach to science fiction or dystopian literature
14:22
in order to try to preserve our constitutional rights. And so I think that that summarizes some of what I'm trying to achieve with my research. Thank you. So I'll jump in here because some of the questions or issues that interest me most have both privacy and security elements.
14:45
And if you don't separate them and think about the kinds of harms that can flow separately, I think you miss the bigger picture or problem. And to backtrack a bit, prior to teaching at West Point for several years,
15:03
I was a federal prosecutor. Being a federal prosecutor, you obtain orders from a court, issue subpoenas for various kinds of collection, various kinds of information. Law enforcement, as you're all well aware, uses various kinds of technologies
15:22
to surveil and collect information. There was one kind of technology that I didn't come to know about until after I stopped being a prosecutor that, frankly, I have to credit this community with. For a long time, there have been presentations
15:41
about IMSI catchers. And it wasn't until a longtime member of this community, someone named Chris Segoian, who is currently working for Senator Wyden, we had collaborated after I was no longer a prosecutor on another piece about location data. And we were sitting on the steps of a yogurt shop
16:02
in DuPont Circle, having some yogurt on hot summer day, and he started telling me about what IMSI catchers do. And it floored me a bit, I gotta tell you. The fact that this particular piece of equipment could impersonate a cell tower,
16:21
and in doing so, collect all sorts of information, spoof another phone. And at the time that this technology was known generally in this community and as Chris was doing his due diligence and investigating it,
16:42
it was certainly being used in the government, federal, and then it trickled down to state and local law enforcement. But again, I had never encountered it in my work as a prosecutor. And what really struck me, as it started to become more of a public conversation,
17:01
there were, I think, the first big article on it in a national newspaper, if you will, the Wall Street Journal, I think it was, Jennifer Valentino de Vries did a story on a case about a very smart defendant who figured out that the way law enforcement located him
17:23
was with the use of an IMSI catcher. This is Daniel David Rigmeggen. He is now, happily, a free person. But he figured out that the only way they could have tracked him down to his data card, if you will, was by using an IMSI catcher.
17:42
So as this conversation started to develop in a more public way, what became interesting is that, appropriately, some people who normally sort of focus on privacy interests were talking about, well, wait a minute, how is the government
18:02
using the IMSI catcher or more commonly termed stingray? Is it what kind of court order, if any, is it getting? In other words, they were concerned about whether there was potentially a Fourth Amendment violation. But here was the, and that's a very, very important conversation,
18:21
and of course, one that should occur because potentially Congress or the courts need to step in and say, look, if the government is gonna collect location data or various other kinds of non-content data that the stingray can collect, law enforcement perhaps needs a warrant.
18:43
And just to jump a little bit to the end of the story, DOJ has now issued a policy, it is not the law, but that any time law enforcement uses an IMSI catcher, it is supposed to get a warrant. But what was more interesting to me and what my now longtime friend, Chris Sagoian,
19:04
explained to me was that IMSI catchers exploit vulnerabilities in our cellular networks. So dealing with the privacy problem wasn't gonna fix the security implications of using stingrays.
19:21
And I didn't personally attend Black Hat this year, but I understand from reading a piece in Wired that indeed security researchers have now found vulnerabilities in the 5G network that stingrays can penetrate. So this, it's just all about saying
19:40
that when we approach surveillance issues, I like and find most interesting to think about both the security and the privacy implications. So I will toss it back to you on that point. So now let's talk about the next generation of technologies that cross the civil
20:00
and criminal boundaries. And I call this the problem of the internet of bodies. So we know the internet of things connects all of the things to the network from our talking toasters to our refrigerators that self order food or the apocryphal story of the refrigerator spamming people.
20:22
But what we also see, and all you have to do is walk into the biohacking village and you see this, is that there is a host of new technology that is embedded in and connected to the body. So we're all comfortable with the Fitbits and the Apple Watches and the, some of us Google Glass
20:41
that films things and stays always on. But when we're starting to talk about an artificial pancreas that needs security updates, when we're starting to talk about digital pills that talk to your phone with Bluetooth from the inside of your stomach, things that have already been FDA approved. There's a whole generation of these technologies,
21:02
some of which are medical devices, some of which are not going to be classified as medical devices probably. So what that means is that all of the challenges that we've seen on the security side with the internet of things, or some of you know a Twitter handle undoubtedly that has a colorful renaming of the quality
21:23
of the internet of things. We're gonna have those same problems in the context of the internet of bodies. So extreme interconnection, sometimes gratuitously so, high levels of security vulnerabilities, the problem of what I call builder bias, in other words, shipping fast without securing adequately, and a problem of impoverished choice
21:43
that I call the problem of not being able to, basically the lack of choice and the degree of technological connectedness. And so I view this as a competition problem basically. And so when we look at the question
22:02
of what we can buy in the marketplace, if you try to buy a car now that isn't powered by hundreds of millions of lines of code, good luck, right? Everything is now so code reliant that you don't functionally have the choice to pick the degree of technology connectedness that you want in whatever your estimation
22:24
is the appropriate degree of connectedness. So if you think about Battlestar Galactica, there's the story of the Galactica surviving because it was not connected to the internet, right? But when we're building, whether it's the internet of things or it's the internet of bodies devices,
22:41
and we aren't thinking through the extent of connectedness, we can sort of predict in this community at least how that's gonna play out, right? Things won't end ideally. And legally speaking, because of the gray areas of whether these kinds of devices are going to be classified as medical devices regularly or going to be classified as just internet of things
23:00
consumer devices, which means in one case, they would be held under closer scrutiny by the FDA. In the second case, it's only the FTC that would functionally be looking at these devices. And it's a much smaller agency with limited resources and the extent of enforcement would probably be significantly lower. We have an identified regulatory gap there,
23:23
but the fun doesn't end there, fun. So think about every end user license agreement you've ever clicked on. They've become progressively more draconian across time. They've become longer. They're written by lawyers for lawyers as a corporate lawyer who used to write them. I can say with certainty
23:41
they're written by lawyers for lawyers. And so imagine that instead of it being a random website you're clicking on, you say that it is the interface for your internet reliant enhanced vision from the injected contact lenses,
24:02
patented invention already by multiple companies. Your injected contact lenses, whether it's for AR gaming or it is for vision correction or it is for archiving of what's happening in your life. So with the embeddedness of that device,
24:22
so first of all, if it's relying on the internet, suddenly these secondary issues start to come into play. For example, which data plan did you pick? Is your provider going to be able to push you that critical update wirelessly to your eyes when you need it? What happens when you click yes and it says that they disclaim any liability
24:41
for any malfunction in the code of your injected contact lenses? What happens when the provider of those services needs a new line of funding and the venture capitalists that invest decide to change the terms of the data aggregation that you thought you were agreeing to?
25:00
What happens when the company goes bankrupt? The bankruptcy court doesn't consider the interests of consumers generally. The bankruptcy court considers the interests of the creditors. Consumers frequently aren't even represented at the table in a bankruptcy proceeding. So all of those data aggregation repurposing issues that we see now in the first generation of bankruptcies of internet of things devices,
25:21
including bricking sometimes for patent reasons, those are all going to show up, one might predict, in the internet of bodies world, except the consequences are going to be physical harm to human bodies. And that's where the legal mass is gonna get really interesting because courts won't be comfortable with the same kind of power balance
25:40
that we have now in the world of software in terms of as is, where is, you build it, it doesn't work, oh well, your operating system crashes, you lose the document you've been working on for four hours, it sucks to be you. When it's your eyes no longer working or it is your robotic arm, if you're say a veteran with a prosthetic
26:01
that is internet reliant, suddenly things start to look very connected to the physicality of humanity. And so courts are not gonna be comfortable with seeing plaintiffs come in and saying, your honor, my arm doesn't work anymore or I can't see out of my right eye because my left injected contact lens had a patent problem
26:24
and they wouldn't pay the money for the license fees. While this sounds like it may be sci-fi, although I think many of you are gonna be along for the ride with me on this because you know exactly how IoT security works, but we've had a similar problem in the context of medical procedures before and patents.
26:44
So in the 1990s, doctors started patenting medical procedures and because of those patents, other doctors started to shy away from performing certain kinds of surgeries that they considered to be in the best interest of the patient for fear of being held liable
27:01
for patent infringement. And to make a long story short, it is primarily because the doctors had a professional association, the AMA, that lobbied Congress and they got a change in the law that said that no one could recover on those patent damages from the medical process patents.
27:23
So there are a few interesting tidbits in that story. So one is the fact that a concerted community that cared about patient safety managed to get a change to law that was a big deal and getting intellectual property law changed is an extra big deal.
27:40
It's really, really hard. Any of you who've dealt with patent or copyright or trademark disputes know that it's a free for all with bare knuckle lawyering at the extreme. So that's one part of the story, that success story. The other part of the story is that they required a really concerted push in an organized way. And so when I take a step back
28:01
and I look at our world here, I see the same level of care about what happens to humanity and what happens to people getting hurt. But in this community, there isn't the same level of organization to be able to ensure that when bad things start to happen
28:20
we're there to nudge and to help correct the course. And to talk to the companies that are building some of these technologies about making sure that they're building things in the safest way possible and that the regulators are minding the store in the optimal way. So that's the world of the Internet of Bodies and I'll leave you on a particularly dystopian note
28:41
and then turn it over to Margaret because this was all the happy part by the way in case that wasn't clear. So any of you who've been reading Wired or the Wall Street Journal or following the adventures of Elon Musk know that there is a company that he's involved with called Neuralink and they have a cortical interface technology
29:05
and they're not the only company in the valley that's doing this that functionally has a live read and write feed back to the cloud that is intended to augment the processing capability of the wetware of the human brain.
29:20
And these are healthy bodies that they're talking about that are choosing to augment their processing power of their brains. And the way that the video that Neuralink released recently described it, there was going to be a component that sits behind your ear and then there's a Bluetooth connection.
29:40
Footnote, Bluetooth, problems with Bluetooth. So they're going to build this read and write technology. So now let's shift a little bit and let's think about what it means to have other people being able to write things to your brain. When we think about what it means to be a human,
30:04
a citizen, a voter, in order to process our opinions about how we should be governed, how we should live our lives, we rely on not only the exercise of autonomous choice in voting but there's actually a precondition.
30:21
So Kant called this heftonomy. He talked about autonomy which is going out, doing things and demonstrating what your opinions are. But Kant had this thing called heftonomy and it was the inside voice. It was you talking to yourself just trying to figure out what you think. So in other words, the acts of autonomy that we all exercise in choosing the courses of action,
30:41
choosing our leaders, those acts of autonomy require the precondition of a hermetically sealed thought process where we talk to ourselves about what we think. In a world where other people can write things to our brain, how does that self-contained process
31:01
of thinking actually work, right? Are we sandboxing pieces of our brains off? How are we doing this? And these technologies that are already being built, are they thinking about this enough? Are we potentially undermining the future of liberal democracy by running too fast, too hard in trying to augment human bodies that are healthy
31:24
with extra capabilities in ways that may bring unintended negative consequences, particularly for security and privacy. And with that, I will turn it over to Margaret with some other thoughts about elections, I think.
31:41
Yes, definitely. And I think that that dystopian segue is really very appropriate because as you point out, Andrea, I think a lot of these harms that we're now addressing with the challenges of these technologies are really societal-wide harms. And going back to the theme of what our Constitution was intended to do
32:01
is in many ways intended to protect individual-based rights against individual-based harms. But what if the harm is a harm to society? What if the harm is the harm to democracy? What if the harm is the harm to how we conceptualize human rights? What do we do then?
32:21
And because we're in the ethics village, I think it's really important for us to think about whether or not the law is even capable of protecting some of these rights and privileges and core fundamental values, or whether we need to turn to something like ethics, or whether we need to turn to things like art and literature.
32:41
So I think talking about dystopian literature, I think is really important because to the extent that we maybe lack the imagination as lawyers, maybe we need to turn to Philip K. Dick and Minority Report as a way to really understand what's at stake and what we need to do moving forward with these technologies.
33:01
So I wanted to talk a little bit about the foreign interference of US elections because I think that that was what bridged my interests between data privacy and cybersecurity. When you have something like Cambridge Analytica and you have the allegation that this contractor
33:22
for a campaign was capable of collecting anywhere between 2,000 to 7,000 data points on every US voter, over 200 million voters in the United States able to aggregate that data and build psychographic profiles in order to influence those voters in campaigns.
33:41
And then you have the other allegations of the foreign interference with the cyber propaganda and other commandeering of our social media platforms and being able to spread disinformation, et cetera. What can we do under the law, especially in the United States when we have the First Amendment, when we rely on these new modes of communication
34:03
and digital economy? What is the role of the government to then step in and regulate? And then what do we do when we rely so heavily on the private sector, not the public sector, to secure those forms of communication and economy? And I think that it poses unprecedented challenges.
34:25
There was an excellent discussion yesterday here on, for example, cyber offensive measures as a deterrent against future cyber attacks or trying to prevent and preempt future types of forms
34:41
of cyber warfare or cyber disinformation campaign and cyber propaganda campaigns. But to what extent do we swallow up democracy by trying to protect democracy? And I think that that's a core ethical question that spans the worlds of both law
35:01
and cybersecurity and data privacy. And so I think that that's not something that I necessarily have an answer to, but I would love to invite a discussion on here. So I'll end on an ethical point, too, I think, and particularly on a, unfortunately, ongoing debate that is, I think, near and dear to this community's heart,
35:22
the going dark debate or whatever iteration of the crypto wars we're in. I find it interesting and honestly troubling that a number of government officials, when they talk about a need for back doors
35:42
or law enforcement access to our systems, frame it as a privacy issue. And what I mean is they invoke the language of the Fourth Amendment and talk about, but wait a minute, we're going to a court to get an order requiring a company to disclose information
36:04
or to be able to have their systems built in such a way that it can disclose that information. When you frame this, and I don't think I need to name names, there's a former FBI director and a current attorney general who is doing this,
36:22
but when you frame this in a Fourth Amendment context and try and shift this to just, well, this is how our constitution balances government power against citizen privacy, you lose the fact that you are exposing, you're damaging information security
36:42
in ways that we probably all can't, well, you all get it, but I think we can't even appreciate the full iteration of how those vulnerabilities will manifest as different technologies interact. And so one of the things that I'm working on
37:00
with a colleague who is a philosophy professor is to try and talk about the ethics of framing this debate in the right way so that the security issues are better understood and protected. Stephanie, how would this play out in a world where we could read and write information from brains directly?
37:21
Oh my goodness. Well, I think we certainly have privacy, security, and literally body integrity security issues. Botnets of body parts is, you know, it's your term.
37:41
And on that happy note, we will open it up to questions, discussion, thoughts. Yeah, please. So you were talking about how to frame this debate. What about a Second Amendment challenge to some of these crypto and cyber laws, meaning we hear cyber arms control, we hear export controls and encryption.
38:01
It's very clear that the government views these as some kind of weapon. It's also very clear that the Second Amendment says rights bear arms shall not be infringed, and that our decision says that there is an individual right to do with bear arms. Therefore, do I not have an individual right to strong correct evidence? So I'm gonna do a lawyer thing
38:21
and partly a DC policy thing. So that's an interesting theory. Given though what is going on with the Second Amendment and gun control issues right now, I'm not sure that the issue, at least in the way I see it,
38:41
in wanting to prevent back doors, I don't know that is best framed like a Second Amendment issue. Perhaps a legal challenge, but in so far as, at least it is my hope
39:01
that we get better gun control laws and that the, I don't think that the Heller opinion, maybe you might have, you may differ on this, but maybe not. I don't think that the Heller opinion prevents gun control in the way that some argue,
39:22
but I am slightly concerned just living in the crazy DC, I live in DC, actually on Capitol Hill in spitting distance from the House of Representatives buildings. I'm sort of concerned, at least in some circles, in couching the discussion that way.
39:40
I don't know if that answers your question. I'll throw in one line. So whenever we're talking about legal strategy, you always have to think about the interaction of different strategies. So you don't want one strategy to damage the protection offered by a different strategy, and in particular, when you consider the interaction of the First Amendment with this kind of a legal argument you may end up at the end of the day
40:02
with a less protective approach in the aggregate if you bring in a novel legal approach kind of from left field. So you wanna game it out, and so the First Amendment questions to the extent they've been developed, I would say probably will get us
40:21
to a greater level of protection than a novel Second Amendment argument would. Yeah, and I think just the Heller opinion is very controversial in part because of the way that it divided the text of the Second Amendment into parts, and it ignored the first part of the Second Amendment. So the Second Amendment says,
40:40
a well-regulated militia, comma, being necessary to the security of a free state, comma. The right of the people to keep and bear arms shall not be infringed, and so the focus in the Heller opinion on that second part of the Second Amendment on the right of the people to keep and bear arms was the focus of that opinion, but I think that there's a lot
41:00
of constitutional law scholars that would question whether or not you can really cleave the Second Amendment in half and just focus on the second half and not engage in the first half. You guys mentioned- Wait, really quickly, if you don't mind trying to get to the microphone just so we can all hear you.
41:21
Ross, yes. Thank you. You guys mentioned Fourth Amendment legislation that would potentially harm information security. I didn't quite follow that. Could you guys go a little more in depth?
41:41
So it was more that if an issue is framed simply as a Fourth Amendment privacy issue, the Constitution has struck a balance, basically saying that law enforcement
42:02
cannot search persons, places, or effects without a warrant. So what law enforcement would say when it goes to a company wanting information and the company says, look, we don't have access to it. We encrypt it in a way that we don't keep the keys.
42:22
Law enforcement would say, well, wait a minute, that's not necessarily in line with the way our Constitution balances privacy rights. If law enforcement comes with a warrant, then it should be able to get the information. What I would respectfully submit is that
42:41
that framing of the going dark debate as a privacy issue, misses the big pink elephant in the room and that is if companies are required vis-a-vis a statute or otherwise to build their systems in a way that always give law enforcement access to data,
43:03
that the Fourth Amendment doesn't consider the security implications of all of that. So the premise is the warrant, whether or not the warrant can access that information. The Fourth Amendment looks to a warrant as the right way to strike that balance.
43:22
And so law enforcement says, and again, I'm not sticking this on all of law enforcement. We've had public statements by high-level officials. They would say, well, that this going dark debate or the crypto wars, let's look at how the Constitution would talk about law enforcement access to data.
43:43
And I would just say, that's probably the wrong way to frame it. These are competing visions of security. On the one hand, you have public safety concerns that law enforcement traditionally investigates.
44:01
But on the other hand, you have these information security or cyber security issues that the Fourth Amendment just doesn't seem to be the right tool to address.
44:20
Are you aware of any cases where companies had to decrease the security level or encryption level because of the law requirements? So there have been some cases where law enforcement, there's been a fight. I mean, you're aware of the maybe Apple iPhone debate,
44:45
which sort of petered out in some respects because what happened? Law enforcement, as it is reported, I don't have inside information, was able to find a third-party vendor that could access the phone.
45:02
There's another case, and I don't know that I'd quite put it in that category, but the company case from, it's going back maybe 15 years, where law enforcement wanted to compel a car company
45:21
that provided various onboard services to wiretap the individuals in the car. And they were using this language in the Wiretap Act that basically talked about technical assistance
45:41
that companies had to provide to law enforcement. But the problem was, and what ended up preventing law enforcement from being able to compel the company to use the microphone is that when the microphone was used, it disabled the sort of emergency component of the car
46:04
that would allow a user to tell this third-party company, I need help, my car's broken down, I'm in danger. And so the Wiretap Act, the language sort of limits based on that kind of concern.
46:23
But if this were a car where, and presumably today, such things are not so interconnected that perhaps the microphone feature could be enabled without harming the security features of the car. But again, it wasn't, the company had access to,
46:43
it wasn't a matter of them not having the ability to wiretap, if you will. The issues that are coming up now, and there was some reference in a news story, I believe, about WhatsApp fighting a request
47:06
to compel encrypted data. In fact, I believe that the ACLU and Rihanna Pfefferkorn from Stanford are actually fighting to get those pleadings unsealed,
47:24
because they're currently under-sealed. But there are companies that are considering the extent that they really need to be collecting certain information, because to the extent they collect that information, they are subject to a warrant to share it. And all of the user agreements give them the right
47:41
to share that information even without a warrant. So we consent, at least in theory, every time we click yes on one of those end user license agreements that none of us reads very carefully. And so the principles of data minimization, while a good security principle, they also play into this conversation about the extent of information
48:01
that a company can be compelled to share with law enforcement. Hi, first, this was fascinating. Thank you so much. First, quick comment. I think we definitely need a season six of The Wire for the whole Barksdale group is using Signal and VPNs,
48:21
and the Major K Squad launches a spear-phishing campaign or something. So my more serious question, I'm just curious to your thoughts or opinions on the balance between security and privacy around required disclosures of breaches, particularly the 24-hour disclosure requirement in GDPR.
48:41
Me as a consumer, yes, I want to know as soon as possible so that I can take my own protective action, but me as an incident responder, that first 24 hours, I might not even know if there's a persistence in my network yet, and how do we balance that? Is there some way that we can notify law enforcement, but it's not publicly announced yet or something like that?
49:02
I'm just curious what your thoughts on that are. So the approach that we're taking in the US is really one that varies state to state. So the extent of the breach needs to be disclosed, what constitutes as a carve-out, what constitutes encrypted data that's not within the purview, it's all varied. So we have a bunch of different approaches. The thing about GDPR is that it objectively
49:22
just does raise that level. The most interesting part of GDPR, and we can debate whether the 24 hour period is an adequate turnaround period, but the thing that is an important, I think, positive shift with GDPR is that in the preamble, there is an imposed duty to stay up to date
49:43
with the state of the art of security. So it shifts the whole conversation from being one around throwing a bandaid on the severed arm to trying to prevent the children from playing in the street in the first place. It's trying to shift to a proactive patching,
50:01
maintenance, investing adequately in security teams kind of model. So while it's fair to debate the efficacy of the particular formulation of certain pieces of GDPR, that overarching paradigm shift is important, and I think ultimately a positive one
50:21
from the way that we shift the conversation toward a more proactive risk modeling, threat assessment, attack service analysis conversation. Okay, so we'll cut it off there. Give us a few minutes to reconfigure for the next discussion, and stay with us for Anthony Ferrante.
50:45
If you guys do wanna talk, I think you guys are available to talk. I'll have to kick you out of the room, but. We won't say anything about talking. You guys can do it, just we'll have to kick you out if you wanna continue the conversation.
51:00
Thank you. And take stickers, because I have a whole stack here, and they have a cute cow on them.