Biohacking Village - Infodemic: Threat Models for patient communities on social networks.
This is a modal window.
Das Video konnte nicht geladen werden, da entweder ein Server- oder Netzwerkfehler auftrat oder das Format nicht unterstützt wird.
Formale Metadaten
Titel |
| |
Serientitel | ||
Anzahl der Teile | 374 | |
Autor | ||
Lizenz | CC-Namensnennung 3.0 Unported: Sie dürfen das Werk bzw. den Inhalt zu jedem legalen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen. | |
Identifikatoren | 10.5446/49886 (DOI) | |
Herausgeber | ||
Erscheinungsjahr | ||
Sprache |
Inhaltliche Metadaten
Fachgebiet | |
Genre |
00:00
RechnernetzModelltheorieComputersicherheitComputerspielModelltheorieMultiplikationsoperatorGruppenoperationFacebookZweiComputerarchitekturSelbst organisierendes SystemProgramm/QuellcodeComputeranimation
00:35
Digitale PhotographiePunktZellularer AutomatMAPKontextbezogenes SystemGrundsätze ordnungsmäßiger DatenverarbeitungDigitalisierungComputeranimation
01:41
PerspektiveDigitalisierungFramework <Informatik>ModelltheorieMinkowski-MetrikBitNatürliche ZahlPhysikalisches SystemPerspektiveÜberlagerung <Mathematik>InformationComputeranimation
02:30
RechnernetzSystemplattformKonditionszahlZentrische StreckungPeer-to-Peer-NetzSoundverarbeitungStatistikGraphSondierungPunktSoftwareComputeranimation
03:55
SystemplattformNatürliche ZahlGruppenoperationMAPGemeinsamer SpeicherOrtsoperatorLeistung <Physik>App <Programm>Computeranimation
05:40
ComputerspielGruppenoperationFacebookHypermediaEreignishorizontNormalvektorBeobachtungsstudieGüte der AnpassungProzess <Informatik>Peer-to-Peer-NetzSoundverarbeitungSystemplattformComputerspielGruppenoperationOrdnung <Mathematik>FacebookAutomatische DifferentiationInformationVererbungshierarchieTypentheorieDifferenteHypermediaHoaxSnake <Bildverarbeitung>Abgeschlossene MengeEntropie <Informationstheorie>EreignishorizontUnrundheitBildschirmmaskeSoftwareZellularer AutomatComputeranimation
10:17
Twitter <Softwareplattform>ÄhnlichkeitsgeometrieLokales MinimumMenütechnikWechselseitige InformationMehrrechnersystemMusterspracheTwitter <Softwareplattform>CursorKurvenanpassungProzess <Informatik>Formale SpracheZahlenbereichProjektive EbeneSoftwareRechter WinkelSoftwareschwachstelleRoboterGruppenoperationPhysikalische TheorieTypentheorieOpen SourceZentrische StreckungTafelbild
12:14
RechnernetzInformationTwitter <Softwareplattform>Zentrische StreckungRoboterProjektive EbeneSoftware
13:05
Strom <Mathematik>ZeitabhängigkeitNotepad-ComputerGleitendes MittelDatenmissbrauchRechnernetzInformationVakuumRechenwerkKommandospracheFacebookDatenmissbrauchVideo GenieSoftwareMittelwertMedizinische InformatikKurvenanpassungInformationHypermediaSchlussregelCASE <Informatik>AutorisierungDienst <Informatik>Exogene VariableBitrateComputersicherheitMinkowski-MetrikCybersexRechter WinkelProzess <Informatik>ModelltheorieWasserdampftafelComputeranimation
16:11
Vollständiger VerbandFramework <Informatik>CybersexEreignishorizontGruppenoperationSystemplattformExpertensystemFramework <Informatik>MultiplikationsoperatorWeb-SeiteInternetworkingEinsARM <Computerarchitektur>EreignishorizontCybersexSystemaufrufRechter WinkelVollständiger VerbandGruppenoperationSystemplattformComputersicherheitTwitter <Softwareplattform>FacebookKurvenanpassungProgrammierumgebungComputeranimation
Transkript: Englisch(automatisch erzeugt)
00:00
Hello, DEF CON. My name is Andrea and today I'm going to be talking to you about threat models for patient communities on social networks. Not gonna spend too much time about me. This is my second year at DEF CON. I am a BRCA1 community data organizer,
00:22
a mutant turned security researcher. Last year, I presented on a major security flaw in Facebook's group architecture. I started a nonprofit called Life Collective and enough about me. I wanna start with this photograph of Portland. It is a community under siege.
00:42
And when we look at this photograph, depending on your politics, your ideology, your education level, your hopes, your fears, you might see different things from this photograph. But one thing we might agree on is that Portland is a community in turmoil.
01:04
I look at this photo and I see thousands of beautiful points of lights generated from cell phones, from people with varying levels of technical literacy and I hope to God they understand how their data at this protest may be used against them.
01:24
They are vulnerable and in raising their voices, their data can be something that is weaponized. Well, how do we think about this in context of a digital community for healthcare or for health?
01:42
Well, I'd like to start with just a quick framework for what I'm gonna cover today on threat models and health social networks from a community's perspective. We often think about protecting systems and data and not necessarily about the communities
02:00
or digital spaces where we reside. So in this talk, I'll cover how the nature of the adversary, when we think about threat models is becoming difficult to detect. I'm gonna talk about how nobody is immune to an infodemic and especially as this COVID pandemic rages on. I'm gonna talk a little bit about how influence
02:21
can be deadly or it can save lives and finally ask what is a path forward from here? How do we survive an infodemic? Well, let me start with my village. This is a quick network graph of my own community and just like that picture in Portland, I would kind of explain to everybody
02:43
who isn't involved in patient communities or e-patient social networks, we are right now also a community under siege. We are losing access to care. We are losing access to meds. We are high risk communities who have adverse
03:01
or underlying conditions that make us more at risk to potentially dying from COVID. And these are work that expands to a much larger scale. In fact, there aren't a lot of statistics on this in the pandemic, but I'll give you or point you to a survey from 2018 from Hope Lab
03:25
that shared 51% of young adults have tried to find people online with health concerns similar to their own. What we call this phenomenon is peer support and there's a whole body of evidence around how peer support and health
03:42
can have really beneficial effects. It can also have harmful effects when the tech platforms where we reside or the knowledge that we share can be weaponized against us. Further, it can be much more difficult when the nature of the adversary
04:02
is becoming harder and harder to detect. Well, what has this looked like over the past couple of months? We have physicians at the steps of the Supreme Court in their white lab coats advocating for the use of hydroxychloroquine, which we know is not an evidence-based treatment
04:24
for COVID-19. We have doctors flocking to TikTok. A lot of ways these doctors have the best of intentions, but when we encourage the sharing of knowledge in a health community,
04:41
we are inevitably exposing the people who engage at that level on these platforms to have those data weaponized against them. Well, the nature of the adversary is also becoming difficult to detect because the leaders and scientists
05:00
who have traditionally been in these positions of power are in some ways enabling disinformation. This is a picture of George Church who recently launched a dating app based on genomics, and here he is snorting his own vaccine that is not FDA approved,
05:22
and it begs the question, how are we replacing science with ideology? How are these disinformation narratives targeting vulnerable groups? Well, more and more, it's starting to feel like no one is coming to save us.
05:42
Well, further, no one is immune to ad targeting or disinformation on these tech platforms where we reside. I highly encourage you to take a look at this recent news article about Facebook and direct-to-consumer pharmaceutical ads.
06:03
And here's just one great example of a direct-to-consumer ad. I have to laugh at this one. For anybody who knows about GINA, the Genetic Information Nondiscrimination Act, here we have a direct-to-consumer ad that is advertising life insurance based on your genome.
06:23
Well, there's a problem with that. Genetic information or the GINA, the Genetic Information Nondiscrimination Act has one loophole that allows companies to discriminate. And that one loophole is for life insurance companies.
06:41
So I ask or it begs the question, how far have we gotten from serving and the people that reside on these platforms with good knowledge in a way that is going to protect them instead of use health or genetic information against people?
07:04
Influence can be deadly. I'm gonna give you a couple of examples here. One is making the round lately in Facebook groups. It's a black cell treatment, a fake cancer cure. And this is what happens when you apply
07:22
a snake oil treatment that essentially burns your skin and is being peddled by marketers in these different groups. Some of them are people joining closed groups with under the guise of being a person offering support
07:43
when really they have an interest in peddling snake oil or other types of treatments. Parents are poisoning their children with bleach in order to cure autism. And we could teach the debate all we want on the anti-vax movement.
08:02
I'll just offer up this one example of a mom not giving her son Tamiflu and he later died. There are more and more examples popping up like this all over the place and I could go on and on. So in one aspect, there is a bright spot here
08:23
when we think about social networks coming together and doing so in a way that is evidence-based. I wanna give this one example of a community within my own ecosystem of breast cancer social networks
08:41
that actually came together in a good way. This was a group of women who organized around a rare disorder called BIA-LLCL, which is breast implant illness and a rare form of leukemia that was being caused by a certain type of textured implant
09:01
that a lot of women who are going through breast reconstruction or bilateral mastectomies were opting to have. Well, as it turns out, the data on adverse events for this particular type of implant were not being reported back to the FDA.
09:21
And so these women banded together on very large Facebook groups, they worked with physicians and the outcome of that was allergen was cited and there was an FDA warning and more transparency and different processes around post-approval study requirements for breast implants.
09:42
So there can be good outcomes here. When we think about how social networks come together, it's just a double-edged sword. Peer support and the lifelines I've seen over the many years that I've been on social media can be life-saving. They can change things, but we have to recognize
10:04
that there are good effects and bad effects. We have to bolster the good while really acknowledging the harm and asking ourselves how do we reground in ethics and how do we first do no harm? Well, what does an infodemic look like
10:21
when we zoom out and take a look at how social networks, bots and disinformation campaigns target vulnerable communities at scale? Here's a quick snapshot of known conspiracy theories
10:42
and disinformation hashtags. And I'm just gonna give this one example and move my cursor over here at the right so you can see QAnon in red. This is a cluster of the QAnon hashtag tweeting about COVID. This comes from a really great open-source project
11:03
called Project Domino. I invite you to reach out to Leo Majerovich, who is the co-founder of Graspistry, and I'm on their COVID hunting team in Project Domino, and it's just a really fantastic group that has banded together and started visualizing
11:22
what these disinformation networks, what these bot networks look like, and thinks about how their behavior can be clustered together in the types of language that are being used or the number of tweets per day that might be a pattern
11:40
that is statistically significant is. Well, I look at this and I think, well, my gosh, this is a snapshot of the infodemic. This is what a biological process looks like on a social network. To me, it looks like social networks not being able to detect and respond effectively
12:03
to these campaigns in a way that's getting ahead of the infodemic, and that's one of the reasons why we're not flattening the curve. Here's another picture that I think is really important. This is another one from Project Domino.
12:21
This is roughly 211,000 tweets from 50 COVID-related misinformation hashtag campaigns. I wanna give out a shout out to Cody Webb who helped generate this. And once again, this is what pollution of information and social networks
12:43
when people are going through trauma looks like at scale. When we have bot networks, when we have sock puppets attacking and just spewing out the wrong information to vulnerable people who are seeking knowledge,
13:01
evidence-based knowledge, and they don't know who and what to believe anymore. So where does this leave us? It leaves us leading the world in not flattening the curve. Here is daily confirmed cases, a five-day moving average of new cases
13:22
where we are hitting between 60 and 70,000 on our five-day moving average of new cases. Users really don't have rights when it comes to health privacy on social networks. And that in of itself is a threat model we need to think about.
13:40
Health information can be used to deny jobs, can be used to deny healthcare. And the one agency that we have put a complaint forth to is the FTC. Well, I think it's important to think about this really great paper from Nature Medicine
14:01
called Privacy in the Age of Medical Big Data. It shows or paints a picture of the big data policy landscape as an iceberg. Above the water at the tip of the iceberg, we have all HIPAA-covered entities where so much in cybersecurity, when we talk about protecting devices,
14:22
when we talk about health data breaches, that's above the iceberg. Well, below the iceberg is a lot more. Not only has the FTC failed to enforce or protect the health privacy on social networks, and I know I'm blocking this,
14:41
so I'm gonna move over here. There we go. The FTC had a settlement back in 2019, a $5 billion settlement. And we brought a complaint to the FTC under this PHR breach notification rule. It's the one rule in the one agency
15:00
outside of Health and Human Services that has authority to enforce any kind of consumer protection for health information. And so we went to them and said, Facebook has a major data breach that has to do with health information. And there was basically no response
15:21
in this $5 billion settlement. Meanwhile, health insurers are vacuuming up details about us. It can raise our rates. Any health information that you share on a social network can be used by data aggregators and packaged up to basically be used
15:42
to discriminate against you. And I want everybody just to be very careful about that. For me, as somebody who's been on social media for 10 years, the genie is already out of the bottle. And I recognize that when people go through a new diagnosis, they're seeking support and information,
16:02
but we don't have any safe harbors. We don't have any safe spaces anymore to talk about our health. And that's a problem. Where do we go from here? I think we need to lock arms. And I'm gonna take a page from I am the cavalry and say, no one is coming to save us.
16:23
I've tried, nobody's coming to save us. The only people who are coming to save us are the ones directly affected. And I really hope that I can give a meaningful call to action to the folks who are listening today. I need the cybersecurity community.
16:41
I need the national security community. I need healthcare leaders and experts to come and lock arms with these patient communities and lift our voices up. If we don't do that, if we don't meet people where they are and start giving them meaningful rights and protections,
17:00
this harm and damage is gonna continue and we are not gonna flatten the curve. Well, what does that look like and how are we doing this through the Light Collective? We have a very ambitious roadmap. We are working on a framework for collective self governance that is driven by patient communities that reside on social networks.
17:20
We are developing best practices to protect patient support groups that already exist on Facebook and Twitter and asking ourselves, well, if we are in such a hostile environment, maybe we need to leave the platform. How do we do that? We're looking at legal frameworks like a data trust.
17:40
We're looking at cyber hygiene best practices, onboarding mentors, and I invite you to get involved, to donate. We have weekly events and we would love to see you there. Finally, thank you for your time. Join us, you know where to find me. Come follow Be Like Light
18:02
and we will see you on the internet. Bye for now.