Data Responsibility on the Front Lines
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Subtitle |
| |
Title of Series | ||
Number of Parts | 21 | |
Author | ||
License | CC Attribution - ShareAlike 3.0 Germany: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this | |
Identifiers | 10.5446/36102 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
re:publica 201810 / 21
10
11
12
16
00:00
Information securityDependent and independent variablesLine (geometry)Flow separationInsertion lossTrailElectric generatorSoftware developerDigitizingGoodness of fitDigital signal processingVector potentialView (database)Online helpLecture/ConferenceMeeting/Interview
02:19
Point (geometry)Address spaceInformationSelf-organizationState of matterVideo gameInformation privacyDomain nameCybersexDependent and independent variablesSoftware developerIntranetLecture/ConferenceMeeting/Interview
05:31
Sensitivity analysisProjective planeSet (mathematics)Expert systemDirection (geometry)Perspective (visual)ReliefTouchscreenLecture/ConferenceMeeting/Interview
06:44
Digital mediaGoodness of fitWordGroup actionSlide ruleLevel (video gaming)Presentation of a groupDifferent (Kate Ryan album)Lecture/ConferenceMeeting/Interview
07:41
RootArmVideo gameSelf-organizationRevision controlPresentation of a groupBitMultiplication signCore dumpBuildingIntegrated development environmentMeeting/Interview
08:54
CodeOffice suiteGroup actionNumberSelf-organizationBuildingOrder (biology)Information privacyDistanceRule of inferencePhysical systemRevision controlMultiplication signNoise (electronics)Text editorBitOffice suitePoint (geometry)Task (computing)Wave packetPredictabilityInteractive televisionProcess (computing)MeasurementGoodness of fitMeeting/Interview
10:29
Digital signalData managementDevice driverProcess (computing)Control flowStandard deviationMultiplication signOperator (mathematics)InformationNumberVideo gameIntegrated development environmentRule of inferenceInformation privacyMeasurementProcedural programmingAlgorithmKey (cryptography)Hard disk driveBitElectric generatorLecture/ConferenceMeeting/Interview
12:01
Computer programmingHeat transferData miningGroup actionDigital signalIntegrated development environmentInformation privacyMobile WebInformation privacyAdditionMultiplication signRegulator geneTraffic reportingExpert systemSelf-organizationStandard deviationSeries (mathematics)Fundamental theorem of algebraHeat transferNumberCartesian coordinate systemPlanningMereologyElectronic program guideAuthorizationCloud computingPersonal digital assistantMathematical analysisContext awarenessInterpreter (computing)Ocean currentReliefArithmetic meanDecision theoryInternetworkingAnalogyLocal ringSuite (music)Lattice (order)Electronic mailing listComputer animation
14:47
Information privacySocial classUniqueness quantificationRevision controlMultiplication signTheoryDivisorInformation securityCollaborationismPrisoner's dilemmaDigitizingSeries (mathematics)Universe (mathematics)ArmAreaNeuroinformatikHypothesisClassical physicsRight angleLecture/ConferenceMeeting/Interview
16:29
Set (mathematics)TheoryObservational studyArchaeological field surveyRevision controlField (computer science)Execution unitAdditionComputer animationMeeting/Interview
17:28
Channel capacitySeries (mathematics)DialectResultantFrequencyLevel (video gaming)Field (computer science)BuildingAuthorizationInformationChannel capacityStaff (military)PhysicalismVulnerability (computing)CollaborationismPresentation of a groupForm (programming)Integrated development environmentLie groupDivisorState of matterGraph (mathematics)Prisoner's dilemmaCASE <Informatik>Cross-correlationMeeting/Interview
19:26
TelecommunicationData managementProcess (computing)Information securityGraph (mathematics)Process (computing)DivisorOperator (mathematics)Information securityStaff (military)Data managementTelecommunicationField (computer science)Computing platformSelf-organizationCartesian coordinate systemPlanningINTEGRALXMLProgram flowchartComputer animation
20:20
Order (biology)Channel capacitySystem callBitProcess (computing)Video gameDatabaseComputer programmingDecision theoryRevision controlComputer architectureReliefComplete metric spaceWhiteboardInternet service providerService (economics)Perfect groupQuicksortDistanceInformation managementMultiplication signMedizinische InformatikOperator (mathematics)Information privacy2 (number)Business modelStaff (military)File archiverClient (computing)Term (mathematics)Personal digital assistantInformationExtension (kinesiology)Information systemsData structureGoodness of fitImage registrationRandom matrixTouchscreenSeries (mathematics)Instance (computer science)Civil engineeringSet (mathematics)Standard deviationPhysical systemType theoryComputer fileFamilyAreaAbsolute valueProcedural programmingDifferent (Kate Ryan album)Element (mathematics)AnalogyProjective planeShape (magazine)Data managementMenu (computing)Asynchronous Transfer ModeInformation Technology Infrastructure LibraryParallel portMechanism designReflection (mathematics)DigitizingBranch (computer science)WordMoment (mathematics)State of matterLine (geometry)AdditionLecture/ConferenceMeeting/Interview
29:52
Digital mediaSlide ruleVideo gameLevel (video gaming)Order (biology)Multiplication signGoodness of fitSquare numberSpring (hydrology)Digital photographyMoore's lawLecture/Conference
30:53
Digital mediaCASE <Informatik>Moore's lawGoodness of fitTotal S.A.Validity (statistics)Normal (geometry)Type theoryMoment (mathematics)Physical lawVector spacePartial derivativePhysical system3 (number)Independence (probability theory)Rule of inferenceBackdoor (computing)Lecture/ConferenceMeeting/Interview
32:24
CybersexMatrix (mathematics)CASE <Informatik>DivisorDirection (geometry)Raw image formatFacebookGame controllerTraffic reportingPoint (geometry)TelecommunicationLecture/ConferenceMeeting/Interview
33:21
Digital mediaDivisorSoftware frameworkSpacetimeOpen setDigital photographyNormal (geometry)Moment (mathematics)Civil engineeringOptical disc driveCodeGroup actionIdentifiabilityPattern languageInformationLecture/ConferenceMeeting/Interview
34:46
Surface of revolutionPersonal digital assistantDecision theorySoftwarePoint (geometry)Right angleFocus (optics)Line (geometry)Computer programmingCodeLecture/ConferenceMeeting/Interview
35:39
Digital mediaTransmitterNichtlineares GleichungssystemPhysical systemSelf-organizationWordInformation and communications technologyCausalityRight angleInformationRectifierInformation privacySelf-organizationProcedural programmingMultiplication signTransmissionskoeffizientPhysical systemSymbol tableSummierbarkeitWordNichtlineares GleichungssystemTelecommunicationLecture/ConferenceComputer animation
36:34
Right angleSeries (mathematics)Physical systemWeightLecture/Conference
37:22
BitChemical equationNavigationContext awarenessAreaSelf-organizationComplex (psychology)CASE <Informatik>Process (computing)Prisoner's dilemmaData structurePoint (geometry)Table (information)Arithmetic meanInformationData managementResultantIntegrated development environmentInformation privacyStaff (military)Element (mathematics)Information Technology Infrastructure LibraryInternet service providerOperator (mathematics)Functional (mathematics)Decision theoryInformation technology consultingMultiplication signReflection (mathematics)Execution unitWhiteboard3 (number)Meeting/Interview
40:43
Information privacyDecision theoryTouch typingField (computer science)Functional (mathematics)Software frameworkOffice suiteBitMeeting/Interview
41:44
Complex (psychology)Information privacyObservational studyElectric generatorDatabase transactionAreaHeat transferProcess (computing)MetadataContext awarenessCoprocessorSeries (mathematics)Self-organizationService (economics)Line (geometry)Order (biology)Revision controlOperator (mathematics)Internet service providerKey (cryptography)Digital mediaProcedural programmingMeeting/Interview
43:45
Mobile appExtension (kinesiology)Information privacyDigital mediaElectric generatorMetadataDifferent (Kate Ryan album)Dependent and independent variablesMechanism designRule of inferenceComputer programmingCASE <Informatik>Term (mathematics)Key (cryptography)Message passingContext awarenessBitSelf-organizationChannel capacityMeeting/Interview
44:45
Data managementInformation securityProcess (computing)Constraint (mathematics)TelecommunicationMoment (mathematics)SoftwareMetadataServer (computing)Proxy serverOrder (biology)Combinational logicEncryptionDifferent (Kate Ryan album)Information privacyInternetworkingFlow separationMeeting/Interview
46:09
Channel capacityMeasurementSelf-organizationCommitment schemeDependent and independent variablesMeeting/InterviewLecture/Conference
46:55
Information securityDirectory serviceLevel (video gaming)Operator (mathematics)Dependent and independent variablesRight anglePoint (geometry)Self-organizationWave packetLatent heatTerm (mathematics)Type theoryMeeting/Interview
47:51
Type theoryInformation privacyIncidence algebraTraffic reportingSelf-organizationCore dumpInformationCASE <Informatik>BiostatisticsCodeSign (mathematics)Meeting/Interview
48:54
Absolute valueBlock (periodic table)SoftwareAmenable groupGroup actionIdentity managementDigitizingLecture/Conference
50:13
Self-organizationFundamental theorem of algebraComputing platformBasis <Mathematik>Pairwise comparisonTerm (mathematics)Meeting/InterviewLecture/Conference
51:05
Cache (computing)Physical systemPersonal digital assistantRight angleSource codeFrequencyMultiplication signService (economics)MeasurementTerm (mathematics)Suite (music)Proof theorySound effectFile archiverCASE <Informatik>Operator (mathematics)InformationNetwork topologyHeat transferComputer programmingElement (mathematics)Staff (military)Constraint (mathematics)Point (geometry)Public domainType theoryRow (database)FamilyBitHypothesisSelf-organizationLecture/ConferenceMeeting/Interview
55:33
Perspective (visual)Information privacyPhysical systemTheory of relativityType theoryComplex (psychology)Personal digital assistantPoint (geometry)Category of beingLevel (video gaming)Latent heatComputer programmingAuthorizationMeeting/Interview
56:22
AuthorizationTheory of relativityEndliche ModelltheorieCASE <Informatik>Category of beingInformationRevision controlType theoryCondition numberPersonal digital assistantAreaPressureCovering spaceAsynchronous Transfer ModeIndependence (probability theory)Meeting/Interview
57:41
Server (computing)Physical systemData managementSelf-organizationMeeting/Interview
59:01
SubsetArithmetic meanSpacetimeArmContext awarenessCASE <Informatik>Field (computer science)Exception handlingVector spaceFundamental theorem of algebraData conversionMoment (mathematics)CodeIndependence (probability theory)Basis <Mathematik>Type theoryThermal conductivityDependent and independent variablesComputing platformCurveShared memoryRevision controlLecture/ConferenceMeeting/Interview
01:01:00
TrailPretzelMereologyNumberMeeting/InterviewLecture/Conference
Transcript: English(auto-generated)
00:15
Thank you very much for coming, despite the warm weather outside and the cold drinks,
00:21
being here with us for this panel discussion within the track, Tech for Good. My name is Daniel Braun, I'm with the German Federal Ministry for Development Corporation, and before we start this panel, let me just give you an idea what we are doing, why we are here. Just like in the previous years, the German Development Ministry supported and supports
00:46
Republika with several sessions, this year is something special. We are really proud that we can be one of the main partners of Republika and have this track Tech for Good as one of the main tracks here at Republika.
01:02
So thanks to all the people at Republika for making this happen. This gives us a nice opportunity. We want to bring in the views, the ideas from the developing world to the panels here,
01:20
to the debate, to the discussion. And we want to also have innovative debates for us for the development community to ask ourselves the question, how can we use the potential of the digital world, of the digital transformation for our work?
01:41
For example, can e-health help us in humanitarian situations? Can e-learning help us to not lose whole generations of refugees? Or can we use blockchain for transparency in developing countries?
02:01
So we have a common interest with Republika and hopefully with you to seize the opportunity of the digital transformation, and that's combined in our track Tech for Good. But on the other hand, we really have to increasingly discuss and address also the
02:23
downsides, the risks, the dark sides of the digital world, and that brings me to this panel here. They're not the dark sides, they're the good boys. But we have to ask us several questions, and we can draw lessons from things that
02:41
happened recently. For example, data scandals, I don't name a very special social network, or cyber attacks, I don't name a very special German government intranet which was attacked by cyber-terrorists. So the main lesson we draw from this is data is a target, definitely.
03:08
And the second lesson is data can be us. Personal data is us. So personal data can reveal our whole personality, our interests, our likes, our needs, our
03:28
dreams, our whole personal data, our whole personality. So while we are concerned about our private data, the international community, the development
03:42
community, the humanitarian community also has a responsibility for the data of others, for the data of the most vulnerable people on earth, be it refugees, be it people in war situations, be it people in problematic situations that come from climate change.
04:06
And this gives us a catch-22 situation. On the one hand, data from these people have to be collected by our partners from the humanitarian side, Red Cross, UNRWA, et cetera, to be able to deliver aid to these
04:26
people. And the recipients of aid don't have any other chance to give away their personal data to these organisations. So what should we do?
04:41
The problem is if we lose our data, our personal private data, it can be a problem of you're losing information about your favourite movies, you're losing information about your tax returns, but for these people, if we expose their personal data, we put their life at risk.
05:02
Sometimes it's enough if you reveal a name of a refugee that he or she can be connected to a tribe, to a region, to ethnicity, and we put their lives at risk if we reveal
05:21
their personal data to third parties who have other interests than us. So the question to this panel is, what's the state of play? What can we do to secure and protect these highly sensitive sets of data of millions
05:43
and millions of people? And we as a ministry are really proud to support not only this panel but also a project behind it, and we're really happy to have one of the most knowledgeable experts here with us, on one hand, our direct project partners, Nathaniel and Stuart,
06:07
from the signal team of the Harvard Humanitarian Initiative, and Dorothy, you will see her later on the screen from UNRWA, the United Nations Works and Relief Agency in the Middle East, and also Steven LeBlond from the École Polytechnique
06:27
Fédéral de Lausanne, and Massimo Marelli from our friends from the ICRC, the International Committee of the Red Cross, to give us their perspective on this really important issue.
06:43
So I'm looking forward as well to this panel. Thank you all for coming here, and let's hope we can not only use tech for good but also use tech and policies to prevent from happening bad things.
07:02
Thanks a lot, and I give the word to you. Thank you. All right. Thank you all for coming. We realize that there are more compelling places to be and perhaps more comfortable ways to be spending your evening, but very briefly, I just want to tell you how we're going to run this session.
07:20
We're going to have quick five-minute ignites from each of the speakers. That will include Dorothy, so we'll beam her in as soon as the slides are gone. And then we'll move into a discussion, and that discussion is as much about us on stage as it is about you in the audience. So thank you for being here, and please keep questions that come up during the different presentations in mind. Take note, and be ready to engage.
07:40
We're going to see if I can manage to switch the slides, and then we'll get started with Massimo. All right, Massimo. My German is limited, I think. Bill Shmirmush presentation is where we want to be.
08:02
All right. You've got five minutes, Massimo. Thank you. And first question, is anybody here familiar with the International Committee of the Right Cross? Quick raise of hands. Wow, everybody already familiar with the International Committee of the Right Cross. That's a very pleasant surprise. So we'll not have to spend too much time explaining what it is that we do,
08:20
but just as a quick reminder, because it's really important and it stays at the roots of why we care so much about this kind of issues. The International Committee of the Right Cross is a neutral, impartial, independent, exclusively humanitarian organization that has the mission to protect the life and the dignity of people that are affected by armed conflict and other situations of violence, and to provide assistance.
08:43
So a bit of a mouthful, but really the core is the protection of the life and the dignity of the most vulnerable people, as I mentioned earlier. This means that we're actually taken to work in places like this. To work in places like this, as you can see, there's no, what we can not probably see, but there's no armed escort, there's no armored vehicles, probably the colleagues
09:03
in the car there are being watched by snipers on the roof of that building, and their access to the most vulnerable people is based on the fact that the people accept the International Committee of the Right Cross as a neutral organization and trust it. So it is, in order to ensure that we can have access to those places,
09:22
that the International Committee of the Right Cross is acknowledged to have international legal personality and privileges and immunities to ensure, as a basic first measure of protection of the people that were there to protect and assist, so that no party to the conflict can interfere with these people.
09:42
There's a bit of a background noise. It's a train, I think. It's a train. Okay, good. So this is a point that my friend Stevens will be touching on later, which is very important also when we look at new technologies, because there are some implications when we look at technology and when we look
10:00
at situations where we need to engage third party processors, but we'll look into this in a minute. So as an international organization, we have our own system of rules and compliance with personal data protection. I will not spend much time because time is running fast, but just to mention, the rules on personal data protection, the creation of a data protection office,
10:20
and a data protection commission that has the task of providing an effective remedy to people that might have complaints about the way in which their personal data has been handled. Why do we do this? I mentioned it earlier, because our mandate is to protect the life and the dignity of people. Protecting the life and the dignity of people in very sensitive environment means protecting their information.
10:41
We've been doing this for a long time, and a number of rules and measures of data protection are integrated for a long time in our ways of standard operating procedures, guidelines, and policies. It's about doing no harm, and this becomes increasingly important when we look at new technology, and we'll see why. Perhaps also, we'll have a little bit more time in the discussion
11:02
to be beneficiary centric and so on. So if this has been integrated for a long time in the way in which we do things, why is it that it's so important today? And why is it that everybody's talking about data protection in the humanitarian sector, but also outside? Well, it's because technology.
11:22
Technology has been developing ever since the beginning of humanity, but what is very peculiar about the last few years is the fact that technology enables us to generate enormous quantity of information, to collect it, to store it. It's becoming cheaper to store it. What we had on a USB key, what we have on a USB key today is probably what we could have
11:43
in a hard drive a couple of years ago. And not only that, but we can make sense of that information. While you probably couldn't even make sense of so much information with millions of people going through documents, now we can with an algorithm. So this has brought many technologies to be of relevance for the humanitarian sector
12:06
to enable the humanitarian sector to be more efficient and effective in the work that it does. At the same time, it brings a lot of risks, a lot of additional risks that the regulators outside are trying to regulate in the non-humanitarian world,
12:20
but that we need to address as well. Privacy International in 2013 published a report that was really very critical of the humanitarian sector and highlighted how actually by adopting certain new technologies without understanding the implications of it, humanitarian actors were actually enabling additional surveillance
12:41
and potentially causing harm to people. The International Conference of Privacy and Data Protection Commissioners also looked at the challenges that are faced by the humanitarian sector and the importance of additional guidance when it comes to adopting new technologies. And so together with Privacy International, with data protection authorities and a number of other humanitarian organizations, we engaged in a working series to look
13:04
at certain new technologies in the humanitarian sector to look at how they could be adopted in a way that does not cause harm to people and that is respectful of data protection. Protecting individuals' personal data is an integral part of protecting their life and dignity.
13:22
This is why personal data protection is of fundamental importance for humanitarian organizations. The International Committee of the Red Cross teamed up with the Brussels Privacy Hub to bring together leading experts on data protection to produce a practical guide to the issues facing humanitarian organizations.
13:44
The Handbook on Data Protection in Humanitarian Action seeks to raise awareness and assist humanitarian organizations in ensuring that they comply with personal data protection standards when carrying out their activities. After an initial analysis of the application of data protection principles
14:03
to humanitarian action, the handbook considers the data protection implications of using specific new technologies in the humanitarian sector. These include data analytics, drones, biometrics, cash transfer programming,
14:21
cloud computing services, and messaging apps. So my time is up now, but I just wanted to highlight a couple of things and I will not do it now, but I hope there will be an opportunity afterwards to discuss about what are the next steps. It's not sufficient to just have some guidance.
14:42
What is it that we're planning to do to put that guidance in practice? Thank you, Massimo. And all kidding aside, if you read one handbook on data protection in the humanitarian sector this year, this is the one to read. Genuinely the best resource out there, and there are many.
15:02
You'd be surprised. We love documents. Instant classic. All right. Is there a USB port? I think let's not add another layer of technology here. There you go. Your time is up there.
15:23
Hi, Republica. My name is Steven Seblon, and this is my great pleasure to address you today to present this ongoing collaboration between the ICRC and EPFL, my university. So the initial hypothesis of this work is that the mandate of the ICRC, and in particular its global presence,
15:42
with an emphasis on areas of armed conflicts and other situations of violence, and the privileges and immunities of PNI that Massimo mentioned, is that they create a unique need for security that is unaddressed by existing technology.
16:01
And the promise of our work is that if we can characterise those needs sufficiently well, then perhaps we can propose better technology to address them. So essentially to our work is the notion of digital immunity, which is computer security combined with organisational factors
16:25
as well as privileges and immunity. And so if we can combine those three, then the hope is that we can propose better designs. To carry out our study, we used an inductive approach,
16:42
which means that we started with a set of theories that we refined throughout the course of our studies. To acquire the data, we used a qualitative approach comprising semi-structured interviews, surveys, review of internal documents, as well as a visit in an ICRC delegation bordering an armed conflict.
17:04
We acquired data through 27 interviews until we reached topic exhaustion, and pretty much our interviews covered all the operational units of the ICRC. As you can see here, the combined experience of our interview is amounted to over a quarter of a millennium in the field.
17:27
And in addition to this extensive experience, you can see on this map the regions of the ICRC where our participants have perited. So pretty much all of them are covered here.
17:44
Arriving to results. So we identified four main practical factors that constrain the use of technology by ICRC field workers today. The first one is the vulnerability of beneficiaries. And so in present environments, for example, technology is banned,
18:04
which means that pretty much the best that you can do in this environment is to use a paper form and to pseudonymize the name of a detainee to prevent him from harm. Second, capacity building is also problematic, as well as collaborations with national Red Cross societies,
18:23
because they lead to coercion as well as physical attacks. So coercion lies in sensitive information that might be disclosed to staff that may not be covered by privileges and immunities,
18:43
or be more susceptible to being coerced. And this is particularly problematic in state-owned facilities, such as hospitals, for example. And so there is a need there to provide technology that mitigates the coercion of staff
19:02
or the unauthorized disclosure of data. With respect to physical attacks, technology may be placed in untrustworthy environments that is susceptible, again, to physical attacks. And in that case, there is a need for technology
19:23
that resists partial compromise. This graph illustrates the impact of these operational factors and others on the satisfaction of ICRC employees with respect to communication, data management, and processing.
19:42
What you see here is that there is a gap between the need of secure technology and its satisfaction by ICRC staff workers. So to summarize, we identified a strong need for secure technology
20:00
that is currently not satisfied due to the practicality of the field work. And we are planning to close this gap by collaborating with the ICRC and other humanitarian organizations to integrate those secure applications into a usable technological platform.
20:20
And finally, I believe that there is a lack of a vehicle to implement and deploy such technology today. And in that end, we are working on the foundation that combines the neutrality of academia with the industrial capacity in order to hopefully create such a vehicle.
20:48
Thank you for your attention. This is the end of my talk. Awesome. Not yet. I'm going to keep you in the corner for a little bit longer, Nettie.
21:00
You shifted. Dorothy, this is when the technology fails. So let's get you on the screen here. That's me. All right, someone who speaks German. How do I go into full screen mode? On this menu, though? Yeah.
21:21
Volbid modus. All right. We're in good shape. All right, everybody meet Dorothy. Yeah, we're on it. We're good. All right, Dorothy, can you hear me? Absolutely. Perfect. So Dorothy's joining us from Amman in Jordan. Yeah, Dorothy, over to you. Correct, from the UNRWA headquarters.
21:41
Well, I'm very happy to be with you despite the distance. So let me tell you a little bit about our experience here at UNRWA. Maybe I do need to explain briefly who are we and what do we do. UNRWA stands for the United Nations Relief and Works Agency for Palestine Refugees in the Near East. We've been created in 1949, given a mandate from the UN General Assembly to
22:06
support the populations that were displaced and affected by the Arab-Israeli conflict of 1948. And ever since, for the past 70 years, we've been operating in Jordan, the West Bank, Gaza, Lebanon, as well as in Syria until today.
22:22
We're catering for one of the probably most disenfranchised populations in the Middle East, given the absence of a Palestinian state until today. At the moment, we're providing about 500,000 children with basic education. We see 2 million persons coming to our health centres and 1.7 million people
22:44
are actually relying on our food assistance as well as cash assistance that we provide. All of this is done through about 30,000 staff, most of which, 98% are Palestine refugees themselves. And this creates a very specific dynamic across the agency, which is 150% service, delivery-oriented, every
23:07
single penny, of which we don't have too many, goes into the frontline staff and the frontline services. Now, how does this set up relate to our ability to manage and govern data? Because we do have quite a lot.
23:22
Since 1950, we basically collected information, documentation, civil registration information about the people we serve. So right now, we hold 5.4 million active data files of refugees in our system.
23:41
But in addition, of course, we've also been creating databases and management information systems to support our education and health systems as well as our vast relief operations. How does that work? We have an information management department full of very committed, enthusiastic staff that come to the program departments and say,
24:04
hey, guys, we've got our technical business solutions ready to support you, make your programs more effective and efficient. So just imagine here's the medical doctor who says, oh, great, we want to digitize our patient and our health information management system. Can you help us?
24:21
And the information guys say, okay, great, we're on to it. Now, what probably the people on the program side, the medical doctors have underestimated is actually the need for them to engage in these processes to make sure that the business solutions through the technology don't come into
24:42
problems with the way they actually carry out their services on the ground. And they may also have overestimated the ability of the IT guys to actually understand the business processes. So in the worst of all worlds, we end up with something like a patient coming to one
25:01
of our health centers and here he has to come and give all the information for the patient panel and it takes them about two and a half minutes just to get in place the basic data, just to get everything going. Now, in our health centers that are usually quite overloaded, a doctor has about three minutes time to serve as a patient.
25:21
So that leaves you with about 30 seconds for the doctor to engage with a patient whereby they should look a little bit more into the family history of health, look maybe at their psychosocial issues that possibly affect the morbidity of the patient. And all that won't be possible. What happens, the doctor probably goes and prescribes a drug or an antibiotic and off the patient goes.
25:45
Who loses out? And as a refugee, it is our client. Why is that? Because our program side probably entrusted too much on the IT side to actually provide business solutions which look like we're providing more effective services. But in the end we're losing out because we have a disconnect between the service providers, the clients and the business solution side.
26:09
And here we haven't even spoken about data protection issues and how we take care of that. Now UNRWA certainly has a whole series of policies and standard operating procedures for data protection
26:23
but what we did realize is that all of these were largely framed in an era where we were still working around an analog business model, not a digital business model. So what we did last year when we saw a proliferation of databases and an IT architecture that was managing in parallel our refugee information,
26:46
we said we need to just put the foot on the brake and do a little bit of a reflection exercise. We identified three main risks. One of course is related to the situation that I've just described
27:00
where we said we have to update our policies of course and make them more adjusted to a digital world. And at the same time the processes whereby we develop business solutions, IT solutions in support of the programs need to be vetted step by step from the data collection through the processing, the archiving of it and as well as the disclosure.
27:27
And we need to put in place a mechanism, a system within the staffing structure and the accountability structure to make that happen. Secondly, another risk area was research and disclosure of data. Because UNRWA holds such vast data across different programs,
27:46
we're quite interesting to all sorts of research institutions but also other humanitarian partners and sometimes even the governments that would like to get hold of the data that we host. We realized that we're not prepared to actually make informed decisions under which
28:04
circumstances and to what extent we support research which could contribute to public good and to what extent in some instances we actually might put our refugees at risk by the type of information that we disclose. So thirdly, we identified another element for improvement in the way we operate and that is the transparency of our data.
28:28
It's very difficult for refugees to actually access the information that we store including the historical family archives that date back to 1948 and sometimes even before. What did we decide to do? One, on the research element, we've instituted a
28:45
complete research moratorium until we have in place an updated data protection access policy, a research review board that will vet according to predefined criteria the research in light of the protection needs of our refugees as well as the basic research ethic guidelines.
29:05
Secondly, we are in the process now of reviewing our operations in terms of providing business solutions, IT based business solutions to our programs so that we're actually able to vet again against a set of risks the entire process and that is ongoing.
29:27
Thirdly, in terms of enabling our refugees to access the data that we store on them, that is a larger project that will take probably a year or two or three even to come and for which we will have to find the necessary resources to make sure that our refugees have access to the data that they've entrusted us with.
29:50
Thank you, Dorothy. I gave you a few extra minutes because life's not fair, gentlemen, so I apologize.
30:02
We're going to just put Nathaniel's slides up, Dorothy, and then we'll bring you back for the discussion so bear with us. How's everybody doing? Are you awake? Good. It's an honor to be here on stage with people who I consider to be heroes at a time when heroes are in short supply.
30:30
So anyone know what this is? This is Tahrir Square back in 2011 during the halcyon days of the Arab Spring. When I saw this photo, I thought of a quote from a book I read in college by a guy named Don DeLillo who said, the future belongs to crowds.
30:49
I used to believe that. I don't believe that anymore. I believe the future belongs to those with the computing power to predict them.
31:00
Think about that. The future belongs to those who can predict crowds. It doesn't belong to the crowds themselves anymore. Hannah Arendt. Anyone know who Hannah Arendt is? Okay, some people still read. Good. In the origins of totalitarianism, she said that basically totalitarianism attacks on freedom begin with dictators undermining the validity of facts.
31:30
And if we look at the history of how freedom dies, it dies when facts are under attack, when trust itself is under attack. And I would say now we are at a moment where we have entered a third world war. And the war is on trust.
31:44
Trust in systems, trust in data, trust in institutions. For humanitarian actors, trust is our only currency. The humanitarian principles of independence, neutrality, humanity and partiality depend on the trust that we have with affected populations, with governments.
32:04
And that trust is baked into norms and to the rule of law. The technology we thought would liberate us, that would set us free, is in fact the vector for a new type of threat. This is Picasso's Guernica on an iPhone. Ladies and gentlemen, this is where we live now.
32:25
The Guernicas of today are happening on social networks, they are happening on devices as much as they are happening on battlefields. And welcome to the great oppression. These liberating tools are the pathway by which now through things
32:40
such as misinformation and in some cases direct cyber attacks, we have entered a new threat matrix. Quickly, the Rohingya in Myanmar have been displaced in an act of ethnic cleansing. Facebook, we now know, has been an inflammatory factor in the attacks on this population.
33:04
Once again, Facebook, South Sudan, UN has put out a report on the role Facebook is playing in communal violence as I trip over the speaker. The role Facebook has played has even gotten in some points to evidence of command and control of attacks.
33:22
And you all know this one, the 2016 election in the United States. Social media misinformation was a critical factor in how that played out. So, another example of misinformation, the white helmets in Syria.
33:41
So, we're looking at the decline and fall of normative frameworks and that's bad for humanitarians. But meanwhile, what are we all doing in the civil society space? This is the opening of Epcot Center in 1982. I find this photo to be ridiculous. I also find the way that civil society has responded to this moment ridiculous as well.
34:04
Because we have focused on the tech as if code can save us. When really, we're not about building the world of tomorrow, we should be about preventing Jurassic Park. So, PII, you heard from Massimo, personal identifiable information.
34:22
It's not PII anymore, it's DII, demographically identifiable information, information about communities. And it's ABI, action based information about the movement patterns of populations. To close up here, this is the plumb line. Data is people.
34:40
When we handle data, we're handling the lives of people. These are pictures of the world we live in now. We now have a networked diaspora, the largest in history. The Syrian refugee population are using mobile devices to make decisions about finding aid, about receiving assistance.
35:05
And at this point, we're not equipped for this revolution. We do not yet have an agreed duty of care, a duty of care for how we handle data as people. Very quickly, we're looking down the wrong end of the telescope. We've focused on the promise of technology, and that's actually made the population very small.
35:25
But when we look at rights as the first lens down the telescope, the population comes into focus. The signal program, we wrote something called the Signal Code, published January last year. We said there's five rights that already exist. A right to information during crisis, a right to protection and how that information is provided,
35:45
a right to privacy, a right to data agency, and a right to rectification and redress when information communication technologies and their use cause harm. To end, I'm just going to end with this wise woman.
36:02
Let's look at her again, the late great Ursula Franklin. She said technology is not the sum of the artifacts of the wheels and gears and rails and electronic transmitter. For me, technology is a system. Technology involves organization, procedures, symbols, new words, equations, and most of all, it involves a mindset. What I want to leave you with, it is time for our mindset to change.
36:23
We're doing it wrong. It's not about what tech can do. It's about a rights-based approach to ensure that we prevent what tech should not do. Thank you.
36:40
All right. Dorothy, we're going to get you back up here. There we go. Fluent in German. All right. Natty's always a lighthearted end to a series of talks, so that was why we left him until the end, but it segued almost like we planned it.
37:00
So now I have a few questions for the panel, and we'll start with Dorothy. Dorothy, just want to make sure you can still hear me. Can you? Yeah? We're audio up. Can we hear Dorothy? Say something? Yes, I can hear you. All right. So thinking about technology as a system and this provocation that we need to really look at the mindset,
37:24
both individually, institutionally, as well as culturally, one of the big areas for international organizations thinking about how to mitigate risks relates to finding a balance between policy and practice and figuring out where to intervene to have an impact, as well as how to find a balance between my own alarm,
37:43
between policy, practice, and culture. So Dorothy, in the context of UNRWA, could you share a bit more about how you're navigating that pretty complex landscape? Yeah, thanks for the question. Certainly extremely relevant, and it hits actually at what we're currently struggling with.
38:01
So at the point when we realized that the existing policies that we have to govern our data management are pretty much outdated, as they do not respond to actually the day-to-day issues that we're being faced, in the sense that our staff don't even relate to these policies anymore and are doing ad hoc decisions, as they are very ingenuine of doing in UNRWA in any case.
38:27
So as we're in that situation, we said, okay, we need to go back to the drawing board and we're going to rewrite a data protection and data access policy. Again, of course, we have our lawyers sitting around the table and drafting something,
38:42
then everybody comments on it, which may lead to sometimes suboptimal results. But we did agree that we have a better engagement process than between the lawyers that are drafting this and the practice side, which is very important. At the same time, it's very clear that no policy,
39:00
given also the speed with which technologies are developing and the complexities of our operational environment that give us challenges every day from our operations in Syria down to Gaza City, no policy will be able to actually capture all of the situations and provide sufficient guidance to make decisions.
39:22
So I guess what we're going to settle on is a broad policy that definitely highlights non-negotiable principles but also might highlight some of the ethical dilemmas we might be faced. And this shall then be a sounding board for decision-making. So I guess what we're looking at is we're going to create a policy layer
39:44
which is a reference document for us to rely on. But then we also need to put in place structures for consultation, for reflection, and for accountable and informed decision-making as we move along and making sure that the technologies we use
40:02
and the data that we host is not going to become a risk to the vulnerable refugees that we serve. And as a third element, of course, we need to spread the awareness across the agency of what Nathaniel just mentioned, data as people, because we are so very much focused on the service delivery
40:23
and get things done that sometimes the protection elements fall a little bit too short. So that is something that we increasingly are communicating through various means of how we actually operate and do things.
40:41
Brilliant, thank you. Many organizations haven't yet established the function for data protection and so in many ways ICRC is unique in having a data protection office established and that's the team that Massimo leads. The handbook that we rushed him through showing you is seriously a tremendous resource and it falls between policy and practice.
41:01
It looks at providing a framework for how decisions can be made in the field but making that a reality is complex. So maybe from things you might have shared in your talk if you had a few more minutes, do you want to touch briefly on how, now that the handbook is in place and launched, you're planning to socialize it across the ICRC? So thanks very much for giving me the opportunity to expand a little bit on the next steps
41:24
and certainly it helps nobody to have a good resource on the shelf to get the dust. So why is it that sometimes principles and policies do not translate into practice? Well it's because they remain on those shelves and the only way to translate them in practice is really to ensure that the principles that are in there
41:45
are really declined and drilled down into the operating procedures, the guidelines and policies so that people that are having to deal with an emergency don't need to look very far. They just need to apply their own practical policies and guidelines and they will know that they are in line with the policy.
42:03
Data protection impact assessments. This is at all privacy conferences. Whenever there is a complex question and nobody knows the answer of, that's the answer that you can give and everybody will be nodding. But it is also the right answer because actually that is the process that enables you to really go through all the principles and all the questions that you need to ask yourself
42:23
and to make sure that you are clear as to what the risks that you are generating are going to be and that you are accountable if you are taking them and that you mitigate them as much as possible knowing what they are. So awareness of risks is really key and it's particularly with these technologies and new technologies that awareness of risk is becoming so important.
42:44
People in humanitarian organizations are actually good people and they want to do the right thing. The problem is that sometimes it's not easy to understand what the right thing is and it's a cash transfer. A voucher that pops up on your phone might look like an innocuous cash voucher
43:00
that pops up on your phone. You don't actually realize necessarily that in order to get to somebody's phone it has to go through a whole series of sub-processors and processors and stakeholders, a financial institution, a financial service provider, a mobile phone operator. In the process those stakeholders might actually be generating a lot more data and metadata about the transaction that you are actually aware.
43:23
And so it's important awareness is the key. And so what we are doing now with Privacy International is a study on the possible implications of humanitarian metadata generation. A famous former director of an intelligence agency declared we kill people based on metadata publicly.
43:42
Well, they do that in conflict areas which is where we work. It is our responsibility to understand to what extent the generation of metadata in humanitarian programs including cash, including using social media messaging apps might actually be contributing to that. So awareness, integrating the rules in the day-to-day practices and policies
44:01
and data protection impact assessments are key and cooperation with stakeholders like in this case our friends at the Polytechnic University of Lausanne that actually have very sophisticated technological knowledge that can share with us and can help us out solving certain technical problems. That's a great segue Massimo.
44:20
So while obviously culture and governance and different mechanisms are critical, there is of course a role for technology to play. So Stevens, can you tell us a bit more about the technological interventions that you and your team explored in the context of the research of an organization like the ICRC having the capacity, competency and capability
44:40
to actually introduce more robust technological safeguards? Yes, of course. So as I mentioned, we identified that there was a need for secure communication, data management and processing. And to look at the intersection of those technologies with the organizational constraints of the ICRC as well as the specific P&Is
45:02
that they have in certain countries. And so at the moment we're starting to work on something that is called an anonymity network that prevents the creation of metadata. But as opposed to networks like Tor that work over the internet,
45:22
what we're trying to do is to see whether we could deploy such a network within the ICRC. For example in the secure server rooms that exist within delegations and sub-delegations in order to enforce the physical security that I brought up during my talk. So that as long as one server room in a country
45:41
or the combination of different server rooms in several countries are legitimate as long as their security has not been violated, then we can enforce privacy-enhancing communications as well as end-to-end encryption within the ICRC.
46:02
In a first step and perhaps extend this to the IC beneficiaries in a second step. Great. Now, Nadi, we talk a lot about the importance of having, again we refer to as the competency capacity and capability to understand not just the technologies you're using but also the risks that they pose.
46:23
What are some of the ways you're seeing that the measures that Dorothy, Massimo and Stevens have mentioned are effectively being absorbed or starting to be absorbed into humanitarian organizations and perhaps also the community at large? Well, one piece of evidence is the fact that we're here today
46:41
in the support of GIZ and our colleagues here in Germany. The commitment of funders to the professionalization of not just the tech side but of the broader data responsibility as a profession side is starting to happen.
47:02
We're still a long way away but if you look at where we were say 10 years ago during the great Ushahidi crisis map explosion of 2007-2008, we've now moved to a point where I think a lot of the fetish about volunteer organizations
47:21
being the tip of the spear. They're important but now we're seeing in terms of actual, the beginning of professional training, in terms of the type of training we do at Harvard on remote sensing and response. Just as an example, we're seeing the beginning of professional level staff who are responsible for these operations from not just the tech side but in ethics and the right side.
47:46
Some organizations that are doing that really well to be specific, World Food Program, they recently released an audit of their privacy policies and found that they didn't live up to them. That type of transparency, what we call critical incident reporting, is essential.
48:04
We haven't wanted to tell the truth about when things went bump in the night or when the wheels came off. But if we don't have the negative examples, if we don't have the courage to tell the truth when it doesn't work, we can't develop an evidence-based approach to professionalization and countermeasures.
48:21
So another organization, just to end here, International Organization for Migration, IOM, we've been working with them on our forthcoming core obligations for humanitarian information activities, the first attempt at an ethics code for this work and to see organizations like IOM, which handles biometric data in many cases of refugees crossing borders or internally displaced,
48:44
to see the IOMs, the WFPs taking that lead, that's a great sign. It's not enough, but it's significantly better than where we were a decade ago. Absolutely. I want to gauge how many questions there might be in the audience, because I have plenty, but I want to turn to you all and see what questions you might have.
49:05
I realize it's warm. We've got about 10 minutes left. Oops, I'm blocking. Yes. Hi. I would like to ask how, if you're looking into ledger technologies,
49:24
like digital identities or for-digital identities or blockchain-based cash flows, if that's relevant, and it's one question, like other technologies and anonymity networks, and the other, are you able actually to protect your data from countries where you work?
49:46
I mean, if somebody comes and has a legal warrant, I don't know if you can talk about this, but how, is that an issue? Maybe Dorothy, I don't know if that's an issue that Jordan or something
50:02
comes up to you and says, like, actually, we would like to look into your data and we want it for whatever reason. How do you be able to protect your data and how do you do it? Thank you. We'll take a few more and then turn back to the panel. Any other questions out there? Yes, over here.
50:23
Hi. Thanks, all of you. Really interesting. A lot of commercial organizations trying to address some of the humanitarian issues we're seeing, as alluded to there, looking at blockchain-based platforms and otherwise. It's interesting to know whether you consider those organizations to be kind of fundamentals driven
50:45
along the basis of the research you're doing, or whether maybe the kind of drive to achieve profitability might create platforms that, whilst kind of work in the short term, are maybe not what you're looking for for long-term benefits you're targeting.
51:03
Thank you. Maybe one more? There's one over here. Just a question on cash assistance. So, I would imagine you do an assessment for refugees who qualify for cash assistance,
51:22
whether they have sources of income, and then obviously you gather data on that. So, do you give them the right to be forgotten, to have their data erased after a certain period of time, for example, when the cash assistance finishes? Thank you. So, let's turn back, and Dorothy, maybe we can start with the questions directed at you,
51:43
if you're happy to answer a completely hypothetical question about governments potentially asking for data. So, the question was, how can we ensure that you're able to protect data from countries in theaters where you're working? Do such requests occur, and if so, what type of measures might organizations like UNRWA or the ICRC take
52:05
to ensure that they can hold and protect the data that they are the custodians of? Right. So, do you want me to respond? Yes, please. Okay. I guess the first question, I didn't completely understand it, but I guess it relates to how we see the relevance
52:26
or effectiveness of applying blockchain in terms of our operations specifically to cash assistance. Admittedly, we did look at the possibility of introducing blockchain, and we continue doing so.
52:41
WFP is somehow leading the herd on this. Again, I mean, apart from the fact, of course, that UNRWA right now is in its most severe financial crisis ever since 70 years, and we don't even know how to open up the schools as of September, so there is a resource constraint here
53:04
to introducing a very sophisticated new technologies and having staff available to really run this. I must admit on one side, but we are nevertheless following up and exploring whether it makes sense. But I just want to highlight that. Again, this is, you know, a case in point where the question of is blockchain the right way of collecting and storing the data,
53:26
is it the right thing? And it seems to be obliterating some of the much more pressing issues that we have when we look at the way we design our cash assistance programs, which is are we really targeting the right people?
53:40
You know, is the value of our transfers, is it accurate? How do we access the populations? You know, all of these much more pressing issues are obliterated then by the fact, you know, is blockchain the right way just to process the data in itself?
54:03
And we're right now more concerned, you know, with other elements in terms of, you know, huge humanitarian crisis in Syria, running out of funds, Gaza, where we provide a million people with food and we don't have the money for this. So the blockchain issue is a little bit on the back burner. Do we erase data? The way UNRWA operates is that you need to prove your service eligibility.
54:26
We do not indiscriminately provide services to populations. People have to come and register with us. So they either prove their Palestine refugee status according to the criteria that are available in the public domain
54:42
or otherwise being service eligible. And their records remain with us. Sometimes we have records, as mentioned, that go back down the family trees 70 years and we maintain them, we keep them. And if you wish, this also constitutes something like a national archive or an archive of the Palestine refugee population.
55:05
And our refugees are very much interested that we preserve these archives and that we preserve their information, which in some cases now is also very handy to them. We have a lot of requests from governments in Europe asking us to basically document whether person X or Y is actually registered with UNRWA and is a Palestine refugee because they're claiming asylum.
55:31
Thanks, Dorothy. On the question of privileges and immunities, very briefly, because we've got about two and a half minutes left, Massimo and then Stevens, so from the more strategic approach and then the technological side,
55:43
this question of if governments request for data of beneficiaries who you have registered in your systems, how do privileges and immunities apply from a data protection perspective and then in turn how are you thinking about the technological safeguards that should be in place for that? So that's a very complex question, actually, because it depends.
56:02
And I'm a lawyer, so that's the answer that you're going to hear from a lawyer. It depends. It depends on the specific type of activity. It depends on the level of threat and the type of threat. It depends on the type of request. There are many programs in relation to which there is, in fact, the whole point of the program is to enable the authorities to perform their duties to protect or to provide assistance to certain categories of people.
56:27
So they must be informed about certain things. When there are threats, however, this is where the privileges and immunities become key. And so the fact that, for example, in relation to certain types of activities that are particularly sensitive,
56:43
we don't work with partners. It is us going directly in conflict areas, in places of detention, to collect and to document the conditions of detention. That's because we can be covered by privileges and immunities and if authorities come and seek to have access, we can say we have immunity from jurisdiction. You know it very well.
57:02
This is why you have accepted us being here, because we're a neutral, impartial, independent actor and you have accepted this mode of supranity. I don't want to paint an ideal scenario. We also know that in some cases, that is not sufficient. And this is why it's not something new. It's something that dates back to very early guidelines of our protection work.
57:26
If we fear that the pressure that can be put in relation to some cases is too high, and that the danger for the individual concerned is too high, then actually the guidelines is not to collect the information.
57:41
Very briefly, Stevens on the technological side. Then we'll close, Nadi, with a comment on the question about commercial actors and the role that they play. So Stevens, then Nathaniel. Yes, so again, I think there is an important distinction to make between the technology that is customer or beneficiary facing as opposed to internal technology that would be used only by ICRC employees.
58:03
And with respect to the PNI, I think there is an advantage to leverage when deploying infrastructure over a system of jurisdictions and delegations. And so perhaps some delegations are more trusted than others.
58:23
For example, the ICRC's headquarters are based in Geneva. And they have strong relationship between the organization and the Swiss government. And this is where perhaps there would be an opportunity to deploy a blockchain system, for example, for access management for data.
58:44
So that as long as a server is not compromised, that participates in this blockchain system, then the entire system can remain legitimate. Great, thank you. Nathaniel?
59:02
So when I was but a young aid worker in the early 2000s, the big issue then was civil military space in the context of Iraq and Afghanistan. Meaning how much should aid workers accept protection from, engage with armed actors.
59:21
In many cases the debate was around whether we should receive U.S. and NATO protection in the field. And very complex contentious issue. But now we have at this current moment a new type of space question, which we call civil corporate space.
59:41
And we haven't, it's been the exact opposite. We as the humanitarian community have been so excited to hop into bed with large corporations and large platforms, sharing data in an often experimental way.
01:00:00
And there are moments that that can be really important and good, but we haven't had the broader question about how do you put a Red Cross on a server? How do you impartially share data? How do you have independence from a corporate agenda, which is not about, in the case
01:00:20
of a needs-based ethic and a needs-based code of conduct that we follow as humanitarians, but is about shareholder value? Those are fundamental conflicts, and in many ways we haven't had that conversation. And so it doesn't mean don't work with corporations.
01:00:41
We have corporate partners that we work with constantly, but it means how do we maintain the people and their dignity and their needs, which is the basis of humanitarian response, rather than get co-opted by a corporate agenda?
01:01:00
Thank you. It all comes back to trust, and this is really about understanding how we continue to build and maintain trust so that as we introduce technology and data more and more into the way we deliver aid, we ensure that we're protecting the people that we're trying to serve. I want to thank the panel and our colleagues at BMZ and GIZ for organizing this track.
01:01:20
We're really excited to be a part of it. I suspect the next panel is about blockchain, given the increased number of people in the room. So next year I will think about the title of the talk, including something about foresight and blockchain, but thank you all for coming. We really appreciate it, and we're around. If you want to continue the discussion, Dorothy, we will send you a beer and a pretzel in Amman.
01:01:40
Thank you.