We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Privacy by design & remote tools

00:00

Formal Metadata

Title
Privacy by design & remote tools
Subtitle
Privacy standards during the current crisis and beyond
Title of Series
Number of Parts
49
Author
License
CC Attribution 4.0 International:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor.
Identifiers
Publisher
Release Date
Language

Content Metadata

Subject Area
Genre
Abstract
Privacy is just an additional burden, in particular these days? Let's relax the GDPR requirements, is what lobbying associations are currently claiming. This talk is pleading for why the privacy principles, in particular the concept of privacy by design, are also supporting and protecting companies' information and business secrets. Privacy by design - meaning taking privacy into account throughout the whole engineering process - is a concept that was developed 10 years ago by the former Information and Privacy Commissioner of Ontario, Ann Cavoukian. In 2018, when the GDPR was becoming effective, Privacy by Design was for the first time an enforceable concept in the European Union (EU) and for the member states of the European Economic Area (EEA). Violations can be punished with fines of up to EUR 10,000,000 or up to 2 % of the total worldwide annual turnover of the preceding financial year, whichever is higher. Nevertheless, the obligations of controllers to comply with the principle of privacy by design seem to be either falling under the radar of many companies or are just not as prominent as popular and visible obligations like information requirements or data processing agreements. The following will engage with why the principle of privacy by design shouldn’t be underestimated.
23
Thumbnail
20:06
25
Thumbnail
38:54
27
31
Thumbnail
50:36
32
Point cloudOffice suiteDemosceneStorage area networkInformation privacyMereologyOpen sourceAssociative propertyPhysical lawRegulator geneMultiplication signInformation securityOffice suiteRemote procedure callProcess (computing)Mixed realityMathematicsHTTP cookieAdditionInternetworkingGoodness of fitConnected spaceBitTelecommunicationOcean currentVideo gameCellular automatonElectronic mailing listNatural numberArmTouchscreenObservational studyNetwork operating systemOrder (biology)Different (Kate Ryan album)Computer chessExtension (kinesiology)Rule of inferenceSource codeAngleComputer animation
Information privacyProcess (computing)Sign (mathematics)Information privacyRegulator geneSoftware developerPhysical lawProcess (computing)Right angleOffice suiteInformationFundamental theorem of algebraMultiplication signSoftwareStudent's t-testOntologyArmComputer animation
Total S.A.MereologyGame controllerProcess (computing)ImplementationData storage deviceGame controllerSheaf (mathematics)Information privacyPhysical lawProcess (computing)NP-hardMultiplication signMeasurementMereologyLattice (order)Gastropod shellSelf-organizationRow (database)String (computer science)SoftwareIdentical particlesPrincipal idealHand fanDressing (medical)Default (computer science)Sensitivity analysisRegulator geneImplementationMathematicsSet (mathematics)Computer animation
Finitary relationFrequencyImplementationProcess (computing)StatisticsData storage deviceInformation securityPRINCE2Total S.A.MereologyGame controllerGame controllerInformation privacyPRINCE2Multiplication signGroup actionPhysical lawMetropolitan area networkMeasurementMathematicsArithmetic meanException handlingSystem callValue-added networkShared memoryINTEGRALDistortion (mathematics)Programmer (hardware)Self-organizationState of matterLogical constantArmProcess (computing)Rule of inferenceObservational studyPrisoner's dilemmaTouchscreenHeat transferCoefficient of determinationComputer programmingBasis <Mathematik>Regulator geneImplementationTrailSuite (music)Stress (mechanics)InformationVideo game consoleSampling (statistics)Data storage deviceAddress spaceNewsletterLimit (category theory)MereologyEmailInternetworkingSlide ruleAuthorizationMaxima and minima1 (number)Fitness functionComputer animation
Total S.A.MereologyGame controllerData storage deviceProcess (computing)MeasurementSensitivity analysisPhysical lawInformation privacyGame controllerState of matterCASE <Informatik>Matrix (mathematics)ImplementationLikelihood functionLatent heatContrast (vision)Message passingRight angleResultantSelf-organizationRange (statistics)RoutingForestGrass (card game)AdditionNatural numberSocial classComputer programmingComputer animation
Process (computing)SineGame controllerSoftware developerImplementationPRINCE2Process (computing)Software developerMassGame controllerTexture mappingPhysical lawDependent and independent variablesService (economics)Self-organizationElectronic mailing listMetropolitan area networkInformation privacyInterpreter (computing)Cloud computingMeasurementInternet service providerRegulator geneXML
Source codePoint cloudProcess (computing)Control flowAlphabet (computer science)FacebookInternetworkingHash functionPasswordInformation privacyLoginInformation securityZoom lensInformation privacyPhysical lawPerspective (visual)Regulator geneMultiplication signArmArithmetic meanRange (statistics)DistanceInheritance (object-oriented programming)Sign (mathematics)Connected spaceGame controllerOpen sourceRight angleSoftware maintenanceAverageVideo game consoleSlosh dynamicsFocus (optics)Order (biology)Hardy spaceProcess (computing)NumberInformation securitySource codeError messageMobile appSoftwareInternet service providerLevel (video gaming)Point cloudGraph (mathematics)Office suiteSoftware developerCloud computingDependent and independent variablesZoom lensInternet forumComputer animation
PasswordHash functionInformation privacyLoginControl flowPoint cloudInformation securityZoom lensSource codeGame controllerMilitary operationPlanningInformation securityAngleSet (mathematics)Computer configurationDefault (computer science)Information privacyOpen sourceBitZoom lensMathematicsLattice (order)NeuroinformatikFreezingPasswordVideoconferencing2 (number)Internet service providerCloud computingProcess (computing)TouchscreenCycle (graph theory)Game controllerSoftwarePerspective (visual)Video gameFocus (optics)Software developerWindowClosed setArmMultiplication signOffice suiteSource codeAreaPrisoner's dilemmaArithmetic meanCombinational logicSquare numberMetreMountain passService (economics)InternetworkingCellular automatonMetropolitan area networkRule of inferenceSocial classHasse diagramComputer animation
Control flowGame controllerInformation privacyMilitary operationPlanningProcess (computing)Personal identification numberComputer configurationInformation securityInternet service providerImplementationSoftware testingDenial-of-service attackArithmetic meanInternet service providerInsertion lossArmMereologyGradientSoftware testingMeasurementTable (information)Connected spaceFingerprintInformation securityCASE <Informatik>Computer virusProcess (computing)Information privacyMetropolitan area networkExecution unitShared memoryCopyright infringementOcean currentSet (mathematics)Social classState of matterDemoscenePhysical lawOperator (mathematics)WebsiteInformationClient (computing)Internet forumGroup actionFeedbackRegulator geneTerm (mathematics)MathematicsLine (geometry)AngleInstance (computer science)Game controllerPublic key certificateComputer configurationDifferent (Kate Ryan album)Design by contractAdditionSoftware maintenanceBitDefault (computer science)Service (economics)Electronic data processingPoint cloudSoftwareElectronic signatureImplementationOnline helpProduct (business)Level (video gaming)Multiplication signPoint (geometry)Computer animationProgram flowchart
Control flowPlanningPseudonymizationInformation privacySoftware developerMilitary operationBlogTwitterOffice suiteMultiplication signSoftware developerConservation lawVector spaceService (economics)Product (business)Information privacyMaxima and minimaConnected spaceDataflowInternet service providerRevision controlGreatest elementWordEnterprise architectureDefault (computer science)Process (computing)Game controllerInformationLink (knot theory)Beta functionSpeech synthesisBit rateEvent horizonShared memoryData structureRule of inferenceIntelligent NetworkRight angleComputer virusSet (mathematics)ArmState of matterPower (physics)Dressing (medical)Shooting methodCellular automatonNP-hardComputer animation
Point cloudJSONXMLUML
Transcript: English(auto-generated)
Thank you so much for joining this talk and let me get started. So today I would like to talk about privacy by design and remote tools and what triggered me was the start of the corona crisis. And in particular, all these news around companies who said that privacy is just an additional burden and that GDPR requirements,
the general data protection regulation requirements shall be relaxed. And yeah, in particular, lobbying associations were claiming that and pleading for lower privacy principles.
And I truly believe that this is both not necessary, but also not a good idea, because privacy is not only about protecting personal data of data subjects, which is, of course, the most important thing.
But privacy can also protect business secrets and trade secrets and other stuff, and also ensure security, which is an incremental part of privacy by design. And that's why I would like to get back into the principle of privacy by design, which was introduced in the general data protection regulation.
But one step after the other. So first of all, maybe I'm starting with introducing myself. So who am I? I'm basically a lawyer. So I started as a lawyer and I worked for international law firms.
Then I moved to a medium sized bank and just sold off all the time. I practiced data protection laws and also information security technology laws. Then the bank was kind of more like a governmental thing and very strict and a lot of processes and not very dynamic.
So I decided to go back to a law firm, to an international law firm, because in the law firms, I always used to advise small companies, medium sized companies, but also big companies, but always in connection with new techniques, ideas. And I was deeply involved in the processes, which was pretty exciting.
But half a year after I returned to a law firm, I got hired by IO, my current company, where our CEO asked me if I want to join a really flexible and new startup dealing with some great ideas and changing the Internet.
I was really excited. So I joined IO and I'm there already for over four years. I started as the data protection officer and legal counsel. Currently, I'm still the data protection officer and I will be for the future as well.
And I'm also leading the corporate affairs department, which is a mix out of lawyers, security and privacy people, but also public affairs people and people dealing with corporate communications. And I believe even if it's kind of a diverse mix, it's a good mix because we are
looking at privacy, law, public affairs and communication from very different angles and try to help each other. And that's pretty exciting. So what is IO? IO is the company behind the ad blocker adblockplus. You may have heard of that.
So what we are basically doing is providing an open source extension, which is blocking ads, but is also configurable in a way that we can block cookies and tracking stuff and also other things. Currently, we're having 200 or more than 230 employees. We are located in three different cities. That is Cologne, Berlin and Malmo in Sweden.
And our employees are working out of more than 25 countries. So we are a remote company. And when
Corona kicked in, it was not that hard for us to switch from somehow office work to full remote work. It was just a bit of a change for people who really like to work in an office, for example, like me, but for most of the other people, it didn't change anything because they were already working remotely.
Yeah, that's briefly me. And let's now dive into the talk itself. So I would like to start what is actually privacy by design. And I already said I want to talk about the general data protection regulation where privacy by design is now implemented.
But the journey of privacy by design started way earlier. So the first kind of famous concept regarding privacy by design was already introduced around 10 years ago by the former information and privacy officer of Ontario.
That was Anne Kavukian. And she introduced the seven foundation principles of privacy by design. And that basically means that privacy shall be taken into account throughout the whole engineering process of software.
So as soon as you're starting to develop, privacy shall be considered. When I first heard of this principle that somehow reminded me of my professor I used to work for when I was a student already in 2004.
So way earlier than Anne Kavukian introduced privacy by design. Why is that? He is a German professor always working in the field of privacy, looking at privacy from our fundamental rights.
So he was always looking at our fundamental rights where we have the right to privacy introduced. And he said our fundamental rights must be always considered when technical developments are starting. So actually, he was already referring to considering privacy and fundamental rights in the whole engineering process at the beginning of 2000.
It was pretty early. And he also tried to lobby for getting a law which introduces this principle, but actually it never was implemented.
So when GDPR, which is the General Data Protection Regulation, even the time before, because it was like two years in between, we're passing the time to change processes and things like that. But the basic regulations were already in place.
In 2016 up to 2018, a lot of people, including me, were pretty excited that now finally the principle of privacy by design was introduced to the General Data Protection Regulation.
And what I did now, because it is pretty important for what I would like to talk about in the next minutes, I added a screenshot of the basic principle, that is article 25 of the General Data Protection Regulation.
And it is split in two sections. So section one is privacy by design. Section two is privacy by default. When looking at the privacy by design principle from Anne Karukian, she said privacy by default is part of privacy by design.
The law now distinguishes between the two of them. What the law does not do is it doesn't distinguish between who can actually control privacy by design and privacy by default. So I've highlighted to you who is the addressee of article 25. And
here you can see the controller shall implement appropriate technical and organizational measures, which are designed to implement data protection principles. What is the controller? The controller is the one who is using the software or the technique to collect and to process personal data.
That is usually not the one who is developing software. And I will get back to that in a few minutes. But that is a huge issue.
For privacy by default, we have the same. The controller shall be the one who is responsible of default privacy sensitive settings. Both principles can be punished with fines up to 10 million euros or 2% of the total worldwide annual turnover of a company.
And even the whole group, it's not only the company, it's a whole group, which will be measured for the 2%. So that can be quite high fine.
So let me get back. I said the controller is responsible for implementing data protection principles. So I assume if you're not a lawyer, you don't know what is meant by protection principles.
So last time, I would like to show you the law, because this is also pretty important for your understanding. So what are data protection principles? Data protection principles are laid out in Article 5 of the General Data Protection Regulation. And you don't need to read everything. I've highlighted the parts which are the important ones.
So there are five principles, which in particular say that personal data shall be always processed in a lawful, fairness and transparent manner. Lawful means you can only process personal data if there is a statutory legal permission or a prior consent.
It must be transparent. That means the data subject, so the person who owns the personal data, must be informed about what is happening with their data. You may know that from all the privacy policies you find on the Internet.
Then there is a clear purpose limitation, meaning personal data can be only processed for the purpose for which it has been originally collected. There are a few exceptions, but never mind that are very rare ones. So in general, we do have a clear purpose limitation.
So if you're collecting, for example, an email address to provide a newsletter about your products, you can't use this email address to send information about other companies or whatever.
So it's clearly limited to the purposes you originally stated when collecting the data. Then we do have the principle of data minimization, and data minimization is something which was already introduced in the privacy by design principle from M. Corrucian.
So that means do not collect more data than the data which is really necessary for fulfilling the purpose you would like to reach. That is something which of course doesn't fit and doesn't make sense when it comes to big data, but it is a basic principle and should be always considered.
Data need to be accurate. So as soon as data is not relevant anymore, it's not correct anymore, it must be changed or it must be even erased, deleted and things like that. Then we have the principle of clear storage limitation. Storage limitation means as soon as data are
not necessary for the purpose anymore, and if there are no legal retention obligations, data must be deleted. Must be deleted ASAP. So you can't store personal data forever. That is simply not permitted.
And the last thing is data must be kept confidential and there is also the principle of integrity. So yeah, it must be ensured that with technical and organizational measures which are appropriate,
data must be protected against loss, data breaches, destruction, unauthorized access and things like that. So that goes again into the basic principle of privacy by design. So I hope that was understandable. Let me get back to the initial slide.
So privacy by design means the controller shall implement appropriate technical and organizational measures which are designed to implement data protection principles. So there shall be technical and organizational measures which are ensuring that
the data protection principles I just explained are implemented in an appropriate manner. What are appropriate technical and organizational measures? The law in general is nothing which explicitly would state you have to do XYZ.
Why is that? Because techniques can change and measures can change, so it's generally technically open, we call it that way. So what the law is saying is the controller has to take into account the state of the art, the cost of implementation, the nature, the scope, the context, the purpose of processing.
Also the risks which are related to the processing of the specific personal data and things like that. And only such measures which are appropriate in that manner must be implemented.
So there is not a clear right or wrong. You always have to figure out for the specific case what kind of measures are appropriate. And as you can see, the more sensitive data are, the stronger the measures should be to protect personal data.
So you can always start with a risk matrix to check is personal data first sensitive. The less sensitive it is, the less likelihood we have that or the less measures we need to implement. And in addition, we also need to check how likely it is that personal data can be exposed because of the way personal data is processed.
And both need to be taken into account and then you can start thinking about appropriate technical measures. So that's a basic principle. So getting back to what I was already triggering, kind of.
Article 25, so the privacy by design principle only addresses the controller. And I already said that is usually not the developer or manufacturer.
And also the data protection principles say that the controller is responsible for the compliance with data protection principles. It's always a controller. It is not the developer. You may already see the problem.
We do have a recital. Recitals are designed to help with interpretation of the law. And this recital 78 says, of course, developers shall be encouraged to enable the controller to fulfill their responsibilities of privacy by design or compliance with a general data protection regulation.
As you can see, that is very weak. First, it is not a law itself. It just helps with interpretation and it does not clearly oblige a developer.
And then again, there's another article that's Article 32, which is telling the people what can be, for example, technical and organizational measures. And that is also only addressing the controller and sometimes the processor.
So if you're using a cloud service provider who is storing data on your behalf, then they also must be sure technical and organizational measures. But the developer itself is not addressed. So now we have started with the law and now I would like to get back to the corona crisis and why I would like to talk about privacy by design now.
And then we will try to bring both perspectives together. And by the way, at the end, we will have a lot of time for questions and answers. So if you have anything in your mind, make a note and let me know at the end.
It's easier for the moderators to do that way. So what is the current situation if you're looking at the graph on the right side? There is a slight overview on the time from September last year to March this year.
And it shows how many business apps have been downloaded during that time. And you can see it's pretty stable until March.
And in March, mid-March, around 90% more business apps have been downloaded than in the average amount of downloads before. That's quite a lot. And I can totally relate to that. That was exactly the same we have seen at IO.
So I will show you at a later stage that we have introduced the so-called security and privacy review process, meaning no third-party tool can ever be introduced without running through that review. And these reviews have tripled.
That was crazy. We received so many questions and requests for so many tools, and we weren't able to handle that. So at the end, we introduced the tool stop, but that's a different story. So I guess that was basically the same for almost all of the companies.
Why is that? Everybody started to send their employees home and let them work remotely, even if the companies haven't been a remote company. Of course, only if it was possible, talking about office people. And what kind of tools have the companies chosen?
Being not experienced, having no big IT department, no developers who were able to deal with, for example, open source tools. They were, of course, going to proprietary software and cloud tools. So what's the issue with that?
All these tools are processing personal data of our employees, of partners, of customers, and many more. They are processing personal data. They can access this data. They can provide maintenance and by providing maintenance, accessing such data and things like that.
The majority of tools is not posted in the European Union. And as we are talking about proprietary software and cloud solutions, the company itself, so the controller who is actually processing personal data, does not have the full control over what is happening with data processing handled by these cloud providers.
And now, thinking about what I said in connection with Article 25 privacy by design, responsibility for privacy by design lays with the controller.
But the controller does not have the full control over that. Here you can also see a graph of the biggest large tech giants.
And as you can see, that are mostly American ones, like Alibaba, Asian. But they are, of course, not European countries. They are almost all cloud providers, as I already said, and the graph just proved that right.
And they are all proprietary. So that is exactly the issue. I would like to give you one example now trying to connect the law with the issues such providers are having.
So some of you may roll your eyes now because Zoom is a story which has extensively discussed in the press already. And a lot of people were bullshitting. I don't want to bullshit about Zoom, but I would like to use it as an example.
So why not considering privacy by design or lobbying for more relaxed privacy regulations can be an issue for your business? So I said at the beginning that privacy by design can be punished by pretty high fines.
So 10 million or 2% of the global annual turnover. Nevertheless, privacy by design is somehow under the radar of many companies. It's not popular, it's not prominent, it's not visible.
I really don't know why, but it seems to be not an issue at all. But now look, getting back to Zoom, what happened there? Zoom is a video conferencing tool which is proprietary coming from the US. And it is a tool which has provided video conferencing in a pretty new way.
So it really thought from a user perspective what is needed to make video conferencing convenient and to provide an experience which is somewhat comparable with in-office meetings.
Like raising the hands, showing a lot of people at the same screen at the same time, and things like that, having the breakout rooms and stuff like that. That is basically what all the other video conferencing tools are now doing as well.
But they are always a step behind Zoom. So they were really innovative. But their default settings have never been very privacy friendly. They are now trying to fix it. They had a 90 days freeze of features to ensure to implement a bit more privacy and security.
But their default settings have never been very privacy friendly. And they even committed in meetings that their focus is not on security and privacy. Which is from a business perspective even understandable.
Nevertheless, they had a lot of data breaches in the past and major security risks. For example, the one for Windows users where the linking caused Windows to send the person's Windows login name and their NTLM password hash, which made it possible to access almost everything.
And that's a big security risk. It's a risk not only for personal data of participants, it's also a risk for the company's business secrets which are accessible on the computer or the device of the user of Zoom and things like that.
So Zoom bombing had been an issue of course. But there are so many more things. So Zoom itself was never focused on privacy. And obviously, they haven't implemented privacy by design. So the only angle the company would have been using Zoom is to use the option they are having to change default settings.
Which is then privacy by default. So the second paragraph of Article 25. Looking again to the engineering process and the life cycle of designing software and services,
you can see here that what I already mentioned, privacy by design is something which is in the control of the developer. It must be implemented in the engineering itself.
At a later stage, the only chance the controller has is to either do customization if it is possible or changing default settings. If the developer and the manufacturer provided the option to change default settings, not every tool provider does so.
So the options when it comes to privacy by default are very limited. But now we have the question, how can a company comply with privacy by design if they are not the developer of the tool?
And they don't have the capability to use open source tools and to customize them in a way that they fit their needs. Not every company can do that from a personal perspective, from a headcount perspective,
but also some do not have enough money to just hire an external company to do the customization process. So what can a company do and how can they indirectly force cloud service providers,
proprietary tool providers to implement more privacy by design into their products? Because that is the only angle a controller has to change things. So what I would suggest is that every company thinks wisely before they are choosing a new tool,
meaning they have to invest in checking whether a tool provider meets the requirements of the general data protection regulation and also fulfills basic security needs.
If a tool provider allows default setting changes and things like that. So now you can see our tool review process, which looks pretty complicated, but it basically is not. So when looking at the beginning, you can see the red dot here.
We are receiving a tool request where we are requiring quite some information about what is the name of the tool, who is providing it, where is a provider seated, is it a cloud tool or can it be also hosted on premise? What is the purpose, what kind of data will be processed and things like that.
Based on that, we are starting two processes. First, we are starting with the privacy review process and then we are checking whether a data processing agreement is required. That is always the case when the tool provider has access to personal data.
That is already the case if they are providing maintenance or doing a usual post. So basically almost all the time when we're talking about software as a service. If so, we are starting a contract review and checking whether the
documentation the provider has provided to us is compliant with the applicable laws. Basically, the general data protection regulation. Please remember that one, because I will get back to that at a later stage. If it meets the requirements, we are requesting signature and we would approve the privacy review.
Of course, that's now a very short version. Then it goes to the security review and in the security review, we will check all the documentation the provider is providing.
We will do our own research about whether the provider suffered from any vulnerabilities, data breaches in the past and check all information which is somehow available anywhere. Also checking forums and things like that. What kind of controls have they implemented and other things.
If everything is sufficient, the security review is also approved. Then the security and privacy review in general is approved. The respective team can ask our operations team for implementing the tool and acquiring it.
If not, here we are talking about starting with a privacy review, for example. If the terms are not compliant with law, we are starting to renegotiate with the provider. Either the provider is implementing compliant provisions with the law, then we
are getting back to the usual process and would approve the privacy review. If the provider does not, we won't have an agreement. We won't check security settings. We will just say the privacy review is rejected and please check out whether there is another tool.
You may be now wondering what has that to do with privacy by design. There are two angles where we look a bit deeper than I just mentioned. That is a contract preview and that is a security certification documentation. In both parts, we are deeply looking into what kind of measures the provider has implemented regarding privacy by design.
Are there any measures in the documentation? Can we see that from the certificates? We are also starting testing as far as this is possible and checking out whether there is privacy by design.
We are also checking how default settings can be influenced, how much customization is possible and things like that. And asking of course for help from our IT people. Yes, that is what we are doing here and only if that is sufficient, we are going for approval for the product.
We would always tell the teams why we have rejected a tool in particular when privacy by design has not been implemented sufficiently. And we also ask the teams to either get back themselves to the tool provider and provide
them with the feedback or we are helping with that and providing the feedback to the provider. And of course when talking about really really really big players, they do not really care. But as soon as the company is not one of the large giants, they are actually interested.
At least that is what I observed. They are asking why? What is exactly your problem? What can we do differently to get used by you? So they are at some point flexible.
So I have the feeling that the feedback really makes a difference. And in addition, if every company would do so and just reject tools which are not providing sufficient measures to implement privacy by design and by default, nobody would really use tools which are not compliant.
And then indirectly, they would change their habits. You may remember when GDPR was introduced two years ago. There were so many issues around privacy and non-European players. But they changed a hell of things just to get compliant with GDPR.
Because the companies now here in Europe are now scared to receive fines, to be punished with fines and told their providers we can't act this way anymore. You have to change that. And they actually did. Slowly but surely. And so it can have an impact.
So I can highly recommend to ensure that only tools will be selected which comply with privacy by design. And here again, what is the part of the design, what we are looking at that are
the basic principles, the data minimization principle, I already mentioned in connection with the data protection principles. Then data should be tied, in particular, thought anonymized or anonymized. If possible, we are checking whether a provider is separating data, whether data can be aggregated.
So meaning a data would be grouped, whether the information policy is very transparent about what is actually processed and collected. Then the control over the default settings, I already mentioned a couple of times.
What kind of policies the provider has implemented to comply with GDPR and also how they are demonstrating compliance in their documentation and things like that. If you are interested in that, feel free to check out of the INISA privacy and data protection principles from 2015.
But they are still very relevant and up to date, I would say. Link is provided at the bottom. So that's it. And we have four minutes left and I would like to open the floor for any kind of questions.
Yeah, thank you very much for your very interesting talk. If you have questions, please write them in our chat that I can unlock you, that you can ask them. Or if you don't feel confident to ask them without you, I can read them out loud so we can all hear that.
It doesn't look anybody has any questions. Or can you see anything, Judith?
No, I can't see anything. So either we made up for the time we lost at the beginning, which is also a good thing. Or everybody is too shy or everything is clear. Or I confuse everybody too much with me talking all the time.
Yeah, I don't hope it's the last one, but I think and I still don't see any questions. There is one. So the problem is more the data retention, third parties are another vector problem. Self-hosting on services is a way for privacy by design.
What do you think about the shady market about seal info? What do you exactly mean with seal info? Ah, now it makes sense. Okay, yeah. What do I think about that? I have a very clear opinion on that.
I always told everybody, stay out of that. Don't buy any of this kind of data. And if I can see a pattern, either from the press, from any hints, or from their policies, usually you can already see that in the policies that they would have very vague wording.
I'm always saying no, don't use this provider, go away. That leads into huge discussions, of course, but yeah. Yeah, so I don't, I can't support it in any way. So you said that's a problem, not the developer, but the bad enterprise policy, capitalism.
Yeah, you're right. I don't want to claim the developer itself. So when I said developer, I meant more the producing company behind it, telling their developers how to do things. So I fully agree, it's not the bad intention of the developers. It's rather the opposite.
What I can see from our developers is that they are even more privacy conservative than I am. That's, first of all, very surprising, but also a great thing. But you're right, it's capitalism.
Yeah, thank you very much for answering these questions. So I think there was enough time to add another question, but I still don't see any. Oh, there's another one. Want you to read or should I?
I can. How does your company implement the GDPR requirements in product development? Yes. So we have also introduced a similar process to the one I showed you for the tool review process. So in general, whenever we are starting to develop anything, the data protection officer,
so myself in person and then delegating to the team or doing it on my own, are included. So we would be already included from idea thinking to starting with the first prototype,
structuring data flows and then bringing it to the first beta version and things like that. So actually we are involved in all stages. So we basically do privacy by design. You're welcome.
So maybe there is one last question. I think we can wait half a minute. We have it since we started a little bit late.
But yes. And it doesn't look like that there are more questions again. And I hope there isn't popping a question up now if I say that.
So I would like to thank you for your talk. Thank you very much. And yeah, we like to see you out in maybe one of our community rooms. And we wish you all a nice FrostCon.
Thank you very much all for listening and also for you as a host to make that happen and helping me in advance so much.