We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Who will control Big Data, the currency of the digital age?

00:00

Formal Metadata

Title
Who will control Big Data, the currency of the digital age?
Title of Series
Number of Parts
21
Author
License
CC Attribution - ShareAlike 3.0 Germany:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this
Identifiers
Publisher
Release Date
Language

Content Metadata

Subject Area
Genre
Abstract
Facebook and Google suck up our personal data and turn it into fake news, engagement algorithms and “psychographic messaging” fed by Russian bots and digital stealth operatives trolling for Brexit and Donald Trump's election. A battle is looming over who will control this sea of data and artificial intelligence, with the needs of the public interest and private, Internet-based businesses often in conflict. What can be done to promote innovative policy approaches about the use of Big Data, and to develop a social economy for the digital age?
Control flowDigital signalProgrammer (hardware)Parameter (computer programming)Perspective (visual)Event horizonGroup actionTranslation (relic)Rule of inferenceResultantWordBitMeeting/Interview
Discrete groupDataflowCommutatorRegulator geneBoiling pointFlow separationWeb pageWhiteboardInternet forumDecision theoryGoodness of fitBlack boxSelf-organizationMachine learningBitRule of inferenceInformation privacyTravelling salesman problemExterior algebraFitness functionNormal (geometry)Basis <Mathematik>Perspective (visual)InternetworkingVariety (linguistics)Standard deviationDigitizingBeat (acoustics)AuthorizationBridging (networking)Computing platformFocus (optics)Wage labourCausalityState of matterForm (programming)Group actionPressureChemical equationAlgorithmPower (physics)FacebookArtificial neural networkDisk read-and-write headWebsiteFile formatPhysical systemMultiplication signParameter (computer programming)TheorySoftware developerLine (geometry)WeightElectronic GovernmentMoment (mathematics)Virtual machineMereologyOpen setImpulse responseCivil engineeringMeeting/Interview
Computing platformLevel (video gaming)Wage labourBitSoftware developerOpen setFacebookState of matterHoaxSelf-organizationArtificial neural networkCollaborationismRegulator geneControl flowQuicksortSource codeDigitizingHypermediaMessage passingZuckerberg, MarkType theoryPresentation of a groupServer (computing)ResultantRule of inferenceSinc functionPhysical lawPower (physics)Price indexProduct (business)NumberTerm (mathematics)Extreme programmingPattern recognitionWell-formed formula10 (number)Set (mathematics)Virtual machineBackdoor (computing)Information privacyOnline helpBit rateFigurate numberForm (programming)Logic gateAnalogyWater vaporWeightUtility softwareForcing (mathematics)Order (biology)Line (geometry)WordUbiquitous computingIdeal (ethics)Right angleOpen sourceMeeting/Interview
Computer networkSet (mathematics)Regulator geneRight angleCircleInequality (mathematics)Vulnerability (computing)Group actionDifferent (Kate Ryan album)Direction (geometry)Perspective (visual)Self-organizationPoint (geometry)MereologyPhysical systemContext awarenessProcess (computing)Power (physics)AreaDependent and independent variablesSocial classMachine visionSound effectTheoryTwitterCivil engineeringState of matterGame theoryWage labourAlgorithmFacebookInformationInformation privacyEstimatorNetwork topologyMathematical analysisRule of inferenceView (database)Open setDigital electronicsOrder (biology)Lecture/ConferenceMeeting/Interview
Perspective (visual)Group actionGoodness of fitArithmetic meanRight angleGame controllerDigitizingBitProcess (computing)Decision theoryDatabaseAlgorithmInformation securityProcedural programmingExpressionPhysical systemDifferent (Kate Ryan album)Insertion lossSpeech synthesisInformation privacyConnected spaceMereologyLevel (video gaming)InternetworkingWordGame theoryPerfect groupBasis <Mathematik>Meeting/InterviewLecture/Conference
Level (video gaming)BitSound effectStatement (computer science)AlgorithmWater vaporSoftwareLecture/ConferenceMeeting/Interview
Level (video gaming)Software developerView (database)State of matterPoint (geometry)Civil engineeringFacebookRight anglePhysical lawDigitizingRegulator geneMultiplication signInequality (mathematics)Equaliser (mathematics)Lecture/Conference
HypermediaField (computer science)Right angleInformationFunctional (mathematics)AlgorithmFacebook1 (number)Utility softwarePairwise comparisonMeasurementLevel (video gaming)Computer configurationDifferent (Kate Ryan album)Physical systemRegulator geneDigitizingPerspective (visual)Physical lawMultiplication signLecture/Conference
Regulator geneInsertion lossPhysical lawProcedural programmingProcess (computing)Virtual machineStandard deviationSelf-organizationSet (mathematics)Different (Kate Ryan album)Right angleMessage passingMeeting/InterviewLecture/Conference
WebsiteFunctional (mathematics)Point (geometry)BitData conversionRiflingTable (information)BuildingLecture/Conference
System callRight angleCausalityFlagCASE <Informatik>AlgorithmRule of inferenceArithmetic meanRegulator geneBeta functionMeeting/Interview
Multiplication signPoint (geometry)NumberSoftware developerText editorPhysical lawQuicksortRule of inferenceThermal radiationLecture/ConferenceMeeting/Interview
Computer networkPoint (geometry)Decision theoryPhysical lawTask (computing)Moment (mathematics)AuthorizationImplementationFraction (mathematics)Information privacyOnline helpDifferent (Kate Ryan album)Real numberFacebookPower (physics)Perturbation theoryBeta functionEndliche ModelltheorieView (database)Regulator geneDigitizingStaff (military)CodeLecture/ConferenceMeeting/Interview
Metropolitan area networkTwitterFacebookGoogolHypermediaExistenceStandard deviationExterior algebraPhysical lawMoment (mathematics)Sinc functionTelecommunicationMeeting/InterviewLecture/Conference
Online helpTelecommunicationMultiplication signExterior algebraStandard deviationRight angleInformation privacyPhysical lawDeterminantPoint (geometry)Lecture/ConferenceMeeting/Interview
Roundness (object)Regulator geneMultiplication signIdentity managementClosed setRight angleInternetworkingPopulation densityLecture/Conference
Moment (mathematics)Design by contractBasis <Mathematik>Software developerRevision controlDirection (geometry)FacebookRule of inferenceSet (mathematics)Point (geometry)AlgorithmArtificial neural networkSpacetimeDigitizingDifferent (Kate Ryan album)ResultantGoogolPhysical systemForcing (mathematics)Meeting/InterviewLecture/Conference
Machine visionRegulator geneLevel (video gaming)Physical lawRule of inferenceOpen setControl flowProcess (computing)View (database)Exterior algebraOnline helpInformation privacyPoint (geometry)BitDirection (geometry)Right angleMultiplication signPosition operatorFacebookLecture/ConferenceMeeting/Interview
Lecture/Conference
Transcript: English(auto-generated)
Welcome to this afternoon session on the data economy and the consequences that we will
take out of it. This session will be in English. We have just discussed, we have some very German words, like, we may not translate everything correctly, so, since the audience is mainly German, you will understand what
we try to say. It is also not a scientific panel, we are a very diverse group today, and so we try to bring this, our aim, multi-perspectives together on this topic today.
This is our aim, and you can join in with your perspective or your expertise in the last 15 minutes, we will open with the room microphones to the audience, and you can join in the discussion. So I'm very honoured to be here today since the real host of this event is Stephen Hill
from San Francisco who brought us all together. Welcome Stephen. And one panellist who is still in the programme is missing since he got hurt in Sweden on
his leg. We will miss Martin Spitz today, and I promised him to take some of his arguments in our debate, so, wherever you are. A new commodity, commodity of data spawns a lucrative, fast-growing industry, prompting
antitrust regulators to step in to restrain those who control its flow. A century ago, the resource on this question was oil. Now similar concerns are being raised by the giants that deal in data.
The oil is often referred of the 21st century. This was Martin's opening argument that I just take in, and it also is kind of the combining metaphor for what we try to bring together today in this panel about the rising
of the data economy, how it is designed, who is in charge with the power, how does it impact us as society, as our economy, labour, and not least ourselves. How can we shape it, legislate or regulate it?
Do we have to? And overall, how does it enable us in a way that it sponsors public interest and invests a common good, as well as companies, for a better future? Facebook, as you read yesterday, I guess, just announced some new standards on data
protection, and, at the same time, introduced a new data-harvesting tool of matching people's love destiny. There's one thing, you cannot date your friends in this new system, which I think
is very interesting, but the stock market cheers. The scandals of the last week seem almost forgotten, and this is just one example. We will come up with several new or several different that lights out the power.
This morning, Chelsea Manning said about artificial intelligence and machine learning, it's not a hype, but it is dangerous, and we have to discuss it. And that's why we are here. Americans often use the phrase in their politics about the balance of power,
which matches very well, because we do need some checks and balances for the rising data economy, the use of algorithm and big data. What are these challenges, and how can we face them? That is a key question for all of us here who represent, in different forms,
the civil society in Germany. Claiming the motto of this year's Republic of Pop, we are, or we have to be, some kind of a popular pressure group towards data fairness. This is a cause we also want to support as a side foundation, and that's why I'm honoured to be here and being in the role of being your moderator today.
But you are the stars of this panel, and I would like to introduce all of you first, and then we will start in the discussion with a brief opening impulse of Steve. But from the very left or right, from your perspective, there's Lena Ulbricht.
Ulbricht is a post-doc researcher at the new Weizenbaum Institute for the Network Society here in Berlin at the WZB Social Science Centre here. She obtained a PhD in political science and her research focus on regulation of technology, and many other things, but that's why you are here today.
Thank you, Lena. Then, new on the panel, you're not Malte Spitz, you're Matthias Spielkamp. Very welcome to Matthias, co-founder of Algorithm Watch, a data non-profit advocacy and research organisation on algorithmic decision-making.
And, of course, one of the co-founders of iRIGHT's info, which won 2006 the Grimmer Online Award, member of IGF and several other committees, and the serious fellow at the side foundation at the moment.
Thank you for being here and jumping in. And we have Annette Muehlberg, who works for the United Service Union, Verdi, a trade union, as I learned just. She is head of the Working Group on digitalisation there,
and her focus is on digital labour, net politics, e-government and common good. And recently, she became a member of the Consortium for Online Platform Corporativism, a new form of ownership discussion board you will talk about later a little bit more.
Annette, welcome to the panel. And, of course, Stephen, who is more than being from California, he's a political journalist and lecturer on both sides of the Atlantic, and that's why he's our sparing partner today, to see the challenges of both worlds coming together here,
which he also does in his actual research at the Weizsätbe Science Centre here, as a journalist-in-resident. And he's author of several books to this topic. The last one in German, also published,
Die Stadtab illusion, Wie die Internet ukonomie uns resudenten sozialstär trundienwert. He publishes in several or lots of magazines and newspapers all around the world. I will not mention them all. You can read them, and you can read his new book. And welcome, Stephen, and thank you for the idea to this great panel.
So this was mainly my part. Now I hand over to Stephen to give us a short introduction. As we discussed, we would like to, as you read in the introduction on the web page
about the challenges Stephen perceives that Silicon Valley or its companies, how they are challenging us in Europe. Stephen. Great. Are we working? Are we live? Thank you, Daniel. It's a great pleasure to be here, and to my fellow panellists, it's going to be a great panel,
because these people all have amazing things to say about this subject. So, you know, as a journalist based in Silicon Valley, I see the pros and the cons on a daily basis, the moonshots to nowhere, the companies that get launched and never amount to anything, then those that go from small little start-ups in the room of a Harvard dropout,
Mark Zuckerberg, and balloons into something like Facebook. So you see it all, and I think that a lot of our perspectives are shifting quite quickly as some of these companies do go from small start-ups to what are approaching,
if not already have reached, monopolies. And in fact, they are in some ways redefining what we think of as monopolies in the world today. And so you basically have the norms and the rules of the digital age that are being established by Silicon Valley and by China and the black box of China.
And so it raises a question for myself as someone who goes back and forth across the Atlantic, what is the German and what is the European alternative to the Silicon Valley and the China rules of the Internet and our future, really?
Is it going to just simply be one of tweaking Silicon Valley a little bit when it steps out of line a little too much? Or is it going to be a unique German and European alternative? President Macron from France is talking about injecting European values
into the development of artificial intelligence. What does that mean? What are the European values? Something to do with social values, something to do with political economy that in the United States, Donald Trump doesn't care about at all. And in fact, Silicon Valley and Wall Street don't really care about it all.
Right now in Silicon Valley, there's a little bit of an opening for discussing the social impacts of what they're creating there. Even some of the Ayn Rand, the most neoliberal CEOs are starting to say like, maybe we've gone too far in this disruption stage.
So what would the European contribution look like? That's kind of what I'm interested in. And the side question to that is, who's going to control the data of the 21st century? Because the more and more that I look at it, that is really going to be the key question. Who controls the data? And I'm not just talking about our personal data
and what Facebook does or doesn't do it, or whether it's used in some way for advertising or for fake news or for, you know, to be manipulated in elections. That's important stuff, but it doesn't end there. It's also a matter of, as more and more workers are working online for labor platforms.
So there's a platform in Silicon Valley called Upwork, and they claim they have 10 million freelancers all over the world, including tens of thousands of them right here in Germany. And you know, when they hire a German, they don't pay into the social security system on behalf of that worker. And the worker may not be paying either. The worker may not even be paying income tax
because the German government doesn't necessarily know about that relationship between these international online labor platforms and the worker. So if people, as this becomes the future of work and more and more people are gaining access to work through this, that could threaten the tax funding for the welfare state.
So that's another use of it. You know, data in terms of, for example, Airbnb. The city of Berlin passed a law to try and, you know, not crack down, not get rid of Airbnb, just to reign it in a little bit. And the law has completely failed.
A year later, the number of hosts that are renting out entire apartments, which is sort of the indicator for professionals who are renting out of place 365 days a year, not just a spare room every now and then, that has gone up by 45 to 50 percent in the year since that law went into place.
So you realize that without the data from this company, you really can't regulate it. And a lot of these online platforms, they sort of exist everywhere and nowhere. You know, if you think about, say, Ford Motor Company. Ford Motor Company wanted to come to Germany and set up an auto plant, right?
They have to get a whole bunch of licenses and permits. And they have to get permission. And they can't just set it up and just do whatever they want. And if they don't follow the licensing permits, they can be fined. They can get in a lot of trouble. They can have their license revoked and shut down. Look what happened to Volkswagen in the Dieselgate scandal.
They got in a lot of trouble because they broke the rules. Well, with these online platform companies, you know, let's say, I mean, the European Commission fined Google $2.4 billion for manipulating search results. So when you get on Google and you search, they were steering you to companies and results that they wanted you to see, not necessarily just an open, you know, open platform in that way.
So they fined them for that. Well, Google, if they wanted to say, look, we're not paying your fine. And we can take our servers and we can move them out of Germany. We can move them out of Europe. We can move them to an island off the coast of Germany in international waters. And there's nothing you can do to regulate us.
And we would still be able to, you know, people in Germany would still be able to access our servers. So these types of companies present challenges. So the question is, what do we do about it? Do we pass new regulations? The General Data Protection Regulation, for example, is a start towards that.
Do we let these companies self-regulate? Of course, they've promised up and down. If any of you saw Mark Zuckerberg in the congressional hearings, he said, oh, yes, you know, we're going to do better. We promise, we're sorry. We've heard this before from Mark Zuckerberg. Some of the proposals that have been put out there are things like regulate these companies as public utilities, you know.
I mean, Facebook, it's not a small social networking platform anymore. Has two billion users. It's a media platform. It's the largest media platform in the world. It's much larger than any European or German media company or any American media company. It is becoming, in many ways, it's replacing television for a lot of people
as their primary source of news, entertainment, and these sorts of things. So should we just let Facebook self-regulate? Or should we pass regulations to tell Facebook how we want it to work? What would those regulations look like? And others have proposed, as I said, public utilities, other proposed,
break up these platforms as monopolies and turn them into smaller companies. That's being discussed among certain people in the United States right now. Should we have a collaborative multi-European state organization for development of AI,
sort of like an Airbus for AI development or CERN for AI development? And so that way you can feel confident that the data sets that are being produced for developing artificial intelligence are being used for the public good and are being used in a proper way, and not in a way that's going to potentially damage you as an individual.
Because artificial intelligence takes a lot of data to develop. I mean, AI is basically machines looking for pattern recognitions in huge amounts of data. And that's how it figures out what to do from the formulas that it's told to do with huge sources of data.
Well, if Europe can't trust the source of this data, are you going to be willing to allow your data to be used in this way to develop AI? That's a big question. I mean, will the General Data Protection Regulation, which is going to put a lot of rules around data protection,
is that going to hurt Europe's ability to have enough data available to develop artificial intelligence? That's a real question that has to be thought through. Should we become data shareholders of our own personal data? So one of the proposals coming out of Silicon Valley
is that we will sell them our data. Each one of us as individuals will own our own data and sell it to them, and we'll be able to, quote, monetize it. Is that the way to go? I mean, how much will you as an individual really be able to negotiate a good deal for your data with these big, huge companies? Not very much. The amount you'll get from it will be probably pretty small.
Would it be better to have a conception of our data as social data? As I said, it's needed to use for the development of artificial intelligence. So if we conceive of it as social data rather than as personal individual data, I mean, is conceiving it as personal individual data just like the extreme end of neoliberal ideology
instead of conceiving it as social data? So this is another question that we really have to talk about. So anyway, there's other ideas that we'll get into in the panel, things like digital licenses, digital borders that are being discussed. Essentially, it's a whole new world. We have to figure out a way either we either let these companies do what they want,
and we see some of the results of that already. And keep in mind, we're just at the beginning of the development of these technologies, and they're just going to get more powerful and more pervasive throughout our societies. Or do we take steps now to start thinking about how to harness the power of artificial intelligence, harness the power of the products that these companies are producing
without getting so many of the downsides? That's really the challenge before us. And so I look forward to joining with all of you during Q&A and my fellow panelists as we debate and discuss these ideas. Thank you. Thank you, Stephen.
Lots of perceptions, lot of ideas, lots of concepts. Before we come and dig into these concepts, I would like to ask you, Elena, before, or as an opening, as Stephen says, it's a whole new world. How do you approach this whole new world from a scientific point of view? Let's say it more basically,
do we have enough data about these trends to deal with them in a scientific-based way? Okay, then the two remarks I want to make as an answer. The first is that from a social scientist's perspective, much of the debate that we're having about the potentially positive and negative effects
of datafication, of big data, are not more than plausible theories. We know very, very little about the impact of big data practices on individuals, on groups, on vulnerable groups, on societies as a whole.
We know very little about medium-term impact and long-term impact. This is due to many reasons. First of all, we lack access to the whole system of practices based on big data. We do not have access to the data that is being used.
We know very little about the algorithms that are being used. But more importantly, we know very little about the whole context where these practices take place. Who uses this kind of analysis? With what aim? Who takes part in this? And again, what is the impact on individuals, groups, etc.? So this is one point from a scientist's perspective that we need to raise again and again,
is that much of what we hear in public discussions about the threats or about the chances of big data are plausible, but we do not know much about it. Another point I want to raise from a social science perspective is that
you've rightly addressed many, many problems or challenges. And there's not one regulation for everything. Societies are very complex, and regulation is complex. So when we discuss all these problems, it is important to keep in mind that we have very different regulatory approaches for different risks and for different problems.
So some of the problems we have due to big data concern privacy, right? Another problem we have concerns more discrimination of vulnerable groups or of protected groups. Another risk is manipulation.
Another risk is disinformation. Another risk is simply to increase social inequalities that are already there. Another problem is the one of big monopolies, and so on and so on. So it is very important to keep in mind that we should not look for one regulation or for one approach, but there are many available.
And the way I view the discussion about the regulation of big data is that it is not true that there are not enough ideas around. When you study different proposals for how we can react to these problems, you will realize that there are more regulatory suggestions and ideas
than you can possibly think about. We have a lot. So what we lack is the will to actually regulate. And what we lack is also, as I said, knowledge about how different regulation can impact the system, how successful it can be.
So what is important on the one hand is to try to understand why, despite having many ideas of how to break up Facebook or of how to protect privacy or citizens' rights, why many of these proposals are not even discussed or discussed in very small circles.
And as a last point, answering this question of what could be a European approach, I want to stress that from a social science perspective, we realize that much of the regulation that is being introduced right now, for example, with the GDPR and much of the debate centers very much
on the rights of the individual. For example, the right to explanation or the rules for consent, giving to data collection processing. But we have a long, long history of collective power. And this is something that I would really like to see more in our debates
and our regulation is to think about how organizations, how civil society groups, NGOs, labor unions, the churches and various aspects of state institutions can come into the game
so that we do not have a discussion that centers merely on the relationship between the user and Facebook with all its incredibly big power asymmetries, but that we think more about how intermediate groups and civil society organizations that already exist and they have some kind of power that is superior
to the power of the individual user can be harnessed for such a vision. Just to give a very brief, concrete example, in Germany, we have recently introduced the possibility of class action lawsuits in the area of data protection.
That means that it is not the responsibility, not only the responsibility of the individual to pass a lawsuit on a company as Facebook, but that it can use an NGO or a federation to do this process
because we know that foreign individuals are very, very difficult and very costly to engage in a lawsuit. So this is an attempt to try to harness the power of collective organizations and I think there are many ways where we can go forward in this direction. Thank you, Lena.
Let's just switch to Annette. You represent one of these interest groups Lena just mentioned. How is Verdi, yourself, perceiving the challenges that Stephen just mentioned and which is maybe the most important and how can we address them maybe with this kind of European perspective in your work?
Okay, lots and lots of challenges. As trade unionists, we are citizens and workers and I think both perspectives are really important and I'm trying to pick just two issues out of this bunch of issues.
What are the most important issues right now? I think talking about civil rights, talking about human dignity, talking about the risk of the loss of privacy, which is also a threat to the freedom of expression,
to the freedom of opinion. I think there is a democratic issue here and there is a structural issue of working world, which is a different issue. There is a different issue of new dependencies. The issue of safety is another issue.
So I think we should make it a little bit more clear. Do we talk about AI and automated decision processes that are based on big data and we have no clue what comes out of that decision.
It's rather a game. Or do we talk about general digitalisation and the use of data and who owns data and how do algorithms work and all this. I think we should make this a little bit more clear. As a trade union, we also fight for democratic rights,
democratic rights for society and for the working world and big data is a big challenge for both. We want to use the digitalisation in the working world
to make the working world better, to make it healthier, to make it more enjoyable, to make it better paid, more democratic, more co-decided, less precarious and more meaningful. There are many chances and we did just two weeks ago, we organised a conference on the issue of common goods
in the digital world and how we can contribute to this as workers. So meaningfulness means that work is more satisfiable when you actually work for something that contributes to the common good.
And so we have a look at what are the special challenges and chances created by big data. For example, we can look at water supply. Just a few years ago, water supply was connected to the internet which is a real danger and they stopped that.
They look for more technological sovereignty on a local level, on a national level, on a transnational level, so we have to check the technology, we have to have informed workers but we also have to have this connection to society as such.
So I think looking at both sides, we have a democratic part, we do not want to live in a, it just has been announced, the next speech is going to be about the credit score system in China which is a huge threat for society as such
as well as for the working world. We do not want to have a system established of total surveillance, so we have to make sure that this does not take place in the working world and we want to make sure that it does not take place in society as such
but so we have to make some decisions how we deal with these issues and therefore I think to tackle it down a little bit, we should focus on the issue of democracy on the one hand in society and working world and on the issue of security
because I think it's a huge threat to security if we have automated decisions on a data basis we cannot control. That's the perfect word for Matthias but before I hand over to Matthias,
and I think you mentioned one very important thing, we are talking here on the challenges that Steve mentioned are very different levels. We have the antitrust question, what if a company gathers so many data through their network effects that they have an advantage nobody other can keep up with.
We have the personal data question, we have the AI question of what happens out of using the data, maybe including or integrating in your opening statement. Matthias, you can help us to frame these different levels a little bit and introduce what algorithm watch is concentrating on.
Well, integrating all of this, I will probably not be able to achieve that, but I was going to comment on a couple of things that were already said on the panel, and one of these things is that there's been a lot of discussion
about the European aspect in all of this, and the European values, and Macron in his interview referred to the European values, and so far I have difficulty seeing anyone spelling out what these European values are. I've never heard anyone say, what exactly is it that we are talking about?
Is it individual liberty? I thought the US were making a big deal out of that. Is it freedom? Is it equality? Probably that equality thing is getting us somewhere because the United States as a society seemingly is willing to accept a lot more inequality than, for example, many European countries do.
So is that something that we could work with? And if so, then we would, of course, have to spell out what it means in dealing with the challenges we are facing. But at the same time, I really disagree with the assessment that there is an unwillingness, for example, to regulate.
We've had a lot of regulation in the last years, in the last 10 to 15 years, that we're concerned with the challenges of digitization. And there are many examples of badly done regulation. We've been talking about the Google tax here in Germany.
I mean, that's what it's termed internationally, because Leissungschutzrecht verpresseverlage is not really something that translates well into English, right? So this is not a law that anyone in Germany, from the academic point of view, from the civil society point of view, from the economist point of view,
were really in favor of. Still, government enacted it, and now the European Union is trying to put this on the European level. That is something that is a hurdle to the development of ideas in Europe that are dealing, for example, with new ideas in journalism
that could be quite useful in combating monopolistic tendencies that we see in companies like Facebook or probably Google. But these companies are usually very, first of all, willing and capable to comply with these laws,
because they have tons of lawyers and even more money to throw at these issues in comparison to many small and medium-sized companies in Europe. So what I think is that we need to, and that was already said, look at this entire issue on a very differentiated level.
And that is difficult, because everyone is expecting answers to the big questions. Is Facebook destroying democracy? Well, if you ask me, no, not really, at least not in the next two to three years, right? I mean, democracy has been here for quite a while, and there are challenges to democracy,
and there have always been challenges to democracy, and probably starker ones than we are facing right now. But yes, of course we need to look at this, and suggestions like looking at or defining these companies more like public utilities or infrastructure are, in our opinion at Algorithm Watch, much more...
I don't want to call it realistic, you know, because it's quite a long way before we can achieve that, but they are more pertinent and more actually addressing the problems that we're having, because if you, for example, argue, as many people do, that Facebook is a media company,
we would disagree. We would say, no, Facebook is not a media company. Facebook is not just a technological tool that distributes information, because we all know that the information there is curated, and that is a very powerful function that Facebook exerts, but at the same time, it's not a media company.
It's not creating content the way the New York Times or Spiegel Online is creating content, right? And as a matter of fact, we don't want them to get the rights of a media company. I mean, it's not just about the liabilities and the duties. It's also about the rights of media companies that we do not want to assign to a company like Facebook. So, you know, we're always looking at this huge question
of how to regulate this whole field of digitization, and then when we look at it closer, we realize that, oh, we have to look at it more in a micro perspective to be able to address the problems. I mean, as Lena already said, we've had regulation on algorithms and digitization in many, many different sectors,
starting from the financial system to, for example, public utilities and the medical sector. What other options or measures do we have beyond or alternatively to regulation? What can we talk about in the next 30 minutes? Well, regulation, if we're not just talking about laws,
because laws in a democratic society should be the last resort, you know? We should only enact laws if we have no other way to mitigate a situation you want to solve a problem that we see as a society. So if we look at different kinds of regulation,
we have technological standards, there's a lot of standard setting activities around the world, the IEEE and the German Dean, you know, it's like the standards organization in Germany, they are looking at standardizing IA procedures. Will we be able to also figure in ethical concerns there?
Well, not in the sense that we'll have ethical machines, because there are no such things as ethical machines. Will we be able by using these processes to heighten the attention that there needs to be ethical guidance in designing those procedures
in a way that we will not have discriminating or unduly discriminating procedures in the end, that we will not have procedures in the end that take away all the money from the have-nots and redistribute it to the rich people in the world? Yeah, right.
I mean, this is what we are trying to achieve, but as I said, before we make any laws, especially not the laws that we've been looking at in the last couple of years, we should find other paths to success there. Just one follow-up. You had a conversation about this with Stephen in Dietzeit
last fall with algorithm, but you answered on his point that rather very condensed, rather regulate or forbid certain technologies, we need to enable people or empower people better to use them. Could you elaborate a little bit on this idea?
Yeah, well, you know, we just had an, this is not like, I'm not advertising this, but we just had a new essay by my colleague Lorena on our website where she's arguing that, oh, she's giving an example of regulating the car. I mean, the car is a technology that's been around for quite a while, right? To us, it's completely normal the way
that traffic functions and so on and so forth. It wasn't that at the beginning. There were tremendous fears associated with cars. There were people who were assigned to walk in front of cars with a red flag so that the car would not run someone over, right? And it took a while to grapple with this new challenge
and to make sense of it. And then in the end, we made regulation that said, you can drive a Porsche instead of a Volkswagen Beetle, but that doesn't mean that you can go 150 miles in town, right? You have to abide by the same rules as anyone else,
no matter what technology you're using, right? So we were not limiting technology in that sense. We were giving guidelines to people as a society. We were saying, this is how you need to use this technology. And this is exactly how we need to deal with the technology that we are confronted with now.
This is not so much about regulating algorithms. It's about regulating how as a society, we can employ and we probably in some cases must not employ these technologies. And what this means in the specific case, we only can discuss with a specific example, not in general. You know, let me just jump in real quick.
I mean, I agree with much of what you say, Matthias, except I also disagree with a rather fundamental point. And it comes down to, you know, is this time fundamentally different than when we invented the cars? And I would assert that it is in a number of ways. But the way I'm thinking of right now is that the example that your colleague used
with developing cars, the cars were right there. The companies producing them were right there. The government overseeing the development of that had the ability to monitor its domestic market. And at some point when it decided something about the domestic market wasn't working right,
it was easy for it to correct that. The companies that are being produced now out of Silicon Valley and out of China, we don't even know what at some point China is going to come rolling into the global markets with whatever they're developing there. These companies, as I said, they exist everywhere and nowhere.
The ability of a nation like Germany or a union like the EU to monitor and regulate its domestic markets is under threat. Because these companies, as I said, they can do what they want and you can try to regulate them and they can pay attention if they want or they can say, sorry, we're not going to do that.
I mean, you know, right now, Silicon Valley has decided to sort of like go along with Europe. Yeah, Europe, they like to regulate us more than the United States. Okay, you know, it's a huge market. We don't want to have a big fight with them right now. We'll go along with things like the GDPR, the fine for 2.4 billion, because for Google, that's nothing.
The fine for Apple in Ireland, because that was nothing for Apple, ultimately. But they could just as easily say, we're not going to follow any of your rules and any of your laws. And what would Europe do at that point? And Anita first, or just right answer. Yeah, maybe I can jump in,
because I want to stress the point that it's, I think it's very important to not to have to choose between different kinds of regulation, be it through ethical standards, be it through laws, be it through digital self-defense, professional codes, et cetera. And so when before I said,
what we lack is not ideas for regulation, but implementation and re-regulation, I mean exactly this. We have first attempts to regulate, but we might realize that this solves just a tiny fraction of the problem. And we might realize that Facebook sometimes complies, sometimes doesn't. But it's not that we are out of our toolkit.
There are so many more things that you can do. The thing is that you can prevent Facebook from operating on the European market. The thing is that nobody at the moment dares to do that, is that policy lawmakers and the data protection authorities that have the obligation,
or at least have the task to control, lack staff, they lack resources. So the laws we have are not even really implemented. And I think this is something that will go on with a new GDPR. The way it is actually, the way it will be implemented
is going to be negotiated. And these negotiations are starting right now. So we still don't know how the GDPR is going to change the picture, right? But there's so many more things that you could do. The thing is that decision makers are really reluctant to do so. Why? Because they're kind of afraid that Facebook and other companies
will actually leave the European market. Political parties and bureaucracies use Facebook a lot. And another thing is that there's some kind of, at least until recently, some kind of belief that the US market is some kind of a model. Everybody wanted to have his own Silicon Valley.
So this is probably what you mean when you say we need new visions, but until recently, there was really this idea that what we need is more entrepreneurs and more startups, and they should somehow look like Silicon Valley and we need more Silicon Valley mentality. So when you see it that way, you will not throw out the leaders of the market.
Another thing is that this whole problem of companies leaving the market can create unemployment. So these are very common narratives
that have been used for not regulating big companies. And you see that with Facebook and Google too. So I think it's not that politicians or law makers are powerless, but they don't choose to really use their power. And it doesn't help that the European Union
is not a real team, right? There's no real ambition. Not a comment, Annette, it's your turn. But just one simple question to the audience. Who of you would like to see Facebook, Twitter, and Google banned from the European market? Hands up. Good question. There's an alternative.
Well, there isn't. At the moment. I saw about five hands, okay? Wait, wait, wait, wait, wait. Now, Annette, your turn. No, certainly not Stephen. Now it's me. I think we can work on alternatives, and we should work on alternatives.
Facebook was against German and European law since its existence. So there should have been much more law enforcement and standing up for asking for standards and respecting standards. So and if that does not work,
and as we can see that we do have that social media is a communication, global communication tool that is like a global communication market, then we have to promote other alternatives. That does not mean that it has to be state-run
or something like this, but alternatives can be promoted and they can get financial help and there has to be criteria for it. So we have to talk about design. We have to talk about technical design. We have to talk, no, we have to talk about ethical design
and I think there are many chances what we can actually do. We should decide what are infrastructures that we really need and there is a lack of an alternative because if there is a lack of alternative, we have to look for an alternative and we have to create an alternative.
Others is a law question. I think talking about the question of what should be integrated and what could be done different, this is something that we should focus
on our ethical standards like privacy, our ethical standards like the possibility for co-determination, our as a design question saying you could say that it has to be a technology that is open,
it has open standards, it has to be a technology that you can still form, that you can control. We at the workplace, we ask for example, we do have a legal right to say for co-decision if the technology can be used for surveillance,
so we have to make sure that we can analyse if that tool can be used for surveillance and this is also a design question, so it's an organisational question and it's a design question. And it got the point. Thank you so much. Before we go on,
we have a time problem because I promise you we keep you joining in. We have seven minutes left. Are there questions, very important questions we have to take in? Maybe just a very short and a direct question
because I want time for a last closing round. We always try to run behind Silicon Valley with regulation that will be the same thing. We will never reach them. They will always be in advance. So we rather should look into something where we could take over
and that could be that we more or less look into the missing layer of the internet, the identity layer, so that we would work on the technologies which help creating this layer which is missing at the moment and this would also help
passing by all these roadblocks from the big companies. What do you think about that? Real quick, when you think of Silicon Valley, I've heard so many Germans particularly say, how do we copy Silicon Valley?
You cannot copy Silicon Valley. When you think of Silicon Valley, the first thing you should think of is military contracts because for decades that has been the basis for Silicon Valley's success and then from there, they get venture capital funding and they can throw that into different ideas. 7 out of 10 startups in Silicon Valley fail. 9 out of 10 never make any money.
It's basically a casino of gambling on companies that most of which are going to fail but you have that stable military funding so it continues. So Germany and Europe has to find a completely different way unless you're going to really boost up your military spending and have them do like a DARPA for Germany or something like this which I don't see happening.
The challenge for me is as I said, these companies are fundamentally different from the traditional brick and mortar companies. They can set up anywhere in the world and they can beam anywhere in the world. The only country really who has stopped Facebook and Google from doing what it does is China. They shut them out
which brings us to Matthias's question and as a result, a Chinese version of Facebook and Google now have started and they're fortune 500 companies. I mean I look at Europe where you have Airbus for example. You have CERN. You have these multi-state international collaborative efforts to do really high quality science and research
and it seems to me that this is the direction for Europe and for Germany is creating a kind of Airbus for artificial intelligence development for algorithm development where you're sharing resources. It makes no point for sense for Germany to be competing against France when it comes to AI development and yet if you're not careful that's where things are going to go.
The other thing that I'll just throw out quickly is that we need to create what I call digital licenses. So as I said, Ford Motor Company, they have to sign up to licenses and permits when they come here. These companies coming to Germany and Europe should also have to sign on to a digital license that says here's what the rules are. Here's what you may
and may not do with our data. Maybe we want a copy of the data because we want to be able to create public interest data sets for AI development. So Facebook, we want a copy of all your data. There's lots of things you can put into these digital licenses and that could make them enforceable in the way that China has enforced it on Facebook. So to sum up before I give the last round,
we are lacking some kind of consensus about regulations. We are lacking enforcement about existing regulations and we are lacking working alternatives. What do we need? Do we need something like Stephen just promoted or what is your vision,
your positive vision for us in Germany and Europe or maybe universal? Okay, the positive vision is orient on human dignity, the design-making process and also take care of privacy and tax while doing this design process and create new commons.
I think there is a big chance. We have an open government, open data law, but this open data law lacks the criteria on what is going to be collected. It lacks clear rules about privacy, what to use and not to use
and it has no rule whatsoever on the private sector which tends to gain more and more data which is really important for the public. So there is a new initiative right now on the EU level where it starts at least a little bit a new directive to gain data
from the private sector too and I think that makes total sense to have new public commons on data. Thank you. Some people might be surprised if I say I completely agree with what Annette just said because I don't think there is a juxtaposition to what I said before
because I do think that there needs to be a lot more initiative by the public sector in Europe to help design solutions that we like, from a value point of view and this can be based on data that we share, on data that we force others to share
but we need grounds for that and this is the challenge that we are facing right now. I'm a little uneasy, Stephen, I have to admit with the example of China being able to keep out the American companies and designing their own alternatives. I don't think that's the way that we want to go but yeah, they were successful.
We'll need to find some other option. It would hopefully look very different than in China, right? Lena, you have the last word. So my answer would be very brief. I would never go for either or. I think we need everything.
I think we need attempts to some kind of break up Facebook, reign it in, implement our laws, find new laws, we need professional codes, we need ethics, we need individual self-defense, we need all of these and I think we should go forward and not try to play one out against the other.
Thank you so much. Dear audience, I think there's a lot to do. We know what to do. Maybe thank you for all these impressions. Yeah.