We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Man versus machine: Who controls the game?

00:00

Formal Metadata

Title
Man versus machine: Who controls the game?
Title of Series
Number of Parts
234
Author
License
CC Attribution - ShareAlike 3.0 Germany:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this
Identifiers
Publisher
Release Date
Language

Content Metadata

Subject Area
Genre
Abstract
2016 will be remembered for painful disillusions, featuring technology as an important agent, but for once not a positive one. Critical approach to what can be achieved with it – speaking of political and social changes – exploded and moved mainstream debate to the other extreme. Media around the globe did not hesitate twice before blaming ‘algorithms’ and ‘data’ for major political failures. As if it was technology driving us, not the other way round… So who is responsible – human or artificial intelligence? Can humans regain control in this game? Let’s talk!
18
166
Computer animationJSONXMLUML
Video gameMathematical optimizationResultantProcess (computing)Lecture/Conference
Power (physics)Game theoryMultiplication signInternetworking
AlgorithmAlgorithmVirtual machineReliefBitResultantAnalogyData analysisLecture/ConferenceMeeting/Interview
AlgorithmSystem callBitHypermediaCategory of beingSphereRobotGame theoryXML
Total S.A.FacebookSource codeChemical equationPower (physics)Graph (mathematics)DemosceneSystem callFacebookExpected valueGoodness of fitLevel (video gaming)WaveAlgorithmMultiplication signRight angleLecture/ConferenceDiagram
WebsiteXMLUMLComputer animation
SpacetimeAlgorithmHoaxContext awarenessLecture/ConferenceMeeting/Interview
Multiplication signHoaxContext awarenessSound effectExtension (kinesiology)Computer animation
Power (physics)Chemical equation
HypermediaNetwork topologyPhysical lawWindowMeeting/Interview
CAN busMathematicsPower (physics)Lecture/Conference
HypermediaBitMoment (mathematics)Power (physics)Chemical equationGame controllerLecture/Conference
Context awarenessVotingAlgorithmPhysical lawPole (complex analysis)Data analysis
InternetworkingDependent and independent variablesGame theoryComputer animationLecture/Conference
InformationEvent horizonAxiom of choiceWebsitePlanningWeb browserMereologyDifferent (Kate Ryan album)HoaxPlug-in (computing)XML
Client (computing)PressureReal numberHoaxTwitterLine (geometry)WebsiteSelf-organizationMeeting/Interview
System callConsistencyLecture/ConferenceComputer animation
FacebookKeyboard shortcutModule (mathematics)Black boxFacebookShared memoryGame controllerProcess (computing)TrailAlgorithmLecture/ConferenceMeeting/InterviewComputer animation
FacebookResultantWeb pageStreaming mediaElement (mathematics)Operator (mathematics)AlgorithmContent (media)Game controllerPoint cloudFacebookTerm (mathematics)XML
System programmingAlgorithmTerm (mathematics)AlgorithmSystem callJSONComputer animationLecture/ConferenceXML
AlgorithmMessage passingLogicPressureAlgorithmCodeComputer animation
BuildingProduct (business)Connected spaceCodeAreaGame controllerHoaxAlgorithmReal numberDependent and independent variablesComputer animation
Connected spaceLogicSystem callFacebookProcess (computing)CodeOrder (biology)Mathematical analysisAlgorithmProfil (magazine)Game controllerLecture/Conference
Plug-in (computing)FlagAxiom of choiceService (economics)Field (computer science)State of matterFacebookMeeting/InterviewComputer animation
InformationWebsiteProcess (computing)Video trackingDigital signalDigital filterGrass (card game)Inclusion mapSpeech synthesisComputer-generated imageryTerm (mathematics)Content (media)Matrix (mathematics)Message passingContent (media)Term (mathematics)Dependent and independent variablesGame theoryState of matterMetric system1 (number)AlgorithmHypermediaMeeting/InterviewLecture/ConferenceComputer animation
Term (mathematics)Game theoryOperator (mathematics)Power (physics)Data conversionTerm (mathematics)Dependent and independent variablesHypermediaDecision theoryInformation privacyContent (media)AlgorithmGame controllerMechanism designFacebook1 (number)Multiplication signCondition numberDenial-of-service attackRegulator geneService (economics)Software developerHoaxSinc functionDifferent (Kate Ryan album)FlagSpeech synthesisLecture/Conference
LoginService (economics)View (database)Plug-in (computing)Computer animationLecture/Conference
AlgorithmArithmetic meanCodeAutomatic differentiationWebsiteService (economics)Lecture/Conference
MathematicsCodePhysical systemSelf-organizationService (economics)Latent heatMeeting/InterviewLecture/Conference
DivisorChannel capacitySelf-organizationChemical equationMetadataNoise (electronics)Physical system
Moving averageSystem callMedical imagingData storage deviceLecture/ConferenceMeeting/Interview
Computer animation
Transcript: English(auto-generated)
Hello, you may also know me as Katarzyna Szymelewicz. I know it's difficult, and I work in
Panopticon Foundation in Poland, which is the NGO I founded eight years ago. So for last eight years, I was often the one out there in the public who used to be more critical about technology than
mainstream, who used to see potential dangers in how technology shapes our lives, who often said that this is far from being neutral. But I'm also not the one who would blame technology for our failures with regard to political, economic or social life. So, when the terrible 2016 happened, and many people got
disappointed, mostly with the results of political processes, and the thinking about technology moved from optimism to some kind of dystopian ideas that with data on the other side and internet, we as humans are losing.
The key game, which is the political game these days, has always been, and I think for a long time will be, the key game we play with those who have in power. I didn't feel relieved. I must say I felt a bit concerned, because the more we pose, the more we stress the agency of machines, technology and algorithms, the less we see
we see humans in that. It's not to say that I don't appreciate how big impact algorithms have on our lives, including micro-targeting, the use of data analytics be that for the results of the election, as we have seen in the Trump
presidential race or at the situation that happens in the countries after the new government is established. There is good reason why we appreciate the role of bots as new important agents of the political sphere, where they might very effectively influence the narrative that spreads around social media and
the kind of temperature of the debate that we feel we sense as human beings. But this is taking the agency in a very simplified way. What I would like to do today is to look a bit deeper underneath those layers of technology to question for a while who is behind that, who is designing it, who is
manipulating it or gaming it, and who are the people on the other side? Who are us? And what is our role or our potential agency in changing the power balance we might not like so much looking backwards? So take a look at that picture. I believe many of you have seen it already. It's a quite famous graph prepared by Craig
Silverman for the BuzzFeed in November 2016, right after Trump's victory, suggesting, as you can see, that in the Trump election run-up, especially in the last days, people on Facebook interacted much more with fake news than they did
with so-called real stories. It is a matter of fact, but how we interpret that fact, if we look at that scenario, what we really see here is this suggesting that people did so because Facebook-owned algorithms showed them mostly fake news, and this is why the fake spread so well. Or is this mostly because humans chose to interact most with
people? Who was the agent? Those stories that triggered emotions that were linked better to their expectations or their needs? Who was the agent? That is the answer we don't have from just looking at the facts. It needs much
deeper research, which we might not necessarily have. But one thing we know, I think, for now, is that the fact that we found out whether that news was fake or real was not so much meaningful for those who clicked on it, yes? We rediscovered that there is a different wiring of our political brain. It's not so much about
technology, it's about us not being so attached to facts any more, and actually openly saying that these are less relevant in the world dominated by political emotions and narratives that are not fact-based any more. And this is the phenomenon that obviously many of you are discussing, and it rightly so takes a lot of
space in the public debate. It could be the phenomenon that we should be looking much more deeply than the algorithms. And, of course, there are filter bubbles that have been with us for at least ten years, but suddenly they got rediscovered in the context of fake news as the phenomenon that
has key impact on how people think, what people see. But again, what we don't know, and what has been said today many times by other speakers too, is to what extent the effect, the filter bubble effect, really is shaped by mere technology designed for us without transparency, or how much
it's affected by us, by our choices, but what we choose to look at, and how much it might be gained by third parties like the botnets who might be designed by somebody else. So what really is the power balance behind the filter bubbles? We don't know, but there is the
reality outside. This is one of thousands of pictures of trees shared on social media in Poland after the government, Polish government, adopted a new controversial law allowing every private owner of the land to remove trees that are disturbing him or her. So the trees were
disappearing, and thousands of them were posted on social media. Not everybody saw them there, because not everybody belonged to this government-critical bubble, but I believe big majority, or even all the Polish people saw one or two disappearing outside of their
social media. So the trees kindly remind us that there is reality outside our filter bubbles, political, economic, social reality we live in, and that reality is not without an impact on how we think about politics and how we behave. So how will the next year, this year, and next
year look for us? What can we change in our behaviour online and offline to correct the power imbalance and give ourselves back a bit more control? Again, if you look at Poland, with all the technology that is controlled by the government, with all the media which is
effectively controlled by the government these days, we see some hope. We see that the Poles are changing, and the peace who started the law and justice ruling party who started with great people, and the majority of people supporting them now are feeling that there is somebody
running after them. On the other hand, we have those players, many of you might recognise in this picture, transformed Cambridge Analytica logo, the company that still claims to be able to control the voters, to control how people behave in political context, mostly using their data
and algorithms, and data analysis. So, for people like Cambridge Analytica who are excellent in marketing their own marketing, it seems to be mostly the question of money and data being available. But what do we say? What is the response? If I look at the internet, if I look
at what is going around me, not only in my bubble, I do try to go beyond it, I see that the response is far from game over. It's mostly that people say no to that feeling of being manipulated. And, obviously, there are plenty of initiatives emerging in the US, in Europe,
in Russia that respond to the recent buzz around fake news. These are human-driven initiatives that use technology to trace what is credible, what is not, and to enable people making their own assessments more effectively. A big part of those assessments, of course, comes through
technology again. So, to be able to judge fake or real news more efficiently, you can install a plugin in your browser who will give you a warning if the trusted news site of your choice signals that that might not be credible. So we are back to the world of technology with slightly
different concepts of actually escaping the filter bubble or questioning what we see online. And there are initiatives like sleeping giants on Twitter where people mostly in the US organise to exert pressure, real economic client-driven pressure on companies who put their
advertising on fake news sites. So there is good thinking, not driven by technology itself, but by the understanding that there is a market behind that, that without the money in their market, even fake news phenomenon wouldn't exist, because if fake news were not funded as easily,
they couldn't spread and they couldn't be produced that effectively. So those initiatives bring us back to the human agency. And then, not for the last year, but for much more than the last year, for at least eight years of my work, I have witnessed a strong, consistent call for
transparency of technology, which is again the call of humans, people like us, who have always been concerned with not really understanding the black boxes and questioning the black boxes. They're not only hackers. Sometimes they are activists, sometimes they are artists. This is a piece of excellent research done by Sherlabs, who presented it yesterday here in
Berlin, showing how Facebook is doing the tracking, and what kind of data exactly Facebook collects beyond our control, and how that data is processed in the algorithmic engine of Facebook to produce knowledge that then might be sold to such agents as Cambridge Analytica
or any other advertising or marketing entities that want to try manipulating us. And there are initiatives like that one by Claudio Agosti, who started digging into Facebook algorithm from the side of the news stream. So what really is in Facebook
that dictates how we perceive the news stream? How many elements of the news stream come from your friends? How many of them is a paid content? How many of them come as a result of some non-transparent operations? Those things, if we dig deeper, can be found, and there are already
people, and they're helping people for a while, who are trying to find this exactly for the reason of regaining control. Of course, transparency is not everything. The big term, the term I appreciate very much, is the call for much more, for algorithmic accountability.
MIT published that paper again in November 2016, stressing that there is a need to redesign the way you think about algorithms themselves, so they coined five principles for people designing algorithms to increase accountability. And this year, we have seen that voice and many
similar voices putting even more pressure on the very simple message. There is always human behind it. There's always human designing the algorithm, and we need to engage with those humans and their logic before we start asking for the code, and before we even start discussing the technology, because the idea there might be not so complicated as the code itself.
There are always people behind it, and some of them might be like Mark Zuckerberg, who believes that he can shape the global democracy in a better way than we might be shaping as people, as citizens. His response to the whole fuss around fake news was obviously
give me more control. I will redesign my algorithms to the way that you will no longer see fake. You will always see real the way I define the real. So those ideas, again, they are open ideas. We don't have to dig into the code to understand the very logic behind Facebook
Empire. Facebook Empire collects data, processes data, makes algorithmic analysis in order to produce profiles that are marketing profiles for those who want to control us, but also who can become itself a political control machinery, and Zuckerberg is becoming very open
about it, which is, I would say, like a red alert for all of us who are still there and who are feeding the empire with data. So at the end, it comes down to people, to people behind the empire, behind every technology that is driven by personal data. At the end, it
comes down to our choices, how we use Facebook, what we click on, do we use plugins like kill the newsfeed, which is available, and you don't have to look at it at all if we migrate out of the service at all, if we do whatever we think is adequate in the situation. So my main message today, and my own take on this, is a matrix. It's not a yes or no.
It's not black or white, technology or humans. It's not right to say that we are winning the game, it wouldn't be right to say we are losing the game, but certainly we can understand the game in a much more nuanced way that has been suggested to us by mainstream media lamenting over
botnets and algorithms as if those phenomena never existed last year, or listening to companies who say it's up to you people, you are sharing the content, you are the ones who actually drive this machinery. None of those responses is really true. So my understanding of this is that
we are in a much more complex situation that is not that novel in terms of agents and not that novel in terms of phenomena we are talking about. There has always been debate about illegal or harmful content online and how we deal with that content, how it gets deleted and
removed. So the conversation about who has control over what we see is as old as the notice and takedown debate, as the censorship and hate speech debate. And there is another layer, which is filter bubbles, that has always been around us as long as we have social media that were built exactly to filter out the whole content that we cannot process and provide the
content. So this is a very personalised newspaper. In those two phenomena, we have seen the development of legal regulations, like data protection regulations, or notice and takedown regulations. We have seen a lot of business-driven solutions, like terms and conditions that
every online company has, that says what can stay, what should disappear, and we have also seen a lot of flagging mechanisms or algorithmic decision-making that I mentioned using Facebook, as example, in response to the fake news panic. And in between those,
we have, of course, the layer where we operate, where all the civic actors can come up with their own way. They can come up with digital education, they can come up with simply changing the service they use, they can come up with many community-driven solutions and initiatives.
So the middle ground is up to us, and it's extremely important that the middle ground exists, because this is how we can mediate between what business thinks as a right solution, which is mostly driven by money, since business is about making money and there is
nothing strange about it, and between the political, which is where the power lies, the power operates, and where many agents might have different interests, as we, being citizens, and being, in a way, controlled by those who have the power. So this is the way I see it,
and I would really love to engage in a little discussion, if we still have time with you, about how you feel in the matrix, how you find your place between the human and the tech, how much you feel that we can be in control, and can you think about any initiatives especially civic-driven initiatives that might offer some responses in the world where
the narrative is shaped by non-transparent mechanisms, but we are the ones who might be working on opening them, or questioning them, or demanding more accountability for them. If we find the right place for us in this matrix, the game will be far from over,
and there is a good hope that we can actually find a good spot for all of us. Thank you.
Actually, we have now 10 minutes for questions, so anybody who has a question? Or comments, I mean actually I would love to hear your opinions and views on that. Okay, short comments then please. Come on up front.
Hi, thank you for the wonderful talk. I wanted to, it's like a comment question, where do you see plugins and other services that use an obfuscation approach, so the idea is to make you invisible? So, for example, a service called Algnosium which clicks on all ads on the website and thus messes around with an algorithm, is that something which is sufficient? Because
that used to be the pipe dream of the cyberpunk movement, right, the mid-90s. If we're all invisible then no one can see us and we can do whatever we want. Are we still, is that still a goal? Is that important in your eyes? Is that enough? You mean the obfuscation meaning producing, producing, disturbing the system with more data rather than hiding, right?
Exactly, and is that disturbance enough to enact a change or is it just a tool for individuals to hide from specific services or organizations? Yeah, I wouldn't say this is a tool to hide, I would say this is an excellent tool to confuse the system and make the work of the
organization. I don't think we can appreciate the real capacity of the leading companies to differentiate which metadata, which behaviour of us, which activity of us is the real one and which one is the fake one. Just relying on our ability to obfuscate, to produce noise, could
be slightly too optimistic. That's also why I'm trying to show that phenomenon, that problem of political struggles with data and with us as actors is much more complex, rather escaping from one solution, be that law, be that tech, or be that our own activity. I do believe that all of these factors have to come together and only then we can come to some
balanced situations. I would say it's a great tool to use, don't stop, but also don't think that that solves all our problems. Okay, thank you. Any other question? Last chance, last call. Any comment? Okay, thank you very much. I close this talk now.
Thank you.