Bestand wählen
Merken

Fake, hate, and propaganda - What can technology do?

Zitierlink des Filmsegments
Embed Code

Automatisierte Medienanalyse

Beta
Erkannte Entitäten
Sprachtranskript
the and he was called off and to
the point that I talked to and thank you so
much and it's great being here today and
thank you all for being here at 10 o'clock in the morning I commend you I'll give coffee in your hands and I think this is gonna be a really interesting discussion we chatted a bit as in preparation for this and I think you'll find that there and that will yield really try to tackle some of the big issues today some start out with what I think is 1 of the defining questions and something that Frank Pascal talked about yesterday in his talk on stage 1 of the folks saw it and which is the role of a corporation in the 21st century with respect to human rights freedom of
expression privacy and full disclosure in my work I look at a number of different companies including
Google add to various global properties in terms of how do um individuals feel that their content has been taken down on social networks so that Solomon censorship . Oregon what we've been looking about for a few years now but I think it is a really big question that that covers all of these different issues so I would be curious just to here to start off where you feel that are what you feel that the role of a corporation is
today but it is a huge issue as you point out and I think that that the role of a corporation is also constantly changing because of changing laws and changing balances cetera I thing at the heart of it the role of a corporation is right as good as possible service as they can to the users that's a 1st sort of baseline expectation of the and then on top of that to try to find a way to maximize and encourage free expression creativity and uh the open debates that the public discourse requires on on top of that platform now as you well know that responsibility is then influenced by a number of different factors and the ability to sort of maneuver is influenced by a number of different factors so I think it's it's a it's a good question to be asking but it should be asking context so what is the what is the shared distributed responsibility for open debate in our societies if you ask what the role of a
corporation you also necessarily have to ask whether all citizens but also the role of the state and the role of the regulator and try to figure out how they all fit together and that's so that's a negotiation that we've been in for the last 20 years ever
since we we were founded back in 1998 and I think it's it's 1 that's evolving sometimes in a good way sometimes not so much women tasks more specifically with with respect to
google where you see Google fitting and that man's cakes I now when it comes to for example facebook they've been really staunchly opposed to being seen as a publisher they said no were neutral a technology company and I would vehemently disagree with that
personally and so be curious to see you know where where you see Google's role in particular so think I mean we have a number of different roles because you have a number of different properties
that we offer 1 is you tube another surgeon I think that they are but they are also different in terms of how they relate to the overall ecosystem I I I think it's too simple to say that it's i the question of you being a publisher not being a publisher if you if you look at the way the public sphere has changed with the introduction of technology you realize that entirely new roles of actually marched 1 of the things that I think is interesting is that when you have a situation of enormous information well you have so much information out there what is needed is
somebody to cure rate certainly somebody to post somebody to publish it with a sense of liability in editorial
responsibility which also have a vast space that is information discovery search for example solidly sits in information discovery my with which means that that search necessarily needs to find out what is the best way to construct the public sphere information discovery mechanism and that is by maximizing access to knowledge to formation to whatever people are looking for not necessarily becoming a publisher with or without means iteration discovery and publishing our our our different I think that was we start understanding and studying this
ecosystem and the different ways in which we will interact with a wealth of content out there we see that different rules applied so for from inferred search that's very clear the case you too
has more about more of a community approach for this and essentially is trying to create a platform for users to create systems you speak their own voice and within that mission is also trying to create community guidelines that will allow the users to shape in different ways the kind of platform so that's going to be different for different kinds of platforms I don't think
it's it's not a it's not a binary choice of publishing or not publishing an ally really like an answer and I think I when I moved to what we're chatting about just before we started
which is and search so if you don't mind amateur just a quick to fit a good story a few days ago a friend of mine asked you know how many countries are there in the world and a couple of us gave different answers and I thought OK well the good thing we have Google I you
and searched it and let they came back with
them what I now know is called a knowledge panel those boxes that you see at the top of search and and it had a particular answer that was not incorrect but not necessarily comprehensive so obviously how many countries is
a complex question that we're not appeared answer today and but I think the knowledge
panel is a really interesting thing I liked what you were saying before this kind of question around objectivity of information and you know whether that even can provide a complete answer so what what what's your take on that how do you how do
you see those as as a serving then hot where you and I do you see it as kind of a policy issue when when these questions are more
complex we estimate but it's been is it this period the a question of what is the what is a good
response to a search when you're asking a question what's a good answer essentially that's the kind of thing that are engineers are always thinking about them and trying to figure out new ways to to to to answer that question into and I think 1 of the things that we have certainly seen over the last 10 years or so at that is that there is no natural necessity that the answer to researchers 10 blue links in fact this needs to move all because people become more advanced they become better at using information and they want different kinds of responses and on top of that the vast amount of information that you have to search through increases all the time according to some figures man this is quite interesting to look at some figures the amount of information did is restored doubles every 12 months it is staggering staggering observation if you think about it if you if
you believe that search quality is in some way related to the set of information have to
search for you would then say it if you do nothing for this period of time what will happen is that the quality researchers will pass so the the essential the challenge for anyone involved in search is to figure out ways to combat the enormous information explosion and try to get better answers to the questions people are asking 1 way that we've done this is through the knowledge of the knowledge tells and you're building on and uh the kind of technology can describe as a knowledge graph we try to map out what we know and how it relates to every single other thing out there this this is an enormous undertaking because you have to start answering questions like you know how many facts are the and how do you determine
them what is it what is sort of a good solid basis for determining factor and it is it is
tremendously hard this couple of sort of salt the 2 that are working on this and they have a they they've they essentially I think the mission of 1 of them was reported on your times a year ago so was to to determine all the facts in the world then you ask how many they were the the the absolute
someone speaking at that because I think it's really hard to tune in to do that it just took cat such 15 per
cent of the searchers we get every single day anew seen them before these are questions that we know C is required so that's that's a that's a real challenge when I I believe that for a search engine engage in information discovery you have to pursue several different avenues of innovation in order to become better in a space that's quickly exploding in image I think that but the knowledge panels and they
also have the possibility of returning incorrect information and I know that that's something that they Google worked on and it's always actively working effects but I think that that's a good segue to talk about the
the issue that everyone's calling fake news and then all full disclosure here that I you know I personally feel that this the the this binary a fake news in real news is part of the problem of heard people say that here in the past day as well and that by can of drawing that line in the sand saying all this is correct all this is not correct that were were really just perpetuating issue and and so I guess 1st so just safer for those in the remark familiar where where is word Google see its role in tackling
state news prices and I also just love to hear your thoughts on on how you see this and as an issue so that means of fake news in a sense this is not new for a company has been engaged in working on information quality for most of its time i mean fake news or other kinds of low-quality information has always been a problem for search that the key issue to to observe is that the Web is an adversarial environment everyone is not working together collaboratively in order to get to the best possible environment people are working to further their own interests that you know we've had for the longest time these enormous problems with search spam and if you go back to the introduction of was certainly lead was of a brief overview of the history of search you can see that the 1st kinds of search engines that are out there were very simple they did linguistic frequency analysis they would look at a page and they will look at how many times the the word actually occurred and they would use that as the basis of the ranking ones that method was tried in an adversarial environment
completely broke down to go into archive organ look at all the web pages will find web pages that are like this much text and this much
scrolling line where there's white text on a white background to attract people that that that for which is really interesting I am inside you have you have you Britney Spears white its white background for the entire page in a completely broke the linguistic frequencies search model and what we've seen over time is that this at the serial uh this tension is moving from from spam up into more and more semantic layers so people are constantly competing for attention in different ways because attention is something that that can change societies that can change purchasing patterns etc. and then in the midst of all that you have to figure out how you retain information quality which is increasingly hard problem so I think the way we see it is that this is an extension of the information
quality problem but with an added order of difficulty because you're trying to decide semantically if something is
fake or not if it's valuable or not and that's really really hard to do that there's some simple things we can do when somebody is obviously misrepresenting themselves as a news organization you can shut off the ads for the organization things like that sort of follow the money and stop the money flow to those or willfully just spewing noise into the system so that those are certainly things that we do to and the other things we're trying
to do is to to very point lead which show that this is not something that a single actor in this ecosystem can solve just as we came back to the
question responsibility are the it's it's a question of how you build this ecosystem out so where for example enabling different kinds of standards and information mock for factors so that they can work within this ecosystem to mark up information quality as well and so so it's a complex problem it is not new and it is stems from the fact that the web is the giver serial environment like the Web is a deeply
adversarial environment that quick quotable and I don't
think anyone here disagree on and great well then
I guess I would like to add that follow up on that and say um so I in I'm starting to feel like we live in a society where evidenced based thought is kind of on the decline it feels that way at least it feels that way and that the sort of polarization of politics that we're seeing in Europe and the United States in a number of different places in the world and is fact-checking even going to be enough from artifacts didn't have to
fax all the same meaning of everything is a construct and targeted Siena what more can we
as humans do not just as companies not just as a rule the world but what more can we as humans due to solve this issue and it's
it's it's such a good question because at the end of the day we spoke in the beginning about there being a responsibility for citizens as well I recently finished nigericin I think I may be misquoting this wrote a biography of Justice Brandeis um and and in this biography there's this fascinating set of quotes from Justice Brandeis in which he says that leisure time is the time you spend becoming a better citizen that's how you engage in society of
democracy strengthening how you invest in the institutions of democracy
in order to strengthen it and it's your duty as a citizen to do that so it's a way of viewing how we participate in democracy that is almost completely outdated in a sense it's very hard to find corresponding views to those of Justice Brandeis today I think it's he it just as brown eyes point that we do have an individual responsibility when it comes to determination of fact for suspicion democracy the the production of knowledge for example uh is this quite right and then there is there is no simple fixed so there's there's there's no technical fix to what you're describing I think that's really important remember there's no dials to turn up or down there's no switch to flip that will make this kind of problem go away this is about how we view our democracy and how we want participated and how we invest in it in time and effort in mean that is that is going to be up to us that's why it if you look at look at Aristotle's the affix precedes the politics in Aristotle's
work is very real reason for that because you have to
have your ethics right 1st your personal responsibilities and then you can start with the politics I like that as well and I think that the question of participating in democracy
and what's what sorts of the things we expect from 2 democracies at this point in this current error where I think we're all feeling this kind of tension
and brings me to something that I think and know we actually google we personally but also Google and myself my organization probably share an opinion on which is and they hate speech Bill here in Germany
and I'm just wanna pull my phone to make sure I get this call rights as there is a recent civil
society letter that I understand um some of the industry associations they're part of sign on to you but also the assigned by Chaos Computer Club and italic is a certain number of German organizations and and in that letter it said that Internet service providers play an important role in combating illegal content by deleting a blocking it however they meaning Internet service providers who should not be entrusted with the governmental task of making decisions on the legality of content and what are your thoughts on that the and I
agree with I think that if you are a and it's also a question of how you actually want your information flow in a democracy to evolve over time I don't think that that it's a good idea to put that decision over on corporations specifically out of you look at the way they hate speech or is construed with enormous penalties very short response times if you have enormous talent is a short response times that is quite clear that you're creating a risk that people who don't have a lot of resources or don't know they will have a lot of thought into this sort of push a lot of reviewers towards it a simply just going to default detector which means that the the overreached that such a legal construction leads to will be quite significant I think that's that is a real risk of free expression and some of these things some of these institutions some of these decisions the State needs to retain and needs to still uh have as a part of what it means to be a democratic core institution I think that's that's that's very very accurate the think absolutely agree with that and I'm I
think tell when it comes down to that this sort of of companies engaging with States on censorship I wanna I'm not really push back but also kind of broaden this out a little bit expand it to say that you a lot of the companies I think you
tube in this case at Facebook do comply with laws that that they don't
necessarily know the people in those companies do not necessarily feel are just and I think Turkey is a really good example of this I know that there are a couple talks on that and here this week and so where do you feel that the
company's role is in in compliance in that sense where do you feel that they should draw a line and say OK enough or not going to be complicit in this kind of censorship so I think I think in many ways what your asking is you know what do you as a company say about a legal decision in a country where your present if you are present in the country made the choice be present in the country you also implicitly made the choice to abide by the country's laws and rules and to a high degree and I think that that is sort of a basic consideration that all companies need to bring into this discussion now there some decisions and we can certainly push back and you can push back against the legal basis of a new challenge it in court and we've done that in several cases in several different countries and I think what you do at that point is that you try to combat to the scission within the legal framework which in which it was taken I think that's something that companies actually should
due to a large degree and they should challenge these decisions within that legal framework at the end of the day the 0 if you look at the the way that democracies are are expected to work that
democracy and the moon democratically elected government has the right to restrict information to its citizens according to its constitution no the framework and according to 2 law so what I think would be good for a given that there is there is so much and discussion about this is to provide much more accountability and transparency in out of that stuff we have a
transparency report the 1st company to launch a prosperous report reported through all these
different things that take company should do that government should do the government should say here the different ways in which we restrict information there's a different ways in which we request information about US our citizens and here are at times with use them on here is you know the different kinds of ways in which you can change those rules should you disagree with them that should be a national reporting duty for governments because they them provide transparency into those regimes and give citizens a chance to discuss them as as a shows a change them if have do agree on our
excited to talk about transparency I I I agree with you I I think that governments should but we also know that a lot of cases they won't and to give a story about a different company and there was a uh scenario a couple of years ago where I EFF saw the company was complying with them orders from prisons in the United States and that was a real problem because there is no transparency around that because it was not you know it was not the same kind of legal request and that had existed previously in that company's transparency report eventually things were settled it was change the policy was changed but I think that that shows that even if the US which um seeming less and less democratic these days but even if the US is not complying we can't be we it's unreasonable to maybe expects Turkey at this point to do so on and so in that sense it's a agree Google's than an absolute pioneering transparency reporting are who has your back report that we've been putting up for quite some time I think Google has all the stars that I believe and I didn't check the most recent 1 but
do you see now as um now as companies or sort of expected to take on more when it comes to content moderation to regulate content is not necessarily illegal and I know
you google doesn't do this quite as much as some other companies but there are still areas where that happens to you see all in these transparency reports for mean more transparency around that kind of assure and Terms of Service censorship yes I do I I think that that's a very reasonable question I think that something the main I will not be
revealing any state secrets vitally we've been discussing this for a long time at the right way to do it because meaningful and becomes informative it become something that give users can use to make their own decisions about whether or not it's right or wrong this is an evolving topic and I think that if we have this discussion a couple of years you will have seen new governance institutions like that for example have evolved within and I think that is probably the right way to go about this companies companies I think can only gain from providing more transparency and how these decisions are made this at the end of the day users should judge them on the past behavior and try to decide whether or not to trust them based on the past you know that is that is the only way in which you can generate the kind of trust in me continue and delivering the services and then I think that again I think that should be complemented by transparency from governments about how these different rules are applied which is also when it comes to things like the German hate speech or widely it's a bad idea to push all of this over to to companies because then state doesn't
need to take responsibility more accountability for the kinds of decisions which is again something of subsets of changing those decisions absolutely and so beyond
beyond their the issue of speech regulation I think that there's something here to um on hate speech in around to counter extremism and so I wanted to to raise jigsaw which is a part of and and feel like getting this from the sea so
alphabet I my understanding of the of the
vessel who takes exactly good OK if I apologize I think I need I need the flow chart
which are behind us it's complicated but
think that so this whole space in an and so on so I know that jigsaws focused on enter extremism that's 1 of their their areas of focus on and and I think you know this is a place where companies maybe feel that they have to step up in some way and beyond just regulating content beyond just taking down the videos because that may not even be the best solution for example we've heard from some police departments in a number of places in the world is that indeed that the taking down of the content without any further context is not even necessarily helpful but it did does allow them to track it doesn't allow archiving
etc. and and so I think that this is admirable work but at the same time right now given that the rise
of white supremacy in fascist ideologies in Europe and the US in a number of places what we're district soured is Google see its role in countering the combating those ideas as well not just Islamic extremism but also these other really prevalent forms of extremism the what I think that is it
obviously something that a company can do alone even jigsaw acknowledges the radicalization is a complex multi layered process right it's doesn't happen very we have no evidence that it happens because somebody sits in front of a screen and sees a number of videos or number of posts in a social network and the size of a common extremist usually that is about the identity building pardon the radicalization happens offline a small community where somebody's recruited so so that the radicalization process and that the radicalization process are deeply complex processing center and Jigsaw has chosen to for now focus on the Islamic radicalization process that there's a reason the jigsaw was not about a particle will they want to try all these different things and was decided early on that it's better for them to do that as a free-standing company and they should speak for themselves and they have their own ideas about how to put this for Jordan Jarrod Cohen and Scott Carpenter are more than happy can talk about these issues the reason we did the
radicalization project with them from will side was we felt that there was a clear use case therefore the redirects project we did when
somebody looking for information that is essentially going to be radicalization invented is redirected to counter speech because he felt that was a clear case we could test counter speech in this particular uh environment that will be do that for other kinds of extremism as well I don't know I mean we're still evaluating the results of the redirect experiment in what we believe that went but that led to we do know that it led to a lot of exposure for people were looking for for radicalization content but we don't necessarily know what the impact of it as you point out it's a deeply complex process because if you're feeling that the entire world is conspiring against you and stopping you from going to the content that you really need and that this this pattern alistic element to the maybe you actually end up making the situation worse so you want to be very careful want to look at limited cases and you want to understand how they work and that's essentially where we are not so very early in the process of the jigsaw uh may well have another answer that question I think they should
should you will be asked that in their own right the excellent 12 I will do my best ask them in the future and I think you know I'm I'm also really curious about your thoughts on just those where the Internet is headed there is you know I I know that this is
a really big question I know that and in a but we've been kind of talking around these ideas when it comes to hate speech when it comes to and the sort of different types of knowledge that people have access to and censorship and I think that now we're seeing increasing and not me use were balkanization fragmentation of the
Internet and and what you see as your role there in in ensuring that people do have some sort of universalized accessed information from my perspective governments are working together a lot of times to collude in oppression and I think that people also have this this and responsibility to work together across borders and
across boundaries in solidarity with each other but 1 are accessed information differs and sometimes even with the help of companies that can be a
difficult task this on mission statement is
to organize all the world's information universally accessible and useful and I think that we're not the sort of we are committed to try to do that as best as we can within existing laws in the and you know where we can have an open debate in a policy debate about how those regulations can be changed in order to achieve that goal that's very clear to us I think that universal information access is actually a very important point when it comes to to to how we can change democracies and how we can get to a shared world view etc. on on where the Internet is having no I mean there is there's a couple of different things happening that I think are are really difficult they speech for example just as 1 single factor that we can study in order to understand what's happening there is a pushing a lot of western democracies have for good reason because this is felt to be a real problem to regulate hate speech and regulated in in quite heavy handed way which means that that you end up having the regulation that uh might deter companies from from allowing any kind of of spirited debate if you will
on the platform uh I think that's very different than hate speech but it may have a chilling effect so that will happen that is not that this will disappear
and there was a research report recently from too back of published on the 20 9th of march where they ask the number of experts in and the Internet community what they think the next steps are and they say we think we feel more hate speech but we think that we will also see it move from a public stage to sum i public stage into other kinds of networks and it will fragment the Internet because different kinds of content rules will be applied differently in different places and I think that the content regulation we're discussing now in several different parts of the world actually has the ability to create
content islands on so you get an archipelago of different kinds of discussions and then they're separated off
because of regulatory concerns of course you can see that happening and then you can ask yourself look if you have a guy was as I read you really dislike Swedes on an open area right that is you know not nice male but think that's pretty bad but if he goes into a small room with like
minded people as higher readers like Swedes was most dangerous for society to have that close room to go with like
minded people would have aminopyrine we haven't had that discussion we just assume that we can sort of removed hate speech which I think is is is a little bit like fighting a symptom rather than thinking about the core courses that I absolutely agree it's like putting a
Band-Aid on this like open and terrible wounds and thinking OK this is going to go away but it doesn't and I think that we seen this in in the way that imagery is regulated and I know that when I go to budapest for example I can see the mean and standard symbols that indicate that the shop is owned by Nazis and the Confederate flag the American confederate flag is commonly used as a standard symbol because the other 1 is made illegal so I may not
recognize that immediately and I think that there is a real danger there to siloing off networks of hate and i want to change tax just a little bit so it's just a superb answer and move toward automation and I'm maybe not advertising is that I'm running off to my next talk after this and talking a lot about automation algorithms and so I have to take this opportunity and ask for your thoughts and particularly for the role of automation and content moderation and great now there's a lot of talk about the labor involved in the and pose dramatic stress disorder involves uh with content moderators who
have to look at this terrible imagery everyday and and there's a lot of empathy for them in a lot of trying attempts to create something different but at the same time I'm concerned that automation will bring in the same sort of human
biases that we see now and and some curious where you where you see this headed but I think that you know 1st and
foremost a absolutely right to the people who are forced to review this stuff they they are doing really really difficult job and the answer company have a responsibility to make sure that they get that the kind of support they need in that very difficult I think that's important to say then I think that that that this is a problem about uh is problem is separated into 2 different things the 1st is can we you know is it possible within the existing computer science and knowledge to to build a system that can automatically do content regulation and decide that this little piece of content here is hate speech in this is piece of content over here I should say up I think that we are a still a far far away that way away from that minimal to do that and I think that the notion of context that the ability to detect context it's really hard it's a it's an open computer science problem if you want so that's the sort of to hand we perspective I don't think that it's possible to do within you know any kind of confidence interval that we would feel roughly and comfortable with now assuming we can have just for a sort of sake of argument assume that we have a system that has learned to identify some content that is hate speech and some that is would we still want to delegate this decision to a system or not that is a much more difficult question I think because the the the
it might be true for some kinds of content and this is where we need to get new ones and this is where it gets messy
if you if you run a platform and you don't want pornography on that platform you might be able to build a system that could detect photography with a certain high level of confidence and the false positives of without detection maybe you know people on beach or you know the people participating in this some kind of new stuff right at the end and add those false positives may not be
high-value content and if they actually removed from the public debate you may not think that that's the that's a huge loss and you could corrective with an appeals process and the reinstatement process now that if you go to something like hate speech and you
start looking at the nature of the false positives which is the the research problem and you have to engage and you realize that the nature of the false positives when it comes to its is much more difficult to to address because you maybe filtering out country speech for example you may filter out of the things like that so I think that the worst the can we station all we weekend so that sort of settle the issue for now but moving ahead uh the question is will we ever be able to I don't know what the answer to that depends on that of how complex this problem it may well be really complex I personally think it's so complex because again the Web as an adversarial environment you made a really good point about black and if I did the system today that essentially allows me to detect certain kinds of markers for a white supremacist for example and that system picks out all that
speech what will they do they will change the words and this is the exact same kind of
fight that you see in spam but you will end up with that and the codewords sectoral change over time and that system again this the system in evolution and competition and you're going to end up having to try to push the boundaries if that change is always going to be foolish enough to keep confidence intervals lower than we want them to be were not never going to be able to do it so there's a will we ever be able to question 2 but assuming that we will be able to should we i think still is a really big issue and I don't have a good answer for you and then 1 by 1 pretend that you know we're not all discussing this and inter-community had a discussion last week with the professor about this as well and you know he he essentially said that that it is uh decision also that I need to be informed by the risk of future discovery by because if you assume now that you have a really good if you if you assume that had this system in the 19 factors and you put in place everybody would agree it's not biased working really well it's sorting out all of the the the and
tired whatever stuff and then you look at it now with 60 years and inside his and my goal so terribly by strand because
view bias changes over time so there's a whole set of issues that are unresolved and I think we should as a community continue to debate them because I think that we will soon see somebody out there say 0 it's quite possible to do
because they've built a system that they have faith in and they want to be able to do Is there anything right now for which Google
is using them the algorithms to you in a tuple things or even just it to flag them rather than having human moderators the most you moderators right now and uh there's
there's a couple of very simple ways of identifying 1 flags become you know enough flags for a particular subject or something as implied before there is an alternation there but nothing on the scale which were discussing this this idea of sort of applying it to con the moderation which is such you absolutely no I agree that it's a it's a really tricky problem and
I I don't have the right answer either and I wanna go to the audience in just a moment but I think I had 1 last question that I want to ask 1st and perhaps I don't well and I have to have a
did I mean I have so many things but I won't ask you veils wanna make sure that I give plenty of time to the audience to ask questions and so do we have any just yet otherwise I can I can keep going up here we
do and and the I went to others interested in uh where is saying about this this responsibility to discriminate essentially between content and it's our responsibility as citizens to engage I just wanna given that the education that we have received a citizens even amongst people much younger than me is not in step with the technological change so as citizens we're equipped
to engage in a society which is gone and whether there is a responsibility from the producers of the technology that is changing our society so people like yourselves to help equip the citizens to make those choices thank you so we are engaged as probably know in a number of different it is a literacy projects and those are the address that kids but they're also address the problems we're doing something called and the growth engine project which teaches digital skills across Europe and I think that's part of it now I I actually think that it's really important to talk about education overall and how education is a change in our society and why it might be important for us to constantly reeducate ourselves the city technology so we provide a lot of online content etc. but at the end of the day that's also dependent on the individuals will to to seek their content out and change the key I think is as you say trying to figure out how you
fit this into basic education now there are some
signs of hope I'm I'm a native Sweden and if my kids have had this enormously annoying thing that they're doing through classes 4 to 6 is called checked the source so what happens is that every time something is as that every time a proposition is being brought forward their talk to say check the source and then they go off and they turn that into a fact checkers and they Jacob a lot of different sources that separates at and a lotta posts do this 150 times over the course of a year so there really isn't it's ingrained into them at the annoying part of this is that every time I tell them to clean their room they say check the source may this which is a bit of a bother but uh but at the
uh that the overall sentiment there is is really really good and I do believe that this is this
is classical critical thinking and something that needs to be embedded in education and don't love you read science fiction and burn Irving wrote the really interesting book or lower vocatives 1999 called uh rainbows and in which a guy who has Alzheimer's cured of Alzheimer's has to go back to school because have to relearn everything and there are only 2 subjects in school and in ending is a version of the future 1 search and analytics which essentially critical thinking and the others visualization in order to be able to visualize complex information to only school subjects and maybe we should think of education reform in order to equip us with better tools to deal with um have a lot to go back
to the new and German about age bitch that we'd been you've been talking about I think that in the the gist of what you were saying about it might be forcing companies with less resources to just a have like a default solution for this problem so let's get rid let's get rid of it even if it's just controversial and not really at speech so it's much easier not to incurred in those kind of very big finds the
problem is I have been following this closely from from Germany from here the form of the publication and so on the promises is has been ongoing especially would face for more than 2
years and so self-regulation was the 1st choice for the governor and the word that was a big push for self regulation but it didn't work at all it in work at all for 2 years the and seeing this because the numbers show that Google actually did a great job and not because you here because
just uh the uhm mass ministry um so that the Justice
Minister just showed numbers that your number is good when hot Facebook number in actually addressing the legal
content we're not even talking about stuff that be deemed as controversial it was clearly illegal they were not able to pursue that properly the percentages of uh uh illegal
content words that that just report it just went down uh so what is the government to do at some point there is political question pollution that so uh the there is this to be something done and what the Pollock 7 their answers just regulating in ways that and just yet so what you think about this bigger issue of
logical that I know that you can't say this but I can mean I I have to think at some point with Facebook at its willful because the excellent at taking down any image of a woman's breast but they cannot seem to deal with hate speech and and I and I can assure you I've been banned for posting breast
cancer campaigns on Facebook before it's the unfathomable they can tackle this issue
and that a you know I mean from my perspective I do feel that it since Google and then I have seen that as well Google was capable of those I think companies are capable of it look and I don't think that companies for the most part should be forced into this situation I do understand we're coming from with respect to this 1 company but I see this as a problem with 1 company not a problem across the board that needs to be solved with this sort of rapid wall but I alternative series and other yeah would comment on Facebook use
I would and you're not allowed to legislate for common I think is good but they do think there the couple different things here that I believe are are really important and 1 is that I think that the entire industry has said that we willing to come to the table and discuss to see what can be done how this can be improved what we can do we believe this particular piece of legislation is not necessarily going to solve a problem that you're looking and and it's going to have a lot of adverse effects OK so let's let's have a that's not push law through very quickly I understand the sense of of desperation bad but a politician if it also because many of them on the receiving end of this and they're standing up for democracy they're trying to probe the really participate and and you know I think that if you look at the politicians are doing they are exposing themselves to to this problem a lot so I can understand the sense of this preshipment I think there are some of the things that we can do with it doesn't have to entail pushing through an overreaching law 1 thing that we can do that actually should be done is to look at the existing enforcement of the hate speech laws that I could that that are at what kinds of resources are being poured into
this whole much are we actually trying to take these laws and take these cases and bring him to court to what is the resources of the court system to deal with this are the
special prosecutor 2nd work on this for example with transparent means and with revision etc. it or all of those things should be discussed and can be discussed so we have said several times in meetings with with anyone who will listen that were happy to come to the table and discuss what can be done and there may be other things we can do in terms of shared responsibility where we discuss what the role of companies and governments are but this is not it it's not going to have a kind of effect that
that you're looking for many would be much better to look at enforcement and look at alternative
solutions that would the to have more
questions you get the high but if I'm not mistaken Google actually makes the 90 per cent of its revenue
from advertising is that correct as correct OK so basically Google as an advertising company whose businesses to influence people
on behalf of its advertisers which gets back to Jillian's original point of what's the role of Corporation and I have to say and I
very troubled by the idea of experimenting on people even if it's for good you know to try and move them away from you radical sites have I find this is this is human experimentation and Google is already doing this on behalf of advertisers what I it's a valid point we make a money
from advertisers and were not shy about it is not a secret 1 of the things that I I believe is really important is that more you talk with a jigsaw experiment is that this is done in a structured fashion that we're very clear what was happening in that we're very transparent
about you may disagree in you may feel that that is the kind of thing we should never do I think that's a that's a that's a valid to standpoint to taken it's 1 that we don't share we disagree with that we think that we can do this within the remit we have their on advertising generally I the idea of advertising and funding what we do with advertising is the we help people find what they're looking for we help them do it for organic search and we have endured for advertising now you can quite legitimate you take the position of all advertising is a bad thing there should be no advertising in society I disagree with that too
I think advertising is of that really powerful way to to reach consumers
and it's a part of a market economy we may have a fundamental disagreement about whether or not a market economy is a good thing or not and and then we could go on and we can discuss that in the basic tenants of how we build our society but the for us we're company we're providing an advertising technology the revenue that we pour into a lot of different projects I think that in many senses we believe that we are and doing a lot of good with those at projects and we will continue doing so that however I and respect your position the
um yeah when the uh 2 big players Google and Facebook did pretty much for many people right now replaced with the Internet was to become the internet and especially Facebook really tries hard to do
to be internet for millions and billions of people so and so much trouble to win um these companies move past um anything
we experience before from a company and yet still claims something where we need to have we as a company need to have a moral background we need to have a low moral guidelines know you don't you just it's you're beyond the role of IRM picking on choosing moral rules to
implement because and looking at Facebook and that the ban on women's breasts that's a good excuse example on the way this goes so am I well we did you know you shouldn't be in the role to to stop or enforced or it do anything
on any part of free speech because this really goes out of hand because you already far too powerful
I think we not the Internet and that's a very good point and we realize that internally we absolutely not the internet we provide certain services on those services we do believe we have the right to determine roughly what goes in what doesn't and you could say well you were so large now that I disagree with that I think that you should not making a decision about what goes on your platform or not we disagree with that perception and we believe that we should be able to to decide what goes and does not go on our platforms we don't believe that extends to all of the internet and we do think that the Internet is changing quite fast and there will be new services and there will be other competitors and and I will go on and by and then I think that all of those values can be questioned you know if you disagree with our values and you should definitely let us know because the way we try to shape our guidelines on you to for example is from listening to users what they want for they don't want to know what kinds of content they don't like that's why the flagging system is so valuable to us we see people sliding and we try to respond to that now we don't always act on the flags were not completely covered by the flags if we were for example the the couple pieces of music by Justin Bieber that would be picked on because the people the flag just and the role the which is very confusing and and so this is a at so this is
this is something that that that is a part of of running a service you have to decide what you want to actually have on that service or not mean that there is a living debate that is representing exactly would you say people say well your has now cross the threshold words so large as not to be able to do that anymore and you should refrain from any moral judgment I don't think that's possible I think the moral
judgment or a moral decision even to provide a service so far think that's that's a point where discrete can follow up on that request and to
say that I think fits an in a really interesting example of this thinking about recently was uh U-tube restricting you're adding a restricted mode that would and segment some LGBT Q videos off to the side and a youtube is apologize for this and then at most of those videos are now available but at the same time in a that is a company passing a judgment on what is let's say acceptable for
adults vs. children or and people who want to adult content vs. not an end to be clear these were not all sexual in nature
somewhere someone not and I in a from my perspective I think that there is perhaps the diversity issue within companies when these decisions are made an and I I know that's NO Google is CIA does produce diverse here for another in terms of and global diversity at least it's it's it ranks much higher than all of the other companies they still see this is perhaps an issue coming out of Silicon Valley and entity
curious that diversity is certainly important I think there were in making these decisions is absolutely essential we do apologize I think we made a mistake and then when we make mistakes we back off of that and
this was discussed at the very highest levels Susan who leads you to was deeply more than this and of course she she also was deeply involved in in walking the secession back we will make mistakes we will own up to them and we will try to fix them that is that is some portion of the reality of a fairly complex system
and yeah and the high thanks very
popular and I would like to come back to the technology aspect and am in the announcement there was some conversation and AI and perspective was mentioned in the text and
that maybe I overheard that but I didn't use anything about that could you explain a little bit what these the tools of rooms of his art this is this is another jigsaw project that is essentially about trying to help publishers sigh what kind of tone of voice they want and the common Terry space of the website so it's something that is that is and provided for publishers to decide how they moderator comments seduces a basic AI to detect certain kinds of speech patterns that can can then be judged and uh trained as judged to be appropriate or not appropriate by the publisher of the decision taken by house but it's an AI applied to this uh this common space and as we said before you know the confidence rates here are very difficult and the reports about the common they and this particular piece of technology has has varied in terms of how effective it it's yet publishers are really interested in it because they would like a the ideal for that would be to have as way of setting the tone of voice in the discussions that follow on to the quality content that they use and so jigsaw tried to figure out that there was a way to do that and it's it's more of a jigsaw question as well was in this announcement to and I was I was hoping it wouldn't get any machine learning questions this is the 5th the the thank you very much for this session and usually we have
definition of what is legal and what's not legal to put out and expressions and both hate speech blasphemy and other things that are decided by parliaments and by courts
how would you look at integrating the decisions made by courts into automated systems so if I understand your
question correctly you're asking whether or not you could take a decision by a court that in some way interprets along on what's legal or not legal in terms of content and then integrate that into an automated removal system what I think that the challenge there would be that have the cold it's in such a way that that it actually represents accurately the gist of the conceptual analysis and the decision note that would require that the
decision analysis in court decisions is somewhat standardized but you can imagine a world in which actually standardized a conceptual analysis of a piece of content according to where the Sun machine readable that you could immediately important the system that there's been discussions of that kind when it comes to to courts and I think that you know there are a lot of really interesting research track so what school cyber courts where the decisions are essentially uh coded in such a way that the completed transferable between different kinds of systems so if you have a court decision over here you can import into moderation system over here that asks a profound question about the nature of law and legal concepts on law and legal concepts such that they're amenable to that kind of conceptual modeling oracle that I'm not sure they are uh I I tend to believe and know when it comes to legal philosophy in the pragmatic believe with with lower than the homes and others that the law changes that societies change which means that any codification of it in a particular moment will be snapped shot or what the law looks like in that particular moment and conceptually will not be stable over time language came around law changes which means that it won't be possible to actually import a cold at the scission just from from 1 instance to another you can do some things to to I
guess at least did you could be another way of asking a question would be should these
decisions then be use them provided in a format that can be used for training machine learning systems to train classifiers on them in order to train these classifiers to moderate cont the that might be quite interesting to have but yet again that lasts a profound question about the nature of legal concepts or I I tend to be on the skeptical side whether it would work enough but it's it's a very valuable research questions you know however I think he you have very interesting panel
I'm changing the topics or the other that Google has been very proactive in promoting sustainability such as solar panels to power your offices and and data centers on year what is share little bit more on that and other initiatives that you you was actively
introducing all practicing T enables sustainable development thank you
thank you yes of course now we we do believe this is a really important question we are artisans several years back now carbon-neutral we you know we can discuss the efficiency of the of that system of we buy carbon offsets were when people like me travel across the world etc. so we're word Julia carbon-neutral company we're looking at different kinds of sustainable energy investments across the field and uh and especially when it comes to our data
centers we've been very careful about how we look at them in order to be able to look at new ways of utilizing energy for example is a data center in hominid uses offshore wind power in different ways in order to power the the data such it is a it is a key question and for that reason the sustainability work is actually angle proper it's not enough about that's been decided that's a core part of Google and there's several different other projects ongoing and we're happy to to provide you with with an overview have more my head but no review if that can be helpful to him some didn't know that that's
that's they collide in articles have neutral am I think we've got time for probably 1 to 2 more questions it that and I will come back to the
algorithms and machine learning and sorry but it's really seems very relevant in this discussion and my feeling is that many companies propose an algorithm as debt solution for the problem that was created possibly by the use of articles in the
1st place so we see how for example the keys get certain feasibility of light because of the ongoing behind that's that promotes at the content that will trigger more interactions at that's less Google example that's more facebook I think example and then the company
like Facebook responding with the same idea and now we will teach our algorithms to ignore this kind of content so my question to you would be how much you believe in transparency and opening those black boxes do you see any risks and the inherent in to doing bad so for example we open the algorithm and then people started the even more on the other hand if the
error being gained by botnets anyway maybe we should be opening it what's your take on on language transparency in the context of fake news in similar problems so the the color tears to your question I would like to ask the but the first one is is are algorithms really generating a filter bubble this is the real research question of people looking into and and just a couple of days ago the University of Michigan Oxford University and University of Ottawa really a 203 page report where they looked at the use of search and uh political opinion formation of political opinion and they have a that's really interesting report and there's a couple things in there that I think are salient to what you're asking 1 of them is that people actually do not only look at content that come from strong prejudice they look at 4 . 5 sources and average they seek out resources are different from the ones that they typically see maybe this behavior is different add new in social networks I don't know the study doesn't said but the notion of of sort of having and a filter
bubble be generated by a search pattern seems to be 1 that is not proven by research it's rather a challenge by it so that's a that's an important 1st part of the the question to answer the 2nd part is and how do you think about algorithmic transparency I think the the 2 answers to that 1 is the 1 that you give yourself if you put
the algorithm out there in an adversarial environment what will happen is it will be game then you will be lost in the noise and that's not a great idea but that answer is only partially true I think that the the the more problematic question there is that if you look at what is happening today it's actually better to try to explain outcomes then explain the algorithms because the algorithms are constantly becoming more and more complex and it's not certain that you would be able to to make any decisions or informed analysis by just looking at you know a series of neural nets and how the train at a certain data set you would be able to see you the algorithm but you would not be able to predict the outcomes because of the level of complexity inherent in the system that you're trying to audit it so at that point what becomes important is that you're able to do statistical analysis of the outcomes and a durable to talk about the outcomes so for search for example and what we want to be very transparent is about how you can rank higher how can you provide better quality content so we have something called how search works which is a broad campaign that goes in depth from all the different things we look at and at this point it's more than 250 different signals we look at when we try to determine the quality of content and it's going to grow and it's going to increase which means that every single piece of
content that will look at will be determined by the a lot of different signals so we can talk about how that happens is actually more useful than just talking
about the algorithm behind it if you're a few and this this has to do with all levels of explanation if you want to understand how you want to understand how a car works and I start teaching about quantum physics you're not necessarily going to be held by the because quantum physics does describe how the car worked with explanations could be really long and complex but they talk about mechanics suddenly you have another understanding of the car and then you can do things with the con it's a bit the same thing with computer systems talking about the algorithms that but like talking about quantum physics what you action want to do is to topic for the algorithm doubts and the outcome analysis of the algorithm so it's levels of explanation and its levels of explanation of help you the as transparent about what you're doing as possible thank you so much because I think we're out
of time unfortunately thank you everyone I hope that you enjoy this discussion as much as I did thank you very
much thank you but I will
also address the question if only in
Bit
Punkt
Rechter Winkel
Besprechung/Interview
Vorlesung/Konferenz
Computeranimation
Datenmissbrauch
Subtraktion
Freeware
Kategorie <Mathematik>
Besprechung/Interview
Zahlenbereich
Kontextbezogenes System
Systemplattform
Gesetz <Physik>
Term
Teilbarkeit
Quick-Sort
Summengleichung
Dienst <Informatik>
Arithmetischer Ausdruck
Erwartungswert
Offene Menge
Endogene Variable
Inhalt <Mathematik>
Task
Subtraktion
Facebook
Kugel
Kategorie <Mathematik>
Besprechung/Interview
Zahlenbereich
Information
Term
Metropolitan area network
Fitnessfunktion
Aggregatzustand
Arithmetisches Mittel
Kraftfahrzeugmechatroniker
Kugel
Endogene Variable
Besprechung/Interview
Dateiformat
Iteration
Schlussregel
Inhalt <Mathematik>
Information
Raum-Zeit
Subtraktion
Besprechung/Interview
Physikalisches System
Binärcode
Systemplattform
Auswahlaxiom
Objekt <Kategorie>
Quader
Besprechung/Interview
Information
Komplex <Algebra>
Frequenz
Modallogik
Subtraktion
Menge
Hyperlink
Endogene Variable
Besprechung/Interview
Luenberger-Beobachter
Information
Figurierte Zahl
Graph
Basisvektor
Besprechung/Interview
Information
Frequenz
Teilbarkeit
Quick-Sort
Subtraktion
Suchmaschine
Besprechung/Interview
Ablöseblase
Sprachsynthese
Information
Ordnung <Mathematik>
Computerunterstützte Übersetzung
Raum-Zeit
Bildgebendes Verfahren
Quick-Sort
Soundverarbeitung
Besprechung/Interview
Frequenz
Binärcode
Hoax
Homepage
Eins
Arithmetisches Mittel
Benutzerbeteiligung
Suchmaschine
Reelle Zahl
Mereologie
Basisvektor
Wort <Informatik>
Information
Ordnung <Mathematik>
Programmierumgebung
Gerade
Analysis
Aggregatzustand
Suchtheorie
Subtraktion
Selbst organisierendes System
Mustersprache
Besprechung/Interview
Information
Maßerweiterung
Web-Seite
Frequenz
Gerade
Packprogramm
Homepage
Formale Semantik
Subtraktion
Selbst organisierendes System
Besprechung/Interview
Geräusch
Automatische Differentiation
Datenfluss
Teilbarkeit
Benutzerbeteiligung
Endogene Variable
Serielle Schnittstelle
Information
Ordnung <Mathematik>
Programmierumgebung
Standardabweichung
Arithmetisches Mittel
Subtraktion
Polarisation
Besprechung/Interview
Zahlenbereich
Schlussregel
Programmierumgebung
Quick-Sort
Arithmetisches Mittel
Sichtenkonzept
Menge
Determiniertheit <Informatik>
Endogene Variable
Besprechung/Interview
Biprodukt
Ordnung <Mathematik>
Dialekt
Assoziativgesetz
Punkt
Selbst organisierendes System
Besprechung/Interview
Zahlenbereich
Systemaufruf
Sprachsynthese
Quick-Sort
Service provider
Entscheidungstheorie
Task
Vorzeichen <Mathematik>
Rechter Winkel
Mereologie
Endogene Variable
Vorlesung/Konferenz
Inhalt <Mathematik>
Fehlermeldung
Konstruktor <Informatik>
Bit
Freeware
Besprechung/Interview
Sprachsynthese
Datenfluss
Quick-Sort
Entscheidungstheorie
Arithmetischer Ausdruck
Reelle Zahl
Mereologie
Speicherabzug
Response-Zeit
Information
Aggregatzustand
Facebook
Subtraktion
Punkt
Besprechung/Interview
Schlussregel
Kombinatorische Gruppentheorie
Gesetz <Physik>
Framework <Informatik>
Quick-Sort
Entscheidungstheorie
Minimalgrad
Basisvektor
Ablöseblase
Auswahlaxiom
Subtraktion
Minimalgrad
Rechter Winkel
Mathematisierung
Besprechung/Interview
Schlussregel
Information
Gesetz <Physik>
Verkehrsinformation
Framework <Informatik>
Entscheidungstheorie
Gefangenendilemma
Punkt
Webforum
Betrag <Mathematik>
Besprechung/Interview
Inhalt <Mathematik>
Ordnung <Mathematik>
Quick-Sort
Verkehrsinformation
Arithmetisches Mittel
Dienst <Informatik>
Flächeninhalt
Besprechung/Interview
Schlussregel
Sprachsynthese
Term
Verkehrsinformation
Entscheidungstheorie
Aggregatzustand
Programmablaufplan
Besprechung/Interview
Zahlenbereich
Zeichenvorrat
Sprachsynthese
Kontextbezogenes System
Fokalpunkt
Raum-Zeit
Videokonferenz
Entscheidungstheorie
Teilmenge
Flächeninhalt
Endogene Variable
Inhalt <Mathematik>
Extreme programming
Regulator <Mathematik>
Prozess <Physik>
Gebäude <Mathematik>
Besprechung/Interview
Zahlenbereich
Komplex <Algebra>
Videokonferenz
Bildschirmmaske
Multiplikation
Rechter Winkel
Nichtunterscheidbarkeit
Radikal <Mathematik>
Partikelsystem
Extreme programming
Touchscreen
Resultante
Prozess <Physik>
Besprechung/Interview
Sprachsynthese
Element <Mathematik>
Komplex <Algebra>
Mustersprache
Radikal <Mathematik>
Projektive Ebene
Inhalt <Mathematik>
Information
Extreme programming
Programmierumgebung
Subtraktion
Perspektive
Rechter Winkel
Endogene Variable
Datentyp
Besprechung/Interview
Sprachsynthese
Information
Quick-Sort
Computeranimation
Internetworking
Subtraktion
Befehl <Informatik>
Sichtenkonzept
Punkt
Gemeinsamer Speicher
Besprechung/Interview
Sprachsynthese
Gesetz <Physik>
Quick-Sort
Teilbarkeit
Internetworking
Task
Randwert
Information
Ordnung <Mathematik>
Grundraum
Hilfesystem
Regulator <Mathematik>
Soundverarbeitung
Expertensystem
Subtraktion
Datennetz
Besprechung/Interview
Onlinecommunity
Zahlenbereich
Sprachsynthese
Schlussregel
Systemplattform
Neun
Internetworking
Mereologie
Inhalt <Mathematik>
Verkehrsinformation
Regulator <Mathematik>
Arithmetisches Mittel
Bit
Flächeninhalt
Offene Menge
Rechter Winkel
Fahne <Mathematik>
Besprechung/Interview
Symboltabelle
Speicherabzug
Sprachsynthese
Standardabweichung
Arbeit <Physik>
Bit
Algorithmus
Webforum
Datennetz
Besprechung/Interview
Inhalt <Mathematik>
Entropie
Normalspannung
Parametersystem
Besprechung/Interview
Sprachsynthese
Physikalisches System
Kontextbezogenes System
Quick-Sort
Entscheidungstheorie
Bereichsschätzung
Perspektive
Prozess <Informatik>
Offene Menge
Endogene Variable
Binäre Relation
Inhalt <Mathematik>
Informatik
Einfügungsdämpfung
Prozess <Physik>
Ortsoperator
Bereichsschätzung
Rechter Winkel
Besprechung/Interview
Sprachsynthese
Physikalisches System
Inhalt <Mathematik>
Systemplattform
Eins
Benutzerbeteiligung
Ortsoperator
Natürliche Zahl
Arbeitsplatzcomputer
Besprechung/Interview
Wort <Informatik>
Sprachsynthese
Physikalisches System
Programmierumgebung
Quick-Sort
Randwert
Bereichsschätzung
Evolute
Mathematisierung
Besprechung/Interview
Physikalisches System
Teilbarkeit
Entscheidungstheorie
Zentrische Streckung
Algorithmus
Sichtenkonzept
Webforum
Menge
Fahne <Mathematik>
Mathematisierung
Besprechung/Interview
n-Tupel
Äußere Algebra eines Moduls
Physikalisches System
Quick-Sort
Momentenproblem
Mathematisierung
Endogene Variable
Besprechung/Interview
Inhalt <Mathematik>
Digitalisierer
Mathematisierung
Mereologie
Endogene Variable
Adressraum
Besprechung/Interview
Projektive Ebene
Inhalt <Mathematik>
Schlüsselverwaltung
Auswahlaxiom
Fitnessfunktion
Bit
Subtraktion
Klasse <Mathematik>
Besprechung/Interview
Klassische Physik
Versionsverwaltung
Aussage <Mathematik>
Quellcode
Analytische Menge
Vorzeichen <Mathematik>
Mereologie
Visualisierung
Information
Ordnung <Mathematik>
Bildschirmmaske
Besprechung/Interview
Sprachsynthese
Default
Facebook
Prozess <Informatik>
Besprechung/Interview
Ruhmasse
Zahlenbereich
Wort <Informatik>
Störungstheorie
Regulator <Mathematik>
Auswahlaxiom
Facebook
Punkt
Besprechung/Interview
Sprachsynthese
Wort <Informatik>
Bildgebendes Verfahren
Soundverarbeitung
Subtraktion
Facebook
Perspektive
Mereologie
Besprechung/Interview
Reihe
Äußere Algebra eines Moduls
Sprachsynthese
Gesetz <Physik>
Whiteboard
Quick-Sort
Arithmetisches Mittel
Soundverarbeitung
Verbandstheorie
Endogene Variable
Besprechung/Interview
Versionsverwaltung
Ablöseblase
Äußere Algebra eines Moduls
Physikalisches System
Term
Web Site
Punkt
Besprechung/Interview
Ortsoperator
Gemeinsamer Speicher
Besprechung/Interview
Facebook
Subtraktion
Ortsoperator
Mereologie
Güte der Anpassung
Besprechung/Interview
Projektive Ebene
Internetworking
Güte der Anpassung
Besprechung/Interview
Schlussregel
Internetworking
Dienst <Informatik>
Punkt
Freeware
Rechter Winkel
Gruppe <Mathematik>
Fahne <Mathematik>
Mereologie
Besprechung/Interview
Sprachsynthese
Inhalt <Mathematik>
Physikalisches System
Systemplattform
Entscheidungstheorie
Internetworking
ATM
Schwellwertverfahren
Dienst <Informatik>
Punkt
Besprechung/Interview
Wort <Informatik>
YouTube
Videokonferenz
Entscheidungstheorie
Komplexes System
Perspektive
Natürliche Zahl
Besprechung/Interview
Inhalt <Mathematik>
Term
Übergang
Entscheidungstheorie
Bit
Umsetzung <Informatik>
Web Site
Besprechung/Interview
Ideal <Mathematik>
Sprachsynthese
Bitrate
Term
Raum-Zeit
Entscheidungstheorie
Virtuelle Maschine
Bereichsschätzung
Perspektive
Mustersprache
Projektive Ebene
Inhalt <Mathematik>
Hilfesystem
Verkehrsinformation
Arithmetischer Ausdruck
Besprechung/Interview
Vorlesung/Konferenz
Sprachsynthese
Inhalt <Mathematik>
Physikalisches System
Term
Analysis
Entscheidungstheorie
Subtraktion
Momentenproblem
Cybersex
Natürliche Zahl
Formale Sprache
Mathematisierung
Besprechung/Interview
Physikalisches System
Gesetz <Physik>
Entscheidungstheorie
Virtuelle Maschine
Informationsmodellierung
Weg <Topologie>
Webforum
Instantiierung
Orakel <Informatik>
Analysis
Rechenzentrum
Virtuelle Maschine
Bit
Wellenpaket
Gemeinsamer Speicher
Natürliche Zahl
Besprechung/Interview
Dateiformat
Physikalisches System
Ordnung <Mathematik>
Leistung <Physik>
Office-Paket
Entscheidungstheorie
Subtraktion
Winkel
Physikalisches System
Rechenzentrum
Energiedichte
Datenfeld
Mereologie
Ablöseblase
Wort <Informatik>
Speicherabzug
Softwareentwickler
Eigentliche Abbildung
Ordnung <Mathematik>
Leistung <Physik>
Schreib-Lese-Kopf
Facebook
Algorithmus
Blackbox
Feasibility-Studie
Besprechung/Interview
Interaktives Fernsehen
Inhalt <Mathematik>
Algorithmische Lerntheorie
Schlüsselverwaltung
Beobachtungsstudie
Formale Sprache
Besprechung/Interview
Ähnlichkeitsgeometrie
Quellcode
Kontextbezogenes System
Quick-Sort
Hoax
Homepage
Eins
Algorithmus
Reelle Zahl
Mittelwert
Mereologie
Mustersprache
Dateiformat
Kantenfärbung
Inhalt <Mathematik>
Verkehrsinformation
Fehlermeldung
Subtraktion
Punkt
Gewicht <Mathematik>
Wellenpaket
Besprechung/Interview
Reihe
Geräusch
Statistische Analyse
Physikalisches System
Komplex <Algebra>
Übergang
Entscheidungstheorie
Algorithmus
Menge
Spieltheorie
Inhalt <Mathematik>
Programmierumgebung
Analysis
Kraftfahrzeugmechatroniker
Bit
Algorithmus
Datenverarbeitungssystem
Gruppenoperation
Besprechung/Interview
Quantenmechanik
Hilfesystem
Analysis
Übergang

Metadaten

Formale Metadaten

Titel Fake, hate, and propaganda - What can technology do?
Serientitel re:publica 2017
Autor Lundblad, Nicklas
York, Jillian
Lizenz CC-Namensnennung - Weitergabe unter gleichen Bedingungen 3.0 Deutschland:
Sie dürfen das Werk bzw. den Inhalt zu jedem legalen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen und das Werk bzw. diesen Inhalt auch in veränderter Form nur unter den Bedingungen dieser Lizenz weitergeben.
DOI 10.5446/32979
Herausgeber re:publica
Erscheinungsjahr 2017
Sprache Englisch

Inhaltliche Metadaten

Fachgebiet Informatik
Abstract Join us for a session with Nicklas Lundblad on how Google is approaching fake news, hate speech and other policy challenges through partnerships and technology.

Ähnliche Filme

Loading...
Feedback