We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Rage Against The Machine Learning

Formale Metadaten

Titel
Rage Against The Machine Learning
Serientitel
Anzahl der Teile
275
Autor
Mitwirkende
Lizenz
CC-Namensnennung 4.0 International:
Sie dürfen das Werk bzw. den Inhalt zu jedem legalen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen.
Identifikatoren
Herausgeber
Erscheinungsjahr
Sprache

Inhaltliche Metadaten

Fachgebiet
Genre
Abstract
This talk explains why audits are a useful method to ensure that machine learning systems operate in the interest of the public. Scripts to perform such audits are released and explained to empower civic hackers. The large majority of videos watched by YouTube's two billion monthly users is selected by a machine learning (ML) system. So far, little is known about why a particular video is recommended by the system. This is problematic since research suggests that YouTube's recommendation system is enacting important biases, e.g. preferring popular content or spreading fake news and disinformation. At the same time, more and more platforms like Spotify, Netflix, or TikTok are employing such systems. This talk shows how audits can be used to take the power back and to ensure that ML-based systems act in the interest of the public. Audits are a ‘systematic review or assessment of something’ (Oxford Dictionaries). The talk demonstrates how a bot can be used to collect recommendations and how these recommendations can be analyzed to identify systematic biases. For this, a sock puppet audit conducted in the aftermath of the 2018 Chemnitz protests for political topics in Germany is used as an example. The talk argues that YouTube's recommendation system has become an important broadcaster on its own. By German law, this would require the system to give important political, ideological, and social groups adequate opportunity to express themselves in the broadcasted program of the service. The preliminary results presented in the talk indicate that this may not be the case. YouTube's ML-based system is recommending increasingly popular but topically unrelated videos. The talk releases a set of scripts that can be used to audit YouTube and other platforms. The talk also outlines a research agenda for civic hackers to monitor recommendations, encouraging them to use audits as a method to examine media bias. The talk motivates the audience to organize crowdsourced and collaborative audits.
Schlagwörter