Bestand wählen
Merken

Regulating Autonomous Weapons

Zitierlink des Filmsegments
Embed Code

Automatisierte Medienanalyse

Beta
Erkannte Entitäten
Sprachtranskript
to at the
time of the the the the and
the the use and yet diamond a political scientist and researcher at stiff don't this shaft and put take the burden based think tank here we go and if the and the I mean and thanks for being here and I'll probably never cut myself not propose that I hope it's still interesting and I'm going to talk about preventive arms control and international humanitarian law the I n doing in this international debate around autonomous weapons of this type of weapon is also referred to as a lever autonomous weapons systems short loss also killer robots so say loss I mostly mean these weapons and not like legal loss just a confusing a bit a cake and I will discuss this topic along the 3 questions 1st of all what are we actually talking about you what are the timers weapons as seconds why should we even care about this why this important and 30 how could this issue be addressed on international level and so on I go stromatolites anyway and what are we talking about here well um doing the international negotiations so far no real our no common definition has been a has been found so States parties try to find something or not and for my presentation I would just use a very broad definition of autonomous weapons of which is the weapons that can once activated executed brought range of tasks or of selected engage targets without further human intervention and that's just a very broad spectrum of of of weapons that might fall under this definition are actually some existing ones are there as well which you can see here on that would be the following system for example I've been around since the 19 seventies the 3rd this a the but the answer so filing system has been around since the 1970 so you as a system air-defense system based on ships and is meant to just a young defense of the ship against incoming objects from the air so that's around for has been around for quite a long time and then it might be even part of this loss definition or not but just to give you an impression how brought the ranges today with got for example the demonstrators like the terrain thrown from the UK system was the Exsys uh 74 b and some which can for example autonomously the lands and I have land on aircraft carriers and can be ef every fueled and stuff like that which is apparently quite oppressive and if you don't need a human to do that so in the future there might be an even or that probably will be even more autonomous functions they navigational landing reviewing all that stuff and that's not old I'm but at some point there might be a weapons might be able to choose their own ammunition according to the situation of I might be able to choose the target and decide when to engage with the target without any human intervention at some point I and that's quite problematic I will tell you when it's in a minute overall you can see that there is a gap gradually decline gradual decline of human control over weapons systems or over weapons and the use of force and so that's a very short and brought impression of what they're talking about here and tergum by definition that's always interesting what you're not talking about and that's why I want to address some misconceptions in the public
debate on 1st of all of i when we talk about machine autonomy also artificial intelligence with at intelligence with is the technology behind this people like you probably and Indiana broader public you often get the idea that these machines might have some kind of real intelligence or intention or an entity on their own right and they're just not it's just a statistical method that's just a mask and you know way more about this than I do so I will leave it with this and just say that are highlighted that they have these machines these weapons have certain competences for specific tasks they are not entities on their own right they're not intentional and that's important when we talk about ethical and legal challenges afterward in the for let us and the other in connection with this and there's nothing which is the 1st part
of terminator references in the media as soon as he talk about autonomous weapons was they were forgeries killer robots in this context and just in case you intend to write an article about this so don't use a terminator picture please don't because it's really unhelpful to understand where the problems are and with this kind of thing people assume that we have problems this when we have machines with human-like intelligence which look like the Terminator or something like this and and a promise that really way before that they start when you use assisting systems when you have men of human and machine teaming at all when you accumulate a couple of autonomous functions through the targeting cycle on so through this that the military steps that lead to the use of force or lead to the killing of people and that's not this is really not our problem at the moment so please keep this in mind because it's not just semantics semantics to uh to differ part of adjective created these 2 things it's really manages the expectations of political and military decision makers OK so we know we've got kind of an impression what I'm talking about so why should we actually
talk about this what's all the fuss about I
actually economist weapons have would have quite a few military into advantages they might be in some cases fast or even more precise than humans and you don't need a constant communication link and so you don't have to you have to worry about in occasionally things you don't have have to worry about latency for detection or of vulnerability of this specific and so the chain of and a lot of very good that's a interesting military options come from that to some people talk about which are stealthy operations in shallow waters for example or um the remote missions in secluded areas things like that and you can get very creative with the tiniest what robot swarms for example I so shiny new options batch and of course there's a part of that it comes as a at a price because you have at least 3 dimensions of the challenges in this work on this we got 1st of all the legal ones when we talk about these weapons and they might go there will be applied in a conflict where intermissions Unitarian law IHL applies and I gel consists of quite a few very abstract principles the for example principle of distinction between combatants and civilians principle of proportionality or a military necessity they're very abstract and the test I'm pretty sure they will be always need our a human judgment to interpret and the print these principles and apply them to dynamic situation I feel free to correct me if I'm wrong later as so that's 1 thing theory of if you remove the human from the charting cycle these human judgment might be missing out on and therefore militarization makers have to evaluate very carefully and the quality of human control I human judgment within the targeting cycles so that's law 2nd dimension of challenges our security issues but when you look at these new systems that cool and shiny and as most of you what types of weapons they are and they have the potential to Stern arms race between the between states so they've actually might make conflict more likely just because of their their and states want to have them and feel threatened by them 2nd aspect of proliferation autonomy is based on so suffering can be easily transferred it's really hard to control and all the other components are most of the other component you only 2 are available on the civilian market so you can build this stuff on your own if you're smart enough and so we have might have more conflicts from these types of weapons and which might might get well of more difficult to control the application of technology and a 3rd 1 which is the especially worrying for me is best potential for escalation within the conflict especially when you have a Windows was sites sites he's Lisa tons weapons and get these very complex and over 3 the system that it will become very hard to predict how they're going to interact and they will increase the speed of the the of the conflict and as you might not even have a chance to process what's going on there and so it's rewiring and we can see for example in high frequency trading on the stock market where problems arise there and how difficult is for humans to understand what's going on there and so that that that out some of the security issues there and the last and maybe maybe most important 1 hour ethics as a major default of when you use autonomy in weapons or a machines you have artificial-intelligence so you don't have a real intention real entity that's behind this so the killing decision might at some point be based on statistical methods and no 1 will be involved there and that's while wearing for a lot of reasons but also it could constitute a violation of human dignity you can argue that humans have world you can kill humans in and wall but they at least have the right to be killed by another human of police by this decision of another human but we can discuss this later stage at least on this regard would be highly unethical and that's really just scratches the surface of problems and challenges that would arise from the use of these autonomous weapons I haven't even touched on 1 of the problems with training data was accountability this verification of all that's so funny stuff because I don't have to do with the show of something that doesn't so how can this
issue be addressed elite states
have things to a huge campaign of N G O's noticed that there might be some problems and they might be necessity to address this issue they currently do this in the UN Convention on Certain Conventional Weapons CCW and where they discussed a potential ban on the development and use of these the tunnel weapons or awful weapons of that lack meaningful human control over the use of force this cell ideas around there on In such a ban would be read the maximum goal of the NGO but it becomes increasingly unlikely that this happens most states do not agree with the complete ban they want to regulate a bit here but there and they really can't find a common common definition as I mentioned before that because if you have a broad definition as just as I use that are you will notice that you have existing systems in that might be not that programmatic other to just the 1 to band and you might a stop civilian or commercial developments which you also don't want to do so states are stuck on this regard and they also really challenge the notion that we need a preventive arms control his so that we need to act before the systems are applied on the battlefield and also at the moment this is the 4th year something of these negotiations are and we see how that goes up this year and if they can't find a common ground there it becomes like on yet have likely that it will change to another forum just like with anti-personnel mines for example which uh where the the treaty was found on the outside of the united nations in but yeah the window of opportunity really closes and States Energy owes have to act out there and it keeps a track there just as a side note of probably quite a few people are of members of NGO so if you look at the complaint is the killer robots with a big campaign behind this uh this process abound there's only 1 German and geologists facing finance so if you're especially if the gym and health and are interested in the AI it might be worthwhile to look into the military dimension as well you really need to us some the expertise on that regard especially on AI and the technologies there OK so just in case you fall asleep due last
15 minutes I want you to take away 3 key messages so please be aware of the trends and internal logic that lead to autonomy in weapons do not overestimate the abilities of autonomy of polymers machines like intent and these things and because you probably own or knew this already please tell people about this to other people about this educate them about this type of technology and 3rd don't underestimated the potential dangers of for security and human dignity that comes from this type of weapon I hope that I could address the more in
this in this particular issue if you want to learn more you can find really interesting sources on the website of the CCW at the complaint is the pillow robots and from a a research project that I happen to work in the Internet upon on the regulation of opponents weapons we do have a few studies on that regard and we're going to publish a few more so all these take this out and thank you for your attention fj few the questions and so on so we have some tough questions answers in all the 1st of all I have to apologize that we had a hiccup with assigning language you crosses over you're the on the state was so that she could do the job so I'm terribly sorry about that we fix it in the fall and my apologies for the uh we're chewing on the microphones already so we start with my phone number 1 of the questions these things we'll talk and you don't you think there's a possibility to reduce war crimes as well by taking away the decision from humans and by having algorithms to decide which are actually auditable but yeah that's excellent something it is discussed in the international debate as well that there might that machines might be more ethical than humans could be an well of course they won't are just start writing of women because they want to but you can program them to do this so it should is you shift the problems really on and also maybe these machines don't get angry but they don't show compassion either so if you are there in your potential target they just won't stop there just till you and do not think 1 think about this so you have to really look at both sides there against thank thanks so we switch over to microphone free please hi and thanks for the talk and the regarding economists choruses self-driving cars there's a similar discussion going on regarding fixed how should a caller react in the case of an accident should be protected people outside people inside what model also there is another discussion there on do you work with some of the people in this area are the so is there any collaboration and maybe that's less collaboration and I think there is and of course we we monitor this debate as well and and yet we think about the possible applications of the outcomes for example from this German ethical commission on different kinds of file with the but I mean torn that because when you what talk about weapons they had designed to kill people and cars mostly are not in the kids so with this ethical committee you want to avoid can people or decide what happens when this accident occurs as so there but the different um but of course we have you can learn a lot from from those discussions and where we have that if he same similar goal when the bank might from number 2 please uh also from you thanks again for the support and confusing all this road professionalism into the theory of because the sum of the year 0 surroundings of 1 of our or so to say forest scenes scenery and they like to protest against the very specific things like for example the lunchtime babies and in my view that's a bit misguided if you just go out and protest in a populist agree without involving all of these points of expertise that you offer and so thanks again for that and then my question how would you propose that protesters of progress and develop themselves to a to a higher level to be you know on the 1 hand more effective and on the other hand I no more considerate of what is at stake of on all the levels and on all sides involved in yeah well 1st the ranch stand issue is is completely Exley completely different topic on its drone warfare remotely piloted drones so there a lot of a lot of problems with this and was starting killing but it's not about little autonomous weapons in particular at some well if you if you want to be a part of its international the baby is of course this complaint is the killer robot arm and they have a lot of really good people and a lot of resources sources literature and things like that to really educate yourself what's going on there so that the starting point and then yeah just keep talking to scientists about this and I'll find and where we see the problem then the I mean is always helpful for scientists to to talk to people in the field is a are so we have to talk about think thanks for that and the signal into a signal that we have something from the internet thank you and version from seen on the already in the killer robot loads there what that can detect a nuclear power plant for example what do you think and relatedness than
right and the rest the this was so can you guys the closer to the microphone please is apparently more energy will want to its sorry that's what sort of sorry we saw that we can't here over this sort of thing we don't switch over the microphone to please it I have 1 little question so that in your talk you were focusing on the ethical questions related to lethal weapons are you aware of the ongoing discussions regarding the ethical aspects of of the design and implementation of a less than lethal autonomous weapons for crowd control and similar purposes and the actually and within the CCW every term of this leaflet on weapon systems is disputed also a lethal aspect and then for the regulation that might be easier to focus on this for now because a lesser lethal weapons come with their own on problems um and the question of their ethical and if they can if I gel applies to them but I'm not really deep into this discussion so I just have to be everything I things and that to might for 1 things high thank you for the top very much of my question is in the context of the decreasing cost of hardware and software over the next 20 40 years outside of the nation-state state context like private forces for nation state actors gaining use of these weapons to things like the UN convention or they can be just after about supplier the considering private individuals trying to leverage these against others in not sure what the contains says about this that I'm not a member there but the the CCW mostly focuses on international human Terry long arm which is important but I think it's it's it's not broad enough the questions like proliferation and all this connected to your question I'm not really how about a probably won't be part of a regulation there it's discussed on the on the edges of the of the debates and gotiation there but it doesn't seem to be really relation there that text thanks and all the microphones she's I have a question as a researcher do you know how far the development has gone already so how how transparent orange France apparently is you're look into what is being developed and researched on the side of militaria working military people working with autonomous weapons and developing them in I for me it's quite in transparent because I only have only access to public publicly available sources so I don't really know what what's going on behind closed doors and military or the Ministry uh there of course you can you can monitor or the civilian applications or developments which can tell a lot about the of the state of the art and of for example the top priority the American and development agency on which they published sometimes the call for of local papers that's what the term but are they can see where in which areas they interested in and for example they really like this idea of economists muscular back that connected swarms and monetary or or until people acyclic graph so yeah we try to to piece that is a together now work we do not timely OK to answer more questions show that we got a switch over to microphone 3 please so um I think to you in the will of course is looking at all of these little weapon systems if you think about the the millions of landmines which operating and so the question is should be possible to ban these weapons systems the same as that already banned by several of the countries the so just include them in the diffusion and the because the argument should be very similar in yet it does it does come to mind of course of because these might a flying around the the ones in directing step on them and the boom and that they our well it depends it depends 1st of all a bit of your definition of of economy and so something autonomous the when you act in dynamic situations and the other ones would be automated and things like that and I think of this economy aspect I really don't want to find what the what define defined on his return from this this action in more dynamic spaces and the aspect of machine learning and all these things that they are way more complex and they bring different problems just landmines fragments are problematic and anti-personnel mines up and for good reasons but they don't have the same problems as thing so it won't be another thing won't be sufficient to just that but also in their belief that a lot of thank you much I can't see anyone securing up so therefore you know cigarette it's your brought fewer a and B
was that and the and the and the if it the fact that but
Domain <Netzwerk>
Einfügungsdämpfung
Bit
Punkt
Benutzerfreundlichkeit
Kombinatorische Gruppentheorie
Punktspektrum
Gesetz <Physik>
Datensichtgerät
Übergang
Eins
Task
Immersion <Topologie>
Open Source
Spannweite <Stochastik>
Steifes Anfangswertproblem
Binärdaten
Lineares Funktional
Zwei
Physikalisches System
Gesetz <Physik>
Digitale Photographie
Roboter
Objekt <Kategorie>
Rhombus <Mathematik>
Auswahlaxiom
Funktion <Mathematik>
Forcing
Mereologie
ATM
Autonomic Computing
Gamecontroller
Modelltheorie
HMS <Fertigung>
Aggregatzustand
Umwandlungsenthalpie
Einfach zusammenhängender Raum
Lineares Funktional
Momentenproblem
Virtuelle Maschine
Statistische Analyse
Physikalisches System
Kontextbezogenes System
Entscheidungstheorie
Roboter
Formale Semantik
Verdeckungsrechnung
Task
Virtuelle Maschine
Erwartungswert
Forcing
Reelle Zahl
Dreiecksfreier Graph
Hypermedia
Mereologie
Radikal <Mathematik>
Autonomic Computing
Neuronales Netz
Telekommunikation
Web Site
Vektorpotenzial
Wellenpaket
Punkt
Kontrollstruktur
Hausdorff-Dimension
Wasserdampftafel
Hochdruck
Versionsverwaltung
Kartesische Koordinaten
Gesetz <Physik>
Physikalische Theorie
Eins
Virtuelle Maschine
RPC
Flächentheorie
Reelle Zahl
Bildschirmfenster
Computersicherheit
Default
Dimension 2
Softwaretest
Nichtlinearer Operator
Computersicherheit
Programmverifikation
Telekommunikation
Statistische Analyse
Vektorpotenzial
Physikalisches System
Binder <Informatik>
Frequenz
Gesetz <Physik>
Roboter
Konfiguration <Informatik>
Entscheidungstheorie
Konstante
Modallogik
Flächeninhalt
Verschlingung
Softwareschwachstelle
Rechter Winkel
Komponente <Software>
Dreiecksfreier Graph
Gamecontroller
Stapelverarbeitung
Aggregatzustand
Bit
Prozess <Physik>
Momentenproblem
Orthogonale Gruppe
Extrempunkt
Hausdorff-Dimension
Zellularer Automat
Physikalisches System
Gesetz <Physik>
Data Mining
Roboter
Modallogik
Weg <Topologie>
Vier
Forcing
Webforum
Bildschirmfenster
Gamecontroller
Softwareentwickler
Aggregatzustand
Web Site
Subtraktion
Vektorpotenzial
Mathematische Logik
Gewichtete Summe
Punkt
Freeware
Formale Sprache
Versionsverwaltung
Zahlenbereich
Kartesische Koordinaten
Information
Übergang
Internetworking
Demoszene <Programmierung>
Systemprogrammierung
Virtuelle Maschine
Algorithmus
Arithmetische Folge
Prozess <Informatik>
Schätzung
Datentyp
Computersicherheit
Modelltheorie
Widerspruchsfreiheit
Regulator <Mathematik>
Roboter
Umwandlungsenthalpie
Beobachtungsstudie
Sichtenkonzept
Wald <Graphentheorie>
Zirkel <Instrument>
Computersicherheit
Vektorpotenzial
Quellcode
Elektronische Publikation
Ordnungsreduktion
Entscheidungstheorie
Roboter
Kollaboration <Informatik>
Datenfeld
Twitter <Softwareplattform>
Flächeninhalt
Last
Mereologie
Projektive Ebene
Message-Passing
Aggregatzustand
Bit
Gruppenoperation
Implementierung
Kartesische Koordinaten
Information
Term
Raum-Zeit
Eins
Data Mining
Systemprogrammierung
Software
Softwareentwickler
Algorithmische Lerntheorie
Regulator <Mathematik>
Roboter
Einfach zusammenhängender Raum
Parametersystem
Hardware
Graph
Relativitätstheorie
Stellenring
Systemaufruf
Ähnlichkeitsgeometrie
Quellcode
Kontextbezogenes System
Quick-Sort
Energiedichte
Flächeninhalt
Forcing
Mereologie
Gamecontroller
Autonomic Computing
Aggregatzustand
Hypermedia
Medianwert
Systemprogrammierung

Metadaten

Formale Metadaten

Titel Regulating Autonomous Weapons
Untertitel The time travelling android isn’t even our biggest problem
Serientitel 34th Chaos Communication Congress
Autor Dahlmann, Anja
Lizenz CC-Namensnennung 4.0 International:
Sie dürfen das Werk bzw. den Inhalt zu jedem legalen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen.
DOI 10.5446/34923
Herausgeber Chaos Computer Club e.V.
Erscheinungsjahr 2017
Sprache Englisch

Inhaltliche Metadaten

Fachgebiet Informatik
Abstract Depending on the definition, autonomous weapon systems do not and might never exist, so why should we care about killer robots? It is the decline of human control as an ongoing trend in military systems and the incapacity of computing systems to „understand“ human beings and the nature of war that is worrisome.
Schlagwörter Ethics, Society & Politics

Zugehöriges Material

Video wird in der folgenden Ressource zitiert

Ähnliche Filme

Loading...
Feedback