The commander stays at home

Drones - perceptions and expectations on the subject of technical systems that are capable of acting independently have been characterised by the "Terminator" and the technical intelligence behind it, "Skynet", since 1984 at the latest. A machine state with almost unlimited capabilities that are only surpassed by its will to power, boundless amorality and hostility towards humanity. When we talk about "autonomous" systems, the images and story of the "Terminator" are almost automatically brought to mind. The technological reality is very different. Flying Drones halfway around the world, tanks and ships without crews guard military installations or search for submarines and, finally, IT-supported command and weapon deployment systems have long been part of everyday military life - but all of this takes place under the control or close supervision of soldiers. The article combines the technological status and the prospects for future developments with selected questions of international law and ethics. It is based on the findings of the treatise "Unmanned Systems and Cyber Operations: Armed Forces and Conflict in the 21st Century - An Introduction".

Examples of weapon systems

Below are just a few comparative objects from the enormous global selection, all of which can be found using web search engines:

  1. "ZAUNKÖNIG", Kriegsmarine torpedo, in service since September 1943: the world's first autonomous acoustic homing torpedo, used by submarines.
  2. AGM-158C LRASM (Long Range Anti Ship Missile), subsonic sea target missile: designed for use against surface targets at greater distances; is assigned a search area and one or more targets and can make a selection from the assigned targets during the approach.
  3. HARPY: DroneAnti-Radiation Loitering Munition System, designed to combat ground-based air defence radar systems. It is launched from a lorry and circles for several hours over an assigned area of operation; if it detects the radar radiation of a ground-based air defence system, it crashes into this source and destroys it.
  4. MQ-9 REAPER: Reconnaissance and combat drone to engage long-range ground targets, is either remote-controlled or can optionally carry out a programmed mission autonomously.
  5. BLACK HORNET PRS: The contrasting programme to large Drones are nano-systems that are used as "personal reconnaissance systems" by infantrymen; with a length of 17 cm and a weight of 33 g, it remains in the air for around 25 minutes and can also reconnoitre indoor areas.
  6. AIRBUS VSR 700: as a medium-sized unmanned helicopter, the VSR 700 offers a wide range of possible applications; with a total weight of 700 kg, a payload of 150 kg and a flight time of 10 hours, it is the ideal unmanned "on-board helicopter" for corvettes and frigates with the options of reconnaissance and action.
  7. Future Combat Air System (FCAS): Germany, France and Spain want to organise an initial technology demonstration by the end of 2021 for a combination of a fighter aircraft as the successor to the Eurofighter and Droneswhich, as "wingmen" of the fighter aircraft, are intended to create a capability network for reconnaissance, command and control and action. The system should be available for use by the armed forces from around 2040.
  8. PROTECTOR and SEAGULL: unmanned combat boats around 9 to 12 metres long for monitoring coastal waters, harbours, roads and rivers, which can engage targets remotely or in automatic mode. Optionally equipped with machine guns, grenade launchers, mine hunting equipment or ASW torpedoes. They can be deployed singularly, but also as "wing men" for manned boats, for example to protect special forces in amphibious operations. Such USVs could be part of the equipment of the Type 125 frigates with combat boats.
  9. SEA HUNTER, unmanned submarine hunting vessel of the US Navy, 10,000 NM range, 90 days endurance at sea, can be remote-controlled or operate in automatic mode. The PRC is planning to build unmanned submarines that can operate submerged for a comparable length of time.
  10. Large UUV (Unmanned Underwater Vehicles): Boeing's ECHO VOYAGER was submerged for 3 months in a test in 2017, the US Navy has ordered five of its successor ORCA; the Chinese Navy is to receive a UUV in the 1920s, which should also be able to operate submerged for months; Russia is developing "torpedoes" with nuclear propulsion and warhead, range 10.000 NM, which are to be deployed by the 30,000 tonne submarine BELGOROD; the weapon could "lie in wait" for months after being launched from the mother ship.

All systems operate either remotely or on the basis of pre-programmed target assignments or procedures. Only take-offs and landings, transfers to an operational area or pre-programmed emergency procedures such as "lost link mode" take place without the intervention of soldiers.

Sea Hunter

Sea Hunter

 

Drones - There are no "autonomous" systems

The best basis for categorising systems as "autonomous" or merely "automated" is provided by the United Kingdom's "Joint Doctrine Publication 0-30.2 Unmanned Aircraft Systems", which describes the special features of unmanned flying systems and, above all, makes it clear what current systems cannot do. A distinction is made between two categories.

Thereafter Drones as an "Automated System":

"... an automated or automatic system is one that, in response to inputs from one or more sensors, is programmed to logically follow a predefined set of rules in order to provide an outcome. Knowing the set of rules under which it is operating means that its output is predictable."

And as an "autonomous system":

"An autonomous system is capable of understanding higher-level intent and direction. From this understanding and its perception of its environment, such a system is able to take appropriate action to bring about a desired state. It is capable of deciding a course of action, from a number of alternatives, without depending on human oversight and control, although these may still be present. Although the overall activity of an autonomous unmanned aircraft will be predictable, individual actions may not be."

This definition of Drones has a weakness, however: systems that can be monitored and influenced or stopped by humans at any time are not autonomous. The original term "autonomy" describes a human characteristic as a state of self-determination through Decision-making resp. Freedom of action. The definition of an autonomous system in this sense therefore needs to be supplemented:

Autonomy only begins when a system can start and end its actions independently of humans, i.e. when humans have no possibility of intervening in the system's actions.

The ZAUNKÖNIG is therefore an automated system because the operational decision lies with the human and secondly because it is bound to the mechanically defined action programme. LRASM or HARPY also execute pre-programmed options, albeit at a technologically higher level - they do not make their own decisions, even if they carry out their task in the operational area independently of the operator. In other words: Independent operation is not sufficient for "autonomy" if the system follows its mechanical constraints or algorithms alone, because it has no freedom of choice whatsoever. Many of the systems mentioned above are remotely controlled by humans. Is it enough to be labelled "autonomous" if there is no shutdown function for the system, no option for the operator to abort the mission? Even if the system follows a programme and has no option to break out of the pre-programmed framework, the answer is clearly "no".

The military systems mentioned above fall completely into the "automated" category. The following applies to all the systems listed above: in automatic mode, they merely realise programmed intentions in specific individual cases at the command of a human. None of the systems analysed above is an "autonomous" system in the sense defined above. It is therefore appropriate to refer to the systems as "unmanned systems", as this is the only characteristic common to all systems.

 

Sky Guardian

Sky Guardian

 

International humanitarian law and automated systems

International humanitarian law (IHL) binds the parties to a conflict and all persons acting on their behalf. The Viadrina International Law Project (VILP) is one of the sources of the many legal bases. The legal aspects of the use of automated systems are explained using the four most important rules of IHL, based on their easy-to-understand wording on the website of the Federal Foreign Office.

Rule 1

Neither the parties to the conflict nor the members of their armed forces have unrestricted freedom in their choice of methods and means of warfare. The use of any weapons and combat methods that cause superfluous injuries and unnecessary suffering is prohibited.

Prohibited means and methods include, for example: Poison gas, lasers to cause permanent blindness, munitions to cause additional suffering beyond the militarily required effect to injure and kill, terrorist attacks against civilian populations. The legal basis is Article 35 of Additional Protocol No. 1 to the Geneva Conventions. The use of machines in itself does not violate these rules; it depends on the specific deployment and the ammunition used. If the operator uses Drones for a terrorist attack solely against uninvolved civilians or shoots the Drone prohibited ammunition is an offence against the IHL. It makes no difference whether the Drone is remote-controlled or carries out the attack without operator intervention due to the programming.

Rule 2

In order to protect the civilian population and civilian objects, a distinction must be made at all times between the civilian population and combatants. Neither the civilian population as a whole nor individual civilians may be attacked. Attacks may only be directed at military targets.

How is it to be judged when HARPY attacks an enemy radar stationed in the centre of a densely populated village? The radar display is a legitimate military target. The attack is not exclusively and not primarily against the civilian population, therefore the attack on the radar display is permissible as long as no excessive collateral damage is caused. This follows from Articles 35, 51 and 57 of Additional Protocol No. 1 to the Geneva Conventions. There is no difference in legal judgement compared to the pilot of a fighter aircraft, who would also have to decide in favour of missile action against the enemy radar system deployed in the residential area in order to protect himself and the other units deployed, even if civilians could be killed in the process. Precision has been an effective means of reducing the number of civilian casualties for decades; before the introduction of homing missiles, a larger radius would have had to be covered with bombs or grenades to safely disable a radar system. The situation also requires a look at the other side: there is a violation of Rule 1 on the part of the party that positions its radar in a residential area and thus instrumentalises the civilian population as human shields. Articles 48 and 58 (b) of Additional Protocol No. 1 to the Geneva Conventions are violated. The positioning of the radar display in the midst of the civilian population makes it difficult for the enemy to distinguish and separate this military target from other non-legitimate targets - this can also be qualified as a combat method that can lead to unnecessary injuries, killings and unnecessary suffering.

Rule 3

Combatants and civilians under the control of an opposing party are entitled to respect for their lives and dignity. They must be protected from any acts of violence or reprisals.

Custody is exercised by soldiers; at present, unmanned systems are not suitable for this task.

Rule 4

It is forbidden to kill or injure an opponent who surrenders or is unable to continue the fight.

One of the legal bases is Article 41 of Additional Protocol No. 1 to the Geneva Conventions. Machines as well as soldiers must be able to fulfil the requirement to protect the enemy who is out of action, at least if they are acting without the involvement of the operator - which, however, will probably not be possible in the near future due to the current limitations of the perception capability of technical systems. When fighting at sea, it is difficult for people to recognise that an enemy ship has stopped simply to allow the crew to disembark due to the large distances involved. No higher demands can be placed on remote-control operators or programmers of technical systems than on the crew of a combat ship who observe the stopping. In land combat, machines are supported by soldiers in a similar way to the combined deployment of tanks and infantry. Soldiers can recognise surrendering or incapacitated opponents and take them prisoner, thus ensuring that IHL can be observed.

From the point of view of the four main rules of IHL, the following applies: machines must be able to be brought under control at all times and must not kill indiscriminately. The systems listed above do not in themselves constitute a violation of the rules of IHL, but their use is important.

 

Remus

Remus

 

Is SEA HUNTER a warship within the meaning of Art. Convention on the Law of the Sea?

The examination of this question shows that problems under international law arise in rather unexpected places. The United Nations Convention on the Law of the Sea (UNCLOS III) of 1982 defines warships in Article 29:

For the purposes of this Convention, "warship" means a ship belonging to the armed forces of a State bearing the external marks distinguishing such ships of its nationality, under the command of an officer duly commissioned by the government of the State and whose name appears in the appropriate service list or its equivalent, and manned by a crew which is under regular armed forces discipline.

Maritime drones have been in use worldwide for decades as subsystems of warships, from which they are deployed for limited periods of time and space, for example: MinehuntingDiving drones of the German Navy of the penguin or seal type.

A system like SEA HUNTER could not have been imagined at the time UNCLOS was created (1970s) - it is not a subsystem of a warship. Of the three conditions of Article 29, labelling is the easiest to fulfil. Subordination to the command of an officer is also the easiest - this indisputably means "command authority". In the age of remote data transmission, the presence of the officer on board is not mandatory. The wording of UNCLOS - "manned by a crew" - is certainly based on the assumption that a crew is in fact always on board "their" ship. This is a clear requirement, but also an unintentional loophole created by technical progress, which can be filled by analogy. The SEA HUNTER can therefore be described as a warship. This will be controversial among states, as this status is associated with various duties of behaviour and powers, for example when passing through foreign territorial waters.

Unmanned systems and ethics

Soldiers have been dealing with increasingly automated systems for a long time, but it is foreseeable that their range, speed of action and options for action will continue to increase. Missile defence systems and other machines are increasingly turning the soldier into an observer and controller of the system, who only becomes active in the event of deviations from the norm. Through the use of travelling, flying, floating or diving Drones soldiers are withdrawn from the front line of fire, although this is offset by ranged weapons. Nevertheless, the typical risks of being a soldier are changing both qualitatively and quantitatively with the increasing use of automated systems.

When under fire from enemy automated systems, soldiers inevitably develop a new perception. If there is no longer a soldier fighting in the attacking enemy machine and it becomes clear that you are risking life and limb and can only cause damage to property on the other side, this will have an impact on motivation and mental attitude. It is possible that the realisation that only easily replaceable technical equipment is being destroyed and that both toughness and the willingness to make sacrifices no longer offer any advantage will reduce the motivation to fight. On the other hand, the inhibition to kill no longer applies and the soldier does not have to worry about legal or ethical questions regarding the use of means, unless it comes to the choice of means whose effect goes beyond fighting the enemy machines. Does it make any ethical difference if a soldier is shot at by an automated machine? It makes no difference in terms of the unpredictable contingencies of combat. It makes injuries, agony, suffering or death neither better nor worse from the soldier's point of view.

It cannot be completely ruled out that technical systems will one day be able to make decisions on the "whether" and "what" of a weapon deployment. Even if there are currently still clear technological limits, this is within the realms of possibility - regardless of whether this responsibility is actually delegated to such systems. Does it make an ethical difference whether a soldier decides on the start and end of the use of force, merely observes the machine or even stays out of the process altogether? Certainly yes. The core of a soldier's job is to make precisely this difficult decision and not to delegate it to machines. When people make existential decisions involving ethics and emotion, their own vulnerability and mortality always resonate - no machine will ever be able to simulate this, because human decisions are always felt and not just intellectualised as an abstract train of thought. The responsibility for lethal decisions must always remain with the human being, who is aware of his own vulnerability and mortality, precisely because this decision is or at least can be existential not only for the opponent.

Another urgent ethical question arises: Can the leadership of a party to a conflict justify pitting its soldiers against unmanned systems if it equips its armed forces with no or significantly fewer unmanned systems than its opponent? In the knowledge that the machines have an advantage simply due to their faster OODA loops. In the knowledge that the enemy can replace the destroyed machines, but that your own fallen or wounded soldiers are lost for good or must be taken out of the battle. In the knowledge that in defending your own country, you may "burn out" entire generations against the constantly replenished systems of the enemy. In the knowledge that your own soldiers will not be able to adequately protect your own civilian population.

 

Dr Michael Stehr, born in 1966, completed his doctorate in constitutional organisation law under Hans-Peter Schneider in Hanover and has been working for MarineForum since 2000. He has been publishing on political, legal, military and technical aspects of maritime security and European security and defence policy since 1992.

This article is an extract from the author's new book
Unmanned systems and cyber operations: Armed Forces and Conflict in the 21st Century - An Introduction
Hamburg 2020, ISBN 978-3-8132-1103-0

Click here to buy the book 

Photos: US Navy

0 Kommentare

Einen Kommentar abschicken

Your email address will not be published. Erforderliche Felder sind mit * markiert

en_GBEnglish