- Within less than 10 years the top 10 militaries in the world – with perhaps the exception of India – will be nearly or completely fully automated.
- Lethal Autonomous Weapons Systems will be guided by their own algorithms and be impossible to stop once launched – by deliberate design.
- Robotic Autonomous Systems will become cheaper and more widely used by small and mid-size militaries around the world over the next decade.
- Voluntary Measures to try and control these systems are more likely, rather than Hard Laws like Negotiated Treaties or Soft Laws like Codes of Conduct. However, public backlash against autonomous weapons systems is likely.
- Any remaining Human Military will play subordinate roles supporting and maintaining these autonomous systems.
On September 26, 1983, the Soviet Union almost launched a nuclear retaliatory strike against the U.S. due to their early warning system indicating an incoming U.S. ballistic missile. However, Soviet officer Stanislav Petrov’s decision to wait for more confirmation, given he suspected their early warning system was sending a false signal, and not report the incident up the chain of command, may very well have prevented a nuclear conflict.
What had triggered the incoming missile on the Soviet early-warning systems? Sunlight reflecting off high-altitude clouds.
On January 13, 2018, the actions of a supervisor at the Hawaii Emergency Management Agency (HEMA) caused an alert warning of incoming missiles in Hawaii to be broadcast to cellphones in Hawaii and to be broadcast across other media – interrupting broadcasts of a basketball game in Eastern USA, and a soccer match in London, UK. The suspected source of the supposed missile attack was assumed to be North Korea.
What had triggered the warning? Erratic behavior by the HEMA supervisor who had confused a drill with the real event at a time when worries of expanding North Korean missile capability had Hawaii and even the West Coast in a state of heightened alert.
Will accelerating automation across militaries around the world, including the use of autonomous weapon systems, make these types of incidents:
- less likely?
- even more likely?
Or, even more lethal?
Like it or not, we are entering the realm of Science Fiction, although it is no longer fiction, and in fact has not been fiction for decades. As Isaac Asimov postulated in his 3 Laws of Robotics:
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Isaac Asimov – Runaround – short story written in 1942
Replace the word “Robot” with “Autonomous Weapons Systems” (AWS) and imagine an AWS obeying the third law but ignoring the first and second law, and you realize what the current AWS arms race will be like.
Put another way:
I know not with what weapons World War III will be fought, but World War IV will be fought with sticks and stones.
Let’s first establish a few definitions:
Automated system: An automated system is one that has been instructed to automatically perform a set of specific tasks or series of tasks within human-set parameters. This may include basic or repetitive tasks.
Autonomous system: The Defence Science and Technology Laboratory (Dstl UK) defines an autonomous system as one that can exhibit autonomy. There is no agreed definition of autonomy, but Dstl defines it as ‘the characteristic of a system using AI to determine its own course of action by making its own decisions’. Autonomous systems can respond to situations that were not pre-programmed.
Artificial intelligence: There is no universally agreed definition of AI, but it usually refers to a broad set of computational techniques that can perform tasks normally requiring human intelligence. AI is an enabling technology for higher levels of autonomy.
Machine learning: is a branch of AI that has underpinned the most recent advances in technologies with autonomous capabilities.
Robot: A powered machine capable of executing a set of actions by means of direct human control, computer control, or both. It consists of a platform, software, and a power source.
Robotic and Autonomous Systems (RAS) is a framework to describe systems with both a robotic element and an autonomous element. It is important to note that each of these parts of RAS covers a broad range of possibilities.
- Systems refers to a wide variety of physical systems over a wide range of application areas, including military of course.
- Automated software systems running on computers or networks, including ‘bots’, pieces of software that can execute commands with no human intervention, do not qualify as RAS because they lack a physical component.
- Robotic means the physical layout of the system and indicates that the system is unmanned or uninhabited. Other physical aspects (form, shape, whether it flies or rolls on the ground etc.) are not necessary to define.
Lethal Autonomous Weapon System (LAWS): A weapon which, without human intervention, selects and engages targets predefined criteria, following a human decision to deploy the weapon on the understanding that an attack, once launched, cannot be stopped by human intervention.
Meaningful human control (MHC): a broad term that means several things:
- Humans make informed, conscious decisions regarding the use of weapons
- Humans are adequately informed to ensure that the use of force complies with international law, within the knowledge that they have about their target, the weapon, and the context in which the weapon is used
- The weapon has been designed and tested in a realistic operational setting and the people involved have received adequate training, enabling them to use the weapon in a responsible manner.
Military-Industrial Complex (MIC): a network of individuals, institutions, and large corporations that are involved in the technologies, weapons, vehicles, software, platforms, and processes in the military.
Automation: the use of systems to perform tasks that would ordinarily involve human input, in this case military personnel.
DAIC: (Defence AI Centre) a UK government-funded centre that studies artificial intelligence and autonomous systems in the military.
CDE: (Centre for Defence Enterprise) a UK organization that works with the private sector to produce innovative technologies for military use.
Drone swarming Drone swarming is the deployment of multiple drones capable of communicating and coordinating with each other as well as human personnel to achieve an objective. In a military setting, a swarm might be used to:
- monitor an area,
- relay information, or
- attack a target.
Who is this industry run by right now?
To answer this question, it is necessary to talk about the Military-Industrial Complex, which is a concept made famous by President Eisenhower in his farewell speech on television in January of 1961. As explained above in our definitions, it is a network of individuals, companies, government agencies, departments and other institutions like think tanks that provide the weapons, technology and policy proposals that the U.S. Military (as well as other militaries with their own industrial complexes) needs to function.
Some of the major corporate players in the MIC in the U.S. and globally are:
- Lockheed Martin
- Northrop Grumman
- General Dynamics
- L3Harris Technologies
- Huntington-Ingalls Industries
- Booz Allen Hamilton
- General Electric
Obviously, the Department of Defense, Homeland Security, and various Intelligence Agencies are part of, or closely tied to, the MIC, including the White House. Congress theoretically has oversight of defense spending, but lobbying and executive privilege mean they tend to follow the trends set by the MIC.
Let’s now see how much is spent on defence by MICs around the globe.
In the first place, total defense spending in the U.S. in 2019 was $ 731.8 billion, which is around 3 times what China reportedly spends.
However, one should consider the possibility of off-the-books spending on top secret projects which means the real figure for U.S. defense spending is likely higher – as likely are the figures below for countries like China and Russia and perhaps for other countries as well.
Additionally, MICs exist in other countries with a substantial – or smaller in size but innovative in the case of Israel – industrial and military presence:
- China – $ 261.1 billion
- India – $ 71.1 billion
- Russia – $ 65.1 billion
- Saudi Arabia $ 61.9 billion
- France $ 50.1 billion
- Germany $ 49.3 billion
- UK $ 48.7 billion
- Japan $ 47.6 billion
- South Korea $ 43.9 billion
- Brazil $ 26.9 billion
- Italy $ 26.8 billion
- Australia $ 25.9 billion
- Canada $ 22.2 billion
- Israel $ 20.5 billion
- Turkey $ 20.4 billion
These numbers show that India and Saudi Arabia are major players in the Military-Industrial Complex space. However, India does more research in-house while the Saudis tend to contract out the work to Western companies. Furthermore, Saudi defense spending seem to be diversifying its subcontractor space towards more non-Western companies.
Let’s now look at the impact of the military on employment. It is important to keep focused on how automation may impact employment among the military. In other words, to ask how the balance between the human military and the autonomous, machine military will shift over the coming years. This means that employment numbers among defense contractors are not part of the table below. We will only look at active-duty militaries.
Approximate military employment numbers
|Active-Duty Military||As a Percentage of |
Country’s Labor Force
|South Korea||600 thousand||2.0%|
Several things can be observed. Israel’s military as a percentage of its labor force is double that of South Korea’s which is double that of the U.S. and Turkey. This has to do with the unique geo-political situations that Israel and Korea face, as well as with Israel’s relatively small population. Remember that if China had the same percentage of its labor force in the military as Israel, its armed forces would be as large as the world’s combined armed forces.
While France’s active-duty military is close to the global average, the remaining countries in our table are all well below the global average of 0.6%. This clearly indicates how automation has already made militaries in developed nations increasingly dependent on technology rather than on humans, compared to militaries in smaller or less-developed nations.
What is the main objective of the Military Industry – including the growth model, right now?
- Maintain fighting power, both currently and into the foreseeable future.
- Innovate in conjunction with the private sector in order to maintain fighting power.
- Develop, integrate into your armed forces, and utilize what are called Robotic and Autonomous Systems (RAS – see Key Definitions above).
- Train your human military to be able to function with and alongside your autonomous machine military – this will be one of the main challenges of integrating RAS with the human military.
- Solve the problem of Big Data – defined as data sets too large and complex to manage with standard methods – and of machine learning in conflict environments.
- Utilize RAS systems to provide:
- Protection, and
What is the current investment in automation in this industry?
|No investment||Low investment||Moderate investment||High investment||Full automation|
|Most tasks performed manually or with little automation||Automation limited to only a few areas or processes||Automation in wider range of areas & processes – more integrated into daily operations||Automation widely adopted & integrated into most areas & processes yielding significant improvements in productivity||Almost all tasks & processes automated with humans focused on maintenance and supervision of automated systems|
|Most of the less-developed countries although this will change as the price of RAS systems falls.||Mid-sized militaries in the more advanced developing countries like Brazil, although this should rapidly change over the following decade. (Additionally, there is the danger of terrorist organizations including narcotics cartels being able to reach this level of investment.)||The remaining countries listed in the Who is the military industry run by? section above||Hard to predict given that the top 4 countries by military spending, including spending on RAS systems and LAWS, also have large armed forces numbering from 1 million to over 2 million.|
Expedited Timeline of Transformation
|Automation limited to only a few areas or processes||Automation in wider range of areas & processes – more integrated into daily operations||Automation widely adopted & integrated into most areas & processes yielding significant improvements in productivity||Almost all tasks & processes automated with humans focused on maintenance and supervision of automated systems|
|Other mid-sized militaries||2023-2026||2030||2040-2043|
As we move through the coming years, imagine a country’s military moving to the right on the above table. All militaries (including countries not listed here) will inevitably do so – it’s only a question of the speed at which they transform into fully-automated armed forces.
How will the industry start changing? What to look out for?
The MQ-9 “Reaper” drone is classified as a hunter-killer UAV (unmanned aerial vehicle) designed and built by General Atomics and piloted through a sensor system by operators on the ground. It is capable of both high-altitude surveillance and can also launch missile attacks. First introduced in 2007, there are now more than 300 in operation in the USAF, and more in operation with other militaries around the world.
It can be disassembled in under 8 hours and packed onto a military transport plane, flown around the world to any center of operations, where it can be reassembled in hours and launched into action. This is impossible to do with any manned fighter aircraft.
As an example of what automated military confrontations could be like, in March 2023, a U.S. MQ-9 surveillance drone and a Russian SU-27 aircraft collided over the Black Sea, resulting in the drone being either accidentally destroyed or intentionally disabled by the U.S. military.
Was this an act of war?
Source – MQ-9 “Reaper” drone firing missiles
Think about this. NATO and Soviet/Russian pilots have tracked each other at close quarters for close to 70 years now. This is one more example of this type of incident.
However, now imagine the Russian aircraft were also autonomous unmanned airships – some sort of drone with attack rather than surveillance capabilities. And let’s assume the US drone was travelling with its transponder off, as has been reported, which would mean it was deliberately trying to evade detection by Russian radar systems.
One can imagine what are called Lethal Autonomous Weapons Systems (LAWS) on American naval vessels in the Black Sea or further away in the Mediterranean responding to a deliberate downing of a U.S. drone with lethal action against a Russian target.
Clearly what starts as a collision between drones could rapidly escalate into a major conflict if autonomous weapons systems are well established on both sides. So why would militaries risk this by, for example, widespread use of an MQ-9 Reaper drone or some other similar UAV?
They have no choice because their adversaries – and their allies – are already doing so.
While drones have been around since at least the 1940s and were used in the Vietnam War, they have only become widespread in the last 30 years or so with the Predator Drone from the 90s arguably marking the real beginning of the current era in drone warfare.
|First Gulf War 1991||Israel uses RQ-2 Pioneer to swarm and confuse Syrian forces|
|Balkans War 1990s||Drones used by U.S. to surveille Serbian movements. Predator drones become operational.|
|Early 2000s||Satellite-operated Huner-Killer drones try to kill Osama Bin Laden|
|2005||Israel develops the Harop drone which it first sells to Turkey (see next section below)|
|2008-2016||Obama substantially increases counter-terror drone strikes around the Middle East and elsewhere|
|2018-2020||Hobby drones – small and affordable – begin to be used by drug cartels and terrorist organizations.|
|2022 – Current||Drones – some manufactured in Iran – are used extensively in the Ukraine War|
Killer Kamikaze Drones
In 2020 Azerbaijan defeated Armenia in a largely unnoticed conflict over the long-disputed Nagorno Karabakh region. A key factor in Azerbaijan’s victory was a weapon developed by Israel: The Harop.
The Harop is basically a drone but with a very specific function – it is classified as a “loitering munition”. After being launched in batches from a flat-bed truck by a system similar to rocket launchers, with the drones’ wings emerging after being launched, they are capable of flying for several hours, using advanced technology to track down and identify their targets. They then fly straight into the target creating a relatively small but highly effective explosion on impact, disabling the target.
This is an affordable and effective military technology that is worth hundreds – if not thousands – of troops and dozens of tanks, and which is capable of winning regional conflicts. This was the precisely the case in 2020 with Azerbaijan emerging the winner due to a larger and more sophisticated fleet of so-called Kamikaze Drones compared to those of Armenia.
What mid-sized military would refuse to purchase and enable such a system? They have no choice.
OVERWHELMED BY DATA
Finally, an important part of military automation will be dealing with Big Data – as mentioned in the Objectives section above. As the UK’s Centre for Defence Enterprise (CDE) stated in 2015:
In a time when military manpower is limited, manual processing of data is too time consuming. The use of autonomous systems and processes to make sense of data to support decision making could increase efficiency and reduce the risk and cost of operations.
Essentially, this means automated and autonomous systems analyzing the data and providing recommendations to the human military or even giving direct commands to the machine military. The operational structure of the military must cede control to autonomous systems, or the data needed to operate RAS systems will overwhelm the human military. The only question is whether human operators will have any role analyzing data and making rapid-fire decisions in future wars. Or whether they merely launch the autonomous systems and let them function independently of any command-and-control structure.
What will this industry look like when automation is fully complete?
LETHAL AUTOMATED WEAPONS SYSTEMS & FLASH WARS
The Flash Crash
On May 6, 2010, financial markets plunged nearly 10% – a huge drop for a single day – before recovering most of their losses by the end of the trading day. Over a decade later, experts are still arguing about what exactly caused it.
Here’s one of the many explanations for what became known as the Flash Crash, from the Wall Street Journal:
HFTs [then] began to quickly buy and then resell contracts to each other—generating a ‘hot-potato‘ volume effect as the same positions were passed rapidly back and forth.
HFTs are High-Frequency Traders who use complex algorithms and enormous computing capacity to produce incredibly quick execution times. The SEC report stated that these High Frequency Traders traded 27,000 contracts back ‘n forth between each other in a matter of minutes with only 200 net contracts actually being traded in the end, accounting for almost 50% of trading volume that day, and causing a major market disruption we still don’t really understand.
A Flash War
Now, to imagine how this Flash Scenario might play out in a military context, in a report by the Hague Center for Strategic Studies dealing with autonomous weapons systems and the algorithms these systems use, they bring up the following point.
To achieve MHC (Meaningful Human Control) an operator must understand:
- The algorithms used in these systems,
- The outcomes they produce, and
- Be able to explain how these machines arrive at their conclusions
Troublingly, when the algorithms use neural networks, which is typical of Artificial Intelligence, these autonomous systems operate as “Black Boxes” which are difficult to understand because they lack transparency. The operators can’t explain why the machine made any decision. Thus, operators can no longer predict or control the outcomes and therefore are no longer directly responsible for the outcomes. The machine is now in control. This is especially true if the algorithms can learn and evolve.
In other words, Lethal Autonomous Weapons Systems (LAWS) act according to their poorly understood algorithms, modifying their behavior over time as machine learning comes into play.
If two adversarial Lethal Autonomous Weapons Systems are put into play by human military operators – perhaps mistakenly – and the systems escalate their actions at a speed that is uncontrollable for any military operators, we end up with the scenario we proposed in the previous section where drones colliding over the Black Sea end up causing a major conflict.
So, we’re now in a Flash War that could lead to a major conflict over a mistaken launch of a LAWS by either side or by both sides. The initial motive becomes almost irrelevant as the machines take over and launch a world war. Remember, LAWS, by definition and therefore by design, cannot be detained once they are put into motion.
Timeline – Approximately when will this industry be fully automated?
>> 2023: Automation is proceeding rapidly
if unevenly in Militaries around the world.
>> 2028: Countries like Japan and Israel have fully automated militaries with human operators in support and maintenance roles. Larger militaries in China, the U.S., Russia, and the EU are rapidly approaching full automation.
>> 2033: China and the USA now have fully automated militaries, as do France and the UK, as well as Australia and Turkey.
>> 2038: Russia, India, Germany, and Brazil as well as Canada have all joined the fully automated club. Most mid-sized countries have High Automation levels with some remaining combat troops which nonetheless rely heavily on AI systems in battle.
>> 2043: Militaries around the world are now fully automated.
Possible political and legal restraints of automation of this industry
- Regulation of RAS systems through Voluntary Measures rather than:
- Hard Law – Binding Treaties negotiated between states, or
- Soft Law – Quasi-legal instruments such as Codes of Conduct
This seems the most acceptable outcome to militaries, according to the Hague Centre for Strategic Studies.
- Political and cultural backlash to such voluntary measures may result in some attempts at using Hard Law or Soft Law to regulate RAS systems.
- Decreased human casualties (due to replacement of human soldiers by RAS systems)
- Less need to recruit new soldiers by means of media propaganda, as the remaining human military will need a very different skill set – especially tech skills.
- Reduced casualties of civilians given the precision of RAS systems.
- Advanced military Technologies will – as generally happens – lead to civilian applications of these disruptive new innovations.
- High probability of escalation in most conflicts as lethal autonomous weapons systems engage with each other.
- There will be the need to redefine what warfare means as we enter an uncertain at best, and likely unknown future.
- Increase in totalitarianism as technology is turned against civilians to monitor and control them.