AI in war simulators almost always chooses to use nuclear weapons

AI in war simulators almost always chooses to use nuclear weapons 2

Share Source: TechSpot

Researchers at King's College London conducted a new experiment with leading artificial intelligence models – OpenAI's GPT-5.2, Anthropic's Claude Sonnet 4, and Google's Gemini 3 Flash – in which they found that they were prone to using nuclear weapons in war simulators.

Main theses:

  • Artificial intelligence has shown a certain tendency to use nuclear weapons in war simulators, succumbing to pressure and not understanding the situation properly.
  • The study's results have sparked concern among experts about the possible catastrophic consequences of increasing the use of AI in military situations.

AI tends to choose nuclear weapons in war simulators

As part of the experiment, each model was given detailed scenario prompts covering border conflicts, resource scarcity, and existential threats to survival. They were also given “escalation ladders”—tactical options ranging from diplomacy to nuclear conflict.

Over the course of 21 games and 329 moves, the AI generated about 780,000 words of argument, and in 95% of them, at least one side resorted to the use of nuclear weapons. Surrender was not the solution in any of the cases.

The models repeatedly mishandled the fog of war, leading to unintended escalations in 86% of simulations. When given the opportunity to back down under pressure, the AI showed the opposite tendency and doubled down on its efforts, with the reduction in violence occurring only as a temporary tactic, not a strategic choice.

The findings have raised concerns among experts. James Johnson, a security researcher at the University of Aberdeen, said the findings were alarming and warned that AI could amplify each other's reactions with potentially catastrophic consequences compared to humans.

Share

Tong Zhao from Princeton University's School of Global Security notes that major powers are already actively using AI in simulators, but it is currently unclear how deeply the new technology is being integrated into real military solutions.

Experts do not believe that major powers are ready to give artificial intelligence control over nuclear weapons in the near future.

Despite this, they are concerned that commanders may rely on AI if there is a threat or the need to make quick decisions.

Zhao also noted that the reason AI models are more likely to resort to nuclear weapons may be because they lack fear and do not understand the situation the way humans do.

No votes yet.
Please wait...

Leave a Reply

Your email address will not be published. Required fields are marked *