
War by algorithm raises new moral dangers
Israel’s use of an AI-enabled targeting system in Gaza has fed the debate about military technology
One of our worst nightmares about artificial intelligence is that it will enable killer robots to stalk the battlefield dispensing algorithmically-determined death and destruction. But the real world is a lot messier than the comic books. As Israel’s bombardment of Gaza shows, we may be moving towards more invisible and insidious forms of automated decision-making in warfare.
A chilling report published last week by the Israeli online magazine +972 highlighted the heavy reliance of the Israel Defence Forces early in the war on an AI-enabled mass target-generation system known as Lavender, which flagged 37,000 Gazans as suspected Hamas militants. As a result, many were bombed in their homes, often killing their families too.
Read Full Article
