Israel accused of using AI to choose Gaza targets

Image | ISRAEL-PALESTINIANS/

Caption: A man reacts as Palestinians search for casualties a day after Israeli strikes on houses in Jabalia refugee camp in the northern Gaza Strip, November 1, 2023. (Mohammed Al-Masri/Reuters)

The Israeli military has been using an artificial intelligence tool to identify human targets for bombing in Gaza, according to a new investigation by Israeli outlets +972 Magazine and Local Call.
Intelligence sources cited in the report allege that the AI system, called Lavender, at one stage identified 37,000 potential targets — and that approximately 10 per cent of those targets were marked in error. The sources also allege that in the early weeks of the war, the army authorized an unprecedented level of "collateral damage" — that is, civilians killed — for each target marked by Lavender.
The investigation was also shared with the Guardian newspaper, which published their own in-depth reporting.
Israel disputes and denies several parts of the investigation.
Today, the investigation's reporter, Yuval Abraham, joins us to explain his findings.
For transcripts of Front Burner, please visit: https://www.cbc.ca/radio/frontburner/transcripts(external link)
Transcripts of each episode will be made available by the next workday.
Listen on Google Podcasts(external link)
Listen on Apple Podcasts(external link)
Listen on Spotify(external link)