Israeli forces have relied heavily on two AI systems, "Lavender" and "Where's Daddy," to identify and target suspected Hamas militants, says report.
As civilian casualties continue to mount in the wartorn Gaza Strip, reports of Israel's use of artificial intelligence (AI) in its targeting of Hamas militants are facing increasing scrutiny. A report by the Israeli outlets +972 Magazine and Local Call earlier this month said that Israeli forces had relied heavily on two AI tools so far in the conflict — "Lavender" and "Where's Daddy."
While "Lavender" identifies suspected Hamas and Palestinian Islamic Jihad (PIJ) militants and their homes, "Where's Daddy" tracks these targets and informs Israeli forces when they return home, per the report, which cites six Israeli intelligence officers who had used AI systems for operations in Gaza, including "Where's Daddy?"
"We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity," one of the officers told +972 and Local Call. "On the contrary, the IDF bombed them in homes without hesitation, as a first option. It's much easier to bomb a family's home. The system is built to look for them in these situations," they added.
Really makes me wonder what the meaning behind Lavender is in this context. There could quite easily be some horrific intention behind using that specific word in correlation to this murder system.
And another day of "let's shove all people of group X into one drawer and judge them".
We have all this information available through the internet. Can research even the most difficult topics by some mere hits on a keyboard and a click. And yet, there are still so many idiots. This is really mind-boggling.
They can’t fight Hamas on the field, they have to assassinate them in other countries (funny how the best intelligence in the world knows exactly who the “Qatar billionaires” are but hasn’t done anything about it), murder them alongside their families while they sleep, or disguise as civilians to murder them in hospitals. Or the usual, take their frustration out on civilians.
Really appreciating Israels SkyNet 100% speedrun here, some really impressive moves. However, I'm starting to wonder if this run passes the suspect test, I feel like you can't get past regulators so well in most cases without an exploit.
"AI" in this context only serves to obfuscate responsibility. "You've just striked a bunch of children", "Well, sure, but this AI-generated model predicted that there was a 80% chance that they were all little Hamaslings". Of course, the data that AI was trained with was bollocks to ensure everyone is Hamas, including NGO kitchen foreign workers.
So when anyone else kills civilians with air or drone strikes, it's a bug, but when the IDF does it it's a feature?
And Hamas are content to sit back in their Saudi and Egyptian hotels and reject ceasefire deals while their people in Gaza are shot and bombed and burned and tortured and generally murdered to death. Fucking disgusting, all around. Nobody comes out of this clean.
So it's THEIR fault that Hamas came in and murdered, tortured, mutilated, and raped innocents families on October 7th? Seriously? You realize how insane that sounds?