Stripped attacks of all possible doubt, empathy or glimpse of humanity over known residential areas regardless of the collateral victims. To that end, Israeli CNI unit 8200 developed a new training program. Artificial intelligence. Nicknamed ‘Lavender‘—lavender, known as a tranquilizer—but in this case mercilessly lethal: until 37,000 targets He said we had to kill. And there they are: some 33,000 Palestinian deaths are already there.
In that algorithmically generated list there is only men, supposedly very low-ranking militiamen. Whose addresses the algorithm pointed out for the “peace of mind” of the Israeli soldiers: “We bombed their houses without hesitation; the machine coldly pointed them out and made it easier for us,” says an Israeli Intelligence officer in an investigation released today by the Israeli and Palestinian media (Sicha Mekomit, +972 in its English version, and Local Call).
“15 or 20 civilians” for every militiaman killed
ORour soldiers barely dedicated 20 seconds to check that those indicated were just that, men. And others executed the bombings without question nothing more: “We systematically attacked them in their homes, usually at night and with their entire family there, because then they were an easier target,” says another of them. With ‘dumb’ bombsFurthermore, military sources point to this Palestinian-Israeli investigation. Therefore, it was authorized that “15 or 20 civilians!” they would die along with each alleged militiaman. Something unprecedented, they acknowledge. And, if that were not enough, the algorithm has a 10% margin of error so it can incriminate “people who have merely a vague connection to militant groups, or no relationship at all.”
The commanders’ argument is that “They trusted that statistical mechanism more than grieving soldiers.given that everyone there lost someone in the Hamas attack on October 7.” previous warsthe military did not authorize no “collateral damage” unless the target was a senior Hamas official or commander. In the current offensive, according to what was published today, the Army contemplated in several cases the massacre of more than 100 civilians.
[[H2:‘Lavanda’ y ‘Evangelio’, IAs asesinas israelíes]]
This Lavender is new, but not the only AI in this war: has also been pointing out infrastructure to destroy the so-called Gospel (God’s word). Thus – say the military – The objectives, the devastation, do not end.
Source: Lasexta

Ricardo is a renowned author and journalist, known for his exceptional writing on top-news stories. He currently works as a writer at the 247 News Agency, where he is known for his ability to deliver breaking news and insightful analysis on the most pressing issues of the day.