It was artificial intelligence

It was artificial intelligence

The commander's instructions were clear: Don't stop. Under any circumstances, whatever they may be. But there are children in the middle of the road. The military convoy is approaching. Nothing, the kids are still there. If the vehicles stop, they will be exposed to the enemy. Complying with orders means running over children. Children like those the soldier left at home. Such as their sons, brothers, or nephews. How many feelings do you have to let go of to be able to move on? This is war: confronting extremely complex moral dilemmas. When a soldier makes the decision to shoot, he is making a moral decision. At that time, the instructions we receive can calm the conscience. But not forever.

Voices of Censorship (2015) is a documentary film collecting testimonies of men who had just participated in the Six-Day War (1967). The interviews were collected by two young men, one of whom would become famous author Amos Oz. The country was rejoicing. The media praised the lightning conquest of Jerusalem, Gaza, Sinai and the West Bank. Word of victory was everywhere. But these young men, who for several days turned into soldiers, were penetrated by other feelings. For them, victory had a distinctly bitter aftertaste. His views were censored by the government. Only decades later were they able to bring themselves together in the aforementioned documentary. The story is devastating: “In the war, they turned us into killers.” “We were soldiers against civilians.” “I saw the Arab refugees leaving Jericho and I identified with them. I saw myself doing something not very different from what the Nazis did to us.” “The children standing there were shaking. A third child had wet his pants.”

See also  Ukraine, the first train loaded with wheat arrived in Lithuania

Israel is using artificial intelligence to decide who to bomb in Gaza, a practice never seen before. Amnesty International Lavender has created a database of about 35,000 people identified as members of Hamas' military wing, most of whom are low-ranking. The automated system selected infrastructures and buildings that were, in principle, related to these targets, regardless of relatives or neighbors who might be there. The massacre committed during the first weeks of the invasion is attributed to the use of lavender. About 15,000 people were killed from October 7 to November 24. Their deaths are just statistics

Lavender points out with absolute coldness. Moral dilemmas disappear. Human feelings are not part of the decision. Protected by the fact that I didn't do it, there is no feeling of guilt, let alone the possibility of remorse. “What shocked us was seeing the enemy,” admitted one of the participants in the documentary. It is possible, by erasing this proximity, to mitigate the trauma of being part of the massacre. But also civil resistance. The ability to act in the face of others' pain is blocked, and the hatred remains intact.

Leave a Reply

Your email address will not be published. Required fields are marked *