Kinetic Harm

I write about the information technology industry.

I’ve written about other topics, such as the copaganda of Young Turks’ host Ana Kasparian and Zizek, whose work, to quote John Bellamy Foster, has become “a carnival of irrationalism.” In the main, however, the technology industry generally, and its so-called ‘AI’ sub-category, specifically, are my topics. This isn’t random; I’ve worked in this industry for decades and know its dark heart. Honest tech journalism (rather than the boosterism we mostly get) and scholarly examinations are important but, who better to tell a war story than someone in the trenches?

Because I focus on harm and not the fantasy of progress, this isn’t a pursuit that brings wealth or notoriety. There have been a few podcast appearances (a type of sub-micro celebrity, as fleeting as a lightning flash) and opportunities to be published in respected magazines. That’s nice, as far as it goes. It’s important however, to see clearly and be honest with yourself; it’s a sisyphean task with few rewards; motivations must be found within and from a community of like minded people.

Originally, my motivation was to pierce the curtain. If you’ve seen the 1939 MGM film, ‘Wizard of Oz’ you know my meaning: there’s a moment when the supposed wizard, granter of dreams, is revealed to be a sweaty, nervous man, hidden by a curtain, frantically pulling levers and spinning dials to keep the machinery of delusion functioning. This was my guiding metaphor for the tech industry, which claims its products defy the limits of material reality and surpass human thought.

As you learn more, your understanding should change. Parting the curtain, or, debunking was an acceptable way to start but it’s insufficient; the promotion of so-called ‘AI’ is producing real-world harms. From automated recidivism decision systems to facial recognition based arrests and innumerable other intrusions. A technology sold as bringing about a bright future is being deployed to limit possibilities. Digital computation began as a means of enacting a command and control methodology on the world for various purposes (military applications being among the first) and is, in our age, reaching its apotheosis.

Kinetic Harm

Reporting on these harms, as deadly as they often are, fails to tell the entire story of computation in this era of growing instability. The same technologies and methods used to, for example, automate actuarial decision making in the insurance industry can also be used for other, more directly violent aims. The US military, which is known for applying euphemisms to terrible things like a thin coat of paint over rust, calls warfare – that is, killing – kinetic military action. We can call forms of applied computation deliberately intended to produce death and destruction kinetic harm.

Consider the IDF’s Habsora system, described in the +972 Magazine article, ‘A mass assassination factory’: Inside Israel’s calculated bombing of Gaza’ –

In one case discussed by the sources, the Israeli military command knowingly approved the killing of hundreds of Palestinian civilians in an attempt to assassinate a single top Hamas military commander. “The numbers increased from dozens of civilian deaths [permitted] as collateral damage as part of an attack on a senior official in previous operations, to hundreds of civilian deaths as collateral damage,” said one source.

“Nothing happens by accident,” said another source. “When a 3-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed — that it was a price worth paying in order to hit [another] target. We are not Hamas. These are not random rockets. Everything is intentional. We know exactly how much collateral damage there is in every home.”

According to the investigation, another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (“The Gospel”), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. This AI system, as described by a former intelligence officer, essentially facilitates a “mass assassination factory.”

+972 Magazine – https://www.972mag.com/mass-assassination-factory-israel-calculated-bombing-gaza/

The popular phrase, artificial intelligence, a marketing term, really, since no such thing exists, is used to describe the Habsora system. This creates an exotic distance, as if a glowing black cube floats in space deciding who dies and how many deaths will occur.

The reality is more mundane, more familiar, even banal; the components of this machine are constantly in use around us. Here is a graphic that shows some of the likely elements:

As we use our phones, register our locations, fill in online forms for business and government services, interact on social media and so many other things, we unknowingly create threads and weave patterns, stored in databases. The same type of system that enables a credit card fraud detection algorithm to block your card if in-person store transactions are registered in two, geographically distant locations on the same day can be used to build a map of your activities and relations to find and kill you and those you know and love. This is what the IDF has done with Habsora. The distance separating the intrusive methods of Meta, Google and fellow travelers from this killing machine is not as great as it seems.

Before being driven from their homes by the IDF – homes that were destroyed under the most intensive bombing campaign of this and perhaps even the previous, hyper-violent century, Palestinians in Gaza were subject to a program of surveillance and control which put them completely at the mercy of the Israeli government. All data about their movements and activities passed through electronic infrastructure owned and controlled by Israeli entities. This infrastructure, and the data processing and analysis built upon it, have been assembled into a factory whose product is death – whether targeted or en masse.

The Thin Curtain

Surveillance. Control. Punishment. This is what the age of digital computation has brought on an unprecedented scale. For those of us who live in places where the bombs don’t yet fall, there are things like the following, excerpted from the Forbes article (Feb 23, 2024) ‘Dozens Of KFC, Taco Bell And Dairy Queen Franchises Are Using AI To Track Workers’ –

Like many restaurant owners, Andrew Valkanoff hands out bonuses to employees who’ve done a good job. But at five of his Dairy Queen franchises across North Carolina, those bonuses are determined by AI.

The AI system, called Riley, collects streams of video and audio data to assess workers’ performance, and then assigns bonuses to those who are able to sell more. Valkanoff installed the system, which is developed by Rochester-based surveillance company Hoptix, less than a year ago with the hopes that it would help increase sales at a time when margins were shrinking and food and labor costs were skyrocketing.

Forbes – https://www.forbes.com/sites/rashishrivastava/2024/02/23/dozens-of-kfc-taco-bell-and-dairy-queen-franchises-are-using-ai-to-track-workers/

Inside the zone of comparative safety but, deprivation for many and control imposed on all, there are systems like the IDF’s Habsora in service, employing the same computational techniques, which, instead of directing sniper rifle armed quadcopters and F-16s on deadly errands, deprive people of jobs, medical care and freedom.  Just as a rocket’s payload can be changed from peaceful to fatal ends, the intended outcomes of such systems can be altered to fit the goals of the states that employ them.

The Shadow

As I write this, approximately 1.4 million Palestinians have been violently pushed to Rafah, a city in the southern Gaza strip. There, they are facing starvation and incomprehensible cruelty. Meanwhile, southwest of the ruins of Gaza City, in what has come to be known as the Al Nabulsi massacre, over one hundred Palestians were killed by IDF fire while desperately trying to get flour.  These horrors were accelerated by the use of computationally driven killing systems. In the wake of Habsora’s use in what journalist Antony Loewenstein calls the Palestine Laboratory, we should expect similar techniques to be used elsewhere and to become a standard part of the arsenal of states (yes, even those we call democratic) in their efforts to impose their will on an ever more restless world that struggles for freedom.


References

Artificial intelligence and insurance, part 1: AI’s impact on the insurance value chain

https://www.milliman.com/en/insight/critical-point-50-artificial-intelligence-insurance-value-chain

Kinetic Military Action

https://en.wikipedia.org/wiki/Kinetic_military_action

A mass assassination factory’: Inside Israel’s calculated bombing of Gaza

https://www.972mag.com/mass-assassination-factory-israel-calculated-bombing-gaza

Report: Israel’s Gaza Bombing Campaign is the Most Destructive of this Century

https://english.aawsat.com/features/4760791-report-israels-gaza-bombing-campaign-most-destructive-century

‘Massacre’: Dozens killed by Israeli fire in Gaza while collecting food aid

https://www.aljazeera.com/news/2024/2/29/dozens-killed-injured-by-israeli-fire-in-gaza-while-collecting-food-aid

Dozens Of KFC, Taco Bell And Dairy Queen Franchises Are Using AI To Track Workers

https://www.forbes.com/sites/rashishrivastava/2024/02/23/dozens-of-kfc-taco-bell-and-dairy-queen-franchises-are-using-ai-to-track-workers

The Palestine Laboratory: How Israel Exports the Technology of Occupation Around the World

Examples of Other Algorithm Directed Targeting Systems

Project Maven

https://www.engadget.com/the-pentagon-used-project-maven-developed-ai-to-identify-air-strike-targets-103940709.html

Generative AI for Defence (marketing material from C3)

https://c3.ai/generative-ai-for-defense