Leaving the Lyceum

Can large language models – known by the acronym LLM – reason? 

This is a hotly debated topic in so-called ‘tech’ circles and the academic and media groups that orbit that world like one of Jupiter’s radiation blasted moons.  I dropped the phrase, ‘can large language models reason’ into Google, (that rusting machine) and got this result:

This is only a small sample. According to Google there are “About 352.000.000 results.” We can safely conclude from this, and the back and forth that endlessly repeats on Twitter in groups that discuss ‘AI’ that there is a lot of interest in arguing the matter: pro and con. Is this debate, if indeed it can be called that, the least bit important? What is at stake?

***

According to ‘AI’ industry enthusiasts, nearly everything is at stake; a bold new world of thinking machines is upon us. What could be more important?  To answer this question, let’s do another Google search, this time, for the phrase, Project Nimbus:

The first result returned was a Wikipedia article, which starts with this:

Project Nimbus (Hebrew: פרויקט נימבוס) is a cloud computing project of the Israeli government and its military. The Israeli Finance Ministry announced in April 2021, that the contract is to provide “the government, the defense establishment, and others with an all-encompassing cloud solution.” Under the contract, the companies will establish local cloud sites that will “keep information within Israel’s borders under strict security guidelines.”

Wikipedia: https://en.wikipedia.org/wiki/Project_Nimbus

What sorts of things does Israel do with the system described above? We don’t have precise details but there are clues such as what’s described in this excerpt from the +972 Magazine article, ‘A mass assassination factory’: Inside Israel’s calculated bombing of Gaza’ –

According to the [+972 Magazine] investigation, another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (“The Gospel”), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. This AI system, as described by a former intelligence officer, essentially facilitates a “mass assassination factory.”

+972: https://www.972mag.com/mass-assassination-factory-israel-calculated-bombing-gaza/

***

History, and legend tell us that in ancient Athens there was a place called the Lyceum, founded by Aristotle, where the techniques of the Peripatetic school were practiced. Peripatetic means, more or less, ‘walking about’ which reflects the method: philosophers and students, mingling freely, discussing ideas. There are centuries of accumulated hagiography about this school. No doubt it was nice for those not subject to the slave system of ancient Greece.

Similarly, debates about whether or not LLMs can reason are nice for those of us not subject to hellfire missiles, fired by Apache helicopters sent on their errands based on targeting algorithms. But, I am aware of the pain of people who are subject to those missiles. I can’t unsee the death facilitated by computation.

This is why I have to leave the debating square, the social media crafted lyceum. Do large language models reason? No. But even spending time debating the question offends me now. A more pressing question is what the people building the systems killing our fellow human beings are thinking. What is their reasoning?

Command, Control, Kill

The IDF assault on Nasser hospital in Southern Gaza joined a long and growing list of bloody infamies committed by Israel since Oct 7, 2023. During a Democracy Now interview, broadcast on Feb 15, 2024, Dr. Khaled Al Serr, who was later kidnapped by the IDF, described what he saw:

Actually, the situation here in the hospital at this moment is in chaos. All of the patients, all the relatives, refugees and also the medical staff are afraid because of what happened. We could not imagine that at any time the Israeli army will bomb the hospital directly, and they will kill patients and medical personnel directly by bombing the hospital building. Yesterday also, Israeli snipers and Israeli quadcopters, which is a drone, carry on it an AR, and with a sniper, they shot all over the building. And they shot my colleague, Dr. Karam. He has a shrapnel inside his head. I can upload for you a CT for him. You can see, alhamdulillah, it was superficial, nothing serious. But a lot of bullets inside their bedroom and the restroom.”

The Israeli military is using quadcopters, armed with sniper rifles, as part of its assassination arsenal. These remote operated drones, which possess limited but still important automatic capabilities (flight stability, targeting persistence) are being used in the genocidal war in Gaza and the war between Russia and Ukraine to name two, prominent examples. They are likely to make an appearance near you in some form, soon enough.


I haven’t seen reporting on the type of quadcopter used but it’s probably the Smash Dragon, a model produced by the Israeli firm Smart Shooter which, on its website, describes its mission:

SMARTSHOOTER develops state-of-the-art Fire Control Systems for small arms that significantly increase weapon accuracy and lethality when engaging static and moving targets, on the ground and in the air, day and night.

Here is a promotional video for the Smash Dragon:

Smart Shooter’s product, and profit source are the application of computation to the tasks of increasing accuracy and automating weapon firing. One of their ‘solutions’ (solving, apparently, the ‘problem’ of people being alive) is a fixed position ‘weapon station’ called the Smash Hopper that enables a distant operator to target-lock the weapon on a person, initiating the firing of a constant stream of bullets. For some reason, the cartoonish word,  ‘smash’ is popular with the Smart Shooter marketing team.


‘AI’, as used under the current global order, serves three primary purposes: control via sorting, anti-labor propaganda and obscuring culpability. Whenever a hospital deploys an algorithmic system, rather than healthcare worker judgment, to decide how long patients stay, sorting is being used as a means of control, for profit. Whenever a tech CEO tells you that ‘AI’ can replace artists, drivers, filmmakers, etc. the idea of artificial intelligence is employed as an anti-labor propaganda tool. And whenever someone tells you that the ‘AI’ has decided, well, anything, they are trying to hide the responsibility of the people behind the scenes, pushing algorithmic systems on the world.

The armed quadcopter brings all of these purposes together, wrapped in a blood stained ribbon. Who lives and who dies is decided via remote control while the fingers pulling the trigger, and the people directing them are hidden from view. These systems are marketed as using ‘AI’ implying machines are making life and death decisions rather than people.


In the introduction to his 2023  book, The Palestine Laboratory, which details Israel’s role in the global arms trade and use of the Palestinians as lethal examples, journalist Anthony Lowenstein describes a weapons demonstration video attended by Andrew Feinstein in 2009:

“Israel is admired as a nation that stands on its own and is unashamed in using extreme force to maintain it. [Andrew Feinstein is] a former South African politician. journalist, and author. He told me about attending the Paris Air Show in 2009, the world’s largest aerospace industry and air show exhibitions. [The Israel-based defense firm Elbit Systems] was showing a promotional video about killer drones, which have been used in Israel’s war against Gaza and over the West Bank.

The footage had been filmed a few months before and showed the reconnaissance of Palestinians in the occupied territories. A target was assassinated. […] Months later, Feinstein investigated the drone strike and discovered that the incident featured in the video had killed a number of innocent Palestinians, including children.  This salient fact wasn’t featured at the Paris Air Show. “This was my introduction to the Israeli arms industry and the way it markets itself.”

The armed quadcopter drone, one of the fruits of an industry built on occupation and death, can be added to the long list of the harms of computation. ‘Keep watching the skies!’ someone said at the end of a 1950s science fiction film whose name escapes me. Never mind though, the advice stands.

References

Democracy Now Interview with Dr. Khaled Al Serr

https://www.democracynow.org/2024/2/15/nasser_hospital_stormed_gaza

Dr. Al Serr kidnapped

The Palestine Laboratory