Pygmalion Displacement – A Review

From the beginning, like a fast talking shell game huckster, the computer technology industry has relied on sleight of hand. 

First, in the 1950s and 60s, to obscure its military origins and purposes by describing early electronic computers as ‘electronic brains’ fashioned from softly glowing arrays of vacuum tubes. Later, by the 1980s, as the consumer electronics era was launched, the industry presented itself as the silicon wielding embodiment of ideas of ‘freedom’ and ‘self expression’ that are at the heart of the Californian Ideology (even as it was fully embedded within systems of command, control and counter-insurgency).

The manic, venture capitalist funded age of corporate ‘AI’ we’re currently subjected to has provided the industry with new opportunities for deception; we are encouraged to believe large language models and other computationally enacted, statistical methods are doing the same things as minds. Earlier, I called this deception but as Lelia A. Erscoi, Annelies Kleinherenbrink, and Olivia Guest, describe in their paper, “Pygmalion Displacement: When Humanising AI Dehumanises Women“, a more precise term is, displacement.


Uniquely for the field of AI critique, ‘Pygmalion Displacement’ identifies the specific ways women have been theorized and thought about within Western societies and how these ideas have persisted into, and shaped the computer age. 

The paper’s abstract introduces the reader to the authors’ concept:

We use the myth of Pygmalion as a lens to investigate the relationship between women and artificial intelligence (AI). Pygmalion was a legendary king who, repulsed by women, sculpted a statue, which was imbued with life by the goddess Aphrodite. This can be seen as a primordial AI-like myth, wherein humanity creates life-like self-images. The myth prefigures gendered dynamics within AI and between AI and society. Throughout history, the theme of women being replaced by automata or algorithms has been repeated, and continues to repeat in contemporary AI technologies. However, this pattern—that we dub Pygmalion displacement—is under-examined, due to naive excitement or due to an unacknowledged sexist history of the field. As we demonstrate, Pygmalion displacement prefigures heavily, but in an unacknowledged way, in the Turing test: a thought experiment foundational to AI. With women and the feminine being dislocated and erased from and by technology, AI is and has been (presented as) created mainly by privileged men, subserving capitalist patriarchal ends. This poses serious dangers to women and other marginalised people. By tracing the historical and ongoing entwinement of femininity and AI, we aim to understand and start a dialogue on how AI harms women.

Pygmalion Displacement: When Humanising AI Dehumanises Women – Pg 1

Like all great theoretical frameworks (such as Marx’s dialectical and historical materialism), Pygmalion Displacement provides us with a toolkit, the Pygmalion Lens, which can be applied to real world situations and conditions, sharpening our understanding and revealing what is hiding in plain sight, obscured by ideology.

Pygmalion Lens Table: Pygmalion Displacement: When Humanising AI Dehumanises Women, Pg 14

Apex Delusions

We generally assume that humanity – whether via evolutionary process or divine creation – is at the top of a ladder of being. Many of us love our dogs and cats but believe that because we build rockets and computers and they don’t, we occupy a loftier perch (I recall a Chomsky lecture during which he threw cold water on this vainglory by observing that the creation of nuclear weapons suggested our vaunted intelligence ‘may not be a successful adaptation’).

In the Introduction section titled, ‘The man, the myth,’ the authors describe another rung on this mythical ladder:

At the top of the proverbial food chain, a majority presence consists of straight white men, those who created, profit from, and work to maintain the capitalist patriarchy and kyriarchy generally (viz. Schüssler Fiorenza 2001). From this perspective, AI can be seen as aiming to seal all humanity’s best qualities in an eternal form, without the setbacks of a mortal human body. It is up for debate, however, what this idealised human(oid) form should look or behave like. When our creation is designed to mimic or be compatible with us, its creator, it will enact, fortify, or extend our pre-existing social values. Therefore, in a field where the vast majority is straight, cisgender, white, and male (Lecher 2019), AI seems less like a promise for all humanity and more like contempt for or even a threat against marginalized communities.

Pygmalion Displacement: When Humanising AI Dehumanises Women – Pg 3

The AI field, dominated by a small cohort, is shaped not only by the idea of humans as superior to the rest of nature but certain humans being superior to others. The imagined artificial general intelligence (AGI) is not simply a thinking machine, but a god-like, machine version of the type of person seen as being at the apex of humanity.

Further on in the introduction, the authors describe how these notions impact women specifically:

Our focus herein is on women in particular, who dwell within the limits of what is expected, having to adhere to standards of ideal and colonial femininity to be considered adequate and then sexualized and deemed incompetent for conforming to them (Lugones 2007). Attitudes towards women and the feminised, especially in the field of technology, have developed over a timeline of gender bias and systemic oppression and rejection. From myths, to hidden careers and stolen achievements (Allen 2017; Evans 2020), to feminized machines, and finally to current AI applications, this paper aims to shine a light on how we currently develop certain AI technologies, in the hope that such harms can be better recognized and curtailed in the future.

Pygmalion Displacement: When Humanising AI Dehumanises Women – Pg 3

On Twitter, as in our walkabout lives, we see and experience these harms in action as the contributions of women in science and technology (and much else besides) are dismissed or attributed to men. I always imagine an army of Jordan Peterson-esque pontificators but alas these pirates come in all shapes and sizes.

From Fiction to History and Back Again

Brilliantly, the authors create parallel timelines – one fictional, the other real – to illustrate how displacement has worked in cultural production and material outcomes.

In the fictional timeline, which includes stories ranging from ‘The Sandman’ (1816) to 2018’s PS4 and PC sci-fi adventure game, Detroit: Become Human, we are shown how displacement is woven into our cultural fabric.

Consider this passage on the 2013 film, ‘Her’ which depicts a relationship (of sorts) between Theodore, a lonely writer, played by Joaquin Phoenix and an operating system named Samantha, voiced by Scarlett Johansson:

…it is interesting to note that unlike her fictional predecessors, Samantha has no physical form — what makes her appear female is only her name and how she sounds (voiced by Scarlett Johansson), and arguably (that is, from a stereotypical, patriarchal perspective) her cheerful and flirty performance of secretarial, emotional, and sexual labor. In relation to this, Bergen (2016) argues that virtual personal assistants like Siri and Alexa are not perceived as potentially dangerous AI that might turn on us because, in addition to being so integrated into our lives, their embodied form does not evoke unruliness or untrustworthiness: “Unlike Pygmalion’s Galatea or Lang’s Maria, today’s virtual assistants have no body; they consist of calm, rational and cool disembodied voices […] devoid of that leaky, emotive quality that we have come to associate with the feminine body” (p. 101). In such a disembodied state, femininity appears much less duplicitous—however, in Bergen’s analysis, this is deceptive: just as real secretaries and housekeepers are often an invisible presence in the house owing to their femininity (and other marginalized identity markers), people do not take virtual assistants seriously enough to be bothered by their access to private information.

Pygmalion Displacement: When Humanising AI Dehumanises Women – Pg 8

Fictional depictions are juxtaposed with real examples of displacement such as the often told (in computer history circles) but not fully appreciated story of the ELIZA and Pedro speech generation systems:

Non-human speech generation has a long history, harking back to systems such as Pedro the voder (voice operating demonstration) in the 1930s (Eschner 2017). Pedro was operated solely by women, despite the fact the name adopted is stereotypically male. The first modern chatbot, however, is often considered to be ELIZA, created by Joseph Weizenbaum in 1964 to simulate a therapist that resulted in users believing a real person was behind the automated responses(Dillon 2020; Hirshbein 2004). The mechanism behind ELIZA was simple pattern matching, but it managed to fool people enough to be considered to have passed the Turing test. ELIZA was designed to learn from its interactions, (Weizenbaum 1966) named precisely for this reason. In his paper introducing the chatbot, Weizenbaum (1966) invokes the Pygmalion myth: “Like the Eliza of Pygmalion fame, it can be made to appear even more civilized, the relation of appearance to reality, however, remaining in the domain of the playwright.” (p. 36) Yet ELIZA the chatbot had the opposite effect than Weizenbaum intended, further fuelling a narrative of human-inspired machines.

Pygmalion Displacement: When Humanising AI Dehumanises Women – Pg 20

Later in this section, quoting from a work by Sarah Dillon on ‘The Eliza Effect’ we’re told about Weizenbaum’s contextual gendering of ELIZA:

Weizenbaum genders the program as female when it is under the control of the male computer programmer, but it is gendered as male when it interacts with a [female] user. Note in particular that in the example conversation given [in Weizenbaum’s Computer Power and Human Reason, 1976], this is a disempowered female user, at the mercy of her boyfriend’s wishes and her father’s bullying, defined by and in her relationship to the men whom, she declares, ‘are all alike.’ Weizenbaum’s choice of names is therefore adapted and adjusted to ensure that the passive, weaker or more subservient position at any one time is always gendered as female, whether that is the female-gendered computer program controlled by its designers, or the female-gendered human woman controlled by the patriarchal figures in her life.

Pygmalion Displacement: When Humanising AI Dehumanises Women – Pg 21

This passage was particularly interesting to me because I’ve long admired Weizenbaum’s thoughtful dissection of his work. I learned from the critique of computation as an ideology but missed his Pygmalion framing; the Pygmalion Lens enables a new way of seeing assumptions and ideas that are taken for granted like the air we breathe.


There is much more to discuss such as an eye-opening investigation into the over-celebrated Turing Test (today, more marketing gimmick than assessment technique) which began as a theorized method to create a guessing game about gender, a test which (astoundingly) “…required a real woman […] to prove her own humanity in competition with the computer.”

This is a marvellous and important paper which presents more than a theory, it gives us a toolkit and method for changing the way we think about the field of computation (and its loud ‘AI’ partisans) under patriarchal capitalism 

Leave a comment