Editorial 

The Guardian view on new dictionary words: a parlour game that can clarify a scary reality

Editorial: AI has given us hallucination as word of the year. We should quarrel with this humanising definition while recognising that it evokes unprecedented times
  
  

Frankenstein, 1931
‘The adoption of humanising metaphors to conceptualise machines has been happening at least since Frankenstein’s monster reared out of early 19th-century fiction.’ Photograph: Alamy

When the Cambridge dictionary announced “hallucinate” as its word of the year this week, it was not referring to its existing definition as a human condition of “seeing, hearing, feeling or smelling something that does not exist”, but to the phenomenon of AI developing the capacity to make – or fake – things up. This is itself a somewhat hallucinatory concept, as Naomi Klein has pointed out. “Why call the errors ‘hallucinations’ at all? Why not algorithmic junk? Or glitches?” she asked. By appropriating the language of psychology, psychedelics and mysticism, she argued, the architects of generative AI had declared themselves midwives at the birth of an animate intelligence that they wanted us to believe would be an evolutionary leap for humanity.

The word of the year is a strange fixture – a parlour game crossed with a marketing opportunity that is enthusiastically played by lexicographers around the world. Anyone who remembers the Oxford dictionary’s choice for 2022 will know how outlandish the offspring can be: invited to make their own choice, 318,956 people – 93% of the overall vote – opted for “goblin mode”. Though this term (basically, slobbing out) has been around for more than a decade, its first appearance in a British newspaper, according to the research engine Factiva, was in the Observer in February last year.

In general, though, words of the year are crunched from searches on dictionaries’ own websites, revealing not only the concerns that are on people’s minds, but the ways in which they are trying to make sense of them. The new usage of hallucination illustrates a growing tendency to anthropomorphise AI technology, with risks of oversimplification and misunderstanding that have been well-documented within the academic world. But the adoption of humanising metaphors to conceptualise machines has been happening in literature at least since Frankenstein’s monster reared out of early 19th-century fiction.

This act of literary conjuring is not merely the preserve of technology. It is evident in a recent nonfiction book dealing with the other great current challenge to the collective imagination: global heating. The evidence base of John Vaillant’s Fire Weather, which won the Baillie Gifford prize for nonfiction last Thursday, is an inferno that engulfed the Canadian oil town of Fort McMurray in May 2016, driving 90,000 people from their homes.

In reaching for a language equal to the horror of the blaze, Mr Vaillant’s interviewees cite Balrog, the fire monster in Tolkien’s Lord of the Rings, and the asteroid strike in the film Armageddon. Told by one firefighter that the blaze was like a moving animal, hunting down areas that had not yet been burned, Mr Vaillant writes that the description was not fanciful. “This was how it felt to be in the fire’s presence – a hungry and motivated adversary intent on maximum mayhem.” Intention is not normally a quality attributed to fire.

The full title of Mary Shelley’s novel is Frankenstein; or, The Modern Prometheus. Written at a time of profound scientific and technological upheaval, its secondary title invoked a Greek god condemned to eternal torment for stealing the power of fire for humans. A new, or repurposed, metaphorical vocabulary is necessary to confront today’s unprecedented challenges – whether a burning world or runaway AI. We can and should quarrel with individual words.

 

Leave a Comment

Required fields are marked *

*

*