A new definition of ‘hallucinate’ was one of many AI-related updates to the Cambridge Dictionary in 2023.
A new definition of ‘hallucinate’ was one of many AI-related updates to the Cambridge Dictionary in 2023.
I expect our psychological vocabulary will be further extended to encompass the strange abilities of the new intelligences we’re creating.
Henry Shevlin
This year has seen a surge in interest in generative artificial intelligence (AI) tools like ChatGPT, Bard and Grok, with public attention shifting towards the limitations of AI and whether they can be overcome.
AI tools, especially those using large language models (LLMs), have proven capable of generating plausible prose, but they often do so using false, misleading or made-up ‘facts’. They ‘hallucinate’ in a confident and sometimes believable manner.
The Cambridge Dictionary – the world’s most popular online dictionary for learners of English – has updated its definition of hallucinate to account for the new meaning and crowned it Word of the Year for 2023.
Hallucinating ‘false information’
The traditional definition of hallucinate is 'to seem to see, hear, feel, or smell something that does not exist, usually because of a health condition or because you have taken a drug'. The new, additional definition is:
'When an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human) hallucinates, it produces false information.'
AI hallucinations, also known as confabulations, sometimes appear nonsensical. But they can also seem entirely plausible – even while being factually inaccurate or ultimately illogical.
AI hallucinations have already had real-world impacts. A US law firm used ChatGPT for legal research, which led to fictitious cases being cited in court. In Google’s own promotional video for Bard, the AI tool made a factual error about the James Webb Space Telescope.
Wendalyn Nichols, Cambridge Dictionary’s Publishing Manager, said: “The fact that AIs can ‘hallucinate’ reminds us that humans still need to bring their critical thinking skills to the use of these tools. AIs are fantastic at churning through huge amounts of data to extract specific information and consolidate it. But the more original you ask them to be, the likelier they are to go astray.
“At their best, large language models can only be as reliable as their training data. Human expertise is arguably more important – and sought after – than ever, to create the authoritative and up-to-date information that LLMs can be trained on.”
‘Profound shift in perception’
The new definition illustrates a growing tendency to anthropomorphise AI technology, using human-like metaphors as we speak, write and think about machines.
Dr Henry Shevlin, an AI ethicist at the University of Cambridge, said: “The widespread use of the term ‘hallucinate’ to refer to mistakes by systems like ChatGPT provides a fascinating snapshot of how we’re thinking about and anthropomorphising AI. Inaccurate or misleading information has long been with us, of course, whether in the form of rumours, propaganda, or ‘fake news’.
“Whereas these are normally thought of as human products, ‘hallucinate’ is an evocative verb implying an agent experiencing a disconnect from reality. This linguistic choice reflects a subtle yet profound shift in perception: the AI, not the user, is the one 'hallucinating.' While this doesn't suggest a widespread belief in AI sentience, it underscores our readiness to ascribe human-like attributes to AI.
“As this decade progresses, I expect our psychological vocabulary will be further extended to encompass the strange abilities of the new intelligences we’re creating.”
Addressing hallucinations – if they can ever be fully fixed – may define the future success and uptake of generative AI.
Engineers and academics around the world, including at OpenAI, Google, and Microsoft, are working to limit AI hallucinations through ‘grounding’, with LLM outputs cross-checked against reliable sources or web searches. Some are working on ‘reinforcement learning from human feedback’, using people to help tackle hallucinations and work out how they can be predicted and eliminated.
What else captured the 2023 zeitgeist?
Several other words experienced spikes in public interest and searches on the Cambridge Dictionary website. They included:
Implosion
1) The act of falling towards the inside with force; 2) A situation in which something fails suddenly and completely.
The tragic case of the Titan submersible’s implosion led many to look up the definition.
Ennui
A feeling of being bored and mentally tired caused by having nothing interesting or exciting to do.
The notorious French robber Rédoine Faïd blamed “ennui” for his helicopter jailbreak: “The ennui provoked the escape… My addiction to liberty has consumed me.”
Ennui was also the Wordle answer on 5 June 2023.
Grifter
Someone who gets money dishonestly by tricking people.
Public figures were controversially accused of being 'grifters', including Prince Harry and Megan Markle (by a Spotify executive) and Nigel Farage (by Coutts bank).
GOAT
Abbreviation for Greatest Of All Time: used to refer to or describe the person who has performed better than anyone else ever, especially in a sport.
The Qatar World Cup provoked new debates about who is the GOAT in football: Lionel Messi, Cristiano Ronaldo, or one of the late greats like Pelé or Diego Maradona?
New words, new meanings
Cambridge lexicographers added more than 6,000 new words, phrases and senses in 2023 to the Cambridge Dictionary’s 170,000+ English definitions.
Beyond hallucinate, several additions reflect rapid developments in AI and computing, such as:
Prompt engineering
In artificial intelligence, the process of designing prompts that will give the best possible results.
Large language model
A complex mathematical representation of language that is based on very large amounts of data and allows computers to produce language that seems similar to what a human might say.
GenAI
Abbreviation for generative AI: the use or study of artificial intelligences that are able to produce text, images, etc.
Train
In machine learning, to create or improve a computer representation of a system or process by supplying it with data.
Black box
A system that produces results without the user being able to see or understand how it works.
Other noteworthy additions to the Cambridge Dictionary in 2023 include:
Shadowban
An act of a social media company limiting who can see someone's posts, usually without the person who has published them knowing.
Vibe check
An act of finding out how someone is feeling or how they make you feel, or what the mood in a particular place or situation is.
Water neutral
(Of a building development, business, etc) not using more water than was used in an area before it was built or established, or not removing more water than it replaces.
Pick up what someone is putting down (US)
to understand what someone means by their words, music, etc.
Affrilachian
An African American who comes from or lives in the region of Appalachia in the eastern United States.
Range anxiety
The fear that an electric vehicle will not have enough battery charge to take you where you want to go to.
UBI
Abbreviation for universal basic income: an amount of money that is given regularly to everyone or to every adult in a society by a government or other organisation and that is the same for everyone.
Newly emerging words that are being considered for entry are shared every Monday on the Cambridge Dictionary blog, About Words.
About Cambridge Dictionary
With over 2.3 billion pageviews and over 420 million visitors per year, Cambridge Dictionary is the world’s most popular website for learners of English, and is the world’s largest free online dictionary by pageviews. It draws on the Cambridge English Corpus – a database of over 2 billion words – covering both British and American English.
The Cambridge Dictionary is completely free of charge. Its rich dictionary, thesaurus and grammar resources such as quizzes and word lists are all informed by Cambridge’s expert research in language. Uniquely, the Cambridge Dictionary allows users to toggle easily between British and American English definitions.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.