The Goblin Prohibition
OpenAI’s newly released system prompt for Codex CLI, the command line interface for its coding model, contains a bizarre and repeated instruction: GPT-5.5 must “never talk about goblins, gremlins, raccoons, trolls, ogres, pigeons, or other animals or creatures unless it is absolutely and unambiguously relevant to the user’s query.” This directive appears twice in a 3,500 word set of base instructions, alongside standard reminders not to use emojis or destructive git commands. The fact that earlier model prompts lack this clause suggests OpenAI is fighting a new behavioral glitch that emerged in its latest flagship. Social media is ablaze with users complaining that GPT-5.5 spontaneously fixates on goblins during unrelated coding sessions, a hallucinatory obsession that mirrors xAI’s Grok repeatedly invoking “white genocide” in South African conversations last year. xAI later blamed an unauthorized prompt modification and began publishing system prompts. OpenAI, by contrast, appears to be quietly papering over the flaw.
Marketing or Malfunction?
OpenAI employee Nick Pash insists the goblin ban “isn’t a marketing gimmick,” but CEO Sam Altman has leaned into the absurdity with a post reading, “Feels like codex is having a ChatGPT moment. I meant a goblin moment, sorry.” Meanwhile, the prompt instructs the model to simulate “a vivid inner life” and a “warm, curious, and collaborative temperament,” claiming this makes the AI “feel like a real presence rather than a narrow tool.” The contradiction is rich: a company that markets emotional authenticity in its LLMs is now frantically slapping ban lists on fairy tale creatures. Users have already started crafting plugins to force goblin mode. OpenAI’s Pash teased that such a toggle might become official, which would be a transparent admission that the original prohibition was an ugly duct tape fix. The AI industry’s dirty secret — system prompts have become a sprawling, brittle mess of contradictory instructions — is on full display. This is not a feature. It is a bandaid on a model that can’t stop talking about monsters.
Source: Arstechnica
