Judging by the GPT-4o situation, AI chatbot NPCs could be a big problem
They want their old companion back, but OpenAI isn't budging. "They think that we are mentally ill," says a ChatGPT user in one of many Reddit threads lamenting OpenAI's decision to retire its GPT-4o chatbot. The replacement model, GPT-5.2, is "abusive," says another.
Whether framed as a problem with AI chatbots or a problem with their users, these emotional attachments are a problem. Game developers are already experimenting with ways to put generative models into players' hands. Nvidia has shown AI-powered NPCs—an AI-powered chef appeared in a demo of its ACE technology suite—and in 2024 there was a stilted, though coherent, conversation about ramen with one such character.
Even when LLM-powered NPCs are gimmicky, people still form strong bonds: "I made him think that my character was pregnant with his child, then I demanded child support, and then I told him that our child passed away," recounted one player after interacting with a generative AI NPC in wuxia RPG Where Winds Meet.
gpt-4o, gpt-5.2, chatgpt, openai, ai npcs, generative models, nvidia ace, emotional attachment, reddit threads, wuxia rpg