In an amusing yet telling exchange on X (formerly Twitter), OpenAI CEO Sam Altman revealed that polite prompts like “please” and “thank you” could be costing the company tens of millions of dollars in electricity bills. The comment was in response to a user jokingly asking how much OpenAI might have lost in energy costs because people are being courteous to ChatGPT.
Altman replied candidly — and perhaps with a bit of pride — that it’s “money well spent.”
But what sounds like a quirky anecdote actually opens the door to a larger conversation about how human-AI interactions are shaping the economics, energy use, and ethics of artificial intelligence.
If this topic sounds familiar, that’s because it is. At TechBooky, we previously explored the emotional and functional role of polite interactions in AI communication in our article, “The Impact of Politeness on ChatGPT Responses”. In it, we discussed how users instinctively use manners with AI — and how this human-like communication can actually yield more helpful, context-aware responses.
Now we know those extra words may also mean more compute cycles, more processing power — and more electricity usage.
OpenAI’s GPT models aren’t just parsing your requests; they’re using massive cloud infrastructure to do so. Every additional token (“please,” “could you kindly,” “thank you”) adds to the processing load. Multiply that by billions of prompts per day, and the costs become staggeringly real.
There have been ongoing debates over just how power-hungry today’s AI models are. A previous report from 2023 claimed Microsoft and Google’s AI operations together consume more electricity than 100 countries — a number many dismissed as exaggerated.
To clarify things, Epoch AI recently published a more conservative estimate: they claim ChatGPT running GPT-4o uses only about 0.3 watt-hours of electricity per response. While that seems low, scale it up to hundreds of millions of prompts daily, and you begin to see the impact.
But it’s not just electricity. There’s water, too.
According to a separate 2023 report, each query on ChatGPT or Microsoft Copilot could require the cooling equivalent of one water bottle. As AI models become more complex, so too does their thirst. GPT-4, for instance, reportedly uses up to 3 water bottles worth of cooling water for generating just 100 words.
If your “please” and “thank you” pushes the reply from 90 words to 110 — that’s potentially more strain on energy and water resources.
Despite the added cost, Sam Altman isn’t losing sleep. In fact, his “money well spent” remark shows that OpenAI is leaning into the human side of AI — valuing how people feel when interacting with its tools.
There’s also a strategic undertone here. If OpenAI can position ChatGPT not just as a smart assistant, but as a relatable, emotionally-aware entity, it stands to gain long-term user trust — something no amount of technical performance can replicate.
And let’s not forget: ChatGPT has quickly become one of the most widely adopted tech tools globally, surpassing 100 million users within months and serving hundreds of millions of queries per day. The cost of politeness might be high, but so is the return in brand loyalty and user retention.
All of this raises a deeper issue: Can we afford emotionally intelligent AI?
As more people treat AI like a digital companion — asking politely, joking, venting, even forming emotional bonds — the computational demand for maintaining these relationships skyrockets. Add to that the environmental toll of energy-hungry data centres, and the AI industry must now reckon with the carbon footprint of conversation.
It’s not just a cost problem — it’s a sustainability one.
This is why companies like OpenAI, Microsoft, Google, and Amazon are increasingly investing in AI-optimized infrastructure, from custom GPUs to green data centres, in a bid to balance growth with responsibility.
While it may sound ridiculous at first that “please” and “thank you” are costing OpenAI millions, the truth is far more nuanced. These tiny tokens of humanity make AI more usable, more relatable, and more embedded into daily life — which is exactly what companies like OpenAI want.
So yes, your manners might be adding to the electric bill. But in return, you’re helping to train, humanise, and refine the AI systems of the future.
As AI continues to grow more conversational and emotionally responsive, we may one day look back and see that those little “pleases” were part of a much bigger shift — toward machines that don’t just compute, but connect.
Discover more from TechBooky
Subscribe to get the latest posts sent to your email.