TLDR:
- Sam Altman revealed that users saying “please” and “thank you” to ChatGPT costs OpenAI tens of millions of dollars
- 67% of American users are polite to AI assistants according to a December 2024 survey
- Debate exists over ChatGPT’s electricity consumption per query (between 0.3 and 3 watt-hours)
- OpenAI expects to triple revenue to $12.7 billion this year despite increasing competition
- The company doesn’t anticipate being cash-flow positive until 2029
OpenAI CEO Sam Altman has revealed that users being polite to ChatGPT is costing the company tens of millions of dollars. In a response on X (formerly Twitter) on April 16, Altman commented that the money spent processing “please” and “thank you” messages was “tens of millions of dollars well spent — you never know.”
tens of millions of dollars well spent–you never know
— Sam Altman (@sama) April 16, 2025
This revelation has sparked discussions across the tech community about the hidden costs of AI interactions. Each additional word in a prompt requires processing power and contributes to OpenAI’s operational expenses.
The statement highlights the scale at which ChatGPT is being used globally. With millions of daily users, even small courtesies add up to large expenditures for the company.
Why Users Remain Polite to AI
The disclosure prompted many to question why people feel compelled to be polite to AI systems in the first place. According to a December 2024 survey by Future, 67% of American users are polite to AI assistants.
Of those polite users, 55% reported doing so because they believe it’s the right thing to do. The remaining 12% admitted they were polite out of concern that mistreating AI could have future consequences.
Some users expressed that they interact politely with AI systems out of fear that if AI becomes sentient, it might treat humans based on past interactions. This perspective reflects growing public awareness of AI development and potential future capabilities.
Engineer Carl Youngblood offered a different motivation, suggesting that courtesy toward AI serves human development: “Treating AIs with courtesy is a moral imperative for me. I do it out of self-interest. Callousness in our daily interactions causes our interpersonal skills to atrophy.”
Technical Costs and Efficiency Improvements
The conversation around AI courtesy costs has led to debates about the energy consumption of ChatGPT queries. A September 2023 research paper by Digiconomist founder Alex de Vries claimed that a single ChatGPT query requires approximately three watt-hours of electricity.
This figure has been challenged by Josh You, a data analyst from AI research institute Epoch AI. You argues that the actual consumption is closer to 0.3 watt-hours, citing improvements in model efficiency and hardware since 2023.
In response to Altman’s post, some users questioned why OpenAI doesn’t implement solutions to reduce costs associated with processing courtesy words. Such technical adjustments could potentially save millions while preserving the user experience.
Altman has previously stated that the cost of AI output has been falling tenfold every year as models become more efficient. This rapid decline in operational costs suggests that the expense of processing courtesy words may become less burdensome over time.
Despite the current costs, OpenAI expects to more than triple its revenue this year to $12.7 billion. This growth comes amid increasing competition from companies like China’s DeepSeek.
However, the company does not anticipate becoming cash-flow positive until 2029, when it projects revenue will exceed $125 billion. This long-term outlook suggests that while courtesy costs are worth noting, they represent just one aspect of OpenAI’s complex financial picture.