This is how much a request to ChatGPT costs

by Johannes

How much a request to ChatGPT costs depends on various factors. There are free options and tariffs, but the use itself also incurs costs.

This is how much a request to ChatGPT costs

ChatGPT is one of the most powerful AI systems and is aimed at both private individuals and companies. Various access models are available, which differ in price and range of functions.

  • Free access: Users can use ChatGPT free of charge, but with restrictions on availability and speed, especially during peak times, as paying customers are given priority.

  • ChatGPT Plus: With the Plus subscription for $20 per month, you get faster access, preferred server performance, and early access to new features—ideal if you use ChatGPT regularly and professionally.

  • Enterprise licenses: Tailored plans are available for businesses, ranging from a few hundred to several thousand US dollars per month, depending on usage.

  • Cost per request: While individual requests are relatively inexpensive, operating costs add up due to high computing power and power consumption. A single use consumes around 2.9 watt hours of electricity – around ten times more than a Google search – and leads to significant overall costs when utilization is high. Every additional word, even polite phrases such as “please” or “thank you,” increases the computing power and thus the costs, which, according to OpenAI CEO Sam Altman, already amounts to tens of millions of US dollars annually.

  • Total annual costs: Estimates vary significantly depending on the number of users and the volume of requests. With around 100 million weekly users and an average of 15 requests per person, the annual electricity costs add up to around US$30 million. Other calculations, based on around one billion requests per day and 300 million users per week, estimate up to $140 million per year.

Factors influencing costs and future developments

The cost of ChatGPT requests is influenced by several factors and may change in the future.

  • Energy and infrastructure costs: High electricity prices and the immense energy consumption of modern AI infrastructure are leading to rising operating expenses.

  • User growth: More and more people want to use AI successfully. With over 200 million weekly active users in 2024, the number of requests is increasing massively, which increases overall costs but can also reduce costs per user through economies of scale.

  • Advanced features: New features such as multimodal inputs (text, image, video) could be offered as premium services, which could increase prices for certain user groups.

  • Market and competitive pressure: Competition from providers such as Google, Meta, or Microsoft could prompt OpenAI to adjust its pricing strategy in order to remain competitive.

  • Regulatory influences: Future legal requirements could result in additional costs, for example through data protection or environmental regulations, which could affect the pricing structure.

Related Articles

Leave a Comment