Sunday, January 18, 2026
TechnologyThe Unseen Impact: GPT-5's Energy Use Sparks Calls for...

The Unseen Impact: GPT-5’s Energy Use Sparks Calls for Transparency

-

OpenAI’s GPT-5 is being hailed as a major breakthrough, but its release is also highlighting a critical, and often hidden, issue: its immense energy consumption. With no official data from the company, independent researchers are providing the first real glimpse into the model’s power demands. Their findings suggest that the enhanced capabilities of GPT-5 come at a steep and unprecedented environmental cost, leading to urgent calls for greater transparency from the AI industry.
The numbers are alarming. Researchers at the University of Rhode Island’s AI lab have found that a medium-length response from GPT-5 consumes an average of 18 watt-hours. This is a substantial increase from previous models and is “significantly more energy than GPT-4o,” according to a researcher in the group. To put this in perspective, 18 watt-hours is enough to power an incandescent light bulb for 18 minutes. Given that ChatGPT handles billions of requests daily, the total energy consumption of GPT-5 could reach the daily electricity demand of 1.5 million US homes, a staggering figure that underscores the scale of the problem.
A key factor driving this dramatic increase is the model’s size. Although OpenAI has not released the parameter count for GPT-5, experts believe it is “several times larger than GPT-4.” This is consistent with a study from the French AI company Mistral, which found a “strong correlation” between a model’s size and its resource consumption. The study concluded that a model ten times bigger would have an impact one order of magnitude larger. This suggests that the trend of building ever-larger AI models will continue to drive up resource usage at an alarming rate.
The new capabilities of GPT-5 also play a significant role in its high energy demands. Its advanced “reasoning mode” and ability to process video and images require more intensive computation. A professor studying the resource footprint of AI models noted that using the reasoning mode could increase resource usage by a factor of “five to 10.” This means that while a “mixture-of-experts” architecture offers some efficiency, the new, more complex tasks are driving the overall energy footprint to new heights. The urgent calls for transparency from AI developers are a direct response to this growing environmental concern.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Popular news

Metaverse, an online virtual world

The term "metaverse" was coined by author Neal Stephenson in his 1992 science fiction novel "Snow Crash," where he...

In a first, a Strange Quantum Object created in Lab

In the realm of quantum mechanics, which governs the behavior of the Universe at its tiniest scales, scientists have...

Tech trends 2022: Web 3.0, big tech battles

Following a year that saw WFH (work from home) and metaverse become household terms, a new wave of technological...

MIT develops New Programming Language for High-Performance Computers

In the realm of computing, the demand for high performance is ever-increasing, particularly for tasks like image processing and...

5 Cutting-edge programming languages that you should not ignore

Throughout the history of computer programming, numerous programming languages have been developed to serve various purposes. These languages are...

UK Enforces Landmark Online Safety Rules to Shield Children from Harmful Content

In a decisive move to protect children online, Ofcom has unveiled strict new regulations compelling tech companies to...

You might also likeRELATED
Recommended to you