
This week’s launch by OpenAI of its o3-pro artificial-intelligence reasoning model, its mightiest yet, is such a big step forward that Sam Altman hailed the arrival of what he called “the gentle singularity.” “Scientific progress is the biggest driver of overall progress,” OpenAI’s chief executive wrote in a blog post on Tuesday. “It’s hugely exciting to think about how much more we could have.”
But do we have enough energy to fuel it?
The AI boom is also a growing challenge to America’s strained power grid. As of six months ago, ChatGPT received more than a billion queries a day. The number is surely much higher now. And imagine that figure multiplied by the accelerating productivity of AI reasoning models, which will only increase their popularity and the power needed to run AI data centers.
Altman seems sanguine about that, writing in his blog post that the average ChatGPT query uses only as much electricity as a high-efficiency light bulb does “in a couple of minutes.” Many other people are worried.