Y’know what’s kind of wild? The whole thing about AI and not having enough juice to keep it rolling. It’s like, yeah, there are concerns about, I dunno, robots overtaking us or whatever, but right now, someone just needs a bigger power strip. Seriously.
So Sam Altman (the guy at OpenAI who’s probably losing sleep over this) pops up at some AMD shindig. He basically admits they’re gonna need more electricity than your late-night video game binge to make AI even better (sourcing this from LaptopMag, so blame them if it’s off).
AMD’s big boss, Lisa Su, grabs the mic during her speech at the Advancing AI 2025 thing and pokes Altman about why ChatGPT keeps having a nap during the day (read: outages due to GPU shortages). Altman goes, “Well, theoretically, yeah, we might end up using, like, half the planet’s energy just to keep AI running.” I mean, can you imagine? Lights dimming as your toaster fights for electricity with a chatbot.
OpenAI’s dealing with a GPU hiccup. After sending out its shiny new GPT-4o image gizmo, they had to tap the brakes. The rollout was kinda chaotic. It was like handing out free ice cream at a marathon—people swarmed. Suddenly, a million new users, all thanks to those adorable Studio Ghibli-style memes. Gotta love the internet, right?
And get this: Altman jokingly said the GPUs were practically in meltdown mode from all the meme-making frenzy. He was all like, “Guys, maybe chill a bit with the pixel art?” They had to get creative, pulling power from who knows where and setting limits on what you could do, like a parent with a candy jar.
Random tidbit: in some other chat, Altman basically said, “We don’t have GPUs lounging around like they’re on vacation.” Yet now OpenAI’s figured out their GPU drama, and they can better tackle these viral moments… like meme tsunamis.
And if you think ChatGPT’s only good for chit-chat, Altman’s flexing, saying it’s stronger than, well, any human brain ever. Apparently, it eats up 0.34 watt-hours, which is like an oven’s energy gulp for a second. Plus, it’s drinking microscopic sips of water per query. Like, 0.000085 gallons. Who knew AI was so eco-aware—or, well, sort of.
Anyway, that’s your AI energy crisis saga. Cool, huh?