Skip to content
Go back

GPT-5 Is My Favorite Release (and It's Not Why You Think)

GPT-5 is my favorite chat LLM release. My favorite AI release. And it’s not because of its capabilities.

For sure, there are capabilities, and they’re quite great. I’m not trying to doubt that. When GPT-4 was released, I thought it was wonderful. Now we have GPT-5, and it’s wonderful too. But wonderful in what way exactly?

It makes me feel more safe, more relaxed. I see that every release, the actual difference is getting smaller and smaller, while the promise is getting bigger and bigger. Sam Altman posts a picture of the Death Star, and people on Reddit have meltdowns about their 4o AI boyfriend disappearing. The hype was absolutely over the top — and then GPT-5 arrives and it’s… marginally better. That’s it. Barely different at all.

Even Bill Gates said back in October 2023 that GPT-5 probably wouldn’t be much better than GPT-4. He called it a plateau. And he was right.

Just two years ago, everybody was saying “you should not call it AI, it’s a large language model, not AI.” And nowadays everybody calls everything AI, and nobody cares anymore. The whole AGI thing isn’t sensational anymore. The words lost their meaning.

In the past we were thinking about AI as AGI — Artificial General Intelligence. By general, we meant it’s capable of doing many things humans can do, at a pretty high level. What’s happening nowadays? A lot of loudmouths saying we’ve already achieved AGI, or we’re about to with the next GPT. Sam often likes to talk about achieving it, then talks about not achieving it.

Through the last couple of years, I’ve been riding these waves like a non-skilled surfer. Sometimes I’d think, “Oh my God, I’m a software developer, I’m going to be replaced tomorrow.” Then I’d dive in, experiment, try all the new tools, create stuff. And I’d realize nothing changes — we just got a new tool that makes life a little easier in some areas. Software development isn’t going anywhere.

Then again, the hype train arrives, and I start feeling like, “Oh my God, maybe I’m missing out on something, maybe I don’t know some concept, maybe I need to read some papers.”

To be honest, reasoning was a real breakthrough. From all the enormous amount of papers written in the meantime, reasoning was indeed a breakthrough. But still, I’m riding this hype train thinking, “Oh shit, everybody’s talking about something changing.” Some people are even crazier — they think in a couple of years, money won’t exist. It’s fantasy, and I think it’s important to have fantasies, but it is fantasy. We’re so far from even replacing software engineering that I’m not going to even talk about it. It’s just ridiculous.

What’s happening in reality? There are certain areas where large language models are quite helpful. Knowing this, companies started optimizing for them. If in the past we were trying to create a model that knows everything and becomes AGI, nowadays we’re trying to create a model that performs well in specific cases. We train it on synthetic datasets, reward it for succeeding in specific tasks. The model can make you feel a certain way by saying certain words in a specific order. But it’s nowhere near artificial general intelligence.

I can be wrong. Nobody knows a shit.

That’s why GPT-5 is my favorite — because the reality check was so brutal. All that Death Star hype, all those AGI promises, and what did we get? A slightly better chatbot. The gap between expectation and reality has never been wider. And somehow, that’s refreshing. It’s honest progress without the drama. The emperor has no clothes, and we can finally admit it.


Share this post on:

Next Post
Is Your Repo Ready for AI Agents?