
Sam Altman Concedes AGI Needs More Than Just Scaling — And That’s a Big Deal
TLDR:
- OpenAI CEO Sam Altman now admits AI needs “mega breakthrough” beyond current scaling approaches
- Just 14 months ago, Altman claimed “we know how to build AGI” through scaling
- Shift signals industry-wide doubt about pure scaling as path to AGI
- Could impact trillion-dollar data center investments worldwide

A Major Shift from OpenAI’s CEO
In a surprising turn of events, Sam Altman — the CEO of OpenAI — has conceded that building AGI (Artificial General Intelligence) will require major breakthroughs beyond simply scaling up existing AI models.
During a recent talk, Altman stated that he believes there needs to be “another new architecture to find that is going to be as big of a gain as transformers were to LSTMs.” This represents a significant retrenchment from his claim just fourteen months ago that “we now know how to build AGI as it’s usually understood.”
The statement marks a notable shift in Altman’s position. In early 2025, Altman was confident that continued scaling — making models larger, training them on more data — would be sufficient to achieve AGI. Now, he’s actively looking for the next “mega breakthrough.”
What Changed?
The admission comes amid mounting evidence that simply making AI models bigger isn’t producing the leaps in capability that the industry had hoped for. Several prominent tech leaders have recently expressed similar doubts:
– **Elon Musk** admitted that xAI was “not built right”
– **Mark Zuckerberg** delayed Meta’s latest model release
– **Demis Hassabis** (Google DeepMind) has also signaled skepticism about pure scaling
– AI pioneers **Yann LeCun** and **Ilya Sutskever** have long advocated for different approaches
Altman isn’t alone in recognizing that the scaling path may have hit diminishing returns.
What This Means for the Industry
The implications are significant. Companies around the world have been investing trillions of dollars into AI data centers, believing that bigger models would eventually achieve human-level intelligence. This new perspective from OpenAI’s CEO suggests that strategy may need reconsideration.
The transformer architecture — introduced in 2017 — revolutionized AI by enabling models to process text more effectively. But Altman now suggests we need something equally transformative.
What Comes Next?
Altman indicated that AI itself might help discover the next breakthrough architecture. He suggested looking for where a “mega breakthrough” could emerge, with AI’s assistance in the research process.
For the AI industry, this means a potential pivot. Instead of simply building larger data centers, companies may need to invest more heavily in fundamental research into new AI architectures.
Our Take
This is a watershed moment for the AI industry. When the leader of the most influential AI company admits that scaling isn’t enough, it validates what many researchers have been saying for years: we need fundamentally different approaches to achieve true AGI.
For Malaysia’s growing AI ecosystem, this shift creates opportunities. As the global industry pivots from pure scaling to architectural innovation, researchers and startups working on novel approaches may find themselves in higher demand.
The trillion-dollar question now isn’t how big we can build — but what new architecture will actually get us to AGI.






