AGI is a pipe dream until we solve one big problem, AI experts say, even as Google celebrates Gemini’s success

  • AI researchers at NeurIPS 2025 say today’s scaling approach has hit its limit
  • Despite Gemini 3’s strong performance, experts argue that LLMs still can’t reason or understand cause and effect
  • AGI remains far off without a fundamental overhaul in how AI is built and trained

Recent successes by AI models like Gemini 3 don’t disguise the more sobering message that emerged this week at the NeurIPS 2025 AI conference: that we might be building AI skyscrapers on intellectual sand.

While Google celebrated its latest model’s performance leap, researchers at the world’s biggest AI conference issued a warning: no matter how impressive the current crop of large language models may look, the dream of artificial general intelligence is slipping further away unless the field rethinks its entire foundation.

All agreed that simply scaling today’s transformer models, giving them more data, more GPUs, and more training time, is no longer delivering meaningful returns. The big leap from GPT‑3 to GPT‑4 is increasingly seen as a one-off; everything since has felt less like breaking glass ceilings than merely polishing the glass.

That’s a problem not just for researchers, but for everyone being sold the idea that AGI is around the corner. The truth, according to this year’s scientific attendees, is far less cinematic. What we’ve built are highly articulate pattern-matchers. They’re good at producing answers that sound right. But sounding smart and being smart are two very different things, and NeurIPS made clear that the gap isn’t closing.

The technical term being passed around is the “scaling wall.” This is the idea that the current approach – train ever-larger models on ever-larger datasets – is running up against both physical and cognitive limits. We’re running out of high-quality human data. We’re burning enormous amounts of electricity to extract tiny marginal gains. And perhaps most troubling, the models still make the kind of mistakes that no one wants their doctor, pilot, or science lab to make.

It’s not that Gemini 3 hasn’t wowed people. And Google poured resources into optimizing model architecture and training techniques, rather than simply throwing more hardware at the problem, which makes it perform incredibly well. But Gemini 3’s dominance only underscored the problem. It’s still based on the same architecture that everyone is now quietly admitting isn’t built to scale to general intelligence – it’s just the best version of a fundamentally limited system.

Managing expectations

Among the most discussed alternatives were neurosymbolic architectures. These are hybrid systems that combine the statistical pattern recognition of deep learning with the structured logic of older symbolic AI.

Others advocated for “world models” that mimic how humans internally simulate cause and effect. If you ask one of today’s chatbots what happens if you drop a plate, it might write something poetic. But it has no internal sense of physics and no actual grasp of what happens next.

The proposals aren’t about making chatbots more charming; they’re about making AI systems trustworthy in environments where it matters. The idea of AGI has become a marketing term and a fundraising pitch. But if the smartest people in the room are saying we’re still missing the fundamental ingredients, it may be time to recalibrate expectations.

NeurIPS 2025 might be remembered not for what it showcased, but for admitting that the industry’s current trajectory is impressively profitable but intellectually stuck. To go further, we’ll need to abandon the idea that more is always better.

Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!

And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.

Read more @ TechRadar

Latest posts

iRobot’s Roombas have a new Chinese owner, but it says your data will remain in the US

As part of its acquisition by China-based Picea Robotics, iRobot is creating a new US-based subsidiary called iRobot Safe, which the company says will...

What’s in the Epstein files? For Tiktokers, a content gold mine

On TikTok, creators are combing through the Epstein files for views and clout. | Cath Virginia / The Verge, Getty Images "where are you? Are...

Trump Mobile’s origins lie with a Mexican middleweight boxer

Where's the Trump phone? We're going to keep talking about it every week. Last week, we spoke to executives from the company for the...

Can Democrats post their way to midterm victories?

US Vice President Kamala Harris during a Commander in Chief farewell ceremony at Joint Base Myers-Henderson Hall in Arlington, Virginia, US, on Thursday, Jan....

What’s behind the mass exodus at xAI?

The past few days have been a wild ride for xAI, which is racking up staff and cofounder departure announcements left and right. On...

Steam beta lets users add their PC specs to reviews

User reviews on Steam may get more informative after a Steam Client Beta update released on Thursday, which adds an option for users to...

The see-through Beats Studio Buds Plus are more than 40 percent off for Presidents Day

If you like how well AirPods work with iPhones but want something more colorful and flexible, the Beats Studio Buds Plus are a nice...

Nintendo’s Virtual Boy accessory lets you play VR Mario and Zelda on Switch 2

The forthcoming Nintendo Virtual Boy accessory for Switch and Switch 2 can play VR-supported games, as reported by Video Games Chronicle. There are four...

Valve’s latest Steam beta lets you add your PC’s specs to game reviews

An upcoming update to Steam includes a helpful improvement to game reviews. As part of the Steam Client Beta update Valve released on February...