Scaling compute still fruitful in advancing AI, Google DeepMind scientist from China says
Increasing computational power and expanding training data continue to offer meaningful gains in artificial intelligence development, according to Yao Shunyu, a senior research scientist at Google DeepMind who previously worked at US US-based AI company, Anthropic. His comments come as researchers across the industry debate whether the era of scaling is nearing its end or if additional improvements remain achievable.
In a conversation with the Post, Yao said that the strategy of dedicating more resources to computing infrastructure and large-scale datasets is still effective and is likely to remain so for at least another year. He explained that the industry has not yet reached the point where adding more compute and data stops producing measurable advances in model performance. According to him, the real limit will emerge only when the global supply of usable training data is exhausted. Until then, he believes developers can continue benefiting from this approach.
Yao noted that despite recent shifts in sentiment from some industry leaders who argue that scaling may be losing momentum, the field has not yet reached the stage where gains flatten out completely. His remarks followed comments by OpenAI co-founder Ilya Sutskever, who said in a podcast that artificial intelligence research appears to be moving from an age defined by scaling to a new phase driven more by scientific exploration. The discussion reflects a broader question within the community about the balance between brute force scaling and deeper algorithmic breakthroughs.
According to Yao, there remain areas where scaling can unlock progress that has not yet been fully explored. He described these opportunities as low-hanging fruit, pointing to model architecture adjustments, improved training efficiency, and more diverse datasets as examples of domains where scaling still delivers value. He also highlighted that models continue to respond positively to increases in compute, suggesting that the industry has not yet reached a plateau.
At the same time, Yao acknowledged that the sector will eventually need to combine scaling efforts with more refined research to push the boundaries of what is possible. As datasets approach their natural limits and computing resources grow more expensive, the need for innovation in algorithms and data curation will become increasingly important. For now, however, he believes that scaling remains a reliable pathway for improving the capabilities of advanced AI systems.
Yao’s perspective adds to an ongoing conversation about the direction of the global AI industry. While some experts expect scaling to slow as costs climb and datasets run thin, others argue that the method continues to provide a straightforward route to stronger performance. For Yao, the industry is still in a period where increasing computing and data can meaningfully enhance model abilities, even as researchers prepare for a future where new scientific approaches will need to take the lead.