March 30, 2012 - In the afternoon of March 30, the 2025 Zhongguancun ForumAnnual Meeting.Zero One Everything CEO, Chairman of Innovation WorksKai-Fu Leesaid, "The cost of inference for large models is rapidly declining at a rate of ten times less per year, which provides very important conditions for AI-First applications to explode."
"Models that didn't perform well enough two years ago are now good enough;Models that were too expensive to reason about two years ago are now 'cabbage prices'." According to Lee Kai-Fu, "AI-First applications will soon blow up, and 2025 will be the year when AI-First applications explode and big models 'land as king'."
He noted that Scaling Law is slowing down in the pre-training phase because the amount of data used for model training has hit a bottleneck, and there are objective constraints in terms of arithmetic power. The good news is that there is a new dawn in the industry.The Scaling Law is shifting from the pre-training phase to the reasoning phase, which is the slow-thinking mode. In the past, the Scaling Law of the pre-training phase meant that -- with more GPUs, more data -- models could get smarter, but its growth trend is currently slowing down. And the new Slow Thinking Scaling Law is -- the longer a model thinks, the better quality results it will produce.
"It seems that model performance under the Slow Thinking Scaling Law is growing very fast at the moment, and there is still a lot of room for growth." Kai-Fu Lee said, "Combined with these new technological innovations, the process of model training now becomes more like training a 'liberal arts student' first, so that the model reads all the books, and then train it in the direction of a 'science student', so that the model be able to prove math problems and write code, and eventually get awesome 'arts and science' models."
In Kai-Fu Lee's opinion, with enterprises and users now having gone through the market education of "DeepSeek Moment", the Chinese market has truly awakened, which has also cleared a major obstacle for the outbreak of AI-First applications in China. This year, one of the focuses of artificial intelligence should be: Make AI Work, so that the big model can really empower thousands of industries.
According to him, there are still many stuck difficulties to overcome given DeepSeek's landing in enterprise productivity scenarios.Zero One Everything has made a strategic shift in the past few months and has fully embraced DeepSeekWe've also been working on a number of new projects, and have devoted much of our efforts to transforming DeepSeek's high-quality pedestal models into customized solutions for enterprise DeepSeek deployments-an analogy to how everything is building Windows in the AI 2.0 era, and DeepSeek is the kernel that drives Windows.