recently,Moore Thread"The AI Big Model for All-Subject Education"Teacher AI"The two sides jointly announced that they have completedLarge model trainingTest. Relying on the Moore Thread Kua'e (KUAE) Qianka Intelligent Computing Cluster, Shizhe AI successfully completed the high-intensity training of a large model with 7 billion parameters in one week, and the training efficiency reached the expected level, fully demonstrating the capabilities of the domestic full-featured GPU Qianka training platform.
Source Note: The image is generated by AI, and the image is authorized by Midjourney
It is understood that "Shizhe AI" was founded in 2020. The core team comes from Tsinghua University and focuses on the large-scale education model of all disciplines. Since the opening of internal testing, it has more than 25,000 users, supports more than 30 disciplines, and covers more than 2,000 textbooks.
This training test successfully verified the powerful performance of Moore's Threads and K-Card Intelligent Computing Cluster in large model training, laying the foundation for future innovations in educational AI large models. Both parties will continue to carry out adaptation work on large model reasoning and optimize technology to meet high-frequency reasoning needs.
Liu Chunjiang, CEO of Shizhe Big Model, said: "This training test demonstrated the powerful performance of the Kua'e Qianka Intelligent Computing Cluster. We are full of confidence in the domestic computing power. In the future, Shizhe Big Model will continue to run more core businesses on the Kua'e Qianka Intelligent Computing Cluster to provide users with efficient and stable computing services."