Kunlun Wanwei announced the release and open source of "Tiangong Model 3.0" on April 17: 400 billion parameters, claimed to have better performance than Grok 1.0

Kunlun WanweiThe group recently announced through its official WeChat account thatLarge ModelOn the first anniversary of its release, Tiangong Model 3.0 will be officially launched on April 17Public Beta, and will selectOpen Source"Tiangong 3.0" uses 400 billion-level parameter MoE (mixed expert model).Officially stated that it is one of the MoE models with the largest model parameters and the strongest performance in the world, performance exceeds Grok 1.0.

It is reported that compared with the previous generation "Tiangong 2.0" MoE large model, "Tiangong 3.0" has "amazing" performance improvements in the fields of model semantic understanding, logical reasoning, versatility, generalization, uncertainty knowledge, and learning ability.Its model technology knowledge and ability has increased by more than 20%, and its mathematics/reasoning/coding/cultural and creative abilities have increased by more than 30%..

"Tiangong 3.0" also addsEnhanced search, research mode, code calling and chart drawing, multiple calls to online search, etc., and specifically trained the model's Agent capabilities. It can independently complete planning, calling, and combining external tools and information, and can complete various complex needs such as industry analysis and product comparison.

“Tiangong 3.0”Claimed to be the world's first multimodal "super model"It integrates multiple capabilities including AI search, AI writing, AI long text reading, AI dialogue, AI speech synthesis, AI image generation, AI comic creation, AI image recognition, AI music generation, AI code writing, and AI table generation. The official calls it a "super application in the era of big models."

Kunlun Wanwei announced the release and open source of "Tiangong Model 3.0" on April 17: 400 billion parameters, claimed to have better performance than Grok 1.0

In October last year, Kunlun Wanwei open-sourced the "Skywork" Skywork-13B series of tens-billion-level large language models, and also open-sourced the 600GB, 150B Tokens open-source Chinese data set.

Kunlun Wanwei's Skywork-13B series currently includes two models with 13 billion parameters:Skywork-13B-Base model, Skywork-13B-Math model, the open source address is as follows:

statement:The content is collected from various media platforms such as public websites. If the included content infringes on your rights, please contact us by email and we will deal with it as soon as possible.
Information

Kunlun Wanwei AI music generation model Tiangong SkyMusic starts invitation test

2024-4-2 10:08:06

Information

Douyin issues a reminder for content creation during the Qingming Festival: Use "AI resurrection" technology with caution

2024-4-3 9:31:28

Search