-
Kunlun Wanwei announces the open source of Skywork-MoE, a 200 billion sparse model with strong performance and lower cost
Against the backdrop of the rapid development of large model technology, Kunlun Wanwei has open-sourced a landmark sparse large language model, Skywork-MoE. This model not only performs well in performance, but also significantly reduces the cost of inference, providing an effective solution to the challenges brought by large-scale intensive LLMs. Skywork-MoE model features: Open source and free for commercial use: Skywork-MoE's model weights and technical reports are completely open source and free for commercial use without application. Reduced inference cost: While maintaining strong performance, the model significantly reduces...- 3.4k