-
Kunlun Wanwei announces the open source of Skywork-MoE, a 200 billion sparse model with strong performance and lower cost
Against the backdrop of the rapid development of large model technology, Kunlun Wanwei has open-sourced a landmark sparse large language model, Skywork-MoE. This model not only performs well in performance, but also significantly reduces the cost of inference, providing an effective solution to the challenges brought by large-scale intensive LLMs. Skywork-MoE model features: Open source and free for commercial use: Skywork-MoE's model weights and technical reports are completely open source and free for commercial use without application. Reduced inference cost: While maintaining strong performance, the model significantly reduces...- 1.6k
❯
Search
Scan to open current page
Top
Checking in, please wait
Click for today's check-in bonus!
You have earned {{mission.data.mission.credit}} points today!
My Coupons
-
¥CouponsLimitation of useExpired and UnavailableLimitation of use
before
Limitation of usePermanently validCoupon ID:×Available for the following products: Available for the following products categories: Unrestricted use:Available for all products and product types
No coupons available!
Unverify
Daily tasks completed: