Fu Sheng releases Orion-14B, a large model of the Orion sky with 14 billion parameters

On January 21, Orion Starry SkyFu Sheng2024 Opening AI Lecture andOrion Starry Sky ModelThe Orion-14B model was released at the conference. This is a pre-trained multi-languageLarge Language Model, with 14 billion parameters, covering common languages and professional terms, and has achieved the same level of model performance on multiple third-party test sets.optimalEffect.

The features of the Orion Starry Sky model include: support for ultra-long texts, up to 320K tokens; inference speed of 31 tokens/s on a thousand-yuan graphics card; excellent multi-language capabilities, especially in Japanese and Korean; the model size is reduced by 70% after quantization technology processing, and the performance is almost lossless.

Fu Sheng releases Orion-14B, a large model of the Orion sky with 14 billion parameters

In order to meet the application needs of enterprises, Orion Star has also launched a fine-tuning family bucket, including RAG (retrieval enhancement generation) and Agent fine-tuning models. The RAG suite can quickly integrate the enterprise's own knowledge base to build customized applications; the Agent suite can call the most suitable tool according to the user's problem to solve more complex problems.

In addition to launching large models and fine-tuning models, Orion Star has also launched applications such as Juyan Human Resources Assistant, Juyan Cloud Asset Assistant and Juyan Creative Assistant to help companies improve operational efficiency and decision-making capabilities.

At the press conference, Fu Sheng also emphasized that enterprises need not only big models, but also big model applications that can solve pain points in combination with business processes. Orion Star helps enterprises achieve AI-assisted decision-making by providing a one-stop solution for AI big model consulting and services.

The release of the large model of Orion Star is one of the results of its continuous tracking of the evolution of AI technology and huge investment in research and development over the years.TopThe team of algorithm scientists and 2 billion user-level application experience worldwide have accumulated a large amount of user data and token data, providing a solid foundation for R&D and optimization of models.

Orion Star is currently training a hybrid expert model based on the MoE architecture, and the next milestone is an intelligent model with tens of billions of parameters.

Open source address:

https://github.com/OrionStarAI/Orion

https://huggingface.co/OrionStarAI

statement:The content is collected from various media platforms such as public websites. If the included content infringes on your rights, please contact us by email and we will deal with it as soon as possible.
Information

OpenAI competitor Cohere in talks to raise $1 billion

2024-1-23 9:45:28

Information

China FAW and Alibaba Cloud Tongyi Qianwen jointly create a large model application GPT-BI

2024-1-23 9:47:16

Search