recent,FranceAIStartupsMistralReleased a newCoding Model——Codestral MambaThis model is not only fast, but also can process longer codes, helping programmers and developers improve their work efficiency. Mistral has accumulated a lot of fame in the field of open source AI, and the Codestral Mamba launched this time is even more eye-catching.
Codestral Mamba is based on a new architecture called "Mamba", which is more efficient than the traditional transformer architecture. Its design allows the model to give results faster when handling complex tasks and can handle input texts up to 256,000 tokens in length.
Mistral tested the model, which will be available for free on Mistral’s la Plateforme API, on text twice as long as OpenAI’s GPT-4o (which, by comparison, can only handle 128,000 tokens).
In the test, Codestral Mamba performed well in programming tasks, surpassing many competitors, including open source models such as CodeLlama and DeepSeek. Mistral's model is particularly suitable for local coding projects, making developers more comfortable when coding.
In addition to Codestral Mamba, Mistral has also launched another model - Mathstral, which is an AI model focused on mathematical reasoning and scientific exploration. It is designed to help users solve complex mathematical problems, especially suitable for use in the STEM field. Mathstral also uses the open source Apache2.0 license, and users can use and modify it freely.
Mistral's progress is not only due to technological breakthroughs, but also to the financial support it has received. Recently, Mistral successfully raised $640 million, with a valuation of nearly $6 billion, and received investment support from major companies such as Microsoft and IBM. It is foreseeable that Mistral will continue to play an important role in the field of AI in the future.