Raised $320 million! AI programming company Magic's new model supports 100 million token context windows

Startups focusing on generative AI programmingMagicRecently announced the completion of a roundFinancing, the amount is as high as 320 million US dollars.

This round of financing was led by former Google CEO Eric Schmidt and attracted many well-known investors, including Alphabet's CapitalG, Atlassian, Elad Gil, Jane Street, Nat Friedman and Daniel Gross, and Sequoia Capital. This financing brings Magic's total financing amount to nearly $465 million, quickly pushing it into the ranks of many well-funded AI programming startups.

Very long context window

One of Magic's distinctive technologies is its extremely long context window, called "Long Term Memory" (LTM).This means its models are able to handle larger amounts of input data, thus avoiding biases in the generated code.

Magic’s latest model, the LTM-2-mini, can handle 100 million tokens, which is equivalent to about 10 million lines of code, or 750 novels.Officials said that its computational efficiency in decoding tokens is 1,000 times that of traditional models. Compared with mainstream models on the market, such as Llama3.1, LTM-2-mini has much smaller memory requirements and demonstrates a strong performance advantage.

Raised $320 million! AI programming company Magic's new model supports 100 million token context windows

Official blog: https://magic.dev/blog/100m-token-context-windows

Partnering with Google Cloud

While Magic was raising funds, it also reached a cooperation with Google Cloud to build two "supercomputers" on the Google Cloud Platform. The two computers are Magic-G4 and Magic-G5. The former will be equipped with Nvidia H100 GPU, and the latter will use Nvidia's next-generation Blackwell chip to be launched next year.

Raised $320 million! AI programming company Magic's new model supports 100 million token context windows

Magic said that in the future it will expand computing power to "tens of thousands" of GPUs, and such a computing cluster can achieve a processing power of 160 Exaflops per second, equivalent to one quadrillion operations per second.

“We are very excited to work with Google and Nvidia to build the next generation of AI supercomputers with Google Cloud,” Magic founder and CEO Eric Steinberger said in a statement. He pointed out that Nvidia’s Blackwell system will greatly improve the efficiency of reasoning and training of its models, while Google Cloud will enable them to achieve scale in the shortest time and enjoy a rich cloud service ecosystem.

Magic was founded in 2022 by Steinberger, a former AI researcher at Meta, and Sebastian de Ro, the former CTO of German business process management company FireStart.

Their team currently has about twenty people, and although it is not yet profitable, Magic's goal is to help software engineers automate the writing, reviewing, debugging, and planning of code changes, striving to achieve a more efficient way of coding.

statement:The content is collected from various media platforms such as public websites. If the included content infringes on your rights, please contact us by email and we will deal with it as soon as possible.
Information

Former employees revealed that OpenAI AGI security team has lost nearly half of its members

2024-8-31 9:07:56

Information

New unicorn in AI programming! Cursor rival Codeium raises $150 million, valued at $1.25 billion

2024-8-31 9:16:27

Search