Zhipu AIAnnounced that the LLM GLM-4-Long, which supports ultra-long context length, has been launched on the open platform bigmodel.cn. The model is designed for processing ultra-long texts and can read the equivalent of two copies of "Dream of Red Mansions" or 125 papers at a time. It is widely used in scenarios such as translating long documents, analyzing financial reports globally, extracting key information, and building chatbots with ultra-long memory.
GLM-4-Long has a significant price advantage, with input and output prices as low as 0.001 yuan/thousand tokens, providing an economical and efficient solution for enterprises and developers. The model continues to pursue leading context capabilities in technology iterations, from the initial 2K context to the current 1M context length, integrating a large number of research results on long text processing.
In the "finding a needle in a haystack" test, GLM-4-Long demonstrated its ability to process information without loss, proving its superior performance in a context length of 1M. In addition, GLM-4-Long also performed well in practical application tests such as financial report reading, paper summarization, and novel reading, and was able to accurately extract and analyze key information.
The application of GLM-4-Long brings significant advantages to enterprises, including in-depth conversation understanding, complex document processing, more coherent content generation, and stronger data analysis capabilities. These capabilities are particularly important in areas such as customer service, law, finance, scientific research, marketing, advertising, and big data analysis.
Interface documentation:
https://bigmodel.cn/dev/api#glm-4
Experience Center:
https://bigmodel.cn/console/trialcenter