Muskhas announced that its artificial intelligence startup xA 's big language model, Grok-2, will launch in August and will bring even more advanced AI capabilities. While Grok-2 hasn't been unveiled yet, Musk has already begun working on its Grok-3 To build momentum.
Musk said that training AI chatbots requires datasets and that there is a lot of work involved in clearing large language models (LMMs) from existing data. He also addressed several issues with OpenAI model output training.
He revealed that xAI's Grok-3 cost 100,000 bucks.Nvidia The H100 chip for training is expected to be released by the end of the year and is believed to be "very special".
H100 is an AI chip developed by NVIDIA specifically for processing data from Large Language Models (LLMs). Each NVIDIA H100 AI chip is estimated to cost around $30,000 - $40,000 (currently around Rs. 219 - Rs. 292,000), with possible discounts for bulk purchases.
Simple calculations show that the 100,000 NVIDIA H100s used by xAI are worth $3 billion to $4 billion (IT Home note: currently about RMB 21.868 billion to RMB 29.157 billion). Musk has previously mentioned that Tesla's purchases from NVIDIA this year are estimated to be between $3-4 billion, so it's reasonable to assume that xAI is using the NVIDIA chips purchased by Tesla for training.