According to Korean technology media KED Global,SamsungIn order to win the high-end memory (HBM) order for Nvidia's next-generation artificial intelligence graphics processor (AI GPU), Silicon Labs has assembled an "elite team" of about 100 top engineers who have been working to improve manufacturing output and quality, with the primary goal of passing Nvidia's testing.
According to industry insiders, Nvidia CEO Jensen Huang is not satisfied with the yield and quality of the 8-layer and 12-layer HBM3E memory currently provided by Samsung, and asked Samsung to make improvements. HBM3E memory is a key component of Nvidia's next-generation Hopper H200 and Blackwell B200 AI GPUs, but is currently mainly supplied by Samsung's Korean competitor SK Hynix.
Samsung successfully developed the world's first 36GB 12-layer HBM3E memory in February and hopes to pass Nvidia's quality test this month. Samsung has also reportedly booked production lines in advance in order to increase production to meet Nvidia's growing demand.
"Samsung aims to capture a higher market share by rapidly increasing its supply to Nvidia," the source said. "Samsung is expected to speed up its supply in the third quarter."
“We fell behind in the first battle, but we have to win the second one,” said Kyung Kye Hyun, president and CEO of Samsung’s chip business.
Currently, SK Hynix is supplying 8-layer HBM3E memory to Nvidia's new H200 and B200 AI GPUs, and also provides 12-layer HBM3E samples for performance evaluation, which are expected to be available in the third quarter of 2024.