December 11th.IBM announced the development of a new optical technology that can train AI models at the speed of light while saving significant energy. By applying this breakthrough to data centers, the company says that training one AI model saves as much energy as 5,000 U.S. homes use in a year.
The company explains that while data centers are connected to the outside world via fiber optic cables, copper wires are still used internally. These copper wires connect GPU gas pedals, while GPU gas pedals spend a lot of time idle while waiting for data from other devices, consuming energy and driving up costs.
According to Dario Gil, senior vice president and director of research at IBM, "As generative AI demands more energy and processing power, data centers must evolve, and Co-Packaged Optics (CPO) technology can future-proof these data centers. With this breakthrough, theChips of the future will communicate like fiber optic cables transferring data in and out of data centers, thereby ushering in a new era of faster, more sustainable communications capable of handling the AI workloads of the future."
1AI notes that IBM outlined its new CPO prototype in a technical paper.By significantly increasing bandwidth in the data center, GPU idle time can be minimized to accelerate AI processing.IBM claims that training time for Large Language Models (LLMs) can be reduced from three months to three weeks. At the same time, greater energy efficiency will lower energy consumption and reduce the costs associated with training LLMs.