According to a Dec. 22 news report in the Wall Street Journal, it was noted thatOpenAI The next generation of large-scale language models under development GPT-5 development is behind schedule, and the currentThe results achieved have not yet reached a level commensurate with their significant cost.
The news echoes a previous report in The Information, which had hinted that OpenAI was looking for a new strategy toBecause GPT-5 may not be able to achieve the significant performance leaps that previous models didThe Wall Street Journal report reveals further details of the 18-month development process of the GPT-5, codenamed Orion. A report in the Wall Street Journal reveals further details of the 18-month development process of the GPT-5, code-named Orion.
OpenAI has reportedly completed at least two large-scale trainings aimed at improving model performance through massive data training. The first training was slower than expected, signaling that larger-scale training will be time-consuming and costly. While GPT-5's performance is claimed to be superior to that of its predecessor, theBut it has not progressed far enough to justify the huge costs of keeping the model running.
The report also says that OpenAI employs people to create entirely new data by writing code or solving math problems, in addition to relying on publicly available data and licensing agreements. In addition, the company is using synthetic data generated by another of its models, o1.
As of 1AI's press release, OpenAI has not yet responded, and the company has previously said that it will not release a model codenamed "Orion" this year.