OpenAI In the early hours of this morning forDevelopersLaunch of Batch Batch Processing API,Results are available within 24 hours and offer a 50% discount on APIs.
The new Batch API is suitable for asynchronous task processing, such as when developers need to process a large amount of text, images, summaries, they can use this API, OpenAI will give the results within 24 hours. This allows OpenAI to process during off-peak hours, saving server resources.And offer developers a 50% discount to unlock higher rate limits.
The new Batch API supports the use of the following models:
-
gpt-3.5-turbo
-
gpt-3.5-turbo-16k
-
gpt-4
-
gpt-4-32k
-
gpt-4-turbo-preview
-
gpt-4-turbo
-
gpt-3.5-turbo-0301
-
gpt-3.5-turbo-16k-0613
-
gpt-3.5-turbo-1106
-
gpt-3.5-turbo-0613
-
gpt-4-0314
-
gpt-4-turbo-2024-04-09
-
gpt-4-32k-0314
-
gpt-4-32k-0613
OpenAI has already described how to use it in the API documentation.Includes how to create, retrieve, and cancel Batch Batches, etc.The official introduction for developers who need it is as follows:
- Batch API Examples - OpenAI API Documentation
- Batch API FAQs | OpenAI Help Center