October 30, 2010 - Early this morning, Beijing time, theGoogle CEO Sundar Pichai gave an update on Project Astra during the company's third quarter earnings call. He said that Google is buildingAbility to recognize and reason about the user's surroundingsof AI experiences, "'Project Astra' represents an initial exploration of this future. We are striving to launch similar experiences as early as 2025."
Note: Project Astra is a suite of new technologies that Google is showcasing at its I/O developer conference in May 2024, includingRecognizes surroundings and answers questionssmartphone apps that perform actions on behalf of the user, as well as the AI Assistant. The project is based on Gemini and can run natively on Pixel phonesIt can be said to be the latest model used to benchmark OpenAI GPT-4o..
According to a previous official Google demo, Project Astra is able to answer questions about objects in the smartphone camera's field of view, such as the user's neighborhood or the name of a damaged bike part.
This means that Google will have to wait until at least next year to launch Project Astra's plannedCore TechnologyThe project is a broad attempt to develop AI applications and "intelligent agents" with real-time, multimodal understanding. The project is a broad attempt to develop AI applications and "intelligent agents" with real-time, multimodal understanding.
According to a report by The Information this month, Google is planning to launch a consumer-facing "smart agent" service as early as December of this year, featuringProduct Purchase, Flight BookingAnd so on. However, unless the experience has nothing to do with Project Astra, it seems unlikely that this goal will be realized on time.