Google AIDemis Hassabis, the director, said:AIThe development of climate change poses an existential threat to humanity, similar to climate change.
He told The Guardian that he feared humans might develop an out-of-controlsuperIntelligent systems, as well as other malicious possibilities. He believes that we must take the risks of artificial intelligence as seriously as we do climate change. He also pointed out that AI technology may make it easy to create biological weapons, similar to the problem of climate change, and the international community may delay in coordinating an effective global response, and we will pay the price for it. Therefore, he emphasized that we cannot have the same delay in dealing with AI.
Source Note: The image is generated by AI, and the image is authorized by Midjourney
Although AI has great potential in many areas, such as medicine, Hassabis called for the establishment of an independent body to regulate AI, similar to the United Nations' Intergovernmental Panel on Climate Change (IPCC), a view backed by former Google CEO Eric Schmidt.
In fact, one day after Hassabis’ interview was published, Google, Microsoft, OpenAI, and Anthropic announced a $10 million AI Safety Fund to promote research on effective testing and evaluation of the most capable AI models. Hassabis praised the initiative in a post on X (formerly known as Twitter), saying that we are at a critical moment in the history of artificial intelligence.
While experts in the field of AI have publicly expressed concerns about AI safety and ethics, there are doubts about how serious companies like Hassabis and Google are about AI safety and ethics. In 2020, Google fired AI ethicist Timnit Gebru and AI researcher Margaret Mitchell after they co-authored a controversial paper that did not meet Google's publication standards. The paper raised several risks about AI that now seem very accurate: its environmental impact, how it affects marginalized communities, bias in training data, data sets so large that they are difficult to audit, and that they can be used to deceive people.