Experts warn: AI risks like pandemics and nuclear war
Leading experts' warning: AI threat requires global priority alongside societal risks

Experts are sounding the alarm: According to leading experts, artificial intelligence (AI) poses risks similar to those of pandemics or nuclear wars. The risks of AI are not underestimated, but considered a serious threat.
The experts' statement regarding the risks of AI is brief but extremely dramatic: "Reducing the risk of extinction from AI should be a global priority, comparable to other societal risks such as pandemics and nuclear wars."
Among the signatories of this statement are prominent names such as Sam Altman, the head of OpenAI, the company behind the inventor of the chatbot ChatGPT. ChatGPT has sparked a flurry of AI enthusiasm in recent months. Other signatories include Demis Hassabis, the CEO of DeepMind, a Google subsidiary specializing in AI, and Geoffrey Hinton, one of the leading researchers in the field of AI.
The AI experts' appeal was posted on a nonprofit organization's website. There, they highlight various potential dangers of AI, such as its use in warfare or the development of new chemical weapons. The San Francisco-based Center for AI Safety also warns of the spread of misinformation using this technology, as well as a future in which humanity could be completely dependent on machines.
Warren Buffett draws parallels between artificial intelligence and the development of the atomic bomb. The billionaire warns of the incalculable risks associated with AI.
A few weeks ago, another organization already published an open letter signed by tech billionaire Elon Musk, among others. The letter proposed a six-month pause in the development of AI to work out regulatory approaches to the technology. It later became known that Musk was on the verge of founding his own AI company.
