MY KOLKATA EDUGRAPH
ADVERTISEMENT
regular-article-logo Saturday, 11 May 2024

ChatGPT creator OpenAI delivers warning on the risk of artificial intelligence

The start-up’s top bosses have called for the regulation of AI and warned that 'superintelligence' is comparable to nuclear energy in terms of the risks it poses to mankind

The Daily Telegraph New York Published 24.05.23, 04:05 AM
AI scientists from companies including OpenAI, which is thought to have secured $10 billion in backing from Microsoft for its ChatGPT technology, are racing to create software capable of making decisions like a human would.

AI scientists from companies including OpenAI, which is thought to have secured $10 billion in backing from Microsoft for its ChatGPT technology, are racing to create software capable of making decisions like a human would.

ChatGPT creator OpenAI has warned that superhuman artificial intelligence (AI) posing an “existential risk” to humanity could become reality within just ten years.

The start-up’s top bosses have called for the regulation of AI and warned that “superintelligence” is comparable to nuclear energy in terms of the risks it poses to mankind.

ADVERTISEMENT

AI scientists from companies including OpenAI, which is thought to have secured $10 billion in backing from Microsoft for its ChatGPT technology, are racing to create software capable of making decisions like a human would.

OpenAI’s chief executive Sam Altman, president Greg Brockman, and chief scientist Ilya Sutskever said that it was “conceivable” that AI could surpass expert-level abilities “in most domains” within the next 10 years.

“We must mitigate the risks of today’s AI technology too, but superintelligence will require special treatment and coordination,” said the company's bosses in a blog post.

“We are likely to eventually need something like an [International Atomic Energy Agency] for superintelligence efforts; any effort above a certain capability (or resources like compute) threshold will need to be subject to an international authority,” they added.

Follow us on:
ADVERTISEMENT