More than 1100 signatories have signed the open letter calling for a pause on giant AI experiments.
The signatories include Elon Musk, Steve Wozniak, and Emad Mostaque, among others.
The open letter raised concerns about information biases by AI, job automation and the risk of AI on human civilization.
The Trust Project is a worldwide group of news organizations working to establish transparency standards.
To improve your local-language experience, sometimes we employ an auto-translation plugin. Please note auto-translation may not be accurate, so read original article for precise information.
More than 1,100 signatories including leading tech experts have signed an open letter calling for a six-month pause on training AI systems more advanced than GPT-4.
The letter was written by the Future of Life Institute, a nonprofit organization that works to reduce global catastrophic and existential risks facing humanity, particularly the existential risk from advanced artificial intelligence.
Quoting the widely endorsed Asilomar AI Principles, which states that “Advanced AI could represent a profound change in the history of life on Earth, and should be planned for and managed with commensurate care and resources,” the letter says that the level of planning and management isn’t happening.
The letter goes on to say that AI labs are in an “out-of-control” race to develop increasingly powerful artificial intelligence that no one can understand, predict or reliably control. As contemporary AI systems like GPT-4 can compete with humans on general tasks, the letter raises concerns about information biases by AI, job automation, and the risk of losing control on human civilization.
“We call on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4. This pause should be public and verifiable, and include all key actors. If such a pause cannot be enacted quickly, governments should step in and institute a moratorium,”the letter states.
The letter implores AI labs and tech experts to use the period of pause to develop and implement safety protocols for advanced AI design and development that are rigorously audited and overseen by independent outside experts.
However, American journalist Jeff Jarvis criticized the letter, saying that it is a “prime specimen of moral panic.”
Some of the leading minds in tech and machine learning who have signed the letter alongside DeepMind research scientists and several university professors from around the world include:
- Yoshua Bengio, University of Montréal, Turing Laureate for developing deep learning, head of the Montreal Institute for Learning Algorithms
- Stuart Russell, Berkeley, Professor of Computer Science, director of the Center for Intelligent Systems, and co-author of the standard textbook “Artificial Intelligence: a Modern Approach”
- Elon Musk, CEO of SpaceX, Tesla & Twitter
- Emad Mostaque, CEO, Stability AI
- Jaan Tallinn, Co-Founder of Skype, Centre for the Study of Existential Risk, Future of Life Institute
- Gary Marcus, New York University, AI researcher, Professor Emeritus
- Marc Rotenberg, Center for AI and Digital Policy (CAIDP), President
In a recent letter, Rotenberg wrote that the CAIDP will be filing a complaint with the Federal Trade Commission, calling for an investigation of Open AI and ChatGPT, as well as a ban on further commercial releases of the product until safeguards are established.
OpenAI CEO Sam Altman has admitted that “we also need enough time for our institutions to figure out what to do” and that society is not far away from “potentially scary” generative AI tools.
“We are asking the FTC to “hit the pause button” so that there is an opportunity for our institutions, our laws, and our society to catch up. We need to assert agency over the technologies we create before we lose control,” the CAIDP letter states.
Elsewhere, EU legislators are aiming to sign a deal with EU countries by the end of the year to implement AI rules, though it might hit a bottleneck as debates over how AI should be governed intensify.
Any data, text, or other content on this page is provided as general market information and not as investment advice. Past performance is not necessarily an indicator of future results.