In a heated discussion with leading AI scholars Tuesday evening, Eric Schmidt, the American businessman who held the top post at Google from 2001 to 2011, argued that AI systems can develop unexpected behaviors that limit the extent to which companies like Google can implement preemptive safety and governance mechanisms into their products.
A central “problem” with regulating frontier AI models, is that, sometimes, “a new feature emerges in these systems that is not tested, testable,” Schmidt said onstage at the annual Isaac Asimov Memorial Debate, moderated by physicist Neil deGrasse Tyson in Manhattan.
“We can stop [the emergence of new features or behaviors], and therefore stop all progress, by law, by banning larger models,” he said, “but as long as you have this new emergent power, you have deep reasoning, deep capabilities, and they will make mistakes. You have to be tolerant.”
Schmidt—who also served as chairman of Google’s parent Alphabet from 2015 and 2017 and advised Alphabet for three years after that—supported the 2014 acquisition of DeepMind, the unit that now houses Google’s most cutting-edge AI research.
Schmidt said that AI developers, like Google and others “should be held accountable” if they’re found in violation of the law, but emphasized that AI developers frequently have to ship AI products and retroactively correct bad behaviors they didn’t predict as the models evolve.
“I went through this when I was at Google, in earlier versions of [Google’s AI] technology, where the system would actually do something that was wrong, and we fixed it. And we fixed it as fast as we could, because we had to, because it was the right thing to do,” he said.
Google did not respond with an on-the-record statement by press time.
Schmidt was challenged by Latanya Sweeney, a professor of government and technology at Harvard and the former chief technology officer at the Federal Trade Commission, who cast doubt on the suggestion that AI leaders would happily comply with regulations. She said that leading tech companies have proven time and again to ignore key regulations or attempt to bend the law to their commercial interests.


