Google DeepMind CEO Demis Hassabis has called for an international AI governance body modeled on the International Atomic Energy Agency, complete with safety benchmarks, certification standards, and independent audits of frontier AI systems. The proposal comes as Hassabis warned that the timing could hardly be worse - the most powerful technology humanity has ever built is emerging during a period of deep international division.

"It's sort of crazy the timing that we're in with this most consequential maybe technology the world's ever seen, at the same time as a very fragmented international system," Hassabis told Harry Stebbings on the 20VC podcast, "Hvylya" reports.

Hassabis identified two core risks. The first is misuse by bad actors: AI systems are dual-purpose technologies that can advance science and medicine but also be repurposed for harm. The second is technical - ensuring that increasingly autonomous and agentic systems, expected within a year or two, remain within the guardrails their developers set.

His proposed solution: national AI safety institutes - like the one established in the UK under former PM Rishi Sunak - that feed into a global body. That body would define benchmarks for undesirable properties. Hassabis singled out deception as one trait no lab should be building into its models: "Nobody should be building systems that are capable of deception because then they could be getting around other safeguards." He also flagged AI systems communicating in non-human-readable tokens as a vulnerability that should be prohibited.

The end goal, Hassabis said, is a certification process - "almost like a kite mark of quality" - that guarantees a model meets certain safety standards, giving consumers and companies confidence to build on top of it. He emphasized that any such framework must be international, since AI systems operate across borders and no single country's regulations can contain them.

Also read: why the FT's Martin Wolf warned that AI could prove "socially far more dangerous" than previous technologies.