The AGI Definition That Changes Everything
A newly unsealed 2019 contract between Microsoft and OpenAI has laid bare the single most consequential phrase in artificial intelligence: AGI is defined as “a highly autonomous system that outperforms humans at most economically valuable work.” This definition, revealed in the ongoing Musk v. Altman trial, is not just academic. It is the contractual tripwire that could sever Microsoft’s access to OpenAI’s most advanced models the moment such a system emerges. The agreement, made public as a court exhibit, shows that once AGI is achieved, the revenue sharing and technology licensing that has defined this partnership evaporates. Microsoft gets cut out. OpenAI walks away free.
The Conflict of Interest No One Wants to Discuss
This clause exposes a dangerous incentive structure. OpenAI has every financial and strategic reason to delay declaring AGI, because doing so would instantly end the billions in Microsoft funding and cloud credits that keep the lights on. Meanwhile, Microsoft is pouring money into development while simultaneously demanding a definition that locks them out of the very future they are funding. This is not a partnership of equals. It is a hostage situation dressed up as innovation. The trial has ripped the veil off this arrangement, and the implications for AI governance are stark. If AGI is defined by economic outperformance, then the companies building it get to decide when the threshold is crossed, with no independent oversight. That is a recipe for regulatory capture, not progress. The public deserves to see the full contract and understand precisely when this trigger gets pulled.
Source: Theverge
