The Spectre haunting the "AI Safety" Community
Published on February 21, 2026 11:14 AM GMTI’m the originator behind ControlAI’s Direct Institutional Plan (the DIP), built to address extinction risks from superintelligence.My diagnosis is simple: most laypeople and policy makers have not heard of AGI, ASI, extinction risks, or what it takes to prevent the development of ASI.Instead, most AI Policy Organisations and Think Tanks act as if “Persuasion” was the bottleneck. This is why they care so much about respectability, the Overton Window, an