[ Enter Database → ]
[ENTITY FILE] SUBJECT-9081 PERSON ACTIVE
IS
// Subject

Ilya Sutskever​‌‍​‍‍‌‍‍‍​‌‌‌‍‍​​​​

Co-founder, Safe Superintelligence Inc.
Tracked Core AGI developer who privately acknowledged catastrophic risks while publicly advancing the technology — central to preparedness/advocacy contradiction analysis
// Editorial summary — AI-generated from public records

Co-founder and former Chief Scientist of OpenAI. Key figure in AGI development who reportedly advocated for bunker construction before AGI release. Co-founder of Safe Superintelligence Inc.

Facts on record1
Connections mapped0
Sources cited1
Stated vs Revealed
No documented contradictions on file.
TIMELINE Role Overlap Visualizer →
Facts (1)
Data Freshness
Fresh Last update: 22d ago · Avg age: 547d
Confidence Tiers: Primary Source — cross-referenced government/corporate filings Pending Review — sourced but not independently verified AI Inference — analytical hypothesis from cross-referencing
Raw Filing Records (1) — unsourced metadata
Pending Review Former OpenAI Chief Scientist who reportedly told colleagues: "We're definitely going to build a bunker before we release AGI." This statement directly links AGI develop​‌‍​‍‍‌‍‍‍​‌‌‌‍‍​​​​ment milestones to personal survival preparation, representing an internal acknowledgement by a core developer that the technology they are building may pose a catastrophic threat.
Date: 2023-06-01 Added: 15 Apr 2026
All Connections (0)
No connections documented.
Sources (1)
2023-06-01 UNVERIFIED Ilya Sutskever reportedly told colleagues about AGI bunker contingency news Raw