GOBLIN HOUSE
[ Enter Database → ]
Claim investigated: The extent to which OpenAI's models are being used for surveillance, targeting, or other applications with civil liberties implications is not publicly documented. Entity: OpenAI Original confidence: inferential Result: STRENGTHENED → SECONDARY Source: External LLM (manual handoff)
The claim is strongly supported by OpenAI’s confirmed defence contracts (CDAO awards for 'frontier AI projects') and the comprehensive opacity surrounding their execution. Multiple established facts confirm that specific use cases, technical specifications, and capabilities are not publicly disclosed, while classification and trade secret protections would legally prevent documentation of surveillance or targeting applications. No established facts provide any public record of such uses, reinforcing that their extent remains undisclosed.
Reasoning: Primary facts confirm OpenAI holds defence contracts (CDAO), while secondary facts establish that use cases (#14), technical specifications (#14, #15), and capabilities (#24) are not publicly documented. Classification and trade secret protections (#12, #13, #24) create legal barriers to disclosing surveillance/targeting applications, and the absence of any public record of such uses across all established facts confirms the documentation gap. This combination of confirmed defence engagement, legal secrecy frameworks, and verified non-disclosure elevates the inference to well-supported secondary confidence.
USASpending: "OpenAI" AND (surveillance OR targeting OR "civil liberties" OR intelligence OR reconnaissance OR ISR)
Contract descriptions may contain redacted or generic references to use cases, or their absence would support non-disclosure
DoD/CDAO press releases: "OpenAI" AND (frontier OR AI OR model) AND (application OR use OR deployment OR capability)
Official announcements might provide the only public hints of use cases, or their vagueness would confirm opacity
FOIA: "OpenAI" AND (surveillance OR targeting OR "civil liberties" OR intelligence OR DoD OR CDAO)
FOIA releases could reveal use case details or redactions that confirm classification of sensitive applications
Congressional hearings: "OpenAI" AND (surveillance OR targeting OR "civil liberties" OR intelligence OR defense)
Testimony or Q&A might reference applications at a level of detail not available elsewhere
Inspectors General reports: "OpenAI" AND (DoD OR CDAO OR intelligence OR AI OR "frontier") AND (audit OR review OR oversight)
IG reports are a primary oversight source; their absence or redaction would confirm non-public status of use cases
CRS reports: "OpenAI" AND (DoD OR defense OR intelligence OR AI) AND (surveillance OR targeting OR "civil liberties")
CRS reports synthesize public knowledge; their silence on specific use cases supports the claim
Privacy impact assessments: "OpenAI" AND (DoD OR government OR federal) AND (privacy OR "civil liberties" OR PIA)
PIAs for government AI systems would document civil liberties implications, if they exist and are public
Court records: "OpenAI" AND (surveillance OR targeting OR "civil liberties" OR intelligence OR defense)
Litigation might force disclosure of use cases or confirm their classified/trade-secret status
LDA: "OpenAI" AND (surveillance OR targeting OR "civil liberties" OR intelligence OR defense)
Lobbying on these issues would indicate public policy engagement; absence suggests classification
Federal Register: "OpenAI" AND (AI OR "artificial intelligence") AND (rule OR regulation OR notice)
Rulemaking would indicate public frameworks for use; absence supports non-disclosure of applications
CRITICAL — The non-disclosure of frontier AI applications in surveillance and targeting eliminates public oversight of technology with direct civil liberties implications, creating a critical democratic deficit in accountability for potentially rights-violating deployments