GOBLIN HOUSE
[ Enter Database → ]
The feed chooses what the user thinks about next.
Recommendation systems decide which post, video or story a user sees next, based on engagement-prediction models trained on the user's prior behaviour and the behaviour of similar users. The 2023 YouTube audit using 100,000 sock-puppet accounts found that while these systems do not always push users toward greater ideological extremity, they reliably increase the proportion of recommendations drawn from problematic channels — Intellectual Dark Web, Alt-right, Conspiracy and QAnon — particularly for right-leaning users. The system optimises for what holds attention, not for what is true or beneficial.
When the feed chooses what comes next, the user no longer chooses what they think about. The platform's engagement objective replaces the user's information goals as the principal selector of content.
Engagement-prediction models reliably select content that holds attention rather than content that is accurate or beneficial.
Algorithm produces a growing proportion of recommendations from IDW, Alt-right, Conspiracy and QAnon channels, particularly for right-leaning users.