On October 15, 2020, YouTube announced it would ban content promoting QAnon and related conspiracy theories that “target individuals”—but the policy came approximately three years after YouTube’s recommendation algorithm began systematically amplifying QAnon from an obscure 4chan …
YouTubeGoogleQAnon movementFBIQAnon believers+1 moreyoutubeqanonconspiracy-theoriescontent-moderationradicalization+2 more
By 2018, comprehensive academic research from UC Berkeley, Harvard, and other institutions documented that YouTube’s recommendation algorithm systematically amplified conspiracy theories and misinformation over factual content—with some studies showing conspiracy videos received dramatically …
YouTubeGoogleUniversity of California Berkeley researchersHarvard researchersCounter Extremism Project+1 moreyoutubealgorithm-harmconspiracy-theoriesmisinformationacademic-research+3 more