Tenet Media abruptly shut down on September 5, 2024, one day after the Department of Justice unsealed an indictment revealing the company had received nearly $10 million in covert Russian funding. The shutdown triggered cascading actions: YouTube terminated Tenet Media and all Lauren Chen channels, …
Tenet MediaTayler HansenLauren ChenTim PoolDave Rubin+8 moretenet-mediarussian-fundingcompany-shutdownsdoj-indictmentinfluence-operations+3 more
YouTube announced it would no longer remove content alleging fraud in past U.S. elections, shifting to context labels and recommendation interventions.
On October 15, 2020, YouTube announced it would ban content promoting QAnon and related conspiracy theories that “target individuals”—but the policy came approximately three years after YouTube’s recommendation algorithm began systematically amplifying QAnon from an obscure 4chan …
YouTubeGoogleQAnon movementFBIQAnon believers+1 moreyoutubeqanonconspiracy-theoriescontent-moderationradicalization+2 more
On February 17, 2019, YouTuber Matt Watson published a viral investigation exposing how YouTube’s recommendation algorithm facilitated what he called a “wormhole into a soft-core pedophile ring”—systematically clustering videos of minors and serving them to predators while …
YouTubeGoogleMatt Watson (investigative YouTuber)Pedophile networksDisney+3 moreyoutubechild-exploitationalgorithm-harmcontent-moderation-failurepedophile-networks+2 more
By 2018, comprehensive academic research from UC Berkeley, Harvard, and other institutions documented that YouTube’s recommendation algorithm systematically amplified conspiracy theories and misinformation over factual content—with some studies showing conspiracy videos received dramatically …
YouTubeGoogleUniversity of California Berkeley researchersHarvard researchersCounter Extremism Project+1 moreyoutubealgorithm-harmconspiracy-theoriesmisinformationacademic-research+3 more
On March 17-20, 2017, a Times of London investigation exposed that major brands’ advertisements were appearing on YouTube videos supporting terrorism, promoting hate speech, and featuring extremist content—triggering the largest advertiser boycott in digital platform history and exposing …
YouTubeGoogleTimes of LondonMajor advertisers (AT&T, PepsiCo, Johnson & Johnson, Walmart, etc.)British government+1 moreyoutubeadvertisingterrorismhate-speechbrand-safety+3 more
By 2016, YouTube’s recommendation algorithm had become what researchers characterized as a “radicalization engine”—systematically amplifying extremist content and pushing users down rabbit holes of increasingly radical videos because extreme content generated more watch time, which …
YouTubeGoogleGuillaume Chaslot (former YouTube engineer/whistleblower)Zeynep Tufekci (researcher)youtubealgorithm-harmradicalizationextremismmisinformation+3 more