YouTube agreed to pay Trump $24.5 million to settle his lawsuit over account suspension following the January 6, 2021 Capitol attack, with Google dedicating $22 million to Trump’s White House ballroom construction. Free speech experts stated the lawsuit raised no credible legal claims since …
Donald TrumpYouTubeGoogleMetaXcorruptionpay-to-playconflicts-of-interestshakedowntech-regulation
For the first time in White House history, the Easter Egg Roll is sold to corporate sponsors, with tech giants Meta, Amazon, and YouTube purchasing sponsorships ranging from $75,000 to $200,000, raising significant ethical concerns about monetizing public events.
Donald TrumpMelania TrumpAmazonMetaYouTube+1 morekleptocracytrump-administrationexecutive-powercorporate-influencemonetization
The 2025 White House Easter Egg Roll marked an unprecedented moment of corporate capture, with tech giants Meta, YouTube, and Amazon purchasing sponsorship packages ranging from $75,000 to $200,000. These packages included branded activation spaces, event brunch tickets, and potential meet-and-greet …
Donald TrumpTrump AdministrationMetaYouTubeAmazon+2 moregovernment-contractscorporate-accesstrump-administrationinstitutional-capturecorporate-sponsorship
In early 2025, Donald Trump successfully negotiated a series of lucrative settlements with major media and tech platforms, including a $24 million settlement with YouTube over account suspension, a $25 million settlement with Meta, a $10 million settlement with X (Twitter), a $15 million settlement …
Donald TrumpAlphabet Inc.GoogleYouTubeElon Musk+3 moremedia-controlregulatory-capturelawsuit-settlementssocial-medialegal-strategy+1 more
Tenet Media abruptly shut down on September 5, 2024, one day after the Department of Justice unsealed an indictment revealing the company had received nearly $10 million in covert Russian funding. The shutdown triggered cascading actions: YouTube terminated Tenet Media and all Lauren Chen channels, …
Tenet MediaTayler HansenLauren ChenTim PoolDave Rubin+8 moretenet-mediarussian-fundingcompany-shutdownsdoj-indictmentinfluence-operations+3 more
YouTube announced it would no longer remove content alleging fraud in past U.S. elections, shifting to context labels and recommendation interventions.
On October 15, 2020, YouTube announced it would ban content promoting QAnon and related conspiracy theories that “target individuals”—but the policy came approximately three years after YouTube’s recommendation algorithm began systematically amplifying QAnon from an obscure 4chan …
YouTubeGoogleQAnon movementFBIQAnon believers+1 moreyoutubeqanonconspiracy-theoriescontent-moderationradicalization+2 more
On February 17, 2019, YouTuber Matt Watson published a viral investigation exposing how YouTube’s recommendation algorithm facilitated what he called a “wormhole into a soft-core pedophile ring”—systematically clustering videos of minors and serving them to predators while …
YouTubeGoogleMatt Watson (investigative YouTuber)Pedophile networksDisney+3 moreyoutubechild-exploitationalgorithm-harmcontent-moderation-failurepedophile-networks+2 more
By 2018, comprehensive academic research from UC Berkeley, Harvard, and other institutions documented that YouTube’s recommendation algorithm systematically amplified conspiracy theories and misinformation over factual content—with some studies showing conspiracy videos received dramatically …
YouTubeGoogleUniversity of California Berkeley researchersHarvard researchersCounter Extremism Project+1 moreyoutubealgorithm-harmconspiracy-theoriesmisinformationacademic-research+3 more
On March 17-20, 2017, a Times of London investigation exposed that major brands’ advertisements were appearing on YouTube videos supporting terrorism, promoting hate speech, and featuring extremist content—triggering the largest advertiser boycott in digital platform history and exposing …
YouTubeGoogleTimes of LondonMajor advertisers (AT&T, PepsiCo, Johnson & Johnson, Walmart, etc.)British government+1 moreyoutubeadvertisingterrorismhate-speechbrand-safety+3 more
By 2016, YouTube’s recommendation algorithm had become what researchers characterized as a “radicalization engine”—systematically amplifying extremist content and pushing users down rabbit holes of increasingly radical videos because extreme content generated more watch time, which …
YouTubeGoogleGuillaume Chaslot (former YouTube engineer/whistleblower)Zeynep Tufekci (researcher)youtubealgorithm-harmradicalizationextremismmisinformation+3 more