"If you randomly follow the algorithm, you probably would consume less radical content using YouTube as you typically do!" So says Manoel Ribeiro, co-author of a new paper on YouTube's recommendation ...
For years, researchers have suggested that algorithms feeding users content aren't the cause of online echo chambers, but are more likely due to users actively seeking out content that aligns with ...
Tailoring algorithms to user interests is a common practice among social media companies. But new research underscores the harms this practice can yield, especially when it comes to public perception ...
YouTube's algorithm recommends right-wing, extremist videos to users — even if they haven't interacted with that content before — a recent study found. (Marijan Murat/picture alliance via Getty Images ...
New research from Mozilla shows that user controls have little effect on which videos YouTube’s influential AI recommends. YouTube’s recommendation algorithm drives 70% of what people watch on the ...
Researchers found that clicking on YouTube’s filters didn’t stop it from recommending disturbing videos of war footage, scary movies, or Tucker Carlson’s face. reading time 3 minutes My YouTube ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results