The eroticism second image page 5 sexearth is actually flat! 9/11 was an inside job! YouTube was once rife with conspiracy theories like these — and its own recommendation algorithm was the culprit.
While the company has made serious headway recently in fixing the issue, conspiracy theories continue to gain traction on the platform. But why is this happening?
Simply put, the Google-owned video streaming platform isn’t tackling the main issue: the rabbit hole effect caused by its recommendation engine. It's a problem the company may not be able to solve.
But first: the good news. There’s no doubt about it, YouTube has really stepped up to the plate when it comes to limiting the reach of conspiracies on its platform. A recent studyout of the University of California, Berkeley found that YouTube has made significant progress when it comes to reducing the amount of conspiratorial content its algorithm recommends.
According to the study, YouTube has been able to cut the amount of time spent viewing recommended conspiracy theory content by 70 percent. The research was conducted using eight million recommendations over 15 months.
At the start of 2019, YouTube announcedit was going to update its recommendation algorithm in order to remove content that spread 9/11, flat earth, and miracle cure conspiracies. In June, the video platform boastedthat the amount of time users spent watching these videos from YouTube recommendations dropped by 50 percent. Just a few months later in December, the amount of time spent viewing these videos had been cut further, by 70 percent.
However, that drastic reduction was short-lived as researchers discovered that the amount of conspiracy videos served up by the recommendation engine — in particular, the source videos for these conspiracies — slowly rebounded and are now only 40 percent less common.
Furthermore, the study found that certain conspiratorial topics that weren’t being focused on by the company, such as videos on climate change and the moon landing, flourished. YouTube's apparent decision to focus on "highly publicized" conspiracy theories allowed these others to thrive.
These latest findings back up a separate study from the activist group Avaaz earlier this year. That study foundthat climate change denial content on YouTube was growing even as YouTube worked to stifle the reach on other conspiracy content. It also discovered that videos pushing climate misinformation were actively being promoted by YouTube’s recommendation algorithm.
YouTube has been working in recent years to remove toxic contentfrom its massive platform. The company says it has 2 billion monthly active users who watch more than a billion hours of video each day. The site recently rolled out new rules concerning harassment. It’s highlighted its policies dealing with political misinformationin the lead-up to the 2020 election. And, following a fine from the FTC, the company recently made movesto protect children viewing content on its service.
Critics have longchastised YouTube for allowing these harmful conspiracy theory videos to proliferate on its platform. And though these videos don’t necessarily run afoul of YouTube’s policies, the company decided it would stop actively promoting them through its recommendation engine.
Conversation surrounding YouTube and its recommendation algorithm problem often leave out a major piece of the puzzle: the rabbit hole effect. A single video being recommended to a YouTube viewer can lead this user to fall down a “rabbit hole,” which sees them consuming more and more content from the recommended YouTube creator. Once they’ve watched enough, have subscribed, and become a fan of a channel, the conspiracy theorist no longer needs to rely on the recommendation engine to drive views. It isn’t needed. The viewer has become radicalized.
It's easy to see how a creator could produce conspiracy theory content that wasn't actively being policed by YouTube in order to take advantage of the recommendation algorithm and promote their channel.
Alex Jones, the former leader in conspiracy theory content on YouTube, has been bannedfrom the platform since the summer of 2018. Numerous other social platforms followedsuit, however, YouTube’s decision to ban Jones and his InfoWars show dealt the biggest blow as the platform hosted his most popular channel.
Was that the end of Alex Jones? Of course not. Jones was able to build his audience before action was taken against his channel. To this day, he still runs his daily show from his website.
For YouTube, filtering recommendations is a difficult problem to solve. It’s understandable why the company wouldn’t ban “borderline” conspiracy theory videos, like those that claim the U.S. government is communicating with aliens; content that isn’t threatening in nature. But, if YouTube really wants to solve its misinformation problem, it's going to have to plug up its rabbit hole.
Topics Google Social Media YouTube
'Fortnite' crossover with 'Avengers' means one thing: Dancing Thanos'God of War' sales break records for the PlayStation 4Windows 10 is getting a great new screenshot toolJeffrey Tambor confirmed for 'Arrested Development' Season 5Céline Dion has a restraining order against Deadpool in this behindApple says to fix Face ID problems via the rear cameraVideos from Hawaii show creeping lava as it engulfs roads and homesMozilla's Pocket begins sticking sponsored content in new Firefox tabsMicrosoft's future is the Azure Cloud, AI, and Microsoft 365Michelle Obama is frustrated with seeing so many men 'fail up'Lava from Hawaii's Kilauea volcano spews into a neighborhoodSix times that food was used in movies for selfNintendo says Virtual Console isn’t coming to Switch — but don’t worryLava from Hawaii's Kilauea volcano spews into a neighborhoodDespite Facebook News Feed algorithm changes, fake news still thrivesGreenhouse gas concentrations hit highest level in human historyThis comedy sketch about London street gangs is going massively viralLava from Hawaii's Kilauea volcano spews into a neighborhoodApple's upcoming iPhone X Plus to be the same size as iPhone 8 Plus, report says'Black Panther' director commentary reveals sad, new Killmonger fact Poets on Couches: Tess Taylor by Tess Taylor Quarantine Reads: The Book of Disquiet by Eddie Grace Poets on Couches: Major Jackson by Major Jackson Betraying My Hometown by Yan Lianke Fathers Sway above It All by Chelsea Bieker The Art of Distance No. 9 by The Paris Review Laughter as a Shield: An Interview with Souvankham Thammavongsa by Cornelia Channing Staff Picks: Angels, IUDs, and Books in Threes by The Paris Review The Phony Warrior by Yoshiharu Tsuge I Want You by Blutch Why Certain Illnesses Remain Mysterious by Sarah Ramey Poets on Couches: Mark Wunderlich by Mark Wunderlich The Paris Review’s Poetry Crossword by Adrienne Raphel The Winners of 92Y’s 2020 Discovery Poetry Contest by The Paris Review The Black Gambling King of Chicago by Michael LaPointe Our Motto by Maira Kalman Charmed: An Interview with Stephanie Danler by Leah Dieterich Staff Picks: Kentuckis, Kerchiefs, and Choreography by The Paris Review Poets on Couches: Monica Youn by Monica Youn On Reading Basho with My Ten
3.4757s , 10194.8125 kb
Copyright © 2025 Powered by 【eroticism second image page 5 sex】,Evergreen Information Network