Don't worry,girls having sex x videos militia members will have to wait until afterElection Day to be algorithmically pointed to Facebook groups of like-minded individuals.
At Wednesday's Senate hearing on (at least in theory) Section 230, Facebook CEO Mark Zuckerberg let slip a slight behind-the-scenes change his company has taken in the lead up to Nov. 3. Specifically, Zuckerberg offhandedly mentioned that Facebook has temporarily stopped recommending political issue Facebook groups to its users.
Of course, Facebook intends to spin this presumably dangerous — or, at the very least, worrisome — recommendation feature right back up again after the election. So reports BuzzFeed News, which was able to confirm that the new policy is only temporary.
"This is a measure we put in place in the lead-up to Election Day," Facebook spokesperson Liz Bourgeois told the publication. "We will assess when to lift them afterwards, but they are temporary."
Because obviously we won't have any social media-juiced instances of violence after the election. Heavens no.
Notably, this move comes at a time when Zuckerberg — as expressed in his Thursday earnings call — is "worried that with our nation so divided and election results potentially taking days or weeks to be finalized, there is a risk of civil unrest across the country."
Facebook, which recently attempted to ban QAnon conspiracy groups, has particular reason to be concerned about the upcoming election and possible associated violence. Well, concern for its reputation, anyway. The platform has served as a breeding group for violent conspiracy theories for years, and a simple QAnon ban isn't going to change that.
There is a real possibility that the next Kenosha-style tragedy is already being planned, coordinated, or hyped with Facebook tools — only now with an Election Day twist. Facebook's attempt to cool things down by pausing an element of its own recommendation system calls attention to the simple fact that Facebook itself is fundamentally problematic.
Facebook knows this. In May of this year, the Wall Street Journalreported that Facebook had ignored its own internal research showing that its algorithms were making the site more divisive.
"Our algorithms exploit the human brain's attraction to divisiveness," read a slide from a 2018 presentation. "If left unchecked," it warned, Facebook would feed users "more and more divisive content in an effort to gain user attention & increase time on the platform."
SEE ALSO: People are fighting algorithms for a more just and equitable future. You can, too.
No temporary pause of a single recommendation feature, no matter how well intentioned, is going to change that.
Topics Facebook Social Media
(Editor: {typename type="name"/})
Sunday's Fat Bear Week match pits two fat favorites against each other
The Badlands Twitter account is the ultimate climate rebel
Government scientists are caught between new gag order and their own ethics policies
The Birder by Maisie Wiltshire
Trump administration to allow African elephant trophies back into U.S.
My Curtains, My Radiator by Mitchell Johnson
SpaceX's most recent rocket landing looks so sci
‘Google Assistant with Bard’ demo leaks — here’s what it can do with your photos
Amazon Big Spring Sale 2025: Best Apple deals on iPads, MacBooks, and more still live
Amazon Book Sale: Shop early deals now
For Black History Month, TikTok honors creators and announces new grant partnership
接受PR>=1、BR>=1,流量相当,内容相关类链接。