Facebook is giving users reputation scores.
Facebook Is Rating Your Ability to Flag Fake News
Facebook reportedly assigns a 'reputation score' to users to help it identify bad actors who might be abusing the company's content flagging systems to report real news as fake.
Facebook is assigning its users a "reputation score" to determine whether it should take their complaints on fake news seriously or not, The Washington Post reports.
SecurityWatchReputation scores are designed to help the company identify users who might be abusing Facebook's content flagging systems to report real news as fake, the Post says. "(It's) not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they're intentionally trying to target a particular publisher," Facebook product manager Tessa Lyons told the Post in an interview.
Facebook, however, pushed back today on the Post's characterization of the effort.
"The idea that we have a centralized 'reputation' score for people that use Facebook is just plain wrong and the headline in the Washington Post is misleading," a Faceboook spokesperson told PCMag. "What we're actually doing: We developed a process to protect against people indiscriminately flagging news as fake and attempting to game the system. The reason we do this is to make sure that our fight against misinformation is as effective as possible."
In 2015, Facebook began rolling out an option to let people flag false news stories over the platform; go to the "…" icon on a Facebook post and click the "Give feedback on this post" option.
Unfortunately, content flagging systems can also be gamed. All it takes is a mob of online users to report a post is fake news or hate speech to trigger the company to investigate and potentially misinterpret the complaints as legit.
To address the abuse, Facebook created the reputation score over the past year to help it weed out false reports pertaining to fake news. Facebook didn't provide more details, but according to the Post, the score measures your trustworthiness on a scale of zero to one. If you're consistently making false reports, presumably you're score will go down. On the flip side, if you flag something as fake news that is confirmed by a third-party fact checker to be misinformation, Facebook might give you a higher credibility score.
However, the reputation score is only one measurement "among thousands of new behavioral clues" that the company uses to assess whether you're a risk or not, the Post said. How Facebook ultimately determines whether someone is a malicious party or not remains opaque. A big reason why is to prevent abusers from learning how to game the content flagging system. But this can also come at the cost of public transparency.
Earlier this month, Facebook decided to ban conspiracy theorist Alex Jones and his controversial show Infowars from the platform for repeatedly violating Facebook's content policies on hate speech. But not everyone is buying that explanation, with critics claiming that the company simply bowed to public pressure.
Facebook is giving users reputation scores.
4/
5
بواسطة
Hemza