Facebook is now assigning its users a reputation score to predict their trustworthiness on a scale from zero to one, Washington Post reports.
The system is part of its effort against fake news. However, it is unclear what other criteria Facebook measures to determine a user’s score, whether all users have a score and in what ways the scores are used.
However, the management said a user’s trustworthiness score is not meant to be an absolute indicator of a person’s credibility, nor is there is a single unified reputation score that users are assigned. The score is one measurement among thousands of new behavioral clues that Facebook now takes into account as it seeks to understand risk.
Facebook is monitoring which users have a propensity to report other people’s content as problematic as well, and which publishers are considered trustworthy by users.
The site has long relied on its users to report problematic content. However, some users began falsely reporting items as untrue after the introduction of the reporting system. A tab on the upper right-hand corner of every Facebook post lets people report problematic content for a variety of reasons, including pornography, violence, unauthorized sales, hate speech and false news.