Facebook said in a new blog post that it sought to be clear about the content it downranks in the news feed, which it said is based in part on user feedback.
As the social media platform faces increased scrutiny over how it displays content to users, Facebook said on Thursday that it reduces the distribution of certain content on its news feed, such as clickbait, posts with "sensationalist" or exaggerated health claims, or low-quality videos, and outlined its content guidelines.
Facebook also demotes content from news publishers that consumers perceive as untrustworthy in polls, as well as content shared by pages or accounts that persistently break its rules, it said.
Meanwhile, Facebook's semi-independent oversight board has announced that it will investigate the company's "XCheck" system, an internal tool that exempts high-profile users from some or all of the company's rules.
The decision comes after a Wall Street Journal investigation discovered that reviews of posts by well-known users including celebrities, politicians, and journalists are routed through a separate mechanism.
Some users are "whitelisted," meaning they are not subject to regulatory action, while others are allowed to upload content that breaches Facebook policies pending content reviews, which frequently do not happen.
The Wall Street Journal discovered that users were flagged for increased inspection based on factors such as being "newsworthy," "influential or popular," or "PR risky." According to the newspaper, the XCheck list had 5.8 million users by 2020.
The oversight board said Tuesday that it expects to have a briefing on the system from Facebook and that it "will be reporting what we hear from this" as part of a report it will publish in October.
Other recommendations may be made by the board, but Facebook is not obliged to implement them.
The board said the Journal's investigation has increased attention to the company's sometimes inconsistent decision-making process, and why greater openness and independent monitoring of Facebook is so important for users.
In response to the Journal's probe, Facebook informed the publication that the system "was designed for an important reason: to create an additional step so we can accurately enforce policies on content that could require more understanding".
The company went on to say that the criticism was "fair" and that it was working to rectify it.