Facing unrelenting criticism for its role in fomenting discord, Facebook, Inc. said it will establish an independent body to make precedent-setting calls on what content should be banned or removed from its popular social network.

CEO Mark Zuckerberg said yesterday the independent body will review user appeals of content takedowns. He said that to maintain objectivity, Facebook shouldn't itself make "so many important decisions about free expression and safety."

Details of the body are still being worked out, however. Zuckerberg said the goal of this review body will be to increase accountability of removal decisions and ensure they are not being driven by commercial reasons.

Zuckerberg claims Facebook has made progress getting hate, bullying, and terrorism off its network. He said the effort is about finding the right balance between giving people a voice and keeping people safe.

Zuckerberg's announcement was part of an update on Facebook's Transparency Report. This initiative discloses how Facebook deals with inappropriate content; requests for user data by governments and claims of users violating intellectual property rights. It also came about after Facebook reported it's ramping up its ability to quickly detect "hate speech" and other posts violating community rules.

Zuckerberg said he's come to believe "we shouldn't be making so many decisions about free expression and safety on our own."

The composition of the appeals body and how it plans to stay independent while hewing to Facebook's principles and policies will be determined in 2019. Facebook also plans to begin releasing content removal summaries every quarter like its earnings reports, according to company executives.

Zuckerberg said Facebook and the independent judgment body face numerous and difficult challenges. Among these thorny issues is that people naturally tend to engage with more sensational content. This type of content, while on the borderline of violating Facebook policies, are unsuitable for civilized discourse, said, Zuckerberg. He said this phenomenon can be seen in cable news and tabloids too.

Much of Facebook's work is to ensure that borderline content that comes close to violating our content gets less attention not more. Zuckerberg noted that online bullying is a tougher challenge for artificial intelligence (AI) systems because it tends to be personal and open to interpretation.

Over the past three years, Facebook, which is no longer the leading social network among the young, came under intense pressure from government regulators in different countries and activists to eliminate abusive and inappropriate content such as racist and hate speech.