Facebook has begun to notify some users that they may have come across "extremist content" on the social media platform.

Screenshots uploaded on Twitter revealed a notification asking, "Are you worried that someone you know will become an extremist?" Another message warned users, "You may have recently been exposed to harmful extremist content." Both included links to "Getting Help."

According to Andy Stone, a Facebook representative, the move is part of the social media company's Redirect Initiative, which seeks to prevent violent extremism. Screenshots of the notifications were shared on social media this week.

"This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk," Stone said.

"We are partnering with NGOs and academic experts in this space and hope to have more to share in the future."

For years, Facebook, Google, and Twitter have been under pressure to remove extremist content from their platforms before violence spills in the real world, but that focus has heightened this year as a result of increased scrutiny for the role their platforms played in the buildup to the riots at the U.S. Capitol in January.

Facebook said the effort is part of its commitment to the Christchurch Call to Action, a worldwide alliance of governments and technology companies aimed at reducing violent extremist content online in the aftermath of the livestreamed slaughter of 51 people at a mosque in New Zealand.

In February, Facebook announced that it had to remove an increased volume of content in the fourth quarter for the violations of policies prohibiting hate speech, harassment, nudity, and other sorts of harmful content. It said it took action against 26.9 million pieces of hate speech, up from 22.1 million in the third quarter.