Meta is, reportedly, working on a set of optional Instagram user settings to help protect users from unwanted and indecent content.

"This technology doesn't allow Meta to see anyone's private messages, nor are they shared with us or anyone else," Meta said. "We're working closely with experts to ensure these new features preserve people's privacy while giving them control over the messages they receive."

The user controls are meant to "help people protect themselves from unwanted DMs, like photos containing nudity," a Meta spokesperson said (via CNET).

According to Meta, testing of the new tools has not yet begun and they are still in the early phases of development.

Similar to its current Hidden Words controls, some of the new, optional controls Instagram plans to offer will also include end-to-end encryption for on-device security.

The new restrictions, which users can choose to use or not, are intended to prevent harm before it happens and to give users greater choice over their app experience. There won't be any signals to share with outside parties, and Meta won't be able to see or collect any of the images.

Meta's creation of new Instagram tools follows the tech company's $400 million fine from Ireland's Data Protection Commission for failing to secure children's information on Instagram.

In addition, the business is facing various lawsuits in the United States alleging that it willfully abuses its younger users for profit on both Facebook and Instagram.

The claims, which were filed in June, claim that the social networking sites intentionally create and deploy addictive psychological strategies to entice young and vulnerable users, despite "extensive insider knowledge" that their products are causing serious harm to young people's mental health.

Each case is around 100 pages long and was filed in Colorado, Delaware, Florida, Georgia, Illinois, Missouri, Tennessee, and Texas. According to the claims, Meta neglected to inform children's parents about the negative consequences of using social media. Instead, people became aware of these negative effects last year as a result of internal documents released by Meta leaker Frances Haugen, a former Facebook product manager.

The lawsuits claim that in order to increase their profit margins, the well-known websites are neglecting to safeguard their young users and that protracted exposure to them is leading to actual or attempted suicides, self-harm, eating disorders, severe anxiety and sadness, and problems sleeping.

Instagram CEO Adam Mosseri testified before Congress in December on the harm Instagram is doing to young people. Mosseri stated that the business would "rethink what Instagram is."