Apple's new child protection measures confirmed previous reports that the company would start scanning iPhones for images of child abuse. Experts on cybersecurity and child safety have expressed mixed feelings about the news.

Apple's Search app and Siri are both impacted by the first change. Apple will guide users to sites for reporting child sexual abuse or seeking assistance with an attraction to it if they search for issues related to it.

The update, which will be available later this year for iOS 15, watchOS 8, iPadOS 15, and macOS Monterey, is largely uncontroversial.

The other changes, on the other hand, have sparked significantly more controversy. One of them adds a parental control option to Messages, hiding sexually explicit images for users under the age of 18 and alerting parents if a kid under the age of 12 views or sends such images.

The last new feature scans iCloud Photos for child sexual abuse material, or CSAM, and reports it to Apple administrators, who can then forward it to the National Center for Missing and Exploited Children (NCMEC).

Apple claims that this feature was created particularly to protect users' privacy while searching for unlawful content. The same designs, according to critics, lead to a security backdoor.

The company has since defended its new system, promising that it will not be "expanded" for whatever reason.

Last week, digital privacy advocates warned that authoritarian governments could use technology to strengthen anti-LGBT regimes or tighten down on political dissidents in places where protests are prohibited.

However, Apple maintained it "will not accede to any government's request to expand" the system.

It published a question-and-answer page, claiming to have put in place many measures to prevent its systems from being exploited for anything other than detecting child abuse imagery.

"We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands, Apple said. "We will continue to refuse them in the future."

The tech giant has acknowledged the backlash against its updates. However, it has shown no indication that it intends to change or abandon them. An internal memo issued last Friday acknowledged "misunderstandings," but applauded the changes.

Apple also provided assurances about the new feature to alert minors and their parents when sexually inappropriate photos are sent or received using linked family accounts.