Apple has delayed the rollout of child safety features
Apple issued the following statement about its decision
After making an announcement last month following negative feedback, now Apple has delayed the rollout of child safety features.
The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos and expanded CSAM guidance in Siri and Search, reports MacRumors.
Apple confirmed that feedback from customers, non-profit and advocacy groups, researchers and others about the plans have prompted the delay to give the company time to make improvements.
Apple issued the following statement about its decision.
"Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material," the company said.
"Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," it added.
Following their announcement, the features were criticised by a wide range of individuals and organisations, including security researchers, the privacy whistleblower Edward Snowden, Facebook's former security chief, politicians, etc.
Apple has since endeavoured to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more.
The suite of Child Safety Features were originally set to debut in the US with an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.
It is now unclear when Apple plans to roll out the "critically important" features, but the company still intends on releasing the features it appears.
*Edited from an IANS report