Do you have any idea about Apple’s child safety plan?
Recently, Erik Neuenschwander, the privacy head of a renowned tech giant announced that they will introduce a series of new Apple’s Child Safety features to protect children from abuse. Allegedly, Apple is planning to scan photo libraries of iPhone users to identify instances of child sexual abuse material. They will take the help of technology to put in place certain child safety measures.
Wondering, what are these measures? Or will they be any good? Here is everything you need to know about it!
What Is Apple’s Child Safety Plan?
Protecting children in the online world is equally or maybe more necessary than in the real world. Many unwanted creatures are waiting for a single opportunity to take benefit of your child. Moreover, your kid might themselves encounter any content which can leave a scar on their mind. For that concern, Apple’s child safety plan will safeguard kids from online predators as well as restrict the growing CSAM (Child Sexual Abuse Material) in the digital world.
Apple in its recent statement revealed that it will launch safety features in order to identify as well as report child abuse instances.
Let me tell you all about these features.
Apple’s Child Safety Features: Expanded Protection For Children
Allegedly, Apple has introduced a set of three new features regarding communication safety.
These Apple’s child safety features will only be available on the next iOS 15, iPadOS 15, macOS Monterey, and watchOS 8. Let’s zoom in and see what Apple is up to by getting to know these features.
The foremost feature Apple has launched is communication safety. The motive is simple! Apple aims to forewarn children whenever they send or receive adult or inappropriate images.
Wondering, how Apple will put in place this communication safety check?
In its recent blog post, Apple said that it will use native machine learning to identify sexually explicit photos. If a photo does not get through this security check, it will be flagged as inappropriate. Thereafter, Apple’s new scanning system will blur this picture and alert the kid about the same.
How Children Will Get A Warning About Apple’s Child Safety?
After identifying the picture as sensitive, Apple will warn your child that the alleged picture may contain any hurtful message or include sexual content. Plus, kids will be presented with helpful resources too in case they need it. In some exceptional cases, Apple’s child safety technology will warn the kids that their parents will be notified if they still go forward with sending or receiving this picture.
The reason for using machine learning to do this safety check is because Apple claims that it does not intend to read the messages of users. It will just analyze the attachments.
This feature will be available at your disposal only if you get your hands on the new iOS 15, iPsdOS 15, and macOS Monterey devices.
CSAM detection: Part of Apple’s Scanning System
Looking for CSAM meaning?
It is “Child Sexual Abuse Material”. Due to the spread of CSAM like wildfire, Apple has decided that it will scan iCloud Photos and detect images that involve children engaged in sexually explicit activities.
How CSAM detection will be commenced?
This matching process will be commenced with cryptographic technology. It is known as the private set intersection. This process will alert that there is a CSAM match in the user’s iCloud but not reveal the result. After which, Apple will manually review this image to find whether it is CSAM or not. If it is CSAM, Apple will disable the alleged account followed by reporting it to NCMEC(National Center for Missing and Exploited Children).
Additionally, Apple assures that it is using threshold secret sharing technology and on devices matching. It means Apple’s child safety technology will keep the privacy concerns of users in mind. Apple would not be able to view your private data until your pictures failed to pass through Apple’s scanning system.
Involvement of Siri and Search in Apple’s Child Safety
Apple is also expanding the role of Siri and Search in iOS devices. Following this update, users can get all the necessary information about reporting CSAM or kids’ harassment in these features. In unsafe situations, users can ask Siri how and where to report their child’s situation.
Moreover, if any user seems interested in child sexual abuse material through their query, they will be warned. This warning can go from just intimating users that their interest is harmful and problematic to involving the authorities to resolve the concern.
That’s all for Apple’s child safety plan. However, these features did not get along with most of the users.
The Backlash on Apple’s Child Safety Plan
Since Apple has given its statement, it has been fighting the accusations that its upcoming child safety plan will weaken user privacy. Although some privacy and security experts highly applauded Apple’s child safety initiative, others argue that it will break the end-to-end encryption deal.
Privacy is King! It is Apple’s long professed motto but Apple’s child safety feature, particularly CSAM scanning is an apparent contradiction to this statement. Right now, nothing about privacy concerns can be said or interpreted confidently until the tech giant clears things itself. Meanwhile, it would be a wise decision for parents to take things into their own hands. I mean to prevent your children from CSAM and other online hazards, the best option is to use a secure and reliable parental control app like Bit Guardian Parental Control. It will keep your little one safe in the online as well as real-world without compromising yours as well as their privacy.
You do not need to sacrifice your family’s privacy to protect your kids. Period.