Controversial iMessage child safety tools are coming to the UK

Taking a low light picture with the iPhone SE
Taking a low light picture with the iPhone SE

Apple is introducing a new child safety feature for its iMessage app for UK users. The “communications safety in iMessages” feature, which is already available to users in the United States, is now coming to the UK and Canada.

The idea behind the feature is to prevent younger iPhone users being exposed to age-inappropriate content shared via the platform. It’s an opt-in feature for parents, which will scan all photos sent and received via iMessage for nudity, with offending content blurred from young eyes.

The sender would be cautioned not to share the image and receive a prompt to message an adult. Those receiving the images will get a sensitive content warning, and be provided with means of contacting related child safety groups. Apple insists that everything is done on the device without being uploaded to the cloud for scanning. The company is also adding expanded guidance in Spotlight search, Safari search and Siri.

When announcing the feature last year, Apple said it provided “additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report child exploitation will be pointed to resources for where and how to file a report.”

However, one of the more controversial elements of the overall proposition, initially announced by Apple last year, will not be part of this stage of the rollout. Initially, Apple had planned to scan all photos on the device for known images of child sexual abuse, prior to their upload to iCloud, but received considerable pushback. Apple tells Trusted Reviews there is no update on if and when this element of the proposition will roll out and in what form.

The earlier announcement from the company also said parents would be alerted to the offending content, but that is no longer the case.

“Messages analyses image attachments and determines if a photo contains nudity, while maintaining the end-to-end encryption of the messages,” Apple explains on its child safety website.

“The feature is designed so that no indication of the detection of nudity ever leaves the device. Apple does not get access to the messages, and no notifications are sent to the parent or anyone else.”

The features will roll out in a future software update in the next few weeks, Apple tells Trusted Reviews.

The post Controversial iMessage child safety tools are coming to the UK appeared first on Trusted Reviews.