Apple’s Communication Safety feature for iPhone has undergone an expansion, extending its protection to adult users and encompassing various forms of communication, including video content. This announcement was made during the WWDC event, and the enhanced feature will be introduced with iOS 17 later this year. One of the key aspects of this update is that all image and video processing will occur directly on the user’s device, ensuring complete privacy even from Apple.
The primary purpose of the Communication Safety in Messages feature is to employ on-device machine learning to automatically blur any nude images in iMessages before they can be viewed by a child. With the forthcoming iOS 17, this feature will also extend its protection to prevent children from viewing or sharing photos containing nudity through AirDrop, new Contact Posters, FaceTime messages, and while browsing their image library using Photo Picker. In addition to still images, the feature will also scan video content for nudity. However, it remains unclear whether this protection will be applicable to live video content, such as FaceTime video calls. We have reached out to Apple for clarification and will update this information accordingly.
At present, the Communication Safety tool is an opt-in feature within Apple’s existing Family Sharing system, and it exclusively applies to iMessages until the arrival of iOS 17 in the fall. When enabled, the feature currently detects if a child is sending or receiving images that may contain nudity, promptly warning the child and blurring the photo before it is displayed on the minor’s device. Additionally, the child is provided with helpful resources and given the option to message a trusted adult for further support.
In the near future, adults will also benefit from similar safeguards against unsolicited nude content. iOS 17 will introduce a new feature called “Sensitive Content Warning,” which will notify users of any age when they are about to receive an image or video containing nudity. A pop-up message will flag the content and ask users if they still wish to view it, providing reassurance and useful guidance on staying safe, such as reminding them that “It’s not your fault, but naked photos and videos can be used to hurt you.”