Apple has introduced a new feature in Australia which will allow children who receive nude images or videos to report them directly to their company. The company can then report these messages to the police.
This change is part of Apple’s new beta release for Australian users on Thursday. The extension is a continuation of the communications safety measures which have been enabled by default for Apple under 13 users since iOS 17, but are now available to all users. iPhones are equipped with safety features that detect images and videos containing nudity. These include those children may receive in iMessage or AirDrop. This detection is done on devices in order to protect privacy.
The young user will be shown two screens of intervention before he or she can continue. They are also given resources and a way to reach a parent. Apple will now be able to receive images and videos that are displayed as a warning.
The device will create a report that includes the images and videos as well as the messages sent just before or after the image or the video. The report will contain the contact details of both accounts and allow users to fill out a description.
Apple will review the report and may take action, such as blocking a user from sending messages via iMessage. They can also notify law enforcement of any issues.
Apple announced that the feature would initially be rolled out in Australia with the latest beta version, but would eventually be released worldwide.
It is no coincidence that the timing of the announcement and the choice of Australia as the region to receive the new feature coincides ‘s new codes coming into effect. By 2024, all tech companies operating in Australia will have to police content that contains child abuse or terror.
Apple warned the draft code would not protect the end-to-end encrypted communications and leave them vulnerable to mass surveillance. The Australian eSafety Commissioner eventually watered down the law . This allowed companies who believed it would breach end-to-end encrypted to show alternative measures to combat child abuse and terrorist content.
Apple has been criticized by regulators and police agencies around the world for its reluctance in compromising end-to-end encrypted iMessage messages to meet law enforcement needs. Apple has abandoned its plans to scan videos and photos stored in iCloud for child sexual abuse materials (CSAM) by the end of 2022. This prompted further criticism. Apple, WhatsApp and others who support encryption claim that backdoors in encryption threaten users’ privacy worldwide.
The Guardian reported in July that the UK’s National Society for the Prevention of Cruelty to Children had accused Apple of underestimating how frequently CSAM is found in its products.
Apple reported just 267 suspected CSAMs on its platforms to the National Center for Missing & Exploited Child in 2023. This is a vastly lower number than other tech giants, such as Google, which reported more than 1,47m and Meta, who reported more than 30,6m, according to the NCMEC annual report.
Post Disclaimer
The following content has been published by Stockmark.IT. All information utilised in the creation of this communication has been gathered from publicly available sources that we consider reliable. Nevertheless, we cannot guarantee the accuracy or completeness of this communication.
This communication is intended solely for informational purposes and should not be construed as an offer, recommendation, solicitation, inducement, or invitation by or on behalf of the Company or any affiliates to engage in any investment activities. The opinions and views expressed by the authors are their own and do not necessarily reflect those of the Company, its affiliates, or any other third party.
The services and products mentioned in this communication may not be suitable for all recipients, by continuing to read this website and its content you agree to the terms of this disclaimer.