Privacy campaigners express fears Apple’s plan to scan iPhones for child abuse images

Fury at Apple’s plan to scan iPhones for child abuse images and report ‘flagged’ owners to the police after a company employee has looked at their photos

  • New safety tools unveiled to protect young people and limit spread of material
  • The measures are initially only being rolled out in the US, the tech giant said
  • Apple plans for technology to soon be available in the UK and across the globe
  • But security experts branded the plan as ‘absolutely appalling’ and ‘regressive’



<!–

<!–

<!–<!–

<!–

(function (src, d, tag){
var s = d.createElement(tag), prev = d.getElementsByTagName(tag)[0];
s.src = src;
prev.parentNode.insertBefore(s, prev);
}(“https://www.dailymail.co.uk/static/gunther/1.17.0/async_bundle–.js”, document, “script”));
<!–

DM.loadCSS(“https://www.dailymail.co.uk/static/gunther/gunther-2159/video_bundle–.css”);


<!–

Privacy campaigners have expressed fears that Apple’s plan to scan iPhones for child abuse images will be a back door to accessing user’s personal data – and could easily be adapted to spot other material.  

A trio of new safety tools have been unveiled in a bid to protect young people and limit the spread of child sexual abuse material (CSAM), the tech giant said.

While the measures are initially only being rolled out in the US, Apple plans for the technology to soon be available in the UK and other countries worldwide.

But Ross Anderson, professor of security engineering at Cambridge University, has branded the plan ‘absolutely appalling’.

Meanwhile Alec Muffett, a security researcher and privacy campaigner who previously worked at Facebook and Deliveroo, described the proposal as a ‘huge and regressive step for individual privacy’.

iPhones will send sexting warnings to parents if their children send or receive explicit images – and will automatically report child abuse images on devices to the authorities, Apple has announced

iPhones will send sexting warnings to parents if their children send or receive explicit images – and will automatically report child abuse images on devices to the authorities, Apple has announced

iPhones will send sexting warnings to parents if their children send or receive explicit images – and will automatically report child abuse images on devices to the authorities, Apple has announced

Mr Anderson said: ‘It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of our phones and laptops.’ 

The new Messages system will show a warning to a child when they are sent sexually explicit photos, blurring the image and reassuring them that it is OK if they do not want to view the image as well as presenting them with helpful resources.

Parents using linked family accounts will also be warned under the new plans. 

Furthermore, it will inform children that as an extra precaution if they do choose to view the image, their parents will be sent a notification.

Similar protections will be in place if a child attempts to send a sexually explicit image, Apple said. 

Among the other features, is new technology that will allow the company to detect known CSAM images stored in iCloud Photos and report them to law enforcement agencies.

It will be joined by new guidance in Siri and Search which will point users to helpful resources when they perform searches related to CSAM.

The iPhone maker said the new detection tools have been designed to protect user privacy and do not allow the tech giant to see or scan a user’s photo album. 

Instead, the system will look for matches, securely on the device, based on a database of ‘hashes’ – a type of digital fingerprint – of known CSAM images provided by child safety organisations. 

This matching will only take place when a user attempts to upload an image to their iCloud Photo Library. 

While the measures are initially only being rolled out in the US, Apple plans for the technology to soon be available in the UK and other countries worldwide

While the measures are initially only being rolled out in the US, Apple plans for the technology to soon be available in the UK and other countries worldwide

While the measures are initially only being rolled out in the US, Apple plans for the technology to soon be available in the UK and other countries worldwide

Apple said that only if a threshold for matches for harmful content is exceeded would it then be able to manually review the content to confirm the match and then send a report to safety organisations.

The new tools are set to be introduced later this year as part of the iOS and iPadOS 15 software update due in the autumn, and will initially be introduced in the US only, but with plans to expand further over time.

The company reiterated that the new CSAM detection tools would only apply to those using iCloud Photos and would not allow the firm or anyone else to scan the images on a user’s camera roll.

The announcement is the latest in a series of major updates from the iPhone maker geared at improving user safety, following a number of security updates early this year designed to cut down on third-party data collection and improve user privacy when they use an iPhone.

Advertisement

Loading

Leave a Reply

Your email address will not be published. Required fields are marked *

Follow by Email
Pinterest
LinkedIn
Share