Search

Broadly Good....Deeply Concerning

Updated: Aug 19




So, this issue has become something of a hot potato over the last week, since Apple made a big announcement last week that will possibly, have implications for many of us in the years ahead. I could have picked an easier subject for only my second blog, but I felt this needed to be addressed. We are going to talk about the future of privacy for anyone running iOS 15 on iPhone, iPad. AppleWatch or running the latest OS Monterey on a Mac (all available and due to ship this autumn) and the updated Expanded Protections for Children.


So What's Going On Then?


There are three main elements to these new privacy policies all squarely aimed at protecting children. From the get go, marking my card firmly, anything that prevents sexual predators attacking and stalking children has to be the right and good thing. The way Apple are going about this though, has created cause for concern.


The first part of the updates means that if a child is about to receive and open an explicit image within the Messages app, firstly the picture will be blurred out. Depending how you've set up your child's phone, before opening the message, in a very child friendly manner, they'll be warned about what they could be opening. In conjunction with this, a parent or guardian will receive a message letting them know that their child is about to send a potentially sensitive image.

iOS
the screens that will be seen on an iPhone


Expanding Siri & Search


If someone is believed to be searching their device for explicit content of minors known as CSAM (Child Sexual Abuse Material) then this warning will come up trying to offer routes of help.



iPhone and iOS15 on Apple phones & iPads
Siri will provide resources and help around searches related to CSAM.


Scanning on Device


Now, I am sure we are all in agreement the first two steps can be nothing but good. Anything makes a child's online experience safer and happier has to be good.


The next part though is where the concerns have been voiced. Integral to the new iOS will be an inbuilt bit of software that will scan your images for explicit images. The way it is done is not simply by looking at images, but by making, what are known as, hashes. Imagine it scans your photos and turns them in to almost a binary image. This helps in two ways. Paedophiles will often try to manipulate images (crop, change colour, remove certain things) to try and avoid any online detection. In making hashes it becomes mathematically almost impossible for the image to be made undetectable. These hashes are then compared to hashes compared to a know database held by National Center for Missing and Exploited Children. Just to be clear, for now at least, this will only be happening in the US, but the software will be in all new iOS so presumably, ready to roll out later around the world with other agencies.


Another reason these images are held as hashes is because it would be illegal for Apple to hold these images on their own servers. CSAM is illegal full stop. When a prescribed number of hashes are matched on a device, the authorities will be warned and the matter will be escalated. No one knows what that 'safe' level is for obvious reasons. This scanning will be done on device not on your iCloud back-up. All of this only works if you are backing up to iCloud...more of that a little later.




But what next?


This system of hashes is not new. Facebook has been using it since 2013, Google 2008 with Twitter, Dropbox & Reddit also scanning your images and database. But Apple has historically always been about privacy. The 'why now' argument has cropped up this week many times. For starters, would you want to be the one that said we are not interested in protecting the safety of children online? Also, yes, Facebook & Google did take these steps many years ago, but those platforms are different in two crucial ways. If you make an image public, you waive any rights to privacy. Also, the other companies, very publicly make money from trading our data. Apple is unique amongst the tech giants in that they have not been interested in our data. They have always made money from being, predominantly, a hardware company, only recently becoming more software based too. They felt they now had the right security in place and felt safe in making these privacy changes. If, and only if we could guarantee that this would only ever be used against images, I think I'd be a little more relaxed. The problem as I see it is though, these hash algorithms could then be used for anything. certainly, if you back-up a pirated movie in Google Drive, quite simply, it will vanish...today....right now! This is done via the same hash system. So what if, as Apple are doing now, roll out another application they are going to use this with, without an opt out policy. If they decide to start to look for drug use, substance abuse of some other sort or anti-government propaganda? I mentioned movies, but the same could also be true say for pirated music too. And I said, when you update to the next iOS, there is no opt out policy.


Also, how long until this could become the most powerful and damaging malware? Imagine images being uploaded to your device without your knowledge? Law enforcement could then get involved or the owner bribed? What if governments were to force a 'back-door' with Apple?




It's going to happen!


In the week since this press release was announced, there had be some thought that Apple may delay it's roll-out until it had a chance to quell general held fears, but no. Apple, this week, have confirmed this will be going ahead as planned this autumn. Apple would have, without doubt known, the heat they were going to feel over this feature, but they felt this was the right time.




The few dictate to the many


As ever, it is for, the gratefully, very few sick and despicable characters out there that all of our future devices will now be adjusted. On balance, I guess this has to be a good thing, but our digital future looks like it could suddenly have become potentially, a far murkier place. And yet, for all this, if you were that sick and depraved that you had these disgusting images on your phone, is not the obvious loop hole, to simply not activate iCloud back-up?


If you have found what I have written of interest, here is a BBC article on the subject for further reading.

30 views0 comments