The news that Apple will scan iPhones for child abuse imagery has naturally caused a stir.
Before an image is stored onto iCloud Photos, the technology will search for matches of already known CSAM.
Apple said that if a match is found a human reviewer will then assess and report the user to law enforcement.
However there are privacy concerns that the technology could be expanded to scan phones for prohibited content or even political speech.
Experts worry that the technology could be used by authoritarian governments to spy on its citizens.
That last line is the most important because it is the technology that matters here and the potential uses for it. Apple prides itself on privacy and markets that heavily, but this feels like a mis-step on so many levels.
The thing is that if you have stored your photos online, used YouTube or gmail or practically any normal browser you have limited privacy anyway, and the move by Apple is merely an eye-opener rather than a massive draconian move that changes everything. The vast majority of people have no clue how much tech brands now about them and far too many simply do not care.
The fact that it can check photos is not a huge change because of the above, but it is in terms of what it potentially leads to. For many people Apple is the brand of last resort for privacy and in a time when vaccination passports are becoming essential to lead a normal life and when certain Western governments are displaying a worrying disregard for privacy this move feels ill-timed at best.
At worst it is offering a worrying sense of what is to come, and that just does not feel right to me. No matter how well intentioned or how important the issue of child abuse is, the answer is not to inhibit the privacy and rights of everyone else.