Tomorrow it will be decided if I agree with Victoria Nuland

… and her infamous line “Fuck the EU”. Why? Because tomorrow the decision will be made whether the EU introduces surveillance for what’s meant to be private communication.

Originally the voting was supposed to be done today (2024-06-19).

More information on the website of Patrick Breyer (deutsche Version).

We should not be surprised that this initiative happens during the tenure of Ursula von der Leyen who — in Germany — is also known by the moniker Zensursula, a portmanteau of Zensur (German for censorship) and her first name, thanks to her initiatives to push for surveillance. What’s similar is the pretext, which is to fight CSAM.

Who could possibly be against that? And why are you against that, Oliver?

I am not against fighting CSAM. Heck, there’s little that would be worse to think of for loving parents than their children falling prey to those criminals. However, in this case the proposed measures are quite intrusive, undermine the purpose of end-to-end encryption and — worst — they will do exactly nothing against the creation and spreading of CSAM. See, that’s the issue with these sorts of measures. They may be well-meaning, but none of them will prevent the perpetrators of child sexual abuse from “documenting” their abuse and spreading the resulting CSAM. Generally criminals rarely give a flying fuck about laws and are masterful in avoiding and evading measures meant to prevent their actions. So what this will achieve is this: it will criminalize people who have nothing to do with CSAM, or make it impossible or impossibly inconvenient for them to communicate by modern means.

And at this point we haven’t even talked about who would get to see the video clips or photos uploaded before the end-to-end encryption takes place. Why is this important? Well for starters we live in a day and age where such data is being used to train machine learning models ((some call it AI, I prefer to call them LLMs or sometimes multi-modal LLMs)). This means that data such as photos which you entrust to software like Signal or WhatsApp — and which you hope to be end-to-end encrypted — may indeed end up somewhere else entirely as well than merely on the recipients’ devices and may be trained to hone biometric recognition technology which will be used to push for even more surveillance down the road.

Ironically the push for these measures is done by the same entity that has been praised for the GDPR in the past. And no less ironically the data will in all likelihood end up in the data centers of huge US corporations. And before anyone claims that this means it’s harmless as long as the data centers are situated within the jurisdiction of the EU, think again!

This entry was posted in EN, Human Rights, IT Security, Opinion, Privacy and tagged , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *