https://www.statewatch.org/news/2024/june/policing-by-design...
"The paper calls for “a harmonised EU regime on data retention” that is “technology neutral and future-proof,” covers all types of telecommunications service providers, includes measures ensuring both retention of and access to data, and is “in full compliance with privacy and data protection rules.”
"The EU’s previous data retention legislation was struck down by the Court of Justice in 2014, which found that the law allowed for “a wide-ranging and particularly serious interference” with the fundamental rights to privacy and data protection. The court has confirmed this interpretation in several cases about national data retention measures."
"the paper calls for retention of data from “service providers of any kind that could provide access to electronic evidence."
"agreed upon the need for law enforcement to have access to data en clair"
High-Level Group (HLG) recommendations:
https://home-affairs.ec.europa.eu/document/download/1105a0ef...
11. "The creation of a platform (equivalent to SIRIUS51) to share tools, best practices, and knowledge on how to be granted access to data from product owners and producers. Building further on SIRIUS, this should be expanded to include hardware manufacturers in its mandate and to create and map law enforcement points of contact with digital hardware and software manufacturers."
22. "Developing a technology roadmap that brings together technology, cybersecurity, privacy, standardisation and security experts and ensures adequate coordination e.g. potentially through a permanent structure, in order to implement lawful access by design in all relevant technologies in line with the needs expressed by law enforcement, ensuring at the same time strong security and cybersecurity and providing for the full respect of legal obligations on lawful access. According to the HLG, law enforcement authorities should contribute to the definition of requirements, but it should not be their role to impose specific solutions on companies so that they can provide lawful access to data for criminal investigative purposes without compromising security."
26. "Establishing a research group to assess the technical feasibility of built-in lawful access obligations (including for accessing encrypted data) for digital devices, while maintaining and without compromising the security of devices and the privacy of information for all users as well as without weakening or undermining the security of communications."
I could quote the entire PDF but it's too long. In short, they want to expand surveillance on all fronts and mandate backdoors both in software and hardware. Read the PDF.
they're still saying the old thing about accessing encrypted data and protecting privacy while it's obvious that it wouldn't be possible to access encrypted data and for that data to still be "secure" at the same time
They can use homomorphic encryption to learn about the data without actually seeing it.
No, they couldn't. Homomorphic encryption makes it possible for whoever holds the keys to the data to get certain kinds of processing done on it by someone who doesn't know what the data represents, and who won't know what the results represent.
It is very carefully constructed exactly to prevent what you're talking about: leaking any kind of information about the data to someone who doesn't already know what the data is.
The problem is that nobody outside of the people enforcing this would know what that "processing" is looking for either. Is it going to look for illegal content, political activists of women seeking an abortion?
You can design a system where FHE does the analysis, and then the result is available to the 3rd party as well. Nothing in FHE prevents you from doing that.
Do you mean because you can make the result a yes/no, and then brute-force it with a plaintext attack (encrypting "yes", encrypting "no", and seeing which it is)? Or is there some technique that'd scale to larger output sizes?
Sure, if you have the private keys you can publish the result to whomever you want. But you don't need and wouldn't benefit from FHE in any way in this case.
You would benefit from FHE: the users would know that data never leaves the device, the inference is done locally, and only the result is shared.
I mean, I do not have a link to a paper with a system like that, but I think a combination of FHE and enclave of sorts can be good for such purpose (leaving aside potential performance issues with FHE).
If the data is encrypted with my key, no one else can access it or do anything else with it. Period - there is nothing more to talk about (assuming that the encryption scheme is secure, of course). No one can extract anything from this data unless they have my private key.
FHE, formally, is simply a scheme that has the following formal property:
FHE allows me to securely use someone else's hardware to run my inference on my data and be confident that I am the only one who knows the result. If the data is on my hardware, and I don't want it to leave my hardware, then FHE is completely useless for me.Program(Encrypted(data, key)) = Encrypted(Program(data), key)
What you actually want is something like trusted computing. The government decides what analysis to run, it sends it to my hardware, my hardware runs that analysis on my decrypted data, and sends the result to the government, in such a way that the government can be certain that the algorithm was followed exactly. Of course, you need some assurances even here, such that the government doesn't just ask for the plaintext data itself - there have to be some limits to what they can run.
I'm not an expert at all on cryptography so I can't comment on that, however when looking for info about thorn I found a ftm page where a uni researcher acknowledges it's not possible to do it yet. It should be either this https://www.ftm.eu/articles/ashton-kutchers-non-profit-start... or this one, I can't remember at the moment https://www.ftm.eu/articles/ashton-kutcher-s-anti-childabuse...
Edit "possible" as in very computationally expensive to do it on a mass scale
Not possible to do what? Homomorphic encryption?
The links you provided are paywalled.
Yep. Sorry for the pay wall
http://web.archive.org/web/20241210080253/https://www.ftm.eu...
Yeah, I think their design won’t work, of course. It doesn’t mean that the technology cannot be applied.
Learn what exactly? Homomorphic encryption allows for mathematical operations on the data. X+1 can be applied to the data, but it still won't let you know whether x was 1, 2, 3 or any other value.
Despite all this, fuck the EU for consistently trying to undermined data privacy and introducing Kim Jong Um style mass surveillance. None of that shit protects privacy, as they claim.
[flagged]
That's interesting - is there anything relevant they could do under homomorphic encryption? For example, let's say that the government wants to only flag content with the substring "I am planning an attack" - is there any way to do that while keeping encryption intact?
The government alone couldn’t do it. The system has to be on device, otherwise the key is exposed rendering the whole thing moot.
Alternatively, the service providers like Meta can do it. We trust them with the end to end encryption anyway.
> We trust them with the end to end encryption anyway.
No we don't.
Well, maybe you do not,but the general public does.