Tough questions raised
The EU Commission's draft regulation on preventing and combating child abuse is a frontal attack on civil rights. In its draft law, the EU Commission describes one of the most sophisticated mass surveillance apparatuses ever deployed outside China: CSAM (child sexual abuse material) scanning on everybody's devices.
The German government, however, has agreed in their coalition paper that every citizen has a 'right to encryption' - which would obviously be undermined should the EU Commission's draft become law.
In a leaked internal statement to the EU Commission, the German government demands that the draft law must be brought in line with "constitutional standards for the protection of private and confidential communications". The statement positions itself against "general surveillance and private communications scanning measures".
According to the leak, the German government asks 61 questions, which in part are very tough and will pose serious problems for the EU Commission to answer.
For instance, the German government explicitly asks whether the reference to the importance of end-to-end encryption in the text also means that this encryption must not be undermined when detecting images of sexual violence. In addition, the German government asks how the draft law is compatible with certain provisions of the General Data Protection Regulation (GDPR) or whether the software can be built in such a way that it can distinguish between children in non-abusive situations (e.g. at the beach) and those in abusive situations.
Another important question touches the matter of security for all European citizens: The German government asks how the law could determine whether the technology would not be misused and how it would determine misuse.
How will the Commission answer?
All in all, the German government is asking critical questions that bring all the weaknesses of the draft law of the European Commission to light: General mass surveillance as proposed with the draft law will weaken security of European citizens and businesses, and undermine the right to privacy.
It is to be expected that the EU Commission will only be able to answer these questions unsatisfactorily. The abundance and depth of detail of the questions by the German government as well as the massive violations of fundamental rights by the proposed law make it close to impossible to answer in a way that data protection experts will be satisfied.
Fight for privacy
The question that needs to be asked now is: How will the German government react when they receive the not-so-satisfactory answers. Will they push for the 'right to encryption' - as stated in the coalition paper? Will they defend the right to privacy of every European citizen?
While the leaked questions are a very good sign, we should not sit back and wait how things develop.
Instead, we need to take action now and tell politicians that our right to privacy matters.
If you live in Germany, sign the petition by Campact to fight the EU Commission’s draft that would lead to unprecedented mass surveillance in Europe.
If you live in Austria, sign the petition by #aufstehn to fight the EU Commission’s draft that would lead to unprecedented mass surveillance in Europe.
Voice your feedback to the European Commission and share your concerns with this draft law.
Check here how you can join the protest as well as call and email your representatives.
Together we must fight mass surveillance!
From: German government
German questions on COM Proposal for a regulation of the European Parliament and of the council laying down rules to prevent and combat child sexual abuse
GER thanks COM for the initiative and welcomes COM’s effort to prevent and combat child sexual abuse. This is also an objective of the coalition treaty. The CSA draft regulation is an important step towards fighting child sexual abuse in the digital space on a European level and reaching better protection for children.
A common legislation including risk assessment, risk mitigation, risk reporting, clear legal basis and a new European Centre may help strengthening prevention and prosecution of child sexual abuse throughout the EU – while recognizing existing structures of content reporting services.
The confidentiality of communications is an important asset in our liberal societies that must be protected. Based on the Charta of Fundamental Rights, everyone has the right to respect for his or her private and family life, home and communications. All regulatory measures must be proportionate, should not go beyond what is necessary to prevent child sexual abuse in the digital space, and must effectively balance the conflicting interests of protecting children from abuse on the one hand and protecting privacy on the other.
GER will contribute to find clear appropriate and permanent ways for measures to help strengthening prevention and prosecution of child sexual abuse throughout the EU. According to GER’s coalition treaty secrecy of communication, a high level of data protection, a high level of Cybersecurity as well as universal end-to-end-encryption is essential for GER. The GER coalition treaty opposes general monitoring measures and measures for the scanning of private communications. GER is reviewing the draft proposal in the light of the coalition treaty. For GER it is important that regulation fighting against and preventing the dissemination of child sexual abuse material is in line with our constitutional standards of protection for private and confidential communication.
Regarding the establishment of an EU Centre the EU strategy had a rather comprehensive approach in mind addressing both online and offline prevention. The current proposal appears to primarily support law enforcement activities, while having no explicit mandate for offline prevention measures. From our view, the EU-Centre should additionally be a hub for awareness raising measures and the support of networks (incl. networks of survivors of child sexual abuse). We are convinced that the EU Centre should focus in particular on the prevention of online CSA. However, within the scope of its competence, it should also focus on offline CSA, when online offenses are associated with offline violence. Additionally GER advises to implement an equal structure of active participation of those affected by CSA from the beginning in the design of the EU-Centre. The EU Centre aims to provide support for those affected by CSA. However, the current proposal does not provide information concerning the participation of those affected by CSA in the EU-Centre.
Notwithstanding these substantive comments, we are still examining the current proposal to establish the EU Centre as an independent agency.
Our scrutiny reservation includes also but not only the organizational design of a new European Centre, Article 4, and – very generally speaking – the balancing between fundamental rights especially regarding the confidentiality of communication and end-to-end encryption.
GER would very much welcome the possibility of holding technical expert workshops alongside LEWP. Technical workshops would give MS the opportunity to learn more about the technologies at stake regarding detection orders and help improving a common understanding within MS.
We are intensively reviewing the draft regulation and will further comment on it. At this point GER has numerous questions. We would like to thank the Presidency and COM for the opportunity to transmit our questions and initial observations.
GER kindly asks for clarification regarding the following questions. At this point GER priority lies in the following questions:
How does EU CSA support the prevention of offline child sexual abuse? Besides the right for information and deletion of CSAM – what supporting measures are planned for victims and survivors of child sexual abuse?
Could the COM please give examples of possible mitigation measures regarding the dissemination of CSAM as well as grooming that are suitable for preventing a detection order?
Could the COM please explain how age verification by providers respectively App Stores shall be designed? What kind of information should be provided by a user? With regard to grooming your proposal specifically aims at communication with a child user. Shall the identification of a child user be conducted only via age verification? If a risk has been detected will providers be obliged to implanting user registration and age verification? Will there be also a verification to identify adult users misusing apps designed for children?
Does the COM share the view that recital 26 indicating that the use of end-to-end-encryption technology is an important tool to guarantee the security and confidentiality of the communications of users means that technologies used to detect child abuse shall not undermine end-to-end-encryption?
Could the COM please describe in detail on technology that does not break end-to-end-encryption, protect the terminal equipment and can still detect CSAM? Are there any technical or legal boundaries (existing or future) for using technologies to detect online child sexual abuse?
What kind of (technological) measures does COM consider necessary for providers of hosting services and providers of interpersonal communication in the course of risk assessment? Especially how can a provider conduct a risk assessment without applying technology referred to in Articles 7 and 10? How can these providers fulfill the obligation if their service is end-to-end encrypted?
How mature are state-of-the-art technologies to avoid false positive hits? What proportion of false positive hits can be expected when technologies are used to detect grooming? In order to reduce false positive hits, does COM deem it necessary to stipulate that hits are only disclosed if the method meets certain parameters (e.g., a hit probability of 99.9% that the content in question is appropriate)?
Does the proposal establish a legal basis for the processing of personal data for providers in the context of a detection order within the meaning of Article 6 GDPR? Does the proposal establish a legal basis for the processing of personal data for the EU-Centre in the context of a detection order within the meaning of regulation 2018/1725?
Additionally we would already like to raise the following questions:
Risk-assessment and risk mitigation:
Can COM detail on relevant „data samples“ and the practical scope of risk assessing obligations? Especially differentiating between providers of hosting services and providers of interpersonal communications services.
Can COM confirm that providers voluntary search for CSAM remains (legally) possible? Are there plans to extend the interim regulation, which allows providers to search for CSAM?
In Art. 3 par. 2 (e) ii the proposal describes features which are typical for social media platforms. Can COM please describe scenarios in which for those platforms a risk analysis does not come to a positive result?
Regarding detection orders:
Recital 23 states that detection orders should – if possible – be limited to an identifiable part of the service e.g. to specific users or user groups. Could COM please clarify how specific users/user groups shall be identified and in which scenarios a detection order should only be issued addressing a specific user/user groups?
Are the requirements set out in article 7 para 5 / para 6 / para 7 to be understood cumulatively?
Can COM please clarify „evidence of a significant risk“? Is it sufficient that there are more child users on the platforms and that they communicate to the extent described in Article 3?
How detailed does the detection order specify the technical measure required of the provider?
Can COM please clarify on the requirements of para 5b, 6a, 7b – which standard of review is applied? How can the likelihood in Art. 7 par 7 (b) be measured? Does the principle in dubio pro reo apply in favor of the hosting service?
How are the reasons for issuing the identification order weighed against the rights and legitimate interests of all parties concerned under Article 7(4)(b)? Is this based on a concrete measure or abstract?
Has COM yet received feedback by the providers, especially regarding article 7? If so, can you please elaborate the general feedback?
How concretely does the identification order specify the measure required of the provider? What follows in this respect from Article 7(8) („shall target and specify [the detection order]“), what from Article 10(2) („The provider shall not be required to use any specific technology“)?
On page 10 of the proposal it says „Obligations to detect online child sexual abuse are preferable to dependence on voluntary actions by providers, not only because those actions to date have proven insufficient to effectively fight against online child sexual abuse(…)“ What is COMs evidence proving that these voluntary options are insufficient?
How does the draft regulation relate to the rights of data subjects under Art. 12 et seq. of the GDPR, in particular Article 22 GDPR?
Regarding data protection supervisory authorities existing tasks under GDPR and other existing or currently negotiated European Acts (such as the DSA) how can effective control of identification orders be reached?
Does „all parties affected“ in Art. 9 include users who have disseminated CSAM or solicited children but who were nevertheless checked?
Which technologies can be used in principle? Does Microsoft Photo ID meet the requirements?
Should technologies used in relation to cloud services also enable access to encrypted content?
How is the quality of the technologies assured or validated? How does the CSA proposal relate to the draft AI-Act?
How is the equivalence of providers‘ own technologies to be assessed under Article 10(2) and how does this relate to providers‘ ability to invoke trade secrets?
Can the technology be designed to differentiate between pictures of children in a normal/ not abusive setting (e.g. at the beach) and CSAM?
Can text analysis software differentiate a legitimate conversation between adults (parents, relatives, teachers, sport coaches, friends etc) and children from a grooming situation?
How do you want to ensure that providers solely use the technology – especially the one offered by the EU Centre – for executing the detection order?
How would we handle an error? How should eventual cases of misuse be detected?
Could you please elaborate on the human oversight and how it can prevent errors by the technologies used?
How do you expect providers to inform users on „the impact on the confidentiality of users‘ communication“? Is it a duty due to the issuance of a detection order? Or may it be a part of the terms and conditions?
Do provider of file/image-hosting, which do not have access to the content they store fall under the scope of the Regulation?
Further provider obligations
How do reporting obligations under this proposal relate to current NCMEC reporting? How can the two processes best be streamlined? How can be assured that neither a duplication of reports nor a loss of reports is taking place?
Which role should the Coordinating Authority play regarding reporting obligation?
Regarding a EU-wide removal of CSAM how does COM deal with national differences regarding criminal law?
What number of cases does COM expect for the reports to EU CSA? How many cases will be forwarded to the competent national law enforcement authorities and/or Europol?
Will the right to an effective redress be affected by the obligation under art. 14 to execute a removal order within 24 hours?
At what point can knowledge of the content be assumed to have been obtained by the provider, is human knowledge required?
What standard of review does COM assume with regard to the various „actors“ in the information chain in the process of issuing an order? Does this include the requirement for a human assessment/audit in each case?
Why should Europol be involved in all cases, i.e. not only in cases of unclear MS responsibility?
How can blocking orders be limited in practice to specific content or areas of a service, or can only access to the service as a whole be blocked?
Do cloud services have to block access to encrypted content if they receive a suspicious activity report about specific users?
Why did you choose a latitude of judgment regarding penalties?
Does Art. 35 apply to cases of misuse of technology or the omission to establish effective measures to prevent such misuse (Art. 10 para 4)?
Why doesn’t the proposal follow the sanctions set out in TCO Regulation?
Could Article 35(2) be limited to breaches of a central obligation or a small number of central obligations?
Article 39 (2) does not provide for the national law enforcement authorities to be directly connected to the information exchange systems. In which way will reports be passed on to national LEAs?
What shall the information-sharing system embrace? How can effectiveness and data protection best be balanced?
Only EU CSA and Europol will have direct access to the database of indicators (Art 46(5)), how can national LEAs/national coordinating authorities best participate of the information? Does COM consider a new interface necessary in order to let national authorities know that further information might be available?
EU CSA & Europol
With regards to the proposed EU Centre’s cooperation with Europol, how does the Commission envision the distribution of tasks between the two entities in concrete terms in order to assure that any duplication of effort is avoided?
We took notice that the Commission’s impact assessment does not examine further the possibility of integrating the tasks of prevention and victim support into FRA and the tasks with relevance for law enforcement into Europol instead of creating a new entity. Rather, it seems that this possibility is discarded after preliminary examination. We would therefore like to know why this option was not examined further in the first place? Moreover, we kindly ask COM to explain the advantages its expects from creating a new entity instead of allocating the tasks to FRA and Europol in combination?
The legislative proposal foresees that Europol should provide certain „support services“ to EU CSA. What are the concrete means and services EU CSA should draw on at Europol? How can those support tasks be demarcated from the tasks of EU CSA? In that context we would like to ask if and if yes, how many additional resources COM estimates for Europol?
How should Europol handle this support in terms of resources and how does COM ensure that such support would not come at the expense of Europol’s other tasks?
How can the proposed governance structure of EU CSA best be streamlined with Europol’s governance structure making sure that no misbalance between the Commission and Member states is created?
Article 53(2) of the draft deals with mutual access to relevant information and information systems in relation to Europol. Are we right in assuming that the provision does not regulate access to information as such, because reference is made to the relevant provisions („in accordance with the acts of Union law regulating such access“)? What then is the specific regulatory content of the provision? Please explain.
For which period does COM estimate that EU CSA can start its work (while maybe not yet being fully operational)?
At what stage of the process are images deleted according to the proposal?
According to Article 64(4)(h), the Executive Director of EU CSA to be established may impose financial penalties if there are criminal acts detrimental to the financial resources of the Union. How does this relate to EPPO proceedings?
How can the proposal ensure that the competences of EU CSA do not collide with the competences of Eurojust?