Der Verordnungsentwurf der EU-Kommission zur Prävention und Bekämpfung von Kindesmissbrauch ist ein Frontalangriff auf die Bürgerrechte. In ihrem Gesetzesentwurf beschreibt die EU-Kommission einen der ausgeklügeltsten Massenüberwachungsapparate, der jemals außerhalb Chinas eingesetzt wurde: Das Scannen von CSAM (child sexual abuse material) auf allen Geräten.
Die Bundesregierung hat sich in ihrem Koalitionspapier jedoch darauf geeinigt, dass jeder Bürger ein "Recht auf Verschlüsselung" hat - was offensichtlich untergraben würde, sollte der Entwurf der EU-Kommission Gesetz werden.
In einer durchgesickerten internen Stellungnahme an die EU-Kommission fordert die Bundesregierung, dass der Gesetzentwurf mit "verfassungsrechtlichen Standards zum Schutz privater und vertraulicher Kommunikation" in Einklang gebracht werden müsse. Die Stellungnahme positioniert sich gegen "allgemeine Überwachungsmaßnahmen und das Abhören privater Kommunikation".
Laut dem Leak stellt die Bundesregierung 61 Fragen, die zum Teil sehr schwerzu beantworten sind und die EU-Kommission vor große Probleme stellt.
So fragt die Bundesregierung explizit, ob der Hinweis auf die Bedeutung der Ende-zu-Ende-Verschlüsselung im Text auch bedeute, dass diese Verschlüsselung bei der Erkennung von Bildern sexueller Gewalt nicht ausgehebelt werden darf. Darüber hinaus fragt die Bundesregierung, wie der Gesetzesentwurf mit den Bestimmungen der Datenschutzgrundverordnung (DSGVO) vereinbar ist oder ob die Software so aufgebaut werden kann, dass sie zwischen Kindern in nicht missbräuchlichen Situationen (z.B. am Strand) und solchen in missbräuchlichen Situationen unterscheiden kann.
Eine weitere wichtige Frage betrifft die Sicherheit für alle europäischen Bürger: Die deutsche Regierung fragt, wie das Gesetz feststellen könnte, ob die Technologie nicht missbraucht wird und wie es den Missbrauch feststellen würde.
Insgesamt stellt die Bundesregierung kritische Fragen, die alle Schwachstellen des Gesetzentwurfs der Europäischen Kommission ans Licht bringen: Eine allgemeine Massenüberwachung, wie sie der Gesetzentwurf vorsieht, wird die Sicherheit der europäischen Bürger und Unternehmen schwächen und das Recht auf Privatsphäre aushöhlen.
Es ist zu erwarten, dass die EU-Kommission diese Fragen nur unzureichend beantworten kann. Die Fülle und Detailtiefe der Fragen der Bundesregierung sowie die massiven Grundrechtseingriffe durch den Gesetzentwurf machen eine für Datenschützer zufriedenstellende Beantwortung nahezu unmöglich.
Die Frage, die sich nun stellt, ist: Wie wird die deutsche Regierung reagieren, wenn sie die unbefriedigenden Antworten erhält? Wird sie sich für das "Recht auf Verschlüsselung" einsetzen - wie es im Koalitionspapier heißt? Wird sie das Recht auf Privatsphäre eines jeden europäischen Bürgers verteidigen?
Während die durchgesickerten Fragen ein sehr gutes Zeichen sind, sollten wir uns nicht zurücklehnen und abwarten, wie sich die Dinge entwickeln.
Stattdessen müssen wir jetzt aktiv werden und den Politiker*innen sagen, dass unser Recht auf Privatsphäre wichtig ist.
Wenn Sie in Deutschland leben, unterschreiben Sie die Petition von Campact.
Wenn Sie in Österreich leben, unterschreiben Sie die Petition von #aufstehn.
Geben Sie der Europäischen Kommission Ihr Feedback und teilen Sie Ihre Bedenken zu diesem Gesetzesentwurf mit.
Hier erfahren Sie, wie Sie sich dem Protest anschließen können und wie Sie Ihre politischen Vertreter*innen anrufen und anmailen können.
From: German government
GER thanks COM for the initiative and welcomes COM’s effort to prevent and combat child sexual abuse. This is also an objective of the coalition treaty. The CSA draft regulation is an important step towards fighting child sexual abuse in the digital space on a European level and reaching better protection for children.
A common legislation including risk assessment, risk mitigation, risk reporting, clear legal basis and a new European Centre may help strengthening prevention and prosecution of child sexual abuse throughout the EU – while recognizing existing structures of content reporting services.
The confidentiality of communications is an important asset in our liberal societies that must be protected. Based on the Charta of Fundamental Rights, everyone has the right to respect for his or her private and family life, home and communications. All regulatory measures must be proportionate, should not go beyond what is necessary to prevent child sexual abuse in the digital space, and must effectively balance the conflicting interests of protecting children from abuse on the one hand and protecting privacy on the other.
GER will contribute to find clear appropriate and permanent ways for measures to help strengthening prevention and prosecution of child sexual abuse throughout the EU. According to GER’s coalition treaty secrecy of communication, a high level of data protection, a high level of Cybersecurity as well as universal end-to-end-encryption is essential for GER. The GER coalition treaty opposes general monitoring measures and measures for the scanning of private communications. GER is reviewing the draft proposal in the light of the coalition treaty. For GER it is important that regulation fighting against and preventing the dissemination of child sexual abuse material is in line with our constitutional standards of protection for private and confidential communication.
Regarding the establishment of an EU Centre the EU strategy had a rather comprehensive approach in mind addressing both online and offline prevention. The current proposal appears to primarily support law enforcement activities, while having no explicit mandate for offline prevention measures. From our view, the EU-Centre should additionally be a hub for awareness raising measures and the support of networks (incl. networks of survivors of child sexual abuse). We are convinced that the EU Centre should focus in particular on the prevention of online CSA. However, within the scope of its competence, it should also focus on offline CSA, when online offenses are associated with offline violence. Additionally GER advises to implement an equal structure of active participation of those affected by CSA from the beginning in the design of the EU-Centre. The EU Centre aims to provide support for those affected by CSA. However, the current proposal does not provide information concerning the participation of those affected by CSA in the EU-Centre.
Notwithstanding these substantive comments, we are still examining the current proposal to establish the EU Centre as an independent agency.
Our scrutiny reservation includes also but not only the organizational design of a new European Centre, Article 4, and – very generally speaking – the balancing between fundamental rights especially regarding the confidentiality of communication and end-to-end encryption.
GER would very much welcome the possibility of holding technical expert workshops alongside LEWP. Technical workshops would give MS the opportunity to learn more about the technologies at stake regarding detection orders and help improving a common understanding within MS.
We are intensively reviewing the draft regulation and will further comment on it. At this point GER has numerous questions. We would like to thank the Presidency and COM for the opportunity to transmit our questions and initial observations.
GER kindly asks for clarification regarding the following questions. At this point GER priority lies in the following questions:
How does EU CSA support the prevention of offline child sexual abuse? Besides the right for information and deletion of CSAM – what supporting measures are planned for victims and survivors of child sexual abuse?
Could the COM please give examples of possible mitigation measures regarding the dissemination of CSAM as well as grooming that are suitable for preventing a detection order?
Could the COM please explain how age verification by providers respectively App Stores shall be designed? What kind of information should be provided by a user? With regard to grooming your proposal specifically aims at communication with a child user. Shall the identification of a child user be conducted only via age verification? If a risk has been detected will providers be obliged to implanting user registration and age verification? Will there be also a verification to identify adult users misusing apps designed for children?
Does the COM share the view that recital 26 indicating that the use of end-to-end-encryption technology is an important tool to guarantee the security and confidentiality of the communications of users means that technologies used to detect child abuse shall not undermine end-to-end-encryption?
Could the COM please describe in detail on technology that does not break end-to-end-encryption, protect the terminal equipment and can still detect CSAM? Are there any technical or legal boundaries (existing or future) for using technologies to detect online child sexual abuse?
What kind of (technological) measures does COM consider necessary for providers of hosting services and providers of interpersonal communication in the course of risk assessment? Especially how can a provider conduct a risk assessment without applying technology referred to in Articles 7 and 10? How can these providers fulfill the obligation if their service is end-to-end encrypted?
How mature are state-of-the-art technologies to avoid false positive hits? What proportion of false positive hits can be expected when technologies are used to detect grooming? In order to reduce false positive hits, does COM deem it necessary to stipulate that hits are only disclosed if the method meets certain parameters (e.g., a hit probability of 99.9% that the content in question is appropriate)?
Does the proposal establish a legal basis for the processing of personal data for providers in the context of a detection order within the meaning of Article 6 GDPR? Does the proposal establish a legal basis for the processing of personal data for the EU-Centre in the context of a detection order within the meaning of regulation 2018/1725?
Additionally we would already like to raise the following questions:
Risk-assessment and risk mitigation:
Can COM detail on relevant „data samples“ and the practical scope of risk assessing obligations? Especially differentiating between providers of hosting services and providers of interpersonal communications services.
Can COM confirm that providers voluntary search for CSAM remains (legally) possible? Are there plans to extend the interim regulation, which allows providers to search for CSAM?
In Art. 3 par. 2 (e) ii the proposal describes features which are typical for social media platforms. Can COM please describe scenarios in which for those platforms a risk analysis does not come to a positive result?
Regarding detection orders:
Recital 23 states that detection orders should – if possible – be limited to an identifiable part of the service e.g. to specific users or user groups. Could COM please clarify how specific users/user groups shall be identified and in which scenarios a detection order should only be issued addressing a specific user/user groups?
Are the requirements set out in article 7 para 5 / para 6 / para 7 to be understood cumulatively?
Can COM please clarify „evidence of a significant risk“? Is it sufficient that there are more child users on the platforms and that they communicate to the extent described in Article 3?
How detailed does the detection order specify the technical measure required of the provider?
Can COM please clarify on the requirements of para 5b, 6a, 7b – which standard of review is applied? How can the likelihood in Art. 7 par 7 (b) be measured? Does the principle in dubio pro reo apply in favor of the hosting service?
How are the reasons for issuing the identification order weighed against the rights and legitimate interests of all parties concerned under Article 7(4)(b)? Is this based on a concrete measure or abstract?
Has COM yet received feedback by the providers, especially regarding article 7? If so, can you please elaborate the general feedback?
How concretely does the identification order specify the measure required of the provider? What follows in this respect from Article 7(8) („shall target and specify [the detection order]“), what from Article 10(2) („The provider shall not be required to use any specific technology“)?
On page 10 of the proposal it says „Obligations to detect online child sexual abuse are preferable to dependence on voluntary actions by providers, not only because those actions to date have proven insufficient to effectively fight against online child sexual abuse(…)“ What is COMs evidence proving that these voluntary options are insufficient?
How does the draft regulation relate to the rights of data subjects under Art. 12 et seq. of the GDPR, in particular Article 22 GDPR?
Regarding data protection supervisory authorities existing tasks under GDPR and other existing or currently negotiated European Acts (such as the DSA) how can effective control of identification orders be reached?
Does „all parties affected“ in Art. 9 include users who have disseminated CSAM or solicited children but who were nevertheless checked?
Which technologies can be used in principle? Does Microsoft Photo ID meet the requirements?
Should technologies used in relation to cloud services also enable access to encrypted content?
How is the quality of the technologies assured or validated? How does the CSA proposal relate to the draft AI-Act?
How is the equivalence of providers‘ own technologies to be assessed under Article 10(2) and how does this relate to providers‘ ability to invoke trade secrets?
Can the technology be designed to differentiate between pictures of children in a normal/ not abusive setting (e.g. at the beach) and CSAM?
Can text analysis software differentiate a legitimate conversation between adults (parents, relatives, teachers, sport coaches, friends etc) and children from a grooming situation?
How do you want to ensure that providers solely use the technology – especially the one offered by the EU Centre – for executing the detection order?
How would we handle an error? How should eventual cases of misuse be detected?
Could you please elaborate on the human oversight and how it can prevent errors by the technologies used?
How do you expect providers to inform users on „the impact on the confidentiality of users‘ communication“? Is it a duty due to the issuance of a detection order? Or may it be a part of the terms and conditions?
Do provider of file/image-hosting, which do not have access to the content they store fall under the scope of the Regulation?
Further provider obligations
How do reporting obligations under this proposal relate to current NCMEC reporting? How can the two processes best be streamlined? How can be assured that neither a duplication of reports nor a loss of reports is taking place?
Which role should the Coordinating Authority play regarding reporting obligation?
Regarding a EU-wide removal of CSAM how does COM deal with national differences regarding criminal law?
What number of cases does COM expect for the reports to EU CSA? How many cases will be forwarded to the competent national law enforcement authorities and/or Europol?
Will the right to an effective redress be affected by the obligation under art. 14 to execute a removal order within 24 hours?
At what point can knowledge of the content be assumed to have been obtained by the provider, is human knowledge required?
What standard of review does COM assume with regard to the various „actors“ in the information chain in the process of issuing an order? Does this include the requirement for a human assessment/audit in each case?
Why should Europol be involved in all cases, i.e. not only in cases of unclear MS responsibility?
How can blocking orders be limited in practice to specific content or areas of a service, or can only access to the service as a whole be blocked?
Do cloud services have to block access to encrypted content if they receive a suspicious activity report about specific users?
Why did you choose a latitude of judgment regarding penalties?
Does Art. 35 apply to cases of misuse of technology or the omission to establish effective measures to prevent such misuse (Art. 10 para 4)?
Why doesn’t the proposal follow the sanctions set out in TCO Regulation?
Could Article 35(2) be limited to breaches of a central obligation or a small number of central obligations?
Article 39 (2) does not provide for the national law enforcement authorities to be directly connected to the information exchange systems. In which way will reports be passed on to national LEAs?
What shall the information-sharing system embrace? How can effectiveness and data protection best be balanced?
Only EU CSA and Europol will have direct access to the database of indicators (Art 46(5)), how can national LEAs/national coordinating authorities best participate of the information? Does COM consider a new interface necessary in order to let national authorities know that further information might be available?
EU CSA & Europol
With regards to the proposed EU Centre’s cooperation with Europol, how does the Commission envision the distribution of tasks between the two entities in concrete terms in order to assure that any duplication of effort is avoided?
We took notice that the Commission’s impact assessment does not examine further the possibility of integrating the tasks of prevention and victim support into FRA and the tasks with relevance for law enforcement into Europol instead of creating a new entity. Rather, it seems that this possibility is discarded after preliminary examination. We would therefore like to know why this option was not examined further in the first place? Moreover, we kindly ask COM to explain the advantages its expects from creating a new entity instead of allocating the tasks to FRA and Europol in combination?
The legislative proposal foresees that Europol should provide certain „support services“ to EU CSA. What are the concrete means and services EU CSA should draw on at Europol? How can those support tasks be demarcated from the tasks of EU CSA? In that context we would like to ask if and if yes, how many additional resources COM estimates for Europol?
How should Europol handle this support in terms of resources and how does COM ensure that such support would not come at the expense of Europol’s other tasks?
How can the proposed governance structure of EU CSA best be streamlined with Europol’s governance structure making sure that no misbalance between the Commission and Member states is created?
Article 53(2) of the draft deals with mutual access to relevant information and information systems in relation to Europol. Are we right in assuming that the provision does not regulate access to information as such, because reference is made to the relevant provisions („in accordance with the acts of Union law regulating such access“)? What then is the specific regulatory content of the provision? Please explain.
For which period does COM estimate that EU CSA can start its work (while maybe not yet being fully operational)?
At what stage of the process are images deleted according to the proposal?
According to Article 64(4)(h), the Executive Director of EU CSA to be established may impose financial penalties if there are criminal acts detrimental to the financial resources of the Union. How does this relate to EPPO proceedings?
How can the proposal ensure that the competences of EU CSA do not collide with the competences of Eurojust?