After the trilogues, the EU AI Act falls short of needed guarantees to people

14 December 2023 | Artificial Intelligence, Statement

On 8 December, the EU institutions reached a political agreement on the EU Artificial Intelligence Act in a trilogue negotiation between the European Parliament, the Council and the Commission. The act follows a risk-based approach, which classifies artificial intelligence systems according to their potential risks, ranging from no risk to unacceptable risk. Most of the discussions prior to the compromise were about exceptions to the protection rules in various high-risk areas of use of AI applications, such as migration, law enforcement and national security. 

The lack of transparency in the trilogues makes it difficult to thoroughly  assess the guarantees provided by the legislation. The technical discussions in the next weeks will be crucial in clarifying the details of the agreement, so it will be important to continue to ensure public scrutiny of the process.

As civil society and civic actors, we are analysing the value of the legislation not only according to its general features but also to the extent to which  it protects all people, in any situation. Civil society’s engagement, mobilisation and advocacy towards a human-centred AI Act has been crucial for the discussions. 

Even with positive provisions in the compromise, however, we are concerned by the elements that have been made public so far, as it appears that specific groups would be less protected than the rest of society, while the use of AI systems in certain contexts be less well covered than in others (see significant examples below). 

The text includes limitations and prohibitions on harmful systems, such as live public facial recognition, emotion recognition and biometric categorisation systems. However, these restrictions have major loopholes, carve-outs and big gaps that weaken their ability to  fully protect human rights, the rule of law and democracy. For example, emotion recognition systems are to be banned, but only in workplaces and educational settings, overlooking the most harmful applications, such as policing, border control, and migration contexts. The act seems to rely on developers’ and deployers’ self-assessments of the risks of AI systems instead of independent assessments, which is a major loophole for proper protection against misuse and harm. The act also includes major exemptions for law enforcement and security authorities to use high-risk AI systems.

Commenting on the agreement, ECF Policy and Capacity Building Officer Kerttu Willamo said:

We are disappointed with the deal reached by the European institutions, which will undermine  the safeguards that civil society has for long advocated for, and allow them to be abused. The AI Act as proposed by the European Commission was flawed from the beginning, driven by market concerns rather than peoples’ rights. 

 

We need to recognise the tireless mobilisation of civil society who, until the last moment, fought to include fundamental rights protections for all. Once again it is clear why we need not only a thriving civic space but also structural and meaningful involvement of civil society in legislative processes, in line with Article 11 of the Treaty on European Union.

As we acknowledge the ambitious objectives of the legislation, we fear any weakness and vagueness in formulations would fail to protect people from abuses and harm of the use of AI systems. We need to examine the provisions adopted in the upcoming technical discussions in order to draw a final conclusion on the regulation and its ability to meet the EU’s claimed original objective of promoting trustworthy and human-centred AI that fully respects the fundamental rights of all.

To date, we mostly see a self-congratulatory narrative by the European institutions, that is reproduced by the media. We hope that even at this late stage, society can benefit from a critical, public, and democratic debate about this regulation, which covers such an essential issue for the future of our societies.