🔥 Premium AI-powered Stock Picks from InvestingPro Now up to 50% OffCLAIM SALE

Factbox-What are Europe's landmark AI regulations?

Published 09/12/2023, 14:48
Updated 09/12/2023, 23:36
© Reuters. FILE PHOTO: European Union flags fly outside the European Commission in Brussels, Belgium November 8, 2023. REUTERS/Yves Herman/File Photo

By Foo Yun Chee

BRUSSELS (Reuters) - European Union policymakers and lawmakers clinched a deal on Friday on the world's first comprehensive set of rules regulating the use of artificial intelligence (AI) in tools such as ChatGPT and in biometric surveillance.

They will thrash out details in the coming weeks that could alter the final legislation, which is expected to go into force early next year and apply in 2026.

Until then, companies are encouraged to sign up to a voluntary AI Pact to implement key obligations of the rules.

Here are the key points that have been agreed:

HIGH-RISK SYSTEMS

So-called high-risk AI systems - those deemed to have significant potential to harm health, safety, fundamental rights, the environment, democracy, elections and the rule of law - will have to comply with a set of requirements, such as undergoing a fundamental rights impact assessment, and obligations to gain access to the EU market.

AI systems considered to pose limited risks would be subject to very light transparency obligations, such as disclosure labels declaring that the content was AI-generated to allow users to decide on how to use it.

USE OF AI IN LAW ENFORCEMENT

The use of real-time remote biometric identification systems in public spaces by law enforcement will only be allowed to help identify victims of kidnapping, human trafficking, sexual exploitation, and to prevent a specific and present terrorist threat.

They will also be permitted in efforts to track down people suspected of terrorism offences, trafficking, sexual exploitation, murder, kidnapping, rape, armed robbery, participation in a criminal organisation and environmental crime.

GENERAL PURPOSE AI SYSTEMS (GPAI) AND FOUNDATION MODELS

GPAI and foundation models will be subject to transparency requirements such as drawing up technical documentation, complying with EU copyright law and disseminating detailed summaries about the content used for algorithm training.

Foundation models classed as posing a systemic risk and high-impact GPAI will have to conduct model evaluations, assess and mitigate risks, conduct adversarial testing, report to the European Commission on serious incidents, ensure cybersecurity and report on their energy efficiency.

Until harmonised EU standards are published, GPAIs with systemic risk may rely on codes of practice to comply with the regulation.

PROHIBITED AI

The regulations bar the following:

- Biometric categorisation systems that use sensitive characteristics such as political, religious, philosophical beliefs, sexual orientation, race.

- Untargeted scraping of facial images from the internet or CCTV footage to create facial recognition databases;

- Emotion recognition in the workplace and educational institutions.

- Social scoring based on social behaviour or personal characteristics.

- AI systems that manipulate human behaviour to circumvent their free will.

- AI used to exploit the vulnerabilities of people due to their age, disability, social or economic situation.

SANCTIONS FOR VIOLATIONS

© Reuters. File photo: Huenit, an AI camera and modular robot arm, moves a coffee cup at Huenit company's stand at during the opening day of the international consumer technology fair IFA in Berlin, Germany September 1, 2023. REUTERS/Lisi Niesner/File photo

Depending on the infringement and the size of the company involved, fines will start from 7.5 million euros ($8 million) or 1.5 % of global annual turnover, rising to up to 35 million euros or 7% of global turnover.

($1 = 0.9293 euros)

Latest comments

Risk Disclosure: Trading in financial instruments and/or cryptocurrencies involves high risks including the risk of losing some, or all, of your investment amount, and may not be suitable for all investors. Prices of cryptocurrencies are extremely volatile and may be affected by external factors such as financial, regulatory or political events. Trading on margin increases the financial risks.
Before deciding to trade in financial instrument or cryptocurrencies you should be fully informed of the risks and costs associated with trading the financial markets, carefully consider your investment objectives, level of experience, and risk appetite, and seek professional advice where needed.
Fusion Media would like to remind you that the data contained in this website is not necessarily real-time nor accurate. The data and prices on the website are not necessarily provided by any market or exchange, but may be provided by market makers, and so prices may not be accurate and may differ from the actual price at any given market, meaning prices are indicative and not appropriate for trading purposes. Fusion Media and any provider of the data contained in this website will not accept liability for any loss or damage as a result of your trading, or your reliance on the information contained within this website.
It is prohibited to use, store, reproduce, display, modify, transmit or distribute the data contained in this website without the explicit prior written permission of Fusion Media and/or the data provider. All intellectual property rights are reserved by the providers and/or the exchange providing the data contained in this website.
Fusion Media may be compensated by the advertisers that appear on the website, based on your interaction with the advertisements or advertisers.
© 2007-2024 - Fusion Media Limited. All Rights Reserved.