The EU AI Act (AI Act) aims to establish a common regulatory and legal framework for the development and use of AI in the EU.
UK Companies / Companies based outside the EU could be affected in several ways, for example, if their website offers a chatbot function in the EU, they could be caught by the transparency obligations.
According to Article 3 of the EU AI Act – ‘AI system’ means a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments.”
Your obligations depend on the risk category of the AI functionality provided. The 4 risk categories:
Unacceptable risk systems:
Prohibited since the February 2025 e.g. emotion recognition systems in the workplace and in education or inappropriate use of social scoring and crime prediction systems
High Risk systems:
AI systems that create a high risk to the health and safety or fundamental rights of individuals and so are subject to onerous requirements e.g.:
Limited Risk systems:
AI systems that pose a transparency risk.
Minimal Risk systems:
AI systems with no significant restrictions, such as spam filters.
Article 4 of the EU AI Act mandates AI literacy as a fundamental compliance requirement. Organisations deploying AI must ensure that their employees, contractors, and relevant third parties have the necessary skills and knowledge to deploy AI responsibly and manage associated risks.
The implication of this is that it places a greater onus on businesses to put into practice education programmes that demonstrates comprehension and application, going well beyond one-off training sessions.
Penalties for non-compliance range from €7.5 million to €35 million or from 1 % to 7% of the company’s global annual turnover, depending on the severity of the infringement.
Prohibited practices
Up to the higher of €35,000,000 or 7% of the undertaking’s global annual turnover.
High-risk AI and specific transparency requirements
Up to the higher of €15,000,000 or 3% of the undertaking’s global annual turnover.
GPAI models
Up to the higher of €15,000,000 or 3% of the provider’s global annual turnover.
Incorrect, incomplete or misleading information
Up to the higher of €7,500,000 or 1% of the undertaking’s global annual turnover.
Reach out to Andrew Gordon if you’d like to discuss any of the above services.
The National Security and Investment Act 2021 came into force came into force on 4 January 2022 and introduced the first stand-alone regime for screening acquisitions and investments to protect UK national security.
On 19th November 2025, the European Commission announced proposed changes to the AI Act, following their commitment to a “clear, simple, and innovation-friendly implementation of the AI Act.”
What Are The EBA Outsourcing Guidlines? The European Banking Authority (EBA) Outsourcing Guidelines aims to…