©Adobe Stock

EU AI Act: Overview of key content and requirements

The EU AI Act is in all munds, many do not know what he is, who he concerns and what its consequences are.  

The following contribution explains what you need to know about it – simple and compact.  

 

What is the AI Act?

The AI Act was adopted as: first comprehensive EU AI regulation presented in July 2024. Since August 2024, this regulation has been: phased transitional periods entered into force. Thus, there are gradually more rules in place.  

The AI Act divides AI systems based on the underlying technology and the application context in four steps 

  • Minimal risk (e.g. AI spam filters) 
  • Limited risk (e.g. AI chatbots in customer service) 
  • High risk (e.g. AI systems to select candidates) 
  • Unacceptable risk (e.g. AI systems for social assessment)  

AI systems with unacceptable risk are: banned in the EU since February 2025. Basic principle:  The higher the risk, the stricter the requirements and obligations.

 

Why is the AI Act needed?

AI has been increasingly used for some time. In the meantime, she has also kept in our daily work. The lack of a legal framework created uncertainty, as many questions remained open. For example, what is allowed and who takes responsibility for damage that may occur in the event of incorrect use.  

The AI Act aims to clarify such issues and aims to: Making AI providers, deployers and users more secure 

It should also contribute to: Foster innovation and the bunch of Health, safety and fundamental rights protect what they value 

 

Who is affected by the AI Act?

The AI Act has two main objectives:  

  • AI providers; develop or place AI systems on the EU market 
  • AI Operators, deploying AI systems in their own operations 

Although the EU AI Act is primarily addressed to businesses, it also applies to staff indirectly. By acting responsibly, they can protect their business from harm and sanctions.

 

Need for AI literacy 

Under Section 4 of the EU AI Act, employers have already been required to: Ensuring sufficient AI literacy among staff. Both the user profile, consisting of knowledge and experience, and the specific application context, including the AI system used, shall be taken into account in the measures. 

The Federal Network Agency published in June 2025, Explanatory note the three steps to structure measures. The steps shall be: an orientation, no obligation represent: 

  1. Creating a basic understanding of data and AI in the organisation 
  1. Building advanced AI skills 
  1. Roll-specific training with individual focus 

Beside possible liability risks may also: further sanctions under the AI Act billing.  Companies should therefore take early action to develop AI literacy. 

 

Want to learn more? These sources provide you with more information: 

Author

Michelle Jörgens

AWSi