Newsletter

The EU AI Act Is Coming with Numerous Legal Consequences – But Don’t Forget the GDPR

Legal Insights Germany

2024年06月27日

The EU AI Act was adopted by the Council of the European Union on May 21, 2024. The final text was published in the Official Journal on July 12, 2024 and to come into force on August 1, 2024, instead of July as previously assumed. However, numerous legal rules for artificial intelligence (AI) apply already.

The European Commission's internal timetable for how the EU executive is preparing for the implementation of the AI regulation will therefore also be pushed back by a few weeks. The timetable assumed that the regulation would come into force in June or July. Most provisions in the AI Act will be applicable 24 months after its entry into force. However, the ban on AI systems that pose unacceptable risks, for example, will apply six months after entry into force. Unless certain exemptions apply, this means that companies will likely no longer be allowed to use prohibited AI technologies in the EU starting in February 2025. Violations may potentially be subject to high fines, similar to the General Data Protection Regulation (GDPR).

There is a need for timely action for the use of AI in the EU around data privacy, as the GDPR already applies. For example, the recommendations of the German Data Protection Conference (DSK) are relevant to AI: The guidance issued by the DSK on May 6, 2024, "Artificial Intelligence and Data Protection - Version 1.0,” for example, contains a whole series of legal requirements. It applies to any AI application (which is subject to the GDPR) in which personal data is processed with or through AI.

For example, according to the DSK, the data controllers themselves must verify whether and to what extent the AI application that they use has been trained in a lawful manner. Specifically, the GSK suggests that when selecting and using generative AI systems in particular, the covered businesses must check and document whether the AI system to be deployed or developed has been trained in accordance with applicable (data privacy) law. If the business has not trained the AI system itself, it must, according to the DSK, verify whether the AI system produces incorrect results, which is difficult to do in practice.

When using an AI application, the DSK requires businesses to evaluate whether the area where the AI is going to be used in is generally appropriate for AI . The DSK is of the opinion that closed AI systems should be preferred over open AI systems (such as cloud-based systems), which will pose at least some challenges in practice. This rule applies in particular to AI systems that are used in connection with legally relevant decision-making (e.g., for application procedures). The GSK suggests that businesses should establish internal regulations on the use of AI and provide training for their employees. When using AI systems from third-party providers, the DSK guidance suggests that separate data processing agreements or joint controller agreements under the GDPR be concluded.

Overall, it is clear that data privacy and AI can hardly be separated from each other. In practice, businesses should be mindful, for example, that while anonymization of personal data would disapply the GDPR to that data processing activity, in practice, achieving true anonymization in accordance with GDPR standards is not always easy to accomplish.

The rules of the new AI Office in Brussels, which is intended to safeguard a uniform European AI governance system, will therefore play a key role in the implementation of the AI Act. Businesses should closely monitor the development of new rules to be rendered by the AI Office and actively participate in the debate on AI.

______________

Other Articles in this Issue: