In order to counter the risks and address issues such as data protection, transparency and liability in connection with AI, legislative bodies such as the European Union are also increasingly addressing the topic. The EU Artificial Intelligence Act (also known as the AI Act) came into force in August 2024. It imposes obligations primarily on providers, but also on (professional) users of AI systems. This is based on a risk-based approach: the greater the risk associated with an AI application, the more far-reaching the regulation. Certain procedures whose risks considered unacceptable have been banned by the AI Act since the beginning of February 2025.
The other provisions of the AI Act will be applied gradually over several years. The AI Act then obliges providers of high-risk AI systems to set up a quality management system (described in more detail in the text of the regulation). This “shall be documented in a systematic and orderly manner in the form of written policies, procedures and instructions" and, in addition to a number of other aspects, must also include a risk management system, the requirements of which are also described in the text of the AI Act.
To address liability issues in connection with artificial intelligence, also at EU level, the Product Liability Directive, came into force at the beginning of December, in which the definition of product is explicitly extended to include software and AI systems as opposed to the previous version. Unlike the AI Act, whose provisions are directly binding in all EU member states, it is a directive. The individual states therefore have two years to enact corresponding legislation themselves. The latter also previously applied to the EU Machinery Directive, which will be completely replaced by the new Machinery Regulation by 2027. Individual provisions of the regulation have been gradually coming into force since July 2023. The Machinery Regulation is another example of a legal regulation that explicitly addresses artificial intelligence - and more are likely to follow.
Comments
No comments