EC Artificial Intelligence Act
HomeDocumentationGet started
  • EC Artificial Intelligence Act
  • EC AIA - Compliance Requirements
  • Article 09 - Risk Management System (ART09)
    • 09.01 - Risk Management System in Place
    • 09.02 - Risk Management System Capabilities and Process
    • 09.03 - Risk Management Measures
    • 09.04 - Testing
    • 09.05 - Residual Risks
    • 09.06 - Consideration of Children
    • 09.07 - Credit Institutions
  • Article 10 - Data Governance (ART10)
    • 10.01 - Define Sets
    • 10.02 - Dataset Governance Policies
    • 10.03 - Dataset Design Choices
    • 10.04 - Data Source Information
    • 10.05 - Dataset Annotations Information
    • 10.06 - Dataset Labels Information
    • 10.07 - Dataset Cleaning
    • 10.08 - Dataset Enrichment
    • 10.09 - Dataset Aggregation
    • 10.10 - Dataset Description, Assumptions and Purpose
    • 10.11 - Dataset Transformation Rationale
    • 10.12 - Dataset Bias Identification
    • 10.13 - Dataset Bias Mitigation
    • 10.14 - Dataset Bias Analysis Action and Assessment
    • 10.15 - Dataset Gaps and Shortcomings
    • 10.16 - Dataset Bias Monitoring - Ongoing
    • 10.17 - Dataset Bias Special/Protected Categories
  • Article 11 - Technical Documentation (ART11)
    • 11.01 - Technical Documentation Generated
    • 11.02 - Additional Technical Documentation
    • 11.03 - Technical Details
    • 11.04 - Development Steps and Methods
    • 11.05 - Pre-trained or Third Party Tools/Systems
    • 11.06 - Design Specification
    • 11.07 - System Architecture
    • 11.08 - Computational Resources
    • 11.09 - Data Requirements
    • 11.10 - Human Oversight Assessment
    • 11.11 - Pre Determined Changes
    • 11.12 - Continuous Compliance
    • 11.13 - Validation and Testing
    • 11.14 - Monitoring, Function and Control
    • 11.15 - Risk Management System
    • 11.16 - Changes
    • 11.17 - Other Technical Standards
    • 11.18 - Ongoing Monitoring System
    • 11.19 - Reports Signed
    • 11.20 - Declaration of Conformity
  • Article 12 - Record Keeping (ART12)
    • 12.01 - Logging Capabilities
    • 12.02 - Logging Traceability
    • 12.03 - Logging - Situations that may cause AI Risk
    • 12.04 - Logging - Biometric Systems Requirements
  • Article 13 - Transparency and provision of information to user (ART13)
    • 13.01 - Transparency of the AI System
    • 13.02 - Instructions for Use
  • Article 14 - Human Oversight (ART14)
    • 14.01 - Human Oversight mechanism
    • 14.02 - Human Oversight details
    • 14.03 - Human Oversight - Biometric Identification Systems
  • Article 15 - Accuracy, Robustness and Cybersecurity (ART15)
    • 15.01 - Accuracy Levels
    • 15.02 - Robustness Assessment
    • 15.03 - Continuous Learning Feedback Loop Assessment
    • 15.04 - Cyber Security Assessment
  • Article 17 - Quality Management System (ART17)
    • 17.01 - Quality Management System in Place
    • 17.02 - Compliance Strategy Stated
    • 17.03 - Design processes
    • 17.04 - Development and QA processes
    • 17.05 - Test and Validation Procedures
    • 17.06 - Technical Standards
    • 17.07 - Data Management Procedures
    • 17.08 - Risk Management System
    • 17.09 - Ongoing Monitoring System
    • 17.10 - Incident Reporting Procedures
    • 17.11 - Communications with Competent Authorities
    • 17.12 - Record Keeping Procedures
    • 17.13 - Resource Management Procedures
    • 17.14 - Accountability Framework
  • Article 61 - Post Market Monitoring System (ART61)
    • 61.01 - Post Market Monitoring System in Place
    • 61.02 - Data Collection Assessment
    • 61.03 - Post Market Monitoring Plan
Powered by GitBook
On this page
  • Compliance Requirements
  • Obligation of AI Providers and Users
  • Obligation of the High-Risk AI Providers

EC AIA - Compliance Requirements

PreviousEC Artificial Intelligence ActNextArticle 09 - Risk Management System (ART09)

Last updated 2 years ago

Compliance Requirements

High-risk AI systems shall comply with the requirements established in this section. For the compliance requirements, the labelling maps to the Seclea Platform compliance category and compliance item labels.

The main articles in the EC AIA draft are listed below, and their details are in the following sections:

Seclea Platform tracks the compliance of AI applications with above listed EC AIA articles.

Obligation of AI Providers and Users

Article 16 of the regulation states the obligations of the Providers and Users of the High-Risk systems that are details in this section:

Obligation of the High-Risk AI Providers

Providers of High-risk AI systems shall:

  1. ensure that their high-risk AI systems are compliant with the requirements set out in Article 9-15;

  2. have a quality management system in place which complies with Article 17 (discussed subsequent in this section);

  3. draw-up the technical documentation of the high-risk AI system (Article 11);

  4. when under their control, keep the logs automatically generated by their high-risk AI systems (Article 12);

  5. ensure that the high-risk AI system undergoes the relevant conformity assessment procedure, prior to its placing on the market or putting into service;

  6. comply with the registration obligations referred to in Article 51 (requirement to register the high-risk AI application onto the EU database for such application – which is going to be available publicly);

  7. take the necessary corrective actions, if the high-risk AI system is not in conformity with the requirements set out in Article 9-15;

  8. inform the national competent authorities of the Member States in which they made the AI system available or put it into service and, where applicable, the notified body of the non-compliance and of any corrective actions taken;

  9. to affix the CE marking to their high-risk AI systems to indicate the conformity with this Regulation in accordance with Article 49;

  10. upon request of a national competent authority, demonstrate the conformity of the high-risk AI system with the requirements set out in Article 9-15.

Conformity Assessment

Providers of high-risk AI systems shall ensure that their systems undergo the relevant conformity assessment procedure in accordance with Article 43, prior to their placing on the market or putting into service.

The provider shall follow one of the following procedures:

  1. the conformity assessment procedure based on internal control referred to in Annex VI;

  2. the conformity assessment procedure based on assessment of the quality management system and assessment of the technical documentation, with the involvement of a notified body, referred to in Annex VII.

Annex VI - Conformity Assessment Procedure based on Internal Control

Based on Annex VI, the provider has to perform a self-assessment to make sure:

  1. The provider verifies that the established quality management system complies with the requirements of Article 17.

  2. The provider examines the information in the technical documentation to assess the compliance of the AI system with the relevant essential requirements set out in Articles 9-15.

  3. The provider also verifies that the design and development process of the AI system and its post-market monitoring, as referred to in Article 61, is consistent with the technical documentation.

Annex VII - Conformity based on Assessment of Quality Management System and Assessment of Technical Documentation

The approved quality management system for the design, development and testing of AI systems pursuant to Article 17 shall be examined in accordance with point 1 and shall be subject to surveillance as specified in point 3. The technical documentation of the AI system shall be examined in accordance with point 2. Point 1 to 3 are listed below:

  1. Quality Management System

    1. The application of the provider shall include

      1. the name and address of the provider and, if the application is lodged by the authorised representative, their name and address as well;

      2. the list of AI systems covered under the same quality management system;

      3. the technical documentation for each AI system covered under the same quality management system;

      4. the documentation concerning the quality management system which shall cover all the aspects listed under Article 17;

      5. a description of the procedures in place to ensure that the quality management system remains adequate and effective;

      6. a written declaration that the same application has not been lodged with any other notified body.

    2. The quality management system shall be assessed by the notified body, which shall determine whether it satisfies the requirements referred to in Article 17.

    3. The quality management system as approved shall continue to be implemented and maintained by the provider so that it remains adequate and efficient.

    4. Any intended change to the approved quality management system or the list of AI systems covered by the latter shall be brought to the attention of the notified body by the provider.

  2. Control of the technical documentation

    1. In addition to the application referred to in point 1, an application with a notified body of their choice shall be lodged by the provider for the assessment of the technical documentation relating to the AI system which the provider intends to place on the market or put into service and which is covered by the quality management system referred to under point 1.

    2. The application shall include:

      1. the name and address of the provider;

      2. a written declaration that the same application has not been lodged with any other notified body;

      3. the technical documentation referred to in Annex IV (details of which are included in the Article 11 of this document).

  3. Surveillance of the approved quality management system

    1. The purpose of the surveillance carried out by the notified body referred to in Point 1 is to make sure that the provider duly fulfils the terms and conditions of the approved quality management system.

    2. For assessment purposes, the provider shall allow the notified body to access the premises where the design, development, testing of the AI systems is taking place. The provider shall further share with the notified body all necessary information.

    3. The notified body shall carry out periodic audits to make sure that the provider maintains and applies the quality management system and shall provide the provider with an audit report. In the context of those audits, the notified body may carry out additional tests of the AI systems for which an EU technical documentation assessment certificate was issued.

Article 9 - Risk Management System
Article 10 - Data and Data Governance
Article 11 - Technical Documentation
Article 12 - Record-Keeping
Article 13 - Transparency and provision of information to user
Article 14 - Human Oversight
Article 15 - Accuracy, Robustness and Cybersecurity
Article 17 - Quality Management System
Article 61 - Post Market Monitoring System