EC Artificial Intelligence Act
HomeDocumentationGet started
  • EC Artificial Intelligence Act
  • EC AIA - Compliance Requirements
  • Article 09 - Risk Management System (ART09)
    • 09.01 - Risk Management System in Place
    • 09.02 - Risk Management System Capabilities and Process
    • 09.03 - Risk Management Measures
    • 09.04 - Testing
    • 09.05 - Residual Risks
    • 09.06 - Consideration of Children
    • 09.07 - Credit Institutions
  • Article 10 - Data Governance (ART10)
    • 10.01 - Define Sets
    • 10.02 - Dataset Governance Policies
    • 10.03 - Dataset Design Choices
    • 10.04 - Data Source Information
    • 10.05 - Dataset Annotations Information
    • 10.06 - Dataset Labels Information
    • 10.07 - Dataset Cleaning
    • 10.08 - Dataset Enrichment
    • 10.09 - Dataset Aggregation
    • 10.10 - Dataset Description, Assumptions and Purpose
    • 10.11 - Dataset Transformation Rationale
    • 10.12 - Dataset Bias Identification
    • 10.13 - Dataset Bias Mitigation
    • 10.14 - Dataset Bias Analysis Action and Assessment
    • 10.15 - Dataset Gaps and Shortcomings
    • 10.16 - Dataset Bias Monitoring - Ongoing
    • 10.17 - Dataset Bias Special/Protected Categories
  • Article 11 - Technical Documentation (ART11)
    • 11.01 - Technical Documentation Generated
    • 11.02 - Additional Technical Documentation
    • 11.03 - Technical Details
    • 11.04 - Development Steps and Methods
    • 11.05 - Pre-trained or Third Party Tools/Systems
    • 11.06 - Design Specification
    • 11.07 - System Architecture
    • 11.08 - Computational Resources
    • 11.09 - Data Requirements
    • 11.10 - Human Oversight Assessment
    • 11.11 - Pre Determined Changes
    • 11.12 - Continuous Compliance
    • 11.13 - Validation and Testing
    • 11.14 - Monitoring, Function and Control
    • 11.15 - Risk Management System
    • 11.16 - Changes
    • 11.17 - Other Technical Standards
    • 11.18 - Ongoing Monitoring System
    • 11.19 - Reports Signed
    • 11.20 - Declaration of Conformity
  • Article 12 - Record Keeping (ART12)
    • 12.01 - Logging Capabilities
    • 12.02 - Logging Traceability
    • 12.03 - Logging - Situations that may cause AI Risk
    • 12.04 - Logging - Biometric Systems Requirements
  • Article 13 - Transparency and provision of information to user (ART13)
    • 13.01 - Transparency of the AI System
    • 13.02 - Instructions for Use
  • Article 14 - Human Oversight (ART14)
    • 14.01 - Human Oversight mechanism
    • 14.02 - Human Oversight details
    • 14.03 - Human Oversight - Biometric Identification Systems
  • Article 15 - Accuracy, Robustness and Cybersecurity (ART15)
    • 15.01 - Accuracy Levels
    • 15.02 - Robustness Assessment
    • 15.03 - Continuous Learning Feedback Loop Assessment
    • 15.04 - Cyber Security Assessment
  • Article 17 - Quality Management System (ART17)
    • 17.01 - Quality Management System in Place
    • 17.02 - Compliance Strategy Stated
    • 17.03 - Design processes
    • 17.04 - Development and QA processes
    • 17.05 - Test and Validation Procedures
    • 17.06 - Technical Standards
    • 17.07 - Data Management Procedures
    • 17.08 - Risk Management System
    • 17.09 - Ongoing Monitoring System
    • 17.10 - Incident Reporting Procedures
    • 17.11 - Communications with Competent Authorities
    • 17.12 - Record Keeping Procedures
    • 17.13 - Resource Management Procedures
    • 17.14 - Accountability Framework
  • Article 61 - Post Market Monitoring System (ART61)
    • 61.01 - Post Market Monitoring System in Place
    • 61.02 - Data Collection Assessment
    • 61.03 - Post Market Monitoring Plan
Powered by GitBook
On this page

Article 11 - Technical Documentation (ART11)

Previous10.17 - Dataset Bias Special/Protected CategoriesNext11.01 - Technical Documentation Generated

Last updated 2 years ago

Article 11 deals with the technical documentation requirements and practices required for EC AIA compliance. All activities related to technical documentation are detailed as part of this article. In this article, on the Seclea Platform, there are twenty-on1 defined categories with relevant checks.

Following is the article text with relevant category numbers (11.##) from the Seclea Platform.

The technical documentation of a high-risk AI system shall be drawn up before that system is placed on the market or put into service and shall be kept up-to-date (). The documentation shall be drawn up in a way to:

  • demonstrate that the high-risk AI system complies with the requirements of EC regulation and have necessary information to assess the compliance of the AI system ().

  • high-risk AI system related to a product that have other regulatory requirements shall draw up a single technical documentation ().

The technical documentation shall contain at least the following information, as applicable to the relevant AI system:

  1. A general description of the AI system including ():

    • its intended purpose, the person/s developing the system, the date and the version of the system;

    • how the AI system interacts or can be used to interact with hardware or software that is not part of the AI system itself, where applicable;

    • the versions of relevant software or firmware and any requirement related to version update;

    • the description of all forms in which the AI system is placed on the market or put into service;

    • the description of hardware on which the AI system is intended to run;

    • where the AI system is a component of products, photographs or illustrations showing external features, marking and internal layout of those products;

    • instructions of use for the user and, where applicable installation instructions;

  2. A detailed description of the elements of the AI system and of the process for its development , including:

    1. the methods and steps performed for the development of the AI system (), including, where relevant, recourse to pre-trained systems or tools provided by third parties and how these have been used, integrated or modified by the provider ();

    2. the design specifications of the system (), namely the general logic of the AI system and of the algorithms ; the key design choices including the rationale and assumptions made, also with regard to persons or groups of persons on which the system is intended to be used; the main classification choices; what the system is designed to optimise for and the relevance of the different parameters ; the decisions about any possible trade-off made regarding the technical solutions adopted to comply with the requirements set out from to ;

    3. the description of the system architecture explaining how software components build on or feed into each other and integrate into the overall processing (); the computational resources used to develop, train, test and validate the AI system ();

    4. where relevant, the data requirements in terms of datasheets describing the training methodologies and techniques and the training data sets used, including information about the provenance of those data sets, their scope and main characteristics; how the data was obtained and selected; labelling procedures (e.g. for supervised learning), data cleaning methodologies (e.g. outliers detection) ();

    5. assessment of the human oversight measures needed in accordance with Article 14, including an assessment of the technical measures needed to facilitate the interpretation of the outputs of AI systems by the users, in accordance with Articles 13(3)(d) ();

    6. where applicable, a detailed description of pre-determined changes to the AI system and its performance (), together with all the relevant information related to the technical solutions adopted to ensure continuous compliance of the AI system with the relevant requirements set ();

    7. the validation and testing procedures used, including information about the validation and testing data used and their main characteristics; metrics used to measure accuracy, robustness, cybersecurity and compliance with other relevant requirements, as well as potentially discriminatory impacts; test logs () and all test reports dated and signed by the responsible persons (), including with regard to pre-determined changes as referred to under point 2.6.

  3. Detailed information about the monitoring, functioning and control of the AI system, in particular with regard to: its capabilities and limitations in performance, including the degrees of accuracy for specific persons or groups of persons on which the system is intended to be used and the overall expected level of accuracy in relation to its intended purpose; the foreseeable unintended outcomes and sources of risks to health and safety, fundamental rights and discrimination in view of the intended purpose of the AI system; the human oversight measures needed in accordance with , including the technical measures put in place to facilitate the interpretation of the outputs of AI systems by the users; specifications on input data, as appropriate ();

  4. A detailed description of the risk management system in accordance with Article 9 ();

  5. A description of any change made to the system through its lifecycle ();

  6. A list of the harmonised standards applied in full or in part the references of which have been published in the Official Journal of the European Union; where no such harmonised standards have been applied, a detailed description of the solutions adopted to meet the requirements set out in Title III, Chapter 2, including a list of other relevant standards and technical specifications applied ();

  7. A copy of the EU declaration of conformity ();

  8. A detailed description of the system in place to evaluate the AI system performance in the post-market phase in accordance with (), including the post-market monitoring plan referred to in .

Below is the list of controls/checks part of Article 11.

11.01
11.02
11.02
11.03
11.04
11.05
11.06
Article 9
Article 15
11.07
11.08
11.09
11.10
11.11
11.12
11.13
11.19
Article 14
11.14
11.15
11.16
11.17
11.20
Article 61
11.18
Article 61(3)
11.01 - Technical Documentation Generated
11.02 - Additional Technical Documentation
11.03 - Technical Details
11.04 - Development Steps and Methods
11.05 - Pre-trained or Third party tools/systems
11.06 - Design specification
11.07 - System Architecture
11.08 - Computational Resources
11.09 - Data Requirements
11.10 - Human Oversight Assessment
11.11 - Pre Determined Changes
11.12 - Continuous Compliance
11.13 - Validation and Testing
11.14 - Monitoring, Function and Control
11.15 - Risk Management System
11.16 - Changes
11.17 - Other Technical Standards
11.18 - Ongoing Monitoring System
11.19 - Reports Signed
11.20 - Declaration of Conformity