FDA - AI based SaMD
HomeDocumentationGet started
  • FDA - AI based SaMD
  • Data Governance (DG)
    • DG01 - Define Sets
    • DG02 - Dataset Governance Policies
    • DG03 - Dataset Design Choices
    • DG04 - Dataset Source Information
    • DG05 - Dataset Annotations Information
    • DG06 - Dataset Labels Information
    • DG07 - Dataset Cleaning
    • DG08 - Dataset Enrichment
    • DG09 - Dataset Aggregation
    • DG10 - Dataset Description, Assumptions and Purpose
    • DG11 - Dataset Transformation Rationale
    • DG12 - Dataset Bias Identification
    • DG13 - Dataset Bias Mitigation
    • DG14 - Dataset Bias Analysis Action and Assessment
    • DG15 - Dataset Gaps and Shortcomings
    • DG16 - Dataset Bias Monitoring - Ongoing
    • DG17 - Dataset Bias Special/Protected Categories
  • Technical Documentation (TD)
    • TD01 - Technical Documentation Generated
    • TD02 - Additional Technical Documentation
    • TD03 - Technical Details
    • TD04 - Development steps and methods
    • TD05 - Pre-trained or Third party tools/systems
    • TD06 - Design specification
    • TD07 - System Architecture
    • TD08 - Computational Resources
    • TD09 - Data Requirements
    • TD10 - Human Oversight Assessment
    • TD11 - Pre Determined Changes
    • TD12 - Continuous Compliance
    • TD13 - Validation and Testing
    • TD14 - Monitoring, Function and Control
    • TD15 - Risk Management System
    • TD16 - Changes
    • TD17 - Other Technical Standards
    • TD18 - Ongoing Monitoring System
    • TD19 - Reports Signed
  • Transparency and Provision of Information to Users (TPI)
    • TPI01 - Transparency of the AI System
    • TPI02 - Instructions for Use
  • Human Oversight (HO)
    • HO01 - Human Oversight Mechanism
    • HO02 - Human Oversight Details
    • HO03 - Human Oversight - Biometric Identification Systems
  • Accuracy, Robustness and Cybersecurity (ARC)
    • ARC01 - Accuracy Levels
    • ARC02 - Robustness Assessment
    • ARC03 - Continuous Learning Feedback Loop Assessment
    • ARC04 - Cyber Security Assessment
  • Managing SaMD Lifecycle Support Process - Record Keeping (RK)
    • RK01 - Logging Capabilities
    • RK02 - Logging Traceability
    • RK03 - Logging - Situations that May Cause AI Risk
    • RK04 - Logging - Biometric systems requirements
    • RK05 - Details of Off-the-Shelf Components
    • RK06 - Evaluation Process of Off-the-Shelf Components
    • RK07 - Quality Control Process of Off-the-Shelf Components
    • RK08 - Internal Audit Reports
  • Risk Management System (RMS)
    • RMS01 - Risk Management System in Place
    • RMS02 - Risk Management System Capabilities and Processes
    • RMS03 - Risk Management Measures
    • RMS04 - Testing
    • RMS05 - Residual Risks
    • RMS06 - Full Track of Mitigation Measures
  • Quality Management Principles (QMP)
    • QMP01 - Quality Management System in Place
    • QMP02 - Compliance Strategy stated
    • QMP03 - Design processes
    • QMP04 - Development and QA (Quality Assurance) processes
    • QMP05 - Test and Validation Procedures
    • QMP06 - Technical Standards
    • QMP07 - Data Management Procedures
    • QMP08 - Risk Management System
    • QMP09 - Ongoing Monitoring System
    • QMP10 - Incident Reporting Procedures
    • QMP11 - Communications with Competent Authorities
    • QMP12 - Record Keeping Procedures
    • QMP13 - Resource Management Procedures
    • QMP14 - Accountability Framework
  • Post Market Monitoring System (PMS)
    • PMS01 - Post Market Monitoring System in Place
    • PMS02 - Data Collection Assessment
    • PMS03 - Post Market Monitoring Plan
Powered by GitBook
On this page

FDA - AI based SaMD

NextData Governance (DG)

Last updated 2 years ago

U.S Food and Drug Administration (FDA) has published guidance documents for Artificial Intelligence (AI) based Software as a Medical Device (SaMD). As AI and Machine Learning (ML) are being increasingly incorporated into medical devices, the FDA Center for Devices and Radiological Health (CDRH) is “a total product lifecycle-based regulatory framework for these technologies that would allow for modifications to be made from real-world learning and adaptation, while ensuring that the safety and effectiveness of the software as a medical device are maintained.

The FDA traditionally reviews medical devices through an “appropriate premarket pathway,” but the agency recognises that this regulatory framework was not designed for adaptive technologies such as AI and ML. Regulators have noted that these technologies may require premarket review under the existing agency approach to software modifications.

In 2019, the agency published a outlining a potential approach to premarket review for AI and ML software. As part of its Artificial Intelligence and Machine Learning (AI/ML) Software as a Medical Device Action Plan, the CDRH’s Digital Health Center of Excellence took feedback on the 2019 paper. It formulated a outlining actions the FDA intends to take in this area.

Seclea Platform provides a baseline template based on the FDA's published guidelines. The template is based on the following documents produced by FDA (Federal Drug Authority) and IMDRF (International Medical Device Regulators Forum). There is no definitive regulatory guideline for AI/ML as a medical device. FDA has published a framework change request for SaMD (Software as a Medical Device) to include AI/ML algorithms. has an working on the guideline document. EU and other countries that have imposed AI regulations are part of this working group. The last public activity by IMDRF was to release the “Key Terms and Definitions” document.

The list of documents used to formulate the FDA AI/ML-based SaMD template includes:

As with any medical device, AI-enabled software is subject to FDA review based on its risk classification. Class I devices—such as software that solely displays readings from a continuous glucose monitor— pose the lowest risk. Class II devices are considered moderate to high risk and may include AI software tools that analyse medical images such as mammograms and flag suspicious findings for a radiologist to review. Most Class II devices undergo what is known as a 510(k) review (named for the relevant section of the Federal Food, Drug, and Cosmetic Act), in which a manufacturer demonstrates that its device is “substantially equivalent” to an existing device on the market with the same intended use and technological characteristics.

Alternatively, certain Class I and Class II device manufacturers may submit a De Novo request to FDA, which can be used for devices that are novel but whose safety and underlying technology are well understood and are therefore considered lower risk. Several AI-driven devices are currently on the market—such as IDx-DR, OsteoDetect, and ContaCT.

Class III devices pose the highest risk. They include life-supporting, life-sustaining products or are substantially crucial in preventing the impairment of human health. These devices must undergo the full premarket approval process, and developers must submit clinical evidence that the product's benefits outweigh the risks. The continuous glucose monitoring system, Guardian Connect, was approved through a premarket approval.

Once a device is on the market, FDA takes a risk-based approach to determine whether it will require a premarket review of any changes the developer makes. Generally, when a manufacturer significantly updates the software or makes other changes that would substantially affect the device’s performance, the device may be subject to additional review by the FDA. However, the process for this evaluation differs depending on the device’s risk classification and the nature of the change.

The compliance requirements are taken from Appendix A of , with modifications based on proposed modifications in ““ and ““. Any future proposals from IMDRF related to AI/ML will require revisiting the template.

Seclea Platform template for FDA - AI based SaMD consists of the following compliance categories:

considering
discussion paper
response
IMDRF
AI Medical Devices working group
IMDRF/SaMD - SaMD: Application of Quality Management System
AI/ML-Based Software as a Medical Device (SaMD) Action Plan
Proposed Regulatory Framework for Modification to AI/ML-Based Software as a Medical Device (SaMD)
Good Machine Learning Practice for Medical Device Development Guiding Principles
European Commission - Artificial Intelligence Act (AIA)
IMDRF/SaMD - SaMD: Application of Quality Management System
Proposed Regulatory Framework for Modification to AI/ML-Based Software as a Medical Device (SaMD)
European Commission - Artificial Intelligence Act (AIA)
Data Governance (DG)
Technical Documentation (TD)
Transparency and Provision of Information to Users (TPI)
Human Oversight (HO)
Accuracy, Robustness and Cybersecurity (ARC)
Managing SaMD Lifecycle Support Process - Record Keeping (RK)
Risk Management System (RMS)
Quality Management Principles (QMP)
Post Market Monitoring System (PMS)