Seclea User Documentation
HomeAI RegulationsAI Risk ManagementGet started
  • Seclea User Documentation
  • Introducing Seclea
    • Seclea - Building trust in AI
      • Seclea - Features and services
      • Seclea integrations and supported frameworks
  • Getting Started
    • Create a Seclea account
    • First Project - Example
    • Cloud Platform
    • On-Premises Deployment
  • Seclea Web UI
    • Overview
    • Basic Workflows
    • Creating a Project
    • Project Settings
      • User Access Setting
      • Compliance Settings
      • Risk Management Settings
      • Performance (Internal Policies) Setting
    • User Management
    • AI regulatory compliance
      • Seclea Compliance Dashboard
      • Working with templates for compliance
    • AI risk management
      • Seclea AI Risk Management Dashboard
      • Working with templates for risk management
  • Python API (seclea-ai)
    • Overview
    • API Documentation
  • Supported AI regulations
    • EC Artificial Intelligence Act
    • FDA SaMD with Artificial Intelligence
    • OECD AI Principles
    • Canada Artificial Intelligence and Data Act (AIDA)
    • South Korean - AI-based Medical Devices (SK-AIMD)
    • Saudi Arabia - AI-based Medical Devices (SA-AIMD)
  • Supported risk management frameworks
    • NIST AI risk management
    • ISO AI risk management
    • FDA AI based SaMD Risk Management
  • Seclea INFO
    • Reporting Bugs
    • Error Catalog
Powered by GitBook
On this page
  1. Supported AI regulations

FDA SaMD with Artificial Intelligence

PreviousEC Artificial Intelligence ActNextOECD AI Principles

Last updated 2 years ago

U.S Food and Drug Administration (FDA) has published guidance documents for Artificial Intelligence (AI) based Software as a Medical Device (SaMD). As AI and Machine Learning (ML) are being increasingly incorporated into medical devices, the FDA Center for Devices and Radiological Health (CDRH) is “a total product lifecycle-based regulatory framework for these technologies that would allow for modifications to be made from real-world learning and adaptation, while ensuring that the safety and effectiveness of the software as a medical device are maintained.

The FDA traditionally reviews medical devices through an “appropriate premarket pathway,” but the agency recognises that this regulatory framework was not designed for adaptive technologies such as AI and ML. Regulators have noted that these technologies may require premarket review under the existing agency approach to software modifications.

In 2019, the agency published a outlining a potential approach to premarket review for AI and ML software. As part of its Artificial Intelligence and Machine Learning (AI/ML) Software as a Medical Device Action Plan, the CDRH’s Digital Health Center of Excellence took feedback on the 2019 paper. It formulated a outlining actions the FDA intends to take in this area.

Seclea Platform provides a baseline template based on the FDA's published guidelines. The template is based on the following documents produced by FDA (Federal Drug Authority) and IMDRF (International Medical Device Regulators Forum). There is no definitive regulatory guideline for AI/ML as a medical device. FDA has published a framework change request for SaMD (Software as a Medical Device) to include AI/ML algorithms. has an working on the guideline document. EU and other countries that have imposed AI regulations are part of this working group. The last public activity by IMDRF was to release the “Key Terms and Definitions” document.

The list of documents used to formulate the FDA AI/ML-based SaMD template includes:

As with any medical device, AI-enabled software is subject to FDA review based on its risk classification. Class I devices—such as software that solely displays readings from a continuous glucose monitor— pose the lowest risk. Class II devices are considered moderate to high risk and may include AI software tools that analyse medical images such as mammograms and flag suspicious findings for a radiologist to review. Most Class II devices undergo what is known as a 510(k) review (named for the relevant section of the Federal Food, Drug, and Cosmetic Act), in which a manufacturer demonstrates that its device is “substantially equivalent” to an existing device on the market with the same intended use and technological characteristics.

Alternatively, certain Class I and Class II device manufacturers may submit a De Novo request to FDA, which can be used for devices that are novel but whose safety and underlying technology are well understood and are therefore considered lower risk. Several AI-driven devices are currently on the market—such as IDx-DR, OsteoDetect, and ContaCT.

Class III devices pose the highest risk. They include life-supporting, life-sustaining products or are substantially crucial in preventing the impairment of human health. These devices must undergo the full premarket approval process, and developers must submit clinical evidence that the product's benefits outweigh the risks. The continuous glucose monitoring system, Guardian Connect, was approved through a premarket approval.

Once a device is on the market, FDA takes a risk-based approach to determine whether it will require a premarket review of any changes the developer makes. Generally, when a manufacturer significantly updates the software or makes other changes that would substantially affect the device’s performance, the device may be subject to additional review by the FDA. However, the process for this evaluation differs depending on the device’s risk classification and the nature of the change.

The compliance requirements are taken from Appendix A of , with modifications based on proposed modifications in ““ and ““. Any future proposals from IMDRF related to AI/ML will require revisiting the template.

Seclea Platform template for FDA - AI based SaMD consists of the following compliance categories:

More details on the FDA AI-based SaMD can be found .

considering
discussion paper
response
IMDRF
AI Medical Devices working group
IMDRF/SaMD - SaMD: Application of Quality Management System
AI/ML-Based Software as a Medical Device (SaMD) Action Plan
Proposed Regulatory Framework for Modification to AI/ML-Based Software as a Medical Device (SaMD)
Good Machine Learning Practice for Medical Device Development Guiding Principles
European Commission - Artificial Intelligence Act (AIA)
IMDRF/SaMD - SaMD: Application of Quality Management System
Proposed Regulatory Framework for Modification to AI/ML-Based Software as a Medical Device (SaMD)
European Commission - Artificial Intelligence Act (AIA)
here
Data Governance (DG)
Technical Documentation (TD)
Transparency and Provision of Information to Users (TPI)
Human Oversight (HO)
Accuracy, Robustness and Cybersecurity (ARC)
Managing SaMD Lifecycle Support Process - Record Keeping (RK)
Risk Management System (RMS)
Quality Management Principles (QMP)
Post Market Monitoring System (PMS)