FDA issues draft guidance on developing and managing AI-enabled devices

The 67-page Draft Guidance details how to apply a TPLC approach to AI-enabled device development and management, how to consider issues such as user interface, address risks and data systems throughout the product lifecycle, and offers examples for submitting documentation to the Agency.

The “draft guidance brings together relevant information for developers, shares learnings from authorized AI-enabled devices and provides a first point-of-reference for specific recommendations that apply to these devices, from the earliest stages of development through the device’s entire life cycle,” said Troy Tazbaz, Director of the Digital Health Center of Excellence (DHCoE) at FDA’s Center for Devices and Radiological Health (CDRH) in an Agency statement.  He also noted that regulators have authorized over a thousand AI-enabled medical devices for the US market thus far.

In recent years, the FDA has published several documents covering AI-enabled products, including guiding principles for good machine learning practice (GMLP) and transparency for machine learning-enabled devices. It has also held a public workshop promoting a patient-centered approach to AI-enabled devices.

The draft guidance is intended to complement the recent final guidance on predetermined change control plans (PCCP) for AI-enabled devices.  The PCCP guidance outlines recommendations on proactively planning for product updates, which can help sponsors avoid filing new premarket applications for certain product changes.

“The guidance, if finalized, would be the first guidance to provide comprehensive recommendations for AI-enabled devices throughout the total product lifecycle, providing developers an accessible set of considerations that tie together design, development, maintenance and documentation recommendations to help ensure safety and effectiveness of AI-enabled devices,” said FDA.  “The draft guidance includes recommendations for how and when, in marketing submissions, sponsors should describe the post market performance monitoring and management of their AI-enabled devices.”

“Importantly, this draft guidance also includes the FDA’s current thinking on strategies to address transparency and bias throughout the life cycle of AI-enabled devices,” the Agency added.  “The draft guidance describes specific recommendations intended to help a sponsor demonstrate they have addressed risks associated with bias and provides suggestions for the thoughtful design and evaluation of AI-enabled devices.”

The draft guidance points to specific recommendations sponsors should consider from other guidances that may apply to their AI-enabled devices.  However, FDA noted that there may be recommendations and guidances not mentioned in the draft guidance that sponsors should consider.

FDA also encouraged sponsors to talk to regulators early and often when bringing AI-enabled products to market and emphasized using the guidance in the post-market phase to ensure the products continue to be monitored to maintain regulatory expectations.  “Early engagement with FDA can help guide product development and submission preparation,” said FDA.  “In particular, early engagement could be helpful when new and emerging technology is used in the development or design of the device, or when novel methods are used during the validation of the device.”

FDA has also tried to harmonize terminology and recommendations regarding software-related consensus standards, which have significantly improved medical device software consistency, quality, and documentation.  The Agency said sponsors should consider using the consensus standards listed in the guidance when developing AI-enabled medical devices and preparing premarket submissions.

FDA aimed to clarify differences in terminologies used by the Agency and the AI community and how they see TPLC.  Regulators tried to address those differences in the guidance to ensure everyone was on the same page.  For example, they noted that developers often define “validation” as defining data curation or model tuning of the AI-enabled device.  By contrast, the Agency defines it as confirming that a product can consistently meet the requirements for its intended use.  As a result, it said sponsors should avoid using validation to refer to the training and tuning process in their premarket submissions to avoid confusion.

REFERENCE:  RAPS Regulatory Focus (RF); 06 JAN 2025; Ferdous Al-Faruque