FDA Proposes Regulations for AI-Powered Medical Devices
Because AI retrains itself over time, the administration wants to keep tabs on what’s happening under the hood.
The Food and Drug Administration is trying to figure out how to regulate artificial intelligence as the medical community starts experimenting with the tech.
The FDA on Tuesday proposed a first-of-its-kind framework for assessing the safety and effectiveness of medical devices that rely on AI and machine learning. While doctors have long evangelized AI’s potential to improve healthcare and advance research, the tech presents a handful of issues the administration has yet to fully grapple with.
The framework is currently open for public comment.
AI and ML “have the potential to transform healthcare by deriving new and important insights from the vast amount of data generated during the delivery of healthcare every day,” leading to earlier disease detection, more accurate diagnoses and personalized patient treatment, FDA officials wrote in the solicitation.
One of the tech’s greatest assets is its ability to learn from real-world use and experience—but that also makes it more difficult to regulate. A system’s performance could change as it retrains itself with new data, so it’s important for both regulators and manufacturers to monitor what’s happening under the hood, according to the FDA.
And because the field is developing so quickly, manufacturers should also have the ability to safely upgrade their devices without going back through the certification process every time.
“The highly iterative, autonomous and adaptive nature of these tools requires a new, total product lifecycle regulatory approach that facilitates a rapid cycle of product improvement … while providing effective safeguards,” officials wrote.
The framework would allow manufacturers to get potential future software changes preapproved by the FDA before their product ever goes to market, meaning they can upgrade the tool within those parameters without additional certifications. If they want to change algorithms or data inputs that go beyond those approved, they would need FDA certification.
Groups would also need to go back through the administration’s review process if they wanted to change how their product is used, say, from detecting diseases to diagnosing patients.
Companies would be required to continuously monitor the accuracy and performance of the tech as it’s used in the field and regularly update FDA officials on software changes, even if they were preapproved.
“Transparency about the function and modifications of medical devices is a key aspect of their safety,” FDA said. “Gathering performance data on the real-world use of the [device] may allow manufacturers to understand how their products are being used, identify opportunities for improvements and respond proactively to safety or usability concerns.”