Medical Devices are used in the real world every day, so shouldn’t they be tested in the real world? You would think so. But the FDA hasn’t necessarily been of that mindset — until now.
Per FDA regulations, testing has to be conducted under supervised clinical trials, which have pre-designed protocols. When most think of clinical studies, they think of the structured, randomized clinical studies conducted for drug products; however clinical studies are also required for some high risk medical device applications. When determining if real-world evidence is relevant, one should consider that medical devices work physically, while drugs work chemically or biologically. Therefore, medical devices are tested by the end users or clinical professionals that will use them during the medical device validation process. Some devices are low risk and easy to use, while others are quite complex/high risk medical devices that require professional knowledge and/or detailed instructions for use. It is these high risk devices that require clinical type testing for FDA approval.
Another factor calling for real-world data is that medical devices are invented, so human factors/user error type testing should be conducted during design of the prototype to ensure the devices are not in need of further improvements prior to full scale medical device manufacturing and production. Medical device manufacturers can appreciate that devices may go through various iterations during the medical device development process, and if proper procedures are being followed, then continuous testing is being conducted and documented in the Design History File (DHF) to ensure the product is safe and effective for use. It stands to reason, then, there could be even more to learn once a medical device enters the post market/post production stage, or in layman’s terms, real-world use. Read the full article here…