All year, we’ve eagerly been waiting the release of the U.S. Food and Drug Administration (FDA) Center for Devices and Radiological Health (CDRH) draft guidance on “Computer Software Assurance for Manufacturing, Operations, and Quality System Software.” Unfortunately, it appears that we’ll have to wait a bit longer, since this is now listed as proposed guidance for 2020. However, just because the draft guidance isn’t out yet doesn’t mean that validation experts are mum on the future. In fact, between a recent FDA webinar on the subject and an informative session at the 2019 Masters Summit, there are a few trends that will undoubtedly appear in the new year.
#1: Computer Software Assurance (CSA)
Computer system validation (CSV) is a much more familiar term than CSA, but CSA is the wave of the future in validating your software. The aforementioned FDA webinar outlined a new approach to validation that would focus on confidence levels and help decrease the burden on regulated companies while maintaining high standards. Traditional CSV uses waterfall development methodology, which most companies have replaced with agile development methodology. CSA is a similar shift from a highly regimented approach to a more adaptable one.
With CSA, the FDA is placing the emphasis on what “directly impacts patients or product quality.” Instead of starting with documentation, the new trend is to start with critical thinking. Ask yourself how you use your system and why you’re doing everything in the validation process. This can be broken down into a four-step approach:
- Identify your intended use.
- Determine risk approach.
- Perform risk and testing activities.
- Create documentation.
Not every feature in a system needs to be validated at the same level. A feature that lets you set up out-of-office notifications doesn’t require the rigorous testing — or the documentation — of a feature that controls the amount of active ingredient in a pharmaceutical. This risk-based thinking was used to develop MasterControl Validation Excellence (Vx) methodology, which significantly reduces the documentation burden for users by helping them identify which features require more testing.
#2: Continuous Validation
Before you start panicking, this trend is not as scary as it might sound. The concept isn’t that the software is being continuously validated. Rather the features and functionality of that software are. A traditional waterfall validation approach is designed to test defects out of a product. Continuous validation builds quality directly into the development stage. A traditional waterfall software development life cycle puts each stage of the development process in a silo. The entire product is developed together. Any change found in one stage requires the whole thing to regress to previous stages. In the meantime, there is no progress on the parts of the software that aren’t being changed. Continuous validation brings the agile development process into the validation process as well. It looks at the software feature by feature as opposed to taking a lump sum approach. It also breaks down the walls between departments so there is continual communication between them. This means that any problems are found and fixed as they occur. And while they’re being fixed, the rest of the product can continue through the development process.
Continually validating individual features lets your product move through development faster with fewer revisions and fewer bottlenecks. This agile approach makes quality part of the process from the beginning and uses automated testing for regression tests to provide constant feedback. Getting this information during the development process lets you adapt as you go along, rather than trying to fix problems retroactively.
Bonus: Certificate of Validation
It’s too soon to call this a trend, but it is worth keeping an eye on. The certification of validation would be a simplified approach that a company can take during an inspection. Essentially, the regulated company would provide the inspector with a certificate of validation from the software vendor as evidence that the system is validated, as opposed to showing them the full validation package. This is a direct contradiction of the approach that most companies take. Again, this concept is possible because of the shift to CSA and the focus on more critical software systems. Lower risk systems like quality management systems (QMS) might benefit from using this approach. The FDA doesn’t want to look at any more documentation than they have to and wants to focus on the critical software systems. So, in certain circumstances, a certificate of validation may become a common practice.
The FDA is ultimately concerned about patient safety; not how many documents a company can produce. That documentation should be the end of the validation process, not the beginning. Even though the FDA has hosted webinars on this topic and industry experts are beginning to get excited at the prospect of CSA and continuous validation, it will take some time before these concepts are common practice.
If you enjoyed reading about a Masters Summit session, you’ll enjoy attending even more! Registration for 2020 is already open and by securing your seat now, you’ll save hundreds.
Sarah Beale is a content marketing specialist at MasterControl in Salt Lake City, where she writes white papers, website landing pages, and is a frequent contributor to the company’s blog, GxP Lifeline. Her areas of expertise include the nutraceuticals, cannabis, and food industries. Beale has been writing about the life sciences and health care for over five years. Prior to joining MasterControl she worked for a nutraceutical company in Salt Lake City and before that she worked for a third-party health care administrator in Chicago. She has a bachelor’s degree in English from Brigham Young University and a master’s degree in business administration from DeVry University.
Originally published at https://www.mastercontrol.com.