OR WAIT 15 SECS
Volume 22, Issue 7
Columnist Bob McDowall discusses operating system security patches.
As soon as a spectrometer system has been validated, it is placed under a formal change control procedure to review, approve, implement, and validate changes. However, for networked spectrometers, there is one type of change in which there is always a culture clash: rapid installation of operating system security patches to mitigate vulnerabilities. So in the blue corner we have Quality Assurance (QA): the usual approach taken is that you cannot patch a system without completing a change control form, evaluating the change according to the change control procedure. In the red corner we have Information Technology (IT), represented by the security officer, who wants to patch immediately to ensure the security of the network and maintain the integrity of the data generated by the spectrometer.
One way to avoid this conflict is for laboratory networks to be isolated or separate from the rest of the network within an organization. So when patches are installed via the network for the rest of the organization, the laboratory systems are exempt. This has the advantage of an easy ride for the laboratory and QA because there are no pesky change control forms to complete and no internecine wars with IT. A small downside is that you should not access the Internet because the operating system will not be covered against malware (including viruses, Trojans); use floppy disks or USB sticks for data transfer to outside systems because the potential for data loss due to a vulnerability can be high.
So where can we look for help? The FDA has written a guidance for industry entitled Cybersecurity for Networked Medical Devices Containing Off-the-Shelf Software (2) that covers this topic. The guidance is written by the Center for Devices and Radiological Health (CDRH) and is intended for medical devices and not spectrometers; however, in the response to question 1, it states: This guidance is addressed to device manufacturers who incorporate off-the-shelf software in their medical devices. However, this information also may be useful to network administrators in health care organizations and information technology vendors.
So the purpose of this "Focus on Quality" column is to give my interpretation of the cybersecurity guidance for operating system patches for networked spectrometers. I must apologize for the misquotation of that well-known computer validation expert, William Shakespeare, in the title but that has never stopped me from having a laugh at another's expense.
The FDA states that a cybersecurity vulnerability exists whenever the OTS software provides the opportunity for unauthorized access to the network or the medical device. The consequence is that this can open the door to unwanted software changes that might have an effect on the safety and effectiveness of the medical device (2). Throughout the whole of this guidance, there is not a single reference to Windows or Microsoft.
This document is in an unusual format for an FDA guidance. Typically there is an introduction and then the discussion of the guidance offered by the agency; however, with the cybersecurity guidance, there is a very different style: 10 questions with 10 corresponding answers. The questions posed are listed in Table I. While all the questions posed with their answers are directly relevant to medical devices that are networked, the big problem comes in how to interpret them for the pharmaceutical industry and spectrometer software in particular. As you can see, questions such as "Do I have to report to the FDA a cyber security patch?" would not be appropriate for a pharmaceutical network. I will look at interpreting some of the key questions, but if you want to check, please read the cybersecurity guidance itself because it is the source document from which I have derived this column.
Table I. Questions from the FDA Cybersecurity Guidance
To help with the interpretation, let us first look at the roles and responsibilities that are discussed in the guidance. For medical devices, we have three roles with their outline responsibilities:
At first sight, there does not appear to be much that we can do to apply this to a laboratory or even to a pharmaceutical environment. Who are you going to call when you have a problem? Unfortunately, it's not Ghostbusters or all our problems could be solved. Not even a medical device manufacturer could help, especially when you haven't purchased their system. However, look at the problem and think about the situation in your organization — you don't have a medical device vendor but you do have an IT department.
Therefore, one quick copy and paste later and we have the following interpretation for a laboratory environment:
So instead of three different organizations to coordinate, we have just two: the software vendor and the pharmaceutical company. Even if the IT function is outsourced to a third party, the bright sparks that wrote the service level agreement will have put into the document the process to be followed to ensure that network vulnerabilities will be addressed in a controlled manner — won't they?
Based upon the interpreted roles and responsibilities listed earlier, a patching process could be as follows:
A patch is released by the software vendor to fix a vulnerability in the operating system. At the same time, there will be release notes to understand which versions of the operating system are impacted and how critical the vulnerability is. This might range from no impact to critical, depending upon the version of the operating system.
The patch should be downloaded from the vendor's web site but not installed by the IT department.
The patch should be evaluated by the IT staff (possibly a security group in a larger organization or one of the designated people in a smaller IT group. One advantage of outsourcing is that the external company just needs one security group to cover all the companies that they outsource for. This might be far more efficient than some pharmaceutical companies.
Read the release notes and understand the problem that the patch is trying to fix. Note the use of the word trying; not all patches are successful and might need a second version to correct the problems introduced by the first version. Reading the notes is important because you might not have the version and service pack that indicates any vulnerability. In this case, why install software you don't really need?
Assuming that you need to install the patch — how critical is this based upon the release notes? For some patches, you might need to install them immediately because the risk might be too great — the malicious software to exploit the problem could start to appear within a day of the patch being released. However, for most patches, there is a little time to install the patch in a test environment and check out perhaps that the operating system itself is all right or that the patch does not impact one or two key applications. Note that you will not be able to evaluate the patch extensively.
The prudent IT department also will have a rollback position just in case anything goes wrong.
Then, if no problems have been seen, the patch can be distributed either by hand or, more likely, automatically via the network to specific servers and workstations to close the vulnerability.
End of story? Not quite. If you read through the process above, you'll see that there is no mention of change control. So if we followed this process, how do we keep IT and QA in their respective red and blue corners happy and avoid them fighting?
The snappy answer is by working smarter and not harder. European Union GMP annex 11 (covering computerized systems) states in clause 11 covering change control: "Such an alteration should only be implemented with the agreement of the person responsible for the part of the system concerned, and the alteration should be recorded. Every significant modification should be validated" (3).
From the first sentence, we can interpret that the system owner is responsible for the spectrometer and the application plus the electronic records generated, and the head of IT is responsible for the network and infrastructure. Therefore, we need two separate but closely linked change control procedures: one for the IT infrastructure and the other for the regulated applications. The application change control SOP will ensure that changes made at the application level are appropriately controlled as well as approving significant changes in the infrastructure that impact the validated status of the application. In contrast, the infrastructure SOP covers the changes that are made by IT.
Part of the IT change control procedure will be to check if the change is major (impact on regulated applications) or minor (no impact on regulated applications). If the change is assessed as minor, it can be processed and implemented without reference to the spectrometer system owner and be implemented without reference to QA.
Virtually all operating system security patches will fall under the minor category because they are fixes for remote portions of the operating system and are unlikely to impact the operation of the regulated application. This would not be true for an operating system service pack, which will have a major impact on any business or regulated application. Therefore, as long as IT follows a documented process, they can be separate from the application.
Usually we do not need to validate the application after patching the operating system. Why, you might ask? Let's go back to the regulations. Look at the last sentence quoted earlier from Annex 11 (3): every significant modification should be validated. Are security patches "significant"? In my opinion they are not, and therefore, you do not need to revalidate the application.
What does the FDA think about this? From Table I, there is a specific question: should I validate the software changes made to address cybersecurity vulnerabilities? The last sentence of the answer states: For most software changes intended to address cybersecurity vulnerabilities, analysis, inspection, and testing should be adequate and clinical validation should not be necessary. Therefore, evaluation and testing of the patch within IT should be sufficient and validation of the application should not be necessary.
I have presented my interpretation of the FDA's cybersecurity guidance for networked medical devices containing off-the-shelf software for spectrometers operating in a networked environment. A common sense approach to the interpretation of the guidance and applicable regulations provides a process for evaluating, testing, and installing operating system patches while maintaining the spectrometer's validated status.
R.D. McDowall is principal of McDowall Consulting and director of R.D. McDowall Limited, and "Questions of Quality" column editor for LCGC Europe, Spectroscopy's sister magazine. Address correspondence to him at 73 Murray Avenue, Bromley, Kent, BR1 3DJ, UK.
(1) R.D. McDowall, Spectroscopy 19(5), 50–54 (2004).
(2) FDA Guidance for Industry, Cybersecurity for Networked Medical Devices Containing Off-the-Shelf Software, January 2005.
(3) European Union Good Manufacturing Practice, Annex 11 (2006).