R.D. McDowall is principal of McDowall Consulting and director of R.D. McDowall Limited, and "Questions of Quality" column editor for LCGC Europe, Spectroscopy's sister magazine.
Using information provided by guidance documents from outside the spectroscopy laboratory can be very useful when trying to meet the regulations that we must follow.
In May 2007, the FDA released the final version of a guidance document that many of you are unaware of: Computerized Systems Used in Clinical Investigations (1). So, what is the relevance to users of spectroscopy software? The answer is simple. In the absence of a re-release of 21 CFR 11, this document provides the current formal thinking of the FDA on security and access control applied to computerized systems. It is also important because the Able Laboratories fraud (2,3) and the impact that this will have on the FDA's new approach to Pre-Approval Inspections (4), which is the determination of data integrity and the detection of fraud. Therefore, this places security and access control as prime areas that must be controlled to prevent fraud and to establish and maintain data integrity.
This guidance document has a long history, as it was first released as a draft in April 1997 and finalized in April 1999. The Good Clinical Practice (GCP) guidelines are very vague when it comes to discussing computerized systems. In light of the reassessment of Part 11 by the FDA in 2003, the clinical guidance document was reissued as a second edition in September 2004, and in May 2007, the final version was released under a slightly modified title that extends the scope of the document, which need not concern us in this discussion.
Also of interest is the listing of a minimum list of standard operating procedures (SOPs) required for computerized systems (listed in Appendix A of the document). This is shown in Table I in the left-hand column; of course, you will need to interpret this list with regard to its relevance to an analytical laboratory. My interpretation of this list is shown in the right-hand column; this is my attempt to demonstrate that far from being only of use in a clinical environment, they also can be useful in a spectroscopy laboratory with a modicum of thought. As we shall see, this list of minimum SOPs is a common-sense approach that is consistent with current good practice in a nonregulated environment (for example, ISO 17799 2005) (5) and should not be thought of as the onerous requirements of a regulatory agency.
However, the main aim of this column is to look in more depth at security and access controls because these will be under more regulatory scrutiny in the future. The parts of the FDA clinical guidance that I would like to discuss are the following sections:
D. Internal Security Safeguards – item 1. Limited Access
E. External Security Safeguards
I'll present the points made by the FDA in my sequence rather than as presented in the guidance because, in my view, my way is more logical. This is because the FDA writers have distributed facets of the same item between either internal or external security safeguards. I just prefer to take a different perspective of access control and logical security, which is a combination of these two. I'll make reference in the following text back to either Section D or E of the original guidance document so that you can trace my thought process to the source document. Each item we discuss can be a single SOP, or the topics can be distributed among several procedures — the choice is yours.
It is also important to realize that this is a guidance document that contains in bold type at the top of each page the words "Contains Non-Binding Recommendations," and alternative approaches to those outlined are acceptable.
Table I: List of standard operating procedures and interpretation for an analytical laboratory
The first part of system security is to limit access to any of your systems to only those people who need to use them. Section D1 of the guidance states: "Access must be limited to authorized individuals" (21 CFR 11.10(d)). This is a basic requirement for both the FDA and the EU GMP and GLP regulations, 21 CFR 11, but also of ISO 17799; this latter standard provides much practical advice about how to implement many of the security requirements for our computerized systems and comply with the regulations.
Section E also notes: "Staff should be kept thoroughly aware of system security measures and the importance of limiting access to authorized personnel." This highlights the importance of initial and ongoing training of users, in this as well as the other topics of this column.
The guidance simply states: We recommend that each user of the system have an individual account (D1). This is simply common sense because it is the only way that allows any computer system to uniquely identify an individual and their actions. It is the heart of the requirements for data integrity and authenticity. In cases in which user identities have been shared, the system cannot differentiate between users, and data integrity is severely compromised. Such was the case with the Concord Laboratories warning letter issued by the FDA in July 2006 (6). Here, the laboratory managers logged on and set up chromatography systems that the staff used for analysis. The problem was that user accounts were shared and staff used the managers' accounts; therefore, the individual responsible for any changes could not be identified — hence, the unrequited love letter from the FDA.
Individual user accounts also are consistent requirements outlined in ISO 17799. The standard also goes further and recommends that before a user is allowed access to use a system, there is a formal authorization process to use a system initiated by the head of function, and that access privileges granted are appropriate to the user's actual need. This last stage can have restricted access while a user trains to use the system, with increased privileges coming with successfully acquiring the skills to use the system correctly.
Here the FDA requires a means, either inside the system or external to it, to maintain a cumulative record that indicates, for any point in time, the names of authorized personnel, their titles, and a description of their access privileges (E). Why you may ask would the FDA request this? Again it comes down to the principles of data integrity. If an entry in the audit trail used the user identity to link user actions, then the user identity might not have sufficient detail to identify the individual; hence, the recommendation for a cumulative list. However, consider this further. When was an individual first granted access to the system, and when was their use terminated? This information will aid integrity of data enormously.
Ideally, the data system should be able to provide this information; however, most do not because users do not request this function. So to meet this recommendation, most system administrators will be left with a paper alternative that compiles either an Excel or a Word list that is updated regularly. The information for each user should be:
In addition to having unique user accounts, passwords must not be shared. As the FDA notes: Individuals should work only under their own password or other access key and not share these with others (I). This is also a requirement of the majority of organizations' corporate computing or security policies, so this is not merely a regulatory requirement. In some organizations, staff can be dismissed if they are caught sharing passwords.
Passwords should be changed as follows: We also recommend that passwords or other access keys be changed at established intervals commensurate with a documented risk assessment (I). This is good practice. However, for many organizations, do we really need a documented risk assessment, because our corporate security standards usually define password length, structure (such as number of characters and composition), and expiry time? Therefore, my suggestion would be to follow corporate standards because these should have sufficient requirements to meet the intent of this guidance.
The user should log into that account at the beginning of a data entry session, input information (including changes) on the e-record, and log out at the completion of the data-entry session. The system should not allow an individual to log onto the system to provide another person access to the system (I).
The recommendation here is to have a technical control to prevent sharing accounts. The main way this would be met would be to have the user input his or her personal password at key stages of the workflow. If this seems a little prescriptive, you need to remember that the guidance is aimed at a clinical environment in which many of the clinical investigators might not have basis compliance training.
However, here's a good one for translation to the laboratory: When someone leaves a workstation, the person should log off the system. Alternatively, an automatic logoff might be appropriate for long idle periods. For short periods of inactivity, an automatic screen saver can prevent data entry until a password is entered (I).
The password-protected screen saver is fine in principle, but it might be poor in practice unless you have a situation in which most of the systems are networked. If a screen saver comes on, it has the potential to prevent many standalone systems from working. An automatic logoff might stop your application form working! Secure, but you don't get much work done.
In contrast, many networked data systems can be used like this because the instrument can be set up to run from a data or instrument server, and the user's session can be stopped or terminated without impacting the analysis. Also, users of networked data systems should be trained to lock their screens if they will be away from their terminals for any length of time.
A recommendation of the guidance is that the system should be designed to limit the number of login attempts and to record unauthorized access login attempts (I). This sounds fine in practice, but what is an unauthorized access? In the context of a spectroscopy laboratory, there usually is physical security to restrict unauthorized access to the site and sometimes to the laboratory itself. The unauthorized access will be from either untrained users of the software within the laboratory or people external to it. Therefore, a system must log or audit-trail all access attempts, both successful and unsuccessful. Within a single, a maximum number of unsuccessful attempts before an account locks is normally three or four. On a regular basis, a system administrator should check to see the unsuccessful attempts to determine if any action needs to be taken.
Rather than access a system via the application, a more surreptitious way of accessing data is to use the file manager system of the operating system or a web browser. Therefore, the guidance has the following to say: Procedures and controls should be put in place to prevent the altering, browsing, querying, or reporting of data via external software applications that do not enter through the protective system software (E).
This already has been quoted in a 483 observation to the Gaines Chemical Company, when an inspector reviewed a data system that used operating system files rather than a database to manage the data files generated (6). The observation noted that there was no validation data to demonstrate that an authorized user of the corporate wide area network (WAN) did not have access to analytical data on the laboratory's local area network (LAN).
Therefore, some controls are required to prevent unauthorized users from accessing the data. These can include having write-protected drives so that modified files can be saved only under a different name or so that laboratory network drives are hidden from the general user community. Training is also a major element here in ensuring that users do not access applications via the operating system.
Finally, there is also a requirement for antivirus software: Controls must be implemented to prevent, detect, and mitigate the effects of computer viruses, worms, or other potentially harmful software code on study data and software (E).
These are basic controls that are essential for today's computing environment – unless you have a standalone system in the laboratory that is not connected to a network and movement of files to and from it are checked for any malicious software. However, it is important that when software of this nature is used, it is adequately documented. Also, the antivirus definition updates should be excluded from the scope of your change control SOP because this is the normal mode of operation of this software.
This column is intended as a summary of the security and access control requirements of a new FDA guidance on computerized systems in clinical investigations that has been interpreted for the analytical laboratory.
(1) FDA Guidance for Industry, Computerized Systems Used in Clinical Investigations, May 2007.
(2) Able Laboratories 483 Observation, July 2005.
(3) R.D. McDowall, Quality Assurance J. 11(1), 26–35 (2006).
(4) Gold Sheet, August 2007.
(5) ISO 17799-2005, Information Security Management, ISO Geneva.
(6) Concord Laboratories, FDA Warning Letter, July 2006.
(7) Gaines Chemical Company, 483 Observations, December 1999.
R.D. McDowall is principal of McDowall Consulting and director of R.D. McDowall Limited, and "Questions of Quality" column editor for LCGC Europe, Spectroscopy's sister magazine. Address correspondence to him at 73 Murray Avenue, Bromley, Kent, BR1 3DJ, UK.