Validation of Spectrometry Software: A Suggested Integrated Approach to Equipment Qualification and Computerized System Validation

November 1, 2006

Columnist Bob McDowall proposes an integrated approach to the combined issue of equipment qualification and computerized system validation for spectrometers that combines the qualification, day-to-day calibration, and maintenance of the instrument with the validation of the software for a system?s intended purpose.

A suggested integrated approach to the combined issue of equipment qualification and computerized system validation for spectrometers is proposed. It combines the qualification, day-to-day calibration, and maintenance of the instrument with the validation of the software for a system's intended purpose.

In an earlier installment of this column (1), I bemoaned the fact that we lack an integrated approach to qualifying our laboratory equipment while validating the computerized system that controls it at the same time. Most spectrometers cannot be separated from the computer system that controls them, or they will not work. In my column (1), I was critical of the good automated manufacturing practice (GAMP) Good Practice Guide for the Validation of Laboratory Computerized Systems (GPG) (2) for its overly complex categorization of laboratory systems and the use of an inappropriate risk assessment methodology for what are essentially commercial off-the-shelf (COTS) software or configurable COTS applications. However, my main concern was the fact that the GAMP GPG virtually ignored the qualification of the equipment when undertaking a validation.

R.D. McDowall

In contrast, the American Association of Pharmaceutical Scientists (AAPS) has taken a different approach with its white paper on analytical instrument qualification (3), which has been incorporated into the United States Pharmacopoeia (USP) as general chapter <1058> (4). This white paper looks at the qualification of the instrument and virtually ignores the software.

Therefore, we have two major publications that fail to take an integrated view of equipment qualification (EQ) and computerized system validation (CSV) for laboratory systems. In the April "Focus on Quality" column (1), I outlined some of the steps that I thought should be in place to achieve this integrated approach to the problem:

  • Simpler classification of software based upon the existing GAMP 4 software and hardware categories (5).

  • Integrated qualification and validation terminology.

  • Realistic system implementation life cycle based upon the GAMP GPG (2) model with additions from my April column (1).

  • Writing a combined specification for the whole system: instrument and the controlling software.

  • Using a simplified risk assessment methodology.

In this column, I will give some further thought to this integrated approach and look at the first two items. The life cycle was covered in overview in the April column (1), and the simplified risk assessment methodology was discussed in July's column (6). I will refine the qualification stages of the life cycle when we look at integrated terminology. My aim is for a practical approach and cost-effective solution, and I welcome your comments on this approach and would be happy to discuss them in a future column.

Simpler Classification of Laboratory Instrumentation

Rather than use the impractical and overly complex classification of seven categories proposed by the GAMP GPG (2), I would propose that we base the instrument qualification element classification upon the AAPS/<1058> approach in one of three groups (3,4) plus the GAMP 4 categories (5) for the software elements to find the overall system approach to combined EQ and CSV.

Instrument Classification

The majority of spectrometers would fall into the <1058> Group C Instruments: Conformance to user requirements is highly method specific (4). Installation can be complex and require specialist skills (for example, the vendor). A full qualification is required.

Software Classification

Although software is more complex, it should be classified using the current GAMP 4 software categories to ensure consistency with the rest of the organization rather than use the laboratory classification suggested earlier (2). The software categories are as follows:

GAMP Category 1: Operating Systems

This software will be present when a workstation is used to control a spectrometer. The name and version of the operating system should be recorded along with the service packs as part of the overall configuration of the system. If security patches are installed, these too should be documented. Typically, the operating system is tested implicitly when the overall application is tested.

GAMP Category 2: Firmware

When equipment is controlled by firmware or software incorporated in read-only memory chips, this makes it an integral part of the instrument. Therefore, the software elements are implicitly tested as part of the overall equipment qualification and should be assessed under the <1058>/AAPS classification scheme for equipment.

GAMP Category 3: Commercial COTS Software

Software that is installed and needs only configuration to enable it to operate in the laboratory environment is Category 3. Typically, all that is required are the settings such as default printer and the location of data file storage if the system is on a network, and the users and their associated user types (security and access control).

GAMP Category 4: Configurable COTS Software

Here, the installed software functions can be modified within the parameters set by the vendor. One way this can be done is via radio buttons to change the way the application can function.

GAMP Category 5: Bespoke or Custom Software

This is software written for a specific function and is unique to a single laboratory or computer system. Therefore, all support and maintenance is the responsibility of the laboratory that writes the code or commissions it. Category 5 software has the highest risk associated with it because there is usually only a single laboratory using the application.

The majority of software used in laboratories is either category 3 or 4; therefore, the support is the responsibility of the vendor and the users license the application. However, it is also important to understand that for many spectrometers, more than one category of software can exist in a system. For a spectrometer controlled by a workstation, there will be a minimum of two categories and possibly three present:

Category 1: Operating system software such as Windows or Unix for the workstation.

Category 3 or 4: Software used to control the instrument, acquire and process data, and report the final results.

Category 5: Software might exist on the same workstation because some software allows users to write custom macros to perform specific functions or manipulate data, and these macros need to be specified, tested, and controlled as part of an overall validation of the software.

For the purposes of the rest of this column, we will concentrate on GAMP software categories 3 and 4 that control a USP <1058> Group 3 spectrometer (4). The majority of spectrometers will fall into this category.

Combined System Specification for Equipment and Software

Defining the intended use of both the spectrometer and the software is vital in any EQ or CSV, because the two elements are combined for computerized laboratory systems. We need to ensure that there is an appropriate systems specification available. If purchasing a new system, you might need to have one specification to select the system and then update it for the specific system purchased after selection and familiarization with the instrument and the software. It is against this system specification that the user acceptance testing will be carried out. Therefore, it must be as specific as possible.

Putting It All Together

The overall process is shown in Figure 1. The work undertaken in each phase for both the computer system and the instrument is outlined.

Figure 1

Integrated qualification terminology: The terminology used when we "qualify equipment" or "validate systems" should be developed further, because we use installation qualification (IQ), operational qualification (OQ), and performance qualification (PQ) for both but mean different things depending upon the context in which we use them (6). This is a major cause for confusion between vendors and users. In addition, some vendors use different terminology, such as performance verification (PV) — does this equate to an instrument IQ or OQ? Why can't we standardize and integrate our terminology so that there is no difference and we know exactly what we mean?

Therefore, for overall process of qualifying either an instrument or a system, we need a definition of qualification. The following is the one I have taken from ICH Q7A (GMP for active pharmaceutical ingredients) (7) and modified accordingly.

Qualification: The action of installing and demonstrating that a laboratory instrument or system is properly installed, works correctly, and actually leads to the expected results.

Implicit in this definition is that there will be the appropriate documentation for the initial installation, component integration, and system acceptance during the work. Additionally, if you are to check that the system produces the expected results, then you need to document them in an equipment or system specification.

If qualification is the overall process, it can be broken down further into four phases. Depending upon the nature of the system and the equipment, not all would need to be performed and some work can be undertaken by a vendor. To avoid confusion, we need new terms that are easily understandable. The four phases of qualification of either equipment or a computerized analytical system are as follows:

Component installation and component integration: The action of installing the items and components of instrumentation and computer hardware and software followed by their integration into a system.

This phase of work would be undertaken by the vendor aided by the company's IT staff if the data are being stored on the network. Work carried out here would range from unpacking the individual items of equipment and computer hardware that make up the system, installing the individual components (including the operating system), and, where necessary, confirming that they work.

The next stage of the work is the integration of the components to make the overall system by connecting them and checking that they work as a whole. There would be documentation of all aspects of this work, but the equipment and computer system elements would be treated as equals rather than the computer elements being a sideshow to the main emphasis on the instrument. Configuration management of the whole system would start from this point onward.

Vendor commissioning: The action of demonstrating that the equipment and computer system work according to vendor specifications.

As suggested, this activity is undertaken by the vendor to demonstrate the system as part of their overall installation of the components and their integration into a working system. This can include a check of the spectrometer operation using certified reference materials as well as a check that the software operates correctly from their perspective. There must be a holistic or system test of the overall system (both the instrument and the software) to show that it works correctly; this is similar to the suggestion made by Furman and colleagues (8) for computerized high performance liquid chromatography systems. If this does not occur here, then the users need to undertake this in later stages of qualification.

Vendor commissioning could be repeated either in part or in whole as a regular process to show that the system operates correctly over time, or after a major service or maintenance event.

Defining User Ways of Working

This consists of four activities:

  • Instrument qualification

  • Instrument calibration

  • Software and firmware configuration

  • Software customization

In practice, one or more of these activities might be required for a specific system. It is important to realize that if the instrument has been tested adequately earlier in the qualification process by the vendor, it does not need to be retested here. The corollary is that if vendor testing does not match the user requirements, it must be undertaken here. From the software perspective, if customization or configuration is needed, it must be performed here because the vendor usually will not perform this work. Notwithstanding what is undertaken, it is an unwritten command that it will be documented.

Instrument qualification: Testing the operational parameters of the instrument in the way that it will be used by the laboratory. This will include operating parameters and samples that are specified in the laboratory user requirements.

Instrument calibration: Calibrating the instrument within the parameters that the laboratory will use in routine use, via traceable standards. Note that this does not always mean that what the vendor has calibrated will be appropriate for a specific laboratory. Typically, a vendor qualification is a vanilla approach of one size fits all. Here, the users need to define and calibrate what they really require.

Software and firmware configuration: The action of configuring the software application to laboratory requirements. This activity tends to be overlooked in many validation texts. It occurs to some extent with Category 3 software for application access control, but the main emphasis is the configuration of Category 4 software. Changes made to the application by the users must be documented and, where appropriate, tested in the user acceptance phase. For a spectrometer, it will also include the process of defining, building, and constructing a spectral library.

Software customization: Customizing the operation of the software by writing, typically custom macros. If macros must be written to help the users automate or speed up work, this will occur here. Typically, these macros must be documented, built, tested, and controlled.

User acceptance: The action of demonstrating that the overall system works according to the user specification documents (that is, intended purpose) and meets current regulatory requirements. As suggested by the name, this is the responsibility of the users of the system. Users can leverage work performed in earlier phases to avoid repeating work undertaken by the vendor when commissioning the system, provided that the work meets the user specifications. For example, an instrument with Category 3 software could omit this stage if the vendor commissioning tests matched those required by the users and the associated regulatory compliance. However, a Category 4 system that has been configured would be unlikely to have this advantage because the vendor commissioning usually is undertaken on the unconfigured application. Testing in this phase might also use certified reference materials to demonstrate the correct operation of the instrument or system.


In this column, I have outlined what I think should be required for an integrated approach for combined equipment qualification and computerized system validation. In the next column, I will outline a practical example of this approach. I welcome your comments.


The author would like to thank Chris Burgess for critique and comment in the development of Figure 1.

R.D. McDowall is principal of McDowall Consulting and director of R.D. McDowall Limited, and "Questions of Quality" column editor for LCGC Europe, Spectroscopy's sister magazine. Address correspondence to him at 73 Murray Avenue, Bromley, Kent, BR1 3DJ, UK.


(1) R.D. McDowall, Spectroscopy 21(4), 14–30 (2006).

(2) GAMP Forum Good Practice Guide — Validation of Laboratory Computerized Systems (International Society for Pharmaceutical Engineering, Tampa, Florida, 2005).

(3) S.K. Bansal, T. Layloff, E.D. Bush, M. Hamilton, E.A. Hankinson, J.S. Landy, S. Lowes, M.M. Nasr, P.A. St. Jean, and V.P. Shah, Qualification of Analytical Instruments for Use in the Pharmaceutical Industry: A Scientific Approach (American Association of Pharmaceutical Scientists, 2004).

(4) Pharmacopoeial Forum, <1058> Analytical Equipment Qualification (January 2005).

(5) Good Automated Manufacturing Practice (GAMP) Guidelines, version 4 (International Society for Pharmaceutical Engineering, Tampa, Florida, 2001).

(6) R.D. McDowall, Spectroscopy 21(7), 20–26 (2006).

7) ICH Q7A Good Manufacturing Practice for Active Pharmaceutical Ingredients (2000).

(8) W. Furman, R. Tetzlaff, and T. Layloff, JOAC Int. 77, 1314–1317 (1994).