Software Features to Improve Quality Control and Data Validation in the Inorganic Laboratory

Article

Spectroscopy

SpectroscopySpectroscopy-07-01-2009
Volume 24
Issue 7

One of the most difficult tasks in any laboratory is the validation and assurance of all data being reported. Whether or not this is being mandated by a regulating agency, it is imperative that the quality of data from any analysis be controlled. How do the laboratory workers ensure the quality of their reported analyses and how do they demonstrate this quality?

One of the most difficult tasks in any laboratory is the validation and assurance of all data being reported. Whether or not this is being mandated by a regulating agency, it is imperative that the quality of data from any analysis be controlled. How do the laboratory workers ensure the quality of their reported analyses and how do they demonstrate this quality>

In the case of regulated laboratories, rigorous guidelines exist to demonstrate initial capability as well as assess ongoing performance. Regardless of the technique being employed — atomic absorption (AA), inductively coupled plasma–optical emission spectroscopy (ICP–OES), or ICP–mass spectrometry (ICP-MS) — the tools used to demonstrate capability and performance might include:

  • Calibration range determination

  • Quality control sample analysis

  • Determination of method detection limits

  • Periodic analysis of reagent blanks, fortified blanks, and calibration standards

  • Sample replicate analysis

  • Recovery of fortified samples.

Laura Thompson

Each laboratory is required to maintain performance records detailing this information as well as any trends that might occur. For this record keeping, most laboratories use some type of control charting mechanism. These can range from simple spreadsheet procedures to advanced standalone software packages. In some cases, users can transfer data directly from the instrument workspace into the charting package, but in others, it might require manual entry. In the case of small workloads and limited analytes, manual data entry is manageable. However, in most laboratories where the number of samples can be high and the number of analytes for each sample varies, this can be a daunting task.

Recent advances in software for AA and ICP-OES ease this burden of quality control and validation. Several checks can be sequenced automatically through the software, including QC samples, blanks, duplicates, and fortified or spiked samples. During the analysis, these checks will run as scheduled and the recoveries or deviations will be calculated and stored automatically. The sequence of these check samples within the analysis can be viewed at any time through an Analysis Control Screen (Figure 1). Data are stored along with the method used during the analysis to allow users to archive, export, report, or chart as necessary.

Figure 1: Analysis run list.

A powerful QC Charting option is accessible from within the software to easily generate charts from stored data. This charting wizard automatically accesses stored data to help the user chart many different time-based parameters, such as QC recoveries, blank values, and standard responses (Figure 2). This collected QC data also can be exported for use in other data packages.

Figure 2: Quality control chart.

Many laboratories utilize this charting function to generate charts on a routine, quarterly, or monthly basis. Because the data are imported directly into the program, the inherent difficulties with manual entry are relieved. This function also provides a high level of flexibility to allow the user to chart exactly the parameters required by his or her particular regulation or internal procedure.

Universal Data Acquisition

Specifically in the case of ICP-OES, the validation of data can be more critical. As a multielement technique, ICP lends itself well to laboratories with large sample loads and a variety of samples with wide analyte concentration levels. This can compound the issues of data validation and meeting the needs of sample submitters. With the advent of solid-state detectors, acquisition of all wavelengths and background information can be realized easily. However, most laboratories still use only one wavelength per analyte — as a result of either historical preference or time constraints. Restricting an analysis to one wavelength per analyte and only acquiring data for a subset of analytes can provide easier data review and presentation, but it also can raise several questions:

  • Was the most appropriate wavelength used for these samples?

  • What if additional analyte information is needed in the future for this sample?

  • What is the best wavelength to use for the required detection or reporting limits?

  • How can the correct results be assured with the many possible interferences in emission spectroscopy?

The incorporation of active universal data acquisition (UDA) into the software addresses these problems by allowing for the simultaneous acquisition of all available wavelengths all of the time. UDA does not require a separate mode of analysis and can be turned on or off by the user as desired.

Advantages of Universal Data Acquisition

Data can be validated easily if UDA has been used to collect a set of results. If any uncertainty in the analysis result exists, it can be verified without ever rerunning the sample. In fact, multiple wavelengths can be reprocessed for each element to determine or confirm the correct result. This is done after data collection and can even be performed in "off-line" mode to allow the instrument to still be running other batches of samples and not affect productivity.

Whether a laboratory is analyzing internal samples or those for an outside client, the potential for additional analyte requests exists. In some cases, the results reported initially might indicate further analytical testing. At other times, changes to regulations might prompt a reinvestigation of previous samples for new analytes. These requests might occur hours, days, or even years later when the original samples might be long gone. UDA provides the capability to reprocess the original data to include these new analytes without the need to rerun. Using a suite of calibration standards (containing all potential elements) for every analysis ensures that calibration information would be available for these future requests. Even if the analyte requested was not measured in the initial analysis, it could be added to the method later and the collected UDA spectra could be reprocessed to report a quantitative result.

Another common issue is determining whether the wavelength being used has the appropriate reporting limit for the analysis. Once again, most users run one wavelength and establish a reporting limit for that wavelength only. With UDA, users can determine thousands of reporting limits at once and can simply choose the best one for the analysis. If the work of analyzing for instrument detection limits, method detection limits, or reporting limits has already been performed for a particular analyte using UDA, it can be used without rerunning to investigate alternate wavelengths. These additional wavelengths for the elements of interest could be added to the method and the data reprocessed to determine an appropriate wavelength for use.

In a typical analysis, users must contend with thousands of emission lines that might interfere with the chosen analyte wavelength. Many regulated analyses require some type of interference correction, and it is important to confirm that the correct line is being used for the sample matrix being analyzed. Building these interference correction models can be very time consuming, and wavelength selection becomes even more critical. If the model being used is not correct, the entire method development process might need to be repeated with a new interference wavelength. If UDA is utilized in the process of building the interference correction model, new correction factors can be calculated simply by reprocessing a different wavelength without rerunning the solutions.

Conclusion

Laboratories performing routine as well as nonroutine analyses vary with regard to regulation requirements, sample workload, contract limits, customer requests, and user expertise. This variation is also visible in their dependence on software. In an effort to meet the complex needs of these labs, several new advancements in software for AA and ICP-OES have been introduced. Using any or all of these features will help the inorganic lab meet the needs of their customers by validating data and exhibiting quality control.

Laura Thompson is ICP-OES and Sample Handling Business Manager within the Inorganic Business Unit of PerkinElmer, Inc., Shelton, Connecticut. She has been with PerkinElmer for 10 years, and holds a masters degree in analytical chemistry from North Carolina State University.

Related Content