Calibration Transfer, Part VI: The Mathematics of Photometric Standards Used for Spectroscopy

Oct 30, 2014
Volume 29, Issue 11

Photometric accuracy and precision, as reproducibility and repeatability, respectively, are essential for building consistent large databases over time for use in qualitative searches or quantitative multivariate analysis. If the spectrophotometer in use is inconsistent in terms of linearity and photometric accuracy, the analytical precision and accuracy will be jeopardized over time. Photometric accuracy and linearity drift over time within a single instrument or between instruments and create errors and variation in the accuracy of measurements using databases collected with different photometric registrations. How do current commercial instruments vary with respect to photometric accuracy and precision over time? What are potential solutions to this challenge?

Photometric precision and accuracy, when measured with a certain protocol, are termed photometric repeatability and reproducibility. The photometric stability within a single instrument and between multiple instruments over time is important for seamless transfer of multivariate calibrations, and for the unbiased application of qualitative search libraries. In a previous installment (1), we discussed the importance of the stability in the wavelength axis of spectrophotometers and compared several leading instruments for differences. Data integrity depends on the accuracy and precision of the X (wavelength) and Y (photometric) axes. There are multiple factors affecting the photometric stability of spectrophotometers: the alignment and mechanical tolerances of optical elements; detector noise characteristics and amplitude; detector linearity; detector electronics, including gain and offset settings; signal amplifier noise; and sample presentation repeatability and reproducibility (2,3).

Accurate and precise photometric axis alignment is a complex issue for commercial instrument manufacturers and design engineers. As always, the quality of manufacturing, design tolerances, quality of components, and maintenance of stringent system calibration protocols are required for photometric stability. As is the theme of this series of columns, the appropriate use of primary and secondary reference standards and materials is also essential to superior metrical performance using spectroscopy.

Different Approaches to Alignment of the Photometric Axis

Approaches to aligning the photometric accuracy of any spectrophotometer may involve measuring a standard reference sample and correcting the instrument photometric values to those values obtained by measuring the same sample on a highly accurate and traceable spectrophotometer. This is basically a sound approach, but a number of questions arise: How often should the recalibration be performed? How much difference between measured and reference standard values is acceptable before one recalibrates the photometric axis? What values are to be used for the photometric reference standard? Who is to decide what the correct reference (measurand) values should be? Are the values measured from a master instrument to be used? Should the values measured from a typical production instrument be more appropriate to align all instruments such that production instruments are alike? This would be a viable strategy if all instruments were alike for all time. However, since this is not so, one must look to calibrate to a permanent set of references, such as pure atomic materials, metals, pure molecules, optical materials, or emission sources. If one uses these materials with known photometric values and spectral shapes, then instruments can always be corrected to a known reference standard, and this will always be true even in the future. For this physical reference approach, there is only dependency on natural laws and physics, not unstable materials and instrumentation.

The master instrument concept was acceptable for its time (4–6), but metrical requirements are best served by first principles calibration. The measurands (or actual nominal values) for calibrating photometric scales would optimally be obtained using first principles whenever possible because this is a universal approach and is based on pure physical science and metrology, which by definition is fixed to a specific known uncertainty. The national laboratories around the globe use this approach as much as possible (7,8).

A protocol to correct photometric values measured using a spectrophotometer would ideally provide high precision with universal accuracy, so that expensive and time consuming databases created over time are aligned with any instrument technology, which could be carried reproducibly into the future. In addition, using a first principles approach allows one to standardize the data based on known photometric values at any time. First principles photometric calibration may be defined electronically and, thus, no physical instrument or object is required to transfer the technology required for photometric adjustments or calibration.

To create a first principles photometric calibration protocol one requires repeatable and well-characterized photometric material standards, with known stability, and with well-defined uncertainty. To have minimal effects for calibration transfer, absorbance, or photometric response, accuracy should have an absolute deviation of less than ± 0.02% T (or R) versus a National Institute of Standards and Technology (NIST) traceable (or calibrated) spectrophotometer measurement at 0.09% T (or R) (3.046 AU) to 0.10% T (or R) (3.000 AU) for specific reference standards at ~1333 nm (7500 cm-1) and ~2222 nm (4500 cm-1), respectively. Photometric accuracy must be ± 0.01% T (or R) (± 0.02 AU) in agreement with calibration (reference) instrument (absolute maximum deviation) across all wavelengths. Absorbance or response repeatability of ± <0.001 AU (1 sigma) would be optimal.

The NIST Uncertainty Number for SRMs

NIST uncertainty is a measure of uncertainty unlike the typical manufacturer report as either precision or accuracy. For example, NIST uncertainty has been calculated from equation 1 (9).

where A = twice the largest standard deviation of measurement of multiple wavelength photometric values (measurands) versus the spectrometer measured values over a period of one month (or other designated period of several weeks duration); B = twice the standard deviation of the uncertainty in the photometric data measurement method used (Note: the noise on any photometric measurement adds uncertainty to the absolute photometric values); and C is the maximum variation in the reference standard because of humidity, material instability, photometric bleaching, or temperature changes over a specified range and conditions. (Note: The number 2 [twice] is referred to as the k value, also known as the coverage factor, see reference 7). As mentioned in a past column, one may also obtain more detailed information on the history of the international metrical committees responsible for defining the technical details for uncertainty associated with measurement results; the reader is referred to appendices C and D in reference 7 for historical details.

Commercial Instrument Photometric Data

Measurement of Photometric Accuracy

Figure 1: An identical Fluorilon R99 sample measured on four different (that is, A through D) commercial NIR spectrophotometers as compared to a reference (R) spectrometer measurement (dotted line).
As we demonstrated in a previous installment (1), a comparison of the leading near infrared (NIR) spectrophotometer performance characteristics is very revealing as to the state of the art for wavelength and photometric accuracy and precision in modern instrumentation. For this paper, an identical sample of Fluorilon R99 (Avian Technologies LLC) was measured across four different instrument models and manufacturers; the same sample in the same sample holder is used for all the measurements. The identical sample was measured on a double monochromator system carefully calibrated using NIST standard reference materials and other calibration protocols to obtain the reference (or measurand) values. The example in Figure 1 shows the results for Fluorilon R99, 99% reflectance standard. The dotted line spectrum indicates the reference instrument measurements. It is immediately obvious that only one manufacturer is carefully matching the instrument photometric values to the reference standard values. Even so, only one of the instruments match the standard with sufficient minimum accuracy. This indicates that data collected on any of the instrument models tested will result in significant photometric differences that must be compensated for if using databases collected across the various instrument models and manufacturers.

The mean difference between the measured and measurand values for photometric "accuracy" is determined by equation 2.

Where Āi is the average measured reference material photometric value for the test instrument in absorbance units (AU) over the entire 1100–2500 nm wavelength range; and Āref is the average measured reference material photometric values for the calibrated reference instrument in absorbance units over the entire 1100–2500 nm wavelength range. The results are reported as: mean difference or "accuracy."

Table I: Average photometric values for instrument models A through D as compared to a reference spectrophotometer across a range of 1100–2500 nm
Table I demonstrates the average accuracy for each of the instrument models A through D over the wavelength range of 1100–2500 nm as compared to the reference calibrated spectrophotometer (PerkinElmer Lambda 9/19 UV-vis-NIR spectrophotometer). Note that instrument B is within our minimum specified goal of (±0.02 AU) across the spectral range. This speaks well of calibrating each instrument based on a known photometric standard.

lorem ipsum