Mass Spectrometry

Article

Mass spectrometry has become one of the most important tools in the analytical laboratory, with a wide range of applications. Participants in this Technology Forum are Lester Taylor of Agilent Technologies, Alessandro Baldi and Sal Iacono of PerkinElmer, David Chiang of Sage-N Research, and Yingying Huang and Jean Jacques Dunyach of Thermo Fisher Scientific.

Mass spectrometry has become one of the most important tools in the analytical laboratory, with a wide range of applications. Participants in this Technology Forum are Lester Taylor of Agilent Technologies, Alessandro Baldi and Sal Iacono of PerkinElmer, David Chiang of Sage-N Research, and Yingying Huang and Jean Jacques Dunyach of Thermo Fisher Scientific.

Data handling, interpretation, and cross-platform communication have become some of the most important issues in modern mass spectrometry (MS) because of the large amounts of refined data produced by modern MS instrumentation. Has data acquisition and analysis kept up with the ability of instrumentation to produce data? Can you identify any areas of needed improvement?

Taylor: With regards to data handling and processing, we are updating software to use 64-bit mode, which together with increased memory improves processing speeds for large data sets. In addition, we are fine-tuning data-access components with, for example, better indexing for both nominal and high resolution accurate mass data. This reduces processing speeds by a factor of about threefold. Cross-platform communication has improved, for example, by providing tools to third-party software developers to access data files such as MASCOT and MZXML for proteomics applications. In addition an API (application programming interface) is available to third parties who want to directly control software as part of their instrumentation.

Baldi and Iacono: There are several areas that could always use improvement with respect to data generation in MS. There is a general lack of automated knowledge on the analysis front end to determine an optimum method for a specific molecule. Most laboratories either focus on developing methods or utilize a small set of generic methods. A second area of needed improvement lies within the realm of data processing. Scientists attempt to either find a molecule in a very complex matrix or confirm the mass of interest and molecule recognition. In addition, there is a growing demand of trying to relate the mass of interest to other areas of experimentation, generating the need for compound-centric software workflows. A final area of needed improvement is the ease of sharing data. Having the ability for others to reuse the knowledge gained during an analysis, at a later date, for a different project, is very powerful. Currently, even with the growth of electronic laboratory notebooks as data collectors, the mass spectrometers should have some method intelligence capabilities for sample introduction, a facile capability of data relation to other MS studies or other key analytical technique experiments, and seamlessly integrate into the organizational workflow.

Chiang: Data acquisition and analysis is probably the biggest bottleneck in the productive use of mass spectrometers today, as many labs struggle with the storage, analysis throughput, and workflow integration to handle the high data volume. For many labs, the key to productivity is to migrate their analysis workflows from simple PC tools to robust client-server systems that can store and analyze the data, and generate the necessary reports with a high degree of automation.

Huang:The world of MS is highly technology driven — that is, the advancement of hardware often enables new possibilities of data processing and handling. New workflows are often the direct results from the improvement on data quality. For example the recent advancement in high resolution accurate mass instruments has enabled using full-scan MS to quantify 10 to thousands of compounds without the need for selected reaction monitoring (SRM) method development on a triple-quadrupole platform. Such new workflow not only can save users significant time and resources, but also provide additional information on the samples not available before. To support such revolutionary new workflows, intelligent workflow-driven software that can streamline acquisition and interpretation is a significant area for development. The ability to fully exploit information available from the data, and the support for cross-platform data comparison are the two key functionalities that will benefit users the most and thus should be the focus of improvement.

Ionization techniques are important when determining what types of samples can be analyzed by mass spectrometry. What problems have you had related to either solution chemistry, interference, or chemistry of the sample itself where new developments in ionization will be the most important in the near future and why?

Taylor: There are numerous ionization modes that have been developed since the early development of MS. These have addressed the specific ionization requirements for widely varying classes of compounds from nonpolar volatile (for example, electron ionization [EI]) through to polar thermally labile (electrospray, atmospheric pressure chemical ionization [APCI]). A range of ion sources are available (including third party) for mass spectrometers that are suited to particular applications. Generally liquid chromatography (LC)–MS instruments use electrospray or APCI, which both are suitable for ionizing a wide range of polar and thermally labile compounds. Other ionization techniques have been developed for direct ionization of compounds from surfaces or solids. For example we have used ambient desorption ionization technique, atmospheric solids analysis probe (ASAP) and direct analysis in real time (DART) to directly and rapidly analyze contaminants from orange peel using an ambient desorption ionization technique, with a time-of-flight (TOF) or quadrupole (Q)-TOF instrument. In both cases these techniques allow direct analysis of compounds from the solid phase without prior high performance liquid chromatography (HPLC) separation.

Baldi and Iacono: The geometry and electrical fields of ion sources are critical elements for an efficient ionization process and heavily influencing ionization efficiency. When designing a new ion source, our approach often points to "decoupling the processes"; in other words, minimize or eliminate all the secondary interactions amongst different factors during the ionization process. In this way, it is easier for engineers to control variables and for the users to set the right analytical parameters.The ion source is one of the major sources of contamination, and this is particularly critical when multiple users (as in open access labs) are sharing the use of an instrument. For this reason, one of the more frequent requests from customers has been the ability to minimize cross-contamination.

In the near future, we certainly see the growth of the adoption of direct sampling sources. More and more, in fields like food and forensic analysis, clinical research, or basic process control, we see requests for "no chromatography MS analysis." This is driven by the need to decrease sample preparation time to "no sample preparation time." The advantage is a minimization of analytical artifacts that might prevail during sample preparation. Existing ionization technologies designed for liquid chromatography applications can be applied with proper modifications to direct ionization of the sample. This allows one to analyze a sample "as is" or to use simple substrates like paper, glass, and silica.

Dunyach: Ion suppression is probably one of the most challenging problems in LC–MS as it can impair both the sensitivity and the data quality of any given assay. New developments in samples preparation and HPLC separation techniques have helped limit this effect by providing cleaner samples and better chromatographic separation but there is still a need for improved ionization techniques to further address this issue.

In the past few years a flurry of new ionization techniques have emerged, all aimed at minimizing the sample preparation steps and claiming lower susceptibility to ion suppression effects when compared to standard ionization techniques. However, all these ionization techniques suffer from poor sensitivity and reproducibility and improvements in these two areas will be important as it will help high-throughput applications where minimal sample clean up is key.

Recent Videos