OR WAIT null SECS
In a recent interview, Rohit Bhargava, winner of the 2013 Craver Award from the Coblentz Society, discusses current trends in IR spectroscopic imaging, including application-specific instrumentation, improvements in data interpretation, and identifying relationships between structure and spectra.
An interview with Rohit Bhargava, winner of the 2013 Craver Award from the Coblentz Society. Bhargava discusses current trends in IR spectroscopic imaging, including application-specific instrumentation, improvements in data interpretation, and identifying relationships between structure and spectra. This interview is part of the 2013 series presented in collaboration with the Federation of Analytical Chemistry and Spectroscopy Societies (FACSS), in connection with SciX 2013, the federation's North American conference.
In a recent article (1), you mentioned that extracting information from spectroscopic imaging datasets is not a straightforward process of treating a spectrum from each pixel as an independent measurement comparable to a spectrum from a conventional spectrometer. Why is that the case?
To understand this concept, we have to think about how spectroscopy has been practiced. Traditionally samples were homogenous. They were liquid, perhaps in a cell, or maybe a piece of polymer, and Beer's law was invoked to understand the concentration and its relationship with the spectrum. Absorbance measurement in those traditional instruments involved light that was pretty much collimated; it went through the sample at a right angle and nearly all of the light went through that same angle. The only differences arose if you used a slightly different substrate. If you were to use barium fluoride instead of calcium fluoride, you might have different reflectivity and different cutoffs in the spectra region, but the introduction of a sample into the light beam did not really change things much except for the absorbance. So people did realize that there were other things going on. There has been some work looking at the effect of the substrate, mainly reflectivity and multiple reflections between substrate and things like fringing that were well known, but these were relatively very small corrections. For the most part, one could put a sample into the instrument and get an absorption spectrum that was directly related to the chemistry because the sample was homogeneous.
Today, we are talking about imaging, and imaging only makes sense if the sample is heterogeneous. Naturally, there are boundaries in the sample at which light scatters. It's not just the substrate from which light reflects back and gets transmitted; one must start thinking about refraction within the sample — as well as scattering from the sample. Scattering is really the key to this process. Scattering from a certain point in the sample can transmit light to pixels that it was not supposed to go to. Scattering can also send light out of the cone of collection or the beam.
So the beam is not simply recorded as with or without the sample, with the only change being absorbance; now one has to start thinking of scattering-induced changes in beam direction as well. This has an effect on the spectrum, because spectrally the scattering may not be uniform — in fact it is not — and one must start thinking about differences in spatial fidelity that might arise from the scattering. This is why things are so different now and why one has to treat the entire recorded image as a dataset rather than to think about a spectrum from every pixel being a separate and independent identity.
You mentioned in the article that a trend may emerge toward the development of application-specific instrumentation as opposed to general-purpose instruments. Have you seen evidence this possible trend is becoming a reality? And if so, what are the implications?
Let me address the implications first. One really has to understand how the sample and spectrometer act together to really understand the recorded data. If one were to start recovering chemistry-specific data from a heterogeneous sample, one must understand how the spectrometer interfaces with that type of sample. If you have a large enough market where a certain type of sample is analyzed repeatedly (such as breast tissue or prostate tissue, which are both of interest in our laboratory), then perhaps you could design an instrument in which the scattering is known or minimized or somehow handled, so that chemistry information is obtained really quickly. That's the driving force for this trend.
The evidence says that until recently, most imaging instruments or even most point-mapping instruments in IR spectroscopy were pretty much the same and performed rather similar functions. You could put your sample in and load it up, and either the beam would raster through or you would collect a two-dimensional image, and pretty much the same performance and same sort of data were acquired, with only slight differences between manufacturers. A few years ago we started seeing instruments designed for the first responders or those in the homeland security area, most notably by Smiths Detection, and then people realized that perhaps even within the spectroscopy market there might be some niche areas that could be addressed. For example, last year Bruker came out with a dedicated microscopy system.
There are specific segments in this market in which if we focus, we can then make an instrument that can outperform any other instrument, but it might only do a limited number of things. It's not a general-purpose instrument, but what it does, it does really well. This trend is also perhaps encouraged or promoted by other technologies coming on board. Our laboratory and others have used what we term discrete frequency imaging, as opposed to Fourier transform or continuous spectral imaging. Our laboratory developed narrow-band filters that can be developed cheaply, can be manufactured easily, and are fairly low cost.
We've also used quantum-cascade lasers. With quantum-cascade lasers or filters, you are not really measuring the entire spectrum. You could if you wanted to, but usually you are interested in just a few frequencies. This is quite the opposite of the Fourier transform model in which you measure the whole spectrum; whether you want it or not, that's what you always get. When you start using the discrete frequency approach, then the question arises of which frequencies should be examined, and that's where the use of the instrument really comes into play. From a general-purpose spectral recording device, what you might have is a very specific function spectrometer that might be based on a set of filters or a quantum-cascade laser within a certain range, for example, and that might drive future applications. It's both the need as well as the technology evolving now that will make this trend possible.
How has data interpretation in IR microscopy and imaging improved recently in terms of identifying relationships between structure and spectra?
That's a great question because this is one of the most exciting areas in IR microscopy today. We have always known — in fact the first paper that I published in this field demonstrated — that the real part of the refractive index has a tremendous influence on the spectral data that are recorded. This was in 1998 and we thought that yes, it does have an influence, we understand that, but there are more important things to be done: Making better instruments showing IR imaging is useful in a variety of applications. About seven years ago, people started realizing again that scattering and real transport of light through a sample influences the data that we acquire and also actually limits us in application. Since then, a lot of work has been done. The first few attempts were model based, so we would treat every pixel independently; we would treat things as if there were a scattering center there, and as if that were the only effect on the system. Then, almost four years ago, we came out with a comprehensive theory for IR microscopy. This theory has been published in series of papers in Analytical Chemistry since then (2,3).
So what we have been doing over the years is taking this theory and applying it to very specific cases. Now, for some objects that are well defined and whose properties we know, we can predict what kind of spectra would result if a particular instrument were used. The same framework was recently used to explain the limit of information in high-definition IR imaging. Early in 2013, an Applied Spectroscopy paper from my group laid out a theoretical background and simulations to show the quality of image that we can potentially get with IR imaging (4). So it's not just an improvement in identifying the relationship between structure and spectra, it's also leading to better instrument design, and that in turn then allows us to better understand structure and spectra. This resonance between basic understanding, application, and instrument design is really one of the more exciting trends in the field now and I hope this will continue to be an exciting trend for some time to come.
What other trends in IR imaging do you think will become important in the near future?
One of the important trends we are seeing is the emergence of many new components. Though we have already seen advances on the source side, with lasers, filters, even better sources, and better interferometers, we perhaps have not yet seen the full potential of what can happen on the detection side. Certainly hardware advances as always will continue to remain important. One aspect that has not been quite as important in IR microscopy has been theory and algorithm. I think this trend will really become apparent now that we have realized that the information that we can extract from IR imaging data is limited unless we use some sort of correction. Nearly all forms of data recording will require some sort of data processing that is based on the physics of the instrumentation.
What we are really doing here is moving from a basic passive data recording approach to more of a computed data recording paradigm, in which computation and hardware are merged with each other in a way that is quite synergistic. Most certainly, applications are going to be a big trend. There have been many new ideas. In my laboratory, we have used IR imaging on food grains, forensic materials, and other things. Biomedical applications as always will continue to remain strong.
The one trend though that I have not yet seen much evidence for but is likely to emerge is the idea of total information; not only spectral information and not only thinking of pixels independently of others, but asking how do multiple pixels work with each other, and how do different structures in the image relate to the same chemistry or relate to the same problem? We have not really seen analytical algorithms go after the entire imaging data. We have often seen chemometric algorithms focused on spectral data on a per-pixel basis, and we have seen imaging extracted at a specific wavenumber, for example, but certainly looking at all dimensions of the information is likely to be an area of interest in the future.
(1) R. Bhargava, Appl. Spectrosc.66(10), 1091–1120 (2012).
(2) B.J. Davis, P.S. Carney, and R. Bhargava, Anal. Chem.82, 3487–3499 (2010).
(3) B.J. Davis, P.S. Carney, and R. Bhargava, Anal. Chem.82, 3474–3486 (2010).
(4) R.K. Reddy, M.J. Walsh, M.V. Schulmerich, P.S. Carney, and R. Bhargava, Appl. Spectrosc. 67, 93–105 (2013).
This interview has been edited for length and clarity. For more from the SciX series visit: spectroscopyonline.com/podcasts