Trends in Infrared Spectroscopic Imaging

Nov 01, 2013
By Spectroscopy Editors
Volume 28, Issue 11

An interview with Rohit Bhargava, winner of the 2013 Craver Award from the Coblentz Society. Bhargava discusses current trends in IR spectroscopic imaging, including application-specific instrumentation, improvements in data interpretation, and identifying relationships between structure and spectra. This interview is part of the 2013 series presented in collaboration with the Federation of Analytical Chemistry and Spectroscopy Societies (FACSS), in connection with SciX 2013, the federation's North American conference.

In a recent article (1), you mentioned that extracting information from spectroscopic imaging datasets is not a straightforward process of treating a spectrum from each pixel as an independent measurement comparable to a spectrum from a conventional spectrometer. Why is that the case?

To understand this concept, we have to think about how spectroscopy has been practiced. Traditionally samples were homogenous. They were liquid, perhaps in a cell, or maybe a piece of polymer, and Beer's law was invoked to understand the concentration and its relationship with the spectrum. Absorbance measurement in those traditional instruments involved light that was pretty much collimated; it went through the sample at a right angle and nearly all of the light went through that same angle. The only differences arose if you used a slightly different substrate. If you were to use barium fluoride instead of calcium fluoride, you might have different reflectivity and different cutoffs in the spectra region, but the introduction of a sample into the light beam did not really change things much except for the absorbance. So people did realize that there were other things going on. There has been some work looking at the effect of the substrate, mainly reflectivity and multiple reflections between substrate and things like fringing that were well known, but these were relatively very small corrections. For the most part, one could put a sample into the instrument and get an absorption spectrum that was directly related to the chemistry because the sample was homogeneous.

Today, we are talking about imaging, and imaging only makes sense if the sample is heterogeneous. Naturally, there are boundaries in the sample at which light scatters. It's not just the substrate from which light reflects back and gets transmitted; one must start thinking about refraction within the sample — as well as scattering from the sample. Scattering is really the key to this process. Scattering from a certain point in the sample can transmit light to pixels that it was not supposed to go to. Scattering can also send light out of the cone of collection or the beam.

So the beam is not simply recorded as with or without the sample, with the only change being absorbance; now one has to start thinking of scattering-induced changes in beam direction as well. This has an effect on the spectrum, because spectrally the scattering may not be uniform — in fact it is not — and one must start thinking about differences in spatial fidelity that might arise from the scattering. This is why things are so different now and why one has to treat the entire recorded image as a dataset rather than to think about a spectrum from every pixel being a separate and independent identity.

You mentioned in the article that a trend may emerge toward the development of application-specific instrumentation as opposed to general-purpose instruments. Have you seen evidence this possible trend is becoming a reality? And if so, what are the implications?

Let me address the implications first. One really has to understand how the sample and spectrometer act together to really understand the recorded data. If one were to start recovering chemistry-specific data from a heterogeneous sample, one must understand how the spectrometer interfaces with that type of sample. If you have a large enough market where a certain type of sample is analyzed repeatedly (such as breast tissue or prostate tissue, which are both of interest in our laboratory), then perhaps you could design an instrument in which the scattering is known or minimized or somehow handled, so that chemistry information is obtained really quickly. That's the driving force for this trend.

The evidence says that until recently, most imaging instruments or even most point-mapping instruments in IR spectroscopy were pretty much the same and performed rather similar functions. You could put your sample in and load it up, and either the beam would raster through or you would collect a two-dimensional image, and pretty much the same performance and same sort of data were acquired, with only slight differences between manufacturers. A few years ago we started seeing instruments designed for the first responders or those in the homeland security area, most notably by Smiths Detection, and then people realized that perhaps even within the spectroscopy market there might be some niche areas that could be addressed. For example, last year Bruker came out with a dedicated microscopy system.

There are specific segments in this market in which if we focus, we can then make an instrument that can outperform any other instrument, but it might only do a limited number of things. It's not a general-purpose instrument, but what it does, it does really well. This trend is also perhaps encouraged or promoted by other technologies coming on board. Our laboratory and others have used what we term discrete frequency imaging, as opposed to Fourier transform or continuous spectral imaging. Our laboratory developed narrow-band filters that can be developed cheaply, can be manufactured easily, and are fairly low cost.

We've also used quantum-cascade lasers. With quantum-cascade lasers or filters, you are not really measuring the entire spectrum. You could if you wanted to, but usually you are interested in just a few frequencies. This is quite the opposite of the Fourier transform model in which you measure the whole spectrum; whether you want it or not, that's what you always get. When you start using the discrete frequency approach, then the question arises of which frequencies should be examined, and that's where the use of the instrument really comes into play. From a general-purpose spectral recording device, what you might have is a very specific function spectrometer that might be based on a set of filters or a quantum-cascade laser within a certain range, for example, and that might drive future applications. It's both the need as well as the technology evolving now that will make this trend possible.


native1_300x100
lorem ipsum