How Will Distributed Sensing Inspire Changes in Optical-Sensing R&D?

Article

Spectroscopy

SpectroscopySpectroscopy-01-01-2009
Volume 24
Issue 1

The author discusses what the future holds for the field of optics.

Improvements in engineering and manufacturing processes and tools have rapidly lowered the cost to make products, and to distribute them. The next generation of optical sensing technologies will offer a framework for the creation and testing of new business models based upon the distribution of knowledge and service. In fact, this concept of "distributed sensing" already has emerged in networked systems monitoring various aspects of the environment.

The key to all this will be to lower the cost of trying new marketing ideas. This is accomplished primarily by being speedy to market; by using technologies that are already proven, tested, and functional — spectroscopy certainly fits the bill; and by developing products that are configurable in many different formats. Also, once a new business model is developed, it might be desirable to replace certain technologies with lower-cost versions. That's where distributed sensing networks can help: by making marketing acceptance simpler to demonstrate and to qualify, thus mitigating engineering costs.

Rob Morris

If this all sounds suspiciously familiar, that's because it is — thanks to Apple's reinvention of the marketing model for music, which has managed to distribute more than 5 billion songs since 2003 (and a bunch of videos as well). What technologies will emerge as the iTunes Store of the optical sensing world? And what should designers, manufacturers, and specifiers of optical sensing technologies consider as they build and deploy systems and components for a world that is more "connected" than at any other time in history? Let's consider some scenarios.

All the Data That Are Fit to Distribute

Until 1995, I spent most of my schooling and all of my career path as far from math and science as possible. Not because of a particular dislike for those things, but simply because my interests were elsewhere.

Figure 1: Data collection. Networks of distributed sensors could be used to correlate data on coastal environments around the world.

But something ironic happened as my career progressed — I ended up at a company selling spectrometers and optical components. Talk about a baptism of fire! After about three months of dancing around the most head-scratching questions and discovering that my wit and charm could only take me so far, I discovered an important truth: if I could understand just a handful of key concepts, I could sell optical sensing systems to folks with many more years of scientific training and experience.

Selling is one thing; understanding is quite another. I suspect it would be much harder for a layman like me to break in to the business today, in part because optical sensing systems and components have become more powerful and complex, as well as more integrated with other systems. For example, your spectrometer has to be able to work with various other devices, like light sources and sampling optics, each with its own operating profile. What's more, there are simply more types of technology today than ever before; knowing just a few key concepts is not enough.

But what if the devices themselves were smarter? What if devices had built-in intelligence and could recognize how each device works with the others — understood their roles, if you will — and could share that understanding with users? That would be quite powerful, for layman and expert alike.

In fact, smart technologies already exist all around us. They are part of what Chris Anderson of Wired magazine ("The End of Theory," July 2008) calls "The Petabyte Age" — an era defined by the ubiquity of sensors and the capacity to extract, store, and interpret the data they produce.

Consider adaptive optics, which came out in the 1990s and which have been used to correct optical aberrations in devices such as telescopes and laser communication systems. Indeed, instruments are constantly experiencing changes in temperature and pressure and undergoing mechanical stress and optical decay, resulting in the gradual degradation of the designed performance parameters.

In a spectrometer optical bench, for example, optical components could become fully adaptive by adding micro-motors to control focus, slit width, and grating rotation. The bench would have the ability to configure itself, adapting to changing conditions like temperature shifts, or accommodating different needs, such as widening the slit for fluorescence and then narrowing the slit for high-resolution work. Or the bench would refocus if it's been stressed by thermal shifts, or change wavelength range to optimize readings, or even insert or remove filters to recalibrate itself. Indeed, because spectroscopic analytical instruments in particular require careful mechanical, electronic, and optical design to assure their long-term service reliability and performance, making them "smart" seems like, well, a smart idea.

Using adaptive optics and other smart technologies could make preventive maintenance nearly transparent to the user (similar, in a way, to how antivirus protection software does its thing without us giving it too much thought) and provide a valuable edge in the marketplace to designers and manufacturers of optical sensing products. Competition has never been more robust, and industries — and the researchers who collaborate with them — are constantly looking for ways to better manage their supply chain, to reduce waste or off-spec product and increase yields in manufacturing, and to build value in the brand or selling of their products. Small savings or improvements can drive the difference between being successful versus merely surviving; instrumentation that is rugged and reliable and built around solutions to give better operating decision data can help drive success.

In one "smart instrument" scenario, the intelligence in the system directs the user to perform maintenance checks. For example, in a spectrometer setup, the spectra of a mercury–argon calibration source could be used to determine if wavelength calibration has drifted, if the spectrometer is out of focus, and so on. Linearity testing, dark noise analysis, and other tests could be performed. For light sources, bulb intensity and lifetime could be monitored.

If that sounds like a bit of a stretch, consider that smart technology exists today in optical sensing devices such as detectors, imaging systems, and materials stress monitors.

Expertise in Components, Instruments, and Applications

Making systems and components intelligent and collecting that intelligence are important parts of the distributed sensing model; they make the sensing network worthwhile. After you collect the data, you can use it to make decisions on how to best obtain answers about an application. In the case of a smart spectrometer system, for example, simple tasks like changing integration time or slit width to optimize signal-to-noise performance are straightforward.

Smart devices sense other devices in the community, predict their performance as a system, optimize performance, make recommendations for new devices that will enhance performance, and compare performance among similar systems in the broader network. More complex issues can be addressed through artificial intelligence, neural nets, pattern recognition, and data mining — in short, any analytical technique that is appropriate and useful. The goal is to capture all the expertise accumulated in the design, manufacture, and use of optical sensing systems and components, and put that knowledge to use for the researcher, developer, or technician.

Also, this model could serve as the structure for a sort of expert system database. Using a spectrometer system as an example, we can use keywords in our database that describe certain conditions and report recommendations for dealing with them (that the recommendations are relevant presumes a certain level of expertise on the user's part). The user must supply some idea about the "why" of the application, so the software can make narrower choices. For example, if the user picks the Beer's Law applications feature, the software makes certain inferences: that samples and standards are being used, that maintaining absorbance linearity might require sample dilution, that stray light correction is needed, and so on.

Ultimately, the hardware becomes appliance-like and self-sufficient; the value resides in the network itself, not the components. The distribution of knowledge becomes the new currency.

Distributed Sensing

Back in 1995, when I was answering my first questions about optical resolution, grating efficiency, and the like, there were really only three ways to share information with customers: by phone, by fax, or by mail.

The Internet soon changed all that. With the World Wide Web, information sharing is no longer limited by time or distance; the only barrier is access. So it's no surprise that researchers have already begun to harness the power of the Internet to share data via cabled and wireless networks of sensors. The Sensor Web, a concept developed at NASA/Jet Propulsion Laboratory in the late 1990s, is one such example. There's a great description at http://sensorwebs.jpl.nasa.gov/ of a Sensor Web that uses remote sensing, satellite imagery, and surface measurements to monitor volcano activity near Antarctica.

The implications for instrument design seem readily apparent: We need to think of new products beyond the confines of the laboratory. The ability to connect to the Internet wirelessly or via the Ethernet should be as much a part of instrument design as consideration of processing speed and power requirements. What's more, there are implications for how instruments communicate with each other, how they share information, and how the user interface should be structured.

Communities of Instruments

The Sensor Web module uses a good analogy to describe what comes next. Because each node on the Sensor Web represents the location of an imaging station, those nodes are likened to the pixels of a camera. Only by collecting data from all the "pixels" does an image begin to emerge.

The same idea applies to other collections of instruments. Instruments become a community of independent modules that communicate. The communication can occur within the instrument and its subordinates — secondary channels in a multichannel spectrometer setup, for example — or among other instrument modules that are plugged in anywhere on the World Wide Web. The communities can be a few modules, as in a typical spectrophotometric experiment like measuring absorbance of a sample in a cuvette, or the communities can be large and very distributed. Imagine mapping atmospheric ozone using a network of spectrometers equipped with sampling devices looking directly up through the atmosphere. Obtaining measurements from a thousand locations around the world would take only hours to set up. Imagine measuring UV absorbance of air pollutants across a highway. Just find two office buildings with Internet access and put a deuterium source on one roof and a spectrometer on the other.

There's a slightly more mercenary side to what this model offers, which is the ability for both the instrument and the knowledge supplier to communicate directly with users through online chat, video, voice, text, or more. The network of users becomes at once a delivery mechanism for marketing information and for the distribution of other intellectual products like operating manuals or repair and service reports.

The smart instrument modules of the network form an infrastructure upon which information can be shared and businesses can be built. The value resides in the network itself, not the pieces (think of all the folks developing applications for the iPhone and iPod touch). Forward-thinking companies in all sorts of technology industries won't be focused solely on designing and distributing better MP3 players or cell phones, for instance, but on creating methods for distributing the information those devices capture, display, and store.

None of this will happen overnight. However, driven by an increasing focus on the technology focus of the day — think of areas such as energy self-sufficiency and environmental sustainability — researchers and developers are likely to consider the impact of distributed sensing models applied to applications such as ozone monitoring, agriculture yields, and environmental analysis.

The good news is, the technologies already exist to make all this happen, and the Apple model certainly proves that distributed information has value to many. Consider that 20 years ago, it was hard to imagine a spectrometer portable enough to take to the sample, but today many thousands of spectrometers are in use in situ in even the most inhospitable places. Advances in lasers and optics have been no less dramatic.

So, perhaps the question for distributed sensing is not if, but when?

Rob Morris is Senior Director, Brand for miniature fiber optic spectrometer pioneer Ocean Optics, Inc. He joined Ocean Optics in 1995 and has held various positions within the company in sales, marketing, and customer relations. Morris has a bachelor's degree in journalism from Pennsylvania State University and nearly 25 years of experience in marketing, public relations, and publishing.

Related Content