News

Article

The Role of Explainable AI in Agriculture

Author(s):

Fact checked by:

Key Takeaways

  • Explainable AI (XAI) enhances transparency and reliability in agricultural spectroscopy, addressing the "black box" issue of machine learning models.
  • XAI clarifies which variables influence decisions, fostering trust and innovation in agricultural quality assessment.
SHOW MORE

A new review article highlights how Explainable Artificial Intelligence (XAI) can enhance transparency, trust, and innovation in agricultural spectroscopy, paving the way for smarter and more sustainable food quality assessment.

Recently, artificial intelligence (AI) has become part of the standard workflows in many application areas involving applied computer technology. It is now expected that workers use AI tools to help them work more efficiently. A recent review article highlighted how an AI program, titled explainable AI (XAI), is assisting the integration of machine learning (ML) models into practical applications in agricultural spectroscopy (1). This study, which was led by Mohammed Kamruzzaman of the University of Illinois Urbana-Champaign, highlighted the role of XAI in enhancing transparency, reliability, and innovation in the evaluation of agricultural and food product quality (1).

Rear view of senior farmer standing in soybean field examining crop at sunset. | Image Credit: © Zoran Zeremski - stock.adobe.com

Rear view of senior farmer standing in soybean field examining crop at sunset. | Image Credit: © Zoran Zeremski - stock.adobe.com

Spectroscopic techniques, including visible, near-infrared, and hyperspectral imaging (HSI), have become indispensable tools in agriculture for tasks such as assessing crop health, monitoring food quality, and optimizing postharvest processes (2). The integration of advanced ML and deep learning algorithms has significantly boosted the accuracy of these techniques (1,2). However, the “black box” nature of such models often prevents users from understanding how predictions are made, or which spectral wavelengths contribute most to results (1,2). The researchers discuss how this lack of interpretability has negative downstream effects. These include hampering innovation, limiting the selection of efficient spectral bands for specific applications, and slowing down the design of portable, cost-effective devices (1).

Kamruzzaman and his team highlighted in their review article that XAI can potentially help address these challenges. Unlike traditional algorithms, which provide predictions without context, XAI methods can reveal which variables, such as specific wavelengths or spectral features, drive decision-making (1). This clarity, the authors argue, could help improve agricultural quality assessment by fostering greater trust in AI systems (1).

Although AI is being integrated with spectroscopy more regularly, the research on using XAI in agricultural contexts remains limited. Individual studies have explored aspects of explainability in spectral analysis, including HSI applications, but a comprehensive framework has been missing (1). The review article addresses that gap by systematically categorizing recent literature, identifying progress, outlining current limitations, and projecting future directions.

One of the main insights from the review article is that XAI could empower agricultural practitioners to make more informed decisions when deploying spectroscopic models. For example, by pinpointing which spectral bands most influence quality assessments, researchers can develop leaner models that reduce computational requirements and lower costs, which ate critical factors in scaling portable devices for field use (1). Additionally, transparent AI systems are more likely to be adopted by farmers, food processors, and regulators who require confidence in automated decision-making processes (1).

However, despite some of the positive developments surrounding XAI, there are limitations with XAI that will continue to need to be ironed out. For instance, the computational complexity of some XAI techniques remain an ongoing challenge. There are also difficulties in standardizing explainability across different agricultural domains, and the need to balance accuracy with interpretability (1). The authors also caution that although XAI can shed light on model mechanisms, its explanations must be carefully validated to avoid misleading interpretations (1).

As XAI continues to be improved on in the future, it is expected that the development of new non-destructive quality assessment tools will follow. XAI methods are expected to play a role in this by allowing for greater precision in food engineering, crop monitoring, and postharvest evaluation (1).

By systematically consolidating the growing body of work on XAI in agricultural spectroscopy, this study provides a roadmap for researchers and practitioners alike. As Kamruzzaman’s team at the University of Illinois Urbana-Champaign concludes, the marriage of explainability and AI has the potential to transform how agricultural quality is measured, making advanced technologies both more accessible and more trustworthy (1).

References

  1. Ahmed, M. T.; Ahmed, M. W.; Kamruzzaman, M. A Systematic Review of Explainable Artificial Intelligence for Spectroscopic Agricultural Quality Assessment. Comp. Electron. Agric. 2025, 235, 110354. DOI: 10.1016/j.compag.2025.110354
  2. Workman, Jr., J. A Review of the Latest Spectroscopic Research in Agriculture Analysis. Spectroscopy. Available at: https://www.spectroscopyonline.com/view/a-review-of-the-latest-spectroscopic-research-in-agriculture-analysis (accessed 2025-08-19).

Newsletter

Get essential updates on the latest spectroscopy technologies, regulatory standards, and best practices—subscribe today to Spectroscopy.

Related Videos