Enhancing Decision Making Across Industries With SHAP

  • Unique Paper ID: 194159
  • Volume: 12
  • Issue: 10
  • PageNo: 3670-3678
  • Abstract:
  • Explainable Artificial Intelligence is crucial for building trust, transparency and accountability in decision-making. Explainable AI (XAI), and SHAP (SHapley Additive exPlanations) in particular, is vital in making sophisticated AI models more interpretable and more trustworthy. For example, in agriculture, farmers can see the effects of soil quality, rainfall, and nutrient quality on crops more clearly with XAI, and ultimately make more data-driven decisions around those factors and encourage sustainable farming choices. In the context of residential energy management, SHAP made it more interpretable by clearly showing how much temperature and humidity patterns and occupancy, and times of the day, contribute to energy consumption predictions, giving households and policymakers the ability, they need to further their own energy efficiencies. In industrial contexts, XAI supports safety by identifying the most important sensory signals and environmental information that can inform the detection of toxic gas leaks and for individuals or organizations to respond faster to identified hazards - due to XAI providing cogent information. As demonstrated in each example, SHAP helps improve predictions and identify the factors that strengthen trust, accountability, and fairness in AI systems, in order to support decision-making in the real world.

Copyright & License

Copyright © 2026 Authors retain the copyright of this article. This article is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

BibTeX

@article{194159,
        author = {Sara Sowmya and Soumita Sen and Geethesh Chowdary},
        title = {Enhancing Decision Making Across Industries With SHAP},
        journal = {International Journal of Innovative Research in Technology},
        year = {2026},
        volume = {12},
        number = {10},
        pages = {3670-3678},
        issn = {2349-6002},
        url = {https://ijirt.org/article?manuscript=194159},
        abstract = {Explainable Artificial Intelligence is crucial for building trust, transparency and accountability in decision-making. Explainable AI (XAI), and SHAP (SHapley Additive exPlanations) in particular, is vital in making sophisticated AI models more interpretable and more trustworthy. For example, in agriculture, farmers can see the effects of soil quality, rainfall, and nutrient quality on crops more clearly with XAI, and ultimately make more data-driven decisions around those factors and encourage sustainable farming choices. In the context of residential energy management, SHAP made it more interpretable by clearly showing how much temperature and humidity patterns and occupancy, and times of the day, contribute to energy consumption predictions, giving households and policymakers the ability, they need to further their own energy efficiencies. In industrial contexts, XAI supports safety by identifying the most important sensory signals and environmental information that can inform the detection of toxic gas leaks and for individuals or organizations to respond faster to identified hazards - due to XAI providing cogent information. As demonstrated in each example, SHAP helps improve predictions and identify the factors that strengthen trust, accountability, and fairness in AI systems, in order to support decision-making in the real world.},
        keywords = {Explainable Artificial Intelligence (XAI), SHAP (Shapley Additive Explanations), Machine Learning Interpretability, Predictive Analytics, Decision Support Systems.},
        month = {March},
        }

Cite This Article

Sowmya, S., & Sen, S., & Chowdary, G. (2026). Enhancing Decision Making Across Industries With SHAP. International Journal of Innovative Research in Technology (IJIRT), 12(10), 3670–3678.

Related Articles