FailNetX: A Temporal Stress-Testing Framework for Machine Learning Systems

  • Unique Paper ID: 192182
  • Volume: 12
  • Issue: 9
  • PageNo: 743-747
  • Abstract:
  • Machine learning models often perform well during training but lose reliability after deployment due to changes in data distributions. This paper presents AI Resilience Lab, a web-based platform designed to proactively evaluate model behaviour under realistic data drift conditions before production deployment. The system integrates a drift simulation engine, a SHAP-based explainability module for identifying feature-level causes of degradation, and a unified risk assessment framework that quantifies model reliability. Implemented using React and FastAPI, the platform supports time-based simulations across varying drift intensities. Experimental results on benchmark datasets demonstrate effective performance degradation analysis, accurate root cause identification, and early warning of potential model failure. The proposed approach addresses key limitations of existing MLOps practices by enabling proactive model robustness evaluation instead of reactive monitoring.

Copyright & License

Copyright © 2026 Authors retain the copyright of this article. This article is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

BibTeX

@article{192182,
        author = {Vedavi V and M C S Geetha},
        title = {FailNetX: A Temporal Stress-Testing Framework for Machine Learning Systems},
        journal = {International Journal of Innovative Research in Technology},
        year = {2026},
        volume = {12},
        number = {9},
        pages = {743-747},
        issn = {2349-6002},
        url = {https://ijirt.org/article?manuscript=192182},
        abstract = {Machine learning models often perform well during training but lose reliability after deployment due to changes in data distributions. This paper presents AI Resilience Lab, a web-based platform designed to proactively evaluate model behaviour under realistic data drift conditions before production deployment. The system integrates a drift simulation engine, a SHAP-based explainability module for identifying feature-level causes of degradation, and a unified risk assessment framework that quantifies model reliability. Implemented using React and FastAPI, the platform supports time-based simulations across varying drift intensities. Experimental results on benchmark datasets demonstrate effective performance degradation analysis, accurate root cause identification, and early warning of potential model failure. The proposed approach addresses key limitations of existing MLOps practices by enabling proactive model robustness evaluation instead of reactive monitoring.},
        keywords = {Machine Learning, Data Drift, Model Robustness, Explainable AI, SHAP, Risk Assessment, MLOps, Model Monitoring.},
        month = {February},
        }

Cite This Article

V, V., & Geetha, M. C. S. (2026). FailNetX: A Temporal Stress-Testing Framework for Machine Learning Systems. International Journal of Innovative Research in Technology (IJIRT), 12(9), 743–747.

Related Articles