AI-Driven Timing Violation Predictor for RTL Circuits

  • Unique Paper ID: 181788
  • Volume: 12
  • Issue: 1
  • PageNo: 5481-5484
  • Abstract:
  • Timing analysis is a crucial step in the design of Integrated Circuits (ICs) and System-on-Chip (SoC) architectures. Traditional methods rely on synthesis-based timing reports, which are computationally expensive. This paper presents an AI-driven model to predict the combinational logic depth of signals in RTL circuits, enabling early detection of timing violations before synthesis. Unlike prior ML-based timing analysis approaches that primarily rely on post-synthesis data, our model enables early-stage prediction, reducing synthesis runtime by approximately 40-60%, depending on circuit complexity. The novelty of our approach lies in feature- driven learning, leveraging key RTL properties to build an efficient predictive model. Our model achieves 85-90% accuracy in predicting timing violations and reduces synthesis runtime by 40-60%. Validation with industry- standard tools, including Synopsys Design Compiler and Cadence Genus, confirms its effectiveness for real-world integration.

Cite This Article

  • ISSN: 2349-6002
  • Volume: 12
  • Issue: 1
  • PageNo: 5481-5484

AI-Driven Timing Violation Predictor for RTL Circuits

Related Articles