Title |
Timing Optimization of Digital Circuits using Graph Neural Networks |
Authors |
김준(Jun Kim) ; 정재용(Jaeyong Chung) |
DOI |
https://doi.org/10.5573/ieie.2024.61.2.47 |
Keywords |
Deep learning; Static timing analysis; Attention; Hyperparameter; Optimizer |
Abstract |
The Analytical Gradient Descent (AGD) algorithm can calculate and optimize analytical slopes for target targets through neural networks learned by simulation models. The AGD algorithm can be useful when optimizing several fields. In particular, when used with deep learning models used in the field of electronic design automation (EDA), circuit performance can be predicted and increased. Deep learning models with AGD algorithms show performance close to SynopsysDC, which is widely used in digital circuit design. However, the existing AGD algorithm did not have hyperparameters tuned and several optimization techniques were not applied. In this paper, various optimization techniques that can increase the performance of the AGD algorithm were applied, and the operation time of the logic gate was shortened by adjusting hyperparameter values. When adjusting the gate size of a digital circuit using a performance-enhanced AGD algorithm, the QoR (quality of result) was -0.1572 based on SynopsysDC. This is an indicator of how the tuning of hyperparameters and the application of appropriate optimization techniques affect the model. This paper says that hyperparameter tuning and optimization techniques suitable for deep learning models are factors that can increase performance. Moreover, the performance-enhanced AGD algorithm could be used in several areas that require optimization. |