Skip to Main content Skip to Navigation
Conference papers

A Deep Learning Based Cost Model for Automatic Code Optimization

Abstract : Enabling compilers to automatically optimize code has been a longstanding goal for the compiler community. Efficiently solving this problem requires using precise cost models. These models predict whether applying a sequence of code transformations reduces the execution time of the program. Building an analytical cost model to do so is hard in modern x86 architectures due to the complexity of the microarchitecture. In this paper, we present a novel deep learning based cost model for automatic code optimization. This model was integrated in a search method and implemented in the Tiramisu compiler to select the best code transformations. The input of the proposed model is a set of simple features representing the unoptimized code and a sequence of code transformations. The model predicts the speedup expected when the code transformations are applied. Unlike previous models, the proposed one works on full programs and does not rely on any heavy feature engineering. The proposed model has only 16% of mean absolute percentage error in predicting speedups on full programs. The proposed model enables Tiramisu to automatically find code transformations that match or are better than state-of-the-art compilers without requiring the same level of heavy feature engineering required by those compilers.
Complete list of metadata
Contributor : Daniel Gavrysiak Connect in order to contact the contributor
Submitted on : Friday, September 17, 2021 - 3:52:04 PM
Last modification on : Sunday, June 26, 2022 - 9:27:03 AM

Links full text


  • HAL Id : hal-03347923, version 1
  • ARXIV : 2104.04955




Mohamed Riyadh Baghdadi, Massinissa Merouani, Mohamed-Hicham Leghettas, Kamel Abdous, Taha Arbaoui, et al.. A Deep Learning Based Cost Model for Automatic Code Optimization. 4th MLSys Conference, 2021, San Jose, United States. ⟨hal-03347923⟩



Record views