PINAT: A Permutation INvariance Augmented Transformer for NAS Predictor

Authors

  • Shun Lu Research Center for Intelligent Computing Systems, Institute of Computing Technology, Chinese Academy of Sciences School of Computer Science and Technology, University of Chinese Academy of Sciences
  • Yu Hu Research Center for Intelligent Computing Systems, Institute of Computing Technology, Chinese Academy of Sciences School of Computer Science and Technology, University of Chinese Academy of Sciences
  • Peihao Wang University of Texas at Austin
  • Yan Han University of Texas at Austin
  • Jianchao Tan Kuaishou Technology.
  • Jixiang Li Kuaishou Technology.
  • Sen Yang Snap Inc.
  • Ji Liu Meta Platforms, Inc.

DOI:

https://doi.org/10.1609/aaai.v37i7.26076

Keywords:

ML: Auto ML and Hyperparameter Tuning, ML: Deep Neural Architectures

Abstract

Time-consuming performance evaluation is the bottleneck of traditional Neural Architecture Search (NAS) methods. Predictor-based NAS can speed up performance evaluation by directly predicting performance, rather than training a large number of sub-models and then validating their performance. Most predictor-based NAS approaches use a proxy dataset to train model-based predictors efficiently but suffer from performance degradation and generalization problems. We attribute these problems to the poor abilities of existing predictors to character the sub-models' structure, specifically the topology information extraction and the node feature representation of the input graph data. To address these problems, we propose a Transformer-like NAS predictor PINAT, consisting of a Permutation INvariance Augmentation module serving as both token embedding layer and self-attention head, as well as a Laplacian matrix to be the positional encoding. Our design produces more representative features of the encoded architecture and outperforms state-of-the-art NAS predictors on six search spaces: NAS-Bench-101, NAS-Bench-201, DARTS, ProxylessNAS, PPI, and ModelNet. The code is available at https://github.com/ShunLu91/PINAT.

Downloads

Published

2023-06-26

How to Cite

Lu, S., Hu, Y., Wang, P., Han, Y., Tan, J., Li, J., Yang, S., & Liu, J. (2023). PINAT: A Permutation INvariance Augmented Transformer for NAS Predictor. Proceedings of the AAAI Conference on Artificial Intelligence, 37(7), 8957-8965. https://doi.org/10.1609/aaai.v37i7.26076

Issue

Section

AAAI Technical Track on Machine Learning II