*Result*: From hand-crafted metrics to evolved training-free performance predictors for neural architecture search via symbolic regression.
*Further Information*
*Using training-free (TF) metrics as proxies for network performance has demonstrated both its efficiency and efficacy in Neural Architecture Search (NAS). However, a notable limitation of most TF proxies is their inconsistency, as reflected by the substantial variation in their performance across different NAS problems. Furthermore, the design of existing TF metrics is manual, involving a time-consuming trial-and-error process that requires considerable domain expertise. These challenges raise two interesting questions: (1) Can the design of TF metrics be automated? and (2) Can existing hand-crafted TF metrics be leveraged to synthesize a better-performing proxy? In this study, we present a Symbolic Regression framework based on Genetic Programming to automatically synthesize high-quality TF metrics from existing hand-crafted TF metrics. Extensive experiments on 13 problems from NAS-Bench-Suite-Zero demonstrate that our automatically synthesized metric exhibits a strong positive rank correlation with true network performance across diverse NAS problems and consistently outperforms hand-crafted metrics. When used as the search objective in an evolutionary algorithm, our evolved TF proxy metric guides the search to efficiently identify competitive architectures in different search spaces, highlighting its transferability and practicality. The source code can be found at https://github.com/ELO-Lab/SR-TF-NAS. • Automated synthesis of training-free (TF) NAS proxy metrics via symbolic regression. • Synthesized metric better correlates with network performance than handcrafted ones. • Evolved TF proxy metric efficiently discovers competitive architectures in NAS. [ABSTRACT FROM AUTHOR]*