PED-ANOVA: Efficiently Quantifying Hyperparameter Importance in Arbitrary Subspaces
PED-ANOVA: Efficiently Quantifying Hyperparameter Importance in Arbitrary Subspaces
Shuhei Watanabe, Archit Bansal, Frank Hutter
Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence
Main Track. Pages 4389-4396.
https://doi.org/10.24963/ijcai.2023/488
The recent rise in popularity of Hyperparameter Optimization (HPO) for deep learning has highlighted the role that good hyperparameter (HP) space design can play in training strong models. In turn, designing a good HP space is critically dependent on understanding the role of different HPs. This motivates research on HP Importance (HPI), e.g., with the popular method of functional ANOVA (f-ANOVA). However, the original f-ANOVA formulation is inapplicable to the subspaces most relevant to algorithm designers, such as those defined by top performance. To overcome this issue, we derive a novel formulation of f-ANOVA for arbitrary subspaces and propose an algorithm that uses Pearson divergence (PED) to enable a closed-form calculation of HPI. We demonstrate that this new algorithm, dubbed PED-ANOVA, is able to successfully identify important HPs in different subspaces while also being extremely computationally efficient. See https://arxiv.org/abs/2304.10255 for the latest version with Appendix.
Keywords:
Machine Learning: ML: Hyperparameter optimization
Machine Learning: ML: Automated machine learning