Tutorial 7: Sustainable Hyperparameter Optimization

Speakers

  • Laurens Bliek (Department of Industrial Engineering & Innovation Sciences. Eindhoven University of Technology (TU/e))
  • Sasan Amini (Data Science Institute. Hasselt University)

 

Abstract

Hyperparameter Optimization (HPO) plays a pivotal role in modern machine learning pipelines, determining model performance, generalization, and stability. However, HPO is inherently computationally intensive, resulting in significant energy consumption and environmental cost. As models and datasets grow, this computational burden scales dramatically, making traditional HPO practices unsustainable.

This tutorial provides an overview of state-of-the-art HPO methods, with a particular emphasis on Bayesian Optimization (BO) as a data-efficient and principled framework for black-box optimization. We will begin by revisiting the theoretical underpinnings of BO, including surrogate modeling, acquisition functions, and extensions to constrained and multi-objective settings. Building upon this foundation, we introduce the emerging research area of sustainable HPO, focusing on approaches that explicitly target energy and resource efficiency, such as multi-fidelity modeling. This way, the tutorial contributes to the vertical area of Sustainable and Trustworthy AI.

 

Target Audience

The tutorial targets researchers, graduate students, and practitioners in machine learning, optimization, and AI systems design who are interested in the methodological and practical aspects of hyperparameter optimization and energy – efficient AI. Participants are expected to have:

  • A general understanding of supervised learning (e.g., regression, classification) and model training workflows.
  • Basic familiarity with optimization concepts such as search spaces, objective functions, and performance evaluation.

 

Outline and Description of the Tutorial

Part I introduces the principles of Hyperparameter Optimization (HPO), focusing on Bayesian Optimization (BO) as a data-efficient framework for black-box optimization. (1.5hrs)

Part II presents the emerging area of sustainable HPO, discussing methods that reduce computational and energy costs. (1,5hrs)

Part III offers a short hands-on exercise with an example application of sustainable HPO. (1 hrs)

 

Reading List

  • Bliek, L. (2022). A survey on sustainable surrogate-based optimisation. Sustainability.
  • Garnett, R. (2023). Bayesian optimization. Cambridge University Press.
  • Bischl, B., Binder, M., Lang, M., Pielok, T., Richter, J., Coors, S., … & Lindauer, M. (2023). Hyperparameter optimization: Foundations, algorithms, best practices, and open challenges. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 13(2), e1484.

 

Vertical

Sustainable and Trustworthy AI

 

Timeline

4 hours