Sunny-as2: a step forward in the selection of algorithms

Maurizio Gabbrielli, Roberto Amadini November 27, 2023 6 min read

Algorithm Selection (AS) is a central problem in computer science, particularly relevant in contexts such as combinatorial optimization and NP-hard problem solving. The “SUNNY-AS2: Enhancing SUNNY for Algorithm Selection” case study by Tong Liu (Meituan, Beijing, China), Roberto Amadini and Maurizio Gabbrielli (Department of Computer Science – Science and Engineering, University of Bologna, Italy), Jacopo Mauro (Department of Mathematics and Computer Science, University of Southern Denmark, Denmark) illustrates a significant improvement of SUNNY, an approach based on k-NN (k-nearest neighbors) for algorithm selection.

Algorithm selection is a necessity that arises when one can count on a portfolio of algorithms and wants to select the most suitable one to solve a specific instance of a problem. An algorithm selection problem is defined by a triple (I, A, m) where I is a set of instances, A is a set of algorithms, and m :I × A → R is a performance evaluation metric. To select the best algorithm to solve an unknown instance x, SUNNY uses the k-NN to select from the training set of I the subset Ik (called neighborhood) of the k instances closest to the feature vector F(x) according to the Euclidean distance. Based on the performance of the algorithms of A on the instances of Ik, SUNNY selects a subset of A to execute. The original version of SUNNY has, however, shown some limitations, especially when applied to scenarios other than Constraint Programming (CP) for which it was originally developed.

Sunny-as2 introduces three key improvements over the original version:

  1. Wrapper-based feature selection: unlike its predecessor, sunny-as2 uses wrapper-based feature selection, which allows for a more precise definition of features relevant to the problem. 
  1. Configuration of neighborhood size: sunny-as2 allows configuration of Ik size thus improving generalization to different scenarios. 
  1. Greedy approach to speed up the training phase: Introducing a greedy algorithm for algorithm selection reduces the time needed for the training phase, making the system more efficient.

The user can choose from three different variants:

  1. sunny-as2-k: all features are used and only the configuration of k is performed. 
  1. sunny-as2-f: the value of k is fixed and only feature selection is performed. 
  1. sunny-as2-fk: the value of k and features are configured together.

The authors of the case study then conducted a series of experiments to evaluate the performance of sunny-as2, and the results show greater efficiency than other state-of-the-art algorithm selection methods. In particular, it was observed that k-value configuration and feature selection can have a significant impact on system performances. 

In conclusion, sunny-as2 represents a significant step forward in the field of algorithm selection because it offers a more versatile and efficient solution compared to traditional methods. With its ability to adapt to different scenarios and flexible configurations, this system demonstrates how the integration of machine learning and combinatorial optimization can lead to surprisingly effective results. The experiments conducted by the case study authors confirm not only the effectiveness of sunny-as2, but also its potential in shaping the future of algorithm selection. In a world increasingly dependent on optimized computing solutions, the importance of tools such as sunny-as2 cannot be underestimated. 

This article is based on
Sunny-as2: Enhancing SUNNY for Algorithm Selection
Publisher
International Joint Conference on Artificial Intelligence
Author
Roberto Amadini, Maurizio Gabbrielli
Year
2021
Language
English