•  
  •  
 

Turkish Journal of Electrical Engineering and Computer Sciences

DOI

10.55730/1300-0632.4056

Abstract

This study presents a fast hyperparameter optimization algorithm based on the benefits and shortcomings of the standard grid search (GS) algorithm for support vector regression (SVR). This presented GS-inspired algorithm, called fast grid search (FGS), was tested on benchmark datasets, and the impact of FGS on prediction accuracy was primarily compared with the GS algorithm on which it is based. To validate the efficacy of the proposed algorithm and conduct a comprehensive comparison, two additional hyperparameter optimization techniques, namely particle swarm optimization and Bayesian optimization, were also employed in the development of models on the given datasets. The evaluation of the models’ predictive performance was conducted by assessing root mean square error, mean absolute error, and mean absolute percentage error. In addition to these metrics, the number of evaluated submodels and the time required for optimization were used as determinative performance measures of the presented models. Experimental results proved that the FGS-optimized SVR models yield precise performance, supporting the reliability, validity, and applicability of the FGS algorithm. As a result, the FGS algorithm can be offered as a faster alternative in optimizing the hyperparameters of SVR in terms of execution time.

Keywords

Fast grid search, support vector regression, hyperparameter optimization, grid search

First Page

68

Last Page

92

Share

COinS