[vc_empty_space][vc_empty_space]
On the use of metaheuristics in hyperparameters optimization of Gaussian processes
Palar P.S.a, Zuhal L.R.a, Shimoyama K.b
a Faculty of Mechanical and Aerospace Engineering, Institut Teknologi Bandung, Bandung, Indonesia
b Institute of Fluid Science, Tohoku University, Sendai, Japan
[vc_row][vc_column][vc_row_inner][vc_column_inner][vc_separator css=”.vc_custom_1624529070653{padding-top: 30px !important;padding-bottom: 30px !important;}”][/vc_column_inner][/vc_row_inner][vc_row_inner layout=”boxed”][vc_column_inner width=”3/4″ css=”.vc_custom_1624695412187{border-right-width: 1px !important;border-right-color: #dddddd !important;border-right-style: solid !important;border-radius: 1px !important;}”][vc_empty_space][megatron_heading title=”Abstract” size=”size-sm” text_align=”text-left”][vc_column_text]© 2019 Association for Computing Machinery.Due to difficulties such as multiple local optima and flat landscape, it is suggested to use global optimization techniques to discover the global optimum of the auxiliary optimization problem of finding good Gaussian Processes (GP) hyperparameters. We investigated the performance of genetic algorithms (GA), particle swarm optimization (PSO), differential evolution (DE), and covariance matrix adaptation evolution strategy (CMA-ES) for optimizing hyperparameters of GP. The study was performed on two artificial problems and also one real-world problem. From the results, we observe that PSO, CMA-ES, and DE/local-to-best/1 consistently outperformed two variants of GA and DE/rand/1 with per-generation-dither on all problems. In particular, CMA-ES is an attractive method since it is quasi-parameter free and it also demonstrates good exploitative and explorative power on optimizing the hyperparameters.[/vc_column_text][vc_empty_space][vc_separator css=”.vc_custom_1624528584150{padding-top: 25px !important;padding-bottom: 25px !important;}”][vc_empty_space][megatron_heading title=”Author keywords” size=”size-sm” text_align=”text-left”][vc_column_text]Covariance matrix adaptation evolution strategies,Differential Evolution,Gaussian process regression,Global optimization techniques,Hyperparameters,Likelihood functions,Meta heuristics,Optimization problems[/vc_column_text][vc_empty_space][vc_separator css=”.vc_custom_1624528584150{padding-top: 25px !important;padding-bottom: 25px !important;}”][vc_empty_space][megatron_heading title=”Indexed keywords” size=”size-sm” text_align=”text-left”][vc_column_text]Gaussian Process Regression,Hyperparameters optimization,Likelihood function,Metaheuristics[/vc_column_text][vc_empty_space][vc_separator css=”.vc_custom_1624528584150{padding-top: 25px !important;padding-bottom: 25px !important;}”][vc_empty_space][megatron_heading title=”Funding details” size=”size-sm” text_align=”text-left”][vc_column_text][/vc_column_text][vc_empty_space][vc_separator css=”.vc_custom_1624528584150{padding-top: 25px !important;padding-bottom: 25px !important;}”][vc_empty_space][megatron_heading title=”DOI” size=”size-sm” text_align=”text-left”][vc_column_text]https://doi.org/10.1145/3319619.3322012[/vc_column_text][/vc_column_inner][vc_column_inner width=”1/4″][vc_column_text]Widget Plumx[/vc_column_text][/vc_column_inner][/vc_row_inner][/vc_column][/vc_row][vc_row][vc_column][vc_separator css=”.vc_custom_1624528584150{padding-top: 25px !important;padding-bottom: 25px !important;}”][/vc_column][/vc_row]