Szczegóły publikacji

Opis bibliograficzny

Automatic optimization of hyperparameters using associative self-adapting structures / Szymon Czaplak, Adrian HORZYK // W: IJCNN 2022 [Dokument elektroniczny] : International Joint Conference on Neural Networks : Padua, Italy, 18–23 July 2022 : proceedings / IEEE. — Wersja do Windows. — Dane tekstowe. — Piscataway : IEEE, cop. 2022. — ( Proceedings of ... International Joint Conference on Neural Networks ; ISSN  2161-4393 ). — Konferencja zorganizowana w ramach IEEE World Congress on Computational Intelligence (IEEE WCCI 2022). — e-ISBN: 978-1-7281-8671-9. — S. [1–8]. — Wymagania systemowe: Adobe Reader. — Bibliogr. s. [8], Abstr. — Publikacja dostępna online od: 2022-09-30

Autorzy (2)

Słowa kluczowe

machine learningassociative graph data structuresassociative self-adapting structureshyperparameter optimization

Dane bibliometryczne

ID BaDAP142996
Data dodania do BaDAP2022-10-29
Tekst źródłowyURL
DOI10.1109/IJCNN55064.2022.9892758
Rok publikacji2022
Typ publikacjimateriały konferencyjne (aut.)
Otwarty dostęptak
WydawcaInstitute of Electrical and Electronics Engineers (IEEE)
KonferencjaIEEE International Joint Conference on Neural Networks 2022
Czasopismo/seriaProceedings of ... International Joint Conference on Neural Networks

Abstract

In this work, we focus on hyperparameter optimization, which is one of the essential steps in the development of machine learning solutions. This work proposes a novel approach to hyperparameter optimization using associative self-adapting structures, which allows us to efficiently explore the hyperparameter space. The core algorithm introduced in this paper is based on the associative graph data structure (AGDS) merged with the evolutionary approach, inspired by the processes used in drug discovery and human behavior. The proposed algorithm was compared with two state-of-the-art algorithms, a Tree-Structured Parzen Estimator and differential evolution. All experiments were carried out using Penn Machine Learning Benchmarks, repeated ten times for each problem, to provide objective comparisons. We optimized Random Forest and deep neural network models for every dataset of these benchmarks. The results reveal a noticeable improvement compared to both optimization methods in both optimized models, which shows that the presented approach can be successfully used as a good alternative for current state-of-the-art solutions.

Publikacje, które mogą Cię zainteresować

fragment książki
#142998Data dodania: 29.10.2022
Identifying substitute and complementary products for assortment optimization with Cleora embeddings / Sergiy Tkachuk, Anna Wróblewska, Jacek Dabrowski, Szymon ŁUKASIK // W: IJCNN 2022 [Dokument elektroniczny] : International Joint Conference on Neural Networks : Padua, Italy, 18–23 July 2022 : proceedings / IEEE. — Wersja do Windows. — Dane tekstowe. — Piscataway : IEEE, cop. 2022. — (Proceedings of ... International Joint Conference on Neural Networks ; ISSN 2161-4393). — Konferencja zorganizowana w ramach IEEE World Congress on Computational Intelligence (IEEE WCCI 2022). — e-ISBN: 978-1-7281-8671-9. — S. [1–7]. — Wymagania systemowe: Adobe Reader. — Bibliogr. s. [6–7], Abstr. — Publikacja dostępna online od: 2022-09-30. — S. Łukasik - dod. afiliacja: Systems Research Institute, Polish Academy of Sciences, Warsaw
fragment książki
#142986Data dodania: 2.1.2023
LIFEWATCH: lifelong wasserstein change point detection / Kamil FABER, Roberto Corizzo, Bartłomiej ŚNIEŻYŃSKI, Michael Baron, Nathalie Japkowicz // W: IJCNN 2022 [Dokument elektroniczny] : International Joint Conference on Neural Networks : Padua, Italy, 18–23 July 2022 : proceedings / IEEE. — Wersja do Windows. — Dane tekstowe. — Piscataway : IEEE, cop. 2022. — (Proceedings of ... International Joint Conference on Neural Networks ; ISSN 2161-4393). — Konferencja zorganizowana w ramach IEEE World Congress on Computational Intelligence (IEEE WCCI 2022). — e-ISBN: 978-1-7281-8671-9. — s. 1-8. — Bibliogr., Abstr.