Szczegóły publikacji

Opis bibliograficzny

Explainable sparse associative self-optimizing neural networks for classification / Adrian HORZYK, Jakub Kosno, Daniel BULANDA, Janusz A. Starzyk // W: Neural Information Processing : 30th International Conference, ICONIP 2023 : Changsha, China, November 20–23, 2023 : proceedings, Pt. 9 / eds. Biao Luo, [et al.]. — Singapore : Springer Nature Singapore, cop. 2024. — (Communications in Computer and Information Science ; ISSN 1865-0929 ; vol. 1963). — ISBN: 978-981-99-8137-3; e-ISBN: 978-981-99-8138-0. — S. 229–244. — Bibliogr., Abstr. — Publikacja dostępna online od: 2023-11-26


Autorzy (4)


Słowa kluczowe

sparse network structure self-developmentexplainable self-adaptive modelsassociative fuzzy processingautomatic reduction of dimensionalityoverfittingunderfittingassociative fuzzy representation

Dane bibliometryczne

ID BaDAP150916
Data dodania do BaDAP2023-12-21
DOI10.1007/978-981-99-8138-0_19
Rok publikacji2024
Typ publikacjimateriały konferencyjne (aut.)
Otwarty dostęptak
WydawcaSpringer
KonferencjaInternational Conference on Neural Information Processing
Czasopismo/seriaCommunications in Computer and Information Science

Abstract

Supervised models often suffer from a multitude of possible combinations of hyperparameters, rigid nonadaptive architectures, underfitting, overfitting, the curse of dimensionality, etc. They consume many resources and slow down the optimization process. As real-world objects are related and similar, we should adapt not only network parameters but also network structure to represent patterns and relationships better. When the network reproduces the most essential and frequent data patterns and relationships and aggregate similarities, it becomes not only efficient but also explainable and trustworthy. This paper presents a new approach to detect and represent similarities of numerical training examples to self-adapt a network structure and its parameters. Such a network will facilitate the classification by identifying hyperspace regions associated with the defined classes in a training dataset. Our approach demonstrates its ability to automatically reduce input data dimension by removing features that produce distortions and do not support the classification process. The presented adaptation algorithm uses only a few optional hyperparameters and produces a sparse associative neural network structure that fits contextually any given dataset by detecting data similarities and constructing hypercuboids in data space. The explanation of these associative adaptive techniques is followed by the comparisons of the classification results against other state-of-the-art models and methods.

Publikacje, które mogą Cię zainteresować

fragment książki
Structural properties of associative knowledge graphs / Janusz A. Starzyk, Przemysław Stokłosa, Adrian HORZYK, Paweł Raif // W: Neural Information Processing : 30th International Conference, ICONIP 2023 : Changsha, China, November 20–23, 2023 : proceedings, Pt. 4 / eds. Biao Luo, [et al.]. — Singapore : Springer Nature Singapore, cop. 2024. — (Lecture Notes in Computer Science ; ISSN 0302-9743 ; vol. 14450). — ISBN: 978-981-99-8069-7; e-ISBN: 978-981-99-8070-3. — S. 326–339. — Bibliogr., Abstr. — Publikacja dostępna online od: 2023-11-15
fragment książki
Theory-guided convolutional neural network with an enhanced water flow optimizer / Xiaofeng Xue, Xiaoling Gong, Jacek MAŃDZIUK, Jun Yao, El-Sayed M. El-Alfy, Jian Wang // W: Neural Information Processing : 30th International Conference, ICONIP 2023 : Changsha, China, November 20–23, 2023: proceedings, Pt. 1 / eds. Biao Luo, [et al.]. — Singapore : Springer Nature Singapore, cop. 2024. — (Lecture Notes in Computer Science ; ISSN 0302-9743 ; vol. 14447). — ISBN: 978-981-99-8078-9; e-ISBN: 978-981-99-8079-6. — S. 448–461. — Bibliogr., Abstr. — Publikacja dostępna online od: 2023-11-14. — J. Mańdziuk - dod. afiliacja: Warsaw University of Technology, Warsaw