Szczegóły publikacji

Opis bibliograficzny

Associative Graph Data Structures used for acceleration of k Nearest Neighbor classifiers / Adrian HORZYK, Krzysztof GOŁDON // W: Artificial Neural Networks and Machine Learning – ICANN 2018 : 27th international conference on Artificial Neural Networks: Rhodes, Greece, October 4–7, 2018 : proceedings, Pt. 1 / eds. Věra Kůrková, [et al.]. — Cham : Springer Nature Switzerland AG, cop. 2018. — (Lecture Notes in Computer Science ; ISSN 0302-9743 ; LNCS 11139. Theoretical Computer Science and General Issues ; ISSN 0302-9743). — ISBN: 978-3-030-01417-9; e-ISBN: 978-3-030-01418-6. — S. 648–658. — Bibliogr. s. 657–658, Abstr.

Autorzy (2)

Słowa kluczowe

associative accelerationclassificationassociative graph data structuresk-nearest neighboursbrain inspired associative approach

Dane bibliometryczne

ID BaDAP118910
Data dodania do BaDAP2019-01-08
Tekst źródłowyURL
DOI10.1007/978-3-030-01418-6_64
Rok publikacji2018
Typ publikacjimateriały konferencyjne (aut.)
Otwarty dostęptak
WydawcaSpringer
KonferencjaArtificial Neural Networks and Machine Learning
Czasopisma/serieLecture Notes in Computer Science, Theoretical Computer Science and General Issues

Abstract

This paper introduces a new associative approach for significant acceleration of k Nearest Neighbor classifiers (kNN). The kNN classifier is a lazy method, i.e. it does not create a computational model, so it is inefficient during classification using big training data sets because it requires going through all training patterns when classifying each sample. In this paper, we propose to use Associative Graph Data Structures (AGDS) as an efficient model for storing training patterns and their relations, allowing for fast access to nearest neighbors during classification made by kNNs. Hence, the AGDS significantly accelerates the classification made by kNNs, especially for large and huge training datasets. In this paper, we introduce an Associative Acceleration Algorithm and demonstrate how it works on this associative structure substantially reducing the number of checked patterns and quickly selecting k nearest neighbors for kNNs. The presented approach was compared to classic kNN approaches successfully.

Publikacje, które mogą Cię zainteresować

fragment książki
#116030Data dodania: 2.10.2018
Associative graph data structures with an efficient access via AVB+trees / Adrian HORZYK // W: 2018 11th International conference on Human System Interaction (HSI) [Dokument elektroniczny] : July 4–6 2018, Gdańsk : conference proceedings / eds. Adam Bujnowski, Mariusz Kaczmarek, Jacek Rumiński. — Wersja do Windows. — Dane tekstowe. — [Piscataway] : IEEE, 2018. — Dod. USB ISBN 978-1-5386-5023-3. — ISBN: 978-1-5386-5025-7; e-ISBN: 978-1-5386-5024-0. — S. 169–175. — Wymagania systemowe: Adobe Reader. — Bibliogr. s. 175, Abstr.
fragment książki
#124590Data dodania: 25.9.2019
Temporal coding of neural stimuli / Adrian HORZYK, Krzysztof GOŁDON, Janusz A. Starzyk // W: Artificial neural networks and machine learning - ICANN 2019 : workshop and special sessions : 28th International Conference on Artificial Neural Networks : Munich, Germany, September 17–19, 2019 : proceedings / eds. Igor V. Tetko, [et al.]. — Cham : Springer Nature Switzerland, cop. 2019. — (Lecture Notes in Computer Science ; ISSN 0302-9743 ; 11731). — ISBN: 978-3-030-30492-8; e-ISBN: 978-3-030-30493-5. — S. 607–621. — Bibliogr. s. 620–621, Abstr. — Publikacja dostępna online od: 2019-09-09