Szczegóły publikacji
Opis bibliograficzny
Associative Graph Data Structures used for acceleration of k Nearest Neighbor classifiers / Adrian HORZYK, Krzysztof GOŁDON // W: Artificial Neural Networks and Machine Learning – ICANN 2018 : 27th international conference on Artificial Neural Networks: Rhodes, Greece, October 4–7, 2018 : proceedings, Pt. 1 / eds. Věra Kůrková, [et al.]. — Cham : Springer Nature Switzerland AG, cop. 2018. — (Lecture Notes in Computer Science ; ISSN 0302-9743 ; LNCS 11139. Theoretical Computer Science and General Issues ; ISSN 0302-9743). — ISBN: 978-3-030-01417-9; e-ISBN: 978-3-030-01418-6. — S. 648–658. — Bibliogr. s. 657–658, Abstr.
Autorzy (2)
Słowa kluczowe
Dane bibliometryczne
| ID BaDAP | 118910 |
|---|---|
| Data dodania do BaDAP | 2019-01-08 |
| Tekst źródłowy | URL |
| DOI | 10.1007/978-3-030-01418-6_64 |
| Rok publikacji | 2018 |
| Typ publikacji | materiały konferencyjne (aut.) |
| Otwarty dostęp | |
| Wydawca | Springer |
| Konferencja | Artificial Neural Networks and Machine Learning |
| Czasopisma/serie | Lecture Notes in Computer Science, Theoretical Computer Science and General Issues |
Abstract
This paper introduces a new associative approach for significant acceleration of k Nearest Neighbor classifiers (kNN). The kNN classifier is a lazy method, i.e. it does not create a computational model, so it is inefficient during classification using big training data sets because it requires going through all training patterns when classifying each sample. In this paper, we propose to use Associative Graph Data Structures (AGDS) as an efficient model for storing training patterns and their relations, allowing for fast access to nearest neighbors during classification made by kNNs. Hence, the AGDS significantly accelerates the classification made by kNNs, especially for large and huge training datasets. In this paper, we introduce an Associative Acceleration Algorithm and demonstrate how it works on this associative structure substantially reducing the number of checked patterns and quickly selecting k nearest neighbors for kNNs. The presented approach was compared to classic kNN approaches successfully.