Szczegóły publikacji

Opis bibliograficzny

Efficient pruning and compression techniques for convolutional neural networks to preserve knowledge and optimize performance / Jakub SKRZYŃSKI, Adrian HORZYK // W: Neural Information Processing : 31st International Conference, ICONIP 2024 : Auckland, New Zealand, December 2–6, 2024: proceedings , Pt. 1 / eds. Mufti Mahmud, [et al.]. — Singapore : Springer Nature Singapore, cop. 2025. — ( Lecture Notes in Computer Science ; ISSN  0302-9743 ; LNCS 15286 ). — ISBN: 978-981-96-6575-4; e-ISBN: 978-981-96-6576-1. — S. 381–395. — Bibliogr., Abstr. — Błąd w nazwisku J. Skrzyński

Autorzy (2)

Dane bibliometryczne

ID BaDAP160772
Data dodania do BaDAP2025-08-01
DOI10.1007/978-981-96-6576-1_26
Rok publikacji2025
Typ publikacjimateriały konferencyjne (aut.)
Otwarty dostęptak
WydawcaSpringer
KonferencjaInternational Conference on Neural Information Processing 2024
Czasopismo/seriaLecture Notes in Computer Science

Abstract

Rapid growth in the complexity and size of Convolutional Neural Networks (CNNs) poses significant challenges in computational resources and energy consumption. This paper presents a novel approach to CNN optimization through a combined pruning and compression technique designed to preserve knowledge and enhance efficiency. Unlike traditional pruning methods that introduce sparsity and require specialized hardware, our method focuses on pruning entire filters based on their importance, followed by an innovative compression strategy that merges the least informative filters. This preserves the knowledge contained within these filters while maintaining the network’s structural integrity. We propose a three-step algorithm: selecting low-information filters using entropy metrics, grouping similar filters, and merging these groups to retain critical information. Our approach significantly reduces the network size without compromising accuracy, as demonstrated using the CIFAR-10, MNIST, Fashion MNIST, and USPS datasets, as well as a modified and original VGG16 architecture. The experiments carried out have shown that our method achieves a substantial reduction in the number of parameters and Floating Point Operations (FLOPs), lowering computational costs by up to 86.82% while preserving up to 99% of the original accuracy of the model. This paper contributes to the field of deep learning by offering a scalable, hardware-agnostic solution for CNN optimization, making it highly suitable for deployment in resource-constrained environments. Future work will explore the application of this method to various architectures and datasets to further validate its efficacy and versatility.

Publikacje, które mogą Cię zainteresować

fragment książki
#163072Data dodania: 30.9.2025
Enhancing convnets with pruning and symmetry-based filter augmentation / Igor Ratajczyk, Adrian HORZYK // W: Neural Information Processing : 31st International Conference, ICONIP 2024 : Auckland, New Zealand, December 2–6, 2024: proceedings , Pt. 1 / eds. Mufti Mahmud, [et al.]. — Singapore : Springer Nature Singapore, cop. 2025. — ( Lecture Notes in Computer Science ; ISSN  0302-9743 ; LNCS 15286 ). — ISBN: 978-981-96-6575-4; e-ISBN: 978-981-96-6576-1. — S. 396–409. — Bibliogr., Abstr. — Publikacja dostępna online od: 2025-06-08
fragment książki
#163106Data dodania: 30.9.2025
Advanced stock market forecasting using synergic of sentiment analysis and association rule mining / Michał ZWIERZYŃSKI, Adrian HORZYK // W: Neural Information Processing : 31st International Conference, ICONIP 2024 : Auckland, New Zealand, December 2–6, 2024 : proceedings , Pt. 14 / eds. Mufti Mahmud, [et al.]. — Singapore : Springer Nature, cop. 2025. — ( Communications in Computer and Information Science ; ISSN  1865-0929 ; vol. 2295 ). — ISBN: 978-981-96-7029-1; e-ISBN: 978-981-96-7030-7. — S. 58–72. — Bibliogr., Abstr. — Publikacja dostępna online od: 2025-06-24