Szczegóły publikacji
Opis bibliograficzny
Breast cancer classification on histopathological images affected by data imbalance using active learning and deep convolutional neural network / Bogdan KWOLEK, Michał KOZIARSKI, Andrzej Bukała, Zbigniew Antosz, Bogusław Olborski, Paweł Wąsowicz, Jakub Swadźba, Bogusław CYGANEK // W: Artificial neural networks and machine learning - ICANN 2019 : workshop and special sessions : 28th International Conference on Artificial Neural Networks : Munich, Germany, September 17–19, 2019 : proceedings / eds. Igor V. Tetko, [et al.]. — Cham : Springer Nature Switzerland, cop. 2019. — (Lecture Notes in Computer Science ; ISSN 0302-9743 ; 11731). — ISBN: 978-3-030-30492-8; e-ISBN: 978-3-030-30493-5. — S. 299–312. — Bibliogr. s. 311–312, Abstr. — Publikacja dostępna online od: 2019-09-09. — B. Kwolek, M. Koziarski, B. Cyganek - dod. afiliacja: Diagnostyka Consilio, Łódź
Autorzy (8)
- AGHKwolek Bogdan
- AGHKoziarski Michał
- Bukała Andrzej
- Antosz Zbigniew
- Olborski Bogusław
- Wąsowicz Paweł
- Swadźba Jakub
- AGHCyganek Bogusław
Dane bibliometryczne
| ID BaDAP | 124584 |
|---|---|
| Data dodania do BaDAP | 2019-09-25 |
| Tekst źródłowy | URL |
| DOI | 10.1007/978-3-030-30493-5_31 |
| Rok publikacji | 2019 |
| Typ publikacji | materiały konferencyjne (aut.) |
| Otwarty dostęp | |
| Wydawca | Springer |
| Konferencja | International Conference on Artificial Neural Networks 2019 |
| Czasopismo/seria | Lecture Notes in Computer Science |
Abstract
In this work, we propose an algorithm for training deep neural networks for classification of breast cancer in histopathological images affected by data unbalance with support of active learning. The output of the neural network on unlabeled samples is used to calculate weighted information entropy. It is utilized as uncertainty score for automatic selecting both samples with high and low confidence. A number of low confidence samples that are selected in each iteration is manually labeled by pathologist. A threshold that decays over iteration number is used to decide which high confidence samples should be concatenated with manually labeled samples and then used in fine-tuning of convolutional neural network. The neural network can optionally be trained using weighted cross-entropy loss to better cope with bias towards the majority class.