Szczegóły publikacji
Opis bibliograficzny
A new super resolution Faster R-CNN model based detection and classification of urine sediments / Derya Avci, Eser Sert, Esin Dogantekin, Ozal Yildirim, Ryszard TADEUSIEWICZ, Paweł Plawiak // Biocybernetics and Biomedical Engineering ; ISSN 0208-5216. — 2023 — vol. 43 iss. 1, s. 58–68. — Bibliogr. s. 66–67, Abstr. — Publikacja dostępna online od: 2022-12-15
Autorzy (6)
Avci DeryaSert EserDogantekin EsinYildirim ÖzalAGHTadeusiewicz RyszardPławiak PawełSłowa kluczowe
Dane bibliometryczne
ID BaDAP | 144339 |
---|---|
Tekst źródłowy | URL |
DOI | 10.1016/j.bbe.2022.12.001 |
Rok publikacji | 2023 |
Typ publikacji | artykuł w czasopiśmie |
Otwarty dostęp | |
Czasopismo/seria | Biocybernetics and Biomedical Engineering |
Abstract
The diagnosis of urinary tract infections and kidney diseases using urine microscopy images has gained significant attention of medical community in recent years. These images are usually created by physicians’ own rule of thumb manually. However, this manual urine sediment analysis is usually labor-intensive and time-consuming. In addition, even when physicians carefully examine an image, an erroneous cell recognition may occur due to some optical illusions. In order to achieve cell recognition in low-resolution urine microscopy images with a higher level of accuracy, a new super resolution Faster Region-based Convolutional Neural Network (Faster R-CNN) method is proposed. It aims to increase resolution in low-resolution urine microscopy images using self-similarity based single image super resolution which was used during the pre-processing. De-noising based Wiener filter and Discrete Wavelet Transform (DWT) are used to de-noise high resolution images, respectively, to increase the level of accuracy for image recognition. Finally, for the feature extraction and classification stages, AlexNet, VGFG16 and VGG19 based Faster R-CNN models are used for the recognition and detection of multi-class cells. The model yielded accuracy rates are 98.6%, 96.4% and 96.2% respectively.