Szczegóły publikacji
Opis bibliograficzny
Blind image quality assessment of magnetic resonance images with statistics of local intensity extrema / Mariusz Oszust, Marzena BIELECKA, Andrzej BIELECKI, Igor Stępień, Rafał Obuchowicz, Adam PIÓRKOWSKI // Information Sciences ; ISSN 0020-0255 . — 2022 — vol. 606, s. 112–125. — Bibliogr. s. 124–125, Abstr. — Publikacja dostępna online od: 2022-05-20
Autorzy (6)
- Oszust Mariusz
- AGHBielecka Marzena
- AGHBielecki Andrzej
- Stępień Igor
- Obuchowicz Rafał
- AGHPiórkowski Adam
Słowa kluczowe
Dane bibliometryczne
| ID BaDAP | 140260 |
|---|---|
| Data dodania do BaDAP | 2022-06-24 |
| Tekst źródłowy | URL |
| DOI | 10.1016/j.ins.2022.05.061 |
| Rok publikacji | 2022 |
| Typ publikacji | artykuł w czasopiśmie |
| Otwarty dostęp | |
| Czasopismo/seria | Information Sciences |
Abstract
Magnetic resonance (MR) imaging provides a large amount of data that requires a visual inspection before a diagnosis can be made. Since the exclusion of low-quality image sequences is performed manually and image processing methods are evaluated using techniques developed for natural images, automatic and reliable MR image quality assessment (IQA) approaches are desirable. Therefore, in this work, a new no-reference (NR) MR-IQA technique is proposed. The method uses introduced quality-aware features addressing characteristics of MR images. Specifically, in the method, an MR image is scaled, filtered with two gradient operators, and subjected to identification of the local intensity extrema. Then, the entropy and curvature are calculated to characterize extrema sequences and used as perceptual features to train a quality model with the Support Vector Regression (SVR) technique. In this paper, an extensive comparative evaluation of the method against recent NR approaches, including deep learning-based models, is conducted on two representative MR-IQA benchmarks. The results reveal the superiority of the introduced approach over competing methods as it obtained better overall Spearman and Pearson correlation coefficients by 5% and 3%, respectively.