Szczegóły publikacji
Opis bibliograficzny
Impact of environmental control on subjective video quality assessment in crowdsourced QoE experiments / Avrajyoti DUTTA, Mohamedalfateh T. M. Saeed, Swapnil ARAWADE, Andreja Samčović, Syen UDDIN, Dawid JUSZKA, Michał GREGA, Mikołaj LESZCZUK // Electronics [Dokument elektroniczny]. — Czasopismo elektroniczne ; ISSN 2079-9292 . — 2026 — vol. 15 iss. 8 art. no. 1666, s. 1–28. — Wymagania systemowe: Adobe Reader. — Bibliogr. s. 24–28, Abstr. — Publikacja dostępna online od: 2025-04-16
Autorzy (8)
- AGHDutta Avrajyoti
- Saeed Mohamedalfateh T. M.
- AGHArawade Swapnil
- Samčović Andreja
- AGHUddin Syed
- AGHJuszka Dawid
- AGHGrega Michał
- AGHLeszczuk Mikołaj
Słowa kluczowe
Dane bibliometryczne
| ID BaDAP | 167103 |
|---|---|
| Data dodania do BaDAP | 2026-04-24 |
| Tekst źródłowy | URL |
| DOI | 10.3390/electronics15081666 |
| Rok publikacji | 2026 |
| Typ publikacji | artykuł w czasopiśmie |
| Otwarty dostęp | |
| Creative Commons | |
| Czasopismo/seria | Electronics |
Abstract
This research investigates the influence of environmental regulation on subjective evaluations of video quality within the Quality of Experience (QoE) paradigm. This work presents a supplementary experiment conducted in a controlled laboratory setting, building on our previous crowdsourcing studies carried out in uncontrolled, web-based conditions using the Prolific platform. Both tests utilized the identical crowdsourcing platform and complied with the International Telecommunication Union Telecommunication (ITU-T) P.910 Recommendations, ensuring external validity and methodological consistency. Participants assessed a collection of processed video sequences (PVS) comprising 46 distinct video clips utilizing the 5-point Absolute Category Rating (ACR) scale, while their response times were documented in milliseconds as measures of cognitive exertion and decision delay. The comparison analysis employs nonparametric tests (Mann–Whitney U and Kolmogorov–Smirnov) and a hierarchical Linear Mixed-Effects Model (LMM) to examine disparities in reaction time distributions, rating consistency, and the incidence of outliers across both environments. The results indicate that controlled settings produce statistically significantly less response variability and enhanced data reliability, whereas uncontrolled settings encompass greater external diversity and real-world unpredictability. These findings offer significant insights into the balance between experimental control and external validity in crowdsourced video quality assessment, advancing the development of scalable approaches for Quality of Experience research.