Szczegóły publikacji
Opis bibliograficzny
Mixed reality 6D object pose estimation using deep learning for visual markerless surgical navigation / Michał TROJAK, Artur JURGAS, Maciej Stanuch, Łukasz Kownacki, Andrzej SKALSKI // W: VRW 2025 [Dokument elektroniczny] : 2025 IEEE conference on Virtual Reality and 3D user interfaces workshops : 8–12 March 2025, Saint-Malo, France : proceedings. — Wersja do Windows. — Adobe Reader. — Piscataway : The Institute of Electrical and Electronics Engineers, cop. 2025. — Dod. ISBN: 979-8-3315-2563-7. — e-ISBN: 979-8-3315-1484-6. — S. 947–952. — Wymagania systemowe: Adobe Reader. — Bibliogr. s. 951–952, Abstr. — Publikacja dostępna online od: 2025-04-24. — M. Trojak, A. Jurgas, A. Skalski - dod. afiliacja: MedApp S. A.
Autorzy (5)
- AGHTrojak Michał
- AGHJurgas Artur
- Stanuch Maciej
- Kownacki Łukasz
- AGHSkalski Andrzej
Słowa kluczowe
Dane bibliometryczne
| ID BaDAP | 159683 |
|---|---|
| Data dodania do BaDAP | 2025-05-20 |
| Tekst źródłowy | URL |
| DOI | 10.1109/VRW66409.2025.00193 |
| Rok publikacji | 2025 |
| Typ publikacji | materiały konferencyjne (aut.) |
| Otwarty dostęp | |
| Wydawca | Institute of Electrical and Electronics Engineers (IEEE) |
| Konferencja | IEEE Conference on Virtual Reality and 3D User Interfaces 2025 |
Abstract
Modern surgical navigation systems usually use external trackers to estimate tool pose. They recognize and track markers such as ArUco/QR codes or reflective spheres. As a consequence, there is additional equipment in the operating room that must be calibrated prior to each procedure. This paper introduces a novel approach that uses only visual information from a Microsoft HoloLens 2 mixed reality headset to estimate the pose of the object. The proposed method is based on PVNet to determine the position and orientation of objects in RGB images and transforms the parameters into the headset coordinate system by incorporating depth data from the built-in ToF camera. The experiments demonstrated the angle estimation error equal to 1.91◦ ±1.16◦ for orientation angles between 0◦ and 90◦ and the translation estimation error equal to 0.64±0.19 cm for translations between 0 cm and 45 cm. Those errors meet clinical requirements in terms of accuracy and allow to use the system in practical conditions.