Please use this identifier to cite or link to this item: https://doi.org/10.21256/zhaw-26948
Full metadata record
DC FieldValueLanguage
dc.contributor.authorWegmann, Marcel-
dc.contributor.authorRosenthal, Matthias-
dc.date.accessioned2023-02-15T15:28:23Z-
dc.date.available2023-02-15T15:28:23Z-
dc.date.issued2022-06-23-
dc.identifier.isbn978-3-645-50194-1de_CH
dc.identifier.urihttps://digitalcollection.zhaw.ch/handle/11475/26948-
dc.description.abstractAugmented Reality is the concept of enhancing the real world with virtual objects or information with projections into a viewfinder or through specialized goggles. Simpler forms of Augmented Reality – like a heads-up display in a car – do not need to estimate the camera’s motion, an object, or the user. However, more elaborate implementations of Augmented Reality need to track things and, more importantly, the camera’s movement itself. The applications in which Augmented Reality could be leveraged range from social interaction over pedestrian navigation to various use cases in different professions. Multiple companies already have shown closed source or custom-tailored programming interfaces, either running on smartphones or shipped with industry-targeted goggles. The tracking of real-world objects or surfaces is possible with the provided interfaces, but the algorithms behind the different functions are corporate secrets. This paper describes an approach for an end-to-end pipeline in a prototype of an Augmented Reality platform without using commercial interfaces. A time-of-flight camera provides a depth-image that allows reconstruction of the recorded scene as a cloud of SIFT features. Frame-by-frame analysis of the point cloud estimates the camera’s motion by highly parallel processing and a three-dimensional extension of the RANSAC algorithm. An accelerometer and a gyroscope provide additional data, fused with a Kalman filter to improve the motion estimation. A regular color camera acts as a viewfinder, and Vulkan renders the result to a monitor. Enhancing the matching quality of SIFT features between consecutive frames of a time-of-flight camera using a three-dimensional RANSAC algorithm led to over two times as many correct matches.de_CH
dc.language.isoende_CH
dc.publisherWEKAde_CH
dc.rightsLicence according to publishing contractde_CH
dc.subjectAugmented realityde_CH
dc.subjectTime of flight camerade_CH
dc.subjectReal-time processingde_CH
dc.subjectCUDAde_CH
dc.subjectVulkande_CH
dc.subjectRANSACde_CH
dc.subjectKalman filterde_CH
dc.subjectMotion trackingde_CH
dc.subject.ddc006: Spezielle Computerverfahrende_CH
dc.titleReal time motion tracking for augmented reality with TOF camera and vulkan renderingde_CH
dc.typeKonferenz: Paperde_CH
dcterms.typeTextde_CH
zhaw.departementSchool of Engineeringde_CH
zhaw.organisationalunitInstitute of Embedded Systems (InES)de_CH
dc.identifier.doi10.21256/zhaw-26948-
zhaw.conference.detailsEmbedded World Conference, Nuremberg, Germany, 21-23 June 2022de_CH
zhaw.funding.euNode_CH
zhaw.originated.zhawYesde_CH
zhaw.pages.end665de_CH
zhaw.pages.start661de_CH
zhaw.parentwork.editorSikora, Axel-
zhaw.publication.statuspublishedVersionde_CH
zhaw.publication.reviewPeer review (Abstract)de_CH
zhaw.title.proceedingsProceedings of Embedded World Conference 2022de_CH
zhaw.author.additionalNode_CH
zhaw.display.portraitYesde_CH
Appears in collections:Publikationen School of Engineering

Files in This Item:
File Description SizeFormat 
2022_Wegmann-Rosenthal_Real-time-motion-tracking-for-augmented-reality_EmbeddedWorld.pdf654.45 kBAdobe PDFThumbnail
View/Open
Show simple item record
Wegmann, M., & Rosenthal, M. (2022). Real time motion tracking for augmented reality with TOF camera and vulkan rendering [Conference paper]. In A. Sikora (Ed.), Proceedings of Embedded World Conference 2022 (pp. 661–665). WEKA. https://doi.org/10.21256/zhaw-26948
Wegmann, M. and Rosenthal, M. (2022) ‘Real time motion tracking for augmented reality with TOF camera and vulkan rendering’, in A. Sikora (ed.) Proceedings of Embedded World Conference 2022. WEKA, pp. 661–665. Available at: https://doi.org/10.21256/zhaw-26948.
M. Wegmann and M. Rosenthal, “Real time motion tracking for augmented reality with TOF camera and vulkan rendering,” in Proceedings of Embedded World Conference 2022, Jun. 2022, pp. 661–665. doi: 10.21256/zhaw-26948.
WEGMANN, Marcel und Matthias ROSENTHAL, 2022. Real time motion tracking for augmented reality with TOF camera and vulkan rendering. In: Axel SIKORA (Hrsg.), Proceedings of Embedded World Conference 2022. Conference paper. WEKA. 23 Juni 2022. S. 661–665. ISBN 978-3-645-50194-1
Wegmann, Marcel, and Matthias Rosenthal. 2022. “Real Time Motion Tracking for Augmented Reality with TOF Camera and Vulkan Rendering.” Conference paper. In Proceedings of Embedded World Conference 2022, edited by Axel Sikora, 661–65. WEKA. https://doi.org/10.21256/zhaw-26948.
Wegmann, Marcel, and Matthias Rosenthal. “Real Time Motion Tracking for Augmented Reality with TOF Camera and Vulkan Rendering.” Proceedings of Embedded World Conference 2022, edited by Axel Sikora, WEKA, 2022, pp. 661–65, https://doi.org/10.21256/zhaw-26948.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.