par Van Bogaert, Laurie ;Bonatto, Daniele ;Fachada, Sarah ;Lafruit, Gauthier
Référence Engineering Reality of Virtual Reality 2022 (ERVR2022)(17-20 Janvier 2023: Online), Electronic imaging, Electronic imaging, Vol. 34, page (269-1-269-1)
Publication Publié, 2022-08-01
Publication dans des actes
Résumé : Abstract Virtual Reality and Free Viewpoint navigation require high-quality rendered images to be realistic. Current hardware assisted raytracing methods cannot reach the expected quality in real-time and are also limited by the 3D mesh quality. An alternative is Depth Image Based Rendering (DIBR) where the input only consists of images and theirassociated depth maps for synthesizing virtual views to the Head Mounted Display (HMD). The MPEG Immersive Video (MIV) standard uses such DIBR algorithm called the Reference View Synthesizer (RVS). We have first implemented a GPU version, called the Realtime accelerated View Synthesizer (RaViS), that synthesizes two virtual views in real-time for the HMD.In the present paper, we explore the differences between desktop and embedded GPU platforms, porting RaViS to an embedded HMD without the need for a separate, discrete desktop GPU. The proposed solution gives a first insight into DIBR View Synthesis techniques in embedded HMDs usingOpenGL and Vulkan, a cross-platform 3D rendering library with support for embedded devices.