TY - JOUR
T1 - Visualisation of Fossilised Tree Trunks for XR, Using Geospatial Digitisation Techniques Derived from UAS and Terrestrial Data, Aided by Computational Photography
AU - Psarros, Charalampos
AU - Zouros, Nikolaos
AU - Soulakellis, Nikolaos
PY - 2025/3/14
Y1 - 2025/3/14
N2 - The aim of this research is to investigate and use a variety of immersive multisensory media techniques in order to create convincing digital models of fossilised tree trunks for use in XR (Extended Reality). This is made possible through the use of geospatial data derived from aerial imaging using UASs, terrestrial material captured using cameras and the incorporation of both the visual and audio elements for better immersion, accessed and explored in 6 Degrees of Freedom (6DoF). Immersiveness is a key factor of output that is especially engaging to the user. Both conventional and alternative methods are explored and compared, emphasising the advantages made possible with the help of Machine Learning Computational Photography. The material is collected using both UAS and terrestrial camera devices, including a multi-sensor 3D-360° camera, using stitched panoramas as sources for photogrammetry processing. Difficulties such as capturing large free-standing objects using terrestrial means are overcome using practical solutions involving mounts and remote streaming solutions. The key research contributions are comparisons between different imaging techniques and photogrammetry processes, resulting in significantly higher fidelity outputs. Conclusions indicate that superior fidelity can be achieved through the help of Machine Learning Computational Photography processes, and higher resolutions and technical specs of equipment do not necessarily translate into superior outputs.
AB - The aim of this research is to investigate and use a variety of immersive multisensory media techniques in order to create convincing digital models of fossilised tree trunks for use in XR (Extended Reality). This is made possible through the use of geospatial data derived from aerial imaging using UASs, terrestrial material captured using cameras and the incorporation of both the visual and audio elements for better immersion, accessed and explored in 6 Degrees of Freedom (6DoF). Immersiveness is a key factor of output that is especially engaging to the user. Both conventional and alternative methods are explored and compared, emphasising the advantages made possible with the help of Machine Learning Computational Photography. The material is collected using both UAS and terrestrial camera devices, including a multi-sensor 3D-360° camera, using stitched panoramas as sources for photogrammetry processing. Difficulties such as capturing large free-standing objects using terrestrial means are overcome using practical solutions involving mounts and remote streaming solutions. The key research contributions are comparisons between different imaging techniques and photogrammetry processes, resulting in significantly higher fidelity outputs. Conclusions indicate that superior fidelity can be achieved through the help of Machine Learning Computational Photography processes, and higher resolutions and technical specs of equipment do not necessarily translate into superior outputs.
U2 - 10.3390/electronics14061146
DO - 10.3390/electronics14061146
M3 - Article
SN - 2079-9292
VL - 14
JO - Electronics
JF - Electronics
IS - 6
M1 - 1146
ER -