Skip to content

Latest commit

 

History

History
35 lines (22 loc) · 1.39 KB

README.md

File metadata and controls

35 lines (22 loc) · 1.39 KB

Live 3D Reconstruction Demos

Description

This project showcases real-time point cloud fusion of multiple RGB-D cameras using 2D marker based calibration.


Demos

Video 1: [Point Cloud Reconstruction]

The designed algorithm yielded a 90% reconstruction accuracy compared to manual scans.

The following images were used for reconstruction:

Color Image 1 Color Image 2
PointCloudFusionDemo.mp4

Video 2: [Live Reconstruction Demo]

This demo showcases the reconstruction accuracy of the algorithm when deployed to live video. In this demo, two Intel Realsense L515 cameras were used. This application was developed in python using open3D rendering.

MultiCameraFusionDemo.mp4

Video 3: [Camera Extrinsic Location]

This demo shows the resulting camera extrinsics obtained from the calibration. This demo was created using Unreal Engine 5, and displayed on the Varjo XR3 headset.

XRLocationDemo.mp4

Contact

This project showcases my ongoing work with VeyondMetaverse. For further details or inquiries, feel free to reach out to me at [email protected].