Touch, hit, and drag objects with VR-GS.
A group of researchers from UCLA, HKU, Utah, ZJU, Style3D Research, CMU, and Amazon presented VR-GS, a new system that integrates 3D Gaussian Splatting and allows you to interact with objects in VR.
3D Gaussian Splatting is a rendering technique that leverages 3D Gaussians to represent the scene, thus allowing one to synthesize 3D scenes out of 2D footage. Simply put, it takes samples of images and turns them into 3D scenes without creating meshes by converting a point cloud to Gaussians using machine learning. You can learn more about it here.
Ying Jiang et al.
VR-GS aims to create realistic scenes and experiences by incorporating Extended Position Based Dynamics, real-time deformation embedding, and dynamic shadow casting. You can touch, poke, and drag the assets, and they will react to the touch.
"Beginning with multi-view images, the pipeline skillfully combines scene reconstruction, segmentation, and inpainting using Gaussian kernels. These kernels form the foundation for VR-GS's utilization of the sparse volumetric data structure VDB, facilitating bounding mesh reconstruction and subsequent tetrahedralization. VR-GS further harnesses a novel two-level rendering geometry embedding, XPBD, collision detection, and shadow casting techniques, all converging to deliver a captivating and immersive user experience."
Ying Jiang et al.
The system does produce impressive results, based on the examples provided, although all objects look as if they were made of rubber. Still, it could be the start of advanced VR interaction.
Learn more about the system here and join our 80 Level Talent platform and our Telegram channel, follow us on Instagram, Twitter, LinkedIn, TikTok, and Reddit, where we share breakdowns, the latest news, awesome artworks, and more.