Researchers are tackling the challenge of seamlessly integrating cutting-edge 3D Gaussian Splatting (3DGS) , a technique promising real-time rendering of highly realistic scenes , with existing graphics pipelines. Yinghan Xu, alongside Théo Morales and John Dingliana from Trinity College Dublin, and et al., present SplatBus, a novel framework that bridges this gap by utilising GPU interprocess communication. This innovative approach allows 3DGS outputs to be readily visualised within popular software such as Unity, Blender, Unreal Engine, and OpenGL viewers, significantly broadening the accessibility and potential applications of this exciting technology for fields like autonomous driving, robotics, and immersive experiences. SplatBus therefore represents a crucial step towards democratising advanced rendering techniques and fostering greater creative exploration.
The core of this work lies in treating the 3DGS rasteriser as an independent rendering server, with various viewers acting as clients, a paradigm shift in how these systems are architected. This design not only accelerates the viewing process but also simplifies integration into diverse application environments. Experiments demonstrate that the system successfully composites Gaussian-based rendering with mesh-based rendering using depth-aware blending, opening doors to hybrid rendering scenarios previously difficult to achieve.
The code is publicly available at https://github. com/RockyXu66/splatbus, fostering further research and development within the computer vision and graphics communities. Previously, rasterisation relied on complex mesh-based representations requiring sophisticated lighting models to replicate real-world effects. Recent advances in Neural Radiance Fields (NeRF) popularised novel view synthesis, but often demanded extensive engineering and optimisation for real-time performance0.3D Gaussian Splatting overcomes this challenge by representing scenes as 3D Gaussian primitives, achieving a favourable balance between rendering quality and efficiency. However, the original 3DGS viewer was a standalone application, limiting its compatibility with other representations like 3D meshes and necessitating significant engineering effort for modifications and adaptations.
This research establishes a generalisable viewing process by decoupling the rasteriser from the viewer, offering a flexible and extensible solution. Existing methods for integrating 3DGS into game engines often suffer from limitations such as converting Gaussian representations into particle systems or lacking support for the latest rasteriser variants. SplatBus overcomes these hurdles, providing a simple hook into existing code and a client-server architecture that supports diverse application needs. The team’s innovative use of Nvidia IPC APIs not only enhances performance but also paves the way for future advancements in dynamic radiance fields and real-time rendering technologies, with potential applications spanning autonomous driving, robotics, virtual reality, and extended reality.
Decoupled 3D Gaussian Splatting via IPC
The team engineered a system leveraging Nvidia’s interprocess communication (IPC) APIs to facilitate seamless integration and viewing of 3DGS results within these external clients. Specifically, the research pioneers a method of decoupling the 3DGS rasterizer from the viewer, treating the rasterizer as an independent rendering server and external applications as clients. The study harnessed this direct access to implement depth-aware blending, allowing for the composite rendering of Gaussian-based visuals with traditional 3D meshes. This technique achieves a harmonious integration of disparate rendering methods, enhancing visual fidelity and flexibility.
Researchers implemented the 3DGS rasterizer as a dedicated server, processing 3D Gaussian primitives and generating RGB and depth data. The system delivers this data to client applications via shared memory, bypassing conventional data transfer bottlenecks. This broad compatibility expands the accessibility of 3DGS technology to a wider audience of developers and artists. Furthermore, the work details a method for sorting 3D Gaussians on the CPU and rendering them on the GPU, enabling fast rendering of complex scenes. Building on existing open-source implementations like splat and GaussianSplats3D, the team focused on generalizability and scalability. The approach enables dynamic radiance fields and supports the latest rasterizer variants, overcoming limitations found in previous solutions. The code, freely available at https://github. com/RockyXu66/splatbus, facilitates further research and development in this rapidly evolving field.
SplatBus enables real-time 3D Gaussian Splatting integration with
The team measured seamless transfer of GPU color and depth buffers, enabling depth-aware compositing with conventional rendering techniques. Experiments revealed that point clouds extracted from 3D Gaussian Splatting representations could be imported into Unity for interactive control, alongside standard 3D meshes. Results demonstrate the successful integration of SplatBus into existing pipelines, allowing researchers to view and manipulate 3D Gaussian Splatting scenes within familiar editor environments. Specifically, a trained 3DGS scene was rendered using the original 3DGS project, with SplatBus acting as the server and a Unity plugin as the client.
Data shows that this approach facilitates visual debugging and profiling, as arbitrary data can be transferred between the research code and the viewer. The OpenGL viewer, also developed as part of this work, renders frames as texture objects and provides default keyboard and mouse controls for easy navigation. Scientists recorded successful visualization of a Gaussian avatar reconstruction method, MMLPHuman, within Unity, overcoming challenges associated with neural network components preceding rasterization. The architecture allows for the installation of the SplatBus package on top of existing research codebases, enabling direct viewing of results in external clients without extensive engineering effort.
Tests prove that the system can handle complex scenes containing both Gaussian splats and traditional 3D meshes, demonstrating its versatility and robustness. The breakthrough delivers a lightweight solution that lowers the engineering cost of deploying Gaussian renderers inside interactive applications. Measurements confirm that the system facilitates seamless data transfer via sockets and Nvidia’s IPC, ensuring real-time performance and responsiveness. Future work will focus on enriching interaction possibilities, incorporating advanced rendering effects like shadows and lighting, and expanding cross-platform deployment to broaden accessibility and usability.
SplatBus decouples rasterisation from visualisation seamlessly
This innovative approach facilitates the seamless transfer of GPU colour and depth buffers to external clients, enabling low-latency integration without altering the core rasteriser itself. The system supports depth-aware compositing with conventional rendering techniques, effectively bridging the gap between cutting-edge Gaussian Splatting and established mesh-based rendering pipelines. The authors acknowledge a current limitation in richer interaction capabilities and advanced rendering effects like shadows and lighting, but propose future work to address these areas. Future research will focus on expanding interaction possibilities, incorporating advanced rendering effects, and broadening cross-platform deployment, alongside smoother integration across multiple graphics backends. The development of SplatBus represents a valuable contribution to the field by lowering the barriers to entry for utilising Gaussian Splatting in practical, interactive applications, and paving the way for more accessible and efficient 3D rendering workflows.
👉 More information
🗞 SplatBus: A Gaussian Splatting Viewer Framework via GPU Interprocess Communication
🧠 ArXiv: https://arxiv.org/abs/2601.15431
