Splat Labs
The Definitive Guide

What Is a Gaussian Splat?

A visual-first explanation of the 3D capture technology redefining how we digitize, share, and experience real spaces — from real estate walkthroughs to construction site documentation.

splatlabs.ai — Gaussian Splat Rendering
The Basics

From polygons to brushstrokes

Traditional 3D models are built from polygons — flat triangles stitched together to approximate surfaces. Walk through a photogrammetry mesh and you'll see jagged edges, missing details, and artifacts where the geometry couldn't keep up with reality.

Gaussian splatting takes a different approach entirely. Instead of polygons, it represents a scene as millions of tiny, soft 3D ellipsoids — think of them as translucent brushstrokes suspended in 3D space. Each one has a position, a size, a shape, a color, and an opacity. Together they reconstitute the world not as a hard surface, but as a volumetric cloud of light.

The result is something that feels less like a scan and more like stepping back into the place itself.

polygons → gaussian splats
Why It Looks So Real

Four properties no polygon mesh can replicate

Gaussian splatting isn't just a different file format — it's a fundamentally different way of representing reality. Here's what sets it apart.

Fine detail capture

Fine Detail

Hair, texture grain, fabric threads — Gaussian splats preserve detail at a level polygon meshes simply cannot match. Millions of splats can cluster around high-frequency surfaces, producing photorealistic fidelity without any topology clean-up.

Fine structures

Fine Structures

Fences, rebar, scaffolding, vegetation, wire runs — geometry that frustrates photogrammetry because it has no closed surface. Gaussian splats just place ellipsoids wherever photons arrive, making thin structures appear in full fidelity.

Transparency

Transparency

Glass, windows, water, mesh screens — surfaces that transmit light. Each Gaussian splat has its own opacity value, so semi-transparent materials render naturally. You can see through the glass and still see what's on the other side.

View-dependent reflections

View-Dependent Reflections

Mirrors, polished floors, wet surfaces — materials whose appearance changes depending on where you are standing. Each splat stores spherical harmonic coefficients that encode how its color shifts with viewing angle, producing accurate reflections without any special material definition.

transparency + reflections
The Combination

Transparency and reflections — together

The real test of any 3D capture system is a glass storefront at night — transparent surface, dramatic reflections from the street, neon signs bouncing off the floor. This scenario breaks polygon meshes completely.

Gaussian splats handle it naturally. The opacity channels let light pass through. The spherical harmonics encode the reflection as it actually appeared from each camera. The result is spatially accurate, visually faithful, and navigable in real-time.

  • Glass facades rendered with accurate transmission
  • View-dependent reflections shift as you move through the scene
  • No special material passes or manual retouching required
Under the Hood

How a Gaussian Splat is trained

A Gaussian splat isn't exported from a scanner. It's trained — like a neural network — through an iterative optimization loop that refines millions of tiny ellipsoids until the rendered output matches your photographs.

01

Capture

A camera circles the subject — a room, a site, an object. Dozens or hundreds of overlapping photos from different angles.

02

Structure from Motion

Software analyzes the photos to triangulate where each camera was positioned and build a sparse point cloud of the scene.

03

Splat Initialization

A Gaussian ellipsoid is seeded at each point in the sparse cloud — blurry, opaque, roughly the right color.

04

Gradient Descent

The AI renderer projects the splats onto virtual cameras and compares the result to the real photos. Parameters are adjusted thousands of times, nudging position, scale, rotation, color, and opacity toward photorealism.

05

Densification & Pruning

Splats that are too large get split. Splats in the wrong place are removed. The cloud tightens until the rendered view is nearly indistinguishable from the original photograph.

3DGS training in progress

Gaussian splats materializing during training — from sparse noise to photorealistic 3D.

Side-by-Side Comparison

Gaussian Splats vs. Everything Else

An honest look at how Gaussian splatting compares to photogrammetry and LiDAR point clouds — including where the other technologies still win.

CapabilityGaussian SplatPhotogrammetryLiDAR Point Cloud
Visual Fidelity
Photorealistic outputPartial
Fine detail (hair, fabric, texture)
Thin structures (rebar, fencing, wire)Partial
Transparent surfaces (glass, water)
View-dependent reflections
Real-time browser renderingPartial
Shareable without desktop software
Measurement & Engineering
Survey-grade dimensional accuracyPartial
Direct depth measurement (no image matching)
Volumetric & area calculations
Works in low-light / featureless environments
Workflow & Interoperability
Editable mesh / CAD-BIM compatible outputPartial
Scan-to-BIM workflowPartial
Industry-standard file formats (OBJ, FBX, LAS, E57)Partial
Established software ecosystemPartial
No specialist hardware required

“Partial” indicates the capability is possible but with significant limitations or requires additional tooling.

Splat Labs Cloud

Capturing the splat is just the beginning

Splat Labs turns raw Gaussian Splats into shareable, measurable, annotatable deliverables — in your browser, with no specialist software.

Host & Share

Publish splats to a secure, shareable link. Control who can view — public, private, or team-only.

Measure

Take precise point-to-point, area, and volume measurements directly inside the 3D model.

Annotate

Pin notes, documents, photos, and videos to any point in the 3D scene for team collaboration.

Build Tours

Create guided walkthroughs with auto-camera paths and a filmstrip navigator.

Overlay Data

Drape KML layers and orthomosaics over your splat for geospatial context.

AI Scene Redesign

Remove furniture, redesign spaces, or insert 3D objects with a text prompt.

Frequently Asked questions

01What is the difference between a Gaussian splat and a point cloud?

A point cloud is a collection of discrete XYZ coordinates — dimensionless dots in space. Each point has no size, no shape, no opacity. Gaussian splats are volumetric: each one is a soft 3D ellipsoid with orientation, scale on three axes, color, and transparency. The result renders as photorealistic imagery rather than a cloud of colored dots.

02How does Gaussian splatting compare to photogrammetry?

Photogrammetry reconstructs geometry as a closed polygon mesh and then drapes a texture photograph over it. It struggles with transparency, thin geometry, and view-dependent reflections because those require the renderer to understand volume and light direction — not just surface position. Gaussian splatting doesn't try to build a mesh. It represents the scene as a cloud of volumetric light sources, which naturally handles all of these edge cases.

03What hardware do I need to capture a Gaussian Splat?

You can capture a Gaussian Splat with any camera that produces overlapping photographs — including a smartphone. Purpose-built hardware like the PortalCam or XGRIDS Lixel L2 Pro produces higher density and better geometry, but the technology is hardware-agnostic. Splat Labs accepts splats from Polycam, Luma AI, DJI, and any other tool that outputs the standard .ply or .splat format.

04How long does training take?

A typical interior scene trains in 20–60 minutes on consumer GPU hardware. Purpose-built training servers or cloud training platforms can reduce this to minutes. Training time scales with scene complexity and the number of input photographs.

05Can I take measurements inside a Gaussian Splat?

Yes, with Splat Labs. Measurements are taken against the underlying point geometry that anchors the splats, giving sub-centimeter accuracy for most scenes. You can measure distances, areas, and volumes directly in the browser without any desktop software.

06Do Gaussian Splats work in VR or AR?

Real-time VR rendering is an active area of development. Splat Labs currently supports VR viewer mode in-browser. Native XR rendering (Quest, Vision Pro) is on the roadmap as GPU rasterization pipelines for splats mature.

07How large are Gaussian Splat files?

A typical interior scene with 1–3 million Gaussians is 200–600 MB in raw .ply format. Splat Labs applies compression and streaming so viewers load quickly in the browser regardless of file size.

08Can I embed a Gaussian Splat on my own website or in Zillow?

Yes. Splat Labs generates embeddable iframes that you can paste into any web page, Zillow listing, or content management system. Enterprise users can white-label the viewer with their own branding.

Ready to try Gaussian Splatting?

Capture the world. Share it in 3D.

Upload your first Gaussian Splat to Splat Labs for free — no credit card, no desktop software, no specialist knowledge required.

ガウシアンスプラットとは?ビジュアル完全ガイド | Splat Labs | Splat Labs - Gaussian Splat Cloud