base on Luma Interactive Scenes (captures) Web Examples, use lumalabs.ai captures directly in your three.js or other WebGL projects! # [![luma-logo](./assets/logo.svg)](https://lumalabs.ai) Luma WebGL Library
`luma-web` is a [npm package](https://www.npmjs.com/package/@lumaai/luma-web) for rendering photoreal interactive scenes captured by the [Luma app](https://lumalabs.ai/). It includes `LumaSplatsWebGL`, which is a WebGL-only gaussian splatting implementation designed to be integrated with 3D frameworks, and `LumaSplatsThree`, which is a Three.js implementation that uses `LumaSplatsWebGL` under the hood. For these examples we'll use [Three.js](https://threejs.org/).
**Request features and report bugs on our [![github-logo](./assets/images/github-mark-16.svg) GitHub repo](https://github.com/lumalabs/luma-web-library)**
### Contents
- [Getting Started](#getting-started)
- [Background Removal](#background-removal)
- [Three Fog](#three-fog)
- [Scene Lighting](#scene-lighting)
- [Custom Shaders](#custom-shaders)
- [React Three Fiber](#react-three-fiber)
- [Transmission](#transmission)
- [VR](#vr)
## Getting Started
[![hello-world-demo](./assets/images/hello-world-preview.jpg)](#getting-started)
The simplest way to get started is to create a .html file and load the library from a CDN. Here we load three.js and the luma library and setup a minimal scene with mouse controls. Copy and paste this into a file named `index.html` and open it in your browser (no server needed).
**[minimal.html](./src/minimal.html)**
```html
<canvas></canvas>
<script type="importmap">
{
"imports": {
"three": "https://unpkg.com/
[email protected]/build/three.module.js",
"three/addons/": "https://unpkg.com/
[email protected]/examples/jsm/",
"@lumaai/luma-web": "https://unpkg.com/@lumaai/
[email protected]/dist/library/luma-web.module.js"
}
}
</script>
<script type="module">
import { WebGLRenderer, PerspectiveCamera, Scene } from 'three';
import { OrbitControls } from 'three/addons/controls/OrbitControls.js';
import { LumaSplatsThree } from '@lumaai/luma-web';
let canvas = document.querySelector('canvas');
let renderer = new WebGLRenderer({
canvas: canvas,
antialias: false
});
renderer.setSize(window.innerWidth, window.innerHeight, false);
let scene = new Scene();
let camera = new PerspectiveCamera(75, window.innerWidth / window.innerHeight, 0.1, 1000);
camera.position.z = 2;
let controls = new OrbitControls(camera, canvas);
controls.enableDamping = true;
let splat = new LumaSplatsThree({
source: 'https://lumalabs.ai/capture/d80d4876-cf71-4b8a-8b5b-49ffac44cd4a'
});
scene.add(splat);
renderer.setAnimationLoop(() => {
controls.update();
renderer.render(scene, camera);
});
</script>
```
Alternatively you can install the library from npm:
```bash
npm install @lumaai/luma-web
```
**Usage**
Import the `LumaSplatsThree` class:
```ts
import { LumaSplatsThree } from "@lumaai/luma-web";
```
Then create an instance of `LumaSplatsThree` with a splat `source`, and add it to your scene.
`source` can be either of:
- URL to a capture on [lumalabs.ai](https://lumalabs.ai)
- path to a luma splats file or folder containing a luma splats artifacts
**[DemoHelloWorld.ts](./src/DemoHelloWorld.ts)**
```ts
let splats = new LumaSplatsThree({
source: 'https://lumalabs.ai/capture/ca9ea966-ca24-4ec1-ab0f-af665cb546ff',
// controls the particle entrance animation
particleRevealEnabled: true,
});
scene.add(splats);
```
Splats will integrate with the three.js rendering pipeline and interact with other objects via depth testing. However, splats do not currently write to the depth buffer themselves.
### Performance tips
- Use `antialias: false` when creating the renderer to disable MSAA on the canvas. Splats are already anti-aliased and the high instance count in splats is expensive to render with MSAA
- Set `enableThreeShaderIntegration: false` to disable integration with the three.js rendering pipeline. This will disable features like fog and tone mapping, but will improve performance
## Background Removal
[![background-removal-demo](./assets/images/background-removal-preview.jpg)](#background-removal)
Luma scenes can include multiple semantic layers. By default, all layers are rendered. To filter layers, use the `semanticsMask` property. This is a bit mask, so for example, to show only the foreground layer, set `semanticsMask = LumaSplatsSemantics.FOREGROUND`. To show both foreground and background, set `semanticsMask = LumaSplatsSemantics.FOREGROUND | LumaSplatsSemantics.BACKGROUND`
**[DemoBackgroundRemoval.ts](./src/DemoBackgroundRemoval.ts)**
```ts
import { LumaSplatsSemantics, LumaSplatsThree } from "@lumaai/luma-web";
let splats = new LumaSplatsThree({
source: 'https://lumalabs.ai/capture/1b5f3e33-3900-4398-8795-b585ae13fd2d',
});
scene.add(splats);
// filter splats to only show foreground layers
splats.semanticsMask = LumaSplatsSemantics.FOREGROUND;
```
## Three Fog
[![three.js-fog-demo](./assets/images/three.js-fog-preview.jpg)](#three-fog)
Luma splats integrate with the three.js rendering pipeline including features like tone mapping, color spaces and fog. Ensure `enableThreeShaderIntegration` is set to `true` (the default) and set the scene fog
**[DemoFog.ts](./src/DemoFog.ts)**
```ts
scene.fog = new FogExp2(new Color(0xe0e1ff).convertLinearToSRGB(), 0.15);
scene.background = scene.fog.color;
```
## Scene Lighting
[![scene-lighting-demo](./assets/images/scene-lighting-preview.jpg)](#scene-lighting)
It's possible to illuminate three.js scenes with Luma splats. To do so, we can render a cubemap of the splats and use it as the scene environment. This is done by calling `captureCubemap()` on the splats object. We first wait for the splats to fully load before capturing the cubemap. To ensure the splats are fully rendered at the time of capture, we disable the loading animation.
**[DemoLighting.ts](./src/DemoLighting.ts)**
```ts
let splats = new LumaSplatsThree({
source: 'https://lumalabs.ai/capture/4da7cf32-865a-4515-8cb9-9dfc574c90c2',
// disable loading animation so model is fully rendered after onLoad
loadingAnimationEnabled: false,
});
splats.onLoad = () => {
splats.captureCubemap(renderer).then((capturedTexture) => {
scene.environment = capturedTexture;
scene.background = capturedTexture;
scene.backgroundBlurriness = 0.5;
});
}
```
## Custom Shaders
[![custom-shaders-demo](./assets/images/custom-shaders-preview.jpg)](#custom-shaders)
You can inject code into the splat shaders to customize them. To do this, call `setShaderHooks({ ... })` on your splat and provide GLSL functions, uniforms and globals to override default behavior. For example, in this demo we apply a transform matrix to each splat by setting the vertex shader hook `getSplatTransform`. It generates a transform matrix for time-varying sinusoidal offset to the y coordinate.
The syntax for shader hook function is a GLSL function without a function name. The GLSL function arguments and return are given as documentation on the shader hook fields (see below).
**[DemoCustomShaders.ts](./src/DemoCustomShaders.ts)**
```ts
splats.setShaderHooks({
vertexShaderHooks: {
additionalUniforms: {
time_s: ['float', uniformTime],
},
getSplatTransform: /*glsl*/`
(vec3 position, uint layersBitmask) {
// sin wave on x-axis
float x = 0.;
float z = 0.;
float y = sin(position.x * 1.0 + time_s) * 0.1;
return mat4(
1., 0., 0., 0,
0., 1., 0., 0,
0., 0., 1., 0,
x, y, z, 1.
);
}
`,
}
});
```
### Shader Hook API
```typescript
type LumaShaderHooks = {
/** Hooks added to the vertex shader */
vertexShaderHooks?: {
additionalUniforms?: { [name: string]: [UniformTypeGLSL, { value: any }] },
/** Inject into global space (for example, to add a varying) */
additionalGlobals?: string,
/**
* Example `(vec3 splatPosition, uint layersBitmask) { return mat4(1.); }`
* @param {vec3} splatPosition, object-space
* @param {uint} layersBitmask, bit mask of layers, where bit 0 is background and bit 1 is foreground
* @returns {mat4} per-splat local transform
*/
getSplatTransform?: string,
/**
* Executed at the end of the main function after gl_Position is set
*
* Example `() {
* vPosition = gl_Position;
* }`
* @returns {void}
*/
onMainEnd?: string,
/**
* Example `(vec4 splatColor, vec3 splatPosition) { return pow(splatColor.rgb, vec3(2.2), splatColor.a); }`
* Use `gl_Position` is available
* @param {vec4} splatColor, default splat color
* @param {vec3} splatPosition, object-space
* @param {uint} layersBitmask, bit mask of layers, where bit 0 is background and bit 1 is foreground
* @returns {vec4} updated splat color
*/
getSplatColor?: string,
},
/** Hooks added to the fragment shader */
fragmentShaderHooks?: {
additionalUniforms?: { [name: string]: [UniformTypeGLSL, { value: any }] },
/** Inject into global space (for example, to add a varying) */
additionalGlobals?: string,
/**
* Example `(vec4 fragColor) { return tonemap(fragColor); }`
* @param {vec4} fragColor, default fragment color
* @returns {vec4} updated fragment color
*/
getFragmentColor?: string,
}
}
```
## React Three Fiber
[![react-three-fiber-demo](./assets/images/react-three-fiber-preview.jpg)](#react-three-fiber)
Luma splats can be used with [React Three Fiber](https://docs.pmnd.rs/), a React renderer for Three.js. To do so, we need to extend R3F to include the `LumaSplatsThree` class. This is done by calling `extend` with the class and a name (in this case `LumaSplats` which will be used as the component name). If using TypeScript, we also need to declare the component type.
**[DemoReactThreeFiber.tsx](./src/DemoReactThreeFiber.tsx)**
```typescript
import { Object3DNode, extend } from '@react-three/fiber';
import { LumaSplatsThree, LumaSplatsSemantics } from "@lumaai/luma-web";
// Make LumaSplatsThree available to R3F
extend( { LumaSplats: LumaSplatsThree } );
// For typeScript support:
declare module '@react-three/fiber' {
interface ThreeElements {
lumaSplats: Object3DNode<LumaSplatsThree, typeof LumaSplatsThree>
}
}
function Scene() {
return <lumaSplats
semanticsMask={LumaSplatsSemantics.FOREGROUND}
source='https://lumalabs.ai/capture/822bac8d-70d6-404e-aaae-f89f46672c67'
position={[-1, 0, 0]}
scale={0.5}
/>
}
```
## Transmission
[![transmission-demo](./assets/images/transmission-preview.jpg)](#transmission)
Splats can be used in combination with three.js transmission effects, however some care should be taken to make this work. Splats are considered `transparent` materials in three.js which means by default they're not rendered in the transmissive pass, so initially you won't see your splats in transmissive materials. To fix we set `splats.material.transparent = false;`.
In this example, we draw two splat scenes, one inside a refractive globe and the other outside. To make this work, we want the inner splat scene to _only_ render to the transmission buffer and the outer to the canvas. We do this by checking the render target before rendering and selectively disabling.
**[DemoTransmission.ts](./src/DemoTransmission.ts)**
```typescript
// inner splat
let globeSplats = new LumaSplatsThree({
// Chateau de Menthon - Annecy
source: 'https://lumalabs.ai/capture/da82625c-9c8d-4d05-a9f7-3367ecab438c',
enableThreeShaderIntegration: true,
onBeforeRender: (renderer) => {
// disable MSAA on render targets (in this case the transmission render target)
// this improves splatting performance
let target = renderer.getRenderTarget();
if (target) {
target.samples = 0;
}
// only render in targets and not the canvas
globeSplats.preventDraw = target == null;
}
});
// disable transparency so the renderer considers it an opaque object
// opaque objects are rendered in the transmission pass (whereas transparent objects are not)
globeSplats.material.transparent = false;
scene.add(globeSplats);
// outer splat
let environmentSplats = new LumaSplatsThree({
// Arosa Hörnli - Switzerland
source: 'https://lumalabs.ai/capture/4da7cf32-865a-4515-8cb9-9dfc574c90c2',
// disable animation for lighting capture
loadingAnimationEnabled: false,
// disable three.js shader integration for performance
enableThreeShaderIntegration: false,
});
scene.add(environmentSplats);
// add a refractive transmissive sphere
let glassSphere = new Mesh(
new SphereGeometry(1, 32, 32),
new MeshPhysicalMaterial({
roughness: 0,
metalness: 0,
transmission: 1,
ior: 1.341,
thickness: 1.52,
envMapIntensity: 1.2,
clearcoat: 1,
side: FrontSide,
transparent: true,
})
);
scene.add(glassSphere);
```
## VR
[![vr-demo](./assets/images/vr-preview.jpg)](#vr)
Viewing your splats in VR is as simple as enabling XR in three.js and adding a VR button. View this demo with a VR headset (or through a headset browser) and click "Enter VR"! It will work best on PC VR, standalone VR tends to struggle with splats presently
**[DemoVR.ts](./src/DemoVR.ts)**
```typescript
import { VRButton } from "three/examples/jsm/webxr/VRButton.js";
renderer.xr.enabled = true;
let vrButton = VRButton.createButton(renderer);
document.body.appendChild(vrButton);
let splats = new LumaSplatsThree({
// Kind Humanoid @RyanHickman
source: 'https://lumalabs.ai/capture/83e9aae8-7023-448e-83a6-53ccb377ec86',
});
scene.add(splats);
```
", Assign "at most 3 tags" to the expected json: {"id":"5869","tags":[]} "only from the tags list I provide: [{"id":77,"name":"3d"},{"id":89,"name":"agent"},{"id":17,"name":"ai"},{"id":54,"name":"algorithm"},{"id":24,"name":"api"},{"id":44,"name":"authentication"},{"id":3,"name":"aws"},{"id":27,"name":"backend"},{"id":60,"name":"benchmark"},{"id":72,"name":"best-practices"},{"id":39,"name":"bitcoin"},{"id":37,"name":"blockchain"},{"id":1,"name":"blog"},{"id":45,"name":"bundler"},{"id":58,"name":"cache"},{"id":21,"name":"chat"},{"id":49,"name":"cicd"},{"id":4,"name":"cli"},{"id":64,"name":"cloud-native"},{"id":48,"name":"cms"},{"id":61,"name":"compiler"},{"id":68,"name":"containerization"},{"id":92,"name":"crm"},{"id":34,"name":"data"},{"id":47,"name":"database"},{"id":8,"name":"declarative-gui "},{"id":9,"name":"deploy-tool"},{"id":53,"name":"desktop-app"},{"id":6,"name":"dev-exp-lib"},{"id":59,"name":"dev-tool"},{"id":13,"name":"ecommerce"},{"id":26,"name":"editor"},{"id":66,"name":"emulator"},{"id":62,"name":"filesystem"},{"id":80,"name":"finance"},{"id":15,"name":"firmware"},{"id":73,"name":"for-fun"},{"id":2,"name":"framework"},{"id":11,"name":"frontend"},{"id":22,"name":"game"},{"id":81,"name":"game-engine "},{"id":23,"name":"graphql"},{"id":84,"name":"gui"},{"id":91,"name":"http"},{"id":5,"name":"http-client"},{"id":51,"name":"iac"},{"id":30,"name":"ide"},{"id":78,"name":"iot"},{"id":40,"name":"json"},{"id":83,"name":"julian"},{"id":38,"name":"k8s"},{"id":31,"name":"language"},{"id":10,"name":"learning-resource"},{"id":33,"name":"lib"},{"id":41,"name":"linter"},{"id":28,"name":"lms"},{"id":16,"name":"logging"},{"id":76,"name":"low-code"},{"id":90,"name":"message-queue"},{"id":42,"name":"mobile-app"},{"id":18,"name":"monitoring"},{"id":36,"name":"networking"},{"id":7,"name":"node-version"},{"id":55,"name":"nosql"},{"id":57,"name":"observability"},{"id":46,"name":"orm"},{"id":52,"name":"os"},{"id":14,"name":"parser"},{"id":74,"name":"react"},{"id":82,"name":"real-time"},{"id":56,"name":"robot"},{"id":65,"name":"runtime"},{"id":32,"name":"sdk"},{"id":71,"name":"search"},{"id":63,"name":"secrets"},{"id":25,"name":"security"},{"id":85,"name":"server"},{"id":86,"name":"serverless"},{"id":70,"name":"storage"},{"id":75,"name":"system-design"},{"id":79,"name":"terminal"},{"id":29,"name":"testing"},{"id":12,"name":"ui"},{"id":50,"name":"ux"},{"id":88,"name":"video"},{"id":20,"name":"web-app"},{"id":35,"name":"web-server"},{"id":43,"name":"webassembly"},{"id":69,"name":"workflow"},{"id":87,"name":"yaml"}]" returns me the "expected json"