Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 5 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,14 +61,17 @@ Copy the following code into an `index.html` file.
</script>
<script type="module">
import * as THREE from "three";
import { SplatMesh } from "@sparkjsdev/spark";
import { SparkRenderer, SplatMesh } from "@sparkjsdev/spark";

const scene = new THREE.Scene();
const camera = new THREE.PerspectiveCamera(60, window.innerWidth / window.innerHeight, 0.1, 1000);
const camera = new THREE.PerspectiveCamera(60, window.innerWidth / window.innerHeight, 0.01, 1000);
const renderer = new THREE.WebGLRenderer();
renderer.setSize(window.innerWidth, window.innerHeight);
document.body.appendChild(renderer.domElement)

const spark = new SparkRenderer({ renderer });
scene.add(spark);

const splatURL = "https://sparkjs.dev/assets/splats/butterfly.spz";
const butterfly = new SplatMesh({ url: splatURL });
butterfly.quaternion.set(1, 0, 0, 0);
Expand All @@ -82,10 +85,6 @@ Copy the following code into an `index.html` file.
</script>
```

### Web Editor

Remix the [glitch starter template](https://glitch.com/edit/#!/sparkjs-dev)

### CDN

```html
Expand Down
9 changes: 8 additions & 1 deletion docs/docs/0.1-2.0-migration-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,13 @@ Spark 2.0 was designed with backward-compatibility in mind. We expect most 0.1 a
</script>
```

## Creation of the SparkRenderer
In Spark 0.1 you didn't have to manually create a `SparkRenderer` as Spark would automatically inject one into your scene. Spark 2.0 requires you to create one and add it to your scene, otherwise no splats will be rendered.
```javascript
const spark = new SparkRenderer({ renderer });
scene.add(spark);
```

## Multiple viewpoints and renderers

In Spark 0.1 you created multiple viewpoint from a single `SparkRenderer` instance by calling `.newViewpoint()` on it, returning a `SparkViewpoint`. This representation stored 1 global set of RGBA splats and N splat rendering orders for a viewpoint for correct back-to-front rendering and blending. This model made it impossible to have different colored splats, for example RGBA splats + depth-colored, or rendering different sets of splats with different shader effects applied, or different directional lighting from spherical harmonics from different vantage points.
Expand Down Expand Up @@ -113,7 +120,7 @@ Spark 2.0 supports SOGS files via a `.zip` file with `manifest.json` and referen

## Temporary fallback: OldSparkRenderer

If all else fails, the original `SparkRenderer` class from Spark 0.1 has been renamed to `OldSparkRenderer` (and other classes similarly like `OldSparkViewpoint`). Rename your `new SparkRenderer()` calls to `new OldSparkRenderer()`, and make sure to explicitly add your `OldSparkRenderer` to your scene because only the new `SparkRenderer` will be automatically injected.
If all else fails, the original `SparkRenderer` class from Spark 0.1 has been renamed to `OldSparkRenderer` (and other classes similarly like `OldSparkViewpoint`). Rename your `new SparkRenderer()` calls to `new OldSparkRenderer()`, and make sure to explicitly add your `OldSparkRenderer` to your scene because it won't be automatically injected.

We expect to unwind and remove this support over time and hope you will be able to migrate to the new renderer!

Expand Down
Loading