Skip to content

fix: recreate data textures to fix memory leak in case of scene should be disposed#320

Open
lekzd wants to merge 1 commit intosparkjsdev:mainfrom
lekzd:fix/v2.0.0-preview-memory-leak
Open

fix: recreate data textures to fix memory leak in case of scene should be disposed#320
lekzd wants to merge 1 commit intosparkjsdev:mainfrom
lekzd:fix/v2.0.0-preview-memory-leak

Conversation

@lekzd
Copy link
Copy Markdown

@lekzd lekzd commented Apr 23, 2026

fixes #286

Stale links to old Data Textures in Dyno fix

Just recreation of DataTexture links on every dispose to prevent storing it after SparkPager.dispose() called

Steps to check:

  1. create SplatRenderer and load first scene via SplatMesh
  2. dispose SplatRenderer and SplatMesh
  3. look at heap memory, mind a memory value
  4. repeat 1..3: memory value should not to grow

@lekzd lekzd changed the base branch from v2.0.0-preview to main April 23, 2026 17:39
@lekzd lekzd changed the base branch from main to v2.0.0-preview April 23, 2026 17:40
@lekzd lekzd force-pushed the fix/v2.0.0-preview-memory-leak branch from 2edea37 to 068dd38 Compare April 23, 2026 17:43
@lekzd lekzd changed the base branch from v2.0.0-preview to main April 23, 2026 17:43
@mrxz
Copy link
Copy Markdown
Collaborator

mrxz commented Apr 24, 2026

Thanks for the PR. Ideally we would avoid the SplatPager from being retained in memory, but that requires more impactful changes. In the meantime at least getting rid of the large typed arrays is already a big win.

That said, I did try the changes in the PR, but the memory was still being retained after disposing both the SplatMesh and SparkRenderer. Could you share the code you used for testing, or just the disposal logic if that works better?

@lekzd
Copy link
Copy Markdown
Author

lekzd commented Apr 27, 2026

Could you share the code you used for testing, or just the disposal logic if that works better?

@mrxz sure! I already forget about all hacks I made on App side to fix stale links in memory:

This one is dirty patch for DynoProgram prepareMaterial method
I just copied it from original code to add materialsMap to clear it later

const materialsMap = new Map<typeof spark.dyno.DynoProgram, RawShaderMaterial>();

spark.dyno.DynoProgram.prototype.prepareMaterial = function () {
  if (materialsMap.has(this)) {
    return materialsMap.get(this);
  }

  const material = new RawShaderMaterial({
    glslVersion: GLSL3,
    vertexShader: spark.utils.IDENT_VERTEX_SHADER,
    fragmentShader: this.shader,
    uniforms: this.uniforms,
  });

  materialsMap.set(this, material);

  return material;
};

clearance code when scene should be destroyed

materialsMap.forEach((material, program) => {
        material.dispose();

        // Clear uniforms hard links from material cache
        Object.keys(program.uniforms).forEach((key) => {
          program.uniforms[key].value = undefined;
        });
      });
      materialsMap.clear();

So I tried without it and memory leak still persist

@mrxz
Copy link
Copy Markdown
Collaborator

mrxz commented Apr 28, 2026

@lekzd Thanks for the additional details. I've created an alternative PR #326 based on this one, that nulls the texture source data instead. The general idea is still the same, allowing the large arrays to be garbage collected, but doesn't require the additional clearance code you're using.

If you could give at a try to confirm that it works, that would be appreciated.

Of course there is still some memory being leaked, though no longer huge amounts. The way the materials for DynoPrograms are cached should probably change such that ownership is better tracked, allowing them to be disposed properly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Huge memory leak in v2.0

2 participants