fix: recreate data textures to fix memory leak in case of scene should be disposed#320
fix: recreate data textures to fix memory leak in case of scene should be disposed#320lekzd wants to merge 1 commit intosparkjsdev:mainfrom
Conversation
2edea37 to
068dd38
Compare
|
Thanks for the PR. Ideally we would avoid the That said, I did try the changes in the PR, but the memory was still being retained after disposing both the |
@mrxz sure! I already forget about all hacks I made on App side to fix stale links in memory: This one is dirty patch for DynoProgram prepareMaterial method const materialsMap = new Map<typeof spark.dyno.DynoProgram, RawShaderMaterial>();
spark.dyno.DynoProgram.prototype.prepareMaterial = function () {
if (materialsMap.has(this)) {
return materialsMap.get(this);
}
const material = new RawShaderMaterial({
glslVersion: GLSL3,
vertexShader: spark.utils.IDENT_VERTEX_SHADER,
fragmentShader: this.shader,
uniforms: this.uniforms,
});
materialsMap.set(this, material);
return material;
};clearance code when scene should be destroyed materialsMap.forEach((material, program) => {
material.dispose();
// Clear uniforms hard links from material cache
Object.keys(program.uniforms).forEach((key) => {
program.uniforms[key].value = undefined;
});
});
materialsMap.clear();So I tried without it and memory leak still persist |
|
@lekzd Thanks for the additional details. I've created an alternative PR #326 based on this one, that nulls the texture source data instead. The general idea is still the same, allowing the large arrays to be garbage collected, but doesn't require the additional clearance code you're using. If you could give at a try to confirm that it works, that would be appreciated. Of course there is still some memory being leaked, though no longer huge amounts. The way the materials for |
fixes #286
Stale links to old Data Textures in Dyno fix
Just recreation of DataTexture links on every dispose to prevent storing it after SparkPager.dispose() called
Steps to check: