• About Us
  • Privacy Policy
  • Disclaimer
  • Contact Us
AimactGrow
  • Home
  • Technology
  • AI
  • SEO
  • Coding
  • Gaming
  • Cybersecurity
  • Digital marketing
No Result
View All Result
  • Home
  • Technology
  • AI
  • SEO
  • Coding
  • Gaming
  • Cybersecurity
  • Digital marketing
No Result
View All Result
AimactGrow
No Result
View All Result

Constructing The Monolith: Composable Rendering Programs for a 13-Scene WebGL Epic

Admin by Admin
November 30, 2025
Home Coding
Share on FacebookShare on Twitter




Free course advice: Grasp JavaScript animation with GSAP via 34 free video classes, step-by-step initiatives, and hands-on demos. Enroll now →

To construct this monolithic challenge with 13 totally different scenes, a number of methods made up of reusable and composable parts had been developed inside React Three Fiber:

  1. Deferred Rendering & Outlines
  2. Composable Supplies
  3. Composable Particle System
  4. Scene Transition System

The article begins with an summary of the idea artwork and early collaboration behind the challenge, then strikes into devoted sections that designate every system intimately. These sections describe the choices behind Deferred Rendering and Outlines, the construction of the Composable Supplies system, the logic behind the Composable Particle System, and the method used for transitions between scenes.


Free GSAP 3 Express Course


Study fashionable net animation utilizing GSAP 3 with 34 hands-on video classes and sensible initiatives — excellent for all ability ranges.


Test it out

Temporary Intro & Idea Artwork

Kehan got here to me immediately, figuring out me via a pal of a pal. He had a imaginative and prescient for the challenge and had already engaged Lin for example a number of scenes. I informed him the staff I needed, and we expanded right into a full group of freelancers. Fabian joined as a shader developer, Nando as a inventive, Henry as a 3D artist, Daisy as a producer, and HappyShip joined as soon as Henry went on trip.

Lin’s illustrations had such a particular and provoking artwork model that translating them into 3D turned an extremely enjoyable and thrilling course of. The staff spent numerous days and nights discussing the right way to convey the challenge to life, with a continuing stream of recent concepts and shared references—my bookmarks folder for the challenge now holds greater than 50 hyperlinks. It was a pleasure and a privilege to work with such a passionate and proficient staff.

1. Deferred Rendering & Outlines

A key function of the artwork model is the usage of coloured outlines. After intensive analysis, we discovered three predominant methods to attain this:

  1. Edge detection based mostly on depth and normals
  2. Inverse hull
  3. Edge detection based mostly on materials IDs

We determined to make use of the primary methodology for 2 causes. With inverse hull, shifting the digital camera nearer or farther from the article would trigger the define width to look thicker or thinner. Materials ID would additionally not work nicely with the particle-based clouds.

Normals

https://themonolithproject.internet/?depth

To make use of deferred rendering in Three.js, we have to set the depend in WebGLRenderTarget, the place every depend represents a G-Buffer. For every G-Buffer, we will outline the feel kind and format to cut back reminiscence utilization.

In our case, we used a G-Buffer for storing normals. We utilized a reminiscence optimization method known as octahedron regular vector encoding, which permits normals to be encoded into fewer bits at the price of extra encoding and decoding time.

Define Colours

https://themonolithproject.internet/?outlineColor

We additionally needed totally different coloured outlines for various objects, so we used an extra G-Buffer. As a result of we had been solely utilizing a small variety of colours, one optimization might have been to make use of a shade lookup texture, lowering the G-Buffer to only a few bits. Nevertheless, we stored issues easy and simpler to regulate by utilizing the complete RGB vary.

Outlines

https://themonolithproject.internet/?define

As soon as the G-Buffers are ready, a convolution filter is utilized to the depth and regular knowledge to detect edges. We then apply the colour from the define shade G-Buffer to these edges. Sources equivalent to Moebius Model Put up Processing by Maxime Heckel and Define Styled Materials by Visible Tech Artwork had been immensely useful.

Gotchas

One situation with utilizing depend in Three.js WebGLRenderTarget is that each one core supplies, equivalent to MeshBasicMaterial, will now not render by default. A price should be assigned to the G-Buffer location for it to look once more. To keep away from polluting the buffer with undesirable knowledge, we will merely set it to itself.

structure(location = 1) out vec4 gNormal;

void predominant() {
  gNormal = gNormal;
}

2. Composable Supplies

Since this challenge contains many scenes with quite a few objects utilizing totally different supplies, I needed to create a system that encapsulates a bit of shader performance—together with any knowledge and logic it requires—right into a element. These parts might then be mixed to type a cloth. React and JSX make this sort of composability simple, leading to a quick and intuitive developer expertise.

Word: this challenge was developed in early 2024, earlier than TSL was launched. Issues may very well be finished in a different way at present.

GBufferMaterial

The core of the system is the GBufferMaterial element. It’s basically a ShaderMaterial with useful uniforms and pre-calculated values, together with insertion factors that modules can use so as to add extra shader code on prime.

uniform float uTime;
/// insert 

void predominant() {
  vec2 st = vUv;

  /// insert 
}

MaterialModule

A big array of reusable modules, together with a number of customized one-off modules, had been created for this challenge. Probably the most primary of those is the MaterialModuleColor.

export const MaterialModuleColor = forwardRef(({ shade, mix = '' }, ref) => {
  // COLOR
  const _color = useColor(shade);

  const { materials } = useMaterialModule({
    title: 'MaterialModuleColor',
    uniforms: {
      uColor: { worth: _color },
    },
    fragmentShader: {
      setup: /*glsl*/ `
        uniform vec3 uColor;
      `,
      predominant: /*glsl*/ `
        pc_fragColor.rgb ${mix}= uColor;
      `,
    },
  });

  useEffect(() => {
    materials.uniforms.uColor.worth = _color;
  }, [_color]);

  useImperativeHandle(ref, () => _color, [_color]);

  return <>>;
});

It merely provides a uColor uniform and writes it to the output shade.

Use Case

For instance, that is the code for the monolith:


  
    
    

    
    
    

    

    
    

    
    

  

All of those are generic modules that had been reused throughout many various meshes all through the location.

  • MaterialModuleNormal: encodes and writes the world regular to the conventional G-Buffer
  • MaterialModuleOutline: writes the define shade to the outlineColor G-Buffer
  • MaterialModuleUVMap: units the present st worth based mostly on the offered texture (affecting later modules that use st)
  • MaterialModuleGradient: attracts a gradient shade
  • MaterialModuleAnimatedGradient: attracts an animated gradient
  • MaterialModuleBrightness: brightens the output shade
  • MaterialModuleUVOriginal: resets st to the unique UVs
  • MaterialModuleMap: attracts a texture
  • MaterialModuleFlowMap: provides the circulate map texture to the uniforms
  • MaterialModuleFlowMapColor: provides a shade based mostly on the place the circulate map is activated

Modules that affected the vertex shaders had been additionally created, equivalent to:

  • MaterialModuleWind: strikes the vertex for a wind impact, used for timber, shrubs, and many others.
  • MaterialModuleDistort: distorts the vertex, used for the planets

With this method, complicated shader performance—equivalent to wind—is encapsulated right into a reusable and manageable element. It will possibly then be mixed with different vertex and fragment shader modules to create all kinds of supplies with ease.

3. Composable Particle System

Equally, the thought of constructing issues composable and reusable is prolonged to the ParticleSystem.

ParticleSystem

That is the core ParticleSystem element. Because it was written in WebGL, it contains logic to calculate place, velocity, rotation, and life knowledge utilizing the ping-pong rendering methodology. Extra options embody prewarming, the flexibility to start out and cease the particle system naturally (permitting remaining particles to complete their lifetime), and a burst mode that finally wasn’t used.

Similar to the GBufferMaterial, the place, rotation, and life shaders include insertion factors for modules to make use of. For instance:

void predominant() {
  vec4 currPosition = texture2D(texturePosition, uv);
  vec4 nextPosition = currPosition;

  if (needsReset) {
    /// insert 
  }

  /// insert 

  nextPosition += currVelocity * uDelta;
    
  /// insert 

  gl_FragColor = vec4(nextPosition);
}

It supported two modes: factors or instanced mesh.

ParticleSystemModule

The system is impressed by Unity, with modules that outline the emission form in addition to modules that have an effect on place, velocity, rotation, and scale.

Emission modules

For instance, the EmissionPlane module permits us to set particle beginning positions based mostly on the dimensions and place of a airplane.

The EmissionSphere module permits us to set the particle beginning positions on the floor of a sphere.

Probably the most highly effective module is the EmissionShape module. This permits us to go in a geometry, and it calculates the beginning positions utilizing MeshSurfaceSampler.

Place, Velocity, Rotation, and Scale modules

Different generally used modules embody:

  • VelocityAddDirection
  • VelocityAddOverTime
  • VelocityAddNoise
  • PositionAddMouse: provides to the place based mostly on the mouse place, and may push or pull particles away from or towards the mouse
  • PositionSetSpline: units a spline path for the particles to observe and ignores velocity

Asteroids Use Case

For instance, that is the asteroid belt:


  

  

  

  
    
    
    
    
    
    
  
  

The particles are emitted from a small sphere, then follow a spline path with a random rotation.

It also works with the GBufferMaterial, allowing us to shade it using the same modules. This is how the mouseover flow map is applied to this particle system—the same material module used for the monolith is also used here.

Leafs Use Case


  

  
  

  
  
  
  

  
  
  

  

4. Scene Transition System

Because of the large number of scenes and the variety of transitions we wanted to create, we built another system specifically for scene transitions. Every transition in the project uses this system, including:

  • solar system > planet: wipe up
  • planet > bone: zoom blur
  • history > tablet: mask
  • tablet > fall: mask
  • fall > overview: zoom blur
  • desert > swamp: radial
  • winter > forest: sphere
  • world > ending: mask

First, we draw scene A with deferred rendering, including depth and normals. Then we do the same for scene B.

Next, we use a fullscreen triangle with a material responsible for mixing the two scenes. We created four materials to support all of our transitions.

  • MaterialTransitionMix
  • MaterialTransitionZoom
  • MaterialTransitionRadialPosition
  • MaterialTransitionRaymarched

The simplest of these is MaterialTransitionMix, but it is also quite powerful. It takes the scene A texture, scene B texture, and an additional grayscale mix texture, then blends them based on a progress value from 0 to 1.

For the solar system to planet transition, the mix texture is generated at runtime using a rectangle that moves upward.

For the history to tablet transition, the mix texture is also generated at runtime by rendering the same tablet scene in a special mask mode that outputs the tablet in a black-to-white range.

The tablet to fall transition, as well as the world to ending transition, were handled the same way, using mix textures generated at runtime.

Deferred Rendering, made composable

Using the same insertion technique as the composable material and particle systems, the deferred rendering workflow was made composable as well.

By the end of the project, we had created the following modules for our Deferred Rendering system:

  • DeferredOutline
  • DeferredLighting
  • DeferredChromaticAberration
  • DeferredAtmosphere — most visible in the desert intro
  • DeferredColorCorrect
  • DeferredMenuFilter

Use Case

For example, the solar system scene included the following modules:


  
  
  
  
  

  

Final thoughts

These systems help make development faster by being encapsulated, composable, and reusable.

This means features can be added and tested in isolation. No more giant material files with too many uniforms and hundreds of lines of GLSL. Fixing a specific feature no longer requires copying code across multiple materials. Any JS logic needed for a shader is tightly coupled with the snippet of vertex or fragment code that uses it.

And of course, because all of this is built with React, we get hot reloading. Being able to modify a specific shader for a specific scene and see the results instantly makes the workflow more fun, enjoyable, and productive.

Tags: 13SceneBuildingComposableEpicMonolithRenderingSystemsWebGL
Admin

Admin

Next Post
Fortnite Chapter 7 Season 1 map and loot pool

Fortnite Chapter 7 Season 1 map and loot pool

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recommended.

Entry to experimental medical therapies is increasing throughout the US

Entry to experimental medical therapies is increasing throughout the US

May 16, 2025
Feds cost 16 Russians allegedly tied to botnets utilized in cyberattacks and spying

Wipers from Russia’s most cut-throat hackers rain destruction on Ukraine

November 10, 2025

Trending.

How you can open the Antechamber and all lever places in Blue Prince

How you can open the Antechamber and all lever places in Blue Prince

April 14, 2025
The most effective methods to take notes for Blue Prince, from Blue Prince followers

The most effective methods to take notes for Blue Prince, from Blue Prince followers

April 20, 2025
Exporting a Material Simulation from Blender to an Interactive Three.js Scene

Exporting a Material Simulation from Blender to an Interactive Three.js Scene

August 20, 2025
AI Girlfriend Chatbots With No Filter: 9 Unfiltered Digital Companions

AI Girlfriend Chatbots With No Filter: 9 Unfiltered Digital Companions

May 18, 2025
Sophos Intelix for Microsoft Copilot now brings menace intelligence straight into Copilot – Sophos Information

Sophos Intelix for Microsoft Copilot now brings menace intelligence straight into Copilot – Sophos Information

October 20, 2025

AimactGrow

Welcome to AimactGrow, your ultimate source for all things technology! Our mission is to provide insightful, up-to-date content on the latest advancements in technology, coding, gaming, digital marketing, SEO, cybersecurity, and artificial intelligence (AI).

Categories

  • AI
  • Coding
  • Cybersecurity
  • Digital marketing
  • Gaming
  • SEO
  • Technology

Recent News

The Spark: Engineering an Immersive, Story-First Internet Expertise

The Spark: Engineering an Immersive, Story-First Internet Expertise

January 9, 2026
Pricing Choices and Useful Scope

Pricing Choices and Useful Scope

January 9, 2026
  • About Us
  • Privacy Policy
  • Disclaimer
  • Contact Us

© 2025 https://blog.aimactgrow.com/ - All Rights Reserved

No Result
View All Result
  • Home
  • Technology
  • AI
  • SEO
  • Coding
  • Gaming
  • Cybersecurity
  • Digital marketing

© 2025 https://blog.aimactgrow.com/ - All Rights Reserved