Emission and bloom

Emission

Emission models the contribution of light from a surface, in addition to any light already reflected from that surface. In a physically-based path tracer like Blender's Cycles, or a realtime renderer with support for global illumination (GI), emission works as you'd expect: the surface emits light, and illuminates nearby objects like any other light source. But these types of renderers tend to be very advanced, or slow, or both.

Most realtime renderers use local illumination, and emission brightens only the emissive surface itself.

Why doesn't local illumination allow emissive surfaces to light the entire scene?

With local illumination, lighting of a surface is evaluated in isolation from other surfaces, and any light sources must be defined as simple combinations of shader uniforms and textures. The surface of another mesh cannot act as a light source in a fragment shader. Instead, emission is evaluated in the fragment shader only when rendering that specific emissive surface to the drawing buffer, affecting only itself.

Figure 1: Emission only, rendered with local illumination. Underwhelming, isn't it?

The emissive boxes in the illustration above each have emissive color #59BCF3, intensities increasing from 1 to 256 nits from left to right. Tone mapping causes the brighter boxes to desaturate toward white.[1] The emissive surfaces are visible in the dimly-lit scene, but they aren't illuminating other objects.

Even limited to local illumination, we can create a much better illusion of brightness than this. We'll first light the scene independently, then create a glow called a bloom effect around the emissive surfaces.

Bloom

Bloom is an imaging artifact present in physical cameras, in which fringes surround the brightest areas of the image, “contributing to the illusion of an extremely bright light overwhelming the camera or eye.” (Wikipedia)

Many 3D applications already have the bloom effect available as a post-processing option. But how should emissive surfaces, the surrounding scene, and the bloom effect be configured for best results?

How is a bloom effect implemented?

Bloom is usually implemented as a post-processing effect. Render the scene to an open domain [0, ∞] render target in a linear color space. Apply a render pass over the entire screen — not over individual objects — using a gaussian blur to spread pixels brighter than some defined threshold. The result may be drawn back into a render target for further processing, or tone mapped and converted to the output color space, forming the final image.

Full implementation details are outside the scope of this post, but see Learn OpenGL: Bloom for a more detailed explanation.

Configuring a scene for emission and bloom

First, create a scene with the lighting, composition, and exposure you prefer. Enable tone mapping from the beginning, so that you're making initial lighting choices based on a fully-formed image. Point lights may be placed at the locations of the emissive objects, or lighting can be baked into surrounding materials.

Figure 2: Emission and independent scene lighting, still without bloom. Point lights, image-based lighting, or baked lighting can illuminate rest of the scene giving a sense of light in the space, but the cubes themselves appear flat.

Emissive surfaces intended to bloom should have higher total luminance than the surrounding scene, often using open domain [0, ∞] linear intensities.[2][3] We're trying to give the illusion that the emission has overwhelmed a camera sensor with its brightness.

Non-emissive surfaces are under varying lighting conditions, and may have strong reflective highlights. Unless you want to see bloom from reflected highlights — a stylistic choice — make it easy on yourself by using a much higher intensity for the emissive surfaces. When emissive surfaces are sufficiently bright, there's room to adjust lighting and the luminance threshold toward an overall look. If you can't isolate the effect to the intended surfaces, the surfaces are probably not bright enough.

How can an RGB value be brighter than 1?

Some colors in a 3D scene do require 0–1 RGB values. Base/diffuse/albedo color and specular color, which represent reflectance percentages, can never exceed 1 because a surface cannot reflect >100% of incoming light. Final images rendered to the canvas must use 0–1 RGB values, relative to the display's capabilities.

Emission represents lighting data, not a reflectance percentage or a ratio, and its RGB values can use the full [0, ∞] domain. Whether expressed as an RGB triplet [1000, 0, 1000] or an RGB/intensity pair [#FF00FF, 1000] the meaning is the same.

Pixels above the luminance threshold will bloom, creating a glow in screen space. In three.js, the bloom threshold is defined in luminance, measured with weights for the spectral sensitivity of human vision,[4] also called luma coefficients. With emissive color #FF00FF and an intensity of 100 nits, a luminance threshold of 99 nits will not show visible bloom, due to the luma coefficients.

Luminance = 0.2126 x Red + 0.7152 x Green + 0.0722 x Blue
Why the arbitrary [0.2126, 0.7152, 0.0722] coefficients?

These coefficients come from the Rec. 709 color primaries, used in both sRGB and Linear-sRGB color spaces. Wide gamut color spaces, such as Display P3 and Rec. 2020, use different primaries and require different luma coefficients.

Figure 3: Emission and bloom. Independent lighting illuminates the scene, and a bloom effect creates the illusion that the emissive cubes are the dominant light source.

Finally, remember that tone mapping and conversion to the color space of the HTML canvas context are necessary steps in forming an image from our [0, ∞] linear data, and should be applied after the bloom effect. Other post-processing effects using physically-based units may come before tone mapping.

The 3D scene used throughout this article comes from the glTF Emissive Strength Test model, with modifications. Settings for materials, lighting, and post-processing in the final render, with three.js units:

setting value
material.emissive #59BCF3
material.emissiveIntensity 1 – 256 nits
light.color #59BCF3
light.intensity 1 – 256 cd
renderer.toneMapping AgX
renderer.toneMappingExposure 0.5
bloom.threshold 10 nits
bloom.strength 0.5

Thanks for reading, and please reach out with any questions!


1 Brighter boxes would appear white without tone mapping, too, as their values clip the limits of the [0, 1] drawing buffer. Before reaching white, they would first clip to one of the “Notorious Six”, or #00FFFF in this case. Tone mapping is a necessary step in forming an image from lighting data, and helps to avoid clipping and the Notorious Six.

2 I avoid using “HDR” as a synonym for linear [0, ∞] RGB data. HDR has multiple meanings, often used imprecisely, and is not necessarily linear. In three.js, the working color space is Linear sRGB: Rec. 709 primaries, D65 white point, and a linear transfer function.

3 The domain [0, ∞] is theoretical. A 32-bit floating point render target has a maximum value around 3.40 × 1038, with lower precision as values approach that limit. Rendering pipelines may have further limitations, such as 16-bit render targets.

4 More precisely, for the CIE Standard Observer.