Emission and bloom
Emission
Emission models the contribution of light from a surface, in addition to any light already reflected from that surface. In a physically-based path tracer like Blender's Cycles, or a realtime renderer with support for global illumination (GI), emission works as you'd expect: the surface emits light, and illuminates nearby objects like any other light source. But these types of renderers tend to be very advanced, or slow, or both.
Most realtime renderers use local illumination, and emission brightens only the emissive surface itself.
Why doesn't local illumination allow emissive surfaces to light the entire scene?
With local illumination, lighting of a surface is evaluated in isolation from other surfaces, and any light sources must be defined as simple combinations of shader uniforms and textures. The surface of another mesh cannot act as a light source in a fragment shader. Instead, emission is evaluated in the fragment shader only when rendering that specific emissive surface to the drawing buffer, affecting only itself.
The emissive boxes in the illustration above each have emissive color #59BCF3
, intensities increasing from 1 to 256 nits from left to right. Tone mapping causes the brighter boxes to desaturate toward white.[1] The emissive surfaces are visible in the dimly-lit scene, but they aren't illuminating other objects.
Even limited to local illumination, we can create a much better illusion of brightness than this. We'll first light the scene independently, then create a glow called a bloom effect around the emissive surfaces.
Bloom
Bloom is an imaging artifact present in physical cameras, in which fringes surround the brightest areas of the image, “contributing to the illusion of an extremely bright light overwhelming the camera or eye.” (Wikipedia)
Many 3D applications already have the bloom effect available as a post-processing option. But how should emissive surfaces, the surrounding scene, and the bloom effect be configured for best results?
How is a bloom effect implemented?
Bloom is usually implemented as a post-processing effect. Render the scene to an open domain
Full implementation details are outside the scope of this post, but see Learn OpenGL: Bloom for a more detailed explanation.
Configuring a scene for emission and bloom
First, create a scene with the lighting, composition, and exposure you prefer. Enable tone mapping from the beginning, so that you're making initial lighting choices based on a fully-formed image. Point lights may be placed at the locations of the emissive objects, or lighting can be baked into surrounding materials.
Emissive surfaces intended to bloom should have higher total luminance than the surrounding scene, often using open domain
Non-emissive surfaces are under varying lighting conditions, and may have strong reflective highlights. Unless you want to see bloom from reflected highlights — a stylistic choice — make it easy on yourself by using a much higher intensity for the emissive surfaces. When emissive surfaces are sufficiently bright, there's room to adjust lighting and the luminance threshold toward an overall look. If you can't isolate the effect to the intended surfaces, the surfaces are probably not bright enough.
How can an RGB value be brighter than 1?
Some colors in a 3D scene do require 0–1 RGB values. Base/diffuse/albedo color and specular color, which represent reflectance percentages, can never exceed 1 because a surface cannot reflect >100% of incoming light. Final images rendered to the canvas must use 0–1 RGB values, relative to the display's capabilities.
Emission represents lighting data, not a reflectance percentage or a ratio, and its RGB values can use the full [1000, 0, 1000]
or an RGB/intensity pair [#FF00FF, 1000]
the meaning is the same.
Pixels above the luminance threshold will bloom, creating a glow in screen space. In three.js, the bloom threshold is defined in luminance, measured with weights for the spectral sensitivity of human vision,[4] also called luma coefficients. With emissive color #FF00FF
and an intensity of 100 nits, a luminance threshold of 99 nits will not show visible bloom, due to the luma coefficients.
Luminance = 0.2126 x Red + 0.7152 x Green + 0.0722 x Blue
Why the arbitrary [0.2126, 0.7152, 0.0722]
coefficients?
These coefficients come from the
Finally, remember that tone mapping and conversion to the color space of the HTML canvas context are necessary steps in forming an image from our
The 3D scene used throughout this article comes from the glTF Emissive Strength Test model, with modifications. Settings for materials, lighting, and post-processing in the final render, with three.js units:
setting | value |
---|---|
material.emissive |
#59BCF3 |
material.emissiveIntensity |
1 – 256 nits |
light.color |
#59BCF3 |
light.intensity |
1 – 256 cd |
renderer.toneMapping |
AgX |
renderer.toneMappingExposure |
0.5 |
bloom.threshold |
10 nits |
bloom.strength |
0.5 |
Thanks for reading, and please reach out with any questions!
1 Brighter boxes would appear white without tone mapping, too, as their values clip the limits of the #00FFFF
in this case. Tone mapping is a necessary step in forming an image from lighting data, and helps to avoid clipping and the Notorious Six.
2 I avoid using “HDR” as a synonym for linear
3 The domain
4 More precisely, for the CIE Standard Observer.