SUMMARY
All relevant data of our volumetric renderer can be stored within a G-Buffer, this allowes us to utilize the output of our framebuffer in a deferred rendering pipeline.
Example of the G-Buffer.
We can use this G-Buffer to calculate any light information in our scene and any additional image effects that might be required such as ambient occlusion.
Example of a lit scene, generated from a given G-Buffer.
To achieve a hybrid renderer we can export a depth buffer by converting the "true" distance to the scene back to camera distance. Additionally we could create a lattified version of the given volume if desired
Rendered frame of a volume.
Lattified version of the given volume.
Conclusion
Distance field rendering in its current state is neither robust or fast enough for large-scale commercial video game productions. Nonetheless, in comparison to today's industry norms, the simplicity of these techniques makes them desirable for rendering and other use cases such as modelling. Algorithms and rendering technology will advance over time, allowing for efficient hybrid or full-on volume rendering within game development.