Voxel Cone Tracing for architectural visualization

Voxel Cone Tracing for architectural visualization

At our CAD plugin Enscape, the scene geometry can change instantaneous. Additionally, the scene boundaries aren’t fixed. So the first evaluation for a GI algorithm was the following:

Technique Thoughts Solution
Light propagation volumes Bad behavior for small scale phenomena and occlusions. Discarded
Static Lightmap calculation Computationally too expensive. Discarded
Dynamic, progressive Lightmap calculation Might be complicated in terms of unwrapping, texture atlassing and proper specular lighting to avoid a “clay” look. Postponed
Voxel cone tracing Might scale well for big scenes if we only consider the near space with a sparse octree or clipmap. Let’s try it!

Conservative Rasterization

We chose to use an Octree in Cyril Crassins original fashion. We store Albedo, Normal, Irradiance, Roughness, Occupation and Emissive Intensity for six directions per Voxel. We build the octree in multiple passes to achieve a better thread occupancy. Below, you can see a debug view that renders the voxels normal direction using a geometry shader.

10bitNormalPerChannelInRGBA

According to GPU Gems, Conservative Rasterization, we use conservative rasterization to avoid holes in our surfaces.

normal 6 normal6thin

Directional voxels

Click on following images to see the influence of directional voxels (left) in the Sponza scene. Only storing a single normal and occupancy value per voxel leads to overestimation of occlusion and therefore a darker scene.

1 (1)1 (2)

3 (1) 3 (2)

2 (2) 2 (1)

Injecting the pointlights

The pointlight shadow map is rendered as a shadowed cubemap which injects irradiance into the voxel structure afterwards.

pointlightgi pointlightgi2 pointlightgi3 pointlightgi4 pointlightgi5 pointlightgi6

Voxel Ambient Occlusion

We use collect the marched distance per diffuse cone and calculate the ambient occlusion as proposed by Cyril Crassins original thesis.

sponzagoodao2

Refinement Queue

As suggested in The Technology Behind the “Unreal Engine 4 Elemental demo by Matrin Mittring we used a refinement point queue to sample specular as sparse as possible.

aliasdetection2 aliasdetection

Results

After some experiments, we didn’t manage to get it to work with varying architectural scenes. Especially thin walls led to tremendous light bleeds that could not be stopped properly. We tried to implement an octree based level of detail gradient around the viewer, which did not work robust without noticeable artifacts. Therefore, we moved over to a combination of voxel layouts of local cubemaps.

s1 s2

Leave a Reply

Your email address will not be published. Required fields are marked *