Wrong gradient in volumetric rendering gives uncorrect shading
Hi all !
In the shader for volumetric rendering (used in vtkOpenGLGPUVolumeRayCastMapper), if I'm understanding the code correctly, gradients are mainly used for two things : applying a gradient opacity, and applying the shading model by approximating the normal of a surface. In both cases, the gradient is computed from the scalar texture(s). It makes total sense to compute the gradient opacity, but I think it can lead to some issues when used to approximate a surface's normal. To apply the shading model, we're interested in approximating the volume as a surface at a given sample. But to do that, I think we want to compute the gradient of the density (or opacity) of the volume, not directly the gradient of the scalars. To illustrate it, I have an example here of an ImageData with a sphere. But the inside of the sphere has scalar 0, and the outside has scalar 1. To display the sphere, I also set a decreasing opacity TF (1.0 at 0 and 0.0 at 1). Finally, I put a unique positional light. The result is below (the red dot represents the position of the light).
So in this case, the shading is uncorrect : the bottom of the sphere is lighted, even if the light is above the sphere. Would it be relevant to change the way the surfacic normal is estimated by taking opacity into account ? If so, I can open an MR to support it.
Thanks !