Handmade Network»Forums
Abhaya Uprety
8 posts
Q: Interpolations in per fragment lighting in OpenGL
Edited by Abhaya Uprety on Reason: Update link
Here I am assuming that our mesh is made of triangles and every vertex has position, normal and color attribute. There is a point light source in the scene.

Light direction interpolates poorly along the boundaries of the triangle, and this becomes most visible in cases of huge triangle surfaces and near point light sources. Therefore we interpolate position instead of light direction, and calculate light direction in fragment shader.

By the same argument however, normals should also interpolate poorly if the normals varied greatly. However we still interpolate normal across the fragments of the triangle. Why is that?

I have been referring to Chapter 10 of Learning Modern 3D Graphics Programming book: https://paroj.github.io/gltut/Ill.../Tut10%20Fragment%20Lighting.html.
Mārtiņš Možeiko
2562 posts / 2 projects
Q: Interpolations in per fragment lighting in OpenGL
I think you are misunderstanding what they are saying.

Article says that calculating lighting values (color) at vertex locations and interpolating it (color) across triangle does not pretty.

Interpolating light direction from vertex locations should look fine.
Miles
131 posts / 4 projects
Q: Interpolations in per fragment lighting in OpenGL
Edited by Miles on
You're correct that interpolating the normal has a similar issue. However, unlike with light direction where you can fall back to interpolating position and doing the direction calculation in the fragment shader at relatively little extra cost, when you're interpolating normals there's nothing to fall back to. The normal vector is all you have to work with in the first place.

If you want to get more "correct" interpolation of normals you can use slerp (spherical interpolation) instead of nlerp (normalized linear interpolation). However, to my knowledge basically no one actually does this, for 3 reasons:

1. Slerp is significantly more computationally expensive than nlerp (and significantly more expensive than the light direction calculation).
2. The difference between slerp and nlerp only starts to get significant as the angle surpasses around 90 degrees. Applying smooth normals across such extreme angles usually looks bad anyway, and produces other even more problematic visual artifacts (such as normals pointing away from the camera near the outline of the mesh) regardless what type of interpolation you use.
3. Smooth vertex normals are a loose and lossy approximation of the true surface anyway. Slerp is only more "correct" if you assume that the surface across a triangle has roughly spherical curvature. But the real geometry is missing, so we don't actually know what the surface should look like - we're guessing it's spherical, but it could just as well be some other shape. The only way to get normal interpolation which is actually more correct is to fill in that missing detail, either with more geometry or with a normal map.
Abhaya Uprety
8 posts
Q: Interpolations in per fragment lighting in OpenGL
I didn't put the link to exact section within the chapter of the book. I have corrected that now. That section goes,
"...
There is a problem that needs to be dealt with first. Normals do not interpolate well. Or rather, wildly different normals do not interpolate well. And light directions can be very different if the light source is close to the triangle relative to that triangle's size.
...".
I have corrected the original post with this link:
https://paroj.github.io/gltut/Ill.../Tut10%20Fragment%20Lighting.html
Abhaya Uprety
8 posts
Q: Interpolations in per fragment lighting in OpenGL
"...when you're interpolating normals there's nothing to fall back to. The normal vector is all you have to work with in the first place...", this clicked with me.

"...But the real geometry is missing, so we don't actually know what the surface should look like - we're guessing it's spherical, but it could just as well be some other shape. The only way to get normal interpolation which is actually more correct is to fill in that missing detail, either with more geometry or with a normal map", thumbs up here as well.

Thanks.
Marc Costa
65 posts
Q: Interpolations in per fragment lighting in OpenGL
To approximate higher resolution microgeometry, you can use normal maps. Normal maps encode a normal per texel, which is then fetched in the pixel/fragment shader giving you more variation than a simple interpolation across a triangle's face.