Hi Gabor,
a few weeks ago I "invented" a method (although I am sure it must have been done before) to quickly calculate perfectly smoothly interpolating normals for situations like this. It's relatively easy to implement, fast and the results are great. There is a little catch, that I will explain further on in this post.
As you have mentioned in your post, displacing vertices using texture fetch is a simple technique, but what about your surface normals? They should change, too. One way to do this is by using a geometry shader. When using GL_TRIANGLES, the geometry shader has access to all 3 vertices of your polygon. You could calculate the surface normal by taking the cross product of 2 of the 3 edges, but this won't do you any good: the normal would be the same for all 3 vertices. The result is a single normal per polygon, giving your mesh a very flat shaded look.
What you'd really want is an average normal, calculated from all edges connected to your vertex. But even when using GL_TRIANGLES with adjacency information, you won't get the information of all edges, just a few ones.
So then I thought: "if I use a displacement map for the vertices, why don't I use a normal map for the normals?" The only problem is that the displacement map is dynamically generated, so you need a way to create the normal map on-the-fly as well.
Luckily, there is. It can easily be done in a fragment shader. You can find a sample here:
The implementation uses three render passes:
- Render a displacement map. I use a R32_F (floating point) texture for this. You can see it in the upper left corner of the image.
- Using the displacement map as input, render a normal map. This is an RGB32_F (floating point) texture. You can see it to the right of the displacement map in the image.
- Finally, using both maps, render your mesh. Displace the vertices using the displacement map and simply look-up the normals in your fragment shader (per-pixel) or in your vertex shader (per-vertex).
Your maps don't have to be big, they will be interpolated automatically. I use 256x256 maps for a mesh containing 100x400 vertices.
There are 2 problems with this approach:
- The normal map is incorrect at the edges and the four corner pixels. To correct this, you'd need if-else statements in your shader, as well as additional texture look-ups, which would considerably slow down the shader. I chose not to do this and simply adjust my texture coordinates a little, so the pixels at the edge of the texture aren't used.
- A bigger problem: my shader only works for meshes that are flat to begin with, like terrains. If your original model isn't flat, you will need to adjust the shader that renders the normal map, for example by taking a static normal map as input and modifying it based on the displacement map. I haven't tried that myself, yet.
-Paul