Terrain Morphing

        In this chapter, we will see how to create a terrain that morphs between two heightmaps. Some ideas for using this effect could be interesting transitions between levels, an animation while travelling through time billions of years, or some natural disasters and earthquakes. I’m sure you can think of plenty of other uses for this. Also, one of the techniques covered in this chapter opens the way for the other effects seen later in this tutorial.

To morph between two heightmaps, we need, obviously, two heightmaps. We’ve seen before that each vertex corresponds to a pixel in the heightmap. To morph the terrain, we need to interpolate between the corresponding height from the first heightmap, and the height from the second heightmap. This interpolation will be done based on a number between 0.0f and 1.0f. So, if we smoothly modify the number, the terrain will appear to transform smoothly from one shape to the other. From now on, we will refer to this number as a morph factor.

Now we can begin to see the advantages of using the GPU for this operation. If this was done on the CPU, the vertex buffer would have to be modified each frame, which is a costly operation. The XNA guys have done a great job with the framework, and using DrawUserIndexedPrimitives or DrawUserPrimitives helps a lot in not dealing with dynamic vertex buffers, but we would still want to do this on the GPU, if possible. There are two ways to achieve the desired result.

        Morphing Inside the Vertex Shader

The first method will read inside the vertex shader the heights from the two heightmaps, and interpolate between them. We begin with the code from the previous chapter. Add height2.dds to the project, in the Textures folder. Then add a member for it in Game1.cs and load it with the texture in the LoadGraphicsContent function.

 

 

Texture2D displacementTexture2;
[...]
protected override void LoadGraphicsContent(bool loadAllContent)
        {
            [...]
            displacementTexture2 = content.Load<Texture2D>("Textures\\height2");
        }

In the effect file (VTFDisplacement.fx), add a new parameter, of type float, named morphFactor, and the texture and sampler for the second heightmap.

 

 

float morphFactor = 0.0f;
   texture displacementMap2;
   sampler displacementSampler2 = sampler_state
   {
       Texture   = <displacementMap2>;
       MipFilter = Point;
       MinFilter = Point;
       MagFilter = Point;
       AddressU  = Clamp;
       AddressV  = Clamp;
   };

In the vertex shader, we will read the heights from the two heightmaps, and interpolate between them, as said earlier.

 

 

VS_OUTPUT Transform(VS_INPUT In)
 {
     [...]
     float height1 = tex2Dlod_bilinear( displacementSampler, float4(In.uv.xy,0,0));
     float height2 = tex2Dlod_bilinear( displacementSampler2, float4(In.uv.xy,0,0));
     float height = lerp(height1, height2, morphFactor);
     [...]
 }

The last thing we need to do is set the values for morphFactor and displacementMap2 parameters from the application’s code. So, go in the Game1 class, and inside the Draw function add the following lines. We use a sin function to repeat the smooth transition between the two heightmaps.

 

 

gridEffect.Parameters["displacementMap2"].SetValue(displacementTexture2);
 gridEffect.Parameters["morphFactor"].SetValue( (float)Math.Sin( gameTime.TotalGameTime.TotalSeconds ) * 0.5f + 0.5f);

Running the program now, will show our terrain morphing.

morph

This approach was easy to write, just a few lines of code. Unfortunately, as you might have noticed, the performance was pretty bad. The main reason for this is the fact that inside the vertex shader, we read the two heights. Since each read takes 4 samples from the texture, that makes for a total of 8 vertex texture reads per vertex, which slows down the process (remember that reading from a vertex texture is nowhere near as fast as reading from a texture in a pixel shader, yet). In order to avoid this, we can eliminate bilinear filtering. This brings us down to 2 texture reads per vertex, but we loose the smoothness resulting from the filtering. Next, we will see how to preserve bilinear filtering, and have a good performance at the same time.

        Morphing using Render To Texture

The following method uses a technique called Render To Texture . Before drawing the terrain, we will use the two heightmaps, and combine their colors in a pixel shader. The result will be an interpolation between the two of them, based on the morph factor. Instead of drawing the result to the screen, we draw it onto a texture. After this, we draw the terrain as we did before, but we use the newly created texture as the displacement map. Creating the interpolated heightmap is very fast, since it is done using a pixel shader, so the terrain shader can still use bilinear filtering, and the performance is kept high.

To begin, we need to use the VTFDisplacement.fx effect, as it was at the end of the previous chapter (remove the second displacement map and the interpolation). Create a new file, inside the Shaders folder, name it TextureMorph.fx, and open it. We need to add two parameters for the textures we are combining, and the samplers for them.

 

 

texture textureMap1;
  sampler textureSampler1 : register(s0) = sampler_state
  {
     Texture = (textureMap1);
     ADDRESSU = WRAP;
     ADDRESSV = WRAP;
     MAGFILTER = LINEAR;
     MINFILTER = LINEAR;
     MIPFILTER = LINEAR;
  };

  texture textureMap2;
  sampler textureSampler2 : register(s1) = sampler_state
  {
     Texture = (textureMap2);
     ADDRESSU = WRAP;
     ADDRESSV = WRAP;
     MAGFILTER = LINEAR;
     MINFILTER = LINEAR;
     MIPFILTER = LINEAR;
  };

We need to add the morphFactor parameter, and then we create a pixel shader in which we combine the two textures. We interpolate between the two colors, based on the morphFactor, just as we did previously in the vertex shader.

 

 

float morphFactor = 0.0f;
float4 PixelShaderMorph(in float2 uv : TEXCOORD0) : COLOR
{
 float4 color1 = tex2D(textureSampler1, uv);
 float4 color2 = tex2D(textureSampler2, uv);
 return lerp(color1, color2, morphFactor);
}

technique TextureMorph
{
    pass P0
    {
        pixelShader  = compile ps_3_0 PixelShaderMorph();
    }
}

For simplicity, we will use a SpriteBatcth to render our interpolated texture. Because of this, the technique we defined, called TextureMorph, only contain a pixel shader, because the vertex processing is done for us by the SpriteBatch code. The Sprite Effects sample also shows how to use SpriteBatch with custom shaders, but you do not need to study the sample in order to continue, because we will cover all the necessary code.

Let’s move forward, and open the Game1.cs file. Add the following members to the class.

 

 

Effect morphEffect;
 RenderTarget2D morphRenderTarget;
 DepthStencilBuffer morphDepthBuffer;
 SpriteBatch morphSpriteBatch;

The morphEffect member will be used for loading the effect file created earlier (TextureMorph.fx), morphRenderTarget, morphDepthBuffer and morphSpriteBatch will be used for Render To Texture, when creating the new heightmap. Next, in the LoadGraphicsContent, we have to initialize them. We use 256 for the size, because this is the size of the heightmaps.

 

 

morphEffect = content.Load<Effect>("Shaders\\TextureMorph");
morphRenderTarget = new RenderTarget2D(graphics.GraphicsDevice, 256, 256, 1, SurfaceFormat.Single);
morphDepthBuffer = new DepthStencilBuffer(graphics.GraphicsDevice, 256, 256, graphics.GraphicsDevice.DepthStencilBuffer.Format);
morphSpriteBatch = new SpriteBatch(graphics.GraphicsDevice);

Now, let's create a new function, that takes a float as a parameter. Inside this function we will use Render To Texture to obtain the new heightmap.

protected void Render2TextureMorph(float morphFactor)
        {

First, we need to save the current RenderTarget and DepthStencilBuffer set on the device. In our case, the RenderTarget is null, but if we want to use this in a real game, there’s a great chance that this would be some other RenderTarget, used for post processing, or letterboxing, or other uses. So, we will assume we don’t know its value, and save it. After this, we set our own RenderTarget and DepthBuffer.

 

 

RenderTarget2D oldRT = graphics.GraphicsDevice.GetRenderTarget(0) as RenderTarget2D;       //save old RenderTarget
DepthStencilBuffer oldDS = graphics.GraphicsDevice.DepthStencilBuffer;                                   // save old Depth Buffer
graphics.GraphicsDevice.DepthStencilBuffer = morphDepthBuffer;                                              // set our own
graphics.GraphicsDevice.SetRenderTarget(0, morphRenderTarget);

Next, we clear the render target, we begin drawing, and set the effect parameters. We draw the displacementTexture over the whole RenderTarget. While drawing, our pixel shader will be applied, and the interpolated heightmap will be created.

 

 

graphics.GraphicsDevice.Clear(Color.White);
 morphSpriteBatch.Begin(SpriteBlendMode.None, SpriteSortMode.Immediate, SaveStateMode.None);

 morphEffect.Parameters["textureMap1"].SetValue(displacementTexture);
 morphEffect.Parameters["textureMap2"].SetValue(displacementTexture2);
 morphEffect.Parameters["morphFactor"].SetValue(morphFactor);
 morphEffect.CurrentTechnique = morphEffect.Techniques["TextureMorph"];
 morphEffect.Begin();
 morphEffect.CurrentTechnique.Passes[0].Begin();

 morphSpriteBatch.Draw(displacementTexture, new Rectangle(0, 0, 256, 256), Color.White);
 morphEffect.CurrentTechnique.Passes[0].End();
 morphEffect.End();

 morphSpriteBatch.End();

Lastly, we need to resolve the RenderTarget, and set back the original values for the RenderTarget and DepthStencilBuffer.

 

 

graphics.GraphicsDevice.ResolveRenderTarget(0);
graphics.GraphicsDevice.SetRenderTarget(0, oldRT);
graphics.GraphicsDevice.DepthStencilBuffer = oldDS;
}

In the Draw function, we call this function and then we restore the necessary Render States (that were modified by the spriteBatch). For the gridEffect, we need to set the displacementMap parameter to the newly rendered texture, which can be accessed by morphRederTarget.GetTexture().

 

 

protected override void Draw(GameTime gameTime)
       {
        Render2TextureMorph((float)Math.Sin(gameTime.TotalGameTime.TotalSeconds) * 0.5f + 0.5f);
        graphics.GraphicsDevice.Clear(Color.CornflowerBlue);
        graphics.GraphicsDevice.RenderState.CullMode = CullMode.None;
        graphics.GraphicsDevice.RenderState.DepthBufferEnable = true;
        [...]
        gridEffect.Parameters["displacementMap"].SetValue(morphRenderTarget.GetTexture());
        [...]
}

If you run the code now, you should get the same morphing terrain as before, but the performance is much better. Now let’s analyze the technique we used. The vertex texture fetches were kept to a minimum, the effect was dynamic, which would otherwise have meant modifying the vertex buffer each frame. The morphing itself was done very fast, in a pixel shader. Looking at this, we can see that we could theoretically add very much complexity to the Render To Texture step, and the performance would stay high. One possible scenario would be to use the render-to-texture step to add dynamic deformation to the terrain, like holes made by explosions, waves of land moved by earthquakes, and with some effort, even moving dunes of sand. All you would have to do is manipulate the displacement map before feeding it to the terrain shader.

        Bonus: Terrain Illumination

One complaint you might have for the terrain we rendered until now is the lack of lighting. Since the shape is composed in the vertex shader, the normals would have to be computed there as well. If we had a static terrain, we could create a normal map when we create the height map, and then read from that normal map inside the shaders. But for dynamic terrain, this is no longer an option, since the shape of the terrain can change in unexpected ways. In order to have normals on the terrain, we create a normal map, based on the interpolated heightmap, using the Sobel Filter. This technique is adapted from a shader found in a demo made by ATI. Open TextureMorph.fx, and add the following code:

 

 

//the value of Normal Strength should be tweaked until the result is satisfying. a larger value will result in more pronounced lighting
float normalStrength = 8.0f;
float4 ComputeNormalsPS(in float2 uv:TEXCOORD0) : COLOR
{
    float tl = abs(tex2D (textureSampler1, uv + texelSize * float2(-1, -1)).x);   // top left
    float  l = abs(tex2D (textureSampler1, uv + texelSize * float2(-1,  0)).x);   // left
    float bl = abs(tex2D (textureSampler1, uv + texelSize * float2(-1,  1)).x);   // bottom left
    float  t = abs(tex2D (textureSampler1, uv + texelSize * float2( 0, -1)).x);   // top
    float  b = abs(tex2D (textureSampler1, uv + texelSize * float2( 0,  1)).x);   // bottom
    float tr = abs(tex2D (textureSampler1, uv + texelSize * float2( 1, -1)).x);   // top right
    float  r = abs(tex2D (textureSampler1, uv + texelSize * float2( 1,  0)).x);   // right
    float br = abs(tex2D (textureSampler1, uv + texelSize * float2( 1,  1)).x);   // bottom right

    // Compute dx using Sobel:
    //
    //           -1 0 1 
    //           -2 0 2
    //           -1 0 1
    float dX = -tl - 2.0f*l - bl + tr + 2.0f*r + br;

    // Compute dy using Sobel:
    //
    //           -1 -2 -1 
    //            0  0  0
    //            1  2  1
    float dY = -tl - 2.0f*t - tr + bl + 2.0f*b + br;

    // Compute the normalized Normal
    float4 N = float4(normalize(float3(dX, 1.0f / normalStrength, dY)), 1.0f);
    //convert (-1.0 , 1.0) to (0.0 , 1.0);
    return N * 0.5f + 0.5f;
}
technique ComputeNormals
{
    pass P0
    {
      pixelShader  = compile ps_3_0 ComputeNormalsPS();
    }
}

This code will be used to generate the normal map. Now, we need to add code for lighting in the terrain shader. Open VTFDisplacement.fx, where we need to add a direction for the light, and a texture and sampler for the normal map. Since we will access this in the pixel shader, we can use bilinear filtering.

 

 

float4 lightDirection = {1,-0.7,1,0};
texture normalMap;
sampler normalSampler = sampler_state
{
    Texture   = <normalMap>;
    MipFilter = Linear;
    MinFilter = Linear;
    MagFilter = Linear;
    AddressU  = clamp;
    AddressV  = clamp;
};

We need to modify the pixel shader, to include lighting calculations. We will use a simple L * N formula for the light, and an ambient light of 0.2f.

 

 

float4 PixelShader(in float4 uv : TEXCOORD0, in float4 weights : TEXCOORD2) : COLOR
        {
        [...]
        float4 finalColor =  sand * weights.x + grass * weights.y + rock * weights.z + snow * weights.w;
        // read the normal from the normal map and transform it into (-1, 1) range
        float4 normal = normalize( 2.0f * (tex2D(normalSampler,uv) - 0.5f));
        float4 light = normalize(-lightDirection);
        // dot product between light and normal
         float ldn = dot(light,normal);
         ldn = max(0,ldn);
         // add some ambiental light, and multiply with the color
         return finalColor * (0.2f + ldn);
        }

The last bits we need to add are in Game1.cs. We need a new RenderTarget2D, for the normal computations, and initialize it.

 

 

RenderTarget2D normalRenderTarget;
protected override void LoadGraphicsContent(bool loadAllContent)
        {
            [...]
            normalRenderTarget = new RenderTarget2D(graphics.GraphicsDevice, 256, 256, 1, SurfaceFormat.Color);
            [...]
        }

The function used to compute the normals is very similar to Render2TextureMorph. We need to call it after updating the heightmap, and then the generated normal map has to be passed as a parameter to the terrain rendering shader.

 

 

protected void Render2TextureNormalCompute()
        {

            RenderTarget2D oldRT = graphics.GraphicsDevice.GetRenderTarget(0) as RenderTarget2D;
            DepthStencilBuffer oldDS = graphics.GraphicsDevice.DepthStencilBuffer;

            graphics.GraphicsDevice.DepthStencilBuffer = morphDepthBuffer;
            graphics.GraphicsDevice.SetRenderTarget(0, normalRenderTarget);

            graphics.GraphicsDevice.Clear(Color.White);

            morphSpriteBatch.Begin(SpriteBlendMode.None,
                              SpriteSortMode.Immediate,
                              SaveStateMode.None);
            // compute normals for the newly generated heightmap
            morphEffect.Parameters["textureMap1"].SetValue(morphRenderTarget.GetTexture());

            morphEffect.CurrentTechnique = morphEffect.Techniques["ComputeNormals"];      // select the technique for computing normals.
            morphEffect.Begin();
            morphEffect.CurrentTechnique.Passes[0].Begin();

            morphSpriteBatch.Draw(morphRenderTarget.GetTexture(), new Rectangle(0, 0, 256, 256), Color.White);
            morphEffect.CurrentTechnique.Passes[0].End();
            morphEffect.End();

            morphSpriteBatch.End();

            graphics.GraphicsDevice.ResolveRenderTarget(0);
            graphics.GraphicsDevice.SetRenderTarget(0, oldRT);
            graphics.GraphicsDevice.DepthStencilBuffer = oldDS;

        }
[...]
protected override void Draw(GameTime gameTime)
        {
            graphics.GraphicsDevice.Clear(Color.CornflowerBlue);
            Render2TextureMorph((float)Math.Sin(gameTime.TotalGameTime.TotalSeconds) * 0.5f + 0.5f);
            Render2TextureNormalCompute();
            [...]
            gridEffect.Parameters["normalMap"].SetValue(normalRenderTarget.GetTexture());
            [...]
        }

In the end, the normal mapped terrain should look like this:

normals

For further improvements, the normal map can be combined with some generated noise, to produce the impression of small bumps over the ground, for higher detail.

This ends the second part of this tutorial. We saw how to have dynamic morphing terrain. Using Render-To-Texture can increase performance, and can open the way to lots of special effects, like deformable terrain. We also saw how normals can be computed in real time from the heightmap, and applied to the terrain. For the next chapters, we will leave terrain rendering behind, and cover other interesting effects.

The complete source code for this chapter can be found in Chapter2.zip.

  • David

    Hi!
    I was to start using your terrain code for my game 3D engine so I downloaded the code to see that it worked properly before implementing it in my own engine.
    But when i compiled it everything looked good until it came to the Render2TextureMorph part of your code, and here it says “Both a valid vertex shader and pixel shader (or valid effect) must be set on the device before draw operations may be performed”.

    Have you heard about this problem before, and do you know if its fixable or is it that my graphics card don’t support this type of code?
    Your screens of what you done look really great btw:)

    Regards
    David

  • http://www.catalinzima.com Catalin Zima

    It’s probably your video card.
    For VTF, you need either GeForce greater than 6600, or newer generation (DX10 generation) of ATI

  • David

    Ok:)
    Well back to the net to find a tutorial for terrain again then.
    Hopfully I can use your particle snow later on in my prodject, it looked verry nice:)

    Regards
    David

  • http://oppc.free.fr/DotClear/ Michael

    Hello,

    I have a weird problem with the implementation of the lighting.

    As you can see on this screenshot: http://oppc.free.fr/XNA/heightmapBug.png, the shadows are drawn only on a few percent of the global terrain, even if I change the CellSize property…

    I have an ATI card on my laptop, so maybe it comes from that, and I didn’t have time yet to try my “game” on another computer.

    But I just hoped that problem would make you think to a possible bug in my code…

    Mike

  • http://www.catalinzima.com Catalin Zima

    That’s strange. Do you use SpriteBatch anywhere in the code? that might mess up some texture addressing variables.

  • CDarksight

    If you are using SpriteBatch try changing how you call SpriteBatch.Begin();

    so for example

    spriteBatch.Begin(SpriteBlendMode.AlphaBlend, SpriteSortMode.Immediate, SaveStateMode.None);

    //your drawing code here

    spriteBatch.End();