Friday, November 08, 2013 Eric Richards

Rendering Water with Displacement Mapping

Quite a while back, I presented an example that rendered water waves by computing a wave equation and updating a polygonal mesh each frame.  This method produced fairly nice graphical results, but it was very CPU-intensive, and relied on updating a vertex buffer every frame, so it had relatively poor performance.

We can use displacement mapping to approximate the wave calculation and modify the geometry all on the GPU, which can be considerably faster.  At a very high level, what we will do is render a polygon grid mesh, using two height/normal maps that we will scroll in different directions and at different rates.  Then, for each vertex that we create using the tessellation stages, we will sample the two heightmaps, and add the sampled offsets to the vertex’s y-coordinate.  Because we are scrolling the heightmaps at different rates, small peaks and valleys will appear and disappear over time, resulting in an effect that looks like waves.  Using different control parameters, we can control this wave effect, and generate either a still, calm surface, like a mountain pond at first light, or big, choppy waves, like the ocean in the midst of a tempest.

This example is based off of the final exercise of Chapter 18 of Frank Luna’s Introduction to 3D Game Programming with Direct3D 11.0 .  The original code that inspired this example is not located with the other example for Chapter 18, but rather in the SelectedCodeSolutions directory.  You can download my source code in full from, under the 29-WavesDemo project.  One thing to note is that you will need to have a DirectX 11 compatible video card to execute this example, as we will be using tessellation stage shaders that are only available in DirectX 11.



To model waves, we will start with a regular XZ polygonal grid.  Then, we will use two normal/heightmap textures, to modify the vertex heights and normal by taking the sum of samples from both textures.  We will scroll these normal/depth maps at independent rates, with different scales, producing the peaks and valleys of the wave effect.  We will develop a class to encapsulate the parameters of the wave model, as well as the underlying polygonal mesh, and a new shader effect, based on our displacement mapping effect.

Waves Class

Our waves class encapsulates the underlying grid geometry, as well as the parameters to control the texture scrolling.  We will make use of our BasicModel and BasicModelInstance classes to store the wave geometry, since these classes already provide a lot of the functionality we will need.  I could have subclassed the BasicModel class instead, but I felt that that was overkill for this demo.  For convenience, we will provide accessor properties to set the material, world transform and diffuse map of the underlying BasicModel, rather than just exposing the BasicModel and BasicModelInstance members publicly.

public class Waves : DisposableClass {
    // wave geometry
    private BasicModel _gridModel;
    private BasicModelInstance _grid;

    // offsets to scroll the wave textures for displacement mapping
    private Vector2 _wavesDispOffset0;
    private Vector2 _wavesDispOffset1;

    // offsets to scroll the wave textures for normal mapping
    private Vector2 _wavesNormalOffset0;
    private Vector2 _wavesNormalOffset1;
    // transform matrices to convert offsets into transformations we can feed into the shader
    private Matrix _wavesDispTexTransform0;
    private Matrix _wavesDispTexTransform1;
    private Matrix _wavesNormalTexTransform0;
    private Matrix _wavesNormalTexTransform1;

    private bool _disposed;

    // provides access to model material
    public Material Material {
        get { return _gridModel.Materials[0]; }
        set { _gridModel.Materials[0] = value; }
    // provides access to model world transform
    public Matrix World {
        get { return _grid.World; }
        set { _grid.World = value; }

    public BoundingBox BoundingBox { get { return _grid.BoundingBox; } }

    // normal/heightmap textures
    public ShaderResourceView NormalMap0 {get; set; }
    public ShaderResourceView NormalMap1 { get; set; }

    // parameters to modify texture scrolling rates
    public Vector2 DispFactor0 { get; private set; }
    public Vector2 DispFactor1 { get; private set; }
    public Vector2 NormalFactor0 { get; private set; }
    public Vector2 NormalFactor1 { get; private set; }

    // parameters to modify texture tiling
    public Vector3 DispScale0 { get; private set; }
    public Vector3 DispScale1 { get; private set; }
    public Vector3 NormalScale0 { get; private set; }
    public Vector3 NormalScale1 { get; private set; }

    // provides access to model diffuse map
    public ShaderResourceView DiffuseMap {
        get { return _gridModel.DiffuseMapSRV[0]; }
        set { _gridModel.DiffuseMapSRV[0] = value; }
    // provides access to model diffuse texture transform
    public Matrix TexTransform {
        get {return _grid.TexTransform;}
        set { _grid.TexTransform = value; }

We also expose the parameters that define the scaling and scrolling of our normal/depth maps publicly.  These are set to reasonable default values in our constructor, but are available for tweaking if you want to modify the effect.  Because our Waves class contains a BasicModel member, we will subclass our DisposableClass base class, in order to allow us to clean up DirectX resources easily.

public Waves() {
    _wavesDispOffset0 = new Vector2();
    _wavesDispOffset1 = new Vector2();
    _wavesNormalOffset0 = new Vector2();
    _wavesNormalOffset1 = new Vector2();

    DispFactor0 = new Vector2(0.01f, 0.03f);
    DispFactor1 = new Vector2(0.01f, 0.03f);
    NormalFactor0 = new Vector2(0.05f, 0.02f);
    NormalFactor1 = new Vector2(0.02f, 0.05f);

    DispScale0 = new Vector3(2,2, 1);
    DispScale1 = new Vector3(1,1, 1);
    NormalScale0 = new Vector3(22,22,1 );
    NormalScale1 = new Vector3(16,16, 1);
protected override void Dispose(bool disposing) {
    if (!_disposed) {
        if (disposing) {
            Util.ReleaseCom(ref _gridModel);
        _disposed = true;

Creating a Waves Object

As you can see, the Waves constructor is very lightweight.  To fully initialize the object, you will need to call the Init() method.  This creates the polygon grid geometry with the specified dimensions, and loads the normal/height textures.  We also set a default bluish material, which you can replace if you choose using the Material property.

public void Init(Device device, TextureManager texMgr, float width, float depth, string texture1 = "Textures/", string texture2 = "textures/") {
    NormalMap0 = texMgr.CreateTexture(texture1);
    NormalMap1 = texMgr.CreateTexture(texture2);

    _gridModel = BasicModel.CreateGrid(device, width, depth, ((int)width) * 2, ((int)depth) * 2);
    Material = new Material {
        Ambient = new Color4(0.1f, 0.1f, 0.3f),
        Diffuse = new Color4(0.4f, 0.4f, 0.7f),
        Specular = new Color4(128f, 0.8f, 0.8f, 0.8f),
        Reflect = new Color4(0.4f, 0.4f, 0.4f)

    _grid = new BasicModelInstance {
        Model = _gridModel
    TexTransform = Matrix.Identity;
    World = Matrix.Translation(0, -0.2f, 0);

Updating the Waves

Each frame, we will need to update our waves to scroll the normal/depth maps.  We do this by updating the offset vectors for each texture, and then creating transformation matrices to translate and scale these offsets, according to our wave control parameters.

public void Update(float dt) {
    _wavesDispOffset0.X += DispFactor0.X * dt;
    _wavesDispOffset0.Y += DispFactor0.Y * dt;

    _wavesDispOffset1.X += DispFactor1.X * dt;
    _wavesDispOffset1.Y += DispFactor1.Y * dt;

    _wavesDispTexTransform0 = Matrix.Scaling(DispScale0) *
                                Matrix.Translation(_wavesDispOffset0.X, _wavesDispOffset0.Y, 0);

    _wavesDispTexTransform1 = Matrix.Scaling(DispScale1) *
                                Matrix.Translation(_wavesDispOffset1.X, _wavesDispOffset1.Y, 0);

    _wavesNormalOffset0.X += NormalFactor0.X * dt;
    _wavesNormalOffset0.Y += NormalFactor0.Y * dt;

    _wavesNormalOffset1.X += NormalFactor1.X* dt;
    _wavesNormalOffset1.Y += NormalFactor1.Y * dt;

    _wavesNormalTexTransform0 = Matrix.Scaling(NormalScale0 ) *
                                Matrix.Translation(_wavesNormalOffset0.X, _wavesNormalOffset0.Y, 0);
    _wavesNormalTexTransform1 = Matrix.Scaling(NormalScale1) *
                                Matrix.Translation(_wavesNormalOffset1.X, _wavesNormalOffset1.Y, 0);

Drawing the Waves

Our Draw() function is very similar to that for our BasicModelInstance class.  The bulk of the work performed is simply setting the necessary shader variables for the wave effect.  Then, we draw the geometry for the grid geometry, using our special wave shader.

public void Draw(DeviceContext dc, EffectTechnique waveTech, Matrix viewProj) {
    for (var p = 0; p < waveTech.Description.PassCount; p++) {
        var world = _grid.World;
        var wit = MathF.InverseTranspose(world);
        var wvp = world * viewProj;


        _grid.Model.ModelMesh.Draw(dc, 0);

Waves Shader

Our waves shader is very similar to our displacement mapping shader, except instead of a single normal/depth map, we use two.  This means that we will need to generate four sets of texture coordinates in our vertex shader (one set for each texture for normal mapping, and one set for each texture for displacement mapping, since we are scrolling the normal and displacement maps at different rates).

VertexOut VS(VertexIn vin)
    VertexOut vout;
    // Transform to world space space.
    vout.PosW     = mul(float4(vin.PosL, 1.0f), gWorld).xyz;
    vout.NormalW  = mul(vin.NormalL, (float3x3)gWorldInvTranspose);
    vout.TangentW = mul(vin.TangentL, (float3x3)gWorld);

    // Output vertex attributes for interpolation across triangle.
    vout.Tex            = mul(float4(vin.Tex, 0.0f, 1.0f), gTexTransform).xy;
    vout.WaveDispTex0   = mul(float4(vin.Tex, 0.0f, 1.0f), gWaveDispTexTransform0).xy;
    vout.WaveDispTex1   = mul(float4(vin.Tex, 0.0f, 1.0f), gWaveDispTexTransform1).xy;
    vout.WaveNormalTex0 = mul(float4(vin.Tex, 0.0f, 1.0f), gWaveNormalTexTransform0).xy;
    vout.WaveNormalTex1 = mul(float4(vin.Tex, 0.0f, 1.0f), gWaveNormalTexTransform1).xy;
    float d = distance(vout.PosW, gEyePosW);

    // Normalized tessellation factor. 
    // The tessellation is 
    //   0 if d >= gMinTessDistance and
    //   1 if d <= gMaxTessDistance.  
    float tess = saturate( (gMinTessDistance - d) / (gMinTessDistance - gMaxTessDistance) );
    // Rescale [0,1] --> [gMinTessFactor, gMaxTessFactor].
    vout.TessFactor = gMinTessFactor + tess*(gMaxTessFactor-gMinTessFactor);

    return vout;

Our hull shader will be the same as for our previous displacement mapping shader.  In the domain shader, we need to sample the height maps in order to displace the final generated vertex position in the Y-direction.  We do this by sampling the two heightmaps, using the interpolated displacement texture coordinates, scaling the samples by the scale factor for each texture, and then summing the resulting displacements.  Remember that we have to do manual mipmap level selection in the domain shader, and use SampleLevel, rather than Sample to get the height samples.

// The domain shader is called for every vertex created by the tessellator.  
// It is like the vertex shader after tessellation.
DomainOut DS(PatchTess patchTess, 
             float3 bary : SV_DomainLocation, 
             const OutputPatch<HullOut,3> tri)
    DomainOut dout;
    // Interpolate patch attributes to generated vertices.
    dout.PosW           = bary.x*tri[0].PosW           + bary.y*tri[1].PosW           + bary.z*tri[2].PosW;
    dout.NormalW        = bary.x*tri[0].NormalW        + bary.y*tri[1].NormalW        + bary.z*tri[2].NormalW;
    dout.TangentW       = bary.x*tri[0].TangentW       + bary.y*tri[1].TangentW       + bary.z*tri[2].TangentW;
    dout.Tex            = bary.x*tri[0].Tex            + bary.y*tri[1].Tex            + bary.z*tri[2].Tex;
    dout.WaveDispTex0   = bary.x*tri[0].WaveDispTex0   + bary.y*tri[1].WaveDispTex0   + bary.z*tri[2].WaveDispTex0;
    dout.WaveDispTex1   = bary.x*tri[0].WaveDispTex1   + bary.y*tri[1].WaveDispTex1   + bary.z*tri[2].WaveDispTex1;
    dout.WaveNormalTex0 = bary.x*tri[0].WaveNormalTex0 + bary.y*tri[1].WaveNormalTex0 + bary.z*tri[2].WaveNormalTex0;
    dout.WaveNormalTex1 = bary.x*tri[0].WaveNormalTex1 + bary.y*tri[1].WaveNormalTex1 + bary.z*tri[2].WaveNormalTex1;

    // Interpolating normal can unnormalize it, so normalize it.
    dout.NormalW = normalize(dout.NormalW);
    // Displacement mapping.
    // Choose the mipmap level based on distance to the eye; specifically, choose
    // the next miplevel every MipInterval units, and clamp the miplevel in [0,6].
    const float MipInterval = 20.0f;
    float mipLevel = clamp( (distance(dout.PosW, gEyePosW) - MipInterval) / MipInterval, 0.0f, 6.0f);
    // Sample height map (stored in alpha channel).
    float h0 = gNormalMap0.SampleLevel(samLinear, dout.WaveDispTex0, mipLevel).a;
    float h1 = gNormalMap1.SampleLevel(samLinear, dout.WaveDispTex1, mipLevel).a;

    dout.PosW.y += gHeightScale0*h0;
    dout.PosW.y += gHeightScale1*h1;

    // Project to homogeneous clip space.
    dout.PosH = mul(float4(dout.PosW, 1.0f), gViewProj);
    return dout;

Likewise, in our pixel shader, we need to perform normal mapping using two samples from the normal maps, rather than just one. Once we have sampled the normal maps using the normal texture coordinates, we need to convert the normals from texture space to world space, as we did with displacement mapping.  Then, we sum the sampled normals and normalize the result to generate the final pixel normal.

    // Normal mapping (PS())

    float3 normalMapSample0 = gNormalMap0.Sample(samLinear, pin.WaveNormalTex0).rgb;
    float3 bumpedNormalW0 = NormalSampleToWorldSpace(normalMapSample0, pin.NormalW, pin.TangentW);

    float3 normalMapSample1 = gNormalMap1.Sample(samLinear, pin.WaveNormalTex1).rgb;
    float3 bumpedNormalW1 = NormalSampleToWorldSpace(normalMapSample1, pin.NormalW, pin.TangentW);
    float3 bumpedNormalW = normalize(bumpedNormalW0 + bumpedNormalW1);

WavesEffect Shader Wrapper

Since our WavesEffect is so similar to our displacement mapping shader, and uses all the same techniques, we can actually subclass our DisplacementMapEffect wrapper, and just add the additional shader variables necessary.  We will of course need to add additional code to our static Effects class to reference, create and destroy the new WavesEffect.

public class WavesEffect : DisplacementMapEffect {
    private readonly EffectMatrixVariable _waveDispTexTransform0;
    private readonly EffectMatrixVariable _waveDispTexTransform1;
    private readonly EffectMatrixVariable _waveNormalTexTransform0;
    private readonly EffectMatrixVariable _waveNormalTexTransform1;
    private readonly EffectScalarVariable _heightScale0;
    private readonly EffectScalarVariable _heightScale1;

    private readonly EffectResourceVariable _normalMap0;
    private readonly EffectResourceVariable _normalMap1;

    public void SetWaveDispTexTransform0(Matrix m) { _waveDispTexTransform0.SetMatrix(m); }
    public void SetWaveDispTexTransform1(Matrix m) { _waveDispTexTransform1.SetMatrix(m); }
    public void SetWaveNormalTexTransform0(Matrix m) { _waveNormalTexTransform0.SetMatrix(m); }
    public void SetWaveNormalTexTransform1(Matrix m) { _waveNormalTexTransform1.SetMatrix(m); }

    public void SetHeightScale0(float f) { _heightScale0.Set(f); }
    public void SetHeightScale1(float f) { _heightScale1.Set(f); }

    public void SetNormalMap0(ShaderResourceView srv) { _normalMap0.SetResource(srv); }
    public void SetNormalMap1(ShaderResourceView srv) { _normalMap1.SetResource(srv); }

    public WavesEffect(Device device, string filename) : base(device, filename) {
        _waveDispTexTransform0 = FX.GetVariableByName("gWaveDispTexTransform0").AsMatrix();
        _waveDispTexTransform1 = FX.GetVariableByName("gWaveDispTexTransform1").AsMatrix();
        _waveNormalTexTransform0 = FX.GetVariableByName("gWaveNormalTexTransform0").AsMatrix();
        _waveNormalTexTransform1 = FX.GetVariableByName("gWaveNormalTexTransform1").AsMatrix();
        _heightScale0 = FX.GetVariableByName("gHeightScale0").AsScalar();
        _heightScale1 = FX.GetVariableByName("gHeightScale1").AsScalar();
        _normalMap0 = FX.GetVariableByName("gNormalMap0").AsResource();
        _normalMap1 = FX.GetVariableByName("gNormalMap1").AsResource();

The Result

To really show off the wave effect, one must see it in action in a video.

Displacement Mapped Waves effect

Next Time…

Well, we are finally finished with Frank Luna’s Introduction to 3D Game Programming with Direct3D 11.0.  I have skipped over some chapters here, but the chapters that I have omitted are primarily theory-based, with less than spectacular demo applications.  For reference, these would be: Chapter 12: The Compute Shader, Chapter 13: The Tessellation Stages, and Chapter 24: Quaternions.  We’ve studied the tessellation stages quite heavily, between looking at displacement mapping, terrain rendering, and the current waves demo.  The chapter on quaternions I believe mostly exists to setup Luna’s chapter on character animation, but since we ended up implementing a very different set of loading and rendering code, it is not especially applicable.  Quaternions are very useful for some things, and we did use the Assimp flavor of quaternions in our character animation code, but it’s a little difficult to shoe-horn them into and interesting graphical demo.  The compute shader may be something I circle back to, but for now, its somewhat orthogonal to what I want to do.

Next up, I think that I will spend some time improving our Terrain class.  I have been working on incorporating the shadow mapping and ambient occlusion techniques into the terrain rendering, so that will probably be the next post.  I’ve also been looking at adapting the minimap implementation from Carl Granberg’s Programming an RTS Game with Direct3D into my code, so that should be forthcoming once I nail it down.  After that, I’d really like to spend some time on mouse picking the terrain, as well as implementing pathfinding using A*.

Possibly interspersed with all of this, I am going to begin going through Game Physics Engine Development by Ian Millington in earnest, so there will probably be a series of posts on writing a physics engine.


Hi, I'm Eric, and I'm a biblioholic. Here is a selection of my favorites. All proceeds go to feed my addiction...