Advertisement
  1. Web Design
  2. HTML/CSS
  3. HTML

Creating Realistic Terrain for HTML5 Games With WebGL

Scroll to top
Sponsored Content

This sponsored post features a product relevant to our readers while meeting our editorial guidelines for being objective and educational.

The first version of Flight Simulator shipped in 1980 for the Apple II and, amazingly, it was in 3D! That was a remarkable achievement. It’s even more amazing when you consider that all of the 3D was done by hand, the result of meticulous calculations and low-level pixel commands. When Bruce Atwick tackled the early versions of Flight Simulator, not only were there no 3D frameworks, but there were no frameworks at all! Those versions of the game were mostly written in assembly, just a single step away from ones and zeroes that flow through a CPU.

When we set out to reimagine Flight Simulator (or Flight Arcade as we call it) for the web and to demonstrate what’s possible in the new Microsoft Edge browser and EdgeHTML rendering engine, we couldn’t help but think about the contrast of creating 3D then and now—old Flight Sim, new Flight Sim, old Internet Explorer, new Microsoft Edge. Modern coding seems almost luxurious as we sculpt 3D worlds in WebGL with great frameworks like Babylon.js. It lets us focus on very high-level problems. 

In this article, I’ll share our approach to one of these fun challenges: a simple way to create realistic-looking large-scale terrain.

Note: Interactive code and examples for this article are also located at Flight Arcade / Learn.

Modeling and 3D Terrain

Most 3D objects are created with modeling tools, and for good reason. Creating complex objects (like an airplane or even a building) is hard to do in code. Modeling tools almost always make sense, but there are exceptions! One of those might be cases like the rolling hills of the Flight Arcade island. We ended up using a technique that we found to be simpler and possibly even more intuitive: a heightmap.

A heightmap is a way to use a regular two-dimensional image to describe the elevation relief of a surface like an island or other terrain. It’s a pretty common way to work with elevation data, not only in games but also in geographic information systems (GIS) used by cartographers and geologists.

To help you get an idea for how this works, check out the heightmap in this interactive demo. Try drawing in the image editor, and then check out the resulting terrain.

Screenshot of the heightmap demoScreenshot of the heightmap demoScreenshot of the heightmap demo

The concept behind a heightmap is pretty straightforward. In an image like the one above, pure black is the “floor” and pure white is the tallest peak. The grayscale colors in-between represent corresponding elevations. This gives us 256 levels of elevation, which is plenty of detail for our game. Real-life applications might use the full color spectrum to store significantly more levels of detail (2564 = 4,294,967,296 levels of detail if you include an alpha channel).

A heightmap has a few advantages over a traditional polygonal mesh:

First, heightmaps are a lot more compact. Only the most significant data (the elevation) gets stored. It will need to be turned into a 3D object programmatically, but this is the classic trade: you save space now and pay later with computation. By storing the data as an image, you get another space advantage: you can leverage standard image compression techniques and make the data tiny (by comparison)!

Second, heightmaps are a convenient way to generate, visualize and edit terrain. It's pretty intuitive when you see one. It feels a little like looking at a map. This proved to be particularly useful for Flight Arcade. We designed and edited our island right in Photoshop! This made it very simple to make small adjustments as needed. When, for example, we wanted to make sure that the runway was completely flat, we just made sure to paint over that area in a single color.

You can see the heightmap for Flight Arcade below. See if you can spot the “flat” areas we created for the runway and the village.

heightmap for Flight Arcadeheightmap for Flight Arcadeheightmap for Flight Arcade
The heightmap for the Flight Arcade island. It was created in Photoshop and it's based on the "big island" in a famous Pacific Ocean island chain. Any guesses?
A texture that gets mapped onto the resulting 3D mesh after the heightmap is decoded. More on that below.

Decoding the Heightmap

We built Flight Arcade with Babylon.js, and Babylon gave us a pretty straightforward path from heightmap to 3D. Babylon provides an API to generate a mesh geometry from a heightmap image:

1
var ground = BABYLON.Mesh.CreateGroundFromHeightMap(
2
3
    'your-mesh-name',
4
5
    '/path/to/heightmap.png',
6
7
    100, // width of the ground mesh (x axis)

8
9
    100, // depth of the ground mesh (z axis)

10
11
    40,  // number of subdivisions

12
13
    0,   // min height

14
15
    50,  // max height

16
17
    scene,
18
19
    false, // updateable?

20
21
    null // callback when mesh is ready

22
23
);

The amount of detail is determined by that subdivision’s property. It’s important to note that the parameter refers to the number of subdivisions on each side of the heightmap image, not the total number of cells. So increasing this number slightly can have a big effect on the total number of vertices in your mesh.

  • 20 subdivisions = 400 cells
  • 50 subdivisions = 2,500 cells
  • 100 subdivisions = 10,000 cells
  • 500 subdivisions = 250,000 cells
  • 1,000 subdivisions = 1,000,000 cells

In the next section we'll learn how to texture the ground, but when experimenting with heightmap creation, it's useful to see the wireframe. Here is the code to apply a simple wireframe texture so it’s easy to see how the heightmap data is converted into the vertices of our mesh:

1
// simple wireframe material

2
3
var material = new BABYLON.StandardMaterial('ground-material', scene);
4
5
material.wireframe = true;
6
7
ground.material = material;

Creating Texture Detail

Once we had a model, mapping a texture was relatively straightforward. For Flight Arcade, we simply created a very large image that matched the island in our heightmap. The image gets stretched over the contours of the terrain, so the texture and the height map remain correlated. This was really easy to visualize and, once again, all of the production work was done in Photoshop.

The original texture image was created at 4096x4096. That's pretty big! (We eventually reduced the size by a level to 2048x2048 in order to keep the download reasonable, but all of the development was done with the full-size image.) Here's a full-pixel sample from the original texture. 

A full-pixel sample of the original island textureA full-pixel sample of the original island textureA full-pixel sample of the original island texture
A full-pixel sample of the original island texture. The entire town is only around 300 px square.

Those rectangles represent the buildings in the town on the island. We quickly noticed a discrepancy in the level of texturing detail that we could achieve between the terrain and the other 3D models. Even with our giant island texture, the difference was distractingly apparent!

To fix this, we “blended” additional detail into the terrain texture in the form of random noise. You can see the before and after below. Notice how the additional noise enhances the appearance of detail in the terrain.

Before and after comparison of airport textureBefore and after comparison of airport textureBefore and after comparison of airport texture

We created a custom shader to add the noise. Shaders give you an incredible amount of control over the rendering of a WebGL 3D scene, and this is a great example of how a shader can be useful.

A WebGL shader consists of two major pieces: the vertex and fragment shaders. The principal goal of the vertex shader is to map vertices to a position in the rendered frame. The fragment (or pixel) shader controls the resulting color of the pixels.

Shaders are written in a high-level language called GLSL (Graphics Library Shader Language), which resembles C. This code is executed on the GPU. For an in-depth look at how shaders work, see this tutorial on how to create your own custom shader for Babylon.js, or see this beginner's guide to coding graphics shaders.

The Vertex Shader

We're not changing how our texture is mapped on to the ground mesh, so our vertex shader is quite simple. It just computes the standard mapping and assigns the target location.

1
precision mediump float;
2
3
4
5
// Attributes

6
7
attribute vec3 position;
8
9
attribute vec3 normal;
10
11
attribute vec2 uv;
12
13
14
15
// Uniforms

16
17
uniform mat4 worldViewProjection;
18
19
20
21
// Varying

22
23
varying vec4 vPosition;
24
25
varying vec3 vNormal;
26
27
varying vec2 vUV;
28
29
30
31
void main() {
32
33
34
35
    vec4 p = vec4( position, 1.0 );
36
37
    vPosition = p;
38
39
    vNormal = normal;
40
41
    vUV = uv;
42
43
    gl_Position = worldViewProjection * p;
44
45
}

The Fragment Shader

Our fragment shader is a little more complicated. It combines two different images: the base and blend images. The base image is mapped across the entire ground mesh. In Flight Arcade, this is the color image of the island. The blend image is the small noise image used to give the ground some texture and detail at close distances. The shader combines the values from each image to create a combined texture across the island.

The final lesson in Flight Arcade takes place on a foggy day, so the other task our pixel shader has is to adjust the color to simulate fog. The adjustment is based on how far the vertex is from the camera, with distant pixels being more heavily "obscured" by the fog. You'll see this distance calculation in the calcFogFactor function above the main shader code.

1
#ifdef GL_ES

2
3
precision highp float;
4
5
#endif

6
7
8
9
uniform mat4 worldView;
10
11
varying vec4 vPosition;
12
13
varying vec3 vNormal;
14
15
varying vec2 vUV;
16
17
18
19
// Refs

20
21
uniform sampler2D baseSampler;
22
23
uniform sampler2D blendSampler;
24
25
uniform float blendScaleU;
26
27
uniform float blendScaleV;
28
29
30
31
#define FOGMODE_NONE 0.

32
33
#define FOGMODE_EXP 1.

34
35
#define FOGMODE_EXP2 2.

36
37
#define FOGMODE_LINEAR 3.

38
39
#define E 2.71828

40
41
42
43
uniform vec4 vFogInfos;
44
45
uniform vec3 vFogColor;
46
47
48
49
float calcFogFactor() {
50
51
52
53
    // gets distance from camera to vertex

54
55
    float fogDistance = gl_FragCoord.z / gl_FragCoord.w;
56
57
58
59
    float fogCoeff = 1.0;
60
61
    float fogStart = vFogInfos.y;
62
63
    float fogEnd = vFogInfos.z;
64
65
    float fogDensity = vFogInfos.w;
66
67
68
69
    if (FOGMODE_LINEAR == vFogInfos.x) {
70
71
        fogCoeff = (fogEnd - fogDistance) / (fogEnd - fogStart);
72
73
    }
74
75
    else if (FOGMODE_EXP == vFogInfos.x) {
76
77
        fogCoeff = 1.0 / pow(E, fogDistance * fogDensity);
78
79
    }
80
81
    else if (FOGMODE_EXP2 == vFogInfos.x) {
82
83
        fogCoeff = 1.0 / pow(E, fogDistance * fogDistance * fogDensity * fogDensity);
84
85
    }
86
87
88
89
    return clamp(fogCoeff, 0.0, 1.0);
90
91
}
92
93
94
95
void main(void) {
96
97
98
99
    vec4 baseColor = texture2D(baseSampler, vUV);
100
101
102
103
    vec2 blendUV = vec2(vUV.x * blendScaleU, vUV.y * blendScaleV);
104
105
    vec4 blendColor = texture2D(blendSampler, blendUV);
106
107
108
109
    // multiply type blending mode

110
111
    vec4 color = baseColor * blendColor;
112
113
114
115
    // factor in fog color

116
117
    float fog = calcFogFactor();
118
119
    color.rgb = fog * color.rgb + (1.0 - fog) * vFogColor;
120
121
122
123
    gl_FragColor = color;
124
125
}

The final piece for our custom Blend shader is the JavaScript code used by Babylon. The primary purpose of this code is to prepare the parameters passed to our vertex and pixel shaders.

1
function BlendMaterial(name, scene, options) {
2
3
    this.name = name;
4
5
    this.id = name;
6
7
8
9
    this.options = options;
10
11
    this.blendScaleU = options.blendScaleU || 1;
12
13
    this.blendScaleV = options.blendScaleV || 1;
14
15
16
17
    this._scene = scene;
18
19
    scene.materials.push(this);
20
21
22
23
    var assets = options.assetManager;
24
25
    var textureTask = assets.addTextureTask('blend-material-base-task', options.baseImage);
26
27
    textureTask.onSuccess = _.bind(function(task) {
28
29
        
30
31
        this.baseTexture = task.texture;
32
33
        this.baseTexture.uScale = 1;
34
35
        this.baseTexture.vScale = 1;
36
37
38
39
        if (options.baseHasAlpha) {
40
41
            this.baseTexture.hasAlpha = true;
42
43
        }
44
45
46
47
    }, this);
48
49
50
51
    textureTask = assets.addTextureTask('blend-material-blend-task', options.blendImage);
52
53
    textureTask.onSuccess = _.bind(function(task) {
54
55
        this.blendTexture = task.texture;
56
57
        this.blendTexture.wrapU = BABYLON.Texture.MIRROR_ADDRESSMODE;
58
59
        this.blendTexture.wrapV = BABYLON.Texture.MIRROR_ADDRESSMODE;
60
61
    }, this);
62
63
    
64
65
}
66
67
68
69
BlendMaterial.prototype = Object.create(BABYLON.Material.prototype);
70
71
72
73
BlendMaterial.prototype.needAlphaBlending = function () {
74
75
    return (this.options.baseHasAlpha === true);
76
77
};
78
79
80
81
BlendMaterial.prototype.needAlphaTesting = function () {
82
83
    return false;
84
85
};
86
87
88
89
BlendMaterial.prototype.isReady = function (mesh) {
90
91
    var engine = this._scene.getEngine();
92
93
94
95
    // make sure textures are ready

96
97
    if (!this.baseTexture || !this.blendTexture) {
98
99
        return false;
100
101
    }
102
103
104
105
    if (!this._effect) {
106
107
        this._effect = engine.createEffect(
108
109
110
111
            // shader name

112
113
            "blend",
114
115
116
117
            // attributes describing topology of vertices

118
119
            [ "position", "normal", "uv" ],
120
121
122
123
            // uniforms (external variables) defined by the shaders

124
125
            [ "worldViewProjection", "world", "blendScaleU", "blendScaleV", "vFogInfos", "vFogColor" ],
126
127
128
129
            // samplers (objects used to read textures)

130
131
            [ "baseSampler", "blendSampler" ],
132
133
134
135
            // optional define string

136
137
            "");
138
139
    }
140
141
142
143
    if (!this._effect.isReady()) {
144
145
        return false;
146
147
    }
148
149
150
151
    return true;
152
153
};
154
155
156
157
BlendMaterial.prototype.bind = function (world, mesh) {
158
159
160
161
    var scene = this._scene;
162
163
    this._effect.setFloat4("vFogInfos", scene.fogMode, scene.fogStart, scene.fogEnd, scene.fogDensity);
164
165
    this._effect.setColor3("vFogColor", scene.fogColor);
166
167
168
169
    this._effect.setMatrix("world", world);
170
171
    this._effect.setMatrix("worldViewProjection", world.multiply(scene.getTransformMatrix()));
172
173
174
175
    // Textures

176
177
    this._effect.setTexture("baseSampler", this.baseTexture);
178
179
    this._effect.setTexture("blendSampler", this.blendTexture);
180
181
182
183
    this._effect.setFloat("blendScaleU", this.blendScaleU);
184
185
    this._effect.setFloat("blendScaleV", this.blendScaleV);
186
187
};
188
189
190
191
BlendMaterial.prototype.dispose = function () {
192
193
194
195
    if (this.baseTexture) {
196
197
        this.baseTexture.dispose();
198
199
    }
200
201
202
203
    if (this.blendTexture) {
204
205
        this.blendTexture.dispose();
206
207
    }
208
209
210
211
    this.baseDispose();
212
213
};

Babylon.js makes it easy to create a custom shader-based material. Our Blend material is relatively simple, but it really made a big difference in the appearance of the island when the plane flew low to the ground. Shaders bring the power of the GPU to the browser, expanding the types of creative effects you can apply to your 3D scenes. In our case, that was the finishing touch!

More Hands-On With JavaScript

Microsoft has a bunch of free learning on many open-source JavaScript topics, and we’re on a mission to create a lot more with Microsoft Edge. Here are some to check out:

And some free tools to get started: Visual Studio Code, Azure Trial, and cross-browser testing tools—all available for Mac, Linux, or Windows.

This article is part of the web dev tech series from Microsoft. We’re excited to share Microsoft Edge and the new EdgeHTML rendering engine with you. Get free virtual machines or test remotely on your Mac, iOS, Android, or Windows device @ http://dev.modern.ie/.

Advertisement
Did you find this post useful?
Want a weekly email summary?
Subscribe below and we’ll send you a weekly email summary of all new Web Design tutorials. Never miss out on learning about the next big thing.
Advertisement
Looking for something to help kick start your next project?
Envato Market has a range of items for sale to help get you started.