Quantcast
Channel: Envato Tuts+ Game Development
Viewing all articles
Browse latest Browse all 728

Creating Toon Water for the Web: Part 3

$
0
0

Welcome back to this three-part series on creating stylized toon water in PlayCanvas using vertex shaders. In Part 2 we covered buoyancy & foam lines. In this final part, we're going to apply the underwater distortion as a post-process effect.

Refraction & Post-Process Effects

Our goal is to visually communicate the refraction of light through water. We've already covered how to create this sort of distortion in a fragment shader in a previous tutorial for a 2D scene. The only difference here is that we'll need to figure out which area of the screen is underwater and only apply the distortion there. 

Post-Processing

In general, a post-process effect is anything applied to the whole scene after it is rendered, such as a colored tint or an old CRT screen effect. Instead of rendering your scene directly to the screen, you first render it to a buffer or texture, and then render that to the screen, passing through a custom shader.

In PlayCanvas, you can set up a post-process effect by creating a new script. Call it Refraction.js, and copy this template to start with:

1
//--------------- POST EFFECT DEFINITION------------------------//
2
pc.extend(pc,function(){
3
// Constructor - Creates an instance of our post effect
4
varRefractionPostEffect=function(graphicsDevice,vs,fs,buffer){
5
varfragmentShader="precision "+graphicsDevice.precision+" float;\n";
6
fragmentShader=fragmentShader+fs;
7
8
// this is the shader definition for our effect
9
this.shader=newpc.Shader(graphicsDevice,{
10
attributes:{
11
aPosition:pc.SEMANTIC_POSITION
12
},
13
vshader:vs,
14
fshader:fs
15
});
16
17
this.buffer=buffer;
18
};
19
20
// Our effect must derive from pc.PostEffect
21
RefractionPostEffect=pc.inherits(RefractionPostEffect,pc.PostEffect);
22
23
RefractionPostEffect.prototype=pc.extend(RefractionPostEffect.prototype,{
24
// Every post effect must implement the render method which
25
// sets any parameters that the shader might require and
26
// also renders the effect on the screen
27
render:function(inputTarget,outputTarget,rect){
28
vardevice=this.device;
29
varscope=device.scope;
30
31
// Set the input render target to the shader. This is the image rendered from our camera
32
scope.resolve("uColorBuffer").setValue(inputTarget.colorBuffer);
33
34
// Draw a full screen quad on the output target. In this case the output target is the screen.
35
// Drawing a full screen quad will run the shader that we defined above
36
pc.drawFullscreenQuad(device,outputTarget,this.vertexBuffer,this.shader,rect);
37
}
38
});
39
40
return{
41
RefractionPostEffect:RefractionPostEffect
42
};
43
}());
44
45
//--------------- SCRIPT DEFINITION------------------------//
46
varRefraction=pc.createScript('refraction');
47
48
Refraction.attributes.add('vs',{
49
type:'asset',
50
assetType:'shader',
51
title:'Vertex Shader'
52
});
53
54
Refraction.attributes.add('fs',{
55
type:'asset',
56
assetType:'shader',
57
title:'Fragment Shader'
58
});
59
60
// initialize code called once per entity
61
Refraction.prototype.initialize=function(){
62
vareffect=newpc.RefractionPostEffect(this.app.graphicsDevice,this.vs.resource,this.fs.resource);
63
64
// add the effect to the camera's postEffects queue
65
varqueue=this.entity.camera.postEffects;
66
queue.addEffect(effect);
67
68
this.effect=effect;
69
70
// Save the current shaders for hot reload 
71
this.savedVS=this.vs.resource;
72
this.savedFS=this.fs.resource;
73
};
74
75
Refraction.prototype.update=function(){
76
if(this.savedFS!=this.fs.resource||this.savedVS!=this.vs.resource){
77
this.swap(this);
78
}
79
};
80
81
Refraction.prototype.swap=function(old){
82
this.entity.camera.postEffects.removeEffect(old.effect);
83
this.initialize();
84
};

This is just like a normal script, but we define a RefractionPostEffect class that can be applied to the camera. This needs a vertex and a fragment shader to render. The attributes are already set up, so let's create Refraction.frag with this content:

1
precisionhighpfloat;
2
3
uniformsampler2DuColorBuffer;
4
varyingvec2vUv0;
5
6
voidmain(){
7
vec4color=texture2D(uColorBuffer,vUv0);
8
9
gl_FragColor=color;
10
}

And Refraction.vert with a basic vertex shader:

1
attributevec2aPosition;
2
varyingvec2vUv0;
3
4
voidmain(void)
5
{
6
gl_Position=vec4(aPosition,0.0,1.0);
7
vUv0=(aPosition.xy+1.0)*0.5;
8
}

Now attach the Refraction.js script to the camera, and assign the shaders to the appropriate attributes. When you launch the game, you should see the scene exactly as it was before. This is a blank post effect that simply re-renders the scene. To verify that this is working, try giving the scene a red tint.

In Refraction.frag, instead of simply returning the color, try setting the red component to 1.0, which should look like the image below.

Scene rendered with a red tint Scene rendered with a red tint Scene rendered with a red tint

Distortion Shader

We need to add a time uniform for the animated distortion, so go ahead and create one in Refraction.js, inside this constructor for the post effect:

1
varRefractionPostEffect=function(graphicsDevice,vs,fs){
2
varfragmentShader="precision "+graphicsDevice.precision+" float;\n";
3
fragmentShader=fragmentShader+fs;
4
5
// this is the shader definition for our effect
6
this.shader=newpc.Shader(graphicsDevice,{
7
attributes:{
8
aPosition:pc.SEMANTIC_POSITION
9
},
10
vshader:vs,
11
fshader:fs
12
});
13
14
// >>>>>>>>>>>>> Initialize the time here 
15
this.time=0;
16
17
};

Now, inside this render function, we pass it to our shader and increment it:

1
RefractionPostEffect.prototype=pc.extend(RefractionPostEffect.prototype,{
2
// Every post effect must implement the render method which
3
// sets any parameters that the shader might require and
4
// also renders the effect on the screen
5
render:function(inputTarget,outputTarget,rect){
6
vardevice=this.device;
7
varscope=device.scope;
8
9
// Set the input render target to the shader. This is the image rendered from our camera
10
scope.resolve("uColorBuffer").setValue(inputTarget.colorBuffer);
11
/// >>>>>>>>>>>>>>>>>> Pass the time uniform here 
12
scope.resolve("uTime").setValue(this.time);
13
this.time+=0.1;
14
15
// Draw a full screen quad on the output target. In this case the output target is the screen.
16
// Drawing a full screen quad will run the shader that we defined above
17
pc.drawFullscreenQuad(device,outputTarget,this.vertexBuffer,this.shader,rect);
18
}
19
});

Now we can use the same shader code from the water distortion tutorial, making our full fragment shader look like this:

1
precisionhighpfloat;
2
3
uniformsampler2DuColorBuffer;
4
uniformfloatuTime;
5
6
varyingvec2vUv0;
7
8
voidmain(){
9
vec2pos=vUv0;
10
11
floatX=pos.x*15.+uTime*0.5;
12
floatY=pos.y*15.+uTime*0.5;
13
pos.y+=cos(X+Y)*0.01*cos(Y);
14
pos.x+=sin(X-Y)*0.01*sin(Y);
15
16
vec4color=texture2D(uColorBuffer,pos);
17
18
gl_FragColor=color;
19
}

If it all worked out, everything should now look like as if it's underwater, as below.

Underwater distortion applied to the whole scene Underwater distortion applied to the whole scene Underwater distortion applied to the whole scene
Challenge #1: Make the distortion only apply to the bottom half of the screen.

Camera Masks

We're almost there. All we need to do now is to apply this distortion effect just on the underwater part of the screen. The most straightforward way I've come up with to do this is to re-render the scene with the water surface rendered as a solid white, as shown below.

Water surface rendered as a solid white to act as a maskWater surface rendered as a solid white to act as a maskWater surface rendered as a solid white to act as a mask

This would be rendered to a texture that would act as a mask. We would then pass this texture to our refraction shader, which would only distort a pixel in the final image if the corresponding pixel in the mask is white.

Let's add a boolean attribute on the water surface to know if it's being used as a mask. Add this to Water.js:

1
Water.attributes.add('isMask',{type:'boolean',title:"Is Mask?"});

We can then pass it to the shader with material.setParameter('isMask',this.isMask); as usual. Then declare it in Water.frag and set the color to white if it's true.

1
// Declare the new uniform at the top
2
uniformboolisMask;
3
4
// At the end of the main function, override the color to be white 
5
// if the mask is true 
6
if(isMask){
7
color=vec4(1.0);
8
}

Confirm that this works by toggling the "Is Mask?" property in the editor and relaunching the game. It should look white, as in the earlier image.

Now, to re-render the scene, we need a second camera. Create a new camera in the editor and call it CameraMask. Duplicate the Water entity in the editor as well, and call it WaterMask. Make sure the "Is Mask?" is false for the Water entity but true for the WaterMask.

To tell the new camera to render to a texture instead of the screen, create a new script called CameraMask.js and attach it to the new camera. We create a RenderTarget to capture this camera's output like this:

1
// initialize code called once per entity
2
CameraMask.prototype.initialize=function(){
3
// Create a 512x512x24-bit render target with a depth buffer
4
varcolorBuffer=newpc.Texture(this.app.graphicsDevice,{
5
width:512,
6
height:512,
7
format:pc.PIXELFORMAT_R8_G8_B8,
8
autoMipmap:true
9
});
10
colorBuffer.minFilter=pc.FILTER_LINEAR;
11
colorBuffer.magFilter=pc.FILTER_LINEAR;
12
varrenderTarget=newpc.RenderTarget(this.app.graphicsDevice,colorBuffer,{
13
depth:true
14
});
15
16
this.entity.camera.renderTarget=renderTarget;
17
};

Now, if you launch, you'll see this camera is no longer rendering to the screen. We can grab the output of its render target in Refraction.js like this:

1
Refraction.prototype.initialize=function(){
2
varcameraMask=this.app.root.findByName('CameraMask');
3
varmaskBuffer=cameraMask.camera.renderTarget.colorBuffer;
4
5
vareffect=newpc.RefractionPostEffect(this.app.graphicsDevice,this.vs.resource,this.fs.resource,maskBuffer);
6
7
// ...
8
// The rest of this function is the same as before
9
10
};

Notice that I pass this mask texture as an argument to the post effect constructor. We need to create a reference to it in our constructor, so it looks like:

1
//// Added an extra argument on the line below
2
varRefractionPostEffect=function(graphicsDevice,vs,fs,buffer){
3
varfragmentShader="precision "+graphicsDevice.precision+" float;\n";
4
fragmentShader=fragmentShader+fs;
5
6
// this is the shader definition for our effect
7
this.shader=newpc.Shader(graphicsDevice,{
8
attributes:{
9
aPosition:pc.SEMANTIC_POSITION
10
},
11
vshader:vs,
12
fshader:fs
13
});
14
15
this.time=0;
16
//// <<<<<<<<<<<<< Saving the buffer here 
17
this.buffer=buffer;
18
};

Finally, in the render function, pass the buffer to our shader with:

1
scope.resolve("uMaskBuffer").setValue(this.buffer);

Now to verify that this is all working, I'll leave that as a challenge.

Challenge #2: Render the uMaskBuffer to the screen to confirm it is the output of the second camera.

One thing to be aware of is that the render target is set up in the initialize of CameraMask.js, and that needs to be ready by the time Refraction.js is called. If the scripts run the other way around, you'll get an error. To make sure they run in the right order, drag the CameraMask to the top of the entity list in the editor, as shown below.

PlayCanvas editor with CameraMask at top of entity listPlayCanvas editor with CameraMask at top of entity listPlayCanvas editor with CameraMask at top of entity list

The second camera should always be looking at the same view as the original one, so let's make it always follow its position and rotation in the update of CameraMask.js:

1
CameraMask.prototype.update=function(dt){
2
varpos=this.CameraToFollow.getPosition();
3
varrot=this.CameraToFollow.getRotation();
4
this.entity.setPosition(pos.x,pos.y,pos.z);
5
this.entity.setRotation(rot);
6
};

And define CameraToFollow in the initialize:

1
this.CameraToFollow=this.app.root.findByName('Camera');

Culling Masks

Both cameras are currently rendering the same thing. We want the mask camera to render everything except the real water, and we want the real camera to render everything except the mask water.

To do this, we can use the camera's culling bit mask. This works similarly to collision masks if you've ever used those. An object will be culled (not rendered) if the result of a bitwise AND between its mask and the camera mask is 1.

Let's say the Water will have bit 2 set, and WaterMask will have bit 3. Then the real camera needs to have all bits set except for 3, and the mask camera needs to have all bits set except for 2. An easy way to say "all bits except N" is to do:

1
~(1<<N)>>>0

You can read more about bitwise operators here.

To set up the camera culling masks, we can put this inside CameraMask.js's initialize at the bottom:

1
// Set all bits except for 2 
2
this.entity.camera.camera.cullingMask&=~(1<<2)>>>0;
3
// Set all bits except for 3
4
this.CameraToFollow.camera.camera.cullingMask&=~(1<<3)>>>0;
5
// If you want to print out this bit mask, try:
6
// console.log((this.CameraToFollow.camera.camera.cullingMask >>> 0).toString(2));

Now, in Water.js, set the Water mesh's mask on bit 2, and the mask version of it on bit 3:

1
// Put this at the bottom of the initialize of Water.js
2
3
// Set the culling masks 
4
varbit=this.isMask?3:2;
5
meshInstance.mask=0;
6
meshInstance.mask|=(1<<bit);

Now, one view will have the normal water, and the other will have the solid white water. The left half of the image below is the view from the original camera, and the right half is from the mask camera.

Split view of mask camera and original cameraSplit view of mask camera and original cameraSplit view of mask camera and original camera

Applying the Mask

One final step now! We know the areas underwater are marked with white pixels. We just need to check if we're not at a white pixel, and if so, turn off the distortion in Refraction.frag:

1
// Check original position as well as new distorted position
2
vec4maskColor=texture2D(uMaskBuffer,pos);
3
vec4maskColor2=texture2D(uMaskBuffer,vUv0);
4
// We're not at a white pixel?
5
if(maskColor!=vec4(1.0)||maskColor2!=vec4(1.0)){
6
// Return it back to the original position
7
pos=vUv0;
8
}

And that should do it!

One thing to note is that since the texture for the mask is initialized on launch, if you resize the window at runtime, it will no longer match the size of the screen.

Anti-Aliasing

As an optional clean-up step, you might have noticed that edges in the scene now look a little sharp. This is because when we applied our post effect, we lost anti-aliasing. 

We can apply an additional anti-alias on top of our effect as another post effect. Luckily, there's one available in the PlayCanvas store we can just use. Go to the script asset page, click the big green download button, and choose your project from the list that appears. The script will appear in the root of your asset window as posteffect-fxaa.js. Just attach this to the Camera entity, and your scene should look a little nicer! 

Final Thoughts

If you've made it this far, give yourself a pat on the back! We covered a lot of techniques in this series. You should now be comfortable with vertex shaders, rendering to textures, applying post-processing effects, selectively culling objects, using the depth buffer, and working with blending and transparency. Even though we were implementing this in PlayCanvas, these are all general graphics concepts you'll find in some form on whatever platform you end up in.

All these techniques are also applicable to a variety of other effects. One particularly interesting application I've found of vertex shaders is in this talk on the art of Abzu, where they explain how they used vertex shaders to efficiently animate tens of thousands of fish on screen.

You should now also have a nice water effect you can apply to your games! You could easily customize it now that you've put together every detail yourself. There's still a lot more you can do with water (I haven't even mentioned any sort of reflection at all). Below are a couple of ideas.

Noise-Based Waves

Instead of simply animating the waves with a combination of sine and cosines, you can sample a noise texture to make the waves look a bit more natural and unpredictable.

Dynamic Foam Trails

Instead of completely static water lines on the surface, you could draw onto that texture when objects move, to create a dynamic foam trail. There are a lot of ways to go about doing this, so this could be its own project.

Source Code

You can find the finished hosted PlayCanvas project here. A Three.js port is also available in this repository.


Viewing all articles
Browse latest Browse all 728

Trending Articles