Skip to content
Snippets Groups Projects
Select Git revision
  • 18a90eaa1d30388ddf2cc29a14f294b400da3fef
  • master default protected
2 results

readme.md

Blame
  • user avatar
    Min-Jae, Lee authored
    18a90eaa
    History

    WebGL Tutorial

    Implement Deferred Rendering by Render to Texture and Multiple Render Targets at WebGL2


    Prerequirements

    • Basic Knowledge about WebGL at least rendering colored triangle
    • WebGL2 supported browser

    In this tutorial, for archive one time rendering to make g-buffers(a set of framebuffers that represent discrete information from same geometry information) we use MRT(Multiple Render Targets).

    What is included in this tutorial

    • Render to texture
    • Multiple render targets
    • Deferred Rendering

    Define DOM & Load script from HTML

    <canvas id="view"></canvas>
    <script src="script.js"></script>

    At body, Define canvas node and load script at the end of body for ensure loaded DOM that defined on HTML.

    Get canvas DOM from JavaScript and initialize WebGL2

    const dom = document.querySelector('canvas#view')

    Using document.querySelector for select canvas.

    if ('undefined' === typeof WebGL2RenderingContext)
      throw new Error('This tutorial required WebGL version 2.');
    const gl = dom.getContext('webgl2');
    if (!gl)
      throw new Error('Failed WebGL Initialization.');

    Check availability of WebGL2RenderingContext. and then getting WebGL2 context.

    Create shader program

    const p = gl.createProgram();
    gl.attachShader(p, shaderFromCode(gl, gl.VERTEX_SHADER, vert));
    gl.attachShader(p, shaderFromCode(gl, gl.FRAGMENT_SHADER, frag));
    gl.linkProgram(p);
    if (!gl.getProgramParameter(p, gl.LINK_STATUS)) {
      throw new Error('Failed: link program.\n' + gl.getProgramInfoLog(p));
    }

    You can replace vert and frag by your own shader code for load shader. In this tutorial, 2 shader program that implement each stage. first, building G-buffers(Geometry buffers) stage. second, deferred rendering stage.

    1st stage: Building G-Buffers

    In this tutorial, the target result was position information with edge lines on normal. so we will build G-Buffers with position and normal.

    GLSL Shader program

    #version 300 es
    layout(location = 0) in vec4 iPosition;
    layout(location = 1) in vec4 iNormal;
    uniform vec2 resolution;
    out vec4 vPosition;
    out vec4 vNormal;
    
    // Refer from glMatrix
    mat4 persp(in float fovy, in float aspect, in float near, in float far) {
        float f = 1. / tan(fovy / 2.), nf = 1. / (near - far);
        return mat4(
            f / aspect, 0, 0, 0,
            0, f, 0, 0,
            0, 0, (far + near) * nf, -1,
            0, 0, (far * near) * nf * 2., 0);
    }
    mat4 lookAt(in vec3 eye, in vec3 center, in vec3 up) {
        vec3 z = eye - center;
        z /= length(z);
    
        vec3 x = cross(up, z);
        x /= length(x);
    
        vec3 y = cross(z, x);
        y /= length(y);
    
        return transpose(mat4(
            x, -dot(x, eye),
            y, -dot(y, eye),
            z, -dot(z, eye),
            0,0,0,1
        ));
    }
    
    void main() {
        mat4 projection = persp(radians(30.0), resolution.x / resolution.y, .1, 1e3);
        mat4 view = lookAt(
            vec3(1, 1, 1) * 5.,
            vec3(0, 0, 0),
            vec3(0, 1, 0)
        );
        mat4 model = mat4(
             1,0,0,0
            ,0,1,0,0
            ,0,0,1,0
            ,0,0,0,1
        );
    
        gl_Position = vPosition = projection * view * model * iPosition;
        vNormal = transpose(inverse(view * model)) * iNormal;
    }

    In this tutorial use GLSL 300, so first line was #version 300 es.

    Vertex data that passed by vertexAttribPointer was represented like layout(location = 0) in vec4 iPosition;.

    Uniform data that passed by uniform[1234][if] was represented like uniform vec2 resolution;

    Varying data are define by keyword in, out in GLSL 300. vertex shader's out will be passing to fragment shader with in.

    In this vertex shader, copy of glMatrix's perspective projection and camera look at matrix building methods for Model View Projection.

    the important point was varying vPosition and vNormal that transformed in screen space.

    #version 300 es
    precision mediump float;
    layout(location = 0) out vec4 fPosition;
    layout(location = 1) out vec4 fNormal;
    in highp vec4 vPosition;
    in highp vec4 vNormal;
    void main() {
        fPosition = vec4((vPosition.xyz / vPosition.w + 1.) / 2., 1);
        fNormal = vec4(vNormal.xyz, 1);
    }

    Fragment shader receive vPosition and vNormal, and we need to passing framebuffers that represent position buffer and normal buffer.

    so, layout(location = 0) out vec4 fPosition; in fragment shader was indicate what location of framebuffer that stored each discrete information.

    Create Textures that will be store G-Buffers

    In this tutorial, use Texture Array for represent multiple layered framebuffer.

    GBuffers = gl.createTexture();
    gl.activeTexture(gl.TEXTURE0);
    gl.bindTexture(gl.TEXTURE_2D_ARRAY, texture);
    gl.texParameteri(gl.TEXTURE_2D_ARRAY, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
    gl.texParameteri(gl.TEXTURE_2D_ARRAY, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
    gl.texImage3D(
        gl.TEXTURE_2D_ARRAY,
        0,
        gl.RGBA,
        dom.clientWidth, dom.clientHeight,
        2,
        0,
        gl.RGBA,
        gl.UNSIGNED_BYTE,
        null
    );

    before the call bindTexture, you must call activeTexture for distinguish what texture will be handled by related methods in this code such as texParameteri and texImage3D.

    Define Texture0 as Texture 2D Array by

    gl.activeTexture(gl.TEXTURE0);
    gl.bindTexture(gl.TEXTURE_2D_ARRAY, texture);

    The important point that you need to focus is

    gl.texImage3D(
        gl.TEXTURE_2D_ARRAY,
        0,
        gl.RGBA,
        dom.clientWidth, dom.clientHeight,
        2,
        0,
        gl.RGBA,
        gl.UNSIGNED_BYTE,
        null
    );

    this methods was commonly seen texture loading. but last argument was passed null. so none image data was passed, just define property of pixel type, storing data type, resolution and number of layers.

    if you want to increase the layer of G-buffers, than just edit next of resolution, 6th argument that represent layers.

    const GFramebuffer = gl.createFramebuffer();
    gl.bindFramebuffer(gl.FRAMEBUFFER, GFramebuffer);
    var drawBuffers = [
        gl.COLOR_ATTACHMENT0,
        gl.COLOR_ATTACHMENT1,
    ];
    for (let i = 0; i < 2; ++i)
        gl.framebufferTextureLayer(gl.DRAW_FRAMEBUFFER, drawBuffers[i], GBuffers, 0, i);
    gl.drawBuffers(drawBuffers);

    the next, Connect framebuffer with created textures.

    Create framebuffer by createFramebuffer, and than passing returned framebuffer to bindFramebuffer for binding framebuffer in current context.

    COLOR_ATTACHMENT{0-15} that passed with framebufferTextureLayer , was layout's location factor in fragment shader.

    framebufferTextureLayer will connect fragment shader with texture. 3 tail arguments represent texture, level, layer. this is render target defining, so level that used mipmap was not used, so level was 0. the point is texture and layer. texture that created in the above created by gl.createTexture();. connect each layer with COLOR_ATTACHMENT{0-15}, than fragment shader can be access texture by layout(location=).

    drawBuffers will be define which fragment will be written by draw call. at this point, passing COLOR_ATTACHMENT{0-15} array that texture connected.

    Draw call: Render to Texture(G-Buffers)

    gl.bindFramebuffer(gl.FRAMEBUFFER, GFramebuffer);
    gl.useProgram(program1st);
    gl.uniform2f(gl.getUniformLocation(program1st, 'resolution'), dom.clientWidth, dom.clientHeight);
    
    gl.clearColor(0, 0, 0, 0);
    gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
    
    gl.enable(gl.DEPTH_TEST);
    gl.enableVertexAttribArray(0);
    gl.enableVertexAttribArray(1);
    gl.drawArrays(gl.TRIANGLES, 0, vertices.length / 3);
    gl.disableVertexAttribArray(0);
    gl.disableVertexAttribArray(1);

    the point on draw call, was first line bindFramebuffer that define current draw call will be write in texture connected framebuffer that defined in above.

    below codes are common flow that if you know how to draw colored triangle.

    but, you need to check which shader program was loaded. for safety useProgram for ensure 1st stage's shader will be use.

    At this point G-buffers status

    1st stage's result was rendering of a set position and normal information at the same time.

    so Texture 2D Array stored below data at the result of draw call.

    G-Buffer:Position

    G-Buffer:Normal

    2nd stage: Deferred rendering stage

    2nd stage will be compose 1st stage generated discrete information into display.

    GLSL Shader program

    #version 300 es
    out highp vec2 vUV;
    void main() {
        vUV = vec2((gl_VertexID / 2) ^ (gl_VertexID % 2), gl_VertexID / 2);
        gl_Position = vec4(vec2(-1, -1) + 2. * vUV, 0, 1);
    }

    this vertex shader was generate quad by gl_VertexID. it's like for loop passed integer value, so in this shader generate vUV as (0, 0), (1, 0), (1, 1) and (0, 1). so draw call with gl.drawArrays(gl.TRIANGLE_FAN, 0, 4); will draw quad that filled whole screen.

    #version 300 es
    precision mediump float;
    precision highp sampler2DArray;
    layout(location = 0) out vec4 fColor;
    uniform vec2 resolution;
    uniform sampler2DArray gBuffers;
    in highp vec2 vUV;
    
    // Sobel operator kernel refer from https://en.wikipedia.org/wiki/Sobel_operator#Formulation
    float sobel(mat3 r) {
        mat3 t = transpose(r);
        vec3 v = (r[2] - r[0] + t[2] - t[0]) * vec3(1,2,1);
        return v.x+v.y+v.z;
    }
    
    void main() {
        vec2 texel = 1. / resolution;
        mat3 normalx = mat3(
             texture(gBuffers, vec3(vUV + texel * vec2(-1,-1), 1)).x
            ,texture(gBuffers, vec3(vUV + texel * vec2( 0,-1), 1)).x
            ,texture(gBuffers, vec3(vUV + texel * vec2( 1,-1), 1)).x
    
            ,texture(gBuffers, vec3(vUV + texel * vec2(-1, 0), 1)).x
            ,texture(gBuffers, vec3(vUV + texel * vec2( 0, 0), 1)).x
            ,texture(gBuffers, vec3(vUV + texel * vec2( 1, 0), 1)).x
    
            ,texture(gBuffers, vec3(vUV + texel * vec2(-1, 1), 1)).x
            ,texture(gBuffers, vec3(vUV + texel * vec2( 0, 1), 1)).x
            ,texture(gBuffers, vec3(vUV + texel * vec2( 1, 1), 1)).x
        ), normaly = mat3(
             texture(gBuffers, vec3(vUV + texel * vec2(-1,-1), 1)).y
            ,texture(gBuffers, vec3(vUV + texel * vec2( 0,-1), 1)).y
            ,texture(gBuffers, vec3(vUV + texel * vec2( 1,-1), 1)).y
    
            ,texture(gBuffers, vec3(vUV + texel * vec2(-1, 0), 1)).y
            ,texture(gBuffers, vec3(vUV + texel * vec2( 0, 0), 1)).y
            ,texture(gBuffers, vec3(vUV + texel * vec2( 1, 0), 1)).y
    
            ,texture(gBuffers, vec3(vUV + texel * vec2(-1, 1), 1)).y
            ,texture(gBuffers, vec3(vUV + texel * vec2( 0, 1), 1)).y
            ,texture(gBuffers, vec3(vUV + texel * vec2( 1, 1), 1)).y
        ), normalz = mat3(
             texture(gBuffers, vec3(vUV + texel * vec2(-1,-1), 1)).z
            ,texture(gBuffers, vec3(vUV + texel * vec2( 0,-1), 1)).z
            ,texture(gBuffers, vec3(vUV + texel * vec2( 1,-1), 1)).z
    
            ,texture(gBuffers, vec3(vUV + texel * vec2(-1, 0), 1)).z
            ,texture(gBuffers, vec3(vUV + texel * vec2( 0, 0), 1)).z
            ,texture(gBuffers, vec3(vUV + texel * vec2( 1, 0), 1)).z
    
            ,texture(gBuffers, vec3(vUV + texel * vec2(-1, 1), 1)).z
            ,texture(gBuffers, vec3(vUV + texel * vec2( 0, 1), 1)).z
            ,texture(gBuffers, vec3(vUV + texel * vec2( 1, 1), 1)).z
        );
        float G = sobel(normalx) + sobel(normaly) + sobel(normalz);
    
        fColor = G>5e-2 ? vec4(0,0,0,1) : texture(gBuffers, vec3(vUV, 0));
    }

    uniform sampler2DArray gBuffers; was passed by gl.activeTexture(gl.TEXTURE0); in texture creating, in this tutorial. if you use texture in 1st stage, than you must ensure G-Buffers will be passing into 2nd stage shader program.

    G-buffers access by texture(gBuffers, vec3(vUV, 0));, vec3 argument was represent (x in 0-1, y in 0-1, layer of texture 2d array).

    In this tutorial use Sobel operator that commonly used image space edge detection for draw edge by normal information.

    the output with fColor = G>5e-2 ? vec4(0,0,0,1) : texture(gBuffers, vec3(vUV, 0)); will be position(0 layer) with normal(1 layer) based generated edge line overlay.

    Draw call

    gl.bindFramebuffer(gl.FRAMEBUFFER, null);
    gl.useProgram(program2nd);
    gl.uniform2f(gl.getUniformLocation(program2nd, 'resolution'), dom.clientWidth, dom.clientHeight);
    
    gl.disable(gl.DEPTH_TEST);
    gl.drawArrays(gl.TRIANGLE_FAN, 0, 4);

    For drawing default framebuffer that connected canvas by calling gl.bindFramebuffer(gl.FRAMEBUFFER, null);

    this was just 1 quad will be rendering. so Depth test was not used, so you can disable it.

    ensure 2nd stage shader program was using, than you just call gl.drawArrays(gl.TRIANGLE_FAN, 0, 4);. due to vertex information was generated in vertex shader.

    Result

    The result that printed in canvas was position information with normal based detected edge line.

    Result

    References