Categories
Angular Mobile Development Web Development

Intro to WebGL using Angular- How to Setup and Compile Shaders (Part 2)

Prerequisites for Setting up and Compiling Shaders

  • This tutorial assumes that you’ve completed part 1 (setting up a scene).

Setting up Shaders

At the end of the tutorial in part 1, I mentioned that we were going to set up shaders and a triangle in WebGL using Angular. However, we’re not going to create a triangle, we’re going to create a SQUARE instead!

In this part of the tutorial, we’re going to create shaders and compile them into our framework as part of the loading cycle for rendering content to screen.

WebGL requires two shaders each time you wish to draw something to screen. I mentioned briefly in part 1 what vertex and fragment shaders were and what they do, but for clarity here’s the description again:

  • vertex shader
    • responsible for computing vertex positions – based on the positions, WebGL can then rasterize primitives including points, lines, or triangles.
  • fragment shader
    • when primitives are being rasterized, WebGL calls the fragment shader to compute a colour for each pixel of the primitive that’s currently being drawn.

There’s a fair bit of technical reading in regards to how a vertex and fragment shader go about their business, and how they function in unison to rasterize and colour objects on screen.

One thing to know is the language that both shaders use is called Graphics Library Shader Language (GLSL). It has features that aren’t common in JavaScript which are specialised to do the math commonly needed to compute graphic rasterisation.

Further reading is available here: WebGL Shaders and GLSL

Creating Fragment and Vertex Shaders

Let’s define two files in our Assets folder:

  • toucan-fragment-shader.glsl
  • toucan-vertex-shader.glsl

Populate toucan-fragment-shader.glsl with the following:

varying lowp vec4 vColor;
void main(void) {
    gl_FragColor = vColor;
}

The above code assigns a color to gl_FragColor from vColor to be presented on screen. vColor is assigned a value in the vertex shader below. We will go more in-depth on how this process occurs soon.

Populate toucan-vertex-shader.glsl with the following:

attribute vec4 aVertexPosition;
attribute vec4 aVertexColor;
uniform mat4 uModelViewMatrix;
uniform mat4 uProjectionMatrix;
varying lowp vec4 vColor;
void main(void) {
    gl_Position = uProjectionMatrix * uModelViewMatrix * aVertexPosition;
    vColor = aVertexColor;
}

The above code computes a gl_Position by multiplying the projection matrix, model view matrix and the current vertex’s position. It also assigns vColor a color from aVertexColor which is computed from our app as part of rendering. Again, more on this process later.

Loading the shaders in Typescript

We now have two glsl files. We need to load them in as strings into our angular app. We need to download a few packages to enable us to do this though.

Since we’re using an angular project, we need to extend on angular-cli’s existing webpack configuration and add in a loader which will enable us to compile and load in glsl shaders on demand.

Install the packages below:

npm i ts-loader --save-dev
npm i ts-shader-loader --save-dev
npm i @angular-builders/custom-webpack --save-dev

ts-loader is the typescript loader for webpack.
ts-shader-loader is a GLSL shader loader for webpack.
@angular-builders/custom-webpack is a framework that allows customizing build configuration without ejecting webpack configuration.

Once you’ve downloaded these packages into your project, you’ll need to do the following:

  • add a glsl.d.ts file to the src folder of the project and populate it with the following code:
declare module '*.glsl' {
  const value: string; 
  export default value;
}

e.g.

This declaration will identify all .glsl files as a module where the value exported is the file itself, as a string.

This means that we can now do the following in our project:

import fragmentShaderSrc from '../../../assets/toucan-fragment-shader.glsl';
import vertexShaderSrc from '../../../assets/toucan-vertex-shader.glsl';

And the variables fragmentShaderSrc and vertexShaderSrc are immediately available as strings.

  • Next, create a webpack.config.js file at the root directory of the solution.

e.g.

We need to create an additional webpack.config.js to augment the existing angular-cli one in order to load in glsl files correctly with the underlying webpack config that angular-cli uses.

Populate the webpack.config.js file with the following:

module.exports = {
    module: {
      rules: [
        // all files with a `.ts` or `.tsx` extension will be handled by `ts-loader`
        { test: /\.tsx?$/, loader: "ts-loader" },
        { test: /\.(glsl|vs|fs)$/, loader: "ts-shader-loader" }
      ]
    }
  };

The code above includes usage of ts-loader and ts-shader-loader to identify and use the appropriate loader based on the file type we’re dealing with.

Thus allowing us to compile and use glsl as import statements, as described earlier.

We still need to do one more thing, update the angular.json configuration to now make use of our extra webpack.config.js file.

Open angular.json, navigate to the "serve" definition of the config file and update:

"builder": "@angular-devkit/build-angular:dev-server" 

to

"builder": "@angular-builders/custom-webpack:dev-server"

e.g.

Next, Navigate to the "build" definition of the config file and update:

"builder": "@angular-devkit/build-angular:browser"

to

"builder": "@angular-builders/custom-webpack:browser"

Finally, add in customWebpackConfig property in the "options" definition with the following:

"options": {
  "customWebpackConfig": {
    "path": "./webpack.config.js"
  },
  ...
}

e.g.

Loading and Compiling the Shaders into WebGL

Great, we can now load shader scripts into our app using Typescript which Angular will be able to successfully compile. Now its time to load our shaders into memory so they can then be associated with our WebGL context.

We need to do a few things first:

  1. Determine if the shader type we’re loading is supported
  2. Load the shaders into memory
  3. Check to see if the shaders were compiled/loaded successfully
  4. Finally, create a WebGLProgram, associate the shaders to it, and return the result.

For step 1, create a method called: determineShaderType(shaderType: string): number
Within this method we just check to see if the type we supply matches the known mime types for a vertex or fragment shader like so:

private determineShaderType(shaderMimeType: string): number {
  if (shaderMimeType) {
    if (shaderMimeType === 'x-shader/x-vertex') {
      return this.gl.VERTEX_SHADER;
    } else if (shaderMimeType === 'x-shader/x-fragment') {
      return this.gl.FRAGMENT_SHADER;
    } else {
      console.log('Error: could not determine the shader type');
    }
  }
  return -1;
}

For step 2, create a method called loadShader(shaderSource: string, shaderType: string): WebGLShader.
We’ll create a shader based on the shader type (using the code determined from step 1). We’ll take the shader source code, load it into the shader and compile it. Once, it’s compiled, we then run a check to see if it’s successful and return the result.
e.g.

private loadShader(shaderSource: string, shaderType: string): WebGLShader {
  const shaderTypeAsNumber = this.determineShaderType(shaderType);
  if (shaderTypeAsNumber < 0) {
    return null;
  }
  // Create the gl shader
  const glShader = this.gl.createShader(shaderTypeAsNumber);
  // Load the source into the shader
  this.gl.shaderSource(glShader, shaderSource);
  // Compile the shaders
  this.gl.compileShader(glShader);
  // Check the compile status
  const compiledShader = this.gl.getShaderParameter(
    glShader,
    this.gl.COMPILE_STATUS
  );
  return this.checkCompiledShader(compiledShader) ? glShader : null;
}

For step 3, create a method called: checkCompiledShader(shader): boolean.
This checks to see if we have an instance of a shader and whether there is information available in regards to the compilation failure which occurred with the shader that was attempted to be loaded into memory.
It’ll return false if the compiled shader was null and there was an error. It will return true for everything else.
e.g.

private checkCompiledShader(compiledShader: any): boolean {
  if (!compiledShader) {
    // shader failed to compile, get the last error
    const lastError = this.gl.getShaderInfoLog(compiledShader);
    console.log("couldn't compile the shader due to: " + lastError);
    this.gl.deleteShader(compiledShader);
    return false;
  }
  return true;
}

For step 4, create a method called initialiseShaders(): WebGLProgram.

We’ll use this method to do the following:

  1. Create a WebGLProgram
  2. Compile the vertex and fragment shader scripts we defined earlier
  3. Attach the compiled vertex and fragment shaders to the WebGLProgram using our WebGLContext
  4. Link our WebGLContext to the WebGLProgram
  5. Do a check to ensure that the shaders have been loaded successfully
  6. Return the resultant WebGLProgram

e.g.

initializeShaders(): WebGLProgram {
    // 1. Create the shader program
    let shaderProgram = this.gl.createProgram();
    // 2. compile the shaders
    const compiledShaders = [];
    let fragmentShader = this.loadShader(
      fragmentShaderSrc,
      'x-shader/x-fragment'
    );
    let vertexShader = this.loadShader(
      vertexShaderSrc,
      'x-shader/x-vertex'
    );
    compiledShaders.push(fragmentShader);
    compiledShaders.push(vertexShader);
    // 3. attach the shaders to the shader program using our WebGLContext
    if (compiledShaders &amp;&amp; compiledShaders.length > 0) {
      for (let i = 0; i < compiledShaders.length; i++) {
        const compiledShader = compiledShaders[i];
        if (compiledShader) {
          this.gl.attachShader(shaderProgram, compiledShader);
        }
      }
    }
    // 4. link the shader program to our gl context
    this.gl.linkProgram(shaderProgram);
    // 5. check if everything went ok
    if (!this.gl.getProgramParameter(shaderProgram, this.gl.LINK_STATUS)) {
      console.log(
        'Unable to initialize the shader program: ' +
          this.gl.getProgramInfoLog(shaderProgram)
      );
    }
    // 6. return shader
    return shaderProgram;
}

Finally, back in initialiseWebGLContext(canvas: HTMLCanvasElement): any, add the call to initializeShaders() at the end of it.

It should now look like this:

initialiseWebGLContext(canvas: HTMLCanvasElement): any {
  // Try to grab the standard context. If it fails, fallback to experimental.
  this.renderingContext =
    canvas.getContext('webgl') || canvas.getContext('experimental-webgl');
  // If we don't have a GL context, give up now... only continue if WebGL is available and working...
  if (!this.gl) {
    alert('Unable to initialize WebGL. Your browser may not support it.');
    return;
  }
  this.setWebGLCanvasDimensions(canvas);
  this.initialiseWebGLCanvas();
  // initialise shaders into WebGL
  let shaderProgram = this.initializeShaders();
}

Creating ProgramInfo for Shaders

Yay, we’ve managed to compile and initialise shaders within WebGL. But we’re still only half way there in regards to actually getting something rendering on screen.

We’ve currently defined a means of displaying and colouring content, but we haven’t created the content to render, nor have we created the means to bind and supply the necessary info to our GPU to actually render the content.

The next step to getting something displaying on screen is to create an object which will contain a definition of our shaderProgram and reference the shader information we’ve exposed in our .glsl files (attribs and uniforms).

This object is typically called ProgramInfo and describes the shader program to use, and the attribute and uniform locations that we want our shader program to be aware of when rendering content.

First, define the following variables at the top of the WebGLService class so we can reference them throughout the article:

/**
 * Gets the {@link gl.canvas} as a {@link Element} client.
 */
private get clientCanvas(): Element {
  return this.gl.canvas as Element
}
private fieldOfView = (45 * Math.PI) / 180; // in radians
private aspect = 1;
private zNear = 0.1;
private zFar = 100.0;
private projectionMatrix = matrix.mat4.create();
private modelViewMatrix = matrix.mat4.create();
private buffers: any
private programInfo: any

We’ll cover the details of these variables more later.

Now, at the bottom of initialiseWebGLContext(canvas: HTMLCanvasElement): any, add the following implementation:

this.programInfo = {
  program: shaderProgram,
  attribLocations: {
    vertexPosition: this.gl.getAttribLocation(
      shaderProgram,
      'aVertexPosition'
    ),
    vertexColor: this.gl.getAttribLocation(shaderProgram, 'aVertexColor'),
  },
  uniformLocations: {
    projectionMatrix: this.gl.getUniformLocation(
      shaderProgram,
      'uProjectionMatrix'
    ),
    modelViewMatrix: this.gl.getUniformLocation(
      shaderProgram,
      'uModelViewMatrix'
    ),
  },
};

Notice that aVertexPosition, aVertexColor, uProjectionMatrix and uModelViewMatrix were defined in the fragment shader we defined earler as:

attribute vec4 aVertexPosition;
attribute vec4 aVertexColor;
uniform mat4 uModelViewMatrix;
uniform mat4 uProjectionMatrix;

We’ve now referenced them within attribLocations and uniformLocations respectively.

Now what? Let’s create some buffers and data so we actually have content to render.
Then, we’ll set up a rendering scene to provide the foundation to render content.

Creating Content to Render (Buffers)

Let’s create a new method called initialiseBuffers(): any.

e.g.

initialiseBuffers(): any {
}

We’ll create two buffers to render content with:

  • one to store positional data (where to render)
  • the second to store colour data (what colour to render)

We’re going to keep things simple and limit our buffers to providing data on rendering a simple 2D square.

Initialising a buffer in WebGL is pretty simple, just call gl.createBuffer().

There are many different types of buffers available in WebGL. As such, we need to tell WebGL what we want to do with this buffer, how it should interpret it, and provide it the data to bind to.

Implement the following in initialiseBuffers():

// Create a buffer for the square's positions.
const positionBuffer = this.gl.createBuffer();
// bind the buffer to WebGL and tell it to accept an ARRAY of data
this.gl.bindBuffer(this.gl.ARRAY_BUFFER, positionBuffer);
// create an array of positions for the square.
const positions = new Float32Array([
   1.0,  1.0, 
  -1.0,  1.0, 
   1.0, -1.0, 
  -1.0, -1.0
]);
// Pass the list of positions into WebGL to build the
// shape. We do this by creating a Float32Array from the
// array, then use it to fill the current buffer.
// We tell WebGL that the data supplied is an ARRAY and
// to handle the data as a statically drawn shape.
this.gl.bufferData(
  this.gl.ARRAY_BUFFER,
  positions,
  this.gl.STATIC_DRAW
);

As you’ve probably noticed, bindBuffer tells WebGL the buffer we want to provide data for. It’s a procedural approach when defining, binding, and supplying buffer data. Whenever you’re adding buffers to WebGL, you need to stick to this format to ensure that the data you create is handled and assigned properly, this will allow WebGL to correctly render it.

This approach isn’t something that’s limited to just WebGL, this is quite typical in OpenGL in general. It’s all about memory management. It’s good practice to be strict when creating, binding and applying buffer data so as not to introduce memory leaks. It also allows developers to better use the available API’s and build their apps to revolve around these functions.

But enough of that, lets create another buffer to store colour data.

// Set up the colors for the vertices
let colors = new Uint16Array([
  1.0, 1.0, 1.0, 1.0, // white
  1.0, 0.0, 0.0, 1.0, // red
  0.0, 1.0, 0.0, 1.0, // green
  0.0, 0.0, 1.0, 1.0, // blue
]);
const colorBuffer = this.gl.createBuffer();
this.gl.bindBuffer(this.gl.ARRAY_BUFFER, colorBuffer);
this.gl.bufferData(
  this.gl.ARRAY_BUFFER,
  new Float32Array(colors),
  this.gl.STATIC_DRAW
);

Pretty much the same deal with the colour buffer; create, bind and apply.

You’ll notice as well that colours are defined as R,G,B,A from 0 to 1 range.

Tip: You can look up any typical RGB colour and divide each number by 255 to get it into a 0 to 1 range.

E.g. RGB – 100/255, 32/255, 178/255 = 0.39, 0.12, 0.70 (~approximately)

Finally, we can return the position and color buffers back as the result of this function.

return {
  position: positionBuffer,
  color: colorBuffer,
};

The final method should look like this:

initialiseBuffers(): any {
  // Create a buffer for the square's positions.
  const positionBuffer = this.gl.createBuffer();
  // bind the buffer to WebGL and tell it to accept an ARRAY of data
  this.gl.bindBuffer(this.gl.ARRAY_BUFFER, positionBuffer);
  // create an array of positions for the square.
  const positions = new Float32Array([
    1.0,  1.0, 
    -1.0,  1.0, 
    1.0, -1.0, 
    -1.0, -1.0
  ]);
  // set the list of positions into WebGL to build the
  // shape by passing it into bufferData.
  // We tell WebGL that the data supplied is an ARRAY and
  // to handle the data as a statically drawn shape.
  this.gl.bufferData(
    this.gl.ARRAY_BUFFER,
    positions,
    this.gl.STATIC_DRAW
  );
  // Set up the colors for the vertices
  let colors = new Uint16Array([
    1.0, 1.0, 1.0, 1.0, // white
    1.0, 0.0, 0.0, 1.0, // red
    0.0, 1.0, 0.0, 1.0, // green
    0.0, 0.0, 1.0, 1.0, // blue
  ]);
  const colorBuffer = this.gl.createBuffer();
  this.gl.bindBuffer(this.gl.ARRAY_BUFFER, colorBuffer);
  this.gl.bufferData(
    this.gl.ARRAY_BUFFER,
    new Float32Array(colors),
    this.gl.STATIC_DRAW
  );
  return {
    position: positionBuffer,
    color: colorBuffer,
  };
}

Go back to initialiseWebGLContext(canvas: HTMLCanvasElement): any and add in the call to initialiseBuffers() underneath programInfo like so:

// set up programInfo for buffers
this.programInfo = {
  ...
};
// initalise the buffers to define what we want to draw
this.buffers = this.initialiseBuffers();

Preparing the Scene for Rendering

To prepare the scene for rendering, we need to do the following:

  • Resize the WebGL canvas based on the browsers size
  • Update the WebGL canvas to handle displaying content based on the browsers size
  • Bind vertex position data
  • Bind vertex colour data
  • Tell WebGL to use the shader program for rendering
  • Set the vertex shader’s uniform matrices to be in sync with the projection and model-view matrices we’ve configured

The first two methods we will create will ensure that all content that is rendered on screen is correctly positioned and that the perspective of viewing content is maintained correctly whenever the browser’s size is changed on the fly.

Let’s create a method called resizeWebGLCanvas()

resizeWebGLCanvas() {
  const width = this.clientCanvas.clientWidth;
  const height = this.clientCanvas.clientHeight;
  if (this.gl.canvas.width !== width || this.gl.canvas.height !== height) {
    this.gl.canvas.width = width;
    this.gl.canvas.height = height;
  }
}

Based on the method above, we check the client canvas (the HTML canvas element) and if its width and height doesn’t match the WebGL canvas’s, we update it so it does.

Next, create updateWebGLCanvas() method

updateWebGLCanvas() {
    this.initialiseWebGLCanvas();
    this.aspect = this.clientCanvas.clientWidth / this.clientCanvas.clientHeight;
    this.projectionMatrix = matrix.mat4.create();
    matrix.mat4.perspective(
      this.projectionMatrix,
      this.fieldOfView,
      this.aspect,
      this.zNear,
      this.zFar
    );
    // Set the drawing position to the "identity" point, which is the center of the scene.
    this.modelViewMatrix = matrix.mat4.create();
  }

Here, we make a call to initialiseWebGLCanvas() to ensure that the canvas is in a default state when updating the canvas for rendering.
Next, setup some variables to configure a perspective projection matrix which is used to establish the boundaries of viewing rendered content for the scene (setting up a camera view).

Our field of view is 45 degrees, with a width/height ratio of 640:480. We only want to see objects between 0.1 units and 100 units away from the camera. The perspective projection matrix is a special matrix that is used to simulate the distortion of perspective in a camera.

We ensure that the model view projection matrix is set to what is known as an identity matrix. This ensures that the position of the camera is at the center of the screen.

Methods to Bind Vertex and Colour Buffers

The next two methods will reference the vertex position and colour buffers we created earlier and tell WebGL how to consume them in order to render the data.

First, create a method to bind vertex positions: bindVertexPosition(programInfo: any, buffers: any).

bindVertexPosition(programInfo: any, buffers: any) {
  const bufferSize = 2;
  const type = this.gl.FLOAT;
  const normalize = false;
  const stride = 0;
  const offset = 0;
  this.gl.bindBuffer(this.gl.ARRAY_BUFFER, buffers.position);
  this.gl.vertexAttribPointer(
    programInfo.attribLocations.vertexPosition,
    bufferSize,
    type,
    normalize,
    stride,
    offset
  );
  this.gl.enableVertexAttribArray(programInfo.attribLocations.vertexPosition);
}

We bind the position buffer we created before and then tell WebGL how it should consume the position buffer.

Note, the bufferSize is 2. We are telling WebGL that the ARRAY_BUFFER we’ve bound needs to be interpreted in a group of two elements at a time. We don’t want to normalize the data in any way so we set it to false, and we set the stride and offset to 0 so the buffer is read from start to finish completely.

Remember earlier we defined the position data like so:

const positions = new Float32Array([
   1.0,  1.0, 
  -1.0,  1.0, 
   1.0, -1.0, 
  -1.0, -1.0
]);

We want WebGL to interpret the array in two’s. Essentially defining our (x, y) coordinates and then proceed with the next set of data in the array.

By doing this, we set the top-left, top-right, bottom-left, and bottom-right positions of the square.

The vertexAttribPointer is setup to be consumed by the vertex shader’s aVertexPosition via programInfo.attribLocations.vertexPosition we defined earlier.

At the end of the method, we tell WebGL to enable the vertex attribute array for rendering via enableVertexAttribArray(...).

Next, create a method to bind the colour buffer: bindVertexColor(programInfo: any, buffers: any).

bindVertexColor(programInfo: any, buffers: any) {
  const bufferSize = 4;
  const type = this.gl.FLOAT;
  const normalize = false;
  const stride = 0;
  const offset = 0;
  this.gl.bindBuffer(this.gl.ARRAY_BUFFER, buffers.color);
  this.gl.vertexAttribPointer(
    programInfo.attribLocations.vertexColor,
    bufferSize,
    type,
    normalize,
    stride,
    offset
  );
  this.gl.enableVertexAttribArray(programInfo.attribLocations.vertexColor);
}

We pretty much do the exact same thing for binding the colour buffer as we did earlier with the vertex position, only that the bufferSize we read in is set to 4 instead of 2.

This is due to the fact that colour data is interpreted in RGBA format, hence 4 which matches that definition.

Here’s the colours we defined earlier for reference:

let colors = new Uint16Array([
  1.0, 1.0, 1.0, 1.0, // white
  1.0, 0.0, 0.0, 1.0, // red
  0.0, 1.0, 0.0, 1.0, // green
  0.0, 0.0, 1.0, 1.0, // blue
]);

Putting it All Together to Prepare the Scene

Let’s now create another method called prepareScene() and tie everything together.

prepareScene() {
  this.resizeWebGLCanvas();
  this.updateWebGLCanvas();
  // move the camera position a bit backwards to a position where 
  // we can observe the content that will be drawn from a distance
  matrix.mat4.translate(
    this.modelViewMatrix, // destination matrix
    this.modelViewMatrix, // matrix to translate
    [0.0, 0.0, -6.0]      // amount to translate
  );
  // tell WebGL how to pull out the positions from the position
  // buffer into the vertexPosition attribute
  this.bindVertexPosition(this.programInfo, this.buffers);
  // tell WebGL how to pull out the colors from the color buffer
  // into the vertexColor attribute.
  this.bindVertexColor(this.programInfo, this.buffers);
  // tell WebGL to use our program when drawing
  this.gl.useProgram(this.programInfo.program);
  // set the shader uniforms
  this.gl.uniformMatrix4fv(
    this.programInfo.uniformLocations.projectionMatrix,
    false,
    this.projectionMatrix
  );
  this.gl.uniformMatrix4fv(
    this.programInfo.uniformLocations.modelViewMatrix,
    false,
    this.modelViewMatrix
  );
}

We call resizeWebGLCanvas() and this.updateWebGLCanvas() to ensure the canvas and WebGL canvas are matched.

Next, we setup the model-view matrix by performing a translation on it. We move it six units backwards along the Z axis. This allows us to observe any content that we draw on the scene at a distance.

We then tell WebGL how to bind the vertex position and colour data via bindVertexPosition and bindVertexColor and then tell WebGL to use the shader program we built earlier.

Finally, we bind the projection and model-view matrices in our vertex shader to be bound to the projectionMatrix and modelViewMatrix that are maintained and updated within this service. This is so they reflect eachother and are updated accordingly.

Go back to initialiseWebGLContext(...) method, update its signature to return WebGLRenderingContext, make the call to prepareScene() at the end of the method and then return the this.gl context.

The method should now look like this:

initialiseWebGLContext(canvas: HTMLCanvasElement): WebGLRenderingContext {
  // Try to grab the standard context. If it fails, fallback to experimental.
  this.renderingContext =
    canvas.getContext('webgl') || canvas.getContext('experimental-webgl');
  // If we don't have a GL context, give up now... only continue if WebGL is available and working...
  if (!this.gl) {
    alert('Unable to initialize WebGL. Your browser may not support it.');
    return;
  }
  this.setWebGLCanvasDimensions(canvas);
  this.initialiseWebGLCanvas();
  // initialise shaders into WebGL
  let shaderProgram = this.initializeShaders();
  // set up programInfo for buffers
  this.programInfo = {
    program: shaderProgram,
    attribLocations: {
      vertexPosition: this.gl.getAttribLocation(
        shaderProgram,
        'aVertexPosition'
      ),
      vertexColor: this.gl.getAttribLocation(shaderProgram, 'aVertexColor'),
    },
    uniformLocations: {
      projectionMatrix: this.gl.getUniformLocation(
        shaderProgram,
        'uProjectionMatrix'
      ),
      modelViewMatrix: this.gl.getUniformLocation(
        shaderProgram,
        'uModelViewMatrix'
      ),
    },
  };
  // initalise the buffers to define what we want to draw
  this.buffers = this.initialiseBuffers();
  // prepare the scene to display content
  this.prepareScene();
  return this.gl
}

Displaying the Square!!

Head over to the scene.component.ts that we created in part 1.

Import the interval function form rxjs:

import { interval } from 'rxjs';

Define the two private class variables:

export class SceneComponent implements OnInit, AfterViewInit {
  ...
  private _60fpsInterval = 16.666666666666666667;
  private gl: WebGLRenderingContext
}

Create a method called drawScene() and implement the following:

private drawScene() {
  // prepare the scene and update the viewport
  this.webglService.updateViewport();
  this.webglService.prepareScene();
  // draw the scene
  const offset = 0;
  const vertexCount = 4;
  this.gl.drawArrays(
    this.gl.TRIANGLE_STRIP,
    offset,
    vertexCount
  );
}

This method does two things:

  • prepare the scene for rendering via prepareScene
  • draw all arrays that have been bound in the gl context.

NOTE: the vertexCount is 4. This number matches the lines that represent the sides of the square (left, right, top, bottom lines).

TRIANGLE_STIP is just the default way that arrays are drawn in OpenGL / WebGL in general. OpenGL renders everything as triangles (everything can be defined as a triangle when it comes to rendering objects on screen).

Finally, to call drawScene() and animate it, we need to define a render loop and then call the method as desired.

In ngAfterViewInit() update the method with the following:

ngAfterViewInit(): void {
  if (!this.canvas) {
    alert('canvas not supplied! cannot bind WebGL context!');
    return;
  }
  this.gl = this.webglService.initialiseWebGLContext(
    this.canvas.nativeElement
  );
  // Set up to draw the scene periodically.
  const drawSceneInterval = interval(this._60fpsInterval);
  drawSceneInterval.subscribe(() => {
    this.drawScene();
  });
}

We use interval from rxjs to create a simple render loop for us and call drawScene() within the subscription of interval. Of course, you could use setInterval JS function to achieve the same thing, but this is just some fun in using some rxjs conventions along with rendering content in WebGL.

Run npm start and FINALLY, you can now see a SQUARE on the screen! Yay!

You can even resize the browser window and see that the rendering context updates and positions itself correctly!!

Congratulations! You made it! That was a lot of work… Too much work perhaps to just set up something simple like this. There’s a lot of third-party libraries that help to reduce the amount of boilerplate you need to initially create a scene and start rendering objects, but the aim of this tutorial is to really show you what needs to be done in order to setup and compile shaders in WebGL using Angular with as much detail as possible.

That’s it for part 2! In part 3, we’ll look into animating and displaying a spinning 3D cube!!!

Stay tuned!

As usual. the source code for this tutorial is available @ https://gitlab.com/MikeHewett/intro-webgl-part-2.git

Useful VS Code Extensions for GLSL

VS Code has some extensions that help to highlight and visualise GLSL code.
You can search up the following extensions in the Marketplace:

slevesque.shader
raczzalan.webgl-glsl-editor
circledev.glsl-canvas
boyswan.glsl-literal

References