Angular GCP

How to Deploy Angular 8 App to Google App Engine

Hi there 👋. My name is Thomas Van Deun and I’m a certified Google Cloud Architect.

What we’ll do

  1. Set up a Google Cloud Repository on GCP 
  2. Scaffold an Angular 8 application 
  3. Integrate Cloud Build CI/CD for this project
  4. Automatically deploy to App Engine 
Architecture for deploying angular 8 to Google App Engine on GCP with CI/CD

What you’ll need

  • Google Cloud Platform account — free tier will do
    • Set up a new project if you’d like
  • Have the Cloud SDK installed (to connect with the repository)
  • Angular CLI tools installed
  • IDEA of your choice

Google Cloud Repository

Log in on the Google Cloud Console

Open the hamburger menu and navigate to Source Repository under Tools. On the next screen, you’ll find the button Add Repository.

Here we will create a new repository. Give it a name and choose the project you want to work within. 

Creating an Angular 8 Application in Cloud Source Repositories

When you are happy with your name, click Create.

Now we need to connect locally with the repository. I’d recommend you to set up using the gcloud SDK, you will need it in a later step. If all goes well, you will have the repo cloned on your local machine.  Step 1 complete 🎉

Scaffold Angular 8 project

Now, open the command line at the root of your cloned repository. Let’s scaffold an Angular project with the following commands. 

ng new angular-on-gcp

This will prompt you through the setup, change or accept default settings.

Navigate to your newly created app and serve it up to see it locally.

cd angular-on-gcp
ng serve --open

Feel free to make more changes in this angular project, but for tutorial purposes, we will leave it like this.

Website running on localhost: 

Running Angular on gcp app with a localhost

More in depth Angular documentation over here.

Cloud Build Trigger

Go back to your GCP console, and look for the tool Cloud Build. Start with enabling this API for your project. You should see this screen next: 

GCP console cloud build tool


On the next screen, fill in the name and select a repository. The name should represent what this pipeline does. So in this case, I will name it BuildAndDeployAngularApplication. Select the branch you created in step 1. 

You can leave the trigger on the master branch, but you can change this later to whatever fits your need.

Leave the defaults for Build Configuration.

From now, every push we make to master will trigger this pipeline. At the moment, there is no logic defined so nothing will actually happen. Let’s change that in the next step.

Creating a build trigger on gcp

CI / CD Pipeline

Google Cloud Build(ers)

Google Cloud Build makes uses of Cloud Builders. Google provide a few of them out of the box over here.

Unfortunatly, this list does not contain an Angular builder. We will create our own and store it in our project Image Repository, making use of the community managed cloud builders.

Clone this repo to your local machine, build the Angular docker image and verify that it’s now part of your available containers:

git clone
cd cloud-builders-community/ng
gcloud builds submit --config cloudbuild.yaml .  
gcloud container images list --filter ng

Now that we have the NG image in our project registry, and we can make use of it in our pipeline.

Cloud Build Options

On the root of your Angular repo, create a new file named cloudbuild.yaml. Use these steps and save the file. Replace projectId with your own projectId. It will now reference the image you created in the step before.


  # Install node packages
  - name: ""
    args: ["install"]
    dir: "angular-on-gcp"

  # Build production package
  - name: "{projectId}/ng"
    args: ["build", "--prod"]
    dir: "angular-on-gcp"

  # Deploy to google cloud app engine
  - name: ""
    args: ["app", "deploy", "--version=prod"]
    dir: "angular-on-gcp"

This file means the following:

  1. In the build environment, install all the NPM dependencies defined in the package.json
  2. Using the Angular Cloud Builder to create a production distribution using the Angular Docker
  3. Use the Gcloud Builder to deploy  this app in the App engine service

Google App Engine 

Now that we have set up our automatic build pipeline, we need to give the builder instructions on how to deploy the Angular application. 

At the source root of your Angular project, create a new file named app.yaml and paste the following contents (rename the file directory if you have a different name for your application) :

runtime: python27
api_version: 1
threadsafe: yes

  - url: /(.*\.(gif|png|jpg|css|js)(|\.map))$
    static_files: dist/angular-on-gcp/\1
    upload: dist/angular-on-gcp/(.*)(|\.map)

  - url: /(.*)
    static_files: dist/angular-on-gcp/index.html
    upload: dist/angular-on-gcp/index.html

  - e2e/
  - node_modules/
  - src/
  - ^(.*/)?\..*$
  - ^(.*/)?.*\.json$
  - ^(.*/)?.*\.md$
  - ^(.*/)?.*\.yaml$

Create an App Engine

Enable the App Engine API (App Engine Admin API) on the console and spin up an App Engine by navigating to the App Engine tool in the console.

Enabling the App Engine API on gcp console

Permissions for App Engine Deploy

You will need to give your Cloud Builder service account access to the App Engine. Go to the settings menu of Cloud Build, and enable App Engine Admin

Providing service permissions in App Engine on gcp

The Build History is a good starting point to debug any failures:

GCP Build history

The result 🥂

Now that you have everything set up, Angular 8 should be deployed and running on Google App Engine correctly. When you run your pipeline, it will deploy it to Google App Engine. In the dashboard of App Engine, you will see your deployed instances and the address of your application:

App Engine Dashboard on GCP

Angular on gcp is running

Angular Mobile Development Web Development

Introduction to WebGL Using Angular- How to Set Up a Scene (Part 1)


WebGL has to be one of the most under-used JavaScript APIs within modern web browsers.

It offers rendering interactive 2D and 3D graphics and is fully integrated with other web standards, allowing GPU-accelerated usage of physics and image processing effects as part of the web page canvas (Wikipedia, 2020).

In this article, we’re going to setup WebGL within a typical Angular app by utilising the HTML5 canvas element.


Before starting, its worthwhile to ensure your system is setup with the following:

  • Nodejs is installed
  • You have setup a new or existing Angular app
  • You are using a modern web browser (Chrome 56+, Firefox 51+, Opera 43+, Edge 10240+)

WebGL Fundamentals

There’s honestly a lot to take in regards to the fundamentals of WebGL (and more specifically OpenGL). It does mean that you’ll need to have some basic understanding of linear algebra and 2d/3d rendering in general. WebGL Fundamentals does a great job at providing an introduction to WebGL fundamentals and I’ll be referencing their documentation as we step through setting up our Angular app to use WebGL.

Before going any further, its important that you understand the following at a minimum.

WebGL is not a 3D API. You can’t just use it to instantly render objects and models and get them to do some awesome magic.

WebGL is just a rasterization engine. It draws points, lines and triangles based on the code you supply.

If you want WebGL to do anything else, its up to you to write code that uses points, lines and triangles to accomplish the task you want.

WebGL runs on the GPU and requires that you provide code that runs on the GPU.
The code that we need to provide is in the form of pairs of functions.

They are known as:

  • a vertex shader
    • responsible for computing vertex positions – based on the positions, WebGL can then rasterize primitives including points, lines, or triangles.
  • a fragment shader
    • when primitives are being rasterized, WebGL calls the fragment shader to compute a colour for each pixel of the primitive that’s currently being drawn.

Each shader is written in GLSL which is a strictly typed C/C++ like language.
When a vertex and fragment shader are combined, they’re collectively known as a program.

Nearly all of the entire WebGL API is about setting up state for these pairs of functions to run. For each thing you want to draw, you setup a bunch of state then execute a pair of functions by calling gl.drawArrays or gl.drawElements which executes your shaders on the GPU.

Any data you want those functions to have access to, must be provided to the GPU. There are 4 ways a shader can receive data.

  • Attributes and buffers
    • Buffers are arrays of binary data you upload to the GPU. Usually buffers contain things like positions, normals, texture coordinates, vertex colours, etc although you’re free to put anything you want in them.
    • Attributes are used to specify how to pull data out of your buffers and provide them to your vertex shader. For example you might put positions in a buffer as three 32bit floats per position. You would tell a particular attribute which buffer to pull the positions out of, what type of data it should pull out (3 component 32 bit floating point numbers), what offset in the buffer the positions start, and how many bytes to get from one position to the next.
    • Buffers are not random access. Instead a vertex shader is executed a specified number of times. Each time it’s executed the next value from each specified buffer is pulled out and assigned to an attribute.
  • Uniforms
    • Uniforms are effectively global variables you set before you execute your shader program.
  • Textures
    • Textures are arrays of data you can randomly access in your shader program. The most common thing to put in a texture is image data but textures are just data and can just as easily contain something other than colours.
  • Varyings
    • Varyings are a way for a vertex shader to pass data to a fragment shader. Depending on what is being rendered, points, lines, or triangles, the values set on a varying by a vertex shader will be interpolated while executing the fragment shader.

(WebGL Fundamentals, 2015).

I’m glossing over a lot of technical detail here, but if you really want to know more, head over to WebGL Fundamentals lessons for more info.

Setting up a playground

Let’s set up a playground so we have something that we can use in order to continue setting up WebGL.

First, create a component. You can create one by executing the following command within your Angular root (src) directory. I’ve gone ahead and named mine scene.

E.g. ng generate component scene

PS X:\...\toucan-webgl> ng generate component scene
CREATE src/app/scene/scene.component.html (20 bytes)
CREATE src/app/scene/scene.component.spec.ts (619 bytes)
CREATE src/app/scene/scene.component.ts (272 bytes)
CREATE src/app/scene/scene.component.scss (0 bytes)
PS X:\...\toucan-webgl>

Let’s also create a service with the component and call it WebGL too.

E.g. ng generate service scene/services/webGL

PS X:\...\toucan-webgl> ng generate service scene/services/webGL
CREATE src/app/scene/services/web-gl.service.spec.ts (352 bytes)
CREATE src/app/scene/services/web-gl.service.ts (134 bytes)
PS X:\...\toucan-webgl>

If you’re using a new Angular app, hopefully you’ve already configured it to use App Routing. If you haven’t, follow the next couple of steps.

ng generate module app-routing --flat --module=app

You’ll now have an app-routing.module.ts file, if you haven’t got one already.

Update the contents of the file with the following:

import { Routes, RouterModule } from "@angular/router";
import { SceneComponent } from "./scene/scene.component";
const routes: Routes = [{ path: "", component: SceneComponent }];
  imports: [RouterModule.forRoot(routes)],
  exports: [RouterModule],
export class AppRoutingModule {}

This will ensure that on app load, it’ll display the SceneComponent first.

Next, add the WebGLService to the SceneComponent‘s constructor like so:

import { Component, OnInit } from "@angular/core";
import { WebGLService } from "./services/web-gl.service";
  selector: "app-scene",
  templateUrl: "./scene.component.html",
  styleUrls: ["./scene.component.scss"],
export class SceneComponent implements OnInit {
  // *** Update constructor here ***
  constructor(private webglService: WebGLService) {}
  ngOnInit(): void {}

Finally, run ng serve and check to see if the Angular app is running and displaying the SceneComponent.
It should look like this:

Now, lets move onto adding a WebGL context.

Setting up the WebGL context

Setting up the WebGL context is a little bit involved but once we get the foundation going we can then proceed to start getting something on the screen.

Let’s start by opening up scene.component.html and add a HTML5 canvas element.

<div class="scene">
  <canvas #sceneCanvas>
    Your browser doesn't appear to support the
    <code><canvas></code> element.

Open up scene.component.scss (or equivalent) and add in the following styles:

.scene {
  height: 100%;
  width: 100%;
.scene canvas {
  height: 100%;
  width: 100%;
  border-style: solid;
  border-width: 1px;
  border-color: black;

The following css should just make sure the canvas element extends to the size of the browser window. I just added some border styling so you can explicitly see it for yourself.

TIP: If you want, you can also update the global styles.scss so all content expands to the height of the window respectively.


/* You can add global styles to this file, and also import other style files */
body {
  height: 99%;

We’ll now embark on doing the following:

  1. Resolving the canvas element in typescript via the #canvas id
  2. Binding the canvas element to a WebGL rendering context
  3. Initialize the WebGL rendering canvas

Resolving the canvas element

Open scene.component.ts and add the following property:

@ViewChild('sceneCanvas') private canvas: HTMLCanvasElement;

Update your the SceneComponent class to implement AfterViewInit, we’ll need to hook into this lifecycle hook to continue setting up the WebGL canvas.

Add in the following guard to the ngAfterViewInit method to ensure that we actually have the canvas element before attempting to bind it:

if (!this.canvas) {
  alert("canvas not supplied! cannot bind WebGL context!");

NOTE: If the alert is hit, it’s due to the fact that the ElementRef ID you’re using does match the one defined in HTML and the TS class. You need to ensure they match.

Your component implementation should now look like this:

import { AfterViewInit, Component, OnInit, ViewChild } from "@angular/core";
import { WebGLService } from "./services/web-gl.service";
  selector: "app-scene",
  templateUrl: "./scene.component.html",
  styleUrls: ["./scene.component.scss"],
export class SceneComponent implements OnInit, AfterViewInit {
  @ViewChild("sceneCanvas") private canvas: HTMLCanvasElement;
  constructor(private webglService: WebGLService) {}
  ngAfterViewInit(): void {
    if (!this.canvas) {
      alert("canvas not supplied! cannot bind WebGL context!");
  ngOnInit(): void {}

Binding the canvas element to a WebGL rendering context

Open up the web-gl.service.ts file.

Create a method called initialiseWebGLContext with a parameter canvas: HTMLCanvasElement.

initialiseWebGLContext(canvas: HTMLCanvasElement) {

Go back to scene.component.ts and add in the following line after the guard check in ngAfterViewInit.

ngAfterViewInit(): void {
  if (!this.canvas) {
      alert('canvas not supplied! cannot bind WebGL context!');

Now, back in web-gl.service.ts, lets retrieve a WebGL context from the canvas’s native element and reference it to a property that we’ll call gl.

private _renderingContext: RenderingContext;
private get gl(): WebGLRenderingContext {
  return this._renderingContext as WebGLRenderingContext;
constructor() {}
initialiseWebGLContext(canvas: HTMLCanvasElement) {
  // Try to grab the standard context. If it fails, fallback to experimental.
  this._renderingContext = canvas.getContext('webgl') || canvas.getContext('experimental-webgl');
  // If we don't have a GL context, give up now... only continue if WebGL is available and working...
  if (! {
      alert('Unable to initialize WebGL. Your browser may not support it.');

Once we’ve retrieved the WebGLRenderingContext, we can then set the WebGL canvas’s height and width, and then finally proceed to initialise the WebGL canvas.

Lets add two methods which do that I described above:

setWebGLCanvasDimensions(canvas: HTMLCanvasElement) {
  // set width and height based on canvas width and height - good practice to use clientWidth and clientHeight = canvas.clientWidth; = canvas.clientHeight;
initialiseWebGLCanvas() {
  // Set clear colour to black, fully opaque, 0.0, 0.0, 1.0);
  // Enable depth testing;
  // Near things obscure far things;
  // Clear the colour as well as the depth buffer. |;

Now finally call them at the end of the initialiseWebGLContext method.

initialiseWebGLContext(canvas: HTMLCanvasElement) {
  // Try to grab the standard context. If it fails, fallback to experimental.
  this._renderingContext =
    canvas.getContext('webgl') || canvas.getContext('experimental-webgl');
  // If we don't have a GL context, give up now... only continue if WebGL is available and working...
  if (! {
    alert('Unable to initialize WebGL. Your browser may not support it.');
  // *** set width, height and initialise the webgl canvas ***

Run the app again, you should now see that the canvas is entirely black.

This shows that we’ve successfully initialised the WebGL context.

Thats it for part 1!

Next: Introduction to WebGL using Angular – Part 2 – Setting up shaders and a triangle

In part 2, we’ll proceed to add in shaders and start setting up content to render on screen!

Stay tuned!

The source code for this tutorial is available at