Lecture 08: Buffers, Attributes, Varyings & Uniforms (slides)Learning ObjectivesBy the end of this lecture, you will be able to:
|
Our main goal for today is to be able to run the entire pipeline below from start to finish using WebGL
. The notes here mostly serve as a reference for building complete WebGL
programs. In class, we'll build our own application to render a cube in which the vertices have colors attached to them.
In last week's lab, we focused on implementing a fragment shader. Today, we'll also build a vertex shader which takes in certain inputs that we'll need to store in the GPU's memory. We'll also talk about how we pass data from the vertex shader to the fragment shader and control global variables that are common to our shader programs.
WebGL
is a state machine: manage the state using the "context".WebGL
is a state machine in the sense that it is a large collection of variables that define how it should operate. The state is managed by the context we introduced in the last lecture. Recall that we can retrieve a context from the canvas like this:
let gl = canvas.getContext("webgl"); // retrieves a WebGL context
This context has a collection of functions and constants that we can use and/or modify to change the behavior of the rendering pipeline.
Recall that in Lab 6, we had to enable depth testing (gl.enable(gl.DEPTH_TEST)
). This changed the state so that whenever we ran the rendering pipeline, WebGL
would use depth testing and draw stuff in the correct order. In this example, gl.enable
was the function that allowed us to change the state.
In addition to global variables, we can also manage the state used to describe how data flows through the rendering pipeline using buffers and attributes. We can also manage which shader program is currently active in the rendering pipeline.
To better understand this concept of state, please open the following link. In the dropdown at the top-right, click on the "Rainbow Triangle" example.
https://webglfundamentals.org/webgl/lessons/resources/webgl-state-diagram.html
If you click the two rightwards-pointing arrows at the top right, you should see something like this. Please take a moment to investigate all the different boxes - we'll talk about the details of how this relates to the code we write in the next sections.
Usually, the first thing you'll do is create a program that represents how vertices and fragments are processed. Each shader (vertex, fragment) will be compiled individually and then attached to this shader program. To compile a shader:
const compileShader = (gl, shaderSource, type) => {
let shader = gl.createShader(type);
gl.shaderSource(shader, shaderSource);
gl.compileShader(shader);
if (!gl.getShaderParameter(shader, gl.COMPILE_STATUS)) {
let error = gl.getShaderInfoLog(shader);
gl.deleteShader(shader);
throw ("Unable to compile " + (type === gl.VERTEX_SHADER ? "vertex" : "fragment") + " shader: " + error);
}
return shader;
};
This returns WebGLShader
object that can be attached to a program (a WebGLProgram
object):
const compileProgram = (gl, vertexShaderSource, fragmentShaderSource) => {
let vertexShader = compileShader(gl, vertexShaderSource, gl.VERTEX_SHADER);
let fragmentShader = compileShader(gl, fragmentShaderSource, gl.FRAGMENT_SHADER);
let program = gl.createProgram();
gl.attachShader(program, vertexShader);
gl.attachShader(program, fragmentShader);
gl.linkProgram(program);
if (!gl.getProgramParameter(program, gl.LINK_STATUS))
throw ("Unable to compile the shader program: " + gl.getProgramInfoLog(program));
gl.useProgram(program);
return program;
};
At the end of the compileProgram
function, note that gl.useProgram(program)
will change the state of WebGL
so that any rendering calls use this program
. We will usually have a single program in our course, but it's possible to have multiple programs to create different effects for different objects in your scene.
Here is an example of a vertex shader and fragment shader. Don't worry about the attribute
stuff for now, we'll talk about that soon. Right now, I just want to highlight the idea of a varying
. This is a way to specify that we want to pass a variable called v_Color
from the vertex shader to the fragment shader. Wait a second. If the vertex shader operates on every vertex and the fragment shader operates on every fragment, how can be get a value for v_Color
at a fragment (which is some portion of a triangle that overlaps a particular pixel)? Interpolation! Using barycentric coordinates (again), the values for the three varyings at the three vertices of a triangle will be interpolated to create a value at the fragment. Again, WebGL
does the interpolation for us - we just need to specify that a variable will be passed from the vertex shader to the fragment shader using the varying
declaration.
Vertex Shader:
attribute vec3 a_Position;
attribute vec3 a_Color;
varying vec3 v_Color; // declare varyings before main()
void main() {
gl_Position = vec4(a_Position, 1.0);
v_Color = a_Color; // we need to assign this as an output of the vertex shader
}
Fragment Shader:
varying vec3 v_Color; // declare incoming interpolated varying
void main() {
gl_FragColor = vec4(v_Color, 1); // use the varying
}
We can also specify qualifiers to tell WebGL
how to do the interpolation. In the example above, we will use the default perspective-correct interpolation.
We need to tell WebGL
where to look for our scene data. This means we need to upload the meshes in our scenes to the GPU. We do this with buffers. Think of a buffer as a chunk of memory where we will put our mesh data. Sometimes we need to upload floating-point data (for vertices), and sometimes it will be integer data (for triangle and/or edge indices).
gl.createBuffer
):let pointBuffer = gl.createBuffer();
let triangleBuffer = gl.createbuffer();
gl.bindBuffer
):Binding a buffer refers to the idea of setting the state of which buffer WebGL
considers "active". We will use gl.ARRAY_BUFFER
for floating-point data (vertices) and gl.ELEMENT_ARRAY_BUFFER
for integer data (triangle/edge indices).
gl.bindBuffer(gl.ARRAY_BUFFER, pointBuffer);
// OR (if doing an operation with triangle indices)
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, triangleBuffer);
A very important (and bug-prone) aspect of buffers is that every operation on a buffer will work with the currently bound buffer. This relates to the concept of a state. WebGL
can only hold one state at a time for the ARRAY_BUFFER
, and one for the ELEMENT_ARRAY_BUFFER
, so we need to remember which one was the last one we bound.
The picture below shows the difference between when a positionBuffer
and colorBuffer
are independently bound to the ARRAY_BUFFER
.
gl.bufferData
):Say we have vertices stored in a 1d array called vertices
and triangle indices stored in a 1d array called triangles
. We want to upload both of these arrays to the GPU so WebGL
can use them when running through the pipeline.
Unfortunately, raw JavaScript
arrays (Array()
or []
) contain no information about the type of each element in the array. The GPU needs to know how much data we are uploading, so we need to convert these to typed arrays. We will use Float32Array
for vertex coordinates and either Uint16Array
or Uint32Array
for integer data. These typed arrays can be constructed directly from the original arrays (e.g. vertices_f32 = new Float32Array(vertices)
).
Now the data can be written to the GPU using the gl.bufferData
function. You can always set the last argument to gl.STATIC_DRAW
:
// write vertex coordinates to the GPU
gl.bindBuffer(gl.ARRAY_BUFFER, pointBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(vertices), gl.STATIC_DRAW);
// write triangle indices to the GPU
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, triangleBuffer);
gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, new Uint16Array(triangles), gl.STATIC_DRAW);
We're almost ready to draw. We first need to tell WebGL
how the data we uploaded is associated with the buffers we created and wrote to. This is done using attributes. There are two steps we need to do here: (1) enable the attribute in the shader program, and (2) associate the attribute with a buffer. In order to enable the attribute, we need to know the "location" of the attribute in the program. Think of this as a pointer to the data. Here is a complete example that associates our pointBuffer
with the a_Position
attribute in the vertex shader. Assume the program
is the same one we created earlier.
// assuming our vertex shader looks like this:
const vertexShaderSource = `
attribute vec3 a_Position; // declare the position attribute in the vertex shader (add "attribute" before declaring type and name of variable)
void main() {
gl_Position = vec4(a_Position, 1.0);
}`;
let a_Position = gl.getAttribLocation(program, "a_Position"); // the second argument (String) should match the name of the attribute in the vertex shader
gl.enableVertexAttribArray(a_Position); // enable the attribute
// now, associate the attribute with some buffer (here, we want to use the pointBuffer so we need to bind that one)
gl.bindBuffer(gl.ARRAY_BUFFER, pointBuffer);
gl.vertexAttribPointer(a_Position, 3, gl.FLOAT, false, 0, 0);
// ^ ^ ^
The first three arguments (underlined with ^
) are the most important. The last three can always be set to false, 0, 0
(in this class).
The first argument is the location of the attribute in the shader program. Next we have the stride of the data that is bound to the pointBuffer
. This is 3
because there are three coordinates (x, y, z) for every vertex. The third argument is the type of the data (gl.FLOAT
because we buffered a Float32Array
for the vertices when we did the call to gl.bufferData
).
If you want to attach data to vertices (and use it in the pipeline), you'll need to remember the stride of the data. For 3d vertex normals and 3d vertex coordinates, this will be 3. For 2d texture coordinates (next week), this will be 2. You can also attach colors to each vertex, in which case, the stride will also be 3 for the RGB components.
gl.drawElements
.We're finally ready to invoke the rendering pipeline. We'll generally use gl.drawElements
to draw our meshes. Again, before each call to gl.drawElements
, we need to bind the buffer with the indices that will be used. For our triangleBuffer
declared above, this will be:
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, triangleBuffer);
gl.drawElements(gl.TRIANGLES, triangles.length, gl.UNSIGNED_SHORT, 0);
The first argument specifies that we are drawing triangles. If you want to draw edges, this should be gl.LINES
. The next argument corresponds to how much data there is to draw. You would think that WebGL
would remember this length since we wrote it earlier! But it doesn't. This sounds annoying but it's actually convenient because we may not always want to draw all the data. The last parameter is the offset in the indexed data, which gives control over where to start the drawing call. Again, this allows us to draw a subset of the triangles. In our course, we'll always draw everything, so you can set this to 0
.
The second-to-last argument can be a source of bugs. It refers to the type of the data that we are drawing. In our case, we wrote a Uint16Array
to represent our triangle indices as unsigned 16-bit integers. These are also called unsigned shorts, hence, the use of gl.UNSIGNED_SHORT
. Note that this means the maximum number of vertices would be gl.UNSIGNED_INT
(unsigned 32-bit integer) with triangle indices buffered as a Uint32Array
.
Note there is another function called gl.drawArrays
to invoke the rendering pipeline, but we won't really use it in our course.
Before drawing, we may also need to set some constants to control the global state. The most frequent ones we will use are gl.viewport
, gl.enable
and gl.polygonOffset
and gl.clearColor
.
We may also want to calculate variables on the JavaScript
side and pass them into our shaders. For example, if the view and or model matrices are modified when a user clicks and drags the mouse (we can apply a rotation to the model matrix), we need to transform points accordingly in the vertex shader. These variables can be passed into the pipeline as uniforms. We can pass int
, float
, vec3
, vec4
, mat3
and mat4
. These uniforms can be used in either the vertex or fragment shader and need the uniform
qualifier before declaring the type. For example:
uniform int u_shader; // as we saw in Lab 6
uniform float u_transparency; // in case you want to adjust the transparency value (4th component in gl_FragColor)
uniform vec3 u_km; // specify the material km on the JavaScript side
uniform mat4 u_ModelViewProjectionMatrix; // 4x4 matrix to transform points by the MVP matrix (in the vertex shader).
I usually prefix uniforms with u_
to remember that they are uniforms. This helps when your shaders have a lot of variables. Similar to attributes, we need to retrieve the "location" of the uniform in the shader program, this time using gl.getUniformLocation
. Here are some examples for float
, vec3
and mat4
. Let the vertex and fragment shaders be:
Vertex Shader:
attribute vec3 a_Position;
uniform mat4 u_ModelViewProjectionMatrix;
void main() {
gl_Position = u_ModelViewProjectionMatrix * vec4(a_Position, 1.0);
}
Fragment Shader:
uniform float u_transparency;
uniform vec3 u_km;
void main() {
gl_FragColor = vec4(u_km, u_transparency);
}
We can set these uniforms from the JavaScript
side like this:
let u_transparency = gl.getUniformLocation(program, "u_transparency");
gl.uniform1f(u_transparency, 0.6); // use gl.uniform1i if the uniform is an integer
let u_km = gl.getUniformLocation(program, "u_km");
gl.uniform3f(u_km, 0.6, 0.4, 0.5);
// OR
gl.uniform3fv(u_km, vec3.fromValues(0.6, 0.4, 0.5)); // the "v" at the end of uniform3fv means it is a vector
// assuming you have computed the model-view-projection (MVP) matrix somewhere
let u_mvp = gl.getUniformLocation(program, "u_ModelViewProjectionMatrix");
gl.uniformMatrix4fv(u_mvp, false, mvp); // assuming "mvp" is a mat4
Note the difference between writing scalars and vectors (uniform[1234][fi][v]
) and how matrices are passed (uniformMatrix[234]fv
). This is often a source of bugs (it's easy to forget the transpose
option for matrices, which we have set to false
in this example). For more information, please see the following documentation:
After setting a uniform, we can use it in the shader program when the rendering pipeline runs.
In most of the exercises and labs we have done so far, the models have rotated when we clicked and dragged the mouse. This can be done by adding callbacks that are used when a certain "event" happens on the canvas. These events can be: mousedown
(mouse button is clicked), mouseup
(mouse button is released), mousemove
(mouse is moved), mousewhel
keydown` (some key is pressed). Here are some common callbacks that were set in the labs/exercises:
const mouseMove = (event, renderer) => {
if (!renderer.dragging) return;
let R = rotation(
(event.pageX - renderer.lastX) / renderer.canvas.width,
(event.pageY - renderer.lastY) / renderer.canvas.height
);
mat4.multiply(renderer.modelMatrix, R, renderer.modelMatrix);
renderer.draw();
renderer.lastX = event.pageX;
renderer.lastY = event.pageY;
};
const mouseDown = (event, renderer) => {
renderer.dragging = true;
renderer.lastX = event.pageX;
renderer.lastY = event.pageY;
};
const mouseUp = (event, renderer) => {
renderer.dragging = false;
};
const mouseWheel = (event, renderer) => {
event.preventDefault();
let scale = 1.0;
if (event.deltaY > 0) scale = 0.9;
else if (event.deltaY < 0) scale = 1.1;
let direction = vec3.create();
vec3.subtract(direction, renderer.eye, renderer.center);
vec3.scaleAndAdd(renderer.eye, renderer.center, direction, scale);
mat4.lookAt(
renderer.viewMatrix,
renderer.eye,
renderer.center,
vec3.fromValues(0, 1, 0)
);
renderer.draw();
};
canvas = document.getElementById(canvasId);
let renderer = new Renderer(canvas);
renderer.dragging = false;
canvas.addEventListener("mousemove", (event) => {
mouseMove(event, renderer);
});
canvas.addEventListener("mousedown", (event) => {
mouseDown(event, renderer);
});
canvas.addEventListener("mouseup", (event) => {
mouseUp(event, renderer);
});
canvas.addEventListener("mousewheel", (event) => {
mouseWheel(event, renderer);
});
The rotation
function can be customized based on your application. The idea is to return a 4x4 rotation matrix from the difference in x and y screen coordinates. One option is to assume these changes represent angles about the y- and x-axis, respectively. In other words, the relative difference dx
from dragging along the width (x-direction) represents a small rotation about the y-axis and the relative difference dy
from dragging along the height represents a rotation about the x-axis, which we can compound:
const rotation = (dx, dy) => {
const speed = 4;
const R = mat4.fromYRotation(mat4.create(), speed * dx);
return mat4.multiply(
mat4.create(),
mat4.fromXRotation(mat4.create(), speed * dy),
R
);
};
Keep mind that if you want to center the rotation about a different point, you'll need to translate to the origin, rotate and then translate back (as we did in Lecture 4).
Also remember that the HTML
canvas has y pointing downwards, so a positive dy
is a positive rotation about the x-axis (in the samples above).
In order to set key callbacks, we can use something similar to the example in Lecture 01.
canvas.addEventListener("keydown", (event) => {
const key = event.key; // a character
// do something when this key was pressed
});
Please see some information about the KeyboardEvent
object here. In Thursday's lab, we will use the arrow keys, for which the event.key
will be ArrowRight
, ArrowLeft
, ArrowUp
or ArrowDown
.
To summarize, the main steps to follow when setting up a WebGL
application are:
WebGLShaderProgram
from a vertex shader and fragment shader (each a WebGLShader
).WebGLBuffer
(createBuffer
, bindBuffer
, bufferData
).getAttribLocation
, enableVertexAttribArray
).bindBuffer
, vertexAttribPointer
).viewport
, enable
, clear
).getUniformLocation
, uniform[1234][fi][v]
and uniformMatrix[234][fi]v
).bindBuffer
for the element indices, then drawElements
).Depending on your application and/or scene, some of these steps might be omitted or merged with others.
There are a lot of places bugs can happen in WebGL
programs. If your application doesn't work, please go through the following checklist:
attribute
? Did you enable them on the JavaScript
side? Did you use the correct string name and location when enabling the attribute?varying
variables in both shaders? Did you set the value of the varying
in your vertex shader?bufferData
)? Are you using the correct buffer type? Did you convert the arrays to the correct typed array? Are you using the intended buffer? Check for instances of accidentally typing gl.ARRAY_BUFFER
when you intended gl.ELEMENT_ARRAY_BUFFER
and vice versa.vertexAttribPointer
, bufferData
and drawElements
.gl.ARRAY_BUFFER
when you intended gl.ELEMENT_ARRAY_BUFFER
.JavaScript
and GLSL
code.The most common bug I see (and also produce) is the result of forgetting which buffer is currently bound. Bugs also arise when incorrect parameters are passed to bufferData
, vertexAttribPointer
and drawElements
so please double-check these thoroughly when your application doesn't work.
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8" />
<meta name="viewport" content="width=device-width" />
<title>Week 08: WebGL</title>
<script src="https://cdnjs.cloudflare.com/ajax/libs/gl-matrix/2.8.1/gl-matrix-min.js"></script>
</head>
<body>
<canvas id="renderer-canvas" width="500" height="500"></canvas>
<script type="text/javascript">
let mesh = {
vertices: [
-0.5, -0.5, -0.5, 0.5, -0.5, -0.5, -0.5, 0.5, -0.5, 0.5, 0.5, -0.5,
-0.5, -0.5, 0.5, 0.5, -0.5, 0.5, -0.5, 0.5, 0.5, 0.5, 0.5, 0.5,
],
triangles: [
0, 1, 2, 1, 3, 2, 1, 5, 3, 5, 7, 3, 4, 5, 0, 5, 1, 0, 5, 4, 7, 4, 6,
7, 4, 0, 6, 0, 2, 6, 3, 7, 6, 2, 3, 6,
],
colors: [
0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 1, 1, 0, 1, 0, 1, 0, 1, 1, 1, 1,
1,
],
};
let canvas = document.getElementById("renderer-canvas");
let gl = canvas.getContext("webgl");
const debugInfo = gl.getExtension("WEBGL_debug_renderer_info");
console.log(gl.getParameter(debugInfo.UNMASKED_RENDERER_WEBGL));
gl.clearColor(0.2, 0.2, 0.5, 0.6);
gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
gl.enable(gl.DEPTH_TEST);
let vertexShaderSource = `
attribute vec3 a_Position;
attribute vec3 a_Color;
varying vec3 v_Color;
uniform mat4 u_ViewMatrix;
uniform mat4 u_ProjectionMatrix;
uniform mat4 u_ModelMatrix;
void main(void) {
gl_Position = u_ProjectionMatrix * u_ViewMatrix * u_ModelMatrix * vec4(a_Position, 1);
v_Color = a_Color;
}`;
let fragmentShaderSource = `
precision highp float;
varying vec3 v_Color;
void main() {
gl_FragColor = vec4(v_Color, 1);
}`;
let vertexShader = gl.createShader(gl.VERTEX_SHADER);
gl.shaderSource(vertexShader, vertexShaderSource);
gl.compileShader(vertexShader);
if (!gl.getShaderParameter(vertexShader, gl.COMPILE_STATUS)) {
console.log(gl.getShaderInfoLog(vertexShader));
}
let fragmentShader = gl.createShader(gl.FRAGMENT_SHADER);
gl.shaderSource(fragmentShader, fragmentShaderSource);
gl.compileShader(fragmentShader);
if (!gl.getShaderParameter(fragmentShader, gl.COMPILE_STATUS)) {
console.log(gl.getShaderInfoLog(fragmentShader));
}
// attach shaders to program
let program = gl.createProgram();
gl.attachShader(program, vertexShader);
gl.attachShader(program, fragmentShader);
gl.linkProgram(program);
gl.useProgram(program);
// write vertex data to the GPU
let vertexBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, vertexBuffer);
gl.bufferData(
gl.ARRAY_BUFFER,
new Float32Array(mesh.vertices),
gl.STATIC_DRAW
);
// write color data to the GPU
let colorBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, colorBuffer);
gl.bufferData(
gl.ARRAY_BUFFER,
new Float32Array(mesh.colors),
gl.STATIC_DRAW
);
let triangleBuffer = gl.createBuffer();
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, triangleBuffer);
gl.bufferData(
gl.ELEMENT_ARRAY_BUFFER,
new Uint16Array(mesh.triangles),
gl.STATIC_DRAW
);
// enable the a_Position attribute
let a_Position = gl.getAttribLocation(program, "a_Position");
gl.enableVertexAttribArray(a_Position);
// enable the a_Color attribute
let a_Color = gl.getAttribLocation(program, "a_Color");
gl.enableVertexAttribArray(a_Color);
// associate the buffer data for vertex coordinates
// with the attribute a_Position
gl.bindBuffer(gl.ARRAY_BUFFER, vertexBuffer);
gl.vertexAttribPointer(a_Position, 3, gl.FLOAT, false, 0, 0);
// associate the buffer data for vertex color
// with the attribute a_Color
gl.bindBuffer(gl.ARRAY_BUFFER, colorBuffer);
gl.vertexAttribPointer(a_Color, 3, gl.FLOAT, false, 0, 0);
gl.clearColor(0.2, 0.2, 0.5, 0.6);
gl.clear(gl.DEPTH_BUFFER_BIT | gl.COLOR_BUFFER_BIT);
let viewMatrix = mat4.lookAt(
mat4.create(),
vec3.fromValues(2, 1, 5),
vec3.create(),
vec3.fromValues(0, 1, 0)
);
let projectionMatrix = mat4.perspective(
mat4.create(),
Math.PI / 4,
1.0,
1e-3,
1000
);
let u_ViewMatrix = gl.getUniformLocation(program, "u_ViewMatrix");
let u_ProjectionMatrix = gl.getUniformLocation(
program,
"u_ProjectionMatrix"
);
gl.uniformMatrix4fv(u_ViewMatrix, false, viewMatrix);
gl.uniformMatrix4fv(u_ProjectionMatrix, false, projectionMatrix);
let dragging = false;
let mouseX;
let modelMatrix = mat4.create();
canvas.addEventListener("mousedown", (event) => {
dragging = true;
mouseX = event.pageX;
});
canvas.addEventListener("mouseup", () => {
dragging = false;
});
canvas.addEventListener("mousemove", (event) => {
if (!dragging) return;
const dx = (4 * (event.pageX - mouseX)) / canvas.width;
modelMatrix = mat4.multiply(
mat4.create(),
mat4.fromYRotation(mat4.create(), dx),
modelMatrix
);
mouseX = event.pageX;
draw();
});
draw = () => {
gl.clear(gl.DEPTH_BUFFER_BIT | gl.COLOR_BUFFER_BIT);
let u_ModelMatrix = gl.getUniformLocation(program, "u_ModelMatrix");
gl.uniformMatrix4fv(u_ModelMatrix, false, modelMatrix);
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, triangleBuffer);
gl.drawElements(
gl.TRIANGLES,
mesh.triangles.length,
gl.UNSIGNED_SHORT,
0
);
};
draw();
</script>
</body>
</html>
© Philip Claude Caplan, 2023 (Last updated: 2023-11-02)