The primary goal of this lab is to implement a model for animating cloth. You'll also get some more practice with mesh indexing and setting up a renderer from start-to-finish.
The ClothAnimation
class is the main driver for the cloth animation which stores two arrays: points
(an array of Point
objects) and constraints
(an array of Constraint
objects). The point locations are iteratively updated in the ClothAnimation
update
method in two stages (as described in class): (1) move all points using Verlet integration with the external forces, and then (2) satisfy the constraints on the edges imposed by the spring force on the edge. Note that the ClothAnimation
constructor sets an infinite mass (and hence a zero inverse mass) to three points at the top of the cloth, hence pinning them.
Note that the ClothAnimation
constructor takes in the number of points to use in the horizontal (nx
) and vertical (ny
) directions along the cloth. This means the number of edges (constraints/springs) along the horizontal direction is nx - 1
and the number of edges along the vertical direction is ny - 1
. The total number of points is nx * ny
. Please see the figure below for more details:
There are also Point
and Constraint
class definitions which you'll complete in Part 1. A Point
object stores the current
location (vec3
), as well as the previous
location (mass
(inverseMass
(inverseMass
to zero (hence an infinite mass).
Each Constraint
stores the two endpoint Point
objects, as well as the initial rest length of the spring (Constraint
object called constraint
, constraint.p.current
is constraint.q.current
is
For PART 1A
, please complete the Point
move
function. The external force should just be gravity for now - this vec3
has been defined as fext
at the beginning of the function. Use a time step of
Make sure to update the previous
and current
positions of the point at the end of this function (i.e. the original current
is the new previous
, and the new current
is vec3
s, I would recommend using the .slice()
method. I would also suggest using this.inverseMass
when computing the update so that fixed points do not experience a contribution from the external forces.
For PART 1B
, please complete the Constraint
satisfy
function (to satisfy the spring constraints). See the pseudocode in the lecture notes. Note that mp = this.p.mass
and mq = this.q.mass
for the Constraint
object.
Once this works, your cloth should move (but still pinned at the top).
For PART 1C
(not labelled in the template), please add a random z-component to the external force (fext
) to a create a wind effect. I would recommend setting Math.random()
).
WebGL
.In this part, you will render the cloth using WebGL
. Currently the cloth points and lines are rendered using the 2d
context of the HTML
canvas. Now, you will use the webgl
context to render the cloth as a set of triangles. The rendering context can be switched to the WebGL
context by clicking the button to the right of the "animate" button - you can always click this button again to return to using the 2d
context.
We first need to set up the triangles that define the cloth. The body of the for-loop at PART 2A
is treating a particular square in the cloth (highlighted in green in the image above). The lower left Point
of this green square has index this.triangles
. You can check if this part is correct by checking the length of this.triangles
after the nested for-loop: there are
For PART 2B
, we need to extract the coordinates of each Point
object in every draw call. Note that we are not using transform feedback as in the previous lab - we are indeed writing the point coordinates to the GPU during every draw call. Please complete the loop to extract the current
coordinates of every Point
object to the position
array.
For PART 2C
(not labelled in the template), please complete the rest of the initGL
and drawGL
functions to draw the cloth with WebGL
. You should use the Phong reflection model to calculate the color in the fragment shader which means you will also need to calculate a normal at every vertex and write these to the GPU (enable an attribute, point the buffer to this attribute, etc.). The initGL
function should set up the shader program and any buffers that hold static data during the animation. For example, the triangles don't change so you might want to create a buffer and write the triangle indices to the GPU here. The drawGL
function will invoke the rendering pipeline after enabling & associating attributes with buffers, and will also write the current cloth particle positions and normals to the GPU.
Recommendation: Break up your implementation of PART 2C
into steps. First draw the triangles (leave out the normals for now) and set gl_FragColor
to a constant color. Then write the normals (be sure to calculate and write the normal matrix to a uniform) and add the Phong reflection model in the fragment shader.
Note that there is a computeNormals
function in normals.js
(which is loaded in index.html
). Vertex normals can be calculated from the triangles and current point coordinates using normals = computeNormals(position, this.triangles)
. You'll need to write these normals to the GPU to use a shading model in the fragment shader.
Note that this.viewMatrix
and this.projectionMatrix
are set up in the ClothAnimation
constructor.
In your shading model, should you use a bidirectional lighting model for the diffusion term? Or should the diffusion term be set to zero when the dot product between the normal and light direction is negative? Recall the notes in Chapter 4.
For a complete lab, please pick one of the following features to implement. Of course, you are free to propose your own extension.
a. Turn the cloth into a flag of your choice. This will involve writing texture coordinates for each point that is in the <img>
element with the id
"cloth-picture"
in index.html
. Hint on setting up the texture coordinates: this will be similar to how the points are initially set up. Feel free to upload a different picture for the flag! You can also change the fixed
array defined in the ClothAnimation
constructor if you want to pin the cloth on the left (instead of at the top).
b. Add user interaction. Add an event listener to the canvas so that when the user clicks on the screen (mousedown
), the nearest cloth point jumps to this location. Similar to the wind force in Lab 10, you can cast a ray through the pixel where the mouse is clicked (event.offsetX
, event.offsetY
) into the world, and intersect it with the plane with normal initialize
method of the ClothAnimation
.
c. Turn the cloth into a cylindrical sleeve. This will require changing how the initial coordinates for each Point
object are created. Specifically, you can set
You do not need to make the sleeve line up with itself at the seam, but it's a good challenge to try out!
d. Render the lines representing the constraints with WebGL
(in the right-most image above). You'll need to extract the indices of each line endpoint - hint: see how the cloth constraints are set up. You'll then need an ELEMENT_ARRAY_BUFFER
for the edge indices and a separate call to drawElements
(using gl.LINES
). The fragment color can be set to black when you are drawing lines so you'll need a uniform to know whether you are drawing triangles or lines. I would also recommend enabling gl.POLYGON_OFFSET_FILL
for this feature (and perhaps setting gl.polygonOffset(2, 3)
).
All labs are due on May 16th. I will regularly grade the labs as they are submitted on Gradescope. Please see the Setup page for instructions on how to commit and push your work to your GitHub repository and then submit your repository to Gradescope (in the Lab 11 assignment).
Remember, if you see a git
error when you try to sync changes, you might need to run the following two commands in the Terminal before being able to sync (push) your changes:
git config --global pull.rebase true
git pull