Lab 11: Cloth Animation

Goals

  • Implement the cloth animation model we introduced in class which consists of (1) updating particles according to external forces and then (2) updating particles positions to satisfy the spring constraints.
  • Practice with mesh indexing,
  • Practice some more writing a renderer from start-to-finish.

The primary goal of this lab is to implement a model for animating cloth. You'll also get some more practice with mesh indexing and setting up a renderer from start-to-finish.

The ClothAnimation class is the main driver for the cloth animation which stores two arrays: points (an array of Point objects) and constraints (an array of Constraint objects). The point locations are iteratively updated in the ClothAnimation update method in two stages (as described in class): (1) move all points using Verlet integration with the external forces, and then (2) satisfy the constraints on the edges imposed by the spring force on the edge. Note that the ClothAnimation constructor sets an infinite mass (and hence a zero inverse mass) to three points at the top of the cloth, hence pinning them.

Note that the ClothAnimation constructor takes in the number of points to use in the horizontal (nx) and vertical (ny) directions along the cloth. This means the number of edges (constraints/springs) along the horizontal direction is nx - 1 and the number of edges along the vertical direction is ny - 1. The total number of points is nx * ny. Please see the figure below for more details:

There are also Point and Constraint class definitions which you'll complete in Part 1. A Point object stores the current location ($\vec p^k$) of a point in the cloth (a vec3), as well as the previous location ($\vec p^{k-1}$). It also stores the mass ($m$) and inverseMass ($\frac{1}{m}$) of the point. The latter can be used to "fix" points in the cloth, by setting inverseMass to zero (hence an infinite mass).

Each Constraint stores the two endpoint Point objects, as well as the initial rest length of the spring ($\ell_0$). Thus, for a Constraint object called constraint, constraint.p.current is $\vec p$ in the equations we saw in class (for Stage 2 in the update), and constraint.q.current is $\vec q$.

Part 1: Update points and constraints.

For PART 1A, please complete the Point move function. The external force should just be gravity for now - this vec3 has been defined as fext at the beginning of the function. Use a time step of $\Delta t = 5\cdot 10^{-3}$ (you can adjust this later if you want). Apply the Verlet integration update to compute the new position of the point:

$$ \vec p^{k+1} = 2 \vec p^{k} - \vec p^{k-1} + \frac{1}{m} \Delta t^2 \sum \vec f_{\mathrm{ext}}. $$

Make sure to update the previous and current positions of the point at the end of this function (i.e. the original current is the new previous, and the new current is $\vec p^{k+1}$). When copying vec3s, I would recommend using the .slice() method. I would also suggest using this.inverseMass when computing the update so that fixed points do not experience a contribution from the external forces.

For PART 1B, please complete the Constraint satisfy function (to satisfy the spring constraints). See the pseudocode in the lecture notes. Note that mp = this.p.mass and mq = this.q.mass for the Constraint object.

Once this works, your cloth should move (but still pinned at the top).

For PART 1C (not labelled in the template), please add a random z-component to the external force (fext) to a create a wind effect. I would recommend setting $f_{\mathrm{ext},z} = -0.5 + r$ where $r$ is a random value between 0 and 1 (from Math.random()).

Part 2: Render the cloth using WebGL.

In this part, you will render the cloth using WebGL. Currently the cloth points and lines are rendered using the 2d context of the HTML canvas. Now, you will use the webgl context to render the cloth as a set of triangles. The rendering context can be switched to the WebGL context by clicking the button to the right of the "animate" button - you can always click this button again to return to using the 2d context.

We first need to set up the triangles that define the cloth. The body of the for-loop at PART 2A is treating a particular square in the cloth (highlighted in green in the image above). The lower left Point of this green square has index $j n_x + i$. Please determine the indices of the other three corners of this square and then push two triangles to this.triangles. You can check if this part is correct by checking the length of this.triangles after the nested for-loop: there are $(n_x - 1) \times (n_y - 1)$ squares, and each square is divided into two triangles, and each triangle is described by three point indices.

For PART 2B, we need to extract the coordinates of each Point object in every draw call. Note that we are not using transform feedback as in the previous lab - we are indeed writing the point coordinates to the GPU during every draw call. Please complete the loop to extract the current coordinates of every Point object to the position array.

For PART 2C (not labelled in the template), please complete the rest of the initGL and drawGL functions to draw the cloth with WebGL. You should use the Phong reflection model to calculate the color in the fragment shader which means you will also need to calculate a normal at every vertex and write these to the GPU (enable an attribute, point the buffer to this attribute, etc.). The initGL function should set up the shader program and any buffers that hold static data during the animation. For example, the triangles don't change so you might want to create a buffer and write the triangle indices to the GPU here. The drawGL function will invoke the rendering pipeline after enabling & associating attributes with buffers, and will also write the current cloth particle positions and normals to the GPU.

Recommendation: Break up your implementation of PART 2C into steps. First draw the triangles (leave out the normals for now) and set gl_FragColor to a constant color. Then write the normals (be sure to calculate and write the normal matrix to a uniform) and add the Phong reflection model in the fragment shader.

Note that there is a computeNormals function in normals.js (which is loaded in index.html). Vertex normals can be calculated from the triangles and current point coordinates using normals = computeNormals(position, this.triangles). You'll need to write these normals to the GPU to use a shading model in the fragment shader.

Note that this.viewMatrix and this.projectionMatrix are set up in the ClothAnimation constructor.

In your shading model, should you use a bidirectional lighting model for the diffusion term? Or should the diffusion term be set to zero when the dot product between the normal and light direction is negative? Recall the notes in Chapter 4.

Part 3: Choose your own adventure.

For a complete lab, please pick one of the following features to implement. Of course, you are free to propose your own extension.

a. Turn the cloth into a flag of your choice. This will involve writing texture coordinates for each point that is in the $[0, 1]^2$ domain, setting up a texture and then sampling the texture in the fragment shader. Note the <img> element with the id "cloth-picture" in index.html. Hint on setting up the texture coordinates: this will be similar to how the points are initially set up. Feel free to upload a different picture for the flag! You can also change the fixed array defined in the ClothAnimation constructor if you want to pin the cloth on the left (instead of at the top).

b. Add user interaction. Add an event listener to the canvas so that when the user clicks on the screen (mousedown), the nearest cloth point jumps to this location. Similar to the wind force in Lab 10, you can cast a ray through the pixel where the mouse is clicked (event.offsetX, event.offsetY) into the world, and intersect it with the plane with normal $(0, 0, 1)$ centered on the point $(0, 0, 0)$. Then find the closest cloth point to the world-space mouse point and set this cloth point to the mouse point (the change will be pretty abrupt). I would suggest adding this event listener to the canvas in the initialize method of the ClothAnimation.

c. Turn the cloth into a cylindrical sleeve. This will require changing how the initial coordinates for each Point object are created. Specifically, you can set $x = 0.5 + R\cos\left(\frac{2\pi i}{(n_x - 1)}\right)$, $y = \frac{j}{(n_y - 1)}$ and $z = 0.5 + R \sin\left(\frac{2\pi i}{(n_y - 1)}\right)$ for some choice of the radius $R$. All the points at the top of the sleeve should be pinned as in the following video (on the left):

You do not need to make the sleeve line up with itself at the seam, but it's a good challenge to try out!

d. Render the lines representing the constraints with WebGL (in the right-most image above). You'll need to extract the indices of each line endpoint - hint: see how the cloth constraints are set up. You'll then need an ELEMENT_ARRAY_BUFFER for the edge indices and a separate call to drawElements (using gl.LINES). The fragment color can be set to black when you are drawing lines so you'll need a uniform to know whether you are drawing triangles or lines. I would also recommend enabling gl.POLYGON_OFFSET_FILL for this feature (and perhaps setting gl.polygonOffset(2, 3)).

Submission

All labs are due on May 16th. I will regularly grade the labs as they are submitted on Gradescope. Please see the Setup page for instructions on how to commit and push your work to your GitHub repository and then submit your repository to Gradescope (in the Lab 11 assignment).

Remember, if you see a git error when you try to sync changes, you might need to run the following two commands in the Terminal before being able to sync (push) your changes:

git config --global pull.rebase true
git pull

© Philip Caplan, 2025