Lab 11: Cloth Animation

Goals

By the end of this lab, you will:

  • implement the cloth animation model we introduced in class which consists of (1) updating particles according to external forces and then (2) updating particles positions to satisfy the spring constraints.
  • practice some more with mesh indexing,
  • practice some more writing a renderer from start-to-finish.

The primary goal of this lab is to implement a model for animating cloth. You'll also get some more practice with mesh indexing and setting up a renderer from start-to-finish.

The ClothAnimation class is the main driver for the cloth animation which stores two arrays: points (an array of Point objects) and constraints (an array of Constraint objects). The point locations are iteratively updated in the ClothAnimation update method in two stages (as described in class): (1) move all points using Verlet integration with the external forces, and then (2) satisfy the constraints on the edges imposed by the spring force on the edge. Note that the ClothAnimation constructor sets an infinite mass (and hence a zero inverse mass) to three points at the top of the cloth, hence pinning them.

Note that the ClothAnimation constructor takes in the number of points to use in the horizontal (nx) and vertical (ny) directions along the cloth. This means the number of edges (constraints/springs) along the horizontal direction is nx - 1 and the number of edges along the vertical direction is ny - 1. The total number of points is nx * ny. Please see the figure below for more details:

There are also Point and Constraint class definitions which you'll complete in Part 1. The point stores the current location ($\vec p^k$) of a point in the cloth (a vec3), as well as the previous location ($\vec p^{k-1}$). It also stores the mass ($m$) and inverseMass ($\frac{1}{m}$) of the point. The latter can be used to "fix" points in the cloth, by setting inverseMass to zero (hence an infinite mass).

Each Constraint stores the two endpoint Point objects, as well as the initial rest length of the spring ($\ell_0$). Thus, for a Constraint object called constraint, constraint.p.current is $\vec p$ in the equations we saw in class (for Stage 2 in the update), and constraint.q.current is $\vec q$.

Some utilities for compiling a shader program, setting up a texture and computing normals from a mesh are provided in the utils.js file.

Part 0: Course Response Form.

Please complete the Course Response Form (CRF) if you were not in class on Tuesday. Your feedback is very important when designing future versions of this course!

If you have already completed the CRF, the initial lab template is included at the bottom of this page so you can get started reading through the components you will implement. As you read through the template, please have a look at the transformToScreen function at the top: why is there a division by q[3]? Hint: see the notes from Week 7.

Part 1: Update points and constraints.

For PART 1A, please complete the Point move function. The external force should just be gravity for now - this vec3 has been defined as fext at the beginning of the function. Use a time step of $\Delta t = 5\cdot 10^{-3}$ (you can adjust this later if you want). Apply the Verlet integration update to compute the new position of the point:

$$ \vec p^{k+1} = 2 \vec p^{k} - \vec p^{k-1} + \frac{1}{m} \Delta t^2 \sum \vec f_{\mathrm{ext}}. $$

Make sure to update the previous and current positions of the point at the end of this function (i.e. the original current is the new previous, and the new current is $\vec p^{k+1}$). When copying vec3s, I would recommend using the .slice() method. I would also suggest using this.inverseMass when computing the update so fixed points do not experience a contribution from the external forces.

For PART 1B, please complete the Constraint satisfy function (to satisfy the spring constraints). See the pseudocode in the lecture notes. Note that mp = this.p.mass and mq = this.q.mass for the Constraint object.

Once this works, your cloth should move (but still pinned at the top).

For PART 1C (not labelled in the template), please add a random z-component to the external force (fext) to a create a wind effect. I would recommend setting $f_{\mathrm{ext},z} = -0.5 + r$ where $r$ is a random value between 0 and 1 (from Math.random()).

Part 2: Render the cloth using WebGL (M status).

In this part, you will render the cloth using WebGL. Currently the cloth points and lines are rendered using the 2d context of the HTML canvas. Now, you will use the webgl context to render the cloth as a set of triangles. The rendering context can be switched to the WebGL context by clicking the button to the right of the "animate" button - you can always click this button again to return to using the 2d context.

We first need to set up the triangles that define the cloth. The body of the for-loop at PART 2A is treating a particular square in the cloth (highlighted in green in the image above). The lower left corner of this green square has index $j n_x + i$. Please determine the indices of the other three corners of this square and then push two triangles to this.triangles. You can check if this part is correct by checking the length of this.triangles after the nested for-loop: there are $(n_x - 1) \times (n_y - 1)$ squares, and each square is divided into two triangles, and each triangle is described by three point indices.

For PART 2B, we need to extract the coordinates of each Point object in every draw call. Note that we are not using transform feedback as in the previous lab - we are indeed writing the point coordinates to the GPU during every draw call. Please complete the loop to extract the current coordinates of every Point object to the position array.

For PART 2C (not labelled in the template), please complete the rest of the initGL and drawGL functions to draw the cloth with WebGL. You should use the Phong reflection model to calculate the color in the fragment shader. The initGL function should set up the shader program and any buffers that hold static data during the animation. For example, the triangles don't change so you might want to create a buffer and write the triangle indices to the GPU here. The drawGL will invoke the rendering pipeline after enabling & associating attributes with buffers, and will also write the current cloth particle positions and normals to the GPU.

I recommend breaking up your implementation of PART 2C into steps. First draw the triangles (leave out the normals for now) and set gl_FragColor to a constant color. Then write the normals (be sure to calculate and write the normal matrix to a uniform) and add the Phong reflection model in the fragment shader.

Note that the normals are calculated from the triangles and current point coordinates using normals = computeNormals(position, this.triangles). You'll need to write these normals to the GPU to use a shading model in the fragment shader.

Note that this.viewMatrix and this.projectionMatrix are set up in the ClothAnimation constructor.

In your shading model, should you use a bidirectional lighting model for the diffusion term? Or should the diffusion term be set to zero when the dot product between the normal and light direction is negative? Recall the notes from Week 3.

Part 3: Choose your own adventure (E status).

For E status, please pick one of the following features to implement. Of course, you are free to propose your own extension.

a. Turn the cloth into a flag of your choice. This will involve writing texture coordinates for each point that is in the $[0, 1]^2$ domain, setting up a texture and then sampling the texture in the fragment shader. Hint on setting up the texture coordinates: this will be similar to how the points are initially set up. Feel free to upload a different picture for the flag! You can also change the fixed array defined in the ClothAnimation constructor if you want to pin the cloth on the left (instead of at the top).

b. Turn the cloth into a cylindrical sleeve. This will require changing how the initial coordinates for each Point object are created. Specifically, you can set $x = 0.5 + R\cos\left(\frac{2\pi i}{(n_x - 1)}\right)$, $y = \frac{j}{(n_y - 1)}$ and $z = 0.5 + R \sin\left(\frac{2\pi i}{(n_y - 1)}\right)$ for some choice of the radius $R$. All the points at the top of the sleeve should be pinned as in the following video (on the left):

You do not need to make the sleeve line up with itself at the seam, but it's a good challenge to try out!

c. Render the lines representing the constraints with WebGL (in the right-most image above). You'll need to extract the indices of each line endpoint - hint: see how the cloth constraints are set up. You'll then need an ELEMENT_ARRAY_BUFFER for the edge indices and a separate call to drawElements (using gl.LINES). The fragment color can be set to black when you are drawing lines so you'll need a uniform to know whether you are drawing triangles or lines. I would also recommend enabling gl.POLYGON_OFFSET_FILL for this feature (and perhaps setting gl.polygonOffset(2, 3)).

d. Add user interaction. Add an event listener to the canvas so that when the user clicks on the screen (mousedown), the nearest cloth point jumps to this location. Similar to the wind force in Lab 10, you can cast a ray through the pixel where the mouse is clicked (event.offsetX, event.offsetY) into the world, and intersect it with the plane with normal $(0, 0, 1)$ centered on the point $(0, 0, 0)$. Then find the closest cloth point to the world-space mouse point and set this cloth point to the mouse point (the change will be pretty abrupt). I would suggest adding this event listener to the canvas in the initialize method of the ClothAnimation.

Submission

Completed labs can be submitted anytime and I will review the lab as soon as I see the notification in replit. All final lab revisions are due on December 14th.

When you and your partner are ready, please submit the assignment on replit. I will then make comments (directly on your repl) and enter your current grade status at the top of the index.html file.

Please also remember to submit your reflection for this week in this Google Form.

cloth.js template
/** 
 * Transform a point from world space to screen space.
 * @param {vec3} p, point
 * @param {mat4} transformation from world space to screen space.
 */
const transformToScreen = (p, m) => {
  const ph = vec4.fromValues(p[0], p[1], p[2], 1);
  const q = vec4.transformMat4(vec4.create(), ph, m);
  return vec3.fromValues(q[0] / q[3], q[1] / q[3], q[2] / q[3]);
};

class Point {
  /** 
   * Saves the mass and initial position of the cloth particle.
   * @param {vec3} x, initial position
   * @param {Number} mass
   */
  constructor(x, mass) {
    this.current = x.slice(); // vec3
    this.previous = x.slice(); // vec3
    this.mass = mass; // mass at this point
    this.inverseMass = 1 / mass;
  }

  /**
   * Moves the point according to the external forces.
   * Recall the formula is: p^{k+1} = 2 * p^k - p^{k-1} + fext * dt^2 / m
   * where:
   *       p^k     = this.current (vec3)
   *       p^{k-1} = this.previous (vec3)
   *       m       = this.mass (scalar), or you can you this.inverseMass for 1/m
   */
  move() {
    // PART 1A
    const dt = 5e-3; // time step
    let fext = vec3.fromValues(0.0, -9.81 * this.mass, 0); // external force (gravity)
  }

  /** 
   * Draws the point using the 2d context.
   * @param {CanvasRenderingContext2D} context, 2d rendering context.
   * @param {mat4} transformation from world space to screen space.
   */
  draw(context, transformation) {
    const radius = 5;
    const twopi = Math.PI * 2.0;
    context.beginPath();
    const q = transformToScreen(this.current, transformation);
    context.arc(q[0], q[1], radius, twopi, false);
    context.fill();
  }
}

class Constraint {
  /** 
   * Saves the two Point objects defining this constraint.
   * The initial spring length (restLength) is calculated as ||p - q||.
   * @param {Point} p, first endpoint
   * @param {Point} q, second endpoint
   */
  constructor(p, q) {
    this.p = p;
    this.q = q;
    this.restLength = vec3.distance(p.current, q.current); // initial spring length
  }
  
  /**
   * Attempts to satisfy the constraints on this edge by restoring the spring to its restLength.
   * Edge endpoint coordinates (this.p.current and this.q.current should be updated).
   * Notation in the notes:
   *    L0 = this.restLength (scalar)
   *    p = this.p.current (vec3)
   *    q = this.q.current (vec3)
   *    mp = this.p.mass (or 1/mp = this.p.inverseMass)
   *    mq = this.q.mass (or 1/mq = this.q.inverseMass)
   */
  satisfy() {
    // PART 1B
  }

  /** 
   * Draws the constraint as a line using the 2d context.
   * @param {CanvasRenderingContext2D} context, 2d rendering context.
   * @param {mat4} transformation from world space to screen space.
   */
  draw(context, transformation) {
    context.beginPath();
    let q = transformToScreen(this.p.current, transformation);
    context.lineTo(q[0], q[1]);
    q = transformToScreen(this.q.current, transformation);
    context.lineTo(q[0], q[1]);
    context.stroke();
  }
}

class ClothAnimation {
  /** 
   * Sets up the point and constraints in the cloth animation.
   * The points and constraints are defined by a grid with nx points
   * in the horizontal direction and ny points in the vertical direction.
   * @param {String} canvasId, id of the HTML canvas
   * @param {Number} nx, number of points in the horizontal direction
   * @param {Number} ny, number of points in the vertical direction
   */
  constructor(canvasId, nx, ny) {
    // save the canvas and incoming parameters
    this.canvas = document.getElementById(canvasId);
    this.nx = nx;
    this.ny = ny;

    // initialize the array of point and constraints
    this.points = [];
    this.constraints = [];
    let dx = 1.0 / (this.nx - 1.0);
    let dy = 1.0 / (this.ny - 1.0);
    for (let j = 0; j < this.ny; j++)
      for (let i = 0; i < this.nx; i++) {
        let x = vec3.fromValues(i * dx, j * dy, 0);
        let point = new Point(x, 0.05); // mass = 0.05
        this.points.push(point);
      }

    // vertical constraints
    for (let j = 0; j < this.ny - 1; j++)
      for (let i = 0; i < this.nx; i++) {
        const p = j * this.nx + i;
        const q = (j + 1) * this.nx + i;
        this.constraints.push(new Constraint(this.points[p], this.points[q]));
      }

    // horizontal constraints
    for (let j = 0; j < this.ny; j++)
      for (let i = 0; i < this.nx - 1; i++) {
        const p = j * this.nx + i;
        const q = j * this.nx + i + 1;
        this.constraints.push(new Constraint(this.points[p], this.points[q]));
      }

    // any points listed here will be fixed
    const fixed = [
      this.nx * (this.ny - 1),
      this.nx * (this.ny - 1) + Math.round((this.nx - 1) / 2),
      this.nx * this.nx - 1,
    ];

    // set fixed points to have an infinite mass (inverse mass of zero)
    for (let i = 0; i < fixed.length; i++) {
      this.points[fixed[i]].mass = 1e20;
      this.points[fixed[i]].inverseMass = 0.0;
    }

    this.initialize(); // initialize the rendering context and view
  }

  /** 
   * Initialize the rendering context and set up the view.
   * The rendering context will be this.context after this function is called,
   * which will either be the "2d" or "webgl" context.
   * The transformation matrices from world space to screen space are also set up,
   * (this.viewMatrix, this.projectionMatrix, this.screenMatrix).
   * The total transformation from world space to screen space is this.transformation.
   *
   * If the WebGL context is to be used, the initGL() function is called which should
   * set up the buffers holding the static data throughout the animation.
   */
  initialize() {
    // initialize the context
    const button = document.getElementById("button-context");
    this.context = this.canvas.getContext(button.innerHTML);
    this.useWebGL = button.innerHTML == "webgl";

    // view matrix
    this.eye = vec3.fromValues(0.5, 0.5, 2);
    this.center = vec3.fromValues(0.5, 0.5, 0);
    const up = vec3.fromValues(0, 1, 0);
    this.viewMatrix = mat4.lookAt(mat4.create(), this.eye, this.center, up);

    // projection matrix
    const aspectRatio = this.canvas.width / this.canvas.height;
    const fov = Math.PI / 4.0;
    this.projectionMatrix = mat4.create();
    mat4.perspective(this.projectionMatrix, fov, aspectRatio, 1e-3, 1000);

    // screen (viewport) matrix
    const w = this.canvas.width;
    const h = this.canvas.height;
    // prettier-ignore
    this.screenMatrix = mat4.fromValues(
      w / 2, 0, 0, 0,
      0, -h / 2, 0, 0,
      0, 0, 1, 0,
      w / 2, h / 2, 0, 1
    );
    this.transformation = mat4.multiply(
      mat4.create(),
      this.screenMatrix,
      mat4.multiply(mat4.create(), this.projectionMatrix, this.viewMatrix)
    );

    if (this.useWebGL) this.initGL();
  }

  /** 
   * Upadates and draws a frame in the cloth animation.
   */
  update() {
    let numIter = 2;
    for (let iter = 0; iter < numIter; iter++) {
      // move each point according to external forces
      for (let i = 0; i < this.points.length; i++) this.points[i].move();

      // move points to satisfy the constraints (spring forces) on the edges
      for (let i = 0; i < this.constraints.length; i++)
        this.constraints[i].satisfy();
    }
    // draw the cloth
    this.draw();
  }

  /** 
   * Draw the cloth, using either the 2d context (points + lines) or the WebGL context (triangles).
   */
  draw() {
    if (this.useWebGL) {
      // draw with webgl (part 2)
      this.drawGL();
      return;
    }

    // draw to the HTML canvas
    this.context.fillStyle = "rgba(0, 0, 255, 0.4)";
    this.context.clearRect(0, 0, this.canvas.width, this.canvas.height);
    this.context.rect(0, 0, this.canvas.width, this.canvas.height);
    this.context.fill();

    // draw the constraints
    this.context.fillStyle = "black";
    for (let i = 0; i < this.constraints.length; i++)
      this.constraints[i].draw(this.context, this.transformation);

    // draw the points
    for (let i = 0; i < this.points.length; i++)
      this.points[i].draw(this.context, this.transformation);
  }

  /** 
   * Initialize the WebGL buffers for the static data during the animation,
   * as well as the shader program and textures.
  */
  initGL() {
    let gl = this.context;

    const vertexShaderSource = `
      precision mediump float;
  
      void main() {
      }`;

    const fragmentShaderSource = `
      precision mediump float;
      void main() {
      }`;

    // create the shader program
    this.program = compileProgram(gl, vertexShaderSource, fragmentShaderSource);

    // the triangles array remains constant during the animation
    this.triangles = [];
    for (let j = 0; j < this.ny - 1; j++) {
      for (let i = 0; i < this.nx - 1; i++) {
        const k = j * this.nx + i;
        // PART 2A
      }
    }
  }

  /** 
   * Draws the cloth using the WebGL rendering context.
   */
  drawGL() {
    let gl = this.context;
    gl.clearColor(0, 0, 1, 0.4);
    gl.clear(gl.DEPTH_BUFFER_BIT | gl.COLOR_BUFFER_BIT);
    gl.enable(gl.DEPTH_TEST);
    gl.viewport(0, 0, this.canvas.width, this.canvas.height);

    // extract cloth particle positions to write to the GPU
    let position = new Float32Array(3 * this.points.length);
    for (let i = 0; i < this.points.length; i++) {
      // PART 2B
    }

    // calculate normals at each point
    let normals = computeNormals(position, this.triangles);

  }
}

© Philip Claude Caplan, 2023 (Last updated: 2023-12-14)