[ pdf | chapters | single document ]
Techniques for cloth simulation have advanced greatly over the past few years. Faster simulation algorithms [1], resolution of friction and collision detection [4], and methods of dealing with wrinkles and buckling [5] have advanced the state of the art.
To date, however, there have been few efforts to measure the movement of real cloth. Such measurements could be used to infer parameters for a specific material type, or to verify the correctness of cloth simulators. Many authors fall back upon the old computer graphics adage, ``if it looks right, it is right'' [7]. Breen et al. [3] and Eberhardt et al. [6] have incorporated data from the Kawabata measurement system into their cloth simulation systems, but there is as yet no way to use Kawabata data with more modern simulators. Furthermore, the Kawabata device is expensive and special-purpose. It would be more convenient to be able to determine parameters for simulation systems using conventional equipment such as video capture devices. Furthermore, video allows more information to be obtained about cloth movement.
In this paper, I present a technique for recovering information about the position and deformation of a sheet of rectangular cloth with a printed grid pattern. The cloth is first acquired using a Digiclops camera and stereo correspondence, producing a colour image and a depth map. Subsequently, deformation of the cloth is measured by detecting features in the grid pattern.
The most similar work to mine is that of Jojic et al. [8], who tried to estimate cloth draping parameters from range data. Their technique used only range data without associated colour information. They treated the range data as a solid surface, and used a cloth simulator to drape an imaginary cloth over the solid surface, from which they inferred cloth parameters. Many details of this technique remain unclear.
Louchet et al. [9] used synthetic data from a simulation to recover dynamic parameters for a mass-spring based cloth simulator. The input to their system is not specified explicitly, but it was not images; instead, a 3D mesh of cloth points was likely used. Their work does not, then, overlap with my own.