- Library Home /
- Search Collections /
- Open Collections /
- Browse Collections /
- UBC Theses and Dissertations /
- Capturing and modeling of deformable objects
Open Collections
UBC Theses and Dissertations
UBC Theses and Dissertations
Capturing and modeling of deformable objects Popa, Tiberiu
Abstract
Modeling the behavior of deformable virtual objects has important applications in computer graphics. There are two prevalent approaches for modeling deformable objects, an active one by deforming existing virtual models and a passive one by capturing the geometry and motion of real objects. This thesis explores the problem of modeling and acquisition of objects undergoing deformations, and proposes a set of practical deformation and capturing tools. The first contribution is a new approach to model deformation that incorporates non-uniform materials into the geometric deformation framework. This technique provides a simple and intuitive method to control the deformation using material properties that can be specified by the user with an intuitive interface or can be learned from a sequence of sample deformations facilitating realistic looking results. Some deformable objects such as garments exhibit a complex behavior under motion and thus are difficult to model or simulate, making them suitable target for capture methods. Methods for capturing garments usually use special markers printed on the fabric to establish temporally coherent correspondences between frames. Unfortunately, this approach is tedious and prevents the capture of interesting, off-the-shelf fabrics. A marker-free approach to capturing garment motion that avoids these problems is presented in chapter three. The method establishes temporally coherent parameterizations between incomplete geometries that are extracted at each time step using a multiview stereo algorithm, and the missing geometry is filled in using a template. Garment motion is characterized by dynamic high-frequency folds. However, these folds tend to be shallow, making them difficult to capture. A new method for reintroducing folds into the sequence using data-driven dynamic wrinkling is presented in chapter four. The method first estimates the folds in the video footage and then wrinkle the surface using space-time deformation. The validity of the method is demonstrated on several garments captured using several recent techniques. While this markerless reconstruction method is tailored specifically for garments, this thesis also proposes a more general method for reconstructing a consistent frame sequence from a sequence of point clouds captured using multiple video streams. The method uses optical flow to guide a local-parameterization based cross-parameterization method. This reconstruction method accumulates geometric information from all the frames using a novel correction and completion mechanism.
Item Metadata
Title |
Capturing and modeling of deformable objects
|
Creator | |
Publisher |
University of British Columbia
|
Date Issued |
2009
|
Description |
Modeling the behavior of deformable virtual objects has important applications in
computer graphics. There are two prevalent approaches for modeling deformable
objects, an active one by deforming existing virtual models and a passive one by
capturing the geometry and motion of real objects. This thesis explores the problem
of modeling and acquisition of objects undergoing deformations, and proposes a
set of practical deformation and capturing tools.
The first contribution is a new approach to model deformation that incorporates
non-uniform materials into the geometric deformation framework. This technique
provides a simple and intuitive method to control the deformation using material
properties that can be specified by the user with an intuitive interface or can be
learned from a sequence of sample deformations facilitating realistic looking results.
Some deformable objects such as garments exhibit a complex behavior under
motion and thus are difficult to model or simulate, making them suitable target
for capture methods. Methods for capturing garments usually use special markers
printed on the fabric to establish temporally coherent correspondences between
frames. Unfortunately, this approach is tedious and prevents the capture of interesting,
off-the-shelf fabrics. A marker-free approach to capturing garment motion that
avoids these problems is presented in chapter three. The method establishes temporally
coherent parameterizations between incomplete geometries that are extracted at each time step using a multiview stereo algorithm, and the missing geometry is
filled in using a template.
Garment motion is characterized by dynamic high-frequency folds. However,
these folds tend to be shallow, making them difficult to capture. A new method for
reintroducing folds into the sequence using data-driven dynamic wrinkling is presented
in chapter four. The method first estimates the folds in the video footage and
then wrinkle the surface using space-time deformation. The validity of the method
is demonstrated on several garments captured using several recent techniques.
While this markerless reconstruction method is tailored specifically for garments,
this thesis also proposes a more general method for reconstructing a consistent
frame sequence from a sequence of point clouds captured using multiple
video streams. The method uses optical flow to guide a local-parameterization
based cross-parameterization method. This reconstruction method accumulates geometric
information from all the frames using a novel correction and completion
mechanism.
|
Extent |
65098808 bytes
|
Genre | |
Type | |
File Format |
application/pdf
|
Language |
eng
|
Date Available |
2009-11-02
|
Provider |
Vancouver : University of British Columbia Library
|
Rights |
Attribution-NonCommercial-NoDerivatives 4.0 International
|
DOI |
10.14288/1.0051172
|
URI | |
Degree | |
Program | |
Affiliation | |
Degree Grantor |
University of British Columbia
|
Graduation Date |
2010-05
|
Campus | |
Scholarly Level |
Graduate
|
Rights URI | |
Aggregated Source Repository |
DSpace
|
Item Media
Item Citations and Data
Rights
Attribution-NonCommercial-NoDerivatives 4.0 International