- Library Home /
- Search Collections /
- Open Collections /
- Browse Collections /
- UBC Theses and Dissertations /
- Calibrating head-coupled virtual reality systems
Open Collections
UBC Theses and Dissertations
UBC Theses and Dissertations
Calibrating head-coupled virtual reality systems Stevenson , Alexander
Abstract
Head-tracking virtual environments are difficult to implement because of the need to calibrate such systems accurately, as well as the difficulty in computing the correct off-axis image for a given eye location. The situation is further complicated by the use of multiple screens, the need to change the calibration for different users, and the desire to write portable software which can be reused on different hardware with varying screen configurations. This, thesis presents a solution to these problems, allowing greatly simplified development of head-tracking software. By making use of-the head-tracking sensors built into the environment, we can quickly and accurately calibrate not only userspecific measurements, such as eye-positions, but also system measurements, such as the size and locations of display screens. A method of doing this calibration is developed, as well as a software library which will read a system configuration and integrate with OpenGL to compute correct off-axis projections for a user's viewing position. The calibration makes use of a novel "sighting" technique which has the great advantage of accurately finding the true rotational centre of a user's eyes. To complement this, the software library includes functions which predict the optical centre of a user's eye based on a given gaze point. As a demonstration of both the calibration method and the utility library, a hardware rendering application is discussed. This application performs the real-time rendering of view-dependent LaFortune reflectance functions in graphics hardware. As with all view-dependent lighting methods, both the viewing angle and position of the light are taken into account while rendering. Head-coupling allows the system to use the user's true viewing direction in the lighting computation, and the position of the virtual light is controlled by a 3D sensor in the user's hand. The method in which the view-dependent lighting model is implemented in hardware is explained, as well as possible improvements. Throughout, the Polhemus FASTRAK is used as the tracking system, though all the results are easily applicable to any six degree-of-freedom tracking system.
Item Metadata
Title |
Calibrating head-coupled virtual reality systems
|
Creator | |
Publisher |
University of British Columbia
|
Date Issued |
2002
|
Description |
Head-tracking virtual environments are difficult to implement because of the need to
calibrate such systems accurately, as well as the difficulty in computing the correct
off-axis image for a given eye location. The situation is further complicated by the
use of multiple screens, the need to change the calibration for different users, and
the desire to write portable software which can be reused on different hardware with
varying screen configurations.
This, thesis presents a solution to these problems, allowing greatly simplified
development of head-tracking software. By making use of-the head-tracking sensors
built into the environment, we can quickly and accurately calibrate not only userspecific
measurements, such as eye-positions, but also system measurements, such
as the size and locations of display screens. A method of doing this calibration is
developed, as well as a software library which will read a system configuration and
integrate with OpenGL to compute correct off-axis projections for a user's viewing
position. The calibration makes use of a novel "sighting" technique which has the
great advantage of accurately finding the true rotational centre of a user's eyes. To
complement this, the software library includes functions which predict the optical
centre of a user's eye based on a given gaze point.
As a demonstration of both the calibration method and the utility library, a
hardware rendering application is discussed. This application performs the real-time
rendering of view-dependent LaFortune reflectance functions in graphics hardware.
As with all view-dependent lighting methods, both the viewing angle and position
of the light are taken into account while rendering. Head-coupling allows the system
to use the user's true viewing direction in the lighting computation, and the position
of the virtual light is controlled by a 3D sensor in the user's hand. The method in
which the view-dependent lighting model is implemented in hardware is explained,
as well as possible improvements.
Throughout, the Polhemus FASTRAK is used as the tracking system, though
all the results are easily applicable to any six degree-of-freedom tracking system.
|
Extent |
6670680 bytes
|
Genre | |
Type | |
File Format |
application/pdf
|
Language |
eng
|
Date Available |
2009-08-14
|
Provider |
Vancouver : University of British Columbia Library
|
Rights |
For non-commercial purposes only, such as research, private study and education. Additional conditions apply, see Terms of Use https://open.library.ubc.ca/terms_of_use.
|
DOI |
10.14288/1.0051444
|
URI | |
Degree | |
Program | |
Affiliation | |
Degree Grantor |
University of British Columbia
|
Graduation Date |
2002-05
|
Campus | |
Scholarly Level |
Graduate
|
Aggregated Source Repository |
DSpace
|
Item Media
Item Citations and Data
Rights
For non-commercial purposes only, such as research, private study and education. Additional conditions apply, see Terms of Use https://open.library.ubc.ca/terms_of_use.