UBC Publications

UBC Publications

UBC Publications

Innovations 2011

Item Metadata

Download

Media
focus-1.0115160.pdf
Metadata
JSON: focus-1.0115160.json
JSON-LD: focus-1.0115160-ld.json
RDF/XML (Pretty): focus-1.0115160-rdf.xml
RDF/JSON: focus-1.0115160-rdf.json
Turtle: focus-1.0115160-turtle.txt
N-Triples: focus-1.0115160-rdf-ntriples.txt
Original Record: focus-1.0115160-source.json
Full Text
focus-1.0115160-fulltext.txt
Citation
focus-1.0115160.ris

Full Text

Array ICICS
CONNECTING KNOWLEDGE
lTlP
Fall/Winter 2011
a better
i
"74470"93740'
02
magazine
WTP
v
i HE SPIRIT
l/IARCONI
DGINGUPTO
IIMATION'S
NCANNY VALLEY
INTERACTING
WITH MUSIC
HE LEADING
EDGE OF THE
AEROSPACE
INDUSTRY
SHADOW
REACHING
and more...
a place of mind
THE  UNIVERSITY OF  BRITISH  COLUMBIA IN THE SPIRIT OF MARCONI
Using cellphones as mobile relay
would improve reception
the environment.
WHO TO HELP FIRST IN A DISASTER?
Life-or-death decisions are made easier by a simulator that's
the envy of emergency operations centres everywhere.
innovations
fall/winter 2011
Production
Editor
Writer
Design
Office
Email
Sharon Cavalier
ICICS Administrator
Craig Wilson
ICICS Communication Writer
Industry Design
www.industrydesign.ca
ICICS
University of British Columbia
289-2366 Main Mall
Vancouver, BC, Canada V6T 1Z4
Tel: 604-822-6894
Fax:604-822-9013
info@icics.ubc.ca
director's desk
SHADOW REACHING
Wall-sized display screens are great
for collaboration and teaching, but
only with the right "chalk".
2    Fall/Winter 2011 ICICS members are doing important and interesting work, but not
enough people beyond the academic community and related industry
partners know about it. With this issue of Innovations magazine (formerly
the newsletter Focus), we are reaching out to the wider community with a
fresh new look and a more accessible style.
We know from events we have held recently that people beyond UBC,
whether the general public, industry personnel, or other academics, wish to
connect with us, but often don't know where to begin. Redesigning our main
publication as a magazine that appeals to the general reader and broadening
its distribution base will help bridge this gap. ICICS is a forum, and we
encourage interested readers at all levels to engage with us.
In this issue, we highlight advances ICICS researchers are making
in aircraft manufacturing, electroacoustic music, robotic surgery, 3D
video, film animation, collaboration technology, and more. With over 150
researchers from departments across the campus collaborating on projects,
we have an embarrassment of riches when it comes to deciding which ones
to profile. We hope you will enjoy this slice.
Panos Nasiopoulos
ICICS Director
NOVEL GUIDANCE SYSTEM
FOR ROBOTIC SURGERY
Higher precision robotic surgery will mean
better outcomes for cancer patients.
EDGING UP TO ANIMATION'S UNCANNY VALLEY
Camera-based technique can produce a range of facial
expressions not yet seen in animated films.
15-17
CONVERTING 2D VIDEO TO
3D /BUILDING A BETTER
3D DISPLAY
Novel glasses-free display
technology and 2D to 3D
conversion process could change
the 3D experience.
innovations magazine
Fall/Winter 2011    3 RCONI  FIRST WIRELESS MESSAGE 181
In the Spirit of Marconi
IN DECEMBER 1901, GUGLIELMO MARCONI TRANSMITTED THE FIRST TRANSATLANTIC RADIO
SIGNAL, THE LETTER "S" IN MORSE CODE, FROM CORNWALL, ENGLAND TO SIGNAL HILL IN
ST. JOHN'S, NEWFOUNDLAND. SEVENTY-THREE YEARS LATER, HIS DAUGHTER FOUNDED THE
MARCONI SOCIETY TO PROMOTE AWARENESS OF MAJOR INNOVATIONS IN COMMUNICATIONS.
Recently, the society's antennae
picked up on the work of
Diomidis Michalopoulos, a
Killam Postdoctoral Fellow
at UBC supervised by Electrical and
Computer Engineering professor Robert
Schober. Michalopoulos was given
the society's Young Scholar Award in
2010 for his innovations in cooperative
wireless communications. Recipients are
considered to have already had an impact
in their field, and must be no older than 27,
Marconi's age when he made his landmark
transmission. They are also seen as
potential future candidates for the Marconi
Award, the equivalent of the Nobel Prize in
communications science. Only two other
young researchers worldwide were given
the award in 2010.
USING CELLPHONES
AS RELAY TERMINALS
In    cooperative    communications,    relay
terminals are used to forward information
from source to destination terminals. In a
cellular network, these might be simpler
and consume less energy than large terminal
hubs. Michalopoulos' innovative research
looks at using mobile relay terminals in
the network, such as cellphones. He won
the Young Scholar Award for protocols he
developed for selecting relays, based on
average channel conditions and specified
energy consumption. A network using
these protocols would work well in areas of
low signal strength, and be able to re-route
around obstacles.
Fairness guides the selection
of individual phones as relay
terminals in Michalopoulos'
protocols; all phones involved
ultimately consume equal amounts
of power. By sacrificing a little, each
user gains a lot. Indirectly, so does the
environment: the network would operate
at reduced transmission power, without
the need for fixed relay stations.
Michalopoulos came to UBC in 2009
from Aristotle University of Thessaloniki,
Greece. Robert Schober is happy he did.
"I feel very privileged," he says, "to have
Diomidis in my group. He is a truly
original thinker, and his work could
move the industry in a new direction."
Earlier this year, Michalopoulos was
honoured once more when the Canadian
government awarded him a Banting
Postdoctoral Fellowship. Named after
the Canadian co-discoverer of insulin,
this is a highly competitive international
competition, with only 70 fellowships
awarded annually, worth S70.000 per year
for two years.
When you take a call in
your basement or on a
mountaintop in the next
few years, you may have
Diomidis Michalopoulos to thank for it.
For more information,
contact Diomidis Michalopoulos
at dio@ece.ubc.ca
4    Fall/Winter 2011 IN LAPAROSCOPIC SURGERY, SURGICAL TOOLS
AND A CAMERA ARE INSERTED ON LONG ARMS
THROUGH SMALL INCISIONS. The surgeon performs the operation looking at a monitor, with
the camera controlled by an assistant. In robotic
surgery, also performed through small incisions,
the surgeon looks into a console and controls
three or four arms with "wrists" on the end that
have surgical tools and a 3D camera attached.
The wrists provide many more degrees of freedom than are available in laparoscopic surgery,
and the system is much more intuitive and pre-
ICICS researchers are dramatically improving the
guidance system of the state-of-the-art da Vinci
surgical robot. Led by electrical engineering professor Tim Salcudean, they are fusing preoperative ultrasound and MRI images with ultrasound
and X-ray images taken during surgery, for realtime tool guidance that will minimize tissue and
nerve damage. The system will also correct for
tissue and target movement and deformation
during the operation.
The team is focusing on prostate- and kidney-
cancer treatment, where minimally invasive
surgery is crucial.The techniques they are developing could be applied to a number of procedures in the future, with a profound impact on
healthcare—better surgical outcomes, shorter
hospital stays, faster recovery times. The supplier. Intuitive Surgical, clearly thinks so; they have
donated a second robot to the project that complements one purchased through an ICICS-led
infrastructure grant. With da Vinci robots now at
both UBC and Vancouver General Hospital, UBC
is one of only three centres in the world to have
two surgical robots dedicated for research and
teaching.
For more information, contact Tim Salcudean
at tims@ece.ubc
innovations nygazin"
K
u
Fall/Winter 2011    S An array of video cameras and strobe lights captures a wide range of facial expressions.
Edging Up to
Animation's
Uncanny Valley
WHEN CHILDREN WERE SHOWN EARLY VERSIONS
OF THE ANIMATED MOVIE SHREK, THEY LOVED THE
GREEN OGRE WITH THE HEART OF GOLD THE FILM
WAS NAMED AFTER.
But they started crying when his human love interest Princess Fiona showed up; the animators made
her less lifelike, and the kids loved her too. Fiona had
fallen into the "Uncanny Valley," a term from robotics
describing a narrow region where the robot is lifelike enough to
resemble a human, but with something wrong. The effect on the
viewer is revulsion, perhaps because it triggers an evolutionary
response related to mate selection or avoidance of disease. No
robot designer or animator has yet been able
to cross the uncanny valley with a creation
that's 100 percent lifelike.
The Dolby Research Chair in Computer
Science, Wolfgang Heidrich, may be getting
close. Heidrich developed the image-processing algorithms underlying high-dynamic
range (HDR) display technology invented
at UBC by physicist Lome Whitehead. The
resulting display has a contrast ratio much
closer to what we perceive in the real world
than that of conventional displays. Bright-
side Technologies Inc., the spinoff company
formed to commercialize HDR Technology,
was acquired by Dolby Laboratories in 2007.
Capturing Detail and Emotion
Heidrich's adherence to lifelike rendering of
reality has guided his recent work on facial
capture for film animation. Along with PhD
student Derek Bradley, Heidrich has developed a novel technique for animating faces,
based on capturing real faces with high-definition video cameras. The subject (or actor)
sits facing a semi-circular array of paired
video cameras, zoomed in to the pore level.
Each stereo pair captures a detailed patch
of the subject's face. Pores, hair follicles, and
blemishes are used as reference points to create stereo depth images from the two cameras'
adjacent frames, which are then rendered as a
3D mesh of the actor's face.
In traditional capture techniques, markers are worn by the actor as reference points,
which is not only awkward but also produces
"Current capture systems are
limited when it comes to facial
animations. That's why they
mostly get used for creating
aliens."
animation based on geometry rather than detail. Heidrich's system is markerless, uses relatively inexpensive commercial video cameras,
and produces highly detailed animation that
can later be altered—virtual makeup can be
applied, for example.
"Current capture systems," Heidrich points
out, "are limited when it comes to facial animations. That's
why they mostly get used for creating aliens." His system, on
the other hand, with its level of detail, can capture a range of
expressions. A library of different expressions and positions
could be created for a given actor, and used in different scenes.
6    Fall/Winter 2011 Merging Capture and Animation
Computer simulation is the other main approach currently
used in animation, for scenes that capture is unsuitable for, such
as a tsunami. Heidrich's colleague Robert Bridson is a pioneer
in physics-based animation, where simulations are guided by
the laws of physics. The water scenes in Avatar, for example,
were done using fluid simulation software developed by Bridson's startup company, Exotic Matter AB. Heidrich's long-term
goal is to merge his capture techniques with Bridson's physics-
based simulations, initially for fluids. "It would be the first systematic attempt," he says, "to tackle tight integration of visual
measurement and physical simulation." Both Bridson's work on
fluids and Heidrich's facial capture research are supported by
the GRAND Network of Centres of Excellence led by computer
scientist and ICICS member Kellogg Booth (see Focus, Spring
2010).
Rendering of the eyes remains an obstacle in facial capture,
because of their shine. Once that crucial hurdle is overcome,
Heidrich's system may become a key component in spanning
the uncanny valley.
For more information,
contact Wolfgang Heidrich at heidrich@cs.ubc.ca
**«• of Software Syst*
PURSUE A CAREER
IN THE SOFTWARE
INDUSTRY!
REALISTIC SIMULATIONS
FOR FILM
Image courtesy of Exotic Matter AB
Photorealistic simulations of smoke, liquids, and clothing
that obey the laws of physics can be created using software
developed by computer scientist Robert Bridson. The software has been used in films such as the Harry Potter series,
10,000 BC, Hell Boy II: The Golden Army, The Dark Knight,
and Inkheart. A scene from Avatar created using Bridson's
techniques, in which one of the Na'vi drinks rain water from
a leaf, won the "Best Single Visual Effect of the Year" award
in 2010 from the Visual Effects Society. Eventually, it maybe
possible to merge the capture method described in the adjacent article with these techniques to produce highly convincing simulations.
Watch for  an  article  on  Bridson's
start-up company, Exotic Matter AB,
in an upcoming issue.
rV
*
cs.ubc.ca/mss
,   ICICS
00 •
a place of mind
THE  UNIVERSITY OF  BRITISH  COLUMBIA
innovations magazine
Fall/Winter 2011    7 i
5*
1
Fall/Winter 2011 DURING DISASTERS, TOUGH
DECISIONS NEED TO BE MADE.
IF AN EARTHQUAKE KNOCKS OUT MOST OF A CITY'S WATER SUPPLY, HOW MUCH SHOULD BE DIVERTED
FROM HOUSEHOLDS TO HOSPITALS? WHAT ROADS SHOULD BE REPAIRED FIRST? WHO NEEDS
ELECTRICITY THE MOST?
A multidisciplinary team led by power-systems expert Jose
Marti has developed a simulator to help infrastructure managers
prepare for disasters, and make the right decisions in the midst
of crisis. Disasters such as earthquakes make the linkages
among critical infrastructures (power, water, transportation,
communications, hospitals, etc.) painfully clear, yet coordinating
responses is problematic. For business and security reasons,
managers are reluctant to share their infrastructure data. They
also naturally want their damaged systems to get the most
attention.
Marti and his colleagues have devised techniques to
protect the privacy of infrastructure data while information
is exchanged to enable high-level, real-time decision making.
Scenarios can be run in advance to prepare for disasters, with the
system learning from the results. Human factors such as where
a manager's kids go to school can be taken into consideration.
These unique features have caught the attention of researchers
and security agencies around the world. Marti is now the key
player in an international effort involving 7 countries and 40
collaborators. Closer to home, his system was chosen from over
30 others to assist in planning and real-time decision support
during the 2010 Winter Olympics in Vancouver.
An Emergency Operations Centre now under construction
at UBC is expected to become a global centre for infrastructure
interdependencies research.
For more information,
contact Jose Marti at jrms@ece.ubc.ca
Bin
-•/   ^B
■^^■r
J_—^J
£
y
^^B^^^H    IC5i^29
-1
■-=-.
;      ■ .    -,        -7
Ttfir PT PT
■^^—■—                   '
"
novations magazine
V ~ with Musi
i.^*^
^^^ 2000, ICICS broadened membership eligibility
I    ^^^^   to all researchers across the UBC campus. Bob
I      j^   ^ Pritchard and Keith Hamel from the School of
III Music heeded the call, and were instrumental in
.^L. .^L. ^^designing the sound studio that became part of
a major addition to the ICICS building.
The result is a state-of-the-art facility
that is one of the few studios in the world
capable of supporting 64-channel sound.
The two composers have used the studio
extensively since, producing an impressive
body of interactive works and novel music technologies. Hamel's
automated score-following work takes him to Paris regularly,
where he works with the score-following team at IRCAM (Institut
de Recherche et Coordination Acoustique/Musique), one of the
world's leading electroacoustic music research centres. Pritchard's
interactive works have won awards and been performed around
the world.
Award-Winning Interactive Piece
Strength is a 12-minute work composed by Pritchard for alto sax,
with interactive video and audio clips processed in real-time by
the Max/MSP/Jitter software package. In the work, the camera
pans up two rotating male bodies while water runs down them.
"Strength confronts life,
death, and resurrection."
Roses lie against various parts of their bodies, the petals of which
fall to their feet at the end of the piece.
These images, and the sounds of machinery, are tied together
by the saxophone, in a commentary on durability, impermanence,
and transformation. "Like most of my work," Pritchard says,
"Strength confronts life, death, and resurrection." With the help
of cinematographer Cathryn Robertson,
Pritchard shot Strength on a shoestring
budget, using available objects—a
wading pool, garden hose, whiskey
barrel, hockey sticks—to construct the
set, then edited the footage using the ICICS Video Edit Suite.
His artistic vision prevailed, and Strength was awarded a Unique
Award of Merit by the Canadian Society of Cinematographers in
2007.
Putting the Performer in Charge
Until fairly recently, the musician in Pritchard's interactive pieces
synchronized their performance to the audio and video clips. In
his current work, technological advances such as score-following
allow the performer to control the flow of the piece, which is in
keeping with Pritchard's philosophy of composition.
"The music comes first," he stresses, "then you choose the tools
to get at whatever emotion is coming across." Motion sensors are
10   Fall/Winter 2011 one such tool. In a piece for alto flute he is currently composing,
the musician wears an accelerometer that looks like a watch on her
wrist. Certain gestures she makes while playing cue audio samples.
Other sounds as well as video clips are triggered by Max/MSP/
Jitter messages embedded in the score, which the computer follows
based on pitch, using Hamel's
score-following software. "The
performer can be as expressive
as she wants to be," Pritchard
points out, "providing a much
better musical experience for
both the performer and the
audience." He is now looking at creating a dance piece where the
dancer wears accelerometers on her ankles and wrists to trigger
interactive sounds and video.
DIVAs
Technology controlled by the performer is front and centre in
DIVAs (Digital Ventriloquized Actors), a current performance
project that builds on groundbreaking speech synthesis work
done at UBC by Sid Fels. Fels and his team developed a pair of
gloves that the wearer can manipulate to control frequency-based
speech synthesis parameters and create speech. Pritchard and Fels
then modified the gloves to support song and trigger additional
"The music comes first, then you choose
the tools to get at whatever emotion is
coming across."
sounds and video images, opening the door to a singer creating
her own audio-visual accompaniment.
Artisynth, a related project led by Fels, produces artificial
speech differently, by bringing together software models of the
vocal anatomy—jaw, lip, tongue, vocal cords, etc.—that we use
to talk The Artisynth model
appears onscreen in DIVAs as
an animated face, and Pritchard
expects the DIVA team to have
a version working shortly that
will also generate the "voice"
produced by the gloves. With
sufficient processing power, the DIVA singer could control her
own animated, projected chorus, engaging the audience much
more effectively than a computer can.
Working with electroacoustic music technology, Pritchard
believes, "can inform the composer's ability to compose
acoustically. We teach it to undergraduates because it changes
the way you listen to sound and manipulate it." Bringing the
humanities into ICICS can also help us change the way we think
about technology.
For more information, contact Bob Pritchard
at dr.bob@ubc.ca
•    t
INTERACTIVE PERFORMANCE SOFTWARE
Max/MSP/Jitter is a highly sophisticated
but complex program for developing
interactive performance works. When Bob
Pritchard was asked to teach the music
technology module of an interdisciplinary
course in 2003, he wanted the students,
who had already done modules in creative
writing, dance, video, and circuitry, to work
with audio and video processing. He knew
they would never be able to master Max/
MSP/Jitter sufficiently in the short time
they had with him, so he teamed up with
fellow ICICS member Keith Hamel (Music)
and Nancy Nisbet of UBC Visual Arts to
develop a simplified set of modules, \
they called UBC Toolbox. They subsequently made the program available for fre<
on the Internet, and it is now in use aro
the world. Not surprisingly, Pritchard was
awarded a Killam Teaching Prize in 2005.
innovations magazine
Fall/Winter 2011    11  AT THE
LEADING EDGE
OF THE
AEROSPACE
INDUSTRY
Helping Aircraft Manufacturers
Get It Right the First Time
THE AEROSPACE INDUSTRY faces some of the world's most physics: characteristics and behaviour of the material, tool vi-
exacting manufacturing standards. Parts manufacturers need brations, cutting mechanics, temperature, kinematics, etc. The
to be certified by an international association. Detailed records software packages he has developed allow engineers to test the
must be kept about a part's history, so that malfunctions can be interaction between materials and cutting tools before machin-
traced to the machine it was made on, the cutting tool used, and ing takes place, and make adjustments to maximize productivi-
the operator involved. Materials must be light and strong, and ty. High-precision parts can be made right, at the optimal speed,
can be very expensive—the raw material for a titanium part, for the first time,
instance, can cost hundreds of thousands of dollars. There is no
room for error, and the competition UBC Expertise Flies High
is fierce.
Mechanical engineering profes- 10    &^S    ^3 WUM Altintas'Manufacturing Automation
sor Yusuf Altintas has a long his- Mj WR Laboratory (MAL) at UBC is consid-
tory of meeting these  challenges. fftWinVilfi ered the best virtual-machining cen-
In the  early  1980s, he  developed ^g   mm tre in the woril(j. CutPro, a package
software for automating the manu- of science-based virtual-machining
facture of propellers for the highly algorithms developed at the MAL,
successful de Havilland Dash 8. Since then, he has become one has been licensed by over 130 companies and research centres
of the world's foremost experts in virtual machining, or "mill- worldwide. In recent years, Altintas' methods have been used
ing" parts in computer simulations that factor in the relevant    by Boeing, Airbus, and Bombardier to cut aluminum wing
innovations magazine
Fall/Winter 2011    13 panels; ASCO Aerospace Canada Ltd. to mill titanium flap and
slat tracks for a number of different aircraft; Pratt & Whitney
Canada to machine business-jet engine impellers; and Hitachi
to manufacture hydro turbines. Altintas has held the NSERC-
Pratt & Whitney Canada Chair in Virtual High-Performance
Machining since 2002.
Committed to Research
"Roughly eighty percent of the world's aerospace companies
use our algorithms," Altintas says. So do many other industries.
They often ask Altintas to consult on specific manufacturing
problems, but he usually turns them down. Instead, he turns
the problem over to one of his graduate students as a fundamental research question. In exchange for advising the company on
the use of his software, Altintas receives equipment and material for the MAL. The machine-tool company Mori Seiki, for
instance, donated two machining centres worth $800,000 that
Altintas and his students use for testing their algorithms. Top
cutting-tool companies such as Sandvik Coromant (Sweden),
Kennametal (USA), and Mitsubishi Materials (Japan) also use
his technology, and provide expensive tools in return. "I am
very committed to fundamental research," Altintas asserts,
"which in my case embodies teaching graduate students, with
the problems being defined by real-world scenarios. Teaching
and research are a professor's primary job."
Altintas' heart has always been in academia. He turned down
a job offer from General Motors Canada to come to UBC in
1986 because, he joked, the fishing was better in BC. A number
of years later he confessed to the CM executive who tried to
hire him that he'd only caught half a dozen trout since coming to BC. The man did some quick calculations and informed
Altintas that those fish had cost him several hundred thousand
dollars. His students and the aerospace industry would say they
were worth it.
For more information, contact
Yusuf Altintas at altintas@mech.ubc.ca
ft
Keeping
Canadian
Manufacturing
Competitive
In his spare time, Yusuf Altintas is
Scientific Director of a Natural Sciences
and Engineering Research Council of
Canada (NSERC) Strategic Network that is
developing advanced, science-based virtual
machining technology for the Canadian
automotive, aerospace, machinery, energy,
biomedical, and die-mold industries.
With funding over 5 years of $5 million
from NSERC and $350,000 from industry,
the Canadian Network for Research and
Innovation in Machining Technology
(CANRIMT) is made up of 20 academic
researchers from 7 universities across
Canada, as well as researchers from the
National Research Council's Aerospace
Manufacturing Technology Centre and
industrial partners Pratt & Whitney
Canada, Bombardier Aerospace, ASCO
Aerospace Canada Ltd., Automation
Tooling Systems Inc., Memex Automation
Inc., Origin International Inc., and
Promation    Engineering    Ltd.    Altintas
and his colleagues have involv
industrial partners from the start, to keep
the research on track. The software tools
they develop will be easily integrated into
existing CAD/CAM systems, so that parts
manufacturing can be optimized virtuaTTy^
without costly shop-floor trials. The end
goal is to keep the Canadian manufacturing
sector competitive. 2D VIDEO TO
Taking Cues from Nature
3D TVs, cellphones, games boxes, and movie theatres are
becoming more and more common, but 3D content creation is
lagging behind, since it is so difficult and expensive. Some 3D
TVs currently on the market can convert 2D content to 3D, but
with limited success. A new conversion technique invented in
the UBC 3D Innovation Lab, however, is extremely promising,
since it mimics nature.
The range of cues humans draw upon to build up a 3D view
of the environment, and the relationship of cues to one another,
is extracted by the system from a 2D video or image to create its
3D counterpart. Those that are inappropriate are rejected. The
end result is a highly accurate 3D version of a 2D original. The
conversion will happen in real-time on a TV, PVR, cellphone,
or any other display device where a 3D view is desirable.
Navigating using a 3D version of Google maps would be much
more intuitive, and who doesn't want to see Eartha Kitt as a 3D
Catwoman in the 1960s Batman series?
In related work, the researchers are developing a host of
techniques that will allow the viewer to comfortably sense
depth and benefit from the entire colour gamut, without having
to wear glasses. We will feature this work in an upcoming issue.
For more information,
contact Panos Nasiopoulos at panos@icics.ubc.ca
innovations magazine
Fall/Winter 2011    15 EVOLUTION HAS EQUIPPED US WITH TWO EYES FOR REDUNDANCY, BUT ALSO
FOR DEPTH PERCEPTION, SO WE CAN SUCCESSFULLY NEGOTIATE THE WORLD.
Two separate views of a scene
are projected onto our retinae, offset slightly because
of the distance between our
eyes, then fused by the brain to help us
perceive depth. Perspective lines, shadows, blocking of background objects
by foreground objects, and motion-
parallax cues, where objects in the foreground seem to shift faster than those
in the background as we move our head
from side to side, are also important
cues. Since about 1860, we have been
able to mimic this aspect of human vision by looking at two offset photographs of the same scene through a
prism stereoscope—with enough staring, they converge into a three-dimensional image.
By the mid-1950s, colour 3D movies were the craze. Two images of the
same movie were projected onto the
screen through polarizing filters. Audience members wore cheap glasses with
polarizing filters that let light through
according to the image appropriate for
each eye. Telescopes and other props
that swung out at the audience were
common features of these B movies;
character development and plot were a
little thin.
With recent movies such as Avatar
and Clash of the Titans (see animation
article on Pages 6—7), 3D has moved
into the mainstream.The audience still
has to wear glasses, and much of the
content is computer-generated, since
3D filming of real actors and scenes
is so difficult, requiring a number of
16   Fall/Winter 2011 different    cameras
shooting     simultaneously.    For    the
same   reason,    3D
televisions    already
on the market suffer from a dearth of content beyond
video games, which are computer-
generated. Viewers still need to wear
glasses,  with  lenses  wirelessly  synchronized to two overlapping images
on the screen, so the correct
eye sees the corresponding
image when it should. Glasses are not required with new
autostereoscopic       designs;
however, these displays have
low resolution,  since  pairs
of pixels are required to display each set of offset images. They also have a limited
number of 3D viewpoints,
so motion-parallax cues are
of minimal use.
HIGH-RESOLUTION 3D
WITHOUT GLASSES
A team of researchers led
by Boris Stoeber are
drawing on their combined expertise to address these problems. The crux of
their approach is to use individual
pixels for up to 30 viewpoints,
thereby avoiding the loss of resolution inherent in autostereoscopic
displays. Stoeber, who holds a joint
appointment in Mechanical Engineering and Electrical and Computer Engineering (ECE), will design
microelectromechanical systems
(MEMS) that house optical elements
and rotate rapidly back and forth to
project multiple views of the same
"Combining MEMS technology
with the right optics components,
electronics, and rendering
approaches will lead to the next
generation of 3D displays."
scene.
System-on-a-chip expert
Shahriar Mirabbasi (ECE) will develop advanced micro-circuitry to
generate the voltage necessary to rotate the MEMS platforms. Computer graphics specialist Sid Fels (ECE)
will focus on rendering the images,
using techniques he developed in a
previous project that allow information for multiple viewpoints to be
delivered in a single frame of video.
Stoeber and his team are do-it-
yourselfers, and will manufacture
many of the display's novel components at UBC, in the Advanced
Materials and Process Engineering
Laboratory (AMPEL).
"Combining   MEMS   technology
with the right optics components,
electronics,     and    rendering    approaches," Stoeber says, "will lead to
the next generation of 3D displays." Since MEMS-based 3D
displays could be mass produced, Stoeber believes they
will one day also be used in
biomedical applications such
as     computed     tomography
(CT) and image-guided surgery, as well as in cell phones
and other handheld devices.
Filming 3D content for these
displays would also be much
easier, since it would require only
two   cameras   running   simultaneously.
For more information, contact
Boris Stoeber at
stoeber(a)mech.ubc.ca
ICICS/TELUS
People & Planet Fr
lative
An ICICS/TELUS consortium that promotes sustainability
while maintaining and improving quality of life.
Read more about this unique project in the next issue of Innovations.
innovations magazine
P"
I Fall/Winter 2011    17 SHADOW REACHING
I M THE EARLY NINETEENTH CENTURY, BLACKBOARDS BEGAN REPLACING SLATES IN THE CLASS-
IIM ROOM, AND PUBLIC EDUCATION BECAME A MUCH MORE SOCIAL, COLLABORATIVE EXPERIENCE, MEDIATED BY CHALK IN THE TEACHER'S HAND. WALL-SIZED DISPLAY SCREENS NOW IN LIMITED
USE HAVE THE POTENTIAL TO SIMILARLY REVOLUTIONIZE COLLABORATION, WHETHER IN THE CLASSROOM OR A UTILITIES CONTROL ROOM. WHAT'S MISSING IS THE CHALK.
Traditional means of interacting with computers by
keyboard and mouse are not practical for large-
screen displays, which are designed for collaborative
work Information needs to be graphically displayed
and manipulated on the screen in a way that makes
sense to the audience. With touch screens as large as the 5 m x 3
m display in ICICS' Interactive Workroom, a step ladder would
be needed to reach all areas of the screen. Because these screens
have been deployed without a suitable means of interaction,
they have remained largely in the realm of research.
Garth Shoemaker was Director of Research at Idelix
Software, a developer of mapping software, before coming
to UBC to pursue a PhD in human-computer interaction.
His supervisor, computer scientist Kellogg Booth, is an
internationally recognized expert in the field. Under Booths
direction, Shoemaker has applied his practical experience to
the missing "chalk" problem. His working principle is that for
interaction with large screens to be useful, the audience needs
to be able to connect it to the user—to see the user's hand draw
a line around an island on a map, for example. Shoemaker
realized that if the user's shadow were projected onto the screen,
the closer they got to the apparent light source, the larger their
shadow would become; with it, they could reach all areas of the
screen. Having had his eureka moment, he set himself the task
of generating a virtual shadow of the user that could interact
with the screen.
18   Fall/Winter 2011 tftvm
KMm
.a?
■
unthoo
*
Chin*
Or
Wuhan   ,
A3  •«
Collaboration Technology
for Large Interactive
Wall Screens
"This is a recurring theme in my work," Shoemaker says,
"drawing from the real world and leveraging that in the virtual
world." After consulting with Booth, virtual-reality expert Sid
Fels, computer-animation specialist Michiel van de Panne,
psychologist Jocelyn Keillor (National Research Council),
and other psychology and kinesiology researchers, he devised
a technique for rendering a virtual shadow of the user. Using
visual and magnetic trackers and a virtual light source, the
shadow is "cast" onto the
screen as if by a real light.
Modified Nintendo Wiimote
controllers function as input
devices. Shoemaker calls
the technology "Shadow
Reaching."
Another   advantage   of
rendering users  as virtual
shadows is that screen detail can be seen through them. "It's a
delicate balance," Shoemaker points out. "You want something
that provides the information but doesn't get in the way." Icons
for different tools, such as a virtual pen, are located in various
places on the shadow. Data are stored in the shadow's "stomach,"
and shared by passing folders to other users' shadows. By using
more comprehensive tracking techniques, Shoemaker can
generate a much richer three-dimensional body model that can
be rendered on the screen in a variety of ways. Collaborators
in different cities, for example, may be identified by rendering
them in a more lifelike, recognizable way.
'This is a recurring theme in my work,
drawing from the real world and
leveraging that in the virtual world."
FUTURE TRACKING
SHOEMAKER BELIEVES THAT TRACKING WILL ONE
DAY BE DONE exclusively with cameras, so the user will not
need to wear tracking markers, nor use an input device like a
Wiimote. An instructor in a lecture hall, for example, will be
tracked by depth-sensing cameras and rendered on a large wall
screen. They will interact with it by simply pointing to the area
of the screen where they
want certain information
to appear, then touching
their finger to their thumb
as a "click". Static, sequential
PowerPoint presentations
may go the way of the slate.
Shoemaker's research
has been supported by
SMART Technologies of Calgary, manufacturers of interactive
whiteboards, and by Defence Research and Development
Canada. It is also part of the on-going research program of the
new GRAND Network of Centres of Excellence led by Booth
(see Focus, Spring 2010). Some form of it maybe coming soon
to a classroom or utilities control centre near you.
For more information,
contact Garth Shoemaker at garths@cs.ubc.ca
innovations magazine
Fall/Winter 2011    19 Bring your friends list
to the TV screen.
Introducing Facebook on Optik TV.
A first in Canada, only from TELUS.
To learn more about Facebook on Optik TV™ and for
details about our latest offers visit telus.com/optiktv
^rtELUS
the future is friendly"
©TELUS 2011 11J0319
UBC's Institute for Computing, Information and Cognitive Systems (ICICS) is an umbrella
organization that promotes collaboration among researchers from the faculties of Applied
Science, Arts, Commerce, Education, Forestry, Medicine, and Science, and with industry
ICICS facilitates the collaborative multidisciplinary research of approximately 150 faculty
members and 800 graduate students in these faculties.
Our members attract approximately $18 million annually in grants and contracts. Their work
strengthens Canada's strategic Science and Technology research priority areas, benefiting
all Canadians.
ICICS |p.
a place of mind
THE UNIVERSITY OF BRITISH COLUMBIA
CONNECTING KNOWLEDGE
PUBLICATIONS MAIL AGREEMENT NO. 40049168
RETURN UNDELIVERABLE CANADIAN ADDRESSES TO:
ICICS, University of British Columbia
289-2366 Main Mall
Vancouver, BCV6T1Z4
info@icics.ubc.ca
www.icics.ubc.ca

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
http://iiif.library.ubc.ca/presentation/cdm.focus.1-0115160/manifest

Comment

Related Items