"CONTENTdm"@en . "http://resolve.library.ubc.ca/cgi-bin/catsearch?bid=3191655"@en . "University Publications"@en . "2015-08-26"@en . "2011"@en . "https://open.library.ubc.ca/collections/focus/items/1.0115160/source.json"@en . "application/pdf"@en . " ICICS\nCONNECTING KNOWLEDGE\nlTlP\nFall/Winter 2011\na better\ni\n\"74470\"93740'\n02\nmagazine\nWTP\nv\ni HE SPIRIT\nl/IARCONI\nDGINGUPTO\nIIMATION'S\nNCANNY VALLEY\nINTERACTING\nWITH MUSIC\nHE LEADING\nEDGE OF THE\nAEROSPACE\nINDUSTRY\nSHADOW\nREACHING\nand more...\na place of mind\nTHE UNIVERSITY OF BRITISH COLUMBIA IN THE SPIRIT OF MARCONI\nUsing cellphones as mobile relay\nwould improve reception\nthe environment.\nWHO TO HELP FIRST IN A DISASTER?\nLife-or-death decisions are made easier by a simulator that's\nthe envy of emergency operations centres everywhere.\ninnovations\nfall/winter 2011\nProduction\nEditor\nWriter\nDesign\nOffice\nEmail\nSharon Cavalier\nICICS Administrator\nCraig Wilson\nICICS Communication Writer\nIndustry Design\nwww.industrydesign.ca\nICICS\nUniversity of British Columbia\n289-2366 Main Mall\nVancouver, BC, Canada V6T 1Z4\nTel: 604-822-6894\nFax:604-822-9013\ninfo@icics.ubc.ca\ndirector's desk\nSHADOW REACHING\nWall-sized display screens are great\nfor collaboration and teaching, but\nonly with the right \"chalk\".\n2 Fall/Winter 2011 ICICS members are doing important and interesting work, but not\nenough people beyond the academic community and related industry\npartners know about it. With this issue of Innovations magazine (formerly\nthe newsletter Focus), we are reaching out to the wider community with a\nfresh new look and a more accessible style.\nWe know from events we have held recently that people beyond UBC,\nwhether the general public, industry personnel, or other academics, wish to\nconnect with us, but often don't know where to begin. Redesigning our main\npublication as a magazine that appeals to the general reader and broadening\nits distribution base will help bridge this gap. ICICS is a forum, and we\nencourage interested readers at all levels to engage with us.\nIn this issue, we highlight advances ICICS researchers are making\nin aircraft manufacturing, electroacoustic music, robotic surgery, 3D\nvideo, film animation, collaboration technology, and more. With over 150\nresearchers from departments across the campus collaborating on projects,\nwe have an embarrassment of riches when it comes to deciding which ones\nto profile. We hope you will enjoy this slice.\nPanos Nasiopoulos\nICICS Director\nNOVEL GUIDANCE SYSTEM\nFOR ROBOTIC SURGERY\nHigher precision robotic surgery will mean\nbetter outcomes for cancer patients.\nEDGING UP TO ANIMATION'S UNCANNY VALLEY\nCamera-based technique can produce a range of facial\nexpressions not yet seen in animated films.\n15-17\nCONVERTING 2D VIDEO TO\n3D /BUILDING A BETTER\n3D DISPLAY\nNovel glasses-free display\ntechnology and 2D to 3D\nconversion process could change\nthe 3D experience.\ninnovations magazine\nFall/Winter 2011 3 RCONI FIRST WIRELESS MESSAGE 181\nIn the Spirit of Marconi\nIN DECEMBER 1901, GUGLIELMO MARCONI TRANSMITTED THE FIRST TRANSATLANTIC RADIO\nSIGNAL, THE LETTER \"S\" IN MORSE CODE, FROM CORNWALL, ENGLAND TO SIGNAL HILL IN\nST. JOHN'S, NEWFOUNDLAND. SEVENTY-THREE YEARS LATER, HIS DAUGHTER FOUNDED THE\nMARCONI SOCIETY TO PROMOTE AWARENESS OF MAJOR INNOVATIONS IN COMMUNICATIONS.\nRecently, the society's antennae\npicked up on the work of\nDiomidis Michalopoulos, a\nKillam Postdoctoral Fellow\nat UBC supervised by Electrical and\nComputer Engineering professor Robert\nSchober. Michalopoulos was given\nthe society's Young Scholar Award in\n2010 for his innovations in cooperative\nwireless communications. Recipients are\nconsidered to have already had an impact\nin their field, and must be no older than 27,\nMarconi's age when he made his landmark\ntransmission. They are also seen as\npotential future candidates for the Marconi\nAward, the equivalent of the Nobel Prize in\ncommunications science. Only two other\nyoung researchers worldwide were given\nthe award in 2010.\nUSING CELLPHONES\nAS RELAY TERMINALS\nIn cooperative communications, relay\nterminals are used to forward information\nfrom source to destination terminals. In a\ncellular network, these might be simpler\nand consume less energy than large terminal\nhubs. Michalopoulos' innovative research\nlooks at using mobile relay terminals in\nthe network, such as cellphones. He won\nthe Young Scholar Award for protocols he\ndeveloped for selecting relays, based on\naverage channel conditions and specified\nenergy consumption. A network using\nthese protocols would work well in areas of\nlow signal strength, and be able to re-route\naround obstacles.\nFairness guides the selection\nof individual phones as relay\nterminals in Michalopoulos'\nprotocols; all phones involved\nultimately consume equal amounts\nof power. By sacrificing a little, each\nuser gains a lot. Indirectly, so does the\nenvironment: the network would operate\nat reduced transmission power, without\nthe need for fixed relay stations.\nMichalopoulos came to UBC in 2009\nfrom Aristotle University of Thessaloniki,\nGreece. Robert Schober is happy he did.\n\"I feel very privileged,\" he says, \"to have\nDiomidis in my group. He is a truly\noriginal thinker, and his work could\nmove the industry in a new direction.\"\nEarlier this year, Michalopoulos was\nhonoured once more when the Canadian\ngovernment awarded him a Banting\nPostdoctoral Fellowship. Named after\nthe Canadian co-discoverer of insulin,\nthis is a highly competitive international\ncompetition, with only 70 fellowships\nawarded annually, worth S70.000 per year\nfor two years.\nWhen you take a call in\nyour basement or on a\nmountaintop in the next\nfew years, you may have\nDiomidis Michalopoulos to thank for it.\nFor more information,\ncontact Diomidis Michalopoulos\nat dio@ece.ubc.ca\n4 Fall/Winter 2011 IN LAPAROSCOPIC SURGERY, SURGICAL TOOLS\nAND A CAMERA ARE INSERTED ON LONG ARMS\nTHROUGH SMALL INCISIONS. The surgeon performs the operation looking at a monitor, with\nthe camera controlled by an assistant. In robotic\nsurgery, also performed through small incisions,\nthe surgeon looks into a console and controls\nthree or four arms with \"wrists\" on the end that\nhave surgical tools and a 3D camera attached.\nThe wrists provide many more degrees of freedom than are available in laparoscopic surgery,\nand the system is much more intuitive and pre-\nICICS researchers are dramatically improving the\nguidance system of the state-of-the-art da Vinci\nsurgical robot. Led by electrical engineering professor Tim Salcudean, they are fusing preoperative ultrasound and MRI images with ultrasound\nand X-ray images taken during surgery, for realtime tool guidance that will minimize tissue and\nnerve damage. The system will also correct for\ntissue and target movement and deformation\nduring the operation.\nThe team is focusing on prostate- and kidney-\ncancer treatment, where minimally invasive\nsurgery is crucial.The techniques they are developing could be applied to a number of procedures in the future, with a profound impact on\nhealthcare\u00E2\u0080\u0094better surgical outcomes, shorter\nhospital stays, faster recovery times. The supplier. Intuitive Surgical, clearly thinks so; they have\ndonated a second robot to the project that complements one purchased through an ICICS-led\ninfrastructure grant. With da Vinci robots now at\nboth UBC and Vancouver General Hospital, UBC\nis one of only three centres in the world to have\ntwo surgical robots dedicated for research and\nteaching.\nFor more information, contact Tim Salcudean\nat tims@ece.ubc\ninnovations nygazin\"\nK\nu\nFall/Winter 2011 S An array of video cameras and strobe lights captures a wide range of facial expressions.\nEdging Up to\nAnimation's\nUncanny Valley\nWHEN CHILDREN WERE SHOWN EARLY VERSIONS\nOF THE ANIMATED MOVIE SHREK, THEY LOVED THE\nGREEN OGRE WITH THE HEART OF GOLD THE FILM\nWAS NAMED AFTER.\nBut they started crying when his human love interest Princess Fiona showed up; the animators made\nher less lifelike, and the kids loved her too. Fiona had\nfallen into the \"Uncanny Valley,\" a term from robotics\ndescribing a narrow region where the robot is lifelike enough to\nresemble a human, but with something wrong. The effect on the\nviewer is revulsion, perhaps because it triggers an evolutionary\nresponse related to mate selection or avoidance of disease. No\nrobot designer or animator has yet been able\nto cross the uncanny valley with a creation\nthat's 100 percent lifelike.\nThe Dolby Research Chair in Computer\nScience, Wolfgang Heidrich, may be getting\nclose. Heidrich developed the image-processing algorithms underlying high-dynamic\nrange (HDR) display technology invented\nat UBC by physicist Lome Whitehead. The\nresulting display has a contrast ratio much\ncloser to what we perceive in the real world\nthan that of conventional displays. Bright-\nside Technologies Inc., the spinoff company\nformed to commercialize HDR Technology,\nwas acquired by Dolby Laboratories in 2007.\nCapturing Detail and Emotion\nHeidrich's adherence to lifelike rendering of\nreality has guided his recent work on facial\ncapture for film animation. Along with PhD\nstudent Derek Bradley, Heidrich has developed a novel technique for animating faces,\nbased on capturing real faces with high-definition video cameras. The subject (or actor)\nsits facing a semi-circular array of paired\nvideo cameras, zoomed in to the pore level.\nEach stereo pair captures a detailed patch\nof the subject's face. Pores, hair follicles, and\nblemishes are used as reference points to create stereo depth images from the two cameras'\nadjacent frames, which are then rendered as a\n3D mesh of the actor's face.\nIn traditional capture techniques, markers are worn by the actor as reference points,\nwhich is not only awkward but also produces\n\"Current capture systems are\nlimited when it comes to facial\nanimations. That's why they\nmostly get used for creating\naliens.\"\nanimation based on geometry rather than detail. Heidrich's system is markerless, uses relatively inexpensive commercial video cameras,\nand produces highly detailed animation that\ncan later be altered\u00E2\u0080\u0094virtual makeup can be\napplied, for example.\n\"Current capture systems,\" Heidrich points\nout, \"are limited when it comes to facial animations. That's\nwhy they mostly get used for creating aliens.\" His system, on\nthe other hand, with its level of detail, can capture a range of\nexpressions. A library of different expressions and positions\ncould be created for a given actor, and used in different scenes.\n6 Fall/Winter 2011 Merging Capture and Animation\nComputer simulation is the other main approach currently\nused in animation, for scenes that capture is unsuitable for, such\nas a tsunami. Heidrich's colleague Robert Bridson is a pioneer\nin physics-based animation, where simulations are guided by\nthe laws of physics. The water scenes in Avatar, for example,\nwere done using fluid simulation software developed by Bridson's startup company, Exotic Matter AB. Heidrich's long-term\ngoal is to merge his capture techniques with Bridson's physics-\nbased simulations, initially for fluids. \"It would be the first systematic attempt,\" he says, \"to tackle tight integration of visual\nmeasurement and physical simulation.\" Both Bridson's work on\nfluids and Heidrich's facial capture research are supported by\nthe GRAND Network of Centres of Excellence led by computer\nscientist and ICICS member Kellogg Booth (see Focus, Spring\n2010).\nRendering of the eyes remains an obstacle in facial capture,\nbecause of their shine. Once that crucial hurdle is overcome,\nHeidrich's system may become a key component in spanning\nthe uncanny valley.\nFor more information,\ncontact Wolfgang Heidrich at heidrich@cs.ubc.ca\n**\u00C2\u00AB\u00E2\u0080\u00A2 of Software Syst*\nPURSUE A CAREER\nIN THE SOFTWARE\nINDUSTRY!\nREALISTIC SIMULATIONS\nFOR FILM\nImage courtesy of Exotic Matter AB\nPhotorealistic simulations of smoke, liquids, and clothing\nthat obey the laws of physics can be created using software\ndeveloped by computer scientist Robert Bridson. The software has been used in films such as the Harry Potter series,\n10,000 BC, Hell Boy II: The Golden Army, The Dark Knight,\nand Inkheart. A scene from Avatar created using Bridson's\ntechniques, in which one of the Na'vi drinks rain water from\na leaf, won the \"Best Single Visual Effect of the Year\" award\nin 2010 from the Visual Effects Society. Eventually, it maybe\npossible to merge the capture method described in the adjacent article with these techniques to produce highly convincing simulations.\nWatch for an article on Bridson's\nstart-up company, Exotic Matter AB,\nin an upcoming issue.\nrV\n*\ncs.ubc.ca/mss\n, ICICS\n00 \u00E2\u0080\u00A2\na place of mind\nTHE UNIVERSITY OF BRITISH COLUMBIA\ninnovations magazine\nFall/Winter 2011 7 i\n5*\n1\nFall/Winter 2011 DURING DISASTERS, TOUGH\nDECISIONS NEED TO BE MADE.\nIF AN EARTHQUAKE KNOCKS OUT MOST OF A CITY'S WATER SUPPLY, HOW MUCH SHOULD BE DIVERTED\nFROM HOUSEHOLDS TO HOSPITALS? WHAT ROADS SHOULD BE REPAIRED FIRST? WHO NEEDS\nELECTRICITY THE MOST?\nA multidisciplinary team led by power-systems expert Jose\nMarti has developed a simulator to help infrastructure managers\nprepare for disasters, and make the right decisions in the midst\nof crisis. Disasters such as earthquakes make the linkages\namong critical infrastructures (power, water, transportation,\ncommunications, hospitals, etc.) painfully clear, yet coordinating\nresponses is problematic. For business and security reasons,\nmanagers are reluctant to share their infrastructure data. They\nalso naturally want their damaged systems to get the most\nattention.\nMarti and his colleagues have devised techniques to\nprotect the privacy of infrastructure data while information\nis exchanged to enable high-level, real-time decision making.\nScenarios can be run in advance to prepare for disasters, with the\nsystem learning from the results. Human factors such as where\na manager's kids go to school can be taken into consideration.\nThese unique features have caught the attention of researchers\nand security agencies around the world. Marti is now the key\nplayer in an international effort involving 7 countries and 40\ncollaborators. Closer to home, his system was chosen from over\n30 others to assist in planning and real-time decision support\nduring the 2010 Winter Olympics in Vancouver.\nAn Emergency Operations Centre now under construction\nat UBC is expected to become a global centre for infrastructure\ninterdependencies research.\nFor more information,\ncontact Jose Marti at jrms@ece.ubc.ca\nBin\n-\u00E2\u0080\u00A2/ ^B\n\u00E2\u0096\u00A0^^\u00E2\u0096\u00A0r\nJ_\u00E2\u0080\u0094^J\n\u00C2\u00A3\ny\n^^B^^^H IC5i^29\n-1\n\u00E2\u0096\u00A0-=-.\n; \u00E2\u0096\u00A0 . -, -7\nTtfir PT PT\n\u00E2\u0096\u00A0^^\u00E2\u0080\u0094\u00E2\u0096\u00A0\u00E2\u0080\u0094 '\n\"\nnovations magazine\nV ~ with Musi\ni.^*^\n^^^ 2000, ICICS broadened membership eligibility\nI ^^^^ to all researchers across the UBC campus. Bob\nI j^ ^ Pritchard and Keith Hamel from the School of\nIII Music heeded the call, and were instrumental in\n.^L. .^L. ^^designing the sound studio that became part of\na major addition to the ICICS building.\nThe result is a state-of-the-art facility\nthat is one of the few studios in the world\ncapable of supporting 64-channel sound.\nThe two composers have used the studio\nextensively since, producing an impressive\nbody of interactive works and novel music technologies. Hamel's\nautomated score-following work takes him to Paris regularly,\nwhere he works with the score-following team at IRCAM (Institut\nde Recherche et Coordination Acoustique/Musique), one of the\nworld's leading electroacoustic music research centres. Pritchard's\ninteractive works have won awards and been performed around\nthe world.\nAward-Winning Interactive Piece\nStrength is a 12-minute work composed by Pritchard for alto sax,\nwith interactive video and audio clips processed in real-time by\nthe Max/MSP/Jitter software package. In the work, the camera\npans up two rotating male bodies while water runs down them.\n\"Strength confronts life,\ndeath, and resurrection.\"\nRoses lie against various parts of their bodies, the petals of which\nfall to their feet at the end of the piece.\nThese images, and the sounds of machinery, are tied together\nby the saxophone, in a commentary on durability, impermanence,\nand transformation. \"Like most of my work,\" Pritchard says,\n\"Strength confronts life, death, and resurrection.\" With the help\nof cinematographer Cathryn Robertson,\nPritchard shot Strength on a shoestring\nbudget, using available objects\u00E2\u0080\u0094a\nwading pool, garden hose, whiskey\nbarrel, hockey sticks\u00E2\u0080\u0094to construct the\nset, then edited the footage using the ICICS Video Edit Suite.\nHis artistic vision prevailed, and Strength was awarded a Unique\nAward of Merit by the Canadian Society of Cinematographers in\n2007.\nPutting the Performer in Charge\nUntil fairly recently, the musician in Pritchard's interactive pieces\nsynchronized their performance to the audio and video clips. In\nhis current work, technological advances such as score-following\nallow the performer to control the flow of the piece, which is in\nkeeping with Pritchard's philosophy of composition.\n\"The music comes first,\" he stresses, \"then you choose the tools\nto get at whatever emotion is coming across.\" Motion sensors are\n10 Fall/Winter 2011 one such tool. In a piece for alto flute he is currently composing,\nthe musician wears an accelerometer that looks like a watch on her\nwrist. Certain gestures she makes while playing cue audio samples.\nOther sounds as well as video clips are triggered by Max/MSP/\nJitter messages embedded in the score, which the computer follows\nbased on pitch, using Hamel's\nscore-following software. \"The\nperformer can be as expressive\nas she wants to be,\" Pritchard\npoints out, \"providing a much\nbetter musical experience for\nboth the performer and the\naudience.\" He is now looking at creating a dance piece where the\ndancer wears accelerometers on her ankles and wrists to trigger\ninteractive sounds and video.\nDIVAs\nTechnology controlled by the performer is front and centre in\nDIVAs (Digital Ventriloquized Actors), a current performance\nproject that builds on groundbreaking speech synthesis work\ndone at UBC by Sid Fels. Fels and his team developed a pair of\ngloves that the wearer can manipulate to control frequency-based\nspeech synthesis parameters and create speech. Pritchard and Fels\nthen modified the gloves to support song and trigger additional\n\"The music comes first, then you choose\nthe tools to get at whatever emotion is\ncoming across.\"\nsounds and video images, opening the door to a singer creating\nher own audio-visual accompaniment.\nArtisynth, a related project led by Fels, produces artificial\nspeech differently, by bringing together software models of the\nvocal anatomy\u00E2\u0080\u0094jaw, lip, tongue, vocal cords, etc.\u00E2\u0080\u0094that we use\nto talk The Artisynth model\nappears onscreen in DIVAs as\nan animated face, and Pritchard\nexpects the DIVA team to have\na version working shortly that\nwill also generate the \"voice\"\nproduced by the gloves. With\nsufficient processing power, the DIVA singer could control her\nown animated, projected chorus, engaging the audience much\nmore effectively than a computer can.\nWorking with electroacoustic music technology, Pritchard\nbelieves, \"can inform the composer's ability to compose\nacoustically. We teach it to undergraduates because it changes\nthe way you listen to sound and manipulate it.\" Bringing the\nhumanities into ICICS can also help us change the way we think\nabout technology.\nFor more information, contact Bob Pritchard\nat dr.bob@ubc.ca\n\u00E2\u0080\u00A2 t\nINTERACTIVE PERFORMANCE SOFTWARE\nMax/MSP/Jitter is a highly sophisticated\nbut complex program for developing\ninteractive performance works. When Bob\nPritchard was asked to teach the music\ntechnology module of an interdisciplinary\ncourse in 2003, he wanted the students,\nwho had already done modules in creative\nwriting, dance, video, and circuitry, to work\nwith audio and video processing. He knew\nthey would never be able to master Max/\nMSP/Jitter sufficiently in the short time\nthey had with him, so he teamed up with\nfellow ICICS member Keith Hamel (Music)\nand Nancy Nisbet of UBC Visual Arts to\ndevelop a simplified set of modules, \\nthey called UBC Toolbox. They subsequently made the program available for fre<\non the Internet, and it is now in use aro\nthe world. Not surprisingly, Pritchard was\nawarded a Killam Teaching Prize in 2005.\ninnovations magazine\nFall/Winter 2011 11 AT THE\nLEADING EDGE\nOF THE\nAEROSPACE\nINDUSTRY\nHelping Aircraft Manufacturers\nGet It Right the First Time\nTHE AEROSPACE INDUSTRY faces some of the world's most physics: characteristics and behaviour of the material, tool vi-\nexacting manufacturing standards. Parts manufacturers need brations, cutting mechanics, temperature, kinematics, etc. The\nto be certified by an international association. Detailed records software packages he has developed allow engineers to test the\nmust be kept about a part's history, so that malfunctions can be interaction between materials and cutting tools before machin-\ntraced to the machine it was made on, the cutting tool used, and ing takes place, and make adjustments to maximize productivi-\nthe operator involved. Materials must be light and strong, and ty. High-precision parts can be made right, at the optimal speed,\ncan be very expensive\u00E2\u0080\u0094the raw material for a titanium part, for the first time,\ninstance, can cost hundreds of thousands of dollars. There is no\nroom for error, and the competition UBC Expertise Flies High\nis fierce.\nMechanical engineering profes- 10 &^S ^3 WUM Altintas'Manufacturing Automation\nsor Yusuf Altintas has a long his- Mj WR Laboratory (MAL) at UBC is consid-\ntory of meeting these challenges. fftWinVilfi ered the best virtual-machining cen-\nIn the early 1980s, he developed ^g mm tre in the woril(j. CutPro, a package\nsoftware for automating the manu- of science-based virtual-machining\nfacture of propellers for the highly algorithms developed at the MAL,\nsuccessful de Havilland Dash 8. Since then, he has become one has been licensed by over 130 companies and research centres\nof the world's foremost experts in virtual machining, or \"mill- worldwide. In recent years, Altintas' methods have been used\ning\" parts in computer simulations that factor in the relevant by Boeing, Airbus, and Bombardier to cut aluminum wing\ninnovations magazine\nFall/Winter 2011 13 panels; ASCO Aerospace Canada Ltd. to mill titanium flap and\nslat tracks for a number of different aircraft; Pratt & Whitney\nCanada to machine business-jet engine impellers; and Hitachi\nto manufacture hydro turbines. Altintas has held the NSERC-\nPratt & Whitney Canada Chair in Virtual High-Performance\nMachining since 2002.\nCommitted to Research\n\"Roughly eighty percent of the world's aerospace companies\nuse our algorithms,\" Altintas says. So do many other industries.\nThey often ask Altintas to consult on specific manufacturing\nproblems, but he usually turns them down. Instead, he turns\nthe problem over to one of his graduate students as a fundamental research question. In exchange for advising the company on\nthe use of his software, Altintas receives equipment and material for the MAL. The machine-tool company Mori Seiki, for\ninstance, donated two machining centres worth $800,000 that\nAltintas and his students use for testing their algorithms. Top\ncutting-tool companies such as Sandvik Coromant (Sweden),\nKennametal (USA), and Mitsubishi Materials (Japan) also use\nhis technology, and provide expensive tools in return. \"I am\nvery committed to fundamental research,\" Altintas asserts,\n\"which in my case embodies teaching graduate students, with\nthe problems being defined by real-world scenarios. Teaching\nand research are a professor's primary job.\"\nAltintas' heart has always been in academia. He turned down\na job offer from General Motors Canada to come to UBC in\n1986 because, he joked, the fishing was better in BC. A number\nof years later he confessed to the CM executive who tried to\nhire him that he'd only caught half a dozen trout since coming to BC. The man did some quick calculations and informed\nAltintas that those fish had cost him several hundred thousand\ndollars. His students and the aerospace industry would say they\nwere worth it.\nFor more information, contact\nYusuf Altintas at altintas@mech.ubc.ca\nft\nKeeping\nCanadian\nManufacturing\nCompetitive\nIn his spare time, Yusuf Altintas is\nScientific Director of a Natural Sciences\nand Engineering Research Council of\nCanada (NSERC) Strategic Network that is\ndeveloping advanced, science-based virtual\nmachining technology for the Canadian\nautomotive, aerospace, machinery, energy,\nbiomedical, and die-mold industries.\nWith funding over 5 years of $5 million\nfrom NSERC and $350,000 from industry,\nthe Canadian Network for Research and\nInnovation in Machining Technology\n(CANRIMT) is made up of 20 academic\nresearchers from 7 universities across\nCanada, as well as researchers from the\nNational Research Council's Aerospace\nManufacturing Technology Centre and\nindustrial partners Pratt & Whitney\nCanada, Bombardier Aerospace, ASCO\nAerospace Canada Ltd., Automation\nTooling Systems Inc., Memex Automation\nInc., Origin International Inc., and\nPromation Engineering Ltd. Altintas\nand his colleagues have involv\nindustrial partners from the start, to keep\nthe research on track. The software tools\nthey develop will be easily integrated into\nexisting CAD/CAM systems, so that parts\nmanufacturing can be optimized virtuaTTy^\nwithout costly shop-floor trials. The end\ngoal is to keep the Canadian manufacturing\nsector competitive. 2D VIDEO TO\nTaking Cues from Nature\n3D TVs, cellphones, games boxes, and movie theatres are\nbecoming more and more common, but 3D content creation is\nlagging behind, since it is so difficult and expensive. Some 3D\nTVs currently on the market can convert 2D content to 3D, but\nwith limited success. A new conversion technique invented in\nthe UBC 3D Innovation Lab, however, is extremely promising,\nsince it mimics nature.\nThe range of cues humans draw upon to build up a 3D view\nof the environment, and the relationship of cues to one another,\nis extracted by the system from a 2D video or image to create its\n3D counterpart. Those that are inappropriate are rejected. The\nend result is a highly accurate 3D version of a 2D original. The\nconversion will happen in real-time on a TV, PVR, cellphone,\nor any other display device where a 3D view is desirable.\nNavigating using a 3D version of Google maps would be much\nmore intuitive, and who doesn't want to see Eartha Kitt as a 3D\nCatwoman in the 1960s Batman series?\nIn related work, the researchers are developing a host of\ntechniques that will allow the viewer to comfortably sense\ndepth and benefit from the entire colour gamut, without having\nto wear glasses. We will feature this work in an upcoming issue.\nFor more information,\ncontact Panos Nasiopoulos at panos@icics.ubc.ca\ninnovations magazine\nFall/Winter 2011 15 EVOLUTION HAS EQUIPPED US WITH TWO EYES FOR REDUNDANCY, BUT ALSO\nFOR DEPTH PERCEPTION, SO WE CAN SUCCESSFULLY NEGOTIATE THE WORLD.\nTwo separate views of a scene\nare projected onto our retinae, offset slightly because\nof the distance between our\neyes, then fused by the brain to help us\nperceive depth. Perspective lines, shadows, blocking of background objects\nby foreground objects, and motion-\nparallax cues, where objects in the foreground seem to shift faster than those\nin the background as we move our head\nfrom side to side, are also important\ncues. Since about 1860, we have been\nable to mimic this aspect of human vision by looking at two offset photographs of the same scene through a\nprism stereoscope\u00E2\u0080\u0094with enough staring, they converge into a three-dimensional image.\nBy the mid-1950s, colour 3D movies were the craze. Two images of the\nsame movie were projected onto the\nscreen through polarizing filters. Audience members wore cheap glasses with\npolarizing filters that let light through\naccording to the image appropriate for\neach eye. Telescopes and other props\nthat swung out at the audience were\ncommon features of these B movies;\ncharacter development and plot were a\nlittle thin.\nWith recent movies such as Avatar\nand Clash of the Titans (see animation\narticle on Pages 6\u00E2\u0080\u00947), 3D has moved\ninto the mainstream.The audience still\nhas to wear glasses, and much of the\ncontent is computer-generated, since\n3D filming of real actors and scenes\nis so difficult, requiring a number of\n16 Fall/Winter 2011 different cameras\nshooting simultaneously. For the\nsame reason, 3D\ntelevisions already\non the market suffer from a dearth of content beyond\nvideo games, which are computer-\ngenerated. Viewers still need to wear\nglasses, with lenses wirelessly synchronized to two overlapping images\non the screen, so the correct\neye sees the corresponding\nimage when it should. Glasses are not required with new\nautostereoscopic designs;\nhowever, these displays have\nlow resolution, since pairs\nof pixels are required to display each set of offset images. They also have a limited\nnumber of 3D viewpoints,\nso motion-parallax cues are\nof minimal use.\nHIGH-RESOLUTION 3D\nWITHOUT GLASSES\nA team of researchers led\nby Boris Stoeber are\ndrawing on their combined expertise to address these problems. The crux of\ntheir approach is to use individual\npixels for up to 30 viewpoints,\nthereby avoiding the loss of resolution inherent in autostereoscopic\ndisplays. Stoeber, who holds a joint\nappointment in Mechanical Engineering and Electrical and Computer Engineering (ECE), will design\nmicroelectromechanical systems\n(MEMS) that house optical elements\nand rotate rapidly back and forth to\nproject multiple views of the same\n\"Combining MEMS technology\nwith the right optics components,\nelectronics, and rendering\napproaches will lead to the next\ngeneration of 3D displays.\"\nscene.\nSystem-on-a-chip expert\nShahriar Mirabbasi (ECE) will develop advanced micro-circuitry to\ngenerate the voltage necessary to rotate the MEMS platforms. Computer graphics specialist Sid Fels (ECE)\nwill focus on rendering the images,\nusing techniques he developed in a\nprevious project that allow information for multiple viewpoints to be\ndelivered in a single frame of video.\nStoeber and his team are do-it-\nyourselfers, and will manufacture\nmany of the display's novel components at UBC, in the Advanced\nMaterials and Process Engineering\nLaboratory (AMPEL).\n\"Combining MEMS technology\nwith the right optics components,\nelectronics, and rendering approaches,\" Stoeber says, \"will lead to\nthe next generation of 3D displays.\" Since MEMS-based 3D\ndisplays could be mass produced, Stoeber believes they\nwill one day also be used in\nbiomedical applications such\nas computed tomography\n(CT) and image-guided surgery, as well as in cell phones\nand other handheld devices.\nFilming 3D content for these\ndisplays would also be much\neasier, since it would require only\ntwo cameras running simultaneously.\nFor more information, contact\nBoris Stoeber at\nstoeber(a)mech.ubc.ca\nICICS/TELUS\nPeople & Planet Fr\nlative\nAn ICICS/TELUS consortium that promotes sustainability\nwhile maintaining and improving quality of life.\nRead more about this unique project in the next issue of Innovations.\ninnovations magazine\nP\"\nI Fall/Winter 2011 17 SHADOW REACHING\nI M THE EARLY NINETEENTH CENTURY, BLACKBOARDS BEGAN REPLACING SLATES IN THE CLASS-\nIIM ROOM, AND PUBLIC EDUCATION BECAME A MUCH MORE SOCIAL, COLLABORATIVE EXPERIENCE, MEDIATED BY CHALK IN THE TEACHER'S HAND. WALL-SIZED DISPLAY SCREENS NOW IN LIMITED\nUSE HAVE THE POTENTIAL TO SIMILARLY REVOLUTIONIZE COLLABORATION, WHETHER IN THE CLASSROOM OR A UTILITIES CONTROL ROOM. WHAT'S MISSING IS THE CHALK.\nTraditional means of interacting with computers by\nkeyboard and mouse are not practical for large-\nscreen displays, which are designed for collaborative\nwork Information needs to be graphically displayed\nand manipulated on the screen in a way that makes\nsense to the audience. With touch screens as large as the 5 m x 3\nm display in ICICS' Interactive Workroom, a step ladder would\nbe needed to reach all areas of the screen. Because these screens\nhave been deployed without a suitable means of interaction,\nthey have remained largely in the realm of research.\nGarth Shoemaker was Director of Research at Idelix\nSoftware, a developer of mapping software, before coming\nto UBC to pursue a PhD in human-computer interaction.\nHis supervisor, computer scientist Kellogg Booth, is an\ninternationally recognized expert in the field. Under Booths\ndirection, Shoemaker has applied his practical experience to\nthe missing \"chalk\" problem. His working principle is that for\ninteraction with large screens to be useful, the audience needs\nto be able to connect it to the user\u00E2\u0080\u0094to see the user's hand draw\na line around an island on a map, for example. Shoemaker\nrealized that if the user's shadow were projected onto the screen,\nthe closer they got to the apparent light source, the larger their\nshadow would become; with it, they could reach all areas of the\nscreen. Having had his eureka moment, he set himself the task\nof generating a virtual shadow of the user that could interact\nwith the screen.\n18 Fall/Winter 2011 tftvm\nKMm\n.a?\n\u00E2\u0096\u00A0\nunthoo\n*\nChin*\nOr\nWuhan ,\nA3 \u00E2\u0080\u00A2\u00C2\u00AB\nCollaboration Technology\nfor Large Interactive\nWall Screens\n\"This is a recurring theme in my work,\" Shoemaker says,\n\"drawing from the real world and leveraging that in the virtual\nworld.\" After consulting with Booth, virtual-reality expert Sid\nFels, computer-animation specialist Michiel van de Panne,\npsychologist Jocelyn Keillor (National Research Council),\nand other psychology and kinesiology researchers, he devised\na technique for rendering a virtual shadow of the user. Using\nvisual and magnetic trackers and a virtual light source, the\nshadow is \"cast\" onto the\nscreen as if by a real light.\nModified Nintendo Wiimote\ncontrollers function as input\ndevices. Shoemaker calls\nthe technology \"Shadow\nReaching.\"\nAnother advantage of\nrendering users as virtual\nshadows is that screen detail can be seen through them. \"It's a\ndelicate balance,\" Shoemaker points out. \"You want something\nthat provides the information but doesn't get in the way.\" Icons\nfor different tools, such as a virtual pen, are located in various\nplaces on the shadow. Data are stored in the shadow's \"stomach,\"\nand shared by passing folders to other users' shadows. By using\nmore comprehensive tracking techniques, Shoemaker can\ngenerate a much richer three-dimensional body model that can\nbe rendered on the screen in a variety of ways. Collaborators\nin different cities, for example, may be identified by rendering\nthem in a more lifelike, recognizable way.\n'This is a recurring theme in my work,\ndrawing from the real world and\nleveraging that in the virtual world.\"\nFUTURE TRACKING\nSHOEMAKER BELIEVES THAT TRACKING WILL ONE\nDAY BE DONE exclusively with cameras, so the user will not\nneed to wear tracking markers, nor use an input device like a\nWiimote. An instructor in a lecture hall, for example, will be\ntracked by depth-sensing cameras and rendered on a large wall\nscreen. They will interact with it by simply pointing to the area\nof the screen where they\nwant certain information\nto appear, then touching\ntheir finger to their thumb\nas a \"click\". Static, sequential\nPowerPoint presentations\nmay go the way of the slate.\nShoemaker's research\nhas been supported by\nSMART Technologies of Calgary, manufacturers of interactive\nwhiteboards, and by Defence Research and Development\nCanada. It is also part of the on-going research program of the\nnew GRAND Network of Centres of Excellence led by Booth\n(see Focus, Spring 2010). Some form of it maybe coming soon\nto a classroom or utilities control centre near you.\nFor more information,\ncontact Garth Shoemaker at garths@cs.ubc.ca\ninnovations magazine\nFall/Winter 2011 19 Bring your friends list\nto the TV screen.\nIntroducing Facebook on Optik TV.\nA first in Canada, only from TELUS.\nTo learn more about Facebook on Optik TV\u00E2\u0084\u00A2 and for\ndetails about our latest offers visit telus.com/optiktv\n^rtELUS\nthe future is friendly\"\n\u00C2\u00A9TELUS 2011 11J0319\nUBC's Institute for Computing, Information and Cognitive Systems (ICICS) is an umbrella\norganization that promotes collaboration among researchers from the faculties of Applied\nScience, Arts, Commerce, Education, Forestry, Medicine, and Science, and with industry\nICICS facilitates the collaborative multidisciplinary research of approximately 150 faculty\nmembers and 800 graduate students in these faculties.\nOur members attract approximately $18 million annually in grants and contracts. Their work\nstrengthens Canada's strategic Science and Technology research priority areas, benefiting\nall Canadians.\nICICS |p.\na place of mind\nTHE UNIVERSITY OF BRITISH COLUMBIA\nCONNECTING KNOWLEDGE\nPUBLICATIONS MAIL AGREEMENT NO. 40049168\nRETURN UNDELIVERABLE CANADIAN ADDRESSES TO:\nICICS, University of British Columbia\n289-2366 Main Mall\nVancouver, BCV6T1Z4\ninfo@icics.ubc.ca\nwww.icics.ubc.ca"@en . "Titled \"Focus\" from 1990 to 2010, and \"Innovations\" from 2010 onward."@en . "Periodicals"@en . "Vancouver (B.C.)"@en . "QA75.5 .F628"@en . "QA75_5_F628_2011"@en . "10.14288/1.0115160"@en . "English"@en . "Vancouver : University of British Columbia Library"@en . "Vancouver : University of British Columbia Institute for Computing, Information and Cognitive Systems (ICICS)"@en . "Images provided for research and reference use only. Permission to publish, copy, or otherwise use these images must be obtained from The University of British Columbia Institute for Computing, Information and Cognitive Systems (ICICS): http://www.icics.ubc.ca/index.php"@en . "Original Format: University of British Columbia. Archives"@en . "University of British Columbia"@en . "Computer systems"@en . "Innovations"@en . "Text"@en . ""@en .