Immersive Integration of Physical and Virtual Environments
We envision future work and play environments in which the user's computing interface is more closely integrated with the physical surroundings than today's conventional computer display screens and keyboards. We are working toward realizable versions of such environments, in which multiple video projectors and digital cameras enable every visible surface to be both measured in 3D and used for display. If the 3D surface positions are transmitted to a distant location, they may also enable distant collaborations to become more like working in adjacent offices connected by large windows. In one prototype, depth maps are calculated from streams of video images and the resulting 3D surface points are displayed to the user in head-tracked stereo. Another prototype allows direct "painting" onto movable objects -- a dollhouse, for example. One long-term goal is advanced training for trauma surgeons by immersive replay of recorded procedures. More generally, we hope to demonstrate that the principal interface of a future computing environment need not be limited to a screen the size of one or two sheets of paper. Just as a useful physical environment is all around us, so too can the increasingly ubiquitous computing environment be all around us --integrated intuitively with our physical surroundings.
Bio:
Henry Fuchs is the Federico Gil Professor of Computer Science, Adjunct
Professor of Biomedical Engineering, and Adjunct Professor of Radiation
Oncology at UNC Chapel Hill. He has been active in computer graphics
since the 1970's, with rendering algorithms (BSP Trees), hardware
(Pixel-Planes and PixelFlow), virtual environments, tele-immersion
systems and medical applications. He is a member of the National Academy
of Engineering, a fellow of the American Academy of Arts and Sciences,
recipient of the ACM-Siggraph Achievement Award, and the Academic Award
of the National Computer Graphics Association.