We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Interpolating vertical parallax for an autostereoscopic 3D projector array

Formal Metadata

Title
Interpolating vertical parallax for an autostereoscopic 3D projector array
Title of Series
Part Number
6
Number of Parts
29
Author
License
CC Attribution - NoDerivatives 2.0 UK: England & Wales:
You are free to use, copy, distribute and transmit the work or content in unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor.
Identifiers
Publisher
Release Date
Language

Content Metadata

Subject Area
Genre
Abstract
CONTEXT: We present a technique for achieving tracked vertical parallax for multiple users for a variety of autostereoscopic projector array setups including front- and rear- projection, and curved display surfaces. This “hybrid parallax” approach allows for immediate horizontal parallax as viewers move left and right, and tracked parallax as they move up and down, allowing cues such as 3D perspective and eye contact to be conveyed faithfully. OBJECTIVE: Projector arrays are well suited for 3D displays because of their ability to generate dense and steerable arrangements of pixels. We have developed a new autostereoscopic display utilizing a single dense row of 69 pico projectors. The projectors are focused on a 30x30cm vertically anisotropic screen that scatters the light from each lens into a vertical stripe while preserving horizontal angular variation. Each viewer’s eye observes the combined effect of image stripes from multiple projectors which combine to form a seamless 3D image. As every viewer sees a different 3D image, it is possible to customize each view with a different vertical perspective. Given a sparse set of tracked viewer positions, the challenge is to create a continuous estimate of viewer height and distance for all potential viewing angles to provide consistent vertical perspective to both tracked and untracked viewers. METHOD: Rendering to a dense projector display requires multiple-center of projection imagery, as adjacent projector pixels diverge to different viewer positions. If you assume constant viewer height and distance for each projector, viewers may see significant cross-talk and geometric distortion particularly when multiple viewers are in close proximity. We solve this problem with a custom GPU vertex shader projection that dynamically interpolates multiple viewer heights and distances within each projector frame. Thus, each projector’s image is rendered in a distorted manner representing multiple centers of projection, and might show an object from above on the left and from below on the right. RESULTS: We use a low-cost RGB depth sensor to simultaneously track multiple viewer head positions in 3D and interactively update the imagery sent to the array. Even though each user sees slices of multiple projectors, the perceived 3D image is consistent and smooth from any vantage point with reduced cross-talk. This rendering framework also frees us to explore different projector configurations including front and rear- mounted projector arrays and non-flat screens. Our rendering algorithm does not add significant overhead enabling realistic dynamic scenes. Our display produces full color autostereoscopic 3D imagery, with zero horizontal latency, and a wide 110o field of view which can accommodate numerous viewers. NOVELTY: While user tracking has long been used for single-user glasses displays, and single-user autosteroscopic display [Perlin et al. 2000] in order to update both horizontal and vertical parallax, our system is the first autostereoscopic projector array to incorporate tracking for vertical parallax. Our method could be adapted to other projector arrays [Rodriguez et al. 2007, Kawakita et al 2012, Kovacs and Zilly 2012, Yoshida et al 2011]. Furthermore, our display is reproducible with off-the-shelf projectors, screen materials, graphics cards, and video splitters.