Stereoscopy-based procedural generation of virtual environments

Video in TIB AV-Portal: Stereoscopy-based procedural generation of virtual environments

Formal Metadata

Stereoscopy-based procedural generation of virtual environments
Title of Series
Part Number
Number of Parts
CC Attribution - NoDerivatives 2.0 UK: England & Wales:
You are free to use, copy, distribute and transmit the work or content in unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor.
Release Date

Content Metadata

Subject Area
Procedural generation of virtual scenes (like e.g., complex cities with buildings of different sizes and heights) is widely used in the CG movies and videogames industry. Even if this kind of scenes are often visualized using stereoscopy, however, to our knowledge, stereoscopy is not currently used as a tool in the procedural generation, while a more comprehensive integration of stereoscopic parameters can play a relevant role in the automatic creation and placement of virtual models. In this paper, we show how to use stereoscopic parameters to guide the procedural generation of a scene in an open-source modeling software. Virtual objects can be automatically placed inside the stereoscopic volume, in order to reach the maximum amount of parallax on screen, given a particular interocular distance, convergence plane and display size. The proposed approach allows to create again a virtual scene, given a particular context of visualization, avoiding problems related to excessive positive parallax in the final rendering. Moreover, the proposed approach can be used also to automatically detect window violations, by determining overlaps in negative parallax area between models and the view frustums of the stereoscopic camera, and to apply proper solutions, like e.g. the automatic placement of a floating window. © 2016, Society for Imaging Science and Technology (IS&T).