Data-Driven Procedural Landscape Modeling
Kenny Mitchell (Disney Research Zurich)
Gwyneth Bradbury (University College London)
Tim Weyrich (Disney Rresearch Zurich) classified2
Improvements in computer graphics continue to make worlds look and feel more believable and more realistic. This advance, however, comes at a huge price: financial investment in artists and designers to carry out increasingly time-consuming and detailed tasks.This work presents several distinct projects which address the problems involved in partially automated reconstruction of virtual environments, with particular focus on landscapes and natural terrain features. Multi-spectral stereo image capture is investigated with the aim of extracting useful information about a natural scene from a small database of images. Aerial and satellite footage is then investigated in order to create natural-looking environments which incorporate the statistical distributions of different types of vegetation which can be then edited (importantly in an invertible procedural model), amplifying the artists creativity.The random forest classifier allows us to retain connectivity of regions separated by stochastic patterns (typically observed in vegetation distribution). We are also investigating image synthesis techniques to analyse image patterns are reapply them to target landscapes with procedural models. Presently, promising results are arising from the use of pyramid histograms, which reasonably effectively capture the frequency profile of the input terrain.We are incorporating this work with an interactive height-field editing system for large scale landscape design. Using interviews and further feedback from artists, we are steering the development with the goal of amplifying the artist’s workflow efficiency. Procedural terrain editing features under development include, procedural copy and paste similar features to target locations ridge feature drawing mountain shaping library of distinctive terrain feature maps to apply variations of in place on the map.