Proceedings: GI 2015

Dynamic on-mesh procedural generation

Cyprien Buron , Jean-Eudes Marvie , Gaël Guennebaud , Xavier Granier

Proceedings of Graphics Interface 2015: Halifax, Nova Scotia, Canada, 3 - 5 June 2015, 17-24

DOI 10.20380/GI2015.03

  • Bibtex

    @inproceedings{Buron:2015:10.20380/GI2015.03,
    author = {Buron, Cyprien and Marvie, Jean-Eudes and Guennebaud, Ga{\"e}l and Granier, Xavier},
    title = {Dynamic on-mesh procedural generation},
    booktitle = {Proceedings of Graphics Interface 2015},
    series = {GI 2015},
    year = {2015},
    issn = {0713-5424},
    isbn = {978-1-4822-6003-8},
    location = {Halifax, Nova Scotia, Canada},
    pages = {17--24},
    numpages = {8},
    doi = {10.20380/GI2015.03},
    publisher = {Canadian Human-Computer Communications Society},
    address = {Toronto, Ontario, Canada},
    }

Abstract

We present a method to synthesize procedural models with global structures, such as growth plants, on existing surfaces at interactive time. More generally, our approach extends shape grammars to enable context-sensitive procedural generation on the GPU. Central to our framework is the unified representation of external contexts as texture maps. These generic contexts can be spatially varying parameters controlling the grammar expansion through very fast texture fetches (e.g., a density map). External contexts also include the shape of the underlying surface itself that we represent as a texture atlas of geometry images. Extrusion along the surface is then performed by a marching rule working in texture space using indirection pointers. We also introduce a lightweight deformation mechanism of the generated geometry maintaining a C1 continuity between the terminal primitives while taking account for the shape and trajectory variations. Our method is entirely implemented on the GPU and it allows to dynamically generate highly detailed models on surfaces at interactive time. Finally, by combining marching rules and generic contexts, users can easily guide the growing process by directly painting on the surface with a live feedback of the generated model. This provides friendly editing in production environments.