Please use this identifier to cite or link to this item:
Title: Efficient conservative collision detection for populated virtual worlds
Author: Ramires Fernandes, A.
Deusdado, Leonel
Keywords: Three-dimensional graphics and realism
Virtual reality animation
Computational geometry
Object modeling
Geometric algorithms
Issue Date: 2007
Publisher: The Eurographics Association
Citation: Ramires Fernandes, A.; Deusdado, Leonel (2007) - Efficient conservative collision detection for populated virtual worlds. In SIACG - Ibero American Symposium in Computer Graphics. Santiago de Compostela.
Abstract: Large virtual worlds, with considerable level of detail are starting to emerge everywhere, from large areas of actual cities to archaeological reconstructions of large sites. Populating a virtual world adds an extra touch to the visualization of these worlds, but unfortunately it also brings an extra burden to the system. Several tasks are required when adding animated characters to a virtual world, such as collision detection, path planning and other AI algorithms, rendering of dynamic geometry, amongst others. In here a method for efficient and scalable conservative collision detection is presented, that is able to deal with large scenes and thousands of avatars. This method does not perform exact collision detection, hence it is conservative. The method is suitable as a basis for path planning algorithms and other AI algorithms where an avatar is often regarded as 'something' that can be bounded by a cylinder, or a box. The algorithm is capable of dealing with arbitrarily complex 3D worlds, and does not require any a priori knowledge of the geometry.
Peer review: yes
Appears in Collections:IC - Artigos em Proceedings Não Indexados ao ISI/Scopus

Files in This Item:
File Description SizeFormat 
Efficient Conservative Collision Detection for Populated virtual Worlds.pdf646,46 kBAdobe PDFView/Open

FacebookTwitterDeliciousLinkedInDiggGoogle BookmarksMySpace
Formato BibTex MendeleyEndnote Degois 

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.