Volume 15, number 1-3 (2007)

Permanent URI for this collection

Browse

Recent Submissions

Showing 1 - 20 out of 20 results
  • Item
    Computation of tunnels in protein molecules using Delaunay triangulation
    (Václav Skala - UNION Agency, 2007) Medek, Petr; Beneš, Petr; Sochor, Jiří; Skala, Václav
    This paper presents a new method of specific cavity analysis in protein molecules. Long-term biochemical research has the discovery that protein molecule behaviour depends on the existence of cavities (tunnels) leading from the inside of the molecule to its surface. Previous methods of tunnel computation were based on space rasterization. Our approach is based on computational geometry and uses Voronoi diagram and Delaunay triangulation. Our method computes tunnels with better quality in reasonable computational time. The proposed algorithm was implemented and tested on several real protein molecules and is expected to be used in various applications in protein modelling and analysis. This is an interesting example of applying computational geometry principles to practical problems.
  • Item
    Light octree: global illumination fast reconstruction and realtime navigation
    (Václav Skala - UNION Agency, 2007) Vivanloc, Vicent; Hoelt, Jean-Christophe; Hong, Coong Binh; Paulin, Mathias; Skala, Václav
    We present a method to rapidly build an irradiance cache based on a local illumination environment approach. This cache is obtained by a stream simplification of a photon map. The photons are K-Means clustered per voxel into sets of virtual directional light. These lights are stored into an irradiance texture to provide a real-time rendering of a global illuminated scene. This method can be integrated into an existing GPU shader to obtain complex material rendering and can be accelerated by texture atlases.
  • Item
    Non-iterative computation of contact forces for deformable objects
    (Václav Skala - UNION Agency, 2007) Spillmann, Jonas; Becker, M.; Teschner, M.; Skala, Václav
    We present a novel approach to handle collisions of deformable objects represented by tetrahedral meshes. The scheme combines the physical correctness of constraint methods with the efficiency of penalty approaches. For a set of collided points, a collision-free state is computed that is governed by the elasticities and impulses of the collided objects. In contrast to existing constraint methods we show how to decouple the resulting system of equations in order to avoid iterative solvers. By considering the time step of the numerical integration scheme, the contact force can be analytically computed for each collided point in order to achieve the collision-free state. Since predicted information on positions, impulses, and penetration depths of the subsequent time step is considered, a collision-free state is maintained at each simulation step which is in contrast to existing penalty methods. Further, our approach does not require a user-defined stiffness constant. Our scheme can handle various underlying deformable models and numerical integration schemes. To illustrate its versatility, we have performed experiments with linear and non-linear finite element methods.
  • Item
    Efficient and accurate rendering of vector data on virtual landscapes
    (Václav Skala - UNION Agency, 2007) Schneider, Martin; Klein, Reinhard; Skala, Václav
    In geographical information systems (GIS) vector data has important applications in the analysis and management of virtual landscapes. Therefore, methods that allow combined visualization of terrain and geo-spatial vector data are required. Such methods have to adapt the vector data to the terrain surface and to ensure a precise and efficient mapping. In this paper, we present a method that is based on the stencil shadow volume algorithm and allows high-quality real-time overlay of vector data on virtual landscapes. Since the method is a screen-space algorithm it is per-pixel exact and does not suffer from aliasing artifacts like texture-based techniques. In addition, since the method is independent of the underlying terrain geometry, its performance does not depend on the complexity of the data set but only on the complexity of the vector data.
  • Item
    Real-time rendering of planets with atmospheres
    (Václav Skala - UNION Agency, 2007) Schafhitzel, Tobias; Falk, Martin; Ertl, Thomas; Skala, Václav
    This paper presents a real time technique for planetary rendering and atmospheric scattering effects. Our implementation is based on Nishita’s atmospheric model which describes actual physical phenomena, taking into account air molecules and aerosols, and on a continuous level-of-detail planetary renderer. We obtain interactive frame rates by combining the CPU bound spherical terrain rendering with the GPU computation of the atmospheric scattering. In contrast to volume rendering approaches, the parametrization of the light attenuation integral we use makes it possible to pre-compute it completely. The GPU is used for determining the texture coordinates of the pre computed 3D texture, taking into account the actual spatial parameters. Our approach benefits from its independence of the rendered terrain geometry. Therefore, we demonstrate the utility of our approach showing planetary renderings of Earth and Mars.
  • Item
    Extracting separation surfaces of path line oriented topology in periodic 2D time-dependent vector fields
    (Václav Skala - UNION Agency, 2007) Shi, Kuangyu; Theisel, Holger; Weinkauf, Tino; Hauser, Helwig; Hege, Hans-Christian; Seidel, Hans Peter; Skala, Václav
    This paper presents an approach to extracting the separation surfaces from periodic 2D time-dependent vector fields based on a recently introduced path line oriented topology. This topology is based on critical path lines which repeat the same spatial cycle per time period. Around those path lines there are areas of similar asymptotic flow behavior (basins) which are captured by a 2D Poincaré map as a discrete dynamical system. Due to pseudo discontinuities in this map and the discrete integration scheme, separatrices between the basins can’t be obtained as integral curves. Instead we choose a point-wise approach to segment the Poincaré map and apply image analysis algorithms to extract the 2D separation curves. Starting from those curves we integrate separation surfaces which partition the periodic 2D time-dependent vector field into areas of similar path line behavior. We apply our approach to a number of data sets to demonstrate its utility.
  • Item
    Recognizing human motion using eigensequences
    (Václav Skala - UNION Agency, 2007) Bottino, Andrea; De Simone, Matteo; Laurentini, Aldo; Skala, Václav
    This paper presents a novel method for motion recognition. The approach is based on 3D motion data. The captured motion is divided into sequences, which are sets of contiguous postures over time. Each sequence is then classified into one of the recognizable action classes by means of a PCA based method. The proposed approach is able to perform automatic recognition of movements containing more than one class of action. The advantages of this technique are that it can be easily extended to recognize many action classes and, most of all, that the recognition process is real-time. In order to fully understand the capabilities of the proposed method, the approach has been implemented and tested in a virtual environment. Several experimental results are also provided and discussed.
  • Item
    Refining single view calibration with the aid of metric scene
    (Václav Skala - UNION Agency, 2007) Lourakis, Manolis I. A.; Argyros, Antonis A.; Skala, Václav
    Intrinsic camera calibration using a single image is possible provided that certain geometric objects such as orthogonal vanishing points and metric homographies can be estimated from the image and give rise to adequate constraints on the sought calibration parameters. In doing so, however, any additional metric information that might be available for the imaged scene is not alwaysl traightforward to accommodate. This paper puts forward a method for incorporating into the calibration procedure metric scene information expressed in the form of known segment 3D angles, equal but unknown 3D angles and known 3D length ratios.l Assuming the availability of an initial calibration estimate, the proposed method refines the former by numerically minimizing 4n error term corresponding to the discrepancy between the scene's known metric properties and the values measured with the laid of the calibration estimate. Sample experimental results demonstrate the improvements in the intrinsic calibration estimates that are achieved by the proposed method.
  • Item
    Splat-based ray tracing of point clouds
    (Václav Skala - UNION Agency, 2007) Linsen, Lars; Müller, Karsten; Rosenthal, Paul; Skala, Václav
    Point-based surface representations have gained increasing interest in the computer graphics community within the last decade. Surface splatting established as one of the main rendering techniques for point clouds. We present a ray-tracing approach for objects whose surfaces are represented by point clouds. Our approach is based on casting rays and intersecting them with splats. Since ray-tracing methods require smoothly changing surface normals for producing the desired photorealistic results, splat generation must include the derivation of such normals. We determine a neighborhood around each point of the point cloud, estimate the surface normal at each of the points, compute splats with varying radii that cover the surface, and use the normals of all points that are covered by each splat to generate a smoothly varying normal field for each splat. This part of the computation is view-independent and, thus, can be precomputed. During the rendering step, ray-splat intersections are performed, where the normal at the intersection point is interpolated using local coordinates of the splat’s normal field. Care has to be taken where splats overlap. We speed up the computations of the ray-splat intersections using an octree data structure.
  • Item
    New method for opacity correction in oversampled volume ray casting
    (Václav Skala - UNION Agency, 2007) Lee, Jong Kwan; Newman, Timothy S.; Skala, Václav
    A new opacity correction approach for oversampled volume ray casting is introduced. While the only existing opacity correction method in the literature is based on the assumption of dataset homogeneity, the new opacity correction method introduced in this paper is a faster, generalized voxel-by-voxel approach which does not assume dataset homogeneity. The new opacity correction avoids the dataset homogeneity assumption by introducing a new opacity correction factor for the samples in each voxel. Its performance improvement over the existing opacity correction approach is also exhibited for real volumetric datasets.
  • Item
    Evaluation of a bricked volume layout for a medical workstation based on Java
    (Václav Skala - UNION Agency, 2007) Kohlmann, Petr; Bruckner, Stefan; Kanitsar, Armin; Gröller, M. Eduard; Skala, Václav
    Volumes acquired for medical examination purposes are constantly increasing in size. For this reason, the computer’s memory is the limiting factor for visualizing the data. Bricking is a well-known concept used for rendering large data sets. The volume data is subdivided into smaller blocks to achieve better memory utilization. Until now, the vast majority of medical workstations use a linear volume layout. We implemented a bricked volume layout for such a workstation based on Java as required by our collaborative company partner to evaluate different common access patterns to the volume data. For rendering, we were mainly interested to see how the performance will differ from the traditional linear volume layout if we generate images of arbitrarily oriented slices via Multi-Planar Reformatting (MPR). Furthermore, we tested access patterns which are crucial for segmentation issues like a random access to data values and a simulated region growing. Our goal was to find out if it makes sense to change the volume layout of a medical workstation to benefit from bricking. We were also interested to identify the tasks where problems might occur if bricking is applied. Overall, our results show that it is feasible to use a bricked volume layout in the stringent context of a medical workstation implemented in Java.
  • Item
    Hardware-accelerated ray-triangle intersection testing for high-performance collision detection
    (Václav Skala - UNION Agency, 2007) Kim, Sung-Soo; Nam, Seung-Woo; Kim, Do-Hyung; Lee, In-Ho; Skala, Václav
    We present a novel approach for hardware-accelerated collision detection. This paper describes the design of the hardware architecture for primitive inference testing components implemented on a multi-FPGA Xilinx Virtex-II prototyping system. This paper focuses on the acceleration of ray-triangle intersection operation which is the one of the most important operations in various applications such as collision detection and ray tracing. Also, the proposed hardware architecture is general for intersection operations of other object pairs such as sphere vs. sphere, oriented bounding box (OBB) vs. OBB, cylinder vs. cylinder and so on. The result is a hardware-accelerated ray-triangle intersection engine that is capable of out-performing a 2.8GHz Xeon processor, running a well-known high performance software ray-triangle intersection algorithm, by up to a factor of seventy. In addition, we demonstrate that the proposed approach could prove to be faster than current GPU-based algorithms as well as CPU based algorithms for ray-triangle intersection.
  • Item
    Optimized continuous collision detection for deformable triangle meshes
    (Václav Skala - UNION Agency, 2007) Hutter, Marco; Fuhrmann, Arnulph; Skala, Václav
    We present different approaches for accelerating the process of continuous collision detection for deformable triangle meshes. The main focus is upon the collision detection for simulated virtual clothing, especially for situations involving a high number of contact points between the triangle meshes, such as multi-layered garments. We show how the culling efficiency of bounding volume hierarchies may be increased by introducing additional bounding volumes for edges and vertices of the triangle mesh. We present optimized formulas for computing the time of collision for these primitives analytically, and describe an efficient iterative scheme that ensures that all collisions are treated in the correct chronological order.
  • Item
    Painterly rendering framework from composition
    (Václav Skala - UNION Agency, 2007) Chu, Chi; Shih, Zen-Chung; Skala, Václav
    Painterly rendering has recently drawn considerable attention from graphics researchers. However, the state of the art is neither systematic nor evaluative. This work presents a novel painterly rendering framework. The painting process is decomposed into three stages to satisfy the needs of developers and users of painterly rendering algorithms and programs. The framework comprises three systems, namely primitive mapping, rendering and mark systems, and is inspired by John Willats’ perceptual decomposition of the painting process presented by [Wil97]. Moreover, the rendering system is further decomposed into four independent modules, namely initial point, path, cross-section and color. The independence of each module makes new styles easy to generate by combining existing styles, or constructing complex styles from simple styles. The proposed framework shows the power of painterly rendering algorithm, which can not only imitate existing styles, but also generate new styles. Furthermore, parameters in rendering systems are specified hierarchically. Users only need to specify the user parameters, which are then automatically converted into system parameters during rendering. This approach is crucial to facilitating the use of the program by end-users.
  • Item
    Normal mapping for surfel-based rendering
    (Václav Skala - UNION Agency, 2007) Holst, Mathias; Schumann, Heidrun; Skala, Václav
    On the one hand normal mapping is a common technique to improve normal interpolation of low tesselated triangle meshes for a realistic lighting. On the other hand today’s graphics hardware allows texturing of view plane aligned point primitives. In this paper we illustrate how to use textured points together with normal mapping to increase surfel splatting quality, especially when using larger splats on lower level of detail. In combination with a silhouette refinement this results in a significant decimation of needed surfels with small visual disadvantages only. Furthermore, we explain how to create a normal map for points within a point hierarchy.
  • Item
    Instant animated grass
    (Václav Skala - UNION Agency, 2007) Habel, Ralf; Wimmer, Michael; Jeschke, Stefan; Skala, Václav
    This paper introduces a technique for rendering animated grass in real time. The technique uses front-to-back compositing of implicitly defined grass slices in a fragment shader and therefore significantly reduces the overhead associated with common vegetation rendering systems. We also introduce a texture-based animation scheme that combines global wind movements with local turbulences. Since the technique is confined to a fragment shader, it can be easily integrated into any rendering system and used as a material in existing scenes.
  • Item
    Annotating images through adaptation: an integrated text authoring and illustration framework
    (Václav Skala - UNION Agency, 2007) Götzelmann, Timo; Götze, Marcel; Ali, Kamran; Hartmann, Knut; Strothotte, Thomas; Skala, Václav
    This paper presents concepts to support authors illustrating their texts. Our approach incorporates content- and feature-based retrieval techniques in multimedia databases containing 2D images and 3D models. Moreover, we provide tools (i) to adapt the retrieval results to contextual requirements and (ii) to ease their integration into target documents. For 3D models the adaptation comprises aspects of the image composition (i. e., the selection of an appropriate view and the spatial arrangement of visual elements) and the selection of appropriate parameters for the rendering process. In addition, secondary elements (e. g., textual annotations or associated visualizations) are smoothly integrated into adapted 2D or 3D illustrations. These secondary elements reveal details about the semantic content of illustrations and author’s communicative intentions. They can ease the retrieval, reuse, and adaptation of illustrations in multimedia databases and are explicitly stored in conjunction with the adapted illustrations. Moreover, we developed a novel technique to support the mental reconstruction of complex spatial configurations by shape icons. With this illustration technique, shape properties of salient objects can be conveyed using abstractshaped models. We present retrieval techniques to determine appropriate 3D models to be displayed for shape icons. These shape icons along with the other secondary elements are smoothly integrated into the illustration that can be interactively explored by the user.
  • Item
    Reducing artifacts in surface meshes extracted from binary volumes
    (Václav Skala - UNION Agency, 2007) Bade, Ragner; Konrad, Olaf; Preim, Bernhard; Skala, Václav
    We present a mesh filtering method for surfaces extracted from binary volume data which guarantees a smooth and correct representation of the original binary sampled surface, even if the original volume data is inaccessible or unknown. This method reduces the typical block and staircase artifacts but adheres to the underlying binary volume data yielding an accurate and smooth representation. The proposed method is closest to the technique of Constrained Elastic Surface Nets (CESN). CESN is a specialized surface extraction method with a subsequent iterative smoothing process, which uses the binary input data as a set of constraints. In contrast to CESN, our method processes surface meshes extracted by means of Marching Cubes and does not require the binary volume. It acts directly and solely on the surface mesh and is thus feasible even for surface meshes of inaccessible or unknown volume data. This is possible by reconstructing information concerning the binary volume from artifacts in the extracted mesh and applying a relaxation method constrained to the reconstructed information.
  • Item
    A scale invariant detector based on local energy model for matching images
    (Václav Skala - UNION Agency, 2007) Ancuti, Cosmin; Bekaert, Philippe; Skala, Václav
    Finding correspondent feature points represents a challenge for many decades and has involved a lot of preoccupation in computer vision. In this paper we introduce a new method for matching images. Our detection algorithm is based on the local energy model, a concept that emulates human vision system. For true scale invariance we extend this detector using automatic scale selection principle. Thus, at every scale level we identify points where Fourier components of the image are maximally in phase and then we extract only feature points that maximize a normalized derivatives function through scale space. To find correspondent points a new method based on the Normalized Sum of Squared Differences (NSSD) is introduced. NSSD is a classical matching measure but is limited to only the small baseline case. Our descriptor is adapted to characteristic scale and also is rotation invariant. Finally, experimental results demonstrate that our algorithm is reliable for significant modification of scale, rotation and variation of image illumination.
  • Item
    Efficient Compression of 3D Dynamic Mesh Sequences
    (Václav Skala - UNION Agency, 2007) Amjoun, Rachida; Straßer, Wolfgang; Skala, Václav
    This paper presents a new compression algorithm for 3D dynamic mesh sequences based on the local principal component analysis (LPCA). The algorithm clusters the vertices into a number of clusters using the local similarity between the trajectories in a coordinate system that is defined in each cluster, and thus transforms the original vertex coordinates into the local coordinate frame of their cluster. This operation leads to a strong clustering behavior of vertices and makes each region invariant to any deformation over time. Then, each cluster is efficiently encoded with the principal component analysis. The appropriate numbers of basis vectors to approximate the clusters are optimally chosen using the bit allocation process. For further compression, quantization and entropy encoding are used. According to the experimental results, the proposed coding scheme provides a significantly improvement in compression ratio over existing coders.