Issue 3
Permanent URI for this collection
Browse
Browsing Issue 3 by Issue Date
Now showing 1 - 20 of 41
Results Per Page
Sort Options
Item Conservative Visibility and Strong Occlusion for Viewspace Partitioning of Densely Occluded Scenes(Blackwell Publishers Ltd and the Eurographics Association, 1998) Cohen-Or, Daniel; Fibich, Gadi; Halperin, Dan; Zadicario, EyalComputing the visibility of out-door scenes is often much harder than of in-door scenes. A typical urban scene, for example, is densely occluded, and it is effective to precompute its visibility space, since from a given point only a small fraction of the scene is visible. The difficulty is that although the majority of objects are hidden, some parts might be visible at a distance in an arbitrary location, and it is not clear how to detect them quickly. In this paper we present a method to partition the viewspace into cells containing a conservative superset of the visible objects. For a given cell the method tests the visibility of all the objects in the scene. For each object it searches for a strong occluder which guarantees that the object is not visible from any point within the cell. We show analytically that in a densely occluded scene, the vast majority of objects are strongly occluded, and the overhead of using conservative visibility (rather than visibility) is small. These results are further supported by our experimental results. We also analyze the cost of the method and discuss its effectiveness.Item Dithered Color Quantization(Blackwell Publishers Ltd and the Eurographics Association, 1998) Buhmann, J. M.; Fellner, Dieter W.; Held, M.; Ketterer, J.; Puzicha, J.Image quantization and digital halftoning are fundamental problems in computer graphics, which arise when displaying high-color images on non-truecolor devices. Both steps are generally performed sequentially and, in most cases, independent of each other. Color quantization with a pixel-wise defined distortion measure and the dithering process with its local neighborhood optimize different quality criteria or, frequently, follow a heuristic without reference to any quality measure.In this paper we propose a new method to simultaneously quantize and dither color images. The method is based on a rigorous cost-function approach which optimizes a quality criterion derived from a generic model of human perception. A highly efficient algorithm for optimization based on a multiscale method is developed for the dithered color quantization cost function. The quality criterion and the optimization algorithms are evaluated on a representative set of artificial and real-world images as well as on a collection of icons. A significant image quality improvement is observed compared to standard color reduction approaches.Item Progressive Iso-Surface Extraction from Hierarchical 3D Meshes(Blackwell Publishers Ltd and the Eurographics Association, 1998) Grosso, Roberto; Ertl, ThomasA multiresolution data decomposition offers a fundamental framework supporting compression, progressive transmission, and level-of-detail (LOD) control for large two or three dimensional data sets discretized on complex meshes. In this paper we extend a previously presented algorithm for 3D mesh reduction for volume data based on multilevel finite element approximations in two ways. First, we present efficient data structures which allow to incrementally construct approximations of the volume data at lower or higher resolutions at interactive rates. An abstract description of the mesh hierarchy in terms of a coarse base mesh and a set of integer records offers a high compression potential which is essential for an efficient storage and a progressive network transmission. Based on this mesh hierarchy we then develop a new progressive iso-surface extraction algorithm. For a given iso-value, the corresponding iso-surface can be computed at different levels of resolution. Changing to a higher or coarser resolution will update the surface only in those regions where the volume data is being refined or coarsened. Our approach allows to interactively visualize very large scalar fields like medical data sets, whereas the conventional algorithms would have required at least an order of magnitude more resources.Item A Light Hierarchy for Fast Rendering of Scenes with Many Lights(Blackwell Publishers Ltd and the Eurographics Association, 1998) Paquette, Eric; Poulin, Pierre; Drettakis, GeorgeWe introduce a new data structure in the form of a light hierarchy for efficiently ray-tracing scenes with many light sources. An octree is constructed with the point light sources in a scene. Each node represents all the light sources it contains by means of a virtual light source. We determine bounds on the error committed with this approximation to shade a point, both for the cases of diffuse and specular reflections. These bounds are then used to guide a hierarchical shading algorithm. If the current level of the light hierarchy provides shading of sufficient quality, the approximation is used, thus avoiding the cost of shading for all the light sources contained below this level. Otherwise the descent into the light hierarchy continues.Our approach has been implemented for scenes without occlusion. The results show important acceleration compared to standard ray-tracing (up to 90 times faster) and an important improvement compared to Wardâ s adaptive shadow testing.Item Multiresolution Isosurface Extraction with Adaptive Skeleton Climbing(Blackwell Publishers Ltd and the Eurographics Association, 1998) Poston, Tim; Wong, Tien-Tsin; Heng, Pheng-AnnAn isosurface extraction algorithm which can directly generate multiresolution isosurfaces from volume data is introduced. It generates low resolution isosurfaces, with 4 to 25 times fewer triangles than that generated by marching cubes algorithm, in comparable running times. By climbing from vertices (0-skeleton) to edges (1-skeleton) to faces (2-skeleton), the algorithm constructs boxes which adapt to the geometry of the true isosurface. Unlike previous adaptive marching cubes algorithms, the algorithm does not suffer from the gap-filling problem. Although the triangles in the meshes may not be optimally reduced, it is much faster than postprocessing triangle reduction algorithms. Hence the coarse meshes it produces can be used as the initial starts for the mesh optimization, if mesh optimality is the main concern.Item Fast Feature-Based Metamorphosis and Operator Design(Blackwell Publishers Ltd and the Eurographics Association, 1998) Lee, Tong-Yee; Lin, Young-Ching; Sun, Y.N.; Lin, LeeweenMetamorphosis is a powerful visual technique, for producing interesting transition between two images or volume data. Image or volume metamorphosis using simple features provides flexible and easy control of visual effect. The feature-based image warping proposed by Beier and Neely is a brute-force approach. In this paper, first, we propose optimization methods to reduce their warping time without noticeable loss of image quality. Second, we extend our methods to 3D volume data and propose several interesting warping operators allowing global and local metamorphosis of volume data.Item Maximum Intensity Projection Using Splatting in Sheared Object Space(Blackwell Publishers Ltd and the Eurographics Association, 1998) Cai, Wenli; Sakas, GeorgiosIn this paper we present a new Maximum Intensity Projection (MIP) algorithm which was implemented employing splatting in a shear-warp context. This algorithm renders a MIP image by first splatting each voxel on two intermediate spaces called "worksheet" and "shear image". Then, the maximum value is evaluated between worksheet and shear image. Finally, shear image is warped on the screen to generate the result image. Different footprints implementing different quality modes are discussed. In addition, we introduced a line encoded indexing speed-up method to obtain interactive speed. This algorithm allows for a quantitative, predictable trade-off between interactivity and image quality.Item Perception Based Color Image Difference(Blackwell Publishers Ltd and the Eurographics Association, 1998) Neumann, Laszlo; Matkovic, Kresimir; Purgathofer, WernerA good image metric is often needed in digital image synthesis. It can be used to check the convergence behavior in progressive methods, to compare images rendered using various rendering methods etc. Since images are rendered to be observed by humans, an image metric should correspond to human perception as well. We propose here a new algorithm which operates in the original image space. There is no need for Fourier or wavelet transforms. Furthermore, the new metric is view distance dependent. The new method uses the contrast sensitivity function. The main idea is to place a number of various rectangles in images, and to compute the CIE LUV average color difference between corresponding rectangles. Errors are then weighted according to the rectangle size and the contrast sensitivity function.Item A Bernstein-Bezier Based Approach to Soft Tissue Simulation(Blackwell Publishers Ltd and the Eurographics Association, 1998) Roth, S.H.; Gross, Markus H.; Turello, Silvio; Carls, Friedrich R.This paper discusses a Finite Element approach for volumetric soft tissue modeling in the context of facial surgery simulation. We elaborate on the underlying physics and address some computational aspects of the finite element discretization.In contrast to existing approaches speed is not our first concern, but we strive for the highest possible accuracy of simulation. We therefore propose an extension of linear elasticity towards incompressibility and nonlinear material behavior, in order to describe the complex properties of human soft tissue more accurately. Furthermore, we incorporate higher order interpolation functions using a Bernstein-Bezier formulation, which has various advantageous properties, such as its integral polynomial form of arbitrary degree, efficient subdivision schemes, and suitability for geometric modeling and rendering. In addition, the use of tetrahedral Finite Elements does not put any restriction on the geometry of the simulated volumes.Experimental results obtained from a synthetic block of soft tissue and from the Visible Human Data Set illustrate the performance of the envisioned model.Item Frontiers in User-Computer Interaction(Blackwell Publishers Ltd and the Eurographics Association, 1998) Van Dam, AndriesIn this age of (near-)adequate computing power, the power and usability of the user interface is as key to an applicationâ s success as its functionality. Most of the code in modern desktop productivity applications resides in the user interface. But despite its centrality, the user interface field is currently in a rut: the WIMP (Windows, Icons, Menus, Point-and-Click GUI based on keyboard and mouse) has evolved little since it was pioneered by Xerox PARC in the early â 70s. Computer and display form factors will change dramatically in the near future and new kinds of interaction devices will soon become available. Desktop environments will be enriched not only with PDAs such as the Newton and Palm Pilot, but also with wearable computers and large-screen displays produced by new projection technology, including office-based immersive virtual reality environments. On the input side, we will finally have speech-recognition and force-feedback devices. Thus we can look forward to user interfaces that are dramatically more powerful and better matched to human sensory capabilities than those dependent solely on keyboard and mouse. 3D interaction widgets controlled by mice or other interaction devices with three or more degrees of freedom are a natural evolution from their two-dimensional WIMP counterparts and can decrease the cognitive distance between widget and task for many tasks that are intrinsically 3D, such as scientific visualization and MCAD. More radical post-WIMP UIs are needed for immersive virtual reality where keyboard and mouse are absent. Immersive VR provides good driving applications for developing post-WIMP UIs based on multimodal interaction that involve more of our senses by combining the use of gesture, speech, and haptics.Item A New Approach for Direct Manipulation of Free-Form Curve(Blackwell Publishers Ltd and the Eurographics Association, 1998) Zheng, J.M.; Chan, K.W.; Gibson, I.There is an increasing demand for more intuitive methods for creating and modifying free-form curves and surfaces in CAD modeling systems. The methods should be based not only on the change of the mathematical parameters, such as control points, knots, and weights, but also on the userâ s specified constraints and shapes. This paper presents a new approach for directly manipulating the shape of a free-form curve, leading to a better control of the curve deformation and a more intuitive CAD modeling interface. The userâ s intended deformation of a curve is automatically converted into the modification of the corresponding NURBS control points and knot sequence of the curve. The algorithm for this approach includes curve elevation, knot refinement, control point repositioning, and knot removal. Several examples shown in this paper demonstrate that the proposed method can be used to deform a NURBS curve into the desired shape. Currently, the algorithm concentrates on the purely geometric consideration. Further work will include the effect of material properties.Item Mass-Spring Simulation using Adaptive Non-Active Points(Blackwell Publishers Ltd and the Eurographics Association, 1998) Howlett, P.; Hewitt, W.T.This paper introduces an adaptive component to a mass-spring system as used in the modelling of cloth for computer animation. The new method introduces non-active points to the model which can adapt the shape of the cloth at inaccuracies. This improves on conventional uniform mass-spring systems by producing more visually pleasing results when simulating the drape of cloth over irregular objects. The computational cost of simulation is decreased by reducing the complexity of collision handling and enabling the use of coarser mass-spring networks.Item Anisotropic Solid Texture Synthesis Using Orthogonal 2D Views(Blackwell Publishers Ltd and the Eurographics Association, 1998) Dischler, J.M.; Ghazanfarpour, D.; Freydier, R.Analytical approaches, based on digitised 2D texture models, for an automatic solid (3D) texture synthesis have been recently introduced to Computer Graphics. However, these approaches cannot provide satisfactory solutions in the usual case of natural anisotropic textures (wood grain for example). Indeed, solid texture synthesis requires particular care, and sometimes external knowledge to "guess" the internal structure of solid textures because only 2D texture models are used for analysis. By making some basic assumptions about the internal structure of solid textures, we propose a very efficient method based on a hybrid analysis (spectral and histogram) for an automatic synthesis of solid textures. This new method allows us to obtain high precision solid textures (closely resembling initial models) in a large number of cases, including the difficult case of anisotropic textures.Item Getting Rid of Links in Hierarchical Radiosity(Blackwell Publishers Ltd and the Eurographics Association, 1998) Stamminger, M.; Schirmacher, H.; Slusallek, Ph.; Seidel, H.-P.Hierarchical radiosity with clustering has positioned itself as one of the most efficient algorithms for computing global illumination in non-trivial environments. However, using hierarchical radiosity for complex scenes is still problematic due to the necessity of storing a large number of transport coefficients between surfaces in the form of links. In this paper, we eliminate the need for storage of links through the use of a modified shooting method for solving the radiosity equation. By distributing only unshot radiosity in each step of the iteration, the number of links decreases exponentially. Recomputing these links instead of storing them increases computation time, but reduces memory consumption dramatically. Caching may be used to reduce the time overhead. We analyze the error behavior of the new algorithm in comparison with the normal gathering approach for hierarchical radiosity. In particular, we consider the relation between the global error of a hierarchical radiosity solution and the local error threshold for each link.Item Importance Driven Texture Coordinate Optimization(Blackwell Publishers Ltd and the Eurographics Association, 1998) Sloan, Peter-Pike J.; Weinstein, David M.; Brederson, J.Traditionally, texture coordinates have been generated based solely on the modelâ s geometry, often even before a modelâ s textures have been created. With the arrival of new technologies, such as 3D paint programs, weaknesses of a static optimization pre-process are becoming apparent. These weaknesses arise from constructing a parameterization based solely on the modelâ s geometry, ignoring the fact that detail is not uniformly spaced throughout the texture space. In fact, certain regions of the texture are more important than other regions. In this paper we introduce the notion of the "importance map" and describe how importance values are derived from both intrinsic properties of the texture and user-guided highlights. Furthermore, we describe how importance maps are used to drive the texture coordinate optimization. Finally, we show how this optimization process can be integrated into a 3D painting environment, enabling periodic optimization at any stage of texture design.Item Animation of Biological Organ Growth Based on L-systems(Blackwell Publishers Ltd and the Eurographics Association, 1998) Durikovic, Roman; Kaneda, Kazufumi; Yamashita, HideoIn contrast with the growth of plants and trees, human organs can undergo significant changes in shape through a variety of global transformations during the growth period, such as bending or twisting. In our approach, the topology of a human organ is represented by a skeleton in the form of a tree or cycled graph. The length of skeleton growth can be simulated by an algebraic L-system that also produces discrete events. The paper shows how to include global transformations into the formalism of L-systems to obtain a continuous process. The shape of the organ is approximated by a number of ellipsoidal clusters centred at points on the skeleton. The proposed growth model of the organ continually responds to the positional changes of surrounding organs, thereby changing the organ shape locally. In our study, the stomach of a human embryo is used for the demonstration of organ development, and the methodology employed is also applicable to the animation of animal organs and their development.Item Importance Driven Halftoning(Blackwell Publishers Ltd and the Eurographics Association, 1998) Streit, L.; Buchanan, J.Most halftoning techniques have been primarily concerned with achieving an accurate reproduction of local gray-scale intensities while avoiding the introduction of artifacts. A secondary concern in halftoning has been the preservation of edges in the halftoned image. In this paper, we will introduce a new halftoning technique that utilizes a bandpass pyramid to achieve an accurate reproduction of important attributes in the image. Ink is distributed through the bandpass pyramid primarily according to a user defined importance function. This technique has three main characteristics. First, our technique can produce results similar to many other halftoning techniques by allowing a generic importance function to be specified. If the chosen importance function is average intensity we obtain results similar to traditional halftoning. We also show how the importance function can be changed to highlight areas with high variance. Second, in addition to changing the importance function, the drawing primitives can also be changed. By using line segments instead of single pixels as drawing primitives we illustrate how edge enhancement can be achieved. Third, this technique allows the user to easily limit the number drawing primitives used. This is useful in limited resource rendering.In addition to providing a tailorable halftoning technique our method can easily be adapted to produce two tone non-photorealistic (NPR) images. We illustrate this by showing how sketched effects can be achieved by aligning the drawing primitives according to different image attributes.Item A Vector Approach for Global Illumination in Ray Tracing(Blackwell Publishers Ltd and the Eurographics Association, 1998) Zaninetti, Jacques; Serpaggi, Xavier; Peroche, BernardThis paper presents a method taking global illumination into account in a ray tracing environment. A vector approach is introduced, which allows to deal with all the types of light paths and the directional properties of materials. Three types of vectors are defined: Direct Light Vectors associated to light sources, Indirect Light Vectors which correspond to light having been diffusely reflected at least once and Caustic Light Vectors which are associated to light rays emitted by sources and reflected and/or transmitted by specular surfaces. These vectors are estimated at a small number of points in the scene. A weighted interpolation between known values allows to reconstruct these vectors for the other points, with the help of a gradient computation for the indirect component. This approach also allows to take uniform area light sources (spherical, rectangular and circular) into account for all the types of vectors. Computed images are thus more accurate and no discretizing of the geometry of the scene is needed.Item Author Index(Blackwell Publishers Ltd and the Eurographics Association, 1998)Item Molecular Dynamics Simulation in Virtual Environments(Blackwell Publishers Ltd and the Eurographics Association, 1998) Ai, Z.; Frohlich, T.A virtual environment for interactive molecular dynamics simulation has been designed and implemented at the Fraunhofer Institute for Computer Graphics. Different kinds of virtual reality devices are used in the environment for immersive display and interaction with the molecular system. A parallel computer is used to simulate the physical and chemical properties of the molecular system dynamically. A high-speed network exchanges data between the simulation program and the modeling program. Molecular dynamics simulation virtual environment provides scientists with a powerful tool to study immersively the world of molecules. The dynamic interaction between an AIDS antiviral drug and reverse transcriptase enzyme is illustrated in the paper.