VMV18
Permanent URI for this collection
Browse
Browsing VMV18 by Subject "Applied computing"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Automatic Generation of Saliency-based Areas of Interest for the Visualization and Analysis of Eye-tracking Data(The Eurographics Association, 2018) Fuhl, Wolfgang; Kuebler, Thomas; Santini, Thiago; Kasneci, Enkelejda; Beck, Fabian and Dachsbacher, Carsten and Sadlo, FilipAreas of interest (AOIs) are a powerful basis for the analysis and visualization of eye-tracking data. They allow to relate eyetracking metrics to semantic stimulus regions and to perform further statistics. In this work, we propose a novel method for the automated generation of AOIs based on saliency maps. In contrast to existing methods from the state-of-the-art, which generate AOIs based on eye-tracking data, our method generates AOIs based solely on the stimulus saliency, mimicking thus our natural vision. This way, our method is not only independent of the eye-tracking data, but allows to work AOI-based even for complex stimuli, such as abstract art, where proper manual definition of AOIs is not trivial. For evaluation, we cross-validate support vector machine classifiers with the task of separating visual scanpaths of art experts from those of novices. The motivation for this evaluation is to use AOIs as projection functions and to evaluate their robustness on different feature spaces. A good AOI separation should result in different feature sets that enable a fast evaluation with a widely automated work-flow. The proposed method together with the data shown in this paper is available as part of the software EyeTrace [?] http://www.ti.unituebingen. de/Eyetrace.1751.0.html.Item Painterly Rendering using Limited Paint Color Palettes(The Eurographics Association, 2018) Lindemeier, Thomas; Gülzow, J. Marvin; Deussen, Oliver; Beck, Fabian and Dachsbacher, Carsten and Sadlo, FilipWe present a painterly rendering method for digital painting systems as well as visual feedback based painting machines that automatically extracts color palettes from images and computes mixture recipes for these from a set of real base paint colors based on the Kubelka-Munk theory. In addition, we present a new algorithm for distributing stroke candidates, which creates paintings with sharp details and contrasts. Our system is able to predict dry compositing of thinned or thick paint colors using an evaluation scheme based on example data collected from a calibration step and optical blending. We show results generated using a software stroke-based renderer and a painting machine.Item WithTeeth: Denture Preview in Augmented Reality(The Eurographics Association, 2018) Amirkhanov, Aleksandr; Amirkhanov, Artem; Bernhard, Matthias; Toth, Zsolt; Stiller, Sabine; Geier, Andreas; Gröller, Eduard; Mistelbauer, Gabriel; Beck, Fabian and Dachsbacher, Carsten and Sadlo, FilipDentures are prosthetic devices replacing missing or damaged teeth, often used for dental reconstruction. Dental reconstruction improves the functional state and aesthetic appearance of teeth. State-of-the-art methods used by dental technicians typically do not include the aesthetic analysis, which often leads to unsatisfactory results for patients. In this paper, we present a virtual mirror approach for a dental treatment preview in augmented reality. Different denture presets are visually evaluated and compared by switching them on the fly. Our main goals are to provide a virtual dental treatment preview to facilitate early feedback, and hence to build the confidence and trust of patients in the outcome. The workflow of our algorithm is as follows. First, the face is detected and 2D facial landmarks are extracted. Then, 3D pose estimation of upper and lower jaws is performed and high-quality 3D models of the upper and lower dentures are fitted. The fitting uses the occlusal plane angle as determined mnually by dental technicians. To provide a realistic impression of the virtual teeth, the dentures are rendered with motion blur. We demonstrate the robustness and visual quality of our approach by comparing the results of a webcam to a DSLR camera under natural, as well as controlled lighting conditions.