Browsing by Author "Bosch, Carles"
Now showing 1 - 5 of 5
Results Per Page
Sort Options
Item An Annotation Tool for Digital Restoration of Wall Paintings(The Eurographics Association, 2022) Barreiro Díaz, Albert; Munoz-Pandiella, Imanol; Bosch, Carles; Andujar, Carlos; Ponchio, Federico; Pintus, RuggeroAntique paintings are essential to study and understand our past. Paintings, and specifically mural paintings, are delicate artworks that are affected by multiple deterioration conditions. Weathering and human interventions cause different damage problems, and physical and chemical changes degrade their visual color appearance. As a consequence, art historians and archaeologists require a huge effort to attempt to rebuild their original appearance. The annotation of digital images of the paintings is a valuable tool in this process. In this paper we analyze major requirements from art historians concerning the annotation of painting regions from the point of view of digital restoration. We also describe a tool prototype (based on TagLab) intended to facilitate the annotation and segmentation of mural paintings. The tool assists art historians in formulating multiple hypotheses on the original appearance by supporting multiple annotation layers for degradation and color, providing both hand-drawn and semi-automatic segmentation, and offering web-based dissemination and sharing of the annotations through the W3C Web Annotation Data Model.Item Controllable Image‐Based Transfer of Flow Phenomena(© 2019 The Eurographics Association and John Wiley & Sons Ltd., 2019) Bosch, Carles; Patow, Gustavo; Chen, Min and Benes, BedrichModelling flow phenomena and their related weathering effects is often cumbersome due their dependence on the environment, materials and geometric properties of objects in the scene. Example‐based modelling provides many advantages for reproducing real textures, but little effort has been devoted to reproducing and transferring complex phenomena. In order to produce realistic flow effects, it is possible to take advantage of the widespread availability of flow images on the Internet, which can be used to gather key information about the flow. In this paper, we present a technique that allows the transfer of flow phenomena between photographs, adapting the flow to the target image and giving the user flexibility and control through specifically tailored parameters. This is done through two types of control curves: a fitted theoretical curve to control the mass of deposited material, and an extended colour map for properly adapting to the target appearance. In addition, our method filters and warps the input flow in order to account for the geometric details of the target surface. This leads to a fast and intuitive approach to easily transfer phenomena between images, providing a set of simple and intuitive parameters to control the process.Item EUROGRAPHICS 2017: Short Papers Frontmatter(Eurographics Association, 2017) Peytavie, Adrien; Bosch, Carles;Item Intensity-Guided Exposure Correction for Indoor LiDAR Scans(The Eurographics Association, 2021) Comino Trinidad, Marc; Andújar, Carlos; Bosch, Carles; Chica, Antonio; Munoz-Pandiella, Imanol; Ortega, Lidia M. and Chica, AntonioTerrestrial Laser Scanners, also known as LiDAR, are often equipped with color cameras so that both infrared and RGB values are measured for each point sample. High-end scanners also provide panoramic High Dynamic Range (HDR) images. Rendering such HDR colors on conventional displays requires a tone-mapping operator, and getting a suitable exposure everywhere on the image can be challenging for 360° indoor scenes with a variety of rooms and illumination sources. In this paper we present a simple-to-implement tone mapping algorithm for HDR panoramas captured by LiDAR equipment. The key idea is to choose, on a per-pixel basis, an exposure correction factor based on the local intensity (infrared reflectivity). Since LiDAR intensity values for indoor scenes are nearly independent from the external illumination, we show that intensity-guided exposure correction often outperforms state-of-the-art tone-mapping operators on this kind of scenes.Item Neural Colorization of Laser Scans(The Eurographics Association, 2021) Comino Trinidad, Marc; Andujar, Carlos; Bosch, Carles; Chica, Antonio; Muñoz-Pandiella, Imanol; Ortega, Lidia M. and Chica, AntonioLaser scanners enable the digitization of 3D surfaces by generating a point cloud where each point sample includes an intensity (infrared reflectivity) value. Some LiDAR scanners also incorporate cameras to capture the color of the surfaces visible from the scanner location. Getting usable colors everywhere across 360° scans is a challenging task, especially for indoor scenes. LiDAR scanners lack flashes, and placing proper light sources for a 360° indoor scene is either unfeasible or undesirable. As a result, color data from LiDAR scans often do not have an adequate quality, either because of poor exposition (too bright or too dark areas) or because of severe illumination changes between scans (e.g. direct Sunlight vs cloudy lighting). In this paper, we present a new method to recover plausible color data from the infrared data available in LiDAR scans. The main idea is to train an adapted image-to-image translation network using color and intensity values on well-exposed areas of scans. At inference time, the network is able to recover plausible color using exclusively the intensity values. The immediate application of our approach is the selective colorization of LiDAR data in those scans or regions with missing or poor color data.