Surface reconstrucon and arsc rendering of small paleontological specimens Sebasan Schäfer, Carsten Heep Goethe-Universität Frankfurt Acknowledgement: We thank Dr. Thomas Lehmann and Dr. Krister Smith from Senckenberg Research Instute and Natural History Museum Frankfurt for their expert opinions and support. Outline 1. Reconstrucon Record imagestack Create heightmap Create mesh and total-focus-texture 2. Rendering Render G-Buffers Create Outlines Render arsc style: Realisc Gooch-Shading Hatching Sppling • • • • • • - - - - Abstract An important domain of paleontological research is the creaon of hand-drawn arsc images of small fossil specimens. In this paper we discuss the process of wring an adequate tool to semiautomacally create the desired images and export a reconstructed 3D-model. First we reconstruct the three- dimensional surface enrely on the GPU from a series of images. Then we render the virtual specimen with Non-Photorealisc- Rendering algorithms that are adapted to recreate the impression of manually drawn images. These algorithms are parameterized with respect to the requirements of a user from a paleontological background. Reconstruction The first step is to reconstruct the 3D-surface of the fossil. We concentrated our study on small paleontological specimens that have a diameter of a few millimeters. From a specimen we create an imagestack using a motorfocus microscope. A method called Shape From Focus [2] generates a heightmap by using energy to idenfy focused pixels. We implemented the original algorithm on the GPU and extended it by an energy- threshold [3] to filter out low energy pixels. The holes are filled with a two-step algorithm on the GPU as it was described in [1]. Compared to a CPU-based implementaon we achieve a maximum speed-up factor of 30. Artistic Rendering We render the final image using a deferred rendering system: Depth- and Normalmap usefull for Outlines Albedomap based on total-focus-texture Illuminaonmap based on a modified lamberan illuminaon Scienfic drawings are usually drawn from five perspecves: dorsal, lateral leſt, lateral right, anterior and posterior which we realized as presets for the camera posion. Light source and camera can be posioned around the object on a hemisphere using polar coordinates. Currently there are three arsc renderstyles implemented: Gooch-Shading to visualize surface & structure [5] Hatching for an arsc illustraon [4] Sppling for a scienfic illustraon The user can choose to add an outline which is created from Depth- and Normalmap. We implemented hatching and sppling as screen-space algorithms. Both rely on textures that are generated in a preprocess step. While the hatching-textures are manually drawn the sppling-texture is automacally generated. The points are randomly placed on the texture plane unl it is filled. For every point the iteraon it was placed is encoded in its RGB-Value. Depending on the value stored in the Illuminaonmap the number of sppling points shown in an area can be varied so that darker image parts have more sppling points. The final image is filtered with an adapve blur for more pleasing results. • • • • • • References: [1] HEEP & SCHÄFER, 2010: 3d-oberflächen-rekonstrukon und plassches rendern [...] [2] NAYAR & NAKAGAWA, 1994: Shape from focus [3] SCHÄFER & NAGL, 2010: 3d surface reconstrucon of arbitrary (image) layer [4] PRAUN, HOPPE, WEPP & FINKELSTEIN, 2001: Real-me Hatching [5] GOOCH, GOOCH, SHIRLEY & COHEN, 1998: A non-photorealisc lighng model for automac technical illustraon First erase noisy areas via threshhold, then fill them up in two steps 1. Step 2. Step NPAR: Non-photorealistic Animation and Rendering - 2011 - Vancouver/Canada © 2011 by Sebastian Schäfer and Carsten Heep L R A D P ☼ h S 0 S i S n-1 Leptictidium nasutum tooth left upper M²