Plenoptic Stitching: A Plenoptic Stitching: A Scalable Method for Scalable Method for Reconstructing 3D Reconstructing 3D Interactive Interactive Walkthroughs Walkthroughs Daniel G. Aliaga Aliaga@bell- labs.com Ingrid Carlbom Carlbom@bell- labs.com Presented by Matthew McCrory
23
Embed
Plenoptic Stitching: A Scalable Method for Reconstructing 3D Interactive Walkthroughs Daniel G. Aliaga [email protected] Ingrid Carlbom [email protected].
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Plenoptic Stitching: A Scalable Plenoptic Stitching: A Scalable Method for Reconstructing 3D Method for Reconstructing 3D
Interactive WalkthroughsInteractive WalkthroughsDaniel G. Aliaga
To reconstruct a continuous function, the entire open space should be sampled densely, which just isn’t practical!
A solution: sample observer plane using irregular grid of omnidirectional image sequences forming image loops in the observer plane
CaptureCapture
Relatively inexpensive camera system used, built from off-the-shelf components. Camera placed on motorized cart with battery, computer, frame grabber, and fast RAID disk
Camera uses a convex paraboloidal mirror w/an orthographic projection
Camera Pose EstimationCamera Pose EstimationCalibration scheme developed by Aliaga and
Carlbom that uses beacons for calibrationIn 2 corners of a region, beacons are placed.Before recording, user initializes pose
estimation by identifying the projections of the beacons in the first captured image
As camera moves, beacon is tracked and, using triangulation, derives camera position and orientation
ReconstructionReconstruction
Given a set of image loops, create novel planar views of environment from arbitrary viewpoints inside a loop
Combine pixels from omnidirectional images in the forward-looking view frustum with pixels in the omnidirectional images in the rear-looking frustum
ReconstructionReconstruction
New view created via column-by-column reconstruction of pixels from the omnidirectional images
If viewing direction for a column intersects the COP of 2 complimentary omnidirectional images, then that direction maps directly to radial lines in the images.
ReconstructionReconstruction
Corresponding segments of each radial line warped to the column in reconstructed image
Geometry of omnidirectional camera must be considered
Using similar triangles…
ReconstructionReconstruction
Because pixels from the image behind the viewpoint are generally stretched during the warp, pixels are drawn using fixed sized splats
Vertical disocclusions are filled in using longer than expected splats
Usually, viewing direction will not intersect exactly with COP of an omnidirectional image
Radial is generated then by blending 2 parallel radials
ReconstructionReconstruction
Difficult to reliably identify corresponding features in the radial lines from omnidirectional images on opposite sides of a loop
Temporal coherence of the entire image loop is used to identify the required features
Start with an arbitrary omni image, track features all the way around the loop
OptimizationOptimization
In ideal conditions, radial lines differ only by a radial displacement and should be easy to recover
In practice, feature tracking and camera pose estimation introduce errors into mapping