Top Banner
CSE486, Penn State Robert Collins Lecture 15 Robust Estimation : RANSAC
46
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Lecture15

CSE486, Penn StateRobert Collins

Lecture 15Robust Estimation : RANSAC

Page 2: Lecture15

CSE486, Penn StateRobert Collins

RECALL: Parameter Estimation:

General Strategy• Least-Squares estimation from point correspondences

Let’s say we have found point matches between two images, andwe think they are related by some parametric transformation (e.g. translation; scaled Euclidean; affine). How do we estimate theparameters of that transformation?

But there are problems with that approach....

Page 3: Lecture15

CSE486, Penn StateRobert Collins

Problem : OutliersLoosely speaking, outliers are points that don’t “fit” the model.

outlier

outlier

Page 4: Lecture15

CSE486, Penn StateRobert Collins

Bad Data => OutliersLoosely speaking, outliers are points that don’t “fit” the model.Points that do fit are called “inliers”

outlier

outlier

inliers

Page 5: Lecture15

CSE486, Penn StateRobert Collins

Problem with Outliers

compare

Least squares estimation is sensitive to outliers,so that a few outliers can greatly skew the result.

Least squaresregression withoutliers

Solution: Estimation methodsthat are robust to outliers.

Page 6: Lecture15

CSE486, Penn StateRobert Collins

Outliers aren’t the Only Problem

Multiple structures can alsoskew the results. (the fit procedureimplicitly assumes there is only oneinstance of the model in the data).

Page 7: Lecture15

CSE486, Penn StateRobert Collins

Robust Estimation

• View estimation as a two-stage process:– Classify data points as outliers or inliers

– Fit model to inliers while ignoring outliers

• Example technique: RANSAC(RANdom SAmple Consensus)

M. A. Fischler and R. C. Bolles (June 1981). "RandomSample Consensus: A Paradigm for Model Fitting withApplications to Image Analysis and AutomatedCartography". Comm. of the ACM 24: 381--395.

Page 8: Lecture15

CSE486, Penn StateRobert Collins

Ransac Procedure

Page 9: Lecture15

CSE486, Penn StateRobert Collins

Ransac Procedure

Page 10: Lecture15

CSE486, Penn StateRobert Collins

Count = 4

Ransac Procedure

Page 11: Lecture15

CSE486, Penn StateRobert Collins

Ransac Procedure

Page 12: Lecture15

CSE486, Penn StateRobert Collins

Count = 6

Ransac Procedure

Page 13: Lecture15

CSE486, Penn StateRobert Collins

Ransac Procedure

Page 14: Lecture15

CSE486, Penn StateRobert Collins

Count = 19

Ransac Procedure

Page 15: Lecture15

CSE486, Penn StateRobert Collins

Ransac Procedure

Page 16: Lecture15

CSE486, Penn StateRobert Collins

Count = 13

Ransac Procedure

Page 17: Lecture15

CSE486, Penn StateRobert Collins

Count = 4Count = 6Count = 19Count = 13

Ransac Procedure

Page 18: Lecture15

CSE486, Penn StateRobert Collins

(Forsyth & Ponce)

s

s

s

N

N

d

dd

T

T

Page 19: Lecture15

CSE486, Penn StateRobert Collins

How Many Samples to Choose?

1 - (1- ( 1 - e ) ) = ps N

e = probability that a point is an outliers = number of points in a sampleN = number of samples (we want to compute this)p = desired probability that we get a good sample

Solve the following for N:

Where in the world did that come from? ….

Page 20: Lecture15

CSE486, Penn StateRobert Collins

How Many Samples to Choose?

1 - (1- ( 1 - e ) ) = ps N

e = probability that a point is an outliers = number of points in a sampleN = number of samples (we want to compute this)p = desired probability that we get a good sample

Probability that choosing one point yields an inlier

Page 21: Lecture15

CSE486, Penn StateRobert Collins

How Many Samples to Choose?

1 - (1- ( 1 - e ) ) = ps N

e = probability that a point is an outliers = number of points in a sampleN = number of samples (we want to compute this)p = desired probability that we get a good sample

Probability of choosing s inliers in a row (sampleonly contains inliers)

Page 22: Lecture15

CSE486, Penn StateRobert Collins

How Many Samples to Choose?

1 - (1- ( 1 - e ) ) = ps N

e = probability that a point is an outliers = number of points in a sampleN = number of samples (we want to compute this)p = desired probability that we get a good sample

Probability that one or morepoints in the sample were outliers(sample is contaminated).

Page 23: Lecture15

CSE486, Penn StateRobert Collins

How Many Samples to Choose?

1 - (1- ( 1 - e ) ) = ps N

e = probability that a point is an outliers = number of points in a sampleN = number of samples (we want to compute this)p = desired probability that we get a good sample

Probability that N sampleswere contaminated.

Page 24: Lecture15

CSE486, Penn StateRobert Collins

How Many Samples to Choose?

1 - (1- ( 1 - e ) ) = ps N

e = probability that a point is an outliers = number of points in a sampleN = number of samples (we want to compute this)p = desired probability that we get a good sample

Probability that at leastone sample was not contaminated (at least one sample of spoints is composed of onlyinliers).

Page 25: Lecture15

CSE486, Penn StateRobert Collins

How many samples?

Choose N so that, with probability p, at least one randomsample is free from outliers. e.g. p=0.99

11772727844269585881635433208472939737241674614657261712645723417139534351911974331711765322

50%40%30%25%20%10%5%s

proportion of outliers e

Page 26: Lecture15

CSE486, Penn StateRobert Collins

Example: N for the line-fitting problem

• n = 12 points

• Minimal sample size s = 2• 2 outliers: e = 1/6 => 20%• So N = 5 gives us a 99% chance of getting a pure-inlier sample

– Compared to N = 66 by trying every pair of points

from Hartley & Zisserman

Page 27: Lecture15

CSE486, Penn StateRobert Collins

Acceptable consensus set?

• We have seen that we don’t have to exhaustively samplesubsets of points, we just need to randomly sample N subsets.

• However, typically, we don’t even have to sample N sets!

• Early termination: terminate when inlier ratio reachesexpected ratio of inliers

( ) ∗ (total number of data points)eT −= 1

Page 28: Lecture15

CSE486, Penn StateRobert Collins

RANSAC: Picking Distance Threshold d• Usually chosen empirically• But…when measurement error is known to be Gaussian with

mean 0 and variance s2:– Sum of squared errors follows a χ2 distribution with m DOF, where m

is the DOF of the error measure (the codimension)– (dimension + codimension) = dimension of parameter space

• E.g., m = 1 for line fitting because error is perpendicular distance

• E.g., m = 2 for point distance

• Examples for probability p = 0.95 that point is inlier

5.99 s2Homography, camera matrix2

3.84 s2Line, fundamental matrix1

d2Modelm

Page 29: Lecture15

CSE486, Penn StateRobert Collins

After RANSAC• RANSAC divides data into inliers and outliers and yields

estimate computed from minimal set of inliers with greatestsupport

• Improve this initial estimate with Least Squares estimationover all inliers (i.e., standard minimization)

• Find inliers wrt that L.S. line, and compute L.S. one more time.

from Hartley & Zisserman

Page 30: Lecture15

CSE486, Penn StateRobert Collins

Practical Example

• Stabilizing aerial imagery using RANSAC- find corners in two images

- hypothesize matches using NCC

- do RANSAC to find matches consistent with anaffine transformation

- take the inlier set found and estimate a fullprojective transformation (homography)

Page 31: Lecture15

CSE486, Penn StateRobert Collins

Stabilization ApplicationInput: two images from an aerial video sequence.

Note that the motion of the camera is “disturbing”

Page 32: Lecture15

CSE486, Penn StateRobert Collins

Stabilization ApplicationStep1: extract Harris corners from both frames. We use a smallthreshold for R because we want LOTS of corners (fodder for ournext step, which is matching).

Harris corners for first frame Detailed view

Page 33: Lecture15

CSE486, Penn StateRobert Collins

Stabilization ApplicationStep2: hypothesize matches. For each corner in image 1, look formatching intensity patch in image2 using NCC. Make sure matchingpairs have highest NCC match scores in BOTH directions.

image1 image2

best match from 1 to 2

best match from 2 to 1

best match from 1 to 2

best match from 2 to 1

mutuallycompatible!

Page 34: Lecture15

CSE486, Penn StateRobert Collins

Stabilization ApplicationStep2: hypothesize matches.

yikes!

As you can see, a lot of false matches get hypothesized. The jobof RANSAC will be to clean this mess up.

Page 35: Lecture15

CSE486, Penn StateRobert Collins

Stabilization ApplicationStep3: Use RANSAC to robustly fit best affine transformation tothe set of point matches.

0 1

1

0x

y

x’

y’

transform

xi’ = a xi + c yi + cyi’ = d xi + e yi + f

How many unknowns?How many point matches are needed?

Page 36: Lecture15

CSE486, Penn StateRobert Collins

Stabilization ApplicationStep3: Use RANSAC to robustly fit best affine transformation tothe set of point matches.

Affine transformation has 6 degrees of freedom.We therefore need 3 point matches [each gives 2 equations]

Randomly sample sets of 3 point matches. For each, computethe unique affine transformation they define. How?

Page 37: Lecture15

CSE486, Penn StateRobert Collins

Stabilization ApplicationHow to compute affine transformation from 3 point matches?Use Least Squares! (renewed life for a nonrobust approach)

Then transform all points from image1 to image2 using thatcomputed transformation, and see how many other matchesconfirm the hypothesis.

Repeat N times.

Page 38: Lecture15

CSE486, Penn StateRobert Collins

Stabilization Application

green: inliersred: outliers

original point matches labels from RANSAC

Page 39: Lecture15

CSE486, Penn StateRobert Collins

Stabilization ExampleStep4: Take inlier set labeled by RANSAC, and now use leastsquares to estimate a projective transformation that aligns the images. (we will discuss this ad nauseum in a later lecture).

0 1

1

0x

y

x’

y’

transform

Projective Transformation

Page 40: Lecture15

CSE486, Penn StateRobert Collins

Stabilization ExampleStep4: estimate projective transformation that aligns the images.

Now it is easier for people (and computers) to see the moving objects.

Page 41: Lecture15

CSE486, Penn StateRobert Collins

Stabilization Examples

Page 42: Lecture15

CSE486, Penn StateRobert Collins

Stabilization Examples

green: inliersred: outliers

original point matches labels from RANSAC

Page 43: Lecture15

CSE486, Penn StateRobert Collins

Stabilization Examples

Page 44: Lecture15

CSE486, Penn StateRobert Collins

Stabilization Examples

Page 45: Lecture15

CSE486, Penn StateRobert Collins

Stabilization Examples

green: inliersred: outliers

original point matches labels from RANSAC

Page 46: Lecture15

CSE486, Penn StateRobert Collins

Stabilization Examples