/
Robust fitting Camera calibration Robust fitting Camera calibration

Robust fitting Camera calibration - PowerPoint Presentation

rose
rose . @rose
Follow
66 views
Uploaded On 2023-08-30

Robust fitting Camera calibration - PPT Presentation

Given a set of correspondences between 3D world points and image points Set up an optimization problem Solve for   Homography estimation Given a set of correspondences between 3D world points ID: 1014602

inliers points number model points inliers model number sample fitting set lines line fit ransac solve parameters random hypothesize

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Robust fitting Camera calibration" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

1. Robust fitting

2. Camera calibrationGiven a set of correspondences between 3D world points and image points: Set up an optimization problem:Solve for  

3. Homography estimationGiven a set of correspondences between 3D world points on a plane and image points: Set up an optimization problem:Solve for  

4. Fundamental matrix/ essential matrix estimationGiven a set of correspondences between image points in two images: Set up an optimization problem:Solve for  

5. What happens when correspondences are wrong?Not just noisyNoise (e.g., ~1 pixel) can be handled because we are solving a minimization problem, rather than exactly satisfy equationsWrong correspondences can be way off.

6. Impact of incorrect correspondencesCorrect:Incorrect:

7. Outliersoutliersinliers

8. Fitting: find the parameters of a model that best fit the dataOther examples:least squares linear regressionFitting in general

9. Least squares: linear regressiony = mx + b(yi, xi)

10. Linear regressionresidual error

11. RobustnessProblem: Fit a line to these datapointsLeast squares fitOutliers!

12. How do we find the best line in the presence of outliers?Unlike least-squares, no simple closed-form solution Hypothesize-and-testTry out many lines, keep the best oneWhich lines?How to measure which is best?

13. Choosing lines to hypothesizeRandomly choose lines?Not optimal: highly unlikely to run into correct line by chanceIdea: randomly sample data points from the dataset and fit line to themHow many data points should we sample?Any point we pick might be an outlier Want to maximize the chance that all sampled points are inliers  get the true lineIdea: sample the minimum number of points necessary to get a line

14. Choosing lines to hypothesize

15. Choosing lines to hypothesize

16. Choosing lines to hypothesize

17. Choosing lines to hypothesize

18. Measuring goodness of a line Given a hypothesized lineCount the number of points that “agree” with the line“Agree” = within a small distance of the lineI.e., the inliers to that lineFor all possible lines, select the one with the largest number of inliers

19. Counting inliers

20. Counting inliersInliers: 3

21. Counting inliersInliers: 20

22. RANSAC (Random Sample Consensus)Algorithm: Sample (randomly) the number of points required to fit the model (#=2) Solve for model parameters using samples Score by the fraction of inliers within a preset threshold of the modelRepeat 1-3 until the best model is found with high confidenceIllustration by SavareseLine fitting example

23. RANSACAlgorithm: Sample (randomly) the number of points required to fit the model (#=2) Solve for model parameters using samples Score by the fraction of inliers within a preset threshold of the modelRepeat 1-3 until the best model is found with high confidenceLine fitting example

24. RANSACAlgorithm: Sample (randomly) the number of points required to fit the model (#=2) Solve for model parameters using samples Score by the fraction of inliers within a preset threshold of the modelRepeat 1-3 until the best model is found with high confidenceLine fitting example

25. RANSACAlgorithm: Sample (randomly) the number of points required to fit the model (#=2) Solve for model parameters using samples Score by the fraction of inliers within a preset threshold of the modelRepeat 1-3 until the best model is found with high confidence

26. RANSACIdea:All the inliers will agree with each other on the translation vector; the (hopefully small) number of outliers will (hopefully) disagree with each otherRANSAC only has guarantees if there are < 50% outliers“All good marriages are alike; every bad marriage is bad in its own way.” – Tolstoy via Alyosha Efros

27. Translations

28. RAndom SAmple ConsensusSelect one match at random, count inliers

29. RAndom SAmple ConsensusSelect another match at random, count inliers

30. RAndom SAmple ConsensusOutput the translation with the highest number of inliers

31. Final step: least squares fitFind average translation vector over all inliers

32. RANSAC - hyperparametersInlier threshold related to the amount of noise we expect in inliersOften model noise as Gaussian with some standard deviation (e.g., 3 pixels)Number of rounds related to the percentage of outliers we expect, and the probability of success we’d like to guaranteeSuppose there are 20% outliers, and we want to find the correct answer with 99% probability How many rounds do we need?

33. How many rounds? If we have to choose k samples each timewith an inlier ratio pand we want the right answer with probability Pproportion of inliers pk95%90%80%75%70%60%50%22356711173347911193543591317347254612172657146647162437972937482033541635888592644782721177Source: M. PollefeysP = 0.99

34. proportion of inliers pk95%90%80%75%70%60%50%22356711173347911193543591317347254612172657146647162437972937482033541635888592644782721177P = 0.99

35. How big is k?For alignment, depends on the motion modelHere, each sample is a correspondence (pair of matching points)

36. RANSAC pros and consProsSimple and generalApplicable to many different problemsOften works well in practiceConsParameters to tuneSometimes too many iterations are requiredCan fail for extremely low inlier ratios

37. RANSACAn example of a “voting”-based fitting schemeEach hypothesis gets voted on by each data point, best hypothesis winsThere are many other types of voting schemesE.g., Hough transforms…

38. RANSAC - SetupGivenA dataset Example 1: Line fitting: Example 2: Homography fitting: A set of parameters that need to be fittedLine fitting: Homography estimation , A cost function Line fitting: Homography estimation (Reprojection error)A minimum number needed kLine fitting: 2Homography estimation: 4 

39. RANSAC - SetupGivenA dataset A set of parameters that need to be fittedA cost function k?Problem: outliers 

40. RANSAC - AlgorithmGiven: ,kFor i = 1, …, SSample k pointsMinimize C for these k points to get Compute the set of inliers: If size of is more than size of Minimize over  

41. RANSAC: how many iterations do we need?p = inlier fractionk = minimum number of data pointsS = iter