Industry tracking
Common Algorithms (Machine Vision Direction)
[The following algorithm is encountered in the course of doing the subject, as the learning will be updated from time to time, the mistakes are also expected to be corrected, thank you. ]
Catalog
1, GMM.. 1
2, AHP. 1
3, HMM.. 1
4, KNN.. 1
5. K-Means:... 1
6. Mean-shift method... 1
7. Covariance Matrix 1
8, SIFT. 1
9, SFM.. 2
10. REGRESSION PROBLEM: (2)
11. SVM.. 3
12. Haar classifier... 3
13. RBM Limits Boltzmann Machines 4
14. Sparse coding... 5
15. Radial Basis Function (RBF) Interpolation
16. Camera calibration 6
1. GMM: http://blog.pluskid.org/?P=39
2. AHP: https://jingyan.baidu.com/article/
Cbf0e500eb95582eaa28932b.html
3. HMM: http://www.cnblogs.com/skyme/p/4651331.html
4. KNN: http://www.cnblogs.com/ybjourney/p/4702562.html
5. K-Means: It is mainly used to calculate the data aggregation algorithm, mainly through the algorithm of continuously extracting the nearest mean of the seed point.
6. Mean-shift method: generally refers to an iterative step, that is, first calculating the migration mean of the current point, moving the point to its migration mean, and then taking it as a new starting point, continuing to move until certain conditions are met.
Http://www.cnblogs.com/liqizhou/archive/2012/05/12/2497220.html
7. Covariance Matrix: Covariance reflects the correlation between two random variables.
8. SIFT: (Scale-invariant transformation) is an algorithm for detecting local features
The U SIFT algorithm was proposed by D.G. Lowe in 1999 and improved in 2004. Later, Y. Ke improved the descriptor by replacing histogram with PCA.
U SIFT algorithm is an algorithm for extracting local features. It finds extreme points in scale space, extracts location, scale and rotation invariants.
The U SIFT feature is a local feature of an image. It keeps invariant to rotation, scale scaling, brightness change, and stability to a certain extent to view angle change, affine transformation and noise.
9. SFM: Structurefrom Motion
10. Regression problem: Given multiple independent variables, a dependent variable and some training samples representing the relationship between them, how to determine their relationship?
Modeling: The purpose is to find the function of this dependent variable with respect to these independent variables. This function can accurately express the relationship between the dependent variable and the multiple independent variables.
Http://blog.csdn.net/vshuang/article/details/5512853
11. SVM: Support vector machine is basically the best supervised learning algorithm.
12. Haar classifier:
(1) Boosting algorithm has become a general method to improve the accuracy of classifiers.
(2) Cascade of Strong Classifiers
Http://blog.csdn.net/zouxy09/article/details/7922923
13. RBM restricts Boltzmann machines:
One is to code the data, and then give it to supervised learning method for classification or regression. The other is to get the weight matrix and offset for the initial training of BP neural network.
Http://blog.sciencenet.cn/blog-110554-876316.html
15. Radial Basis Function (RBF) interpolation:
RBF method is a combination of a series of precise interpolation methods; that is, the surface must pass through each measured sample value. There are five basic functions:
Thin Plate Spline
Tension spline function
Regular spline function
Higher order surface function
Anti-Higher Surface Function
In different interpolation surfaces, each basis function has different shapes and results. RBF method is a special case of spline function.
Conceptually, RBF is similar to fitting the rubber film with the measured sample values when minimizing the total curvature of the surface.
16. Camera calibration:
Camera calibration is simply the process of changing the world coordinate system to the image coordinate system, that is, the process of finding the final projection matrix.
Http://www.cnblogs.com/Tigerwang 1218/p/7251311.html
18. Least squares method:
Least square method (also known as least square method) is a mathematical optimization technology. It searches for the best function matching of data by minimizing the sum of squares of errors. Unknown data can be easily obtained by least square method
19. RANSAC algorithm:
RANSAC is the abbreviation of "Random SAmple Consensus". It can estimate the parameters of the mathematical model iteratively from a set of observation data containing "outliers". It is an uncertain algorithm - it has a certain probability to get a reasonable result; in order to improve the probability, the number of iterations must be increased.
The basic assumptions of RANSAC are:
(1) Data are composed of "intra-point". For example, the distribution of data can be explained by some model parameters.
(2) Outside point is the data that can not adapt to the model.
(3) In addition, the data belong to noise.
The following figure illustrates the effect of RanSaC algorithm. There are some points in the graph that obviously satisfy a straight line, and another point is pure noise. The aim is to find the linear equation in the case of a large amount of noise, where the amount of noise data is three times that of a straight line.
20. Cluster adjustment:
The essence of BA (Bundle Adjustment) is an optimization model, which aims at minimizing the re-projection error.
Http://blog.csdn.net/OptSolution/article/details/64442962
21. Opposite geometry:
Epipolar Geometry describes the internal projective relationship between the two views, independent of the external scene, and only depends on the camera parameters and the relative attitude between the two attempts.
When it comes to polar geometry, it must be the opposite of two images.