Eigen
3.4.99 (git rev 69adf26aa3e853418002562f623c42a9c7008271)

This page describes how to solve linear least squares systems using Eigen. An overdetermined system of equations, say Ax = b, has no solutions. In this case, it makes sense to search for the vector x which is closest to being a solution, in the sense that the difference Ax  b is as small as possible. This x is called the least square solution (if the Euclidean norm is used).
The three methods discussed on this page are the SVD decomposition, the QR decomposition and normal equations. Of these, the SVD decomposition is generally the most accurate but the slowest, normal equations is the fastest but least accurate, and the QR decomposition is in between.
The solve() method in the BDCSVD class can be directly used to solve linear squares systems. It is not enough to compute only the singular values (the default for this class); you also need the singular vectors but the thin SVD decomposition suffices for computing least squares solutions:
Example:  Output: 

#include <iostream>
#include <Eigen/Dense>
using namespace std;
using namespace Eigen;
int main()
{
MatrixXf A = MatrixXf::Random(3, 2);
cout << "Here is the matrix A:\n" << A << endl;
VectorXf b = VectorXf::Random(3);
cout << "Here is the right hand side b:\n" << b << endl;
cout << "The leastsquares solution is:\n"
<< A.bdcSvd(ComputeThinU  ComputeThinV).solve(b) << endl;
}
 Here is the matrix A: 1 0.0827 0.737 0.0655 0.511 0.562 Here is the right hand side b: 0.906 0.358 0.359 The leastsquares solution is: 0.464 0.043 
This is example from the page Linear algebra and decompositions .
The solve() method in QR decomposition classes also computes the least squares solution. There are three QR decomposition classes: HouseholderQR (no pivoting, so fast but unstable), ColPivHouseholderQR (column pivoting, thus a bit slower but more accurate) and FullPivHouseholderQR (full pivoting, so slowest and most stable). Here is an example with column pivoting:
Example:  Output: 

MatrixXf A = MatrixXf::Random(3, 2);
VectorXf b = VectorXf::Random(3);
cout << "The solution using the QR decomposition is:\n"
<< A.colPivHouseholderQr().solve(b) << endl;
 The solution using the QR decomposition is: 0.464 0.043 
Finding the least squares solution of Ax = b is equivalent to solving the normal equation A^{T}Ax = A^{T}b. This leads to the following code
Example:  Output: 

MatrixXf A = MatrixXf::Random(3, 2);
VectorXf b = VectorXf::Random(3);
cout << "The solution using normal equations is:\n"
<< (A.transpose() * A).ldlt().solve(A.transpose() * b) << endl;
 The solution using normal equations is: 0.464 0.043 
If the matrix A is illconditioned, then this is not a good method, because the condition number of A^{T}A is the square of the condition number of A. This means that you lose twice as many digits using normal equation than if you use the other methods.