Calculating the eigenvalues of the matrix in the attached file Matrix.txt with the attached program in main.cpp results in what is saved in EigenvaluesFromEigen.txt. These can't be correct because from theoretical arguments I know that the spectrum of the matrix needs to be symmetric around zero (The matrix origins from a physics problem). Moreover, calculating the eigenvalues using e.g. octave yields the spectrum in CorrectEigenvalues.txt.

Created attachment 213 [details] Small program to calculate the eigenvalues of the matrix

Created attachment 214 [details] The files mentioned in the bug description I added all the files to this archive because of the size constraint on attachments of bugzilla.

I forgot to mention: This was tested using Eigen 3.0.2

I can reproduce this. The eigenvalue computation does not converge, as you can check by calling solve.info(); see http://eigen.tuxfamily.org/dox/classEigen_1_1SelfAdjointEigenSolver.html#a0c06d5c2034ebb329c54235369643ad2 . Of course, the next question is, why? The matrix is a symmetric 2970-by-2970 sparse matrix where every row has approximately six 1s (the rest is 0). At the point where the eigensolver terminates, the subdiagonal entry is about 1e-16 and the diagonal entries above and beside it are similar. It looks like the iterations make no progress, except for the first two QR steps.

While the entries of the tridiagonal matrix at the end are all of the order 1e-16, the first subdiagonal entry is of order unity. So it has probably something to do with round-off error. Either we do something unstable somewhere in our implementation or our termination criterion is too strict. I'll see whether I can track down something in the literature.

I see that in LAPACK's routine SSTEQR ( http://www.netlib.org/lapack/explore-html-old/ssteqr.f.html ) they implement two variants, one in which the bulge is chased down (as in our implementation) and one in which it is chased upwards. Perhaps we also need to do this. On a quick reading, LAPACK's working note 17, "Experiments with QR/QL Methods for the Symmetric Tridiagonal Eigenproblem", by A. Greenbaum and J. Dongarra, seems to support for this.

Ahum, I was barking up the wrong tree ... I now think it's much simpler: it is a difficult matrix and we need to allow for more iterations. At the moment, we terminate if an eigenvalue does not converge in 30 iterations. However, we need 36 iterations for one of the eigenvalues (all the other eigenvalues converge in less than 10 iterations). I will change the termination criterion so that we allow 30*n iterations in total (where n is matrix size) instead of 30 iterations per eigenvalue. This is the same test as LAPACK uses. I'll also change the documentation to emphasize that users should test info(). Just give me a couple of days ...

(In reply to comment #7) > I will change the termination criterion so that we allow 30*n iterations in > total (where n is matrix size) instead of 30 iterations per eigenvalue. This is > the same test as LAPACK uses. Sounds good!

This fix has now been pushed (changesets 4fb0f2f5ca65 and f95322d996f2). Commit message: Allow for more iterations in SelfAdjointEigenSolver (bug 354). Add an assert to guard against using eigenvalues that have not converged. Add call to info() in tutorial example to cover non-convergence.

I think this fix is quite dangerous, since it added an eigen_assert() that relies on sufficient numeric convergence and a "magic number" of maximum iterations. As a consequence (and because eigen_assert defaults to checking the assertion even in release builds), this assert is likely to break (and has broken, in my case) existing code. If the library user is supposed to check info() to test convergence, the eigen_assert() seems superfluous anyway. I wouldn't want the application to abort before being able to check info(). Can the eigen_assert() be removed again?

(In reply to comment #10) > I think this fix is quite dangerous, since it added an eigen_assert() that > relies on sufficient numeric convergence and a "magic number" of maximum > iterations. Indeed, we shouldn't do that. assertions should only be used to guard against bad memory accesses, or stuff that would otherwise likely crash us. > As a consequence (and because eigen_assert defaults to checking the assertion > even in release builds) That on the other hand is not true, usually a 'release build' means one where NDEBUG is defined, in which case assertsions (including eigen_assert) are omitted. >, this assert is likely to break (and has broken, in my > case) existing code. A test case would be interesting, if that means you know a case where even the current version doesn't converge. > If the library user is supposed to check info() to test > convergence, the eigen_assert() seems superfluous anyway. I wouldn't want the > application to abort before being able to check info(). > Can the eigen_assert() be removed again? +1

I just had a look at Macros.h. Seems like NDEBUG is not automatically defined on one of my target platforms for a release build. Manually setting EIGEN_NO_DEBUG is my current workaround. If I have some time, I will try and get a test case for non-convergence.

(In reply to comment #11) > (In reply to comment #10) > > I think this fix is quite dangerous, since it added an eigen_assert() that > > relies on sufficient numeric convergence and a "magic number" of maximum > > iterations. > > Indeed, we shouldn't do that. assertions should only be used to guard against > bad memory accesses, or stuff that would otherwise likely crash us. I'm happy to remove the assert. Indeed, I can see that that is more consistent with the behaviour of solve() which happily returns rubbish if the matrix is singular. However, I wonder what the difference is with the asserts against m_isInitialized in say JacobiSVD.h ; if the memory is allocated but not initialized then reading it will not cause a crash. > >, this assert is likely to break (and has broken, in my > > case) existing code. I would say that the existing code was already broken if the assert is triggered; it is just a bit more obvious now. > > If the library user is supposed to check info() to test > > convergence, the eigen_assert() seems superfluous anyway. I wouldn't want the > > application to abort before being able to check info(). But you are able to check info() before the application aborts. See for instance the example at http://eigen.tuxfamily.org/dox-devel/TutorialLinearAlgebra.html#TutorialLinAlgEigensolving : SelfAdjointEigenSolver<Matrix2f> eigensolver(A); if (eigensolver.info() != Success) abort(); cout << "The eigenvalues of A are:\n" << eigensolver.eigenvalues() << endl; The assert is in the eigenvalues() function, but you can check info() immediately after computing the eigenvalue decomposition.

(In reply to comment #13) > (In reply to comment #11) > > (In reply to comment #10) > > > I think this fix is quite dangerous, since it added an eigen_assert() that > > > relies on sufficient numeric convergence and a "magic number" of maximum > > > iterations. > > > > Indeed, we shouldn't do that. assertions should only be used to guard against > > bad memory accesses, or stuff that would otherwise likely crash us. > > I'm happy to remove the assert. Indeed, I can see that that is more consistent > with the behaviour of solve() which happily returns rubbish if the matrix is > singular. However, I wonder what the difference is with the asserts against > m_isInitialized in say JacobiSVD.h ; if the memory is allocated but not > initialized then reading it will not cause a crash. Maybe I should have phrased the rule as: asserts are used to guard against clear user programming mistakes that may have severe consequences, like, depending on implementation details, lead to crashes or memory corruption. Asserts are not to be used with conditions that depend on floating-point computations. A m_isInitialized assert qualifies, as it guards again plain programming mistakes, it's very deterministic. Additionally, I'd say that it's only an implementation detail whether memory is already allocated even in an uninitialized case. > > > >, this assert is likely to break (and has broken, in my > > > case) existing code. > > I would say that the existing code was already broken if the assert is > triggered; it is just a bit more obvious now. That the assert is triggered might mean either the user's code is broken, or our implementation is broken, of the matrix is such that eigenvalues are really had to compute by this algorithm. Hard to tell. > > > If the library user is supposed to check info() to test > > > convergence, the eigen_assert() seems superfluous anyway. I wouldn't want the > > > application to abort before being able to check info(). > > But you are able to check info() before the application aborts. See for > instance the example at > http://eigen.tuxfamily.org/dox-devel/TutorialLinearAlgebra.html#TutorialLinAlgEigensolving > : > > SelfAdjointEigenSolver<Matrix2f> eigensolver(A); > if (eigensolver.info() != Success) abort(); > cout << "The eigenvalues of A are:\n" << eigensolver.eigenvalues() << endl; > > The assert is in the eigenvalues() function, but you can check info() > immediately after computing the eigenvalue decomposition. Good point; please still remove any assertion that depends on results of floating-point computations.

(In reply to comment #14) > A m_isInitialized assert qualifies, as it guards again plain programming > mistakes, it's very deterministic. Yes, I can see that there is a difference between asserts triggered deterministically by errors in programming logic and asserts that may or may not be triggered depending on implementation details and the vagaries of floating-point arithmetics. I removed the asserts; see changesets 840187d7f9b8 (for 3.0) and 3b3e844f16be (for dev branch).

-- GitLab Migration Automatic Message -- This bug has been migrated to gitlab.com's GitLab instance and has been closed from further activity. You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.com/libeigen/eigen/issues/354.