It appears that the internal resize methods for SparseMatrix spuriously throw std::bad_alloc when using a SparseMatrix whose StorageIndex is of a wider integer type than Eigen::Index. For example, consider the following snippet, in which we construct a SparseMatrix using 64-bit StorageIndex in a context where Eigen::Index is a 32-bit integer: ``` #define EIGEN_DEFAULT_DENSE_INDEX_TYPE int32_t #include <Eigen/Core> #include <Eigen/Sparse> int main() { Eigen::SparseMatrix<double, Eigen::ColMajor, int64_t> mat(5, 5); for (int i = 0; i < 5; i++) { for (int j = 0; j < 5; j++) { mat.coeffRef(i, j) = 1; } } return 0; } ``` Compiling this with: ``` g++ main.cpp -o main -Ieigen-3.3.7 -std=c++11 ``` (assuming the Eigen sources live in `./eigen-3.3.7`) and running it with `./main` will abort with: ``` terminate called after throwing an instance of 'std::bad_alloc' what(): std::bad_alloc Aborted (core dumped) ``` The issue appears to stem from the following line in `SparseCore/CompressedStorage.h`: ``` Index realloc_size = (std::min<Index>)(NumTraits<StorageIndex>::highest(), size + Index(reserveSizeFactor*double(size))); ``` when `StorageIndex` is wider than `Index`, this results in `realloc_size` being set to -1, which triggers the `internal::throw_std_bad_alloc()` on the next line.
This looks like an uncovered border case indeed. We could replace all `Index` by `size_t` or `ptrdiff_t` in the line you found. And additionally make sure (at compile-time) that StorageIndex is not larger that size_t, of course.
I'm actually tending to just disallow sizeof(StorageIndex) > sizeof(DenseIndex) Using StorageIndex = int64_t is barely useful on 32bit systems, and setting DenseIndex to int32_t is barely useful on 64bit systems.
-- GitLab Migration Automatic Message -- This bug has been migrated to gitlab.com's GitLab instance and has been closed from further activity. You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.com/libeigen/eigen/issues/1706.