This bugzilla service is closed. All entries have been migrated to
Bug 1175 - SparseMatrix with non-default Index type fails when converting to Matrix (throws bad_alloc because it changes internal index from int to non-default index)
Summary: SparseMatrix with non-default Index type fails when converting to Matrix (thr...
Alias: None
Product: Eigen
Classification: Unclassified
Component: Sparse (show other bugs)
Version: 3.2
Hardware: All All
: Normal Unknown
Assignee: Nobody
Depends on:
Reported: 2016-03-01 11:12 UTC by Dario Villar
Modified: 2019-12-04 15:30 UTC (History)
3 users (show)

Test case to reproduce (736 bytes, text/x-c++src)
2016-03-01 11:12 UTC, Dario Villar
no flags Details

Description Dario Villar 2016-03-01 11:12:58 UTC
Created attachment 662 [details]
Test case to reproduce

When trying to convert a SparseMatrix with non-default Index type to Matrix, the dense one changes its internal Index type.

Attached test case to reproduce.
Comment 1 Gael Guennebaud 2016-03-01 14:04:30 UTC
This is already fixed in devel branch because rows/cols always return Index type.

For 3.2:
Summary:     Bug 1175: fix Index type conversion from sparse to dense.
Comment 2 Nobody 2019-12-04 15:30:07 UTC
-- GitLab Migration Automatic Message --

This bug has been migrated to's GitLab instance and has been closed from further activity.

You can subscribe and participate further through the new bug through this link to our GitLab instance:

Note You need to log in before you can comment on or make changes to this bug.