This bugzilla service is closed. All entries have been migrated to
Bug 668 - Extend PastixSupport to support MPI (optionaly)
Summary: Extend PastixSupport to support MPI (optionaly)
Status: NEW
Alias: None
Product: Eigen
Classification: Unclassified
Component: Sparse (show other bugs)
Version: unspecified
Hardware: All All
: Normal enhancement
Assignee: Nobody
Depends on:
Reported: 2013-10-02 22:57 UTC by Gael Guennebaud
Modified: 2019-12-04 12:39 UTC (History)
1 user (show)


Description Gael Guennebaud 2013-10-02 22:57:45 UTC
See this thread for some hints:

A proper patch is welcome.
Comment 1 Darcy 2017-12-31 15:33:59 UTC
I would also like this feature.  However I think it would be worth it to implement the distributed matrix interface, especially in terms of parallel efficiency.    However, this requires a loc2glob mapping as an additional function call to pastix, as well as some reordering of the matrix and will likely require some MPI calls inside the wrapper.

If this is something you think will be useful to have in Eigen then I would have some follow up questions about the design.
Comment 2 Nobody 2019-12-04 12:39:33 UTC
-- GitLab Migration Automatic Message --

This bug has been migrated to's GitLab instance and has been closed from further activity.

You can subscribe and participate further through the new bug through this link to our GitLab instance:

Note You need to log in before you can comment on or make changes to this bug.