This bugzilla service is closed. All entries have been migrated to https://gitlab.com/libeigen/eigen

Bug 668

Summary: Extend PastixSupport to support MPI (optionaly)
Product: Eigen Reporter: Gael Guennebaud <gael.guennebaud>
Component: SparseAssignee: Nobody <eigen.nobody>
Status: NEW ---    
Severity: enhancement CC: darcy
Priority: Normal    
Version: unspecified   
Hardware: All   
OS: All   
Whiteboard:

Description Gael Guennebaud 2013-10-02 22:57:45 UTC
See this thread for some hints: http://forum.kde.org/viewtopic.php?f=74&t=117721&p=293609#p293609

A proper patch is welcome.
Comment 1 Darcy 2017-12-31 15:33:59 UTC
I would also like this feature.  However I think it would be worth it to implement the distributed matrix interface, especially in terms of parallel efficiency.    However, this requires a loc2glob mapping as an additional function call to pastix, as well as some reordering of the matrix and will likely require some MPI calls inside the wrapper.

If this is something you think will be useful to have in Eigen then I would have some follow up questions about the design.
Comment 2 Nobody 2019-12-04 12:39:33 UTC
-- GitLab Migration Automatic Message --

This bug has been migrated to gitlab.com's GitLab instance and has been closed from further activity.

You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.com/libeigen/eigen/issues/668.