This bugzilla service is closed. All entries have been migrated to https://gitlab.com/libeigen/eigen
Bug 668 - Extend PastixSupport to support MPI (optionaly)
Summary: Extend PastixSupport to support MPI (optionaly)
Status: NEW
Alias: None
Product: Eigen
Classification: Unclassified
Component: Sparse (show other bugs)
Version: unspecified
Hardware: All All
: Normal enhancement
Assignee: Nobody
URL:
Whiteboard:
Keywords:
Depends on:
Blocks:
 
Reported: 2013-10-02 22:57 UTC by Gael Guennebaud
Modified: 2019-12-04 12:39 UTC (History)
1 user (show)



Attachments

Description Gael Guennebaud 2013-10-02 22:57:45 UTC
See this thread for some hints: http://forum.kde.org/viewtopic.php?f=74&t=117721&p=293609#p293609

A proper patch is welcome.
Comment 1 Darcy 2017-12-31 15:33:59 UTC
I would also like this feature.  However I think it would be worth it to implement the distributed matrix interface, especially in terms of parallel efficiency.    However, this requires a loc2glob mapping as an additional function call to pastix, as well as some reordering of the matrix and will likely require some MPI calls inside the wrapper.

If this is something you think will be useful to have in Eigen then I would have some follow up questions about the design.
Comment 2 Nobody 2019-12-04 12:39:33 UTC
-- GitLab Migration Automatic Message --

This bug has been migrated to gitlab.com's GitLab instance and has been closed from further activity.

You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.com/libeigen/eigen/issues/668.

Note You need to log in before you can comment on or make changes to this bug.