charm AT lists.siebelschool.illinois.edu
Subject: Charm++ parallel programming system
List archive
- From: Loriano Storchi <redo AT thch.unipg.it>
- To: charm AT cs.uiuc.edu
- Subject: Re: [charm] AMPI and ScaLAPACK
- Date: Thu, 11 Feb 2010 09:07:16 +0100 (CET)
- List-archive: <http://lists.cs.uiuc.edu/pipermail/charm>
- List-id: CHARM parallel programming system <charm.cs.uiuc.edu>
Dear Aaron and dear Phil,
thanks a lot for the answer. This, I guess, explain why the problem comes out only when using big matrices, and I guess there is
no simple solution to the problem.
all the best
loriano
On Wed, 10 Feb 2010, Phil Miller wrote:
The following response was accidentally sent to only our internal list:
---------- Forwarded message ----------
From: Aaron Becker
<abecker3 AT illinois.edu>
Date: Sat, Feb 6, 2010 at 16:01
Some BLACS functions, including dgebr2d and dgebs2d, which are used in
pzhegvx, create a new MPI datatype each time they're called, and
destroy it at the end of the function. AMPI does not handle this
situation correctly at all, and it never reclaims memory from
destroyed datatypes. I think that's the most likely cause of the
problem.
Aaron
- [charm] AMPI and ScaLAPACK, Loriano Storchi, 02/06/2010
- Message not available
- Message not available
- [charm] AMPI and ScaLAPACK, Phil Miller, 02/10/2010
- Re: [charm] AMPI and ScaLAPACK, Loriano Storchi, 02/11/2010
- [charm] AMPI and ScaLAPACK, Phil Miller, 02/10/2010
- Message not available
- Message not available
Archive powered by MHonArc 2.6.16.