Skip to Content.
Sympa Menu

charm - Re: [charm] AMPI and ScaLAPACK

charm AT lists.cs.illinois.edu

Subject: Charm++ parallel programming system

List archive

Re: [charm] AMPI and ScaLAPACK


Chronological Thread 
  • From: Loriano Storchi <redo AT thch.unipg.it>
  • To: charm AT cs.uiuc.edu
  • Subject: Re: [charm] AMPI and ScaLAPACK
  • Date: Thu, 11 Feb 2010 09:07:16 +0100 (CET)
  • List-archive: <http://lists.cs.uiuc.edu/pipermail/charm>
  • List-id: CHARM parallel programming system <charm.cs.uiuc.edu>


Dear Aaron and dear Phil,
thanks a lot for the answer. This, I guess, explain why the problem comes out only when using big matrices, and I guess there is
no simple solution to the problem.

all the best
loriano

On Wed, 10 Feb 2010, Phil Miller wrote:

The following response was accidentally sent to only our internal list:

---------- Forwarded message ----------
From: Aaron Becker
<abecker3 AT illinois.edu>
Date: Sat, Feb 6, 2010 at 16:01

Some BLACS functions, including dgebr2d and dgebs2d, which are used in
pzhegvx, create a new MPI datatype each time they're called, and
destroy it at the end of the function. AMPI does not handle this
situation correctly at all, and it never reclaims memory from
destroyed datatypes. I think that's the most likely cause of the
problem.

Aaron






Archive powered by MHonArc 2.6.16.

Top of Page