Skip to Content.
Sympa Menu

charm - Re: [charm] Is AMPI support MPI_Waitall?

charm AT lists.cs.illinois.edu

Subject: Charm++ parallel programming system

List archive

Re: [charm] Is AMPI support MPI_Waitall?


Chronological Thread 
  • From: Phil Miller <mille121 AT illinois.edu>
  • To: 张凯 <zhangk1985 AT gmail.com>
  • Cc: charm AT cs.uiuc.edu
  • Subject: Re: [charm] Is AMPI support MPI_Waitall?
  • Date: Thu, 28 Jan 2010 11:42:50 -0600
  • List-archive: <http://lists.cs.uiuc.edu/pipermail/charm>
  • List-id: CHARM parallel programming system <charm.cs.uiuc.edu>

By modifying the program to make its own periodic domain on
MPI_COMM_WORLD, rather than using a cartesian communicator
(Cart_create, Cart_shift), the program ran successfully. The modified
code is attached. I'll investigate why the Cart_create and Cart_shift
functions are going awry.

Phil

On Thu, Jan 28, 2010 at 10:54, Phil Miller
<mille121 AT illinois.edu>
wrote:
> On Thu, Jan 28, 2010 at 07:50, 张凯
> <zhangk1985 AT gmail.com>
> wrote:
>> hi:
>>
>> I am a beginner of AMPI and trying to run a MPI program using it. But i
>> found a little problem.
>>
>> Here(
>> http://www.mcs.anl.gov/research/projects/mpi/usingmpi/examples/advmsg/nbodypipe_c.htm)
>> you can find an example of a MPI program. I have successfully built
>> and
>> run it using both MPICH and intel MPI.
>>
>> However, when i running it with AMPI, i found that the program was blocked
>> by MPI_Waitall function and never return again.
>>
>> I just run it with ++local +p2 +vp2 options. Did i miss other options? or
>> misconfig AMPI?
>
> I'm seeing the same effect as you describe on a net-linux-x86_64 build
> of AMPI from the latest charm sources. We'll look into this and get
> back to you.
>
> For reference, the attached code (with added prints) produces the following:
>
> $ ./charmrun nbp +vp 4 20 +p4
> Charm++: scheduler running in netpoll mode.
> Charm++> cpu topology info is being gathered.
> Charm++> Running on 1 unique compute nodes (8-way SMP).
> Iteration 9
> Iteration 9:0 a
> Iteration 9
> Iteration 9:0 a
> Iteration 9
> Iteration 9:0 a
> Iteration 9
> Iteration 9:0 a
> Iteration 9:0 b
> Iteration 9:0 b
> Iteration 9:0 b
> Iteration 9:0 b
>



Archive powered by MHonArc 2.6.16.

Top of Page