Skip to Content.
Sympa Menu

charm - Re: [charm] Is AMPI support MPI_Waitall?

charm AT lists.cs.illinois.edu

Subject: Charm++ parallel programming system

List archive

Re: [charm] Is AMPI support MPI_Waitall?


Chronological Thread 
  • From: Phil Miller <mille121 AT illinois.edu>
  • To: 张凯 <zhangk1985 AT gmail.com>
  • Cc: charm AT cs.uiuc.edu
  • Subject: Re: [charm] Is AMPI support MPI_Waitall?
  • Date: Fri, 29 Jan 2010 11:12:53 -0600
  • List-archive: <http://lists.cs.uiuc.edu/pipermail/charm>
  • List-id: CHARM parallel programming system <charm.cs.uiuc.edu>

The attached patch is generated from and applies cleanly to charm-6.1.3.

If you're familiar with the 'git' version control system, our
repository can be cloned from git://charm.cs.uiuc.edu/charm.git

If you don't mind my asking, what work are you doing with AMPI? We'd
be very interested to hear about your experiences.

Phil

On Fri, Jan 29, 2010 at 08:03, Phil Miller
<mille121 AT illinois.edu>
wrote:
> 2010/1/29 张凯
> <zhangk1985 AT gmail.com>:
>> I am using charm-6.1.3.  Can that patch fix the problem on charm-6.1.3?
>>
>> When I patch it to ampi.C by using command "patch
>> src/libs/ck-libs/ampi/ampi.C  AMPI_Cart_shift.patch", it reports :
>>
>> patching file src/libs/ck-libs/ampi/ampi.C
>> Hunk #1 succeeded at 5158 (offset -185 lines).
>> Hunk #2 FAILED at 5198.
>> 1 out of 2 hunks FAILED -- saving rejects to file
>> src/libs/ck-libs/ampi/ampi.C.rej
>>
>> AfterI tried to read the patch file and fix it by myself, i found that
>> there
>> maybe a lot difference between the released version and development version
>> of charm source code.
>>
>> Could you help me build an patch for charm-6.1.3? Or tell me where can i
>> find stable developemnt version which I can patch it by myself?
>
> I'll send you the patch against 6.1.3 shortly, when I'm in the office.
>
> Phil
>
>
>>
>> Zhang Kai
>>
>> 2010/1/29 Phil Miller
>> <mille121 AT illinois.edu>
>>>
>>> The bug has been fixed in the development version of charm. If you use
>>> a pre-built development binary, the fix will be in tonight's autobuild
>>> for whatever platform you use. If you're building it from development
>>> source yourself, the patch is attached. If you are using the released
>>> Charm 6.1.x, we can port that fix over for you if you're not
>>> comfortable doing so yourself.
>>>
>>> Phil
>>>
>>> 2010/1/28 张凯
>>> <zhangk1985 AT gmail.com>:
>>> >
>>> > i think u got the problem i suffered.
>>> >
>>> > thanks for your reply.
>>> >
>>> > best regards.
>>> >
>>> > Zhang Kai
>>> >
>>> > 2010/1/29 Phil Miller
>>> > <mille121 AT illinois.edu>
>>> >>
>>> >> On Thu, Jan 28, 2010 at 07:50, 张凯
>>> >> <zhangk1985 AT gmail.com>
>>> >> wrote:
>>> >> > hi:
>>> >> >
>>> >> > I am a beginner of AMPI and trying to run a MPI program using it. But
>>> >> > i
>>> >> > found a little problem.
>>> >> >
>>> >> > Here(
>>> >> >
>>> >> >
>>> >> > http://www.mcs.anl.gov/research/projects/mpi/usingmpi/examples/advmsg/nbodypipe_c.htm)
>>> >> > you can find an example of a MPI program. I have successfully built
>>> >> > and
>>> >> > run it using both MPICH and intel MPI.
>>> >> >
>>> >> > However, when i running it with AMPI, i found that the program was
>>> >> > blocked
>>> >> > by MPI_Waitall function and never return again.
>>> >> >
>>> >> > I just run it with ++local +p2 +vp2 options. Did i miss other
>>> >> > options?
>>> >> > or
>>> >> > misconfig AMPI?
>>> >>
>>> >> I'm seeing the same effect as you describe on a net-linux-x86_64 build
>>> >> of AMPI from the latest charm sources. We'll look into this and get
>>> >> back to you.
>>> >>
>>> >> For reference, the attached code (with added prints) produces the
>>> >> following:
>>> >>
>>> >> $ ./charmrun nbp +vp 4 20 +p4
>>> >> Charm++: scheduler running in netpoll mode.
>>> >> Charm++> cpu topology info is being gathered.
>>> >> Charm++> Running on 1 unique compute nodes (8-way SMP).
>>> >> Iteration 9
>>> >> Iteration 9:0 a
>>> >> Iteration 9
>>> >> Iteration 9:0 a
>>> >> Iteration 9
>>> >> Iteration 9:0 a
>>> >> Iteration 9
>>> >> Iteration 9:0 a
>>> >> Iteration 9:0 b
>>> >> Iteration 9:0 b
>>> >> Iteration 9:0 b
>>> >> Iteration 9:0 b
>>> >
>>> >
>>
>>
>



Archive powered by MHonArc 2.6.16.

Top of Page