Skip to Content.
Sympa Menu

charm - [charm] MPI_Barrier(MPI_COMM_WORLD) equivalent

charm AT lists.cs.illinois.edu

Subject: Charm++ parallel programming system

List archive

[charm] MPI_Barrier(MPI_COMM_WORLD) equivalent


Chronological Thread 
  • From: Evghenii Gaburov <e-gaburov AT northwestern.edu>
  • To: "charm AT cs.uiuc.edu" <charm AT cs.uiuc.edu>
  • Subject: [charm] MPI_Barrier(MPI_COMM_WORLD) equivalent
  • Date: Fri, 23 Sep 2011 02:15:11 +0000
  • Accept-language: en-US
  • List-archive: <http://lists.cs.uiuc.edu/pipermail/charm>
  • List-id: CHARM parallel programming system <charm.cs.uiuc.edu>

Dear All,

As a new user, who is porting his MPI code to Charm++, I have the following
question:

I have a snippet of the code that requests data from remote chares, and these
chares need to send data to the requesting chare. There is no way to know how
many messages a give chare receives from remote chares with a request to
export data. In other words, a given chare may need to export (different)
data to many remote chares that request this, and this chare does not know
how many remote chares request the data.

For logical consistency it is not possible to proceed with further
computations unless all data requested is imported/exported. This leads me to
issue with a global barrier, an equivalent if which,
MPI_Barrier(MPI_COMM_WORLD), I use in my MPI code (there does not seem to be
a way around such a global barrier, since this step established communication
graph between MPI tasks, or for Charm++ between chares, which later use
point-to-point communication).

Regretfully, I fail to find the most optimal way to issue such a barrier.
Using CkCallbackResumeThread() won't work because the calling code (from
MainChare that is a [threaded] entry) also sends messages to other remote
chares with request to import/export data, and those themselves recursively
send Msg to other remote chares to export data until close condition is
satisified. (the depth of recursion is 2 or 3 calls to same function).

Now I use CkWaitQD() in the MainChare as a global synchronization point. I
was wondering if there is a more elegant solution to issue a barrier so that
all previous issued message completed before proceeding further.

Thanks!

Cheers,
Evghenii







Archive powered by MHonArc 2.6.16.

Top of Page