Skip to Content.
Sympa Menu

charm - Re: [charm] [sync] entry method

charm AT lists.cs.illinois.edu

Subject: Charm++ parallel programming system

List archive

Re: [charm] [sync] entry method


Chronological Thread 
  • From: Gengbin Zheng <zhenggb AT gmail.com>
  • To: Haowei Huang <huangh AT in.tum.de>
  • Cc: Phil Miller <mille121 AT illinois.edu>, charm AT cs.illinois.edu
  • Subject: Re: [charm] [sync] entry method
  • Date: Tue, 27 Jul 2010 14:56:59 -0500
  • List-archive: <http://lists.cs.uiuc.edu/pipermail/charm>
  • List-id: CHARM parallel programming system <charm.cs.uiuc.edu>



When B returns its message to A, it is not a fixed size buffer or something you would see in MPI. From user's point of view, chare A does not need to prepare  the receive buffer to stored the message from chare B. In charm++, chare A will literally gets the messages sent from B. You can have whatever data fields you need in the message to record the buffer length for example, so that A knows how to handle the reply message.

Gengbin

On Tue, Jul 27, 2010 at 9:31 AM, Haowei Huang <huangh AT in.tum.de> wrote:
 Hi,

      If I want to define a synchronous entry method in Chares, the return value must be a message. For example, A chare wants some information from B chare, then A calls sync entry method of B which returns a message. Before this communication happens, A must know the size of the message that B wants to return so A must send a message for requesting the size of returning message which might be another sync entry method of B. Must we define the communication pattern like that if we want to use sync entry method? The implementation is somehow similar to MPI communication pattern. Thanks a lot.

--
Haowei Huang
Ph.D. student
Technische Universitaet Muenchen
Institut fuer Informatik, I10
Boltzmannstr. 3
D-85748 Garching
Room 01.06.061
Phone: +49 (89) 289 18477
mailto: huangh AT in.tum.de

_______________________________________________
charm mailing list
charm AT cs.uiuc.edu
http://lists.cs.uiuc.edu/mailman/listinfo/charm






Archive powered by MHonArc 2.6.16.

Top of Page