Skip to Content.
Sympa Menu

charm - Re: [charm] 1 more node than needed on Ranger

charm AT lists.cs.illinois.edu

Subject: Charm++ parallel programming system

List archive

Re: [charm] 1 more node than needed on Ranger


Chronological Thread 
  • From: Junchao Zhang <junchao.zhang AT gmail.com>
  • To: charm AT cs.uiuc.edu
  • Subject: Re: [charm] 1 more node than needed on Ranger
  • Date: Wed, 10 Aug 2011 16:20:54 -0500
  • List-archive: <http://lists.cs.uiuc.edu/pipermail/charm>
  • List-id: CHARM parallel programming system <charm.cs.uiuc.edu>

I avoided the problem by building charm on Ranger with ./build charm++ mpi-linux-x86_64 mpicxx --no-build-shared -O3,
and substituted mpirun with ibrun in the generated charmrun script under ChaNGa directory

Thanks
--Junchao Zhang

On Wed, Aug 10, 2011 at 10:06 AM, Junchao Zhang <junchao.zhang AT gmail.com> wrote:
Hi,
  When I run Charm on Ranger of teragrid, I met some problems. I built charm with  ./build charm++ net-linux-x86_64  ibverbs icc
  I need 432 cores to run an application, ChaNGa.  Each node has 16 cores, so 27 nodes are enough, but charm says running on 28 nodes.

charmnodes >nodelist
./charmrun ++scalable-start +p 432 ./ChaNGa $HOME/input/lambb.param  +balancer OrbLB
Charmrun> scalable start enabled.
Charmrun> IBVERBS version of charmrun
Warning> Randomization of stack pointer is turned on in kernel, thread migration may not work! 
Charm++: scheduler running in netpoll mode.
Charm++> Running on 28 unique compute nodes (16-way SMP).
...
Chip #22: 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 
Chip #23: 368 369 370 371 372 373 374 375 
Chip #24: 376 377 378 379 380 381 382 383
 
Also, I find many messages like
OrbLB Info> PE 6 with 52.525892 background load will have 0 object. 
OrbLB Info> PE 8 with 55.647333 background load will have 0 object. 
OrbLB Info> PE 10 with 85.024853 background load will have 0 object.

 I also find the execution time is unreasonably long.
 Does it imply something wrong?  Could someone help me?

Thanks a lot
-- Junchao Zhang



--
-- Junchao Zhang



--
-- Junchao Zhang



Archive powered by MHonArc 2.6.16.

Top of Page