Skip to Content.
Sympa Menu

charm - Re: [charm] Building PETSc on top of Charm++'s AMPI

charm AT lists.cs.illinois.edu

Subject: Charm++ parallel programming system

List archive

Re: [charm] Building PETSc on top of Charm++'s AMPI


Chronological Thread 
  • From: Phil Miller <mille121 AT illinois.edu>
  • To: Dominik Heller <dominik.heller1 AT gmail.com>
  • Cc: "charm AT cs.uiuc.edu" <charm AT cs.uiuc.edu>
  • Subject: Re: [charm] Building PETSc on top of Charm++'s AMPI
  • Date: Fri, 30 Jan 2015 12:36:10 -0600
  • List-archive: <http://lists.cs.uiuc.edu/pipermail/charm/>
  • List-id: CHARM parallel programming system <charm.cs.uiuc.edu>

I tried to do this several years ago, as part of a side project, but put it on the back burner after only a little bit of progress. Here are the (very terse) notes I made at the time:
  • ./configure --with-cc=$CB/ampicc --with-cxx=$CB/ampicxx --without-fc --CFLAGS='-default-to-aout -G' --CXXFLAGS='-default-to-aout -G' --CC_LINKER_FLAGS="-default-to-aout,-G"
  • Modified the configure system to use int main(int argc, char** argv) instead of int main(void) for compatibility with AMPI_Main's prototype
  • Huge volume of warnings about redefinition of MPI_* functions between our mpi.h and PETSc's headers. Disabled the call-counting instrumentation in the ifdef to avoid this
  • Disabled MPI_Init_thread usage due to lack of definition of MPI_THREAD_FUNNELED
  • Disabled MPI_Win_create in packages/MPI.py, since it's not fully implemented
  • Basic configuration test in MPI.py modified to look for AMPI_Init and AMPI_Comm_create, with no options for additional libraries. Otherwise, it would try to pick up the system's libmpi.a, which was causing all sorts of horrible things.
  • A single-rank test passes, the first 2-rank test fails.

If this is of substantial interest to you, we can probably arrange to have a member of our lab work with you a bit further to get PETSc working on AMPI.



On Fri, Jan 30, 2015 at 11:20 AM, Dominik Heller <dominik.heller1 AT gmail.com> wrote:
Hi,

I'm trying to get PETSc to build on top of Charm++'s Adaptive MPI.
So far I've had no success trying a variety of parameters for the PETSc configuration script.
Will this be possible at all, with AMPI only being a subset of MPI1.1 and bringing its own suite of compile scripts as well as charmrun?

Thanks,
Dominik
_______________________________________________
charm mailing list
charm AT cs.uiuc.edu
http://lists.cs.uiuc.edu/mailman/listinfo/charm




Archive powered by MHonArc 2.6.16.

Top of Page